Download Diapositiva 1

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Intro Machine
Learning
Strata 2 – Teknologi Informasi
STTS
Computer Science Area’s
•
•
•
•
•
•
•
•
•
Artificial intelligence
Web Programming
Algorithm and Data Structure
Computer Architecture
Computer Graphics
Software Engineering
Database and Operating System
Theory of Computation
etc
Referensi:
https://en.wikipedia.org/wiki/Outline_of_computer_science
http://www.cs.cornell.edu/Info/Department/Ugrad/Subareas.html
Computer Science :
definition
Computer Science is the study of computers and
computational systems.
Unlike electrical and computer engineers,
computer scientists deal mostly with software
and software systems; this includes their theory,
design, development, and application.
Referensi:
http://undergrad.cs.umd.edu/what-computer-science
Artificial Intelligence
: definition
Artificial Intelligence (AI) is an area of computer
science that emphasizes the creation of
intelligent machines that work and react like
humans.
Part of Artificial
Intelligence
• Expert System / KBS
• Machine Learning
• Solution Search
• Computer Vision
• Digital Image Understanding
• Digital Image Processing
• Game Playing
• Voice Recognition
• Speech Recognition
• Robotic
Machine Learning
• Machine Learning: Field of study that gives
computers the ability to learn without being
explicitly programmed Arthur Samuel (1959).
Machine Learning
Problem
Learning Process
Data
Term:
• Supervised
• Unsupervised
• Discreet
• Continous
Supervised
Unsupervised
x2
vs
x2
x1
x1
Example 1
Price ($) 400
in 1000’s
Linear
300
?
Polynomial
200
100
0
0
500
1000
1500
2000
750
By Learning Method : Supervised
Regression
By Data : Continous
2500
Size in feet2
Example 2
Breast Cancer : Malignant OR Benign
1(Y)
Malignant?
0(N)
Tumor Size
By Learning Method : Supervised
Classification
By Data : Discreet  YES or NO
Other Example 3
- Clump Thickness
- Uniformity of Cell Size
- Uniformity of Cell Shape
…
Age
Tumor Size
By Learning Method : Supervised
Classification
By Data : Discreet  YES or NO
Exercise
• Problem 1:
You have a large inventory of identical items. You want to
predict how many of these items will sell over the next 3
months.
• Problem 2:
You’d like software to examine individual customer accounts,
and for each account decide if it has been hacked/
compromised.
Answer
• Treat problem 1 as a regression problem
• Treat problem 2 as a classification problem
Classification Example
Training
Training
Images
Image
Features
Training
Labels
Classifier
Training
Trained
Classifier
Cont..
Testing
Image
Features
Trained
Classifier
Prediction
Outdoor
Classification
• Given some set of features with corresponding labels
• Learning a function to predict the labels from the features
• Training labels dictate that two examples are the same or
different, in some sense
• Features and distance measures define similarity
• Classifiers try to learn weights or parameters for features and
distance measures so that feature similarity predicts label
similarity
Clustering Example
Cont..
Exercise
(supervised OR unsupervised)
 Given email labeled as spam/not spam, learn a spam filter.
 Given a set of news articles found on the web, group them
into set of articles about the same story.
 Given a database of customer data, automatically discover
market segments and group customers into different market
segments.
 Given a dataset of patients diagnosed as either having
diabetes or not, learn to classify new patients as having
diabetes or not.
Intro to NN
Neural Network (NN)
• Origins: Algorithms that try to mimic the
brain.
• Was very widely used in 80s and early 90s;
popularity diminished in late 90s.
• Recent resurgence: State-of-the-art technique
for many applications
NN Representation
NN Model
Neural Network has over 20's models, each of
which can be distinguished by:
• Architecture
• How to learn
Architecture
• Full Connected Graph
• Feed Forward
Term
•
•
: Node/ Vertex
: Edge
Layer
Mc Culloch - Pitt
• Node input disebut X
X1
• Bobot disebut W (weight)
W1
X2
W2
W3
X3
.
.
. Wn
.
.
Xn
• Bobot layer 1 disebut V
Y1
• Bobot layer 2 disebut W
(kebanyakan hanya sampai 2 layer)
• Perubahan V disebut ΔV
• Perubahan W disebut ΔW
• Output disebut Y
Sum of Product
• Penjumlahan Hasil Perkalian
 ( x1  w1)  ( x2  w2)  ( x3  w3)  ....  ( xn  wn)
n
Y
in
  ( xi  wi )
i 1
Fungsi Aktivasi
• Binary Hard Threshold
1,

f ( x)  
0
if
x 
Contoh 1
• Berikut ini adalah contoh bahwa bobot merepresentasikan
pengetahuan, sehingga jika diberi pertanyaan dapat diketahui
jawabannya.
X1
X1
X2
Y1
0
0
0
0
1
0
1
0
0
1
1
1
1
1
Y1
X2
Dengan ϴ = 2
1,

f ( x)  
0
if
x 
Contoh 2
X1
2
2
Y1
X2
Dengan ϴ = 2
1,

f ( x)  
0
if
x 
X1
X2
Y1
0
0
0
0
1
1
1
0
1
1
1
1
Contoh 3
Pikirkan berapakah bobot untuk :
X
X1 AND (NOT X2)
1
2
-1
Y1
X2
Dan
X2 AND (NOT X1)
Dengan ϴ = 2
X1
-1
2
X2
Y1
Contoh 4
• Pikirkan berapakah bobot untuk menangani:
X1 XOR X2
Sama artinya dengan...
(X1 AND (NOT X2)) OR (X2 AND (NOT X1))
X1
2
Z1
-1
-1
X2
2
2
2
Z2
Y1
Dengan ϴ = 2
Kasus
• Linear Separable
x1
x1
x1
1
1
1
x2
x2
1
AND
x2
1
1
AND NOT
OR
• Non Linear Separable
x1
1
x2
1
XOR
Summary
• Different weights can save different
knowledge
• The more number of nodes, the knowledge
that is stored will be more complex