(Rule-Based Classifier) Consider the decision tree
Convert the decision tree into a set of classification rule sets
Are the rules mutually exclusive?
Is the rule set exhaustive?
Is ordering needed for this set of rules?
Do you need a default class for the rule set?
(Nearest Neighbor Classifier) The table below lists a dataset that was used to create a nearest neighbor model that predicts whether it will be a good day to go surfing
ID
Wave Size (Ft)
Wave Period (secs)
Wind Speed (mph)
Good Surf
1
6
15
5
Yes
2
1
6
9
No
3
7
10
4
Yes
4
7
12
3
Yes
5
2
2
10
No
6
10
2
20
No
Assuming that the model uses Euclidean distance to find the k-nearest neighbor, what prediction will the model return for the following query instances when k = 1, 3?
Wave Size (Ft)
Wave Period (secs)
Wind Speed (mph)
Good Surf
5
6
7
?
(Naïve Bayes Classifier) The table below gives details of symptoms that patients presented and whether they were suffering from
ID
Headache
Fever
Vomiting
Meningitis
1
True
True
False
False
2
False
True
False
False
3
True
False
True
False
4
True
False
True
False
5
False
True
False
True
6
True
False
True
False
7
True
False
True
False
8
True
False
True
True
9
False
True
False
False
10
True
False
True
True
Headache
Fever
Vomit
Meningitis
False
False
True
?
Using a naïve Bayes classifier to determine whether a patient with the above symptoms have meningitis.
(Bayesian Network) Given the Bayesian Network below, what is the joint probability distribution P(Exercise = Yes, Diet = Healthy, Heart Disease = No, Chest Pain = No, Blood Pressure = low)?
(Perceptron) Given the perceptron Y = sign(0.4X1+0.3X2+0.7X3-0.5). What is the prediction when X1=0.5, X2=-0.5, X3=0.8?
(SVM) Given w = (0.4, 0.3, 0.7) and b = -0.5.
What is the margin of the SVM?
What is the prediction when X1=-0.5, X2=-0.5, X3=0.8? (2 points)
(Performance Evaluations) Given the confusion matrix
Calculate the accuracy
Calculate the F-measure
Assignment status: Solved by our experts
>>>Click here to get this paper written at the best price. 100% Custom, 0% plagiarism.<<<