Question
The following table summarizes a data set with three attributes A, B. C and two class labels *, -. Build a two-level decision tree. A
The following table summarizes a data set with three attributes A, B. C and two class labels *, -. Build a two-level decision tree. A B C Number of Instances - + T T T 5 0 F T T 0 10 T F T 10 0 F F T 0 5 T T F 0 10 F T F 25 0 T F F 10 0 F F F 0 25 a. According to the classification error rate, which attribute would be chosen as the first splitting attribute? For each attribute, show the contingency table and the gains in classification error rate. b. Repeat for the two children of the root node. c. How many instances are misclassified by the resulting decision tree? d. Repeat parts (a), (b), and (c) using C as the splitting attribute. e. Use the results in parts (c) and (d) to conclude about the greedy nature of the decision tree induction algorithm.
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started