Question: In the recursive construction of decision trees, it sometimes happens that a mixed set of positive and negative examples remains at a leaf node, even
In the recursive construction of decision trees, it sometimes happens that a mixed set of positive and negative examples remains at a leaf node, even after all the attributes have been used. Suppose that we have p positive examples and r negative examples.
a. Show that the solution used by DECISION-TREE-LEARNING, which picks the majority classification, minimizes the absolute error over the set of examples at the leaf.
b. Show that the class probability p/ (p + n) minimizes the sum of squared errors.
Step by Step Solution
3.33 Rating (162 Votes )
There are 3 Steps involved in it
This question brings a little bit of mathematics to bear ... View full answer
Get step-by-step solutions from verified subject matter experts
Document Format (1 attachment)
21-C-S-A-I (266).docx
120 KBs Word File
