Question: In the recursive construction of decision trees, it sometimes happens that a mixed set of positive and negative examples remains at a leaf node, even

In the recursive construction of decision trees, it sometimes happens that a mixed set of positive and negative examples remains at a leaf node, even after all the attributes have been used. Suppose that we have p positive examples and r negative examples.

a. Show that the solution used by DECISION-TREE-LEARNING, which picks the majority classification, minimizes the absolute error over the set of examples at the leaf.

b. Show that the class probability p/ (p + n) minimizes the sum of squared errors.

Step by Step Solution

3.33 Rating (162 Votes )

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock

This question brings a little bit of mathematics to bear ... View full answer

blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Document Format (1 attachment)

Word file Icon

21-C-S-A-I (266).docx

120 KBs Word File

Students Have Also Explored These Related Artificial Intelligence Questions!