Question: Given the parameterizations of Example 7.22 (page 316): (a) When the features are a, b, and c, what decision tree will the decision-tree learning algorithm
Given the parameterizations of Example 7.22 (page 316):
(a) When the features are
a, b, and
c, what decision tree will the decision-tree learning algorithm find to represent t (assuming it maximizes information gain and only stops when all examples at a leaf agree)?
(b) When the features are
a, b, and
c, what is a smallest decision tree that can represent t? How does this compare to using x, y, and z.
(c) How can x, y, and z be defined in terms of
a, b, and c?
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
