Given the parameterizations of Example 7.22 (page 316): (a) When the features are a, b, and c,

Question:

Given the parameterizations of Example 7.22 (page 316):

(a) When the features are

a, b, and

c, what decision tree will the decision-tree learning algorithm find to represent t (assuming it maximizes information gain and only stops when all examples at a leaf agree)?

(b) When the features are

a, b, and

c, what is a smallest decision tree that can represent t? How does this compare to using x, y, and z.

(c) How can x, y, and z be defined in terms of

a, b, and c?

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: