Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

The following table summarizes a data set with three attributes A, B. C and two class labels *, -. Builod a two-level decision tree Number

image text in transcribed

The following table summarizes a data set with three attributes A, B. C and two class labels *, -. Builod a two-level decision tree Number of A RInstances TTT 50 F TT 010 TF T 100 FFT 05 T T F010 FTF 25 0 T F F 100 FFF025 According to the classification error rate, which attribute would be chosen as the first splitting attribute? For each attribute, show the contingency table and the gains in classification error rate Repeat for the two children of the root node How many instances are misclassified by the resulting decision tree? Repeat parts (a), (b), and (c) using C as the splitting attribute Use the results in parts (c) and (d) to conclude about the greedy nature of the decision tree induction algorithm a. b. c. d. e

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Auditor Essentials 100 Concepts Tips Tools And Techniques For Success

Authors: Hernan Murdock

1st Edition

1138036919, 978-1138036918

Students also viewed these Databases questions