Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Consider the training examples shown in the following table for a binary classification problem. CID | Gender Type 1 Size 1 Class 1 CO |

image text in transcribed

Consider the training examples shown in the following table for a binary classification problem. CID | Gender Type 1 Size 1 Class 1 CO | 1 | 2 3 4 | I 1 CO CO CO I 1 5 6 7 8 9 CO CO CO 1 I 1 CO CO | 1 I | M M M M M M F F F F M M M M F F F F F F Family Sports Sports Sports Sports Sports Sports Sports Sports Luxury Family Family Family Luxury Luxury Luxury Luxury Luxury Luxury Luxury 10 11 12 13 14 15 16 17 18 19 20 | Small 1 | Medium 1 | Medium 1 | Large | E Large 1 | E Large 1 | Small 1 | Small | Medium | Large 1 | Large 1 | E Large | Medium | E Large | Small | Small | | Medium | Medium | | Medium | | Large 1 CO C1 | | 1 I 1 | | I C1 C1 C1 C1 C1 C1 C1 C1 C1 | (d) What is the information gain if the split is based on Size? | (e) Which attribute provides the best split if information gain is the splitting criterion. (f) Based on the given training date set, calculate the probabilities required for a Nave Bayers classifier. Using Laplace smoothing for k=1 estimate to smooth the following probability estimates: P(C=CO) = ? P(C=C1) = ? P(G=M|C=C1)= ? P(G=F|C=CO) = ? P(G=M|C=CO) = ? P(G=F|C=C1) = ? P(T=F|C=CO) = ? P(T =F|C=C1) = ? P(T =S|C=CO) = ? P(T =S|C=C1) = ? P(T =L|C=C0) = ? P(T =L|C=C1) = ? P(S=S|C=CO) = ? P(S=S|C=C1)= ? P(S=MC=CO) = ? P(S=M|C=C1) = ? P(S=L|C=CO) = ? P(S=LC=C1) = ? P(S=EC=CO) = ? P(S=EC=C1) = ? Consider the training examples shown in the following table for a binary classification problem. CID | Gender Type 1 Size 1 Class 1 CO | 1 | 2 3 4 | I 1 CO CO CO I 1 5 6 7 8 9 CO CO CO 1 I 1 CO CO | 1 I | M M M M M M F F F F M M M M F F F F F F Family Sports Sports Sports Sports Sports Sports Sports Sports Luxury Family Family Family Luxury Luxury Luxury Luxury Luxury Luxury Luxury 10 11 12 13 14 15 16 17 18 19 20 | Small 1 | Medium 1 | Medium 1 | Large | E Large 1 | E Large 1 | Small 1 | Small | Medium | Large 1 | Large 1 | E Large | Medium | E Large | Small | Small | | Medium | Medium | | Medium | | Large 1 CO C1 | | 1 I 1 | | I C1 C1 C1 C1 C1 C1 C1 C1 C1 | (d) What is the information gain if the split is based on Size? | (e) Which attribute provides the best split if information gain is the splitting criterion. (f) Based on the given training date set, calculate the probabilities required for a Nave Bayers classifier. Using Laplace smoothing for k=1 estimate to smooth the following probability estimates: P(C=CO) = ? P(C=C1) = ? P(G=M|C=C1)= ? P(G=F|C=CO) = ? P(G=M|C=CO) = ? P(G=F|C=C1) = ? P(T=F|C=CO) = ? P(T =F|C=C1) = ? P(T =S|C=CO) = ? P(T =S|C=C1) = ? P(T =L|C=C0) = ? P(T =L|C=C1) = ? P(S=S|C=CO) = ? P(S=S|C=C1)= ? P(S=MC=CO) = ? P(S=M|C=C1) = ? P(S=L|C=CO) = ? P(S=LC=C1) = ? P(S=EC=CO) = ? P(S=EC=C1) =

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Sound Investing, Chapter 22 - Management Discussion And Analysis

Authors: Kate Mooney

1st Edition

007171944X, 9780071719445

More Books

Students also viewed these Accounting questions