Answered step by step
Verified Expert Solution
Question
1 Approved Answer
Question 22 (3 points) Suppose we know that 80% of the time Hay Fever (HF) causes red eyes (R) in those with the disease. At
Question 22 (3 points) Suppose we know that 80% of the time Hay Fever (HF) causes red eyes (R) in those with the disease. At any point in time 10% of the general population has Hay Fever. But. among the people who don't have Hay Fever 15% still have red eyes. Note also that we have no prior information about the percentage of the population that might have red eyes. You have red eyes. What is the probability that you have Hay Fever? ONone of the above O0.372 b 0.012 0.56 0.08Question 23 (3 points) Now, suppose that in addition to red eyes. you also have congestion (C]. We know that Hay Fever results in congestion 60% of the time. Furthermore, based on a recent study. if you have red eyes. you are 30% likely to also have congestion (so P(CIR] = 0.3). Assuming that congestion and red eyes are independent given Hay Fever, what is the updated probability that you have Hay Fever (given that you have both red eyes and congestion)? Oo.112 None of the above 0.744 0.180 0.067Question 35 (3 points) > Saved Consider the following training data indicating whether bank customers received loans based on their credit history, income level. and debt. Instance Credit Income Debt Loan? good high YOS 2 good hugh low yes good mechan low yes 4 good machian han no good low YOS bad high high yes bad high low yes D 8 bad modwin no 9 bod low high no low no We want to use the ID3 decision tree learning algorithm to determine what feature/attribute should be the root of the decision tree. What is the information gain for the attribute Credit (Le. Gain(Credit)). Select the closest answer. [Note: when computing entropies, use Log base 2]. 0.845 . 0.12 20.97 ONone of the above035) Consider the following training data: arisinntr Crodll Income Dom Loan? When using |D3 decision tree learning algorithm to determine what feature/attribute should be the root of the decision tree, what is the INFORMATION GAIN for the attribute INCOME. Select the closest answer. (NOTE: When computing entropies, use Log base 2.) Q38) Question 38 (2 points) In this problem, we'll use the same bank data for Naive Bayes classification (with discrete variables). Instance Credit Income Debt Loan? good thy high yes good high low yes good medan low yes good madan high no good low low YOS bad high high yes bad low Yes bad median high no bad high no bad Tow low no As an example, the conditional probability table for Credit is given below: Credit Yes Good No 0.667 Bad 0.25 0.333 0.75 E.g.. P(Credit - Good | No) is 0.25. Compute the conditional probability P(Income-High | Yes). 10 Oo.4 0.167 0.667Question 40 (3 points) Continuing with the Naive Bayes classification problem: Instance Credit Income Debt Loan? good high hugh POS good high low yes good median low yes good media high no good low DOW bad high YES bad high NOW yes bad median hugh no bad JOW no 10 bad low no Suppose a new instance X is to be classified: X - The posterior probability PIX | No) is (approximately): Suppose a new instance X is to be classified: X - The posterior probability PIX | No) is (approximately): 0.0375 O0.37 O0.63 0.72
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started