Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Q1 Learning a Tree 2 Points Select one option. Consider the following dataset. x[1] x[2] x[3] - - +1 0 - 0 -1 0 -1

image text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribed
Q1 Learning a Tree 2 Points Select one option. Consider the following dataset. x[1] x[2] x[3] - - +1 0 - 0 -1 0 -1 0 0 - +1 If we use the decision tree algorithm to learn a decision tree from this dataset, what feature would be used as the split for the root node? O x[1] O x[2] O x[3]Q2 Decision Boundaries 2 Points Which of the following pictures show decision boundaries that could be learned by a decision tree classier only using the features Age and Income? The regions with the green background are predicted positive and the regions with the orange background are predicted negative. m Income \fIncome I + . . . + + $80K $40K + + SOK Age O 10 20 30 40 . . .Q3 Depth 1 Point True or false. When learning decision trees, a smaller depth tree usually translates to lower training error. 0 True C False Q4 Test Error 1 Point True or false. If decision tree T1 has lower training error than decision tree T2, then T1 will always have better test error than T2. C) True 0 False Q5 Bagging 2 Points Select all that applies for a random forest classifier: l:l The trees are built on mutually exclusive subsets ofthe main data. l:l The trees are trained with the same features so that the classifiers are comparable. l:l The trees are built on samples with replacement of the original dataset. l:l The trees are built with different features so that the algorithm can explore the effect of different features. Q6 Boosting 1 Point True or false. AdaBoost decreases the weights for the incorrectly predicted observations. C) True O False Q7 AdaBoost 1 Point Select one option. Suppose we are running AdaBoost using decision tree stumps. At a particular iteration, the data points have weights according to the figure (Large points indicate heavier weights.) X2 Which of the following decision tree stumps is most likely to be fit in the next iteration? Hint: Notice the labels on the decision boundary. It shows the predicted label for a side of the decision boundary under/to the right of the word "Predict" \f\fQ8 Which Model? 1 Point Choose the model that bests fits the description provided. An ensemble model where each of the models in the ensemble can easily be trained in parallel (i.e. in any order). C' Decision Tree 0 Random Forest C) AdaBoost Q9 Stopping 1 Point True or False. It is possible for AdaBoost to overfit if there are too many trees in the ensemble. O True O False

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Numerical Analysis

Authors: Richard L. Burden, J. Douglas Faires

9th edition

538733519, 978-1133169338, 1133169333, 978-0538733519

More Books

Students also viewed these Mathematics questions