node, the split criteria, etc.) explain which parameters you experiment with and why- Report on if and how different parameters aect performance- Deseribe the pruning method used here- How do you examine the effect of di'erent values of op, and how do you select the best pruned tree- Which decision tree parameter values do you nd to be useful for developing a good model- {b} Consider another type of decision tree CS {ll- experiment with the parameters till you get a Hgood model- Summarize the parameters and performance you obtain Also develop a set of rules from the decision tree, and compare performance- Does performance differ across dimmt types of decision tree learners? Compare models using accuracy, sensitivity, precision, recall, etc {as you find reasonable you answer to Questions (a) above should clarify which performance measures you use and why}. Also compare performance on lift, RIDE curves and AUC- I-Iow do the models obtained om these decision tree learan differ? {c} Decision tree models are referred to as \"unstable in the sense that small dierences in training data can give 1- ery different models- Examine the models and performance for di'erent samples of the training- -test data (by changing the random seed} Do you nd your models to be unstable - explain? (d) Which variables are important for separating 'Good' from 'Ead' credit? Determine variable importance om the different 'besf trees- Are there similarities, differences? {d} Consider partitions of the data into Il''lt for Training and 30% for Test, and 30% for Training and 20% for Test and report on model and performance comparisons (for the decision tree learners considered above]- In the earlier question, you had determined a set of decision tree parameters to work well. Do the same parameters give 'best" models across the SillSD, I'D3d, SID-2:] trainingtest splits? Are there similarities among the different models ----iu-, say, the upper part of the tree and what does this indicate? Is there any specic model you would prefer for implementation