Answered step by step
Verified Expert Solution
Question
1 Approved Answer
Linear Model Selection and Regularization This programming assignment will use the Tidy Models platform. It will take a look at regularization models and hyperparameter tuning.
Linear Model Selection and Regularization
This programming assignment will use the Tidy Models platform. It will take a look at regularization models and hyperparameter tuning. These models contain a regularization term. This assignment will use parsnip for model fitting and recipes and workflows to perform the transformations, and tune and dials to tune the hyperparameters of the model.
You will be using the Hitters data set from the ISLR package. You wish to predict the baseball players Salary based on several different characteristics which are included in the data set.
Since you wish to predict Salary, then you need to remove any missing data from that column. Otherwise, you won't be able to run the models.
Set output as Hitters
librarytidymodels
libraryISLR
# Your code here
# Hitters
# your code here
# Hidden Tests
You will use the glmnet package to perform ridge regression. parsnip does not have a dedicated function to create a ridge regression model specification. You need to use linearreg and set mixture to specify a ridge model. The mixture argument specifies the amount of different types of regularization, mixture specifies only ridge regularization and mixture specifies only lasso regularization.
Setting mixture to a value between and lets us use both. When using the glmnet engine you also need to set a penalty to be able to fit the model. You will set this value to for now, it is not the best value, but you will look at how to select the best value in a little bit.
ridgespec linearregmixture penalty
setmoderegression
setengineglmnet
Once the specification is created you can fit it to you data. You will use all the predictors. Use the fit function here.
ridgefit fitridgespec, Salary ~ data Hitters
The glmnet package will fit the model for all values of penalty at once, so you can now see see what the parameter estimate for the model is now that you have penalty You can use the tidy function to accomplish this specific task.
tidyridgefit
Let us instead see what the estimates would be if the penalty was Store your output to tidy What do you notice?
# Your code here
# tidy
# your code here
# Hidden Tests
Look below at the parameter estimates for penalty Store your output to tidy Once again, use the tidy function to accomplish this task.
# Your code here
# tidy
# your code here
# Hidden Tests
You can visualize how the magnitude of the coefficients are being regularized towards zero as the penalty goes up Use the autoplot function to accomplish this task. Output variable here is ridgefit. Your image should look like this:
ridgefit autoplot
Prediction is done like normal, if you use predict by itself, then penalty as you set in the model specification is used.
predictridgefit, newdata Hitters
But you can also get predictions for other values of penalty by specifying it in predict Test with a value of Store your output to predict
# Your code here
# predict
# your code here
# Hidden Tests
You saw how we can fit a ridge model and make predictions for different values of penalty. But it would be great if you could find the "best" value of the penalty. This is something you can use hyperparameter tuning for. Hyperparameter tuning is in its simplest form a way of fitting many models with different sets of hyperparameters trying to find one that performs "best".
The complexity in hyperparameter tuning can come from how you try different models. You will keep it simple for this lab and only look at grid search, only looking at evenly spaced parameter values. This is a fine enough approach if you have one or two tunable parameters but can become computationally infeasible.
See the chapter on iterative search from Tidy Modeling with R for more information.
You begin like normal by setting up a validation split testing and training set A Kfold crossvalidation data set is created on the training data set with folds.
Hitterssplit initialsplitHitters strata "Salary"
Hitterstrain trainingHitterssplit
Hitterstest testingHitterssplit
Hittersfold vfoldcvHitterstrain, v
You can use the tunegrid function to perform hyperparameter tuning using a grid search. tunegrid needs different things;
a workflow object containing the model and preprocessor,
a rset object containing the resamples the workflow should be fitted within, and
a tibble containing the parameter values to be evaluated.
Optionally a metric set of performance metrics can be supplied for evaluation. If you don't set one then a default set of performance metrics is used.
You already have a resample object created in Hittersfold. Now you should create the workflow specification next.
You just used the data set as is when you fit the model earlier. However, ridge regression is scale sensitive so you need to
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started