Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Linear Model Selection and Regularization This programming assignment will use the Tidy Models platform. It will take a look at regularization models and hyperparameter tuning.

Linear Model Selection and Regularization
This programming assignment will use the Tidy Models platform. It will take a look at regularization models and hyperparameter tuning. These models contain a regularization term. This assignment will use parsnip for model fitting and recipes and workflows to perform the transformations, and tune and dials to tune the hyperparameters of the model.
You will be using the Hitters data set from the ISLR package. You wish to predict the baseball players Salary based on several different characteristics which are included in the data set.
Since you wish to predict Salary, then you need to remove any missing data from that column. Otherwise, you won't be able to run the models.
Set output as Hitters
library(tidymodels)
library(ISLR2)
# Your code here
# Hitters <-
Hitters <- ISLR2::Hitters
Hitters <- Hitters %>% drop_na(Salary)
# your code here
Attaching packages tidymodels 1.0.0
broom 1.0.4 recipes 1.0.5
dials 1.1.0 rsample 1.1.1
dplyr 1.1.0 tibble 3.2.0
ggplot23.4.1 tidyr 1.3.0
infer 1.0.4 tune 1.0.1
modeldata 1.1.0 workflows 1.1.3
parsnip 1.0.4 workflowsets 1.0.0
purrr 1.0.1 yardstick 1.1.0
Conflicts tidymodels_conflicts()
purrr::discard() masks scales::discard()
dplyr::filter() masks stats::filter()
dplyr::lag() masks stats::lag()
recipes::step() masks stats::step()
Use suppressPackageStartupMessages() to eliminate package startup messages
# Hidden Tests
You will use the glmnet package to perform ridge regression. parsnip does not have a dedicated function to create a ridge regression model specification. You need to use linear_reg() and set mixture =0 to specify a ridge model. The mixture argument specifies the amount of different types of regularization, mixture =0 specifies only ridge regularization and mixture =1 specifies only lasso regularization.
Setting mixture to a value between 0 and 1 lets us use both. When using the glmnet engine you also need to set a penalty to be able to fit the model. You will set this value to 0 for now, it is not the best value, but you will look at how to select the best value in a little bit.
ridge_spec <- linear_reg(mixture =0, penalty =0)%>%
set_mode("regression")%>%
set_engine("glmnet")
Once the specification is created you can fit it to you data. You will use all the predictors. Use the fit function here.
ridge_fit <- fit(ridge_spec, Salary ~ ., data = Hitters)
The glmnet package will fit the model for all values of penalty at once, so you can now see see what the parameter estimate for the model is now that you have penalty =0. You can use the tidy function to accomplish this specific task.
tidy(ridge_fit)

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Database Systems An Application Oriented Approach Complete Version

Authors: Michael Kifer, Arthur Bernstein, Richard Lewis

2nd Edition

0321268458, 978-0321268457

More Books

Students also viewed these Databases questions