Question
class Optimize: def __init__(self, learning_rate=1e-4, reg=1e-3): self.learning_rate = learning_rate self.reg = reg def update(self, model): pass def apply_regularization(self, model): ''' Apply L2 penalty to the
class Optimize:
def __init__(self, learning_rate=1e-4, reg=1e-3):
self.learning_rate = learning_rate
self.reg = reg
def update(self, model):
pass
def apply_regularization(self, model):
'''
Apply L2 penalty to the model. Update the gradient dictionary in the model
:param model: The model with gradients
:return: None, but the gradient dictionary of the model should be updated
'''
#############################################################################
# TODO: #
# 1) Apply L2 penalty to model weights based on the regularization #
# coefficient #
Weights are in a dictionary as you. an see example below
self.weights['W1'] = 0.001 * np.random.randn(inputsize, n_classes)
self.gradients['W1'] = np.zeros((inputsize, n_classes))
#############################################################################
Need to solve the function in python as instructions mention and having a hard time
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started