Question
manually run the linear regression process by computing the cost function and the gradient descent update for the weights for one iteration. Data set: [(1,1),
manually run the linear regression process by computing the cost function and the gradient descent update for the weights for one iteration.
Data set: [(1,1), (4,2)]
Initial weights: w_0 = 5, w_1 = -5
[Further explanation:
This means that in the first data point x=1, y =1, and in the second data point x=4 and y=2. So, you have two points on the plane.
Linear regression is designed to find a line of best fit (closest to all points as much as possible at the same time).
If you compare with the notebook from the linear regression lecture, you have n=1 (just one x) and m=2 (just two data points)
The line that you'll find is y = w_0 + w_1 x
]
Q1. Compute the cost at the current step. (Acceptable error: 0.1)
Apply gradient descent once to compute the updated values for w_0.
Data set: [(1,1), (4,2)]
Initial weights: w_0 = 5, w_1 = -5
Learning coefficient alpha = 0.1
Recall that
w_i = w_i - alpha*(dcost/dw_i)
Q3. Apply the gradient descent method once to compute the updated w_1.
Data set: [(1,1), (4,2)]
Initial weights: w_0 = 5, w_1 = -5
Learning coefficient alpha = 0.1
Q4. What will be the stable values to w_0 (up to two decimal places) if you let the algorithm run for many many iterations (say 10000 or more)?
Data set: [(1,1), (4,2)]
Initial weights: w_0 = 5, w_1 = -5
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started