Question
Q-2. Consider a data set in which the two points {(1, 1),(1, 1)} belong to one class, and the other two points {(1, 1),(1, 1)}
Q-2.
Consider a data set in which the two points {(1, 1),(1, 1)} belong to one class, and the other two points {(1, 1),(1, 1)} belong to the other class. Start with perceptron parameter values at (0, 0), and work out a few stochastic gradient-descent updates with = 1. While performing the stochastic gradient-descent updates, cycle through the training points in any order.
(a) Does the algorithm converge in the sense that the change in objective function becomes extremely small over time?
(b) Explain why the situation in (a) occurs
use this book: neural networks and deep learning by Charu c Aggarwal, go to chapter 1 and look for exercise 4.
If you have any problem please feel free to comment it. Need help! Thank you in advance.
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started