Question
Answer Consider a sequence of 2-dimensional data points, 71,x2,...,xm and their corresponding labels y(1),y(2), ,y(n). Recall the perceptron algorithm updates the parameters whenever y(i)h(z(i); where
Answer
Consider a sequence of 2-dimensional data points, 71,x2,...,xm and their corresponding labels y(1),y(2), ,y(n). Recall the perceptron algorithm updates the parameters whenever y(i)h(z(i); where h(z(i). ) = sign( . (i) + b). Assume that the points are linearly separable, and that both and b are initialized to zero. Let denote the number of times x(i) is misclassified during training. (a) (1pt) Derive the final decision boundary for the perceptron in terms of ai, and yf") (b) (1pt) Show that the shortest signed distance from the boundary to the origin is equal to (c) (2pts) The following table shows a dataset and the number of times each point is misclassified when one 1 applies the perceptron algorithm (with offset) Assuming and b are initialized to zero, what are the values for and b post training? (d) (1pt) Given a set of linearly separable points, does the order in which the points are presented to the algorithms affect whether or not the algorithm will converge? In general, could the order affect the total number of mistakes made?
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started