Answered step by step
Verified Expert Solution
Question
1 Approved Answer
2. [20 points] Multi-layer perceptrons: Assume a two-layer perceptron (with one hidden layer). The equations for the input-to hidden layer are a; = _with i
2. [20 points] Multi-layer perceptrons: Assume a two-layer perceptron (with one hidden layer). The equations for the input-to hidden layer are a; = _with i zj = g(1)(a;) and the equations for the hidden-to-output layer are ak = >Wki Zi j Z k = g(2)(ak) a) [10 points] Show that the multi-layer perceptron is equivalent to a single-layer perceptron if the hidden layer unit z, is a linear function of its input aj. Write down the equation for the equivalent single-layer perceptron. b) [3 points] What happens if z, = _, ujjaj where {ujj} is a further set of weights? Do you still get an equivalent single-layer perceptron? c) [7 points] Argue at a conceptual level, the minimum requirement for keeping the higher layers of a multi-layer perceptron from "crashing down" and giving you an effective single layer perceptron. (Once again, a purely mathematical answer with no conceptual elaboration will not be given many points.)
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started