Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Consider a specific 2 hidden layer ReLU network with inputs xinR, 1 dimensional outputs, and 2 neurons per hidden layer. This function is given by

Consider a specific 2 hidden layer ReLU network with inputs xinR,1 dimensional outputs, and 2 neurons
per hidden layer. This function is given by
h(x)=W(3)max{0,W(2)max{0,W(1)x+b(1)}+b(2)}+b(3)
with weights:
W(1)=[0.5]
0.5
b(1)=[0]
1
W(2)=[111]
1
b(2)=[0]
0
W(3)=[11]
b(3)=1
An interesting property of networks with piece-wise linear activations
like the ReLU is that on the whole they compute piece-wise linear
functions. At each of the following points x=xo, determine the value
of the new weight WinR and bias binR such that dh(x)dx|x=xo=W and
Wxo+b=h(xo).
xo=2
xo=-1
xo=1
image text in transcribed

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Real Time Database And Information Systems Research Advances

Authors: Azer Bestavros ,Victor Fay-Wolfe

1st Edition

1461377803, 978-1461377801

More Books

Students also viewed these Databases questions

Question

for loop to display two rows of decreasing numbers in java script

Answered: 1 week ago