Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Note that: The ReLU activation function is defined as follows: For an input x , ReLU ( x ) = max ( 0 , x

Note that:
The ReLU activation function is defined as follows:
For an input x,ReLU(x)=max(0,x)
The Softmax function is defined as follows:
Given the inputs, xi,i=1,dots,n; the outputs are as follows:
s(xi)=exij=1nexj
Suppose that the weights of the network are
w11=-0.5; w12=-1.0,w21=1.2; w22=1.0; w31=-0.6; w32=1.0
h11=-0.5; h12=-1.0; h21=1.0,h22=1.5; h31=0.4; h32=0.3
For an input of x1=1.0 and x2=1.0
i. Calculate the net inputs a1 and a2.
ii. Calculate the ReLU outputs.
iii. Calculate y1 and y2.
iv. Calculate z1 and z2.
image text in transcribed

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Data And Information Quality Dimensions, Principles And Techniques

Authors: Carlo Batini, Monica Scannapieco

1st Edition

3319241060, 9783319241067

More Books

Students also viewed these Databases questions

Question

Discuss how selfesteem is developed.

Answered: 1 week ago

Question

Identify ways to increase your selfesteem.

Answered: 1 week ago

Question

Define self-esteem and discuss its impact on your life.

Answered: 1 week ago