Answered step by step
Verified Expert Solution
Question
1 Approved Answer
(c) A fully connected neural network is given below. Note that: - The ReLU activation function is defined as follows: For an input x,ReLU(x)=max(0,x) -
(c) A fully connected neural network is given below. Note that: - The ReLU activation function is defined as follows: For an input x,ReLU(x)=max(0,x) - The Softmax function is defined as follows: Given the inputs, xi,i=1,,n; the outputs are as s(xi)=j=1nexjexi Suppose that the weights of the network are w11=-1.5;w12=-1.0,w21=1.2;w22=1.0;w31=-0.6;w32=h11=-0.5;h12=-1.0;h21=1.0,h22=1.5;h31=0.4;h32=0.3 For an input of x1=1.0 and x2=2.0 i. Calculate the net inputs a1 and a2. ii. Calculate the ReLU outputs. iii. Calculate y1 and y2. iv. Calculate z1 and z2
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started