Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

PLEASE ANSWER THIS QUESTION, DON'T COPY PASTE AND DON'T WRITE THEORETICAL EXPLANATION !!!!! For an input of x1= 1.0 and x2=2.0 What is the net

PLEASE ANSWER THIS QUESTION, DON'T COPY PASTE AND DON'T WRITE THEORETICAL EXPLANATION !!!!!

For an input of x1= 1.0 and x2=2.0 What is the net inputs a1 and a2; and the ReLU outputs.

image text in transcribed

A fully connected neural network is given below. Note that: - The ReLU activation function is defined as follows: For an input x,ReLU(x)=max(0,x) - The Softmax function is defined as follows: Given the inputs, xi,i=1,,n; the outputs are as follows: s(xi)=j=1nexjexi Suppose that the weights of the network are w11=-1.5;w12=-1.0,w21=1.2;w22=1.0;w31=-0.6;w32=1.0h11=-0.5;h12=-1.0;h21=1.0,h22=1.5;h31=0.4;h32=0.3

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Sams Teach Yourself Beginning Databases In 24 Hours

Authors: Ryan Stephens, Ron Plew

1st Edition

067232492X, 978-0672324925

More Books

Students also viewed these Databases questions

Question

A primary key usually has a value of NULL. True False

Answered: 1 week ago

Question

How autonomous should the target be left after the merger deal?

Answered: 1 week ago