Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

* * Part A * * : Suppose each of the weights is initialized to $W ^ k = 1 . 0 $ and each

**Part A**: Suppose each of the weights is initialized to $W^k =1.0$ and each bias is initialized to $b^k =-0.5$. Use forward propagation to find the activities and activations associated with each hidden and output neuron for the training example $(x, y)=(0.5,0)$. Show your work. Answer the Peer Review question about this section. **Part B**: Use Back-Propagation to compute the weight and bias derivatives $\partial \ell /\partial W^k$ and $\partial \ell /\partial b^k$ for $k=1,2,3$. Show all work. Answer the Peer Review question about this section. **PART C** Implement following activation functions:
Formulas for activation functions
* Relu: f($x$)= max(0, $x$)
* Sigmoid: f($x$)= $\frac{1}{1+ e^{-x}}$
* Softmax: f($x_i$)= $\frac{e^x_i}{\sum_{j=1}^{n} e^{x_j}}$

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

More Books

Students also viewed these Databases questions

Question

Can I borrow a similar item instead?

Answered: 1 week ago