Answered step by step
Verified Expert Solution
Question
1 Approved Answer
2. Consider the following simple neural network with only one output node. Ignore the bias node for this example. The values on the edges indicate
2. Consider the following simple neural network with only one output node. Ignore the bias node for this example. The values on the edges indicate the weights ssociated with the "receiving node Input Hidden Output 0.1 --> Output 0.6 2 For cases where Sigmoid activation function is assumed, you will need the following small snippet of Python code to compute sigmoid activation s for a value z. mport numpy as np s-1.0/(1.0+ np.exp(-1.0 * z) Or you can use a scientific calculator, MATLAB, etc. Refer to the class notes/slides and do the following . Perform a forward pass on the network. Consider two neural networks, A and B with the same exact architecture as above, such that A uses Sigmoid activation at each node/unit and B uses RELU activation at each node/unit. Remember that RELU activation is equal to max(0,x). i.e., it simply outputs the input if it is positive, and 0, otherwise. After performing the forward pass using the two activation functions, compare the final output activation for networks A and B for different inputs a) If x1 = 0 and x2 = 1, the output activation for A will be same as the output O b) If x1-4 and x2-2, the output activation for A will be smaller than the O c) If x--4 and x2 = 1the output activation for A will be same as the O d) If x 1 = 1 and x2-4, the output activation for A will be same as the activation for B output activation for B output activation for B output activation for B
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started