Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

a) [8 points Say that the network is using linear units: that is, the output of a unit is CW A for some fixed constant

image text in transcribed
a) [8 points Say that the network is using linear units: that is, the output of a unit is CW A for some fixed constant C. Let the weight values wi be fixed. Re-design the neural network to compute the same function without using any hidden units. Express the new weights in terms of the old weights and the constant C. b) [4 points] Is it always possible to express a neural network made up of only linear units without a hidden layer? Justify your answer. c) [8 points] Another common activation function is a threshold, where the activation is t(W A), where t(x) is 1 if x > 0 and 0 otherwise. Let the hidden units use sigmoid activation functions (activation function of U is then (1 +exp(W A)) ) and let the output unit use a threshold activation function. Find weights which cause this network to compute the XOR of X1 and X2 for binary-valued X1 and X2. Keep in mind that there is no bias term for these units

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

An Introduction to the Mathematics of financial Derivatives

Authors: Salih N. Neftci

2nd Edition

978-0125153928, 9780080478647, 125153929, 978-0123846822

More Books

Students also viewed these Mathematics questions