Question
Suppose you have a neural network with 3 layers. The first layer is the input layer with 20 neurons just encoding the 20 input binary
Suppose you have a neural network with 3 layers. The first layer is the input layer with 20 neurons just encoding the 20 input binary values (0 or 1). The second layer consists of 30 perceptrons each with their own weights and biases. The final layer is just one output perceptron with its weights and bias. The neurons in adjacent layers are fully connected. So this neural network computes a function y=f(x1,x2,,x20) where all variables take binary values. Assume that wx+b is never 0 for every neuron in the network and every possible input x. Is it possible to change the weights and biases of the perceptrons in this neural network in a systematic way so that the new neural network computes a new function z=h(x1,x2,,x20) such that h(x1,x2,,x20)=1- f(x1,x2,,x20) for all possible inputs x1,x2,,x20? If so, how?
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started