Answered step by step
Verified Expert Solution
Question
1 Approved Answer
1 . [ 1 0 points ] Consider a single - input neuron with a bias. We would like the output to be 0 for
points Consider a singleinput neuron with a bias. We would like the output to be for inputs less than and for inputs greater than or equal to
a What kind of transfer function is required?
b What weight and bias would you suggest? Is your bias in any way related to the weight? If yes, how?
c Summarize your network by naming the transfer function and stating the bias and the weight. Draw a diagram of the network.
d Write a MATLAB function that implements your transfer function or use the appropriate transfer function from the NND code Then write a neuron function that uses that transfer function to implement your neuron design. Make sure you follow the convention of putting each MATLAB function in a m file of the same name. Verify using MATLAB that it performs as expected. Include both functions and demonstration output in your document.
points Modify your singleinput neuron function so that it takes multiple inputs and computes the scalar output for that neuron. This function should take three arguments: an input vector, a weight vector, and a bias also a scalar This function should call the same transfer function as the singleinput neuron function from question Implement this multiinput neuron function two ways:
a Use a for loop to iterate over the inputs and weights to compute their inner product.
b Use the fact that MATLAB can directly multiply vectors, plus the MATLAB transpose operator, to compute the inner product in a single mathematical expression.
Note that you can get the size of a MATLAB vector using the length function; this should allow you to write a function that can determine how many inputs are presented to the neuron.
points Implement a PerceptronLayer class that implements a single layer of multiple neurons of the type from question The class should have the following characteristics:
The constructor takes three arguments and can be called one of two ways, depending on the first two arguments: with two scalar arguments the number of inputs and and number of outputs or with a weight matrix and bias vector. You can check the dimensionality of the first two arguments in the constructor using the size function to determine which is the case. In the former case, it should preallocate a weight matrix and a bias vector of correct dimensions with random values uniformly distributed in the range In the latter case, it should set the internal weight matrix and bias vector from those arguments and of course, the number of inputs and outputs for the layer are determined by the dimensions of the weight matrix The third argument to the constructor should be a string: the name of the transfer function to use. A PerceptronLayer object should remember this, so that the forward method will use it
Private transfer function methods. At a minimum, implement hardlim and hardlimsYou can always add other transfer functions later.
A forward method should take an input vector and return an output vector. This implements the operations that computes the output of a layer of multiple neurons. This method should call the private transfer method that was indicated at construction time. Implement this method two ways ie implement it twice; first one way and then the other:
a Use a for loop to iterate over the neurons in the layer, computing each neuron's output and appending its output to the output vector thus assembling the layer's output vector sequentially
b Use MATLAB matrixvector mathematical operations to compute the outputs of all of the neurons in the layer in parallel, using no loop. Likely you will want to start with ensuring that your transfer function works for vectorvalued inputs.
Use MATLAB to demonstrate that each of the two above implementations will work for a twoinput, twooutput network in which the inputs are binary or and the outputs are binary or So for the four possible input patterns think that these are binary patterns one of the layer's outputs will be values one value for each of the four input patterns that correspond to the logical AND of the input values and the other output will be the logical OR of the input values. Note that you are not yet implementing a learning algorithm; a little trial and error should allow you to set reasonable weight and bias values. Include in your document your final code in an appendix, with vectorized forward method and example runs that show the weights and biases and demonstrate that your functions work.
Note that you will be building future implementations from this starting point, so spend some time documenting and designing it for future expansion. Note further that the above implementation uses transfer functions that are methods of the PerceptronLayer class, with a string used to select which one the object uses. An acceptable alternative implementation to this would be to write the transfe
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started