Question
Deep Learning Assignment 1 Forward and Backpropagation Problem 1. Fully Connected Neural Network In this exercise, you will implement a fully connected neural network using
Deep Learning
Assignment 1
Forward and Backpropagation
Problem 1. Fully Connected Neural Network
In this exercise, you will implement a fully connected neural network using
Python and NumPy. Your task is to implement the following functions:
• init_weights(n_inputs, n_hidden, n_output)
This function should randomly initialize the neural network's weights using the
normal distribution. It should return the weight matrices W0, W1, and W2 for the
input-to-hidden, hidden-to-hidden, and hidden-to-output layers, respectively. The
number of input, hidden, and output units should be passed as arguments to the
function.
• feedforward(x, W0, W1, W2)
This function should implement the feedforward operation of the neural network. Itshould
take as input an example x and the weight matrices W0, W1, and W2, and return the preactivations z0, z1, and z2, and the activations a0, a1, and a2 for the input, hidden, and
output layers, respectively. The feedforward operation should consist of matrix
multiplications and pointwise application of the non-linear function f, which can be
chosen as the sigmoid function.
Hint: The function should first augment the weight matrix W by appending the bias vector
as the last column and update the A matrix by appending a column of ones.
• predict(x, W0, W1, W2)
This function should take as input an example x and the weight matrices W0, W1, and
W2and return the prediction of the neural network for that example. This can be done
by passing the example x through the feedforward operation and returning the network's
output.
• train(X_train, Y_train, n_inputs, n_hidden, n_output, n_epochs,
learning_rate)
This function should train the neural network using the training data X_train and Y_train.
The number of input, hidden, and output units should be passed as arguments, along with
the number of training epochs and the learning rate. The function should return the trained
weight matrices W0, W1, and W2.
You can test your implementation using the following code snippet:
# Example usage
n_samples = 1000
n_inputs = 10
n_hidden = 5
n_output = 3
# Generate toy data
X_train = np.random.randn(n_samples, 10)
# Generate a vector of ones with the same number of rows as X_train
b = np.ones((X_train.shape[0], 1))
# Concatenate the ones vector to X_train horizontally (along the columns)
X_train= np.concatenate((X_train, b), axis=1)
# Generate random class labels for Y_train (0 or 1)
Y_train = np.zeros((n_samples, n_output), dtype=int)
# Randomly set one element in each row to 1
for i in range(n_samples):
random_idx = np.random.randint(n_output)
Y_train[i, random_idx]
W0, W1, W2 = init_weights(n_inputs, n_hidden, n_output)
# Train the network
W0, W1, W2 = train(X_train, Y_train, n_inputs, n_hidden, n_output,
n_epochs=100, learning_rate=0.1)
# Test the network
x_test = np.random.randn(1, 11)
y_pred = predict(x_test, W0, W1, W2)
print(y_pred)
Hint: Create the following function:
1- sigmoid(x), tanh(z), and/or relu(z)
2- backprop(X_train, Y_train, W0, W1, W2, learning_rate)
3- loss(Y_pred, Y): mean squared error loss
4- train(X_train, Y_train, n_inputs, n_hidden, n_output, n_epochs,
learning_rate)
Submission: Please submit one Jupyter Notebook file with the code and any comments
or figures you have created by hand to show your thought process
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started