Answered step by step
Verified Expert Solution
Question
1 Approved Answer
a ) ( 1 5 points ) Take the pretrained BERT for BertForSequenceClassification model . Extract logits for each token in the sentence using BERT
a points Take the pretrained BERT for BertForSequenceClassification model Extract logits for each token in the sentence using BERT for a sentence. So for example, if you have the sentence Hello how are you doing today? you will get vectors corresponding to each token in the sentence. To get the embeddings, you get the output of BERT output and do embeddings output.lasthiddenstate.
b points Extract BERT embeddings for all positive and negative sentences in the train.tsv file we used in class for sentiment classification.
c points Train a single layer LSTM network with a hidden dimension of to do sentiment classification from BERT outputs. Use a learning rate of batchsize of
You dont have to tune the number of iterations. Just run a few iterations and report the accuracy.
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started