Answered step by step
Verified Expert Solution
Question
1 Approved Answer
# GRADED FUNCTION: Siamese def Siamese ( text _ vectorizer, vocab _ size = 3 6 2 2 4 , d _ feature = 1
# GRADED FUNCTION: Siamese
def Siamesetextvectorizer, vocabsize dfeature:
Returns a Siamese model.
Args:
textvectorizer TextVectorization: TextVectorization instance, already adapted to your training data.
vocabsize int optional: Length of the vocabulary. Defaults to
dmodel int optional: Depth of the model. Defaults to
Returns:
tfmodel.Model: A Siamese model.
### START CODE HERE ###
branch tfkeras.models.Sequentialname'sequential'
# Add the textvectorizer layer. This is the textvectorizer you instantiated and trained before
branch.addtextvectorizer
# Add the Embedding layer. Remember to call it 'embedding' using the parameter name
branch.addtfkeras.layers.Embedding
inputdimvocabsize,
outputdimdfeature,
embeddingsinitializer'uniform',
embeddingsregularizerNone,
activityregularizerNone,
embeddingsconstraintNone,
maskzeroFalse,
inputlengthNone,
sparseFalse,
name'embedding'
# Add the LSTM layer, recall from W that you want to the LSTM layer to return sequences, ot just one value.
# Remember to call it LSTM using the parameter name
branch.addtfkeras.layers.LSTM
unitsdfeature,
activation'tanh',
recurrentactivation'sigmoid',
usebiasTrue,
kernelinitializer'glorotuniform',
recurrentinitializer'orthogonal',
biasinitializer'zeros',
unitforgetbiasTrue,
kernelregularizerNone,
recurrentregularizerNone,
biasregularizerNone,
activityregularizerNone,
kernelconstraintNone,
recurrentconstraintNone,
biasconstraintNone,
dropout
recurrentdropout
returnsequencesTrue,
returnstateFalse,
gobackwardsFalse,
statefulFalse,
timemajorFalse,
unrollFalse,
nameLSTM
# Add the GlobalAveragePoolingD layer. Remember to call it 'mean' using the parameter name
branch.addtfkeras.layers.GlobalAveragePoolingD
dataformat'channelslast',
name'mean'
# Add the normalizing layer using the Lambda function. Remember to call it 'out' using the parameter name
branch.addtfkeras.layers.Lambda
lambda x: tfmath.lnormalizex
name'out'
# Define both inputs. Remember to call then 'input and 'input using the name parameter.
# Be mindful of the data type and size
input tfkeras.layers.InputNone dtypeNone, name'input
input tfkeras.layers.InputNone dtypeNone, name'input
# Define the output of each branch of your Siamese network. Remember that both branches have the same coefficients,
# but they each receive different inputs.
branch None
branch None
# Define the Concatenate layer. You should concatenate columns, you can fix this using the axisparameter
# This layer is applied over the outputs of each branch of the Siamese network
conc tfkeras.layers.ConcatenateaxisNone, name'concNone None
### END CODE HERE ###
return tfkeras.models.Modelinputsinput input outputsconc, name"SiameseModel"
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started