Question: What is the benefit of using mini - batch gradient descent over stochastic gradient descent? It guarantees to find the global minimum of the loss

What is the benefit of using mini-batch gradient descent over stochastic gradient descent?
It guarantees to find the global minimum of the loss function
It balances the computational efficiency of batch gradient descent with the faster convergence of stochastic gradient descent
It requires fewer hyperparameters to be tuned
It eliminates the need for backpropagation in training
What is the benefit of using mini - batch

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!