Monday, January 2, 2023

CONFUSION KILLER : EPOCH Vs BATCH SIZE Vs ITERATION

CONFUSION KILLER : EPOCH Vs BATCH SIZE Vs ITERATION

*epoch:-*

1 epoch = ( 1 FP + 1 BP ), We can say epoch is like a round. As the no. Of epochs ( rounds ) we set neural network is going to updates weights and biases every rounds of training.

*Batch Size:-*

Batch Size is No. Of Samples is in one batch.

*No. of Batches😘
No. of batches = Total no. of samples/ Batch Size

Suppose if we are having 1000 samples and we are making each batch of with size of 50 samples then the entire datasets is going to devide into 20 batches.



*Iterations:*

To see complete samples by Neural Network we have to pass each batch (which contains 50 samples) in Neural network. So here we have to pass 20 batches to complete all samples for one forward pass and backward pass, these 20 is known as iterations for 1 epoch.
So one epoch having 20 iterations to see all samples by neural network.

And if we decide to go with 100 epochs , then in 1 epoch 20 batches are going. And for 100 epochs 2000 batches (20 * 100) are going. 

                                                            By - Uttam Kumar


1 comment:

Diving Deep into React Native: The Bridge, Code Conversion, and Debugging

  The React Native Bridge: A Core Concept The React Native bridge is the linchpin that enables seamless communication between the JavaScri...