Difference between batch_size and epoch
Suppose you want to train train your machine learning model with data.That data we call it as a training data.
Now the thing is that for huge sets of training data you can't feed whole bunch to your model at once due to limitation in computer memory.
So,what we do is that we break up our whole training batches into sizeable batches which can fit into your computer's memory at once.
We than feed these batches one by one to our model for training.
When all the batches are feed exactly once you complete what is called as epoch.Basically it is equivalent to showing your model that whole training data bunch once.
Now you have to carry it multiple times for successful training,hence multiple epochs we use.