To understand the difference between these you must understand the Gradient Descent Algorithm and its Variants.
Before I start with the actual answer, I would like to build some background.
A batch is the complete dataset. Its size is the total number of training examples in the available dataset.
Mini-batch size is the number of examples the learning algorithm processes in a single pass (forward and backward).
A Mini-batch is a small part of the dataset of given mini-batch size.
Iterations is the number of batches of data the algorithm has seen (or simply the number of passes the algorithm has done on the dataset).
Epochs is the number of times a learning algorithm sees the complete dataset. Now, this may not be equal to the number of iterations, as the dataset can also be processed in mini-batches, in essence, a single pass may process only a part of the dataset. In such cases, the number of iterations is not equal to the number of epochs.
In the case of Batch gradient descent, the whole batch is processed on each training pass. Therefore, the gradient descent optimizer results in smoother convergence than Mini-batch gradient descent, but it takes more time. The batch gradient descent is guaranteed to find an optimum if it exists.
Stochastic gradient descent is a special case of mini-batch gradient descent in which the mini-batch size is 1.