Difference between Steps,Batch size,Epoch in Tensorflow

https://stackoverflow.com/questions/42816124/steps-vs-num-epochs-in-tensorflow-getting-started-tutorial
Let’s start the opposite the order:

1) Steps - number of times the training loop in your learning algorithm will run to update the parameters in the model. In each loop iteration, it will process a chunk of data, which is basically a batch. Usually, this loop is based on the Gradient Descent algorithm.

2) Batch size - the size of the chunk of data you feed in each loop of the learning algorithm. You can feed the whole data set, in which case the batch size is equal to the data set size.You can also feed one example at a time. Or you can feed some number N of examples.

3) Epoch - the number of times you run over the data set extracting batches to feed the learning algorithm.

Say you have 1000 examples. Setting batch size = 100, epoch = 1 and steps = 200 gives a process with one pass (one epoch) over the entire data set. In each pass it will feed the algorithm a batch with 100 examples. The algorithm will run 200 steps in each batch. In total, 10 batches are seen. If you change the epoch to 25, then it will do this 25 times, and you get 25x10 batches seen altogether.

Why do we need this? There are many variations on gradient descent (batch, stochastic, mini-batch) as well as other algorithms for optimizing the learning parameters (e.g., L-BFGS). Some of them need to see the data in batches, while others see one datum at a time. Also, some of them include random factors/steps, hence you might need multiple passes on the data to get good convergence.

你可能感兴趣的:(Tensorflow)