Dropout is a technique used to tackle overfitting.The Dropout method in Keras.layers module takes in float value b/w zero and one,which is the fraction of neurons to drop from a particular layer.The dropout layer is similar except that when the layer is used, the activation are set to zero of some random nodes.These is way to prevent overfitting.
Validation_data or Validation_set :
With a validation set, you are essentially taking a fraction of your sample out of your training set ,or creating entirely new set all together, and holding out the samples in this set from training.
During each epoch, the model will be trained on sample in the training set nut will not be trained on the sample in the validation set. Instead the model will be validating on each samples in the validation set.
The purpose of doing these is to judge how well your model can generalize.Meaning how well is your model is able to predict on data that it's not seen while being trained.
Having validation set also provides great insights into whether your model is over fitting or not.These can be interpreted by comparing the accuracy and loss from your training samples to the val_acc and val_loss from validation samples.For example, if your acc is high but your val_acc is lagging way behind, this is good indication that your model is overfitting.
By setting verbose 0,1,2 you just say how you want to see the training progress for each epoch.
- verbose = 0 will show nothing
- verbose = 1 progress bar will be shown to Ex: Train on 2684 samples,validate on 300 samples Epoch 1/2 2686/2684 [==========>=] loss : 0.1 acc :0.9 val_loss:0.2 val_acc=0.8
- verbose = 2 will show you one line per epoch Ex: Train on 2684 samples, validate on 300 samples Epoch 1/1
MLP for Binary Classification :
The accuracy of the model is not that good because we are not dealing with validation_set in this model so for knowing practical use of these see our precious blog link.