- Hands-On Neural Networks with Keras
- Niloy Purkait
- 123字
- 2025-04-04 14:37:33
Dropout regularization experiments
The following are two experiments that we performed using a network of the same size, with different dropout rates, to observe the differences in performance. We started with a dropout rate of 0.1, and progressively scaled to 0.6 to see how this affected our performance in recognizing handwritten digits. As we can see in the following diagram, scaling our dropout rate seems to reduce overfitting, as the model's superficial accuracy on the training set progressively drops. We can see that both our training and test accuracy converges near the dropout rate of 0.5, after which they exhibit divergent behavior. This simply tells us that the network seems to overfit the least when a dropout layer of rate 0.5 is added:

