Dropout is a regularization technique used to prevent overfitting in neural networks.
During training, randomly selected neurons are ignored (dropped out). They are temporarily removed from the network, meaning they make no contribution to the activation of downstream neurons on the forward pass, and weight updates are not applied to the neuron on the backward pass.
Use the slider to simulate dropout percentage. Notice how connections disappear as neurons are deactivated, forcing the network to learn more robust features that don't rely on specific neurons.