Regularizing neural networks with dropout and with DropConnect
31 points by ZygmuntZ 11 years ago | 2 comments- gamegoblin 11 years agoIf anyone doesn't fully understand and is looking for a quick TL;DR:
Neural networks are non-linear function approximators that use artificial neurons with connections between the neurons to do calculations. Dropout is a technique in which one randomly turns off neurons during training to make the network more generalized by making neurons less strongly interdependent. DropConnect furthers this idea by, rather than turning off entire neurons, merely turning off single connections between neurons.
That being said, I follow Hinton pretty closely, I am a bit surprised he or one of his students didn't think of this generalization when developing dropout in the first place! It seems like a pretty natural extension.
- cracker_jacks 11 years agoBecause it's not clear if dropconnect actually outperforms dropout. Also, this is a generalization in a very weak sense, dropconnect is very unlikely to perform the same thing as dropout.
- cracker_jacks 11 years ago