Batch normalization is a technique for improving the performance and stability of artificial neural networks. It is a technique to provide any layer in a neural network with inputs that are zero mean/unit variance. Batch normalization was introduced in a 2015 paper. It is used to normalize the input layer by adjusting and scaling the activations.
- "Understanding the backward pass through Batch Normalization Layer". kratzert.github.io. Retrieved 24 April 2018.
- Ioffe, Sergey; Szegedy, Christian. "Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift" (PDF).
- "Glossary of Deep Learning: Batch Normalisation". medium.com. Retrieved 24 April 2018.
- "Batch normalization in Neural Networks". towardsdatascience.com. Retrieved 24 April 2018.
|This artificial intelligence-related article is a stub. You can help Wikipedia by expanding it.|