Document Type
Article
Publication Date
3-25-2020
Abstract
Weight-sharing is one of the pillars behind Convolutional Neural Networks and their successes. However, in physical neural systems such as the brain, weight-sharing is implausible. This discrepancy raises the fundamental question of whether weight-sharing is necessary. If so, to which degree of precision? If not, what are the alternatives? The goal of this study is to investigate these questions, primarily through simulations where the weight-sharing assumption is relaxed. Taking inspiration from neural circuitry, we explore the use of Free Convolutional Networks and neurons with variable connection patterns. Using Free Convolutional Networks, we show that while weight-sharing is a pragmatic optimization approach, it is not a necessity in computer vision applications. Furthermore, Free Convolutional Networks match the performance observed in standard architectures when trained using properly translated data (akin to video). Under the assumption of translationally augmented data, Free Convolutional Networks learn translationally invariant representations that yield an approximate form of weight-sharing.
Recommended Citation
Ott, J., Linstead, E., LaHaye, N., & Baldi, P. (2020). Learning in the machine: To share or not to share? Neural Networks, 126, 235-249. https://doi.org/10.1016/j.neunet.2020.03.016
Copyright
The authors
Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.
Included in
Computer and Systems Architecture Commons, Nervous System Commons, Neurosciences Commons, Other Computer Sciences Commons, Other Electrical and Computer Engineering Commons, Systems Architecture Commons
Comments
This article was originally published in Neural Networks, volume 126, in 2020. https://doi.org/10.1016/j.neunet.2020.03.016