Neural networks that grow themselves from noise
Nature builds brains from a single cell. This study shows how an artificial network can do something similar - growing itself from "noise".
Inspired by the early visual system (retina to LGN), the authors propose a simple developmental algorithm that starts with one "cell" and self-organizes a functional, layered network. The key ingredients: spontaneous, wave-like activity in the first layer (random-looking but structured) and a local learning rule in the second layer that captures that pattern. Together, they wire up a convolutional pooling layer - no hand-designed architecture required.
Why it matters: the method adapts to many input geometries, tolerates malfunctioning units, and can grow different pool sizes and shapes. It’s a primitive but concrete step toward AI systems that develop rather than being painstakingly engineered.
Paper: http://arxiv.org/abs/1906.01039v1
Authors: Guruprasad Raghavan, Matt Thomson
Paper: http://arxiv.org/abs/1906.01039v1
Register: https://www.AiFeta.com
AI NeuralNetworks DeepLearning SelfOrganization BioInspired Neuroscience MachineLearning Research