A deep neural network of light
DOI: 10.1063/PT.3.3654
Our brain’s neurons are, in essence, living logic gates: They take averages of the signals they receive from their neighbors and, depending on the results, either fire or don’t. In artificial neural networks, that process is replicated using matrix multiplication, but the task can be time and energy intensive. Now a study by Marin Soljačić, Dirk Englund (both at MIT), and colleagues has demonstrated that the matrix operations underlying neural network computing can be performed quickly and efficiently using photonic circuits. The team’s circuits exploit micron-sized beamsplitters and programmable phase shifters to manipulate input signals from an array of neurons and compute the values that determine neuronal responses. Couplings between waveguides can be adjusted to mimic how neuronal connections strengthen and weaken during real learning. (The

The researchers used the photonic circuits to build a deep neural network—one comprising several layers of artificial neurons—and trained it to recognize simple speech. After exposure to 180 recordings of four different vowel sounds spoken in a variety of voices, the neural network correctly classified subsequent clips of the vowel sounds more than 75% of the time. That accuracy should improve as the team incorporates more neurons into the network. The proof-of-concept network had just four per layer, but state-of-the-art nanofabrication techniques should allow thousands. As of now, the researchers’ network still relies on conventional electronics to simulate neuronal firing. But if that step can be implemented optically—say, with nonlinear dyes or semiconductors—only the training stage will require electronics. Once trained, the network would then be able to perform calculations two orders of magnitude faster than its fully electronic counterparts. (Y. Shen et al., Nat. Photonics 11, 441, 2017, doi:10.1038/nphoton.2017.93