Brain-Inspired Computing Using Spintronic Technologies

Matthew Daniels, NIST

July 17, 2020

As the demand for brain-like computing increases, the quest to build neuromorphic computers turns toward integrating circuits with nanodevices that possess stochastic, analog, or other "brain-like" properties. In this talk, I focus on exploring the role thermally fluctuating nanomagnets can play in on-chip machine learning. Throughout the talk, I focus on our journey as physicists into this new area. Starting from a desire to compute “energy efficiently” and “like the brain”, we quickly found that these two goals are not always compatible, and that it serves us better to understand and augment modern CMOS technology rather than trying to replace its existing functionality.  I show how, with a few simple circuits, superparamagnetic tunnel junctions can serve as the primitive elements for an energy-efficient computing scheme based on strings of random bits. I explain how this randomness-based computing enables a very simple neural network construction that takes up less energy and area than what would be required without spintronic technologies. Finally, I discuss how the insights we gained from learning about adjacent fields in computer and electrical engineering shapes our perspective on what it means going forward to work at the physics/engineering interface.