Artificial neural networks are everywhere now because they are so useful. They do everything from predicting the weather to writing essays and translating them into Spanish. Their abilities have exploded during the last dozen years because they have grown far bigger. Unfortunately, though, bigger neural networks require more energy to train and run. Our brains use only about a fifth of the calories we eat to learn and perform a far greater variety of tasks. How does the network of neurons that comprises the brain manage to do all this at low energy cost? The answer is that it doesn’t use a computer—it learns on its own. Neurons update their connections without knowing what all the other neurons are doing. Liu has worked to develop an approach to learning that shares this key property but is far simpler than the brain's. This approach exploits physics to learn and perform tasks for us. Using this approach, they have built physical systems that learn and perform machine learning on their own. Liu's work establishes a new paradigm for scalable learning.
ABOUT ANDREA LIU
Andrea Liu is a theoretical soft and living matter physicist. She was a faculty member in the Department of Chemistry and Biochemistry at UCLA for ten years before joining the Department of Physics and Astronomy at the University of Pennsylvania in 2004, where she is the Hepburn Professor of Physics and Director of the Center for Soft and Living Matter. She is a fellow of the American Physical Society (APS), American Association for the Advancement of Science (AAAS) and the American Academy of Arts and Sciences, and a member of the National Academy of Sciences (NAS). Liu is currently a Councilor of the AAAS and NAS as well as a member of the Committee for Human Rights of the National Academies.