New Yorker: Computer software is advancing to the point that machines are beginning to be able to interpret not just human speech but also human emotions. They do so by scanning faces and noting so-called “action units"—key features, such as the eyebrows or the corners of the lips, that move when the person reacts to a situation. Beginning in the 1960s, research psychologist Paul Ekman worked to define human emotions as simple geometrical patterns of those deformable points in combination with nondeformable ones, such as the eyes and nose. His system of facial movements, called the Facial Action Coding System, has become the basis of many software programs that are being used in a variety of fields, from computer animation to law enforcement. This New Yorker article by Raffi Khatchadourian focuses on Rana el Kaliouby, an Egyptian scientist living near Boston, whose company has developed a facial-reading program called Affdex.
The finding that the Saturnian moon may host layers of icy slush instead of a global ocean could change how planetary scientists think about other icy moons as well.
Modeling the shapes of tree branches, neurons, and blood vessels is a thorny problem, but researchers have just discovered that much of the math has already been done.
January 29, 2026 12:52 PM
Get PT in your inbox
PT The Week in Physics
A collection of PT's content from the previous week delivered every Monday.
One email per week
PT New Issue Alert
Be notified about the new issue with links to highlights and the full TOC.
One email per month
PT Webinars & White Papers
The latest webinars, white papers and other informational resources.