Washington Post: The way our brains interpret visual stimuli has been translated into video. Jack Gallant of the University of California, Berkeley and colleagues used functional magnetic resonance imaging (fMRI) to measure the blood flow through the brain’s visual cortex as test subjects watched video clips. The participants’ brain activity was recorded by a computer program, which learned how to associate the visual patterns in the video clips with the corresponding brain activity. Test participants then viewed a second set of clips, and the resulting brain activity was used to test the computer’s reconstruction algorithm, which was fed 18 million seconds of random video from YouTube that the participants had not seen. The program then chose the hundred clips that were most similar to the video the subjects had watched and combined them to produce a rough reconstruction of the original video. The process, while indirect, is still a dramatic and somewhat eerie demonstration of how the human brain sees things.