What Hollywood gets wrong (and right) about neuroscience

What Hollywood gets wrong (and right) about neuroscience

By Eric H. Chudler

Scroll to Article Content

Become a martial arts expert by uploading the ability to fight directly to your brain. Build a new body and insert the mind of a lost loved one into this newly created person. Actor Keanu Reeves did both—at least his characters did in the movies The Matrix (1999) and Replicas (2018). Perhaps you would like to tap into the 90% of your brain that you do not use to improve your memory and thinking power. Actor Bradley Cooper did this playing the character Eddie Morra in Limitless (2011), as did Scarlett Johansson as the title character in Lucy (2014). Do Hollywood movies get the neuroscience right? If not, what is the harm? Perhaps Hollywood will provide ideas and inspiration for scientists to pursue new innovations in neuroscience and neurotechnology.

The brain science depicted in many Hollywood blockbusters sounds wonderful and could have far-reaching benefits for people who suffer from diseases or disorders of the nervous system. Such innovations also have uses for those who would want to boost their cognitive abilities—if the science was real. At the present time, our knowledge of how the brain codes information is incomplete. Over the past 75 years, scientific research has made substantial progress toward understanding the cellular mechanisms responsible for memory and learning, but how the brain creates our consciousness, thoughts, emotions and intellect remains elusive. This lack of understanding, together with the limited technical ability to upload information and skills directly into the brain, prevents us from accomplishing the feats of The Matrix and Replicas.

Neuroscience continues to make discoveries that may bring ideas off the silver screen and into medical clinics and homes. While we do not yet have the visor worn by Geordi La Forge to see outside the normal range of vision (Star Trek: The Next Generation, 1987–1994), we do have retinal implants that allow people who are blind see shapes. We do not yet have the bionic ears of Jamie Summers to hear sounds from incredible distances (The Bionic Woman, 1976–1978), but thousands of cochlear implants have been implanted in people to restore their hearing. We do not yet have drugs to boost intelligence (Limitless and Lucy), but drugs to slow memory loss in people with Alzheimer’s disease are currently available.

Technology that fuses the brain with machines has been used to tell stories for decades. Movies and TV shows such as Minority Report (2002), Avatar (2009), Elysium (2013) and Black Mirror (2016) have imagined worlds where the brain interacts seamlessly with robots, prosthetics and computers. Doc Ock in Spider Man 2 (2004) could even control multiple tentacles fused to his spinal cord. Although Doc Ock’s neuroprosthetic technology does not exist, scientists in laboratories around the world have recorded electrical signals from inside, on top of, or outside the brain and have used this activity to control devices such as robotic arms and computer cursors. Stimulation of the brain with electrical or magnetic pulses has also been used successfully to treat the symptoms of depression, stroke, pain, epilepsy and movement disorders. Although additional work is needed to identify the mechanisms that underlie the success of these methods, these innovations establish the foundation for future therapies.

While Hollywood films may inspire researchers and inventors to investigate new ideas and technologies, film viewers must remember that the neuroscience they see on the big screen is science fiction. For example, we use all of the brain rather than just 10% (Limitless; Lucy), the full moon does not result in more abnormal behavior such as crimes or traffic accidents (or werewolves; An American Werewolf in London, 1981), and there is no way to download consciousness into a computer (Transcendence, 2014; Upload, 2020). Naïve belief in fictional film may not have devastating consequences, but trusting Hollywood with scientific facts may result in impaired scientific literacy and the inappropriate use of limited resources. For example, if law enforcement believes that more crime will occur when the moon is full, time and money may be wasted by assigning more officers to the street when it is not needed. Films with neuroscientific themes can be used to correct misunderstandings about the brain and encourage critical thinking.

Nonetheless, the promise of neurotechnology has not escaped the attention of entrepreneurs intent on commercializing the merging of mind and machine. Elon Musk, CEO of SpaceX and Tesla, is the co-founder of Neuralink, a company that would like to implant a small device in the brain to read and write information wirelessly. Synchron, one of Neuralink’s main competitors, has already received FDA approval to conduct clinical trials of their implantable device for similar purposes. Other companies have jumped on the brain-computer interface bandwagon with noninvasive devices that are worn on a user’s head rather than implanted into their brain. Whether the public wants and would purchase these products, especially those devices that require neurosurgical implantation, is yet to be determined.

Neuroscientific research will likely result in the development of new therapies and treatments that alleviate suffering and restore some function in people who have experienced cognitive, motor, and sensory loss. These innovations may also find their way into everyday life in products that are controlled directly by brain activity. Perhaps life would be better if we could manipulate a phone or browse the Internet with thought alone. On the other hand, society may pay a high cost when brains are linked directly to machines. For example, unexpected side effects may change a person’s identity and how others view this person. The cost for neuroenhancing pharmaceuticals and technologies will also likely be prohibitive for most people, and the ability to pay for these new products may create new divisions within society. Security and privacy of brain data must be protected, and access to the data stream out of and into the brain must be safeguarded. We are already flooded with advertisements when we shop, watch TV, and visit websites—I, for one, do not need advertisements beamed directly into my brain. Now, there are some ideas for the next Hollywood film!


Eric H. Chudler is executive director of the Center for Neurotechnology and a neuroscientist at the University of Washington in Seattle. His books include Brain Bytes: Quick Answers to Quirky Questions about the Brain and The Little Book of Neuroscience Haiku.

Notes

Chudler, E. H., The Full Moon and Behavior, http://faculty.washington.edu/chudler/moon.html

Chudler, E. H., Do We Use Only 10% of Our Brains?, http://faculty.washington.edu/chudler/tenper.html

Stewart, H. L. and Chudler, E.H. Neuroscience in the cinema. Science Scope, 25:76–81, 2002.

Wiertelak, E. P. And the winner is: inviting Hollywood into the neuroscience classroom. J Undergrad Neurosci Educ. 2002 Fall;1(1):A4-A17. Epub 2002 Oct 15. PMID: 23493171; PMCID: PMC3592583.

Wijdicks, E.F.M. Neurocinema: When Film Meets Neurology 1st Edition, Boca Raton (FL): CRC Press, 2015.