← Return to search results
Back to Prindle Institute
Technology

The Promises and Perils of Neurotechnology

By Daniel Burkett
16 Jun 2023
image of light beams refracting from model brain

In late May, a groundbreaking study published in Nature outlined how new developments in neurotechnology have allowed a man to walk again after being paralyzed for the better part of a decade. The patient in question – Gert-Jam Oskam – sustained a spinal cord injury in a cycling accident ten years prior, leaving him entirely unable to walk. This injury – like most spinal injuries – essentially meant that Oskam had suffered an interruption in the communication between his brain and certain parts of his body. In the revolutionary new procedure, Oskam received a brain-spine interface (BSI) that essentially created a “digital bridge” between the brain and spinal cord. The treatment was highly effective, with Oskam recovering the ability to stimulate leg muscles mere minutes after implantation. Within a year, Oskam was once again able to stand, walk, climb stairs, and navigate complex terrain.

The rapid development of neurotechnology will provide a raft of new medical interventions; from restoring spinal injuries such as Oskam’s, to allowing the control of prosthetic limbs. It also creates promising opportunities for the treatment of dementia and Parkinson’s disease, as well as more common mental health issues such as depression, insomnia, anxiety, depression, and addiction. Given this battery of potential medical applications, it would seem that neurotechnology is clearly a force for good. But is this really the case?

On a consequentialist analysis, we must not only consider the benefits of new scientific developments, but also their potential costs. What concerns, then, might arise in the context of neurotechnology?

Given its highly invasive nature, neurotechnology’s greatest threats involve potential breaches to both our (1) privacy, and (2) autonomy. Consider, first, privacy. Neurotechnology literally creates a digital connection to our minds – the very thing that makes us us. In doing so, it holds the capacity to gain intimate knowledge of our (previously) most private psychological states. There are very real concerns, then, about what neurotechnology might do with this information. Many of us know the surprise, frustration, and – perhaps –  indignation that comes when we are targeted by a commercial tailored specifically to our internet browsing history. Imagine, then, what would happen if such marketing was based on neurotechnology’s knowledge of our innermost thoughts. Consider the audacity of receiving an advert for the latest SUV just moments after thinking “I really need to buy a new car.”

Of course, this threat to privacy already exists thanks to the ubiquity of technology in our daily lives. While not nearly as invasive, digital technology currently enjoys unprecedented access to our lives via our phones and myriad other smart devices (all, of course, in communication with each other and with unfettered access to our social media, digital communications, and financial transactions). In this way, then, neurotechnology might only represent a difference in the degree of our loss of privacy, rather than an entirely novel intrusion in our lives.

Consider, then, how neurotechnology might instead threaten our autonomy. A vital component of autonomy is retaining complete control over our thoughts and actions. The inclusion, via neurotechnology, of any kind of “digital bridge” necessarily compromises this control – creating a vulnerability that might compromise our autonomy. If there is a digital “middleman” between my psychological desire to lift a glass of water, and my hand’s physical performance of this task, then there is the opportunity for my autonomy to be threatened. What if my BSI refuses to perform the action I desire? What if the BSI is hacked, and I am forced to perform an action that I do not desire? In this sense, neurotechnology poses a threat that prior technological advancements – like phones and smart devices – have not yet created. While social media implements algorithms to monopolize our attention, and advertisers might use every trick in the book to manipulate us into purchasing their products, they have not (yet) been able to wrest control of our physical bodies. With the advent of neurotechnology, however, this may become a possibility.

In addition to concerns relating to our privacy and autonomy, there is the larger concern that neurotechnology might threaten our very humanity. There is, of course, much debate in philosophy about what it means to be “human” – or whether there is any such thing as “human nature” in the first place. However, in Enough: Staying Human in an Engineered Age, author Bill McKibben argues that human life would be meaningless if every challenge we faced could be easily overcome. By this reasoning, then, neurotechnology might threaten to strip meaning from our life by allowing us to strive over adversity without hard work and the development of important skills and character traits.

Of course, this doesn’t imply that the use of all neurotechnology is wrong. We routinely implement medical technology to make people’s lives better, and certain applications of neurotechnology – like the BSI that allowed Gert-Jam Oskam to walk again – are really no different to this. The novelty of neurotechnology, however, is in its capacity to go beyond therapy and provide enhancement – to take us beyond our traditional nature and, in doing so, threaten our very nature. This concern – coupled with those regarding the threats it raises to privacy and autonomy – mean we should practice caution in its development and implementation. What stands to be seen however, is whether such fears are merely the techno-paranoia of Luddites, or reasonable concerns about the wholesale exploitation of technology to threaten our privacy, autonomy, and humanity.

Daniel Burkett received his PhD in Philosophy from Rice University, and is now a lecturer in the Philosophy Department at Binghamton University. His primary research interests are in ethics and political philosophy – particularly issues surrounding punishment and climate change.
Related Stories