The rapid advancement of neuroscience and technology has given rise to a field known as neurotechnology, which involves studying and applying techniques to interact with and manipulate the human brain. This can range from medical development that assists in treating neuro-biological disorders to neurological data mapping and many evolutions yet to come. While the medical field is well regulated on average globally, the growth of neurotechnology is unprecedented and thus requires oversight.
In July 2023, the United Nations Educational, Social and Cultural Organisation (UNESCO), keeping in mind this growth, held a conference to discuss the ethics of neurotechnology.
Some critical aspects of neurotechnology include Brain-Computer Interfaces (BCIs) that enable direct communication between the brain and external devices. These interfaces have the potential to help individuals with disabilities regain mobility or control robotic systems. These are extended to Brain-Machine Interfaces (BMIs), integrating neural signals with external machines, enabling brain-controlled devices like prosthetics and exoskeletons.
Neuroimaging is another aspect of neurotechnology that allows access to neurological responses and data. Further, neurostimulation is an essential aspect of neurotechnology that focuses on human behavioural changes and involves applying electrical or magnetic stimulation to specific brain areas to modulate neural activity, offering therapeutic possibilities for conditions like depression and epilepsy.
These aspects of neurotechnology innovation, combined with Artificial Intelligence, threaten human privacy, autonomy and dignity, including behavioural changes, hormonal responses and the scope for mental influence, all without active subject participation.
This field requires regulation, not only for areas of impact but also to govern neurological data. Currently, there are no guidelines to regulate the gathering, study and use of neurological data, separate from sensitive data. Neuralink, Elon Musk’s neuroscience company, recently announced an upgraded brain implant chip approved for human trial. This implant can allegedly alter memories and treat hearing loss, blindness, paralysis and depression. Since such technology is not new in the private sector, regulations are imminent.
While there are many benefits of neurotechnological innovation, the need for ethical guidelines in this field remains universally considered essential. In 2019, the Organisation for Economic Cooperation and Development (OECD) released the first international-level guidelines on the ethical use of neurotechnology in clinical and research settings. The International Bioethics Committee (IBC) under UNESCO also launched a report that oversees the ethical issues of neurotechnology and recommendations that can assist in ethical innovation.
Neurowarfare: The intersection of biowarfare and cyberwarfare
Advancements in synthetic biology have opened the door for integrating brain functions into warfare, potentially within the next decade. These developments encompass a range of neurotechnological agents that can impact neurological abilities. These neurotechnological agents include neuropharmacological agents, such as Amphetamines—that can alter neurological functioning—and neurotechnological devices.
Concurrently, advances in neuroscience and neurotechnology have necessitated discussions on how such developments could be used as weapons in national security, intelligence, and defence contexts.
As with many technologies, neurotechnology can be dual-use, having both civilian and military applications. Neurowarfare refers to the use of neurotechnologies in military operations. While still largely speculative, potential applications include Cognitive Enhancement in neurowarfare, which may seek to enhance the cognitive abilities of soldiers, such as memory, attention, and decision-making, to optimise their performance in war zones.
One critical case study in neurowarfare is the case of Havana Syndrome, experienced by United States (US) intelligence personnel. This syndrome is a combination of symptoms afflicting individuals’ memory and balance causing vertigo and confusion. A US Department of State (DOS) funded team of neuroscientists claimed these symptoms, which originated from Cuba in 2016, were the outcome of an intentional attack, with other studies claiming it was the outcome of a Directed Energy Weapon.
Other suspected cases have been noted to be originating from Guangzhou, China, though the DOS has not confirmed the severity of these cases, citing privacy concerns.
Technology and medical intervention can also be used in Neuropsychological Warfareto impact and reduce the cognitive abilities of targets both on and off the battlefield, induce confusion and fear, or alter decision-making during conflicts. Additionally, brain-computer interfaces can allow soldiers to control weapons and equipment through neural signals, potentially increasing accuracy and response times.
Ethical concerns and considerations
Much like ethical concerns in neurotechnological intervention and its use in civilian populations for clinical, research or economic purposes, use of neurotechnology in warfare also has ethical considerations.
The first is informed consent and privacy, not only for soldiers but also for civilians. Neurotechnology and neuropharmacology should not be used without informed consent. There should be an oversight over the use of such innovations to harm opposing sides and reduce their cognitive function, temporarily or permanently.
Secondly, psychological harm is a possible outcome of using neurotechnology weapons. Before the use of such technology is deployed, let alone becoming the norm, it is necessary to study the impact on the psychology of an individual and the limits that need to be imposed in this area.
Finally, non-combatants must be protected from using neurotechnology applications, especially neurological data manipulation, loss of privacy and potential abuse and lack of consent.
Given the significant ethical implications, responsible governance is crucial in guiding the development and deployment of neurotechnology in military settings.
International Cooperation is the first step in guiding such measures. As mentioned earlier, the OECD has already outlined ethical considerations for neurotechnology in clinical and research settings, and UNESCO is actively pursuing the same field.
However, there is a need to establish global governance in neurotechnological and neuropharmacological warfare. To make these ethical guidelines relevant to warfare, international organisations cannot only govern the technology in research spaces. Additionally, disarmament forums need to incorporate neurowarfare into their framework. That is, accountability of research and use should not be limited to academia and should include government organisations, prioritising transparency and ethical oversight.
Additionally, disarmament measures must expand with growing technologies and act with foresight to cover neurowarfare, the scope for use, and the limitations that may need to be induced.
Beyond International disarmament measures and cooperation, accountability needs to be created for state actors by creating reporting systems.
Neurotechnology has the potential to revolutionise various aspects of human life, from medicine to communication and beyond. However, its application in neurowarfare raises profound ethical challenges and demands responsible governance to mitigate risks and ensure ethical conduct. Striking a balance between technological advancement and ethical considerations will be essential in shaping the future of neurotechnology and neurowarfare to safeguard human rights and global security.