The Durham Law Review is a student-run society commenting on contemporary legal and commercial issues. Meanwhile, it publishes feature articles alongside Regular commercial and legal updates.

Can Human Rights Law Catch Up with “Neurocapitalism”?

Can Human Rights Law Catch Up with “Neurocapitalism”?

getty_611992272_2000133320009280249_357345.jpg

Neuralink and the Wider Picture 

In July 2019, Elon Musk, founder of Tesla, SpaceX and the recently founded Neuralink, held a live streamed presentation at the California Academy of Sciences that they would be anticipating to start experimenting with humans in 2020. The implantable brain-machine interface (BMI) consists of a module placed outside the head that wirelessly receives information from thin flexible electrode threads embedded in the brain. The system was reported to include ‘as many as 3,072 electrodes per array distributed across 96 threads’ each 4-6 nanometers in width, which would be embedded into the human brain by a robotic apparatus with the intention to avoid damaging blood vessels. 

Though this news might sound fictious, Neuralink is not the first tech company to announce research and even production on BMIs. In fact, Mark Zuckerberg’s Facebook and other companies such as Kernel, Emotiv and Neurosky are all working towards brain tech, and claimed that they’re building it for ethical purposes such as helping people with paralysis to control their devices. 

All this means that these BMIs are moving closer and closer to reality, and potentially, someday it might read thoughts, and become widely available to the average consumer, which, as Samuel (2019) put it, ‘your brain, the final privacy frontier, may not be private much longer’. Human rights law must be developed as a result in order to tackle any humanitarian issues which might arise. 

These issues were well addressed by neuroethicist Marcello Ienca, researcher at ETH Zurich, which proposed four human rights in order to prepare humanity from “neurocapitalism”

The Right to Cognitive Liberty

You should have the right to freely decide you want to use a given neurotechnology or to refuse it (Ienca, 2017). In China, the government is already mining data from employee’s brains by having them wear caps that scan their brainwaves for depression, anxiety, rage or fatigue, and the US military is also looking to neurotechnologies to make solidiers more fit for duty. 

 

The Right to Mental Privacy 

You should have the right to seclude your brain data or to publicly share it (Ienca, 2017). If these BMIs are able to read thoughts, then governments and authorities around the world are very likely to use them for interrogations and investigations in the near future. 

 

The Right to Mental Integrity 

You should have the right not to be harmed physically or psychologically by neurotechnology (Ienca, 2017). BMIs may facilitate a new form of brainwashing through technology, especially in socialist regimes, religious authorities who want to indoctrinate people or terrorist groups seeking new recruits. 

 

The Right to Psychological Continuity 

You should have the right to be protected from alterations to your sense of self that you did not authorize (Ienca, 2017). Advertisers may try to figure out how the brain makes purchasing decisions and how to take advantage of such decisions. We might not be able to make rational and practical financial decisions as a result. 

 

The Way Forward 

To catch up with the rapidly-growing and may I say, scary development of technology, countries like Chile and Europe have already made some form of protection, a NeuroProtection agenda that would make brain data protection a human right for the former, and establishing a set of nine new principles for regulating the use of brain data for the latter. 

 The UK, on the other hand, is slightly falling behind in this regard. The General Data Protection Regulation (GDPR) regards individuals as ‘data subjects’, and asserts each data subject’s rights to any and all personal data form which they might be identified. Different types of data attract different types of protection, depending on sensitivity. Health data sees higher levels of protection than some other types, as from this data very sensitive information may be derived about the data subject. If brain data is gained through medical devices, such as devices used for rehabilitation, then it regularly qualifies as health data. However, if it stems from consumer products, it might not seem to be so. Considering such data as health data (Rainey and Bublitz, 2019) or a possible new Act of Parliament regarding neurotech and BMIs, individuals would be afforded a higher level of protection, and therefore protecting us from potential exploitation of enterprises and authorities under this technological boom. The transition period since UK’s departure from the EU before January next year would be a great opportunity for the country to not only catch up with the protection offered in the EU, but also with the advancement of humanity and civilization.

Allegro.eu record IPO

Allegro.eu record IPO

The Detention of Potentially Infectious Persons: The Coronavirus Act 2020 and its Implications to Civil Liberty

The Detention of Potentially Infectious Persons: The Coronavirus Act 2020 and its Implications to Civil Liberty