Uncategorized

Mind-Reading Tech Is Dangerously Close To Becoming A Reality So We Need New Laws To Protect Us

“Nothing was your own except the few cubic centimeters inside your skull.”

This comment from George Orwell’s dystopian novel 1984 that was published in 1949 was meant to highlight the repressive surveillance state the characters inhabit. But on the brighter side, it shows how lucky they were that their brains were still private.

In recent weeks, Facebook and Elon Musk’s Neuralink have publicized their initiatives in building tech to read human minds.

Mark Zuckerberg’s establishment is backing research on brain-computer interfaces (BCIs) that can pick up thoughts directly from your neurons and interpret them into words. According to the researchers , they’ve already developed an algorithm that can decode words from brain action in real time.

Elon Musk’s neurotechnology company has developed flexible “threads” that can be implanted into a brain and could someday enable you to control your smartphone or computer with just your thoughts. Musk even wants to start human trials by the end of next year.

Other companies in the field like Kernel, Emotiv, and Neurosky are also working on brain tech. They claim to be working for ethical purposes, like helping people with paralysis control their devices.

All this sounds like science fiction, but it’s even now changing people’s lives. Over the past decade, a number of paralyzed patients have benefitted from brain implants that allow them to move a computer cursor or control robotic arms. Brain implants that can directly read thoughts are still years away from commercial availability, but research in this field is moving faster than most people realize.

The final privacy frontier of your brain  may not be private for long.

Some neuroethicists contend that the potential for misuse of these technologies is so high that we need refurbished human rights laws i.e. a new “jurisprudence of the mind” to protect us. These technologies have the latent ability to interfere with rights that are so basic that we may not even consider them as rights such as our ability to determine where our selves end and machines begin. Our existing laws are not equipped to address this.

4 New Rights That May Be Needed Protected In Law

Several countries are already contemplatng how to handle “neurorights.”

One of the leading people asserting for these new human rights is neuroethicist Marcello Ienca, a researcher at ETH Zurich, one of Europe’s top science and technology universities. In 2017, he released a paper describing four specific rights for the neurotechnology age that he believes we should enshrine in law.

He said, about the recent revelations from Facebook and Neuralink, “I’m very concerned about the commercialization of brain data in the consumer market. And I’m not talking about a farfetched future. We already have consumer neurotech, with people trading their brain data for services from private companies.”

He cited neurogaming, where you control your actions in a video game using your brain activity instead of a traditional controller, and also self-tracking where wearable devices monitor your sleep.

Ienca says, “I’m tempted to call it neurocapitalism.”

BCI tech comprise systems that “read” neural activity to decode what it’s already expressing, often with the aid of AI-processing software, and mechanisms that “write” to the brain, providing new inputs to actually alter how it’s functioning. Some systems can do both.

Ienca explains each of the four human rights he believes we need with a concrete example of how neurotechnology might violate it.

  1. The Right To Cognitive Liberty

Humans should have the right to freely decide whether they want to use a given neurotechnology or to refuse it.

In China, the government is already mining data from the brains of some employees by making them wear caps that scan their brainwaves for depression, anxiety, rage, or fatigue.

Ienca said, “If your employer wants you to wear an EEG headset to monitor your attention levels, that might qualify as a violation of the cognitive liberty principle,” as even if you’re told that wearing the device is elective, you’ll probably feel implied pressure to do so since you don’t want to be at a competitive disadvantage.

He added that the US military is also exploring neurotechnologies to make soldiers more fit for duty. Down the line, they may be pressured to accept interventions.

Referring to the Defense Department’s advanced research agency, Ienca said, “There is already military-funded research to see if we can monitor decreases in attention levels and concentration, with hybrid BCIs that can ‘read’ deficits in attention levels and ‘write’ to the brain to increase alertness through neuromodulation. There are DARPA-funded projects that attempt to do so”.

  1. The Right To Mental Privacy

Humans should have the right to seclude their brain data or to publicly share it.

Ienca stressed that neurotechnology has huge implications for law enforcement and government surveillance.

He explained, “If brain-reading devices have the ability to read the content of thoughts, in the years to come governments will be interested in using this tech for interrogations and investigations.”

The right to remain silent and the principle against self-incrimination that are enshrined in the US Constitution might become meaningless in a world where the powers that be are empowered to eavesdrop on your mental state without your consent.

This grim scenario is reminiscent of the sci-fi movie Minority Report  where a special police unit called the PreCrime Division identifies and arrests murderers even before they commit their crimes.

  1. The Right To Mental Integrity

Humans should have the right not to be harmed physically or psychologically by neurotechnology.

BCIs armed with a “write” function can enable new forms of brainwashing, notionally enabling all sorts of people to exert control over our minds. This could be misused by religious authorities who want to indoctrinate people, political regimes that want to quash dissent or even terrorist groups seeking new recruits.

Additionally, devices like those being built by Facebook and Neuralink may even be vulnerable to hacking. Neuroethicists refer to this as brainjacking.

Ienca explained, “This is still hypothetical, but the possibility has been demonstrated in proof-of-concept studies. A hack like this wouldn’t require that much technological sophistication.”

 

  1. The Right To Psychological Continuity

Humans should have the right to be protected from alterations to their sense of self that they did not authorize.

In a study, an epileptic woman enabled with a BCI developed such a radical symbiosis with it that, she said, “It became me.” When the company that implanted the device in her brain went bankrupt forcing her to have it removed, she lamented, “I lost myself.”

Ienca calls this an example of how psychological continuity can be disrupted by the imposition  and removal of a neurotechnology ,“This is a scenario in which a company is basically owning our sense of self”.

Ienca opines that neurotechnologies should be taken out of the control of private companies and reclassified as public goods as this will prevent companies from inflicting harm as well as prevent them from affording benefits only to the privileged class.

He said, “One risk is that these technologies could become accessible only to certain economic strata and that’ll exacerbate preexisting social inequalities. I think the state should play an active role in ensuring these technologies reach the right people.”

It’s hard to say whether  neurorights campaigns like Ienca’s will effectively keep neurotechnology’s risks in check but given the pace at which this tech is developing, it does seem likely that we’ll need new laws to protect us. Now is the time for experts to articulate our rights as lawmakers move slowly after BCI devices like Facebook’s or Neuralink’s hit the market, it could prove too late.

Ienca warned, “Brain data is the ultimate refuge of privacy. When that goes, everything goes. And once brain data is collected on a large scale, it’s going to be very hard to reverse the process.”

Related Articles

Leave a Reply

Back to top button
Close