Q&A: The Ethics of Using Brain Implants to Upgrade Yourself

Posted on

Whether people will be willing to get elective brain surgery

IEEE Spectrum: We currently need to use invasive technology, electrodes that are implanted in the brain tissue, if we want to get really precise signals out of the brain or into the brain. Will the need for brain surgery keep this tech from being widely adopted?

Anders Sandberg: That might hold people back a bit, because there’s a scariness factor. There’s also an important practical factor: If I want to upgrade my cellphone, I go to the store. If I want a neural upgrade, I’d have to go to the hospital and have something removed from my brain. Maybe this will work if surgery becomes very simple and painless, and you can just drop by and get a quick upgrade. Maybe nanomachines will do the surgery, or the technology is so small you can take it as a pill. But that’s not going to happen anytime soon.

The important question about brain-computer interfaces is, “What is the killer app?” Right now it’s replacing biological function that has been lost. That’s a killer app for a very small percentage of the population (such as people dealing with paralysis or blindness). For the larger population, it has to be something we really want to have, and can’t get any other way. If the implant is just supplying information, it has to be a lot better than smartphones or computer screens or even virtual reality. Maybe it would be an implant that controls your weight setpoint, so you can say, “I want to be 20 kilos.” This might actually make you want to put some electrodes in your head.

Spectrum: Some researchers and companies are focused on brain-computer interfaces (BCIs) that “read out” brain signals and use them to control something in the external world, while others are building BCIs that “write in” information into the brain. Which do you think is more likely to take off for enhancement purposes?

Sandberg: Read-out seems to be much easier to achieve than write-in. And in some applications you’re well off with just one or the other: For paralysis, read-out is really useful, and an implant for artificial vision would just need write-in. But most of the really important applications will be about enhancing communication, either between people or between people and machines, and communication is usually two ways. If we had perfect read-out but bad write-in, we would be somewhat stuck.

Spectrum: Should we view brain enhancements achieved through hardware and software as fundamentally different from enhancements achieved via drugs? Are the technological enhancements more alarming, or are they just new?

Sandberg: I think it’s mostly that they’re new. We tend to think new technology is scary and problematic, whereas old technology we take for granted—and “old” means it arrived before you were a teenager. But there’s no philosophical reason to treat neurotechnology as fundamentally different from anything else. Putting an electrode in the brain doesn’t change the brain’s mode of operation. If you take a piano lesson or take a drug or use a brain implant—none of these give you the instant ability to play piano, but they might all make it easier to learn.


Cognitive, emotional, and moral enhancements

Spectrum: You’ve studied brain enhancements that cause changes to people’s cognitive, emotional, and moral systems. Which of these do you think is most likely to become real? Do any give you qualms?

Sandberg: There was an interesting study that asked students about various mental traits and whether they’d be willing to use an enhancement technology to improve them. The students were very willing to use an enhancement to improve cognitive traits like attention, alertness, and rote memory. But they were loath to enhance other traits like empathy and kindness. Only 9 percent of people were willing to be enhanced in kindness.

The authors had a theory to explain their results. They also asked how central these traits are to the person’s sense of self, their sense of who they are. With traits like memory and language ability, the students said they’re part of me, but rather remote from my sense of self. But emotions, those are close to my heart. If this holds true—and I think this is a great study, it should be replicated—it tells us something very cool about how we think about ourselves. So I think cognitive enhancement will be seen as pretty acceptable. And it’s no secret that in academia there are a number of students interested in cognitive enhancement.

Spectrum: What kind of society would that bring about?

Sandberg: It’s interesting to ask which kinds of enhancements would be good for the world. I can get numbers for how society would benefit if people were a bit smarter. But it’s really hard to find numbers for what would happen if people were happier, or if they were more able to trust other people.

You can look at the effect of lead in drinking water, which does impact intelligence and cause worse school performance. We can imagine a brain implant that acts like an anti-lead, and say that an IQ point might be worth about 1 percent of GDP. Other researchers are trying to look at IQ and lifespan and life outcomes. There’s a correlation between being smart and doing better in school and getting better jobs. It’s not always the case, and not every smart person is a happy person. But that’s what we see overall. And people with lower intelligence are much more likely to be victims of a crime.

And people with high intelligence cooperate better. So overall, a society where everybody is a bit smarter would likely be a much better place. And even people who aren’t enhanced would be better off, because they would be surrounded by people who are good at cooperating and being nice. So maybe everyone has a rational reason for not wanting to be enhanced themselves, but wanting everyone else to be enhanced.


A neurotech love potion

Spectrum: In a recent study using prairie voles, rodents that form lifelong monogamous bonds, researchers stimulated certain brain regions and could cause two voles to form that bond, even though they weren’t allowed to mate. Can you imagine similar emotional meddling in the human brain?

Leave a Reply

Your email address will not be published. Required fields are marked *