How Deepfakes could help implant false memories in our minds

It’s startlingly easy

Background

A team of researchers from universities in Germany and the UK today published pre-print research detailing a study in which they successfully implantedand removedfalse memories in test subjects.

Per the team’spaper:

Basically, it’s relatively easy to implant false memories. Getting rid of them is the hard part.

The study was conducted on 52 subjects who agreed to allow the researchers to attempt to plant a false childhood memory in their minds over several sessions. After awhile, many of the subjects began to believe the false memories. The researchers then asked the subjects’ parents to claim the false stories were true.

The researchers discovered that the addition of a trusted person made it easier to both embed and remove false memories.

Per thepaper:

False memory planting techniques have been around for awhile, but there hasn’t been much research on reversing them. Which means this paper comes not a moment too soon.

Enter Deepfakes

There aren’t many positive use cases for implanting false memories. But, luckily, most of us don’t really have to worry about being the target of a mind-control conspiracy that involves being slowly led to believe a false memory over several sessions with our own parents’ complicity.

Yet, that’s almost exactly what happens on Facebook every day. Everything you do on the social media network is recorded and codified in order to create a detailed picture of exactly who you are. This data is used to determine which advertisements you see, where you see them, and how frequently they appear. And when someone in your trusted network happens to make a purchase through an ad, you’re more likely to start seeing those ads.

But we all know this already right? Of course we do, you can’t go a day without seeing an article about how Facebook and Google and all the other big tech companies are manipulating us. So why do we put up with it?

Well, it’s because our brains are better at adapting to reality than we give them credit for. The moment we know there’s a system we can manipulate, the more we think the system says something about us as humans.

A team of Harvard researcherswrote aboutthis phenomenon back in 2016:

What does this have to do with Deepfakes? It’s simple: if we’re so easily manipulated through tidbits of exposure to tiny little ads in our Facebook feed, imagine what could happen if advertisers started hijacking the personas and visages of people we trust?

You might not, for example, plan on purchasing someGrandma’s Cookiesproducts anytime soon, but if it wasyour grandmatelling you how delicious they are in the commercial you’re watching… you might.

Using existing technology it would be trivial for a big tech company to, for example, determine you’re a college student who hasn’t seen their parents since last December. With this knowledge, Deepfakes, and the data it already has on you, it wouldn’t take much to create targeted ads featuring your Deepfaked parents telling you to buy hot cocoa or something.

But false memories?

It’s all fun and games when the stakes just involve a social media company using AI to convince you to buy some goodies. But what happens when it’s a bad actor breaking the law? Or, worse, what happens when it’s the governmentnotbreaking the law?

Police use a variety of techniques to solicit confessions. And law enforcement are generally under no obligation to tell the truth when doing so. In fact, it’sperfectly legalin most places for cops to outright lie in order to obtain a confession.

One popular technique involves telling a suspect that their friends, families, and any co-conspirators have already told the police they know it was them who committed the crime. If you can convince someone that the people they respect and care about believe they’ve done something wrong, it’s easier for them to accept it as a fact.

How many law enforcement agencies in the world currently have an explicit policy against using manipulated media in the solicitation of a confession? Our guess would be: close to zero.

And that’s just one example. Imagine what an autocratic or iron-fisted government could do at scale with these techniques.

The best defense…

It’s good to know there are already methods we can use to extract these false memories. As the European research team discovered, our brains tend to let go of the false memories when challenged but cling to the real ones. This makes them more resilient against attack than we might think.

However it does put us perpetually on the defensive. Currently, our only defense against AI-assisted false memory implantation is to either see it coming or get help after it happens.

Unfortunately theunknown unknownsmake that a terrible security plan. We simply can’t plan for all the ways a bad actor could exploit the loophole that makes it easier to edit our brains when someone we trust is helping the process along.

With Deepfakes and enough time, you could convince someone of just about anything as long as you can figure out a way to get them to watch your videos.

Our only real defense is to develop technology that sees through Deepfakes and other AI-manipulated media. With brain-computer interfaces set to hit consumer markets within the next few years and AI-generated media becoming less distinguishable from reality by the minute, we’re closing in on a point of no return for technology.

Just like the invention of the firearm made it possible for those unskilled in sword fighting to win a duel and the creation of the calculator gave those who struggle with math the ability to perform complex calculations, we may be on the cusp of an era where psychological manipulation becomes a push-button enterprise.

Story byTristan Greene

Tristan is a futurist covering human-centric artificial intelligence advances, quantum computing, STEM, physics, and space stuff. Pronouns:(show all)Tristan is a futurist covering human-centric artificial intelligence advances, quantum computing, STEM, physics, and space stuff. Pronouns: He/him

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with