Test subjects ‘successfully incepted with fake memories’ using deepfake videos
RESEARCHERS have uncovered that they can create fake memories via artificial intelligence-generated deepfakes.
Deepfakes are synthetic media that have been digitally altered to replace one person’s face with that of another.
Now, in a new study, scientists said they have been able to successfully create “fake memories” using deepfake technology.
In the experiment, 436 participants were shown deepfake clips of fictitious movies.
These included remakes of popular movies like The Matrix; in the deepfake version, Will Smith staring as Neo.
Another clip showed fake videos of Brad Pitt starring in “The Shining” – in reality, the role belonged to Jack Nicholson.
After reviewing the clips, participants were asked questions about the movies, as well as asked to rank the movies.
Shockingly, many subjects identified the deepfake movie remakes as the real versions.
In total, scientists observed a staggering average false memory rate of 49 percent.
What’s more, scientists found that even just simple text descriptions of fictitious events were enough to distort memories.
“Deepfakes were no more effective than simple text descriptions at distorting memory,” the paper reads.
In fact, researchers noted that many studies have shown “misinformation in non-technical forms like simple narratives are extremely effective at distorting memory.”
The study has prompted concern about deepfake technology among experts, however, the study authors believe more research needs to be done.
“We shouldn’t jump to predictions of dystopian futures based on our fears around emerging technologies,” lead study author Gillian Murphy, a researcher at University College Cork in Ireland, told The Daily Beast.
“Yes there are very real harms posed by deepfakes, but we should always gather evidence for those harms in the first instance, before rushing to solve problems we’ve just assumed might exist.”
In conclusion, the team said that their findings suggest we might be underestimating “how readily our memories can be distorted without any technological input.”
They added that they support “growing calls to understand deepfakes as a cultural technology, where social concerns and fears should be engaged with critically and any interventions or regulations should be evidence-based.”
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.