Âé¶¹ÒùÔº


The disturbing trend of state media use of deepfakes

The disturbing trend of state media use of deepfakes
An AI-generated image of Pope Francis wearing a white puffer jacket went viral online, with users wondering if it was real. Credit: Reddit

Social media has been awash with fake images of a , and .

Such AI-generated images and videos, or , have become increasingly accessible due to advances in artificial intelligence. As more sophisticated fabricated images spread, it will become increasingly difficult for users to differentiate the real from the fake.

Deepfakes get their name from the technology used to create them: . When unleashed on a dataset, these algorithms learn patterns and can replicate them in novel—a²Ô»å convincing—ways.

While this technology can be used for entertainment, it also has dark potential, raising .

Unlike simple stories or memes which differ little from propaganda techniques used by and photo editing by , deepfakes have a high degree of realism. Their accessibility to the public and states could erode our sense of reality.

Fake news anchor

Beyond the growing concern that , deepfakes can be used as the unchecked mouthpieces for organizations and states.

Leading the way, . Ren, although , illustrates both the commitment to the technology and the incremental increases in realism.

Other countries such as and have also launched AI generated anchors.

When looking at these anchors, we might object that only the most naive viewer would mistake them for real humans, such as . Yet, these technologies are still in their infancy. We cannot dismiss them.

Fabricated news

China's transparency in using AI-generated news anchors stands in contrast to Venezuela's fabricated news coverage. Venezuelan state media presented favorable reports of the country's progress, purportedly created by international English-language news outlets. However, .

The use of these videos in Venezuela is particularly troubling because they are used as external validation for the government's activities. By claiming the video comes from outside of one's country, it provides another source to bolster their claims.

Venezuela is not the only country to adopt these methods. Fabricated videos of were also circulated during the ongoing Russia-Ukraine conflict.

Fabricated images and videos are merely the tip of the deepfake iceberg. In 2021, Russia was accused of using . The ability to mimic political figures and interact with others in real time is a truly disturbing development.

As these technologies become increasingly accessible to everyone, from harmless meme-makers and would-be social engineers, the boundaries of the real and imagined become progressively indistinguishable.

The proliferation of deepfakes foreshadow a , defined by a fractured geopolitical landscape, opinion echo chambers and mutual distrust that can be exploited by governmental and non-governmental organizations.

Disinformation and believable fakes

The spread of disinformation requires that we understand how ideas, innovation or behavior spread within a social network, referred to as .

Cognitive science is concerned with ""—anything that reduces our uncertainty about the actual state of the world. Disinformation has the appearance of information, except uncertainty is reduced at the expense of accuracy.

Observations that likely stems from the fact that when a message is , it increases our confidence.

Disinformation spreads for a variety of reasons. It must appear close enough to the "truth" that it is believable. If a new "fact" is incompatible with what we know, we are inclined to reject it even if it is true. and seek to resolve it. People will also ignore the structure and quality of an argument, and .

Deepfakes move us beyond text-based persuasion, because —a²Ô»å —than abstract concepts alone. Its use in spreading disinformation is therefore far more concerning.

The structure of the environment is also critical. People attend to , focusing on information that confirms their prior beliefs. By increasing the frequency of images, ideas and other media, we increase people's and .

Social networks and contagion

While we look for —experts or peers—our memory stores information separately from its source. Over time, this results in our retrieval of information from memory with understanding its origin.

Through and , marketers and have exploited these techniques for generations. Most recently, .

The introduction of AI will only accelerate this process by permitting tighter control of the information environment through dark .

Legal, social and moral issues

Producing, managing and disseminating information grants people authority and power. When information ecosystems become flooded with disinformation, truth is become debased.

The accusation of "fake news" has become a tactic used to discredit any argument. Deepfakes are variations on this theme. Social media users have already falsely claimed that real videos of and are fake.

Social movements such as or claims about rely on the compelling qualities of videos.

. The time required for verification—especially if left to the user—allows disinformation to propagate. and public fact-checking websites can help. But they need legitimacy to foster trust.

Brazil provides a recent demonstration of such an attempt. After the government launched a verification website, accused it of pro-government bias. However, noted that the site was not meant to replace private initiatives.

There is . Rather than passive consumers of media, we must actively challenge our own beliefs.

The only way to combat harmful forms of artificial intelligence is to cultivate human intelligence.

Provided by The Conversation

This article is republished from under a Creative Commons license. Read the .The Conversation

Citation: The disturbing trend of state media use of deepfakes (2023, April 13) retrieved 15 June 2025 from /news/2023-04-disturbing-trend-state-media-deepfakes.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further


2 shares

Feedback to editors