Exposing The Truth Behind Deepfakes In Political Ads – Broadcasting: Film, TV & Radio

Date:

To print this article, all you need is to be registered or login on Mondaq.com.

Political advertisements showing clips of newscasts or
politicians making speeches and talking to voters are nothing new
– and with a little over a year left until the 2024
elections, voters are gearing up to be bombarded with these ads.
But the rise of generative artificial intelligence (AI) has raised
a new concern about political advertisements: How do we know
what’s being shown in these ads is real?

Over the last year, generative AI platforms have quickly become
a popular way to create music, artwork, text and videos. At this
point, most people have played around with ChatGPT or listened to
AI-generated music. Now, political advertisers are joining the
generative AI movement and using it to create photos, videos and
audio clips that align with their political message. For example, a
Ron DeSantis-supporting political action committee (PAC) used
generative AI to create a “deepfake” of Donald
Trump’s voice attacking the Republican governor of Iowa.
Meanwhile, the Republican National Committee used AI-generated
images of boarded up storefronts and military on the streets of
U.S. cities to show what they envision happening if President Biden
is re-elected.

These developments have raised concerns for online platforms,
lawmakers and regulators trying to combat misinformation in
political advertising.

Google’s New Policy

Google recently announced a new policy aimed at educating
viewers when a political advertisement contains AI-generated
content. Starting in November 2023, Google will require any
political advertisements that feature “synthetic content”
(such as AI-generated photos or videos) that “inauthentically
represents real or realistic-looking people or events” to
include a clear and conspicuous disclosure noting for viewers that
it contains AI-generated content.

As an example, Google specified that any ad with synthetic
content that “makes it appear as if a person is saying or
doing something they didn’t say or do” or that
“alters footage of a real event or generates a realistic
portrayal of an event to depict scenes that did not actually take
place” would need a disclosure.

Notably, Google also clarified that ads that use AI in an
“inconsequential” way (such as image re-sizing, color
correction, etc.) would not need a disclosure.

Lawmakers and the FEC Look to Regulate AI-Generated Political
Ads

Earlier this year, Rep. Yvette Clark introduced the REAL
Political Ads Act, which would require a disclaimer on any
political ads that use images or video generated by AI, no matter
the medium or platform on which those ads appear. Similar
legislation was also introduced in the Senate by senators Amy
Klobuchar, Cory Booker and Michael Bennett. However, given partisan
gridlock in both the Senate and House of Representatives, the
future of the Act is unclear.

The Federal Election Commission (FEC) is also taking steps to
regulate AI-generated political advertisements. The FEC opened
public comment on a petition for the FEC to regulate this content
by amending regulations that prohibit a candidate or their agent
from “fraudulently misrepresenting other candidates or
political parties” to make clear that this prohibition would
apply to deliberately deceptive AI-generated campaign ads.
Transparency advocates see this as a sign that the FEC is taking
the issue seriously.

Other Online Political Ad Regulations

As online platforms, political advertisers and their agencies
continue to watch for more movement in the call to regulate AI use
in political advertising, these parties should also be aware of the
complicated patchwork of laws governing online political
advertising more generally.

FEC regulations have long required that political advertisements
on television and radio disclose who paid for such ads. Last year,
the FEC approved new regulations that expand its disclosure
requirements to cover “internet public communications.”
This includes “any public communication over the internet that
is placed for a fee on another person’s Web site, digital
device, application, or advertising platform.” While the exact
form and content of the required disclaimer varies depending on the
entity that authorizes and finances the advertisement (for example,
a PAC, independent expenditure or candidate/campaign), generally
speaking, these internet communications will now require a
disclaimer stating who paid for and authorized the advertisement.
This move closes a loophole that previously exempted these ads from
the FEC disclosure requirements, as earlier FEC regulations only
required disclaimers on ads “placed for a fee on another
person’s Web site.”

Many states have also enacted legislation over the last few
years aimed at regulating online political advertisements. These
laws vary greatly from state to state, with some simply extending
existing requirements for TV and radio advertising to online
political advertising. Meanwhile, some go further and impose new
requirements (most commonly robust recordkeeping requirements) on
online platforms and ad networks.

Because state laws vary so greatly, political advertisers and
online platforms should carefully review each advertisement they
place or accept to ensure compliance with all applicable legal
requirements.

The Bottom Line

  • Generative AI tools have made it easier and cheaper for
    political groups and campaigns to create convincing but fictitious
    attack ads targeting political rivals.
  • There are an array of proposed tactics that would require
    political advertisers to clearly and conspicuously disclose when
    generative AI tools were used to create the content in their
    advertising.
  • Publishers and agencies working with political campaigns and
    operatives to create and distribute political advertising should
    familiarize themselves with existing rules and requirements and
    closely watch the rapidly changing developments heading into the
    2024 election cycle.

The content of this article is intended to provide a general
guide to the subject matter. Specialist advice should be sought
about your specific circumstances.

POPULAR ARTICLES ON: Media, Telecoms, IT, Entertainment from United States

Can I Record A Conversation In New York?

Romano Law

In today’s world, making a sound recording is easier than ever. But, as easy as it may be, recording a telephone or in-person conversation is not always legal.

The NCAA’s Summer Of Legal Battles

Reavis Page Jump

The National College Athletic Association (NCAA), the organization responsible for regulating college sports and athletes in the US, seems to consistently find itself at the center of high-profile legal disputes.

Share post:

Subscribe

Popular

More like this
Related

NBC10’s Matt DeLucia tries Olympic sports – NBC10 Philadelphia

What happens when a reporter with little athletic ability...

CharacterX Evolution: Key Changes and Path Forward

Since first version was released in September 2023, CharacterX...

Always Ready to Meet Challenges: Investment Strategies and Practices in the New Energy Field

(Author: Du Su) In the global new energy investment Field, many...

CrowdStrike backlash over $10 apology voucher for IT chaos

.CrowdStrike is facing fresh backlash after giving staff and...