Missouri ‘Taylor Swift Act,’ other deepfake AI bills a first step but not enough | Opinion

Dan MacMedan/USA Today

You may have heard about the New Hampshire robocall that had a voice that sounded very much like President Joe Biden asking people not to go to the polls.

You’ve probably heard about if not seen the video of President Barack Obama supposedly saying, well, things he wouldn’t say.

And more recently, someone faked realistic pornographic images of Taylor Swift and shared them on X, formerly known as Twitter.

Welcome to the not-so-new world of artificial intelligence and deepfakes — authentic-looking digital depictions that are manufactured with AI technology. Concern about deepfakes goes back to just before the pandemic, but it’s 2024, and the technology that creates these troublesome photos, audio and video is cheaper and easier to use.

Missouri lawmakers have turned their attention to the technology. It’s about time.

Last week, the Missouri House heard House Bill 2628, prohibiting someone, within 90 days of an election, from distributing a “synthetic media message of any candidate or party for elective office who will appear on a state or local ballot” that is not labeled as artificial.

The House perfected the bill and it awaits a final vote before, if passed, it heads to the Senate.

Rep. Ben Baker, a Newton County Republican, sponsored HB 2628. “We must address the AI issue, especially in elections,” Baker said in an email. “Protecting the integrity of our elections should be a priority of the legislature and this bill accomplishes that. This is about protecting the voters from purposeful deception by the use of deepfake AI generated media, it is not about protecting politicians from attacks,” he said.

Another Missouri House bill is 2573, introduced last week by Representative Adam Schwadron, a St. Louis County Republican, and is referred to as The Taylor Swift Act. It would allow anyone who becomes the subject of an “intimate digital depiction” without their consent to bring a civil action against the creator. It was introduced after the superstar’s image was used via deepfake technology to show her doing things she didn’t do, and certainly wouldn’t consent to. A public hearing was held March 5.

These bills don’t forbid this technology — how could lawmakers expect to? — but try to rein it in a little. Very little.

The Taylor Swift Act emphasizes consent, as it should, and the other, House Bill 2628, would require labeling political content as digital representation with something like: “This (video, photo, audio, or description of the content) has been manipulated or generated by artificial intelligence.”

Penalties aren’t that severe, however. More on this in a moment.

Celebrities aren’t the only targets

I’m hopeful these bills do pass to the governor’s desk because this kind of digital depiction can be dangerous — not just to politicians and celebrities, but to you and me. Do you have an image or video of yourself online somewhere? You’re fair game.

What are deepfakes? The Sloan School of Management at Massachusetts Institute of Technology explains it this way: “A deepfake refers to a specific kind of synthetic media where a person in an image or video is swapped with another person’s likeness.” The term was coined in 2017 by a Reddit user, according to MIT. Back then, this problem wasn’t even conceivable to Missouri lawmakers. But we’ve all caught up, haven’t we?

Deepfake technology can be used for good. The World Economic Forum suggests using such tech to create health care models that diagnose and treat illnesses better, for example.

I’m not alone in my concerns here in the Show-Me State. Three Missouri groups partnered to present a webinar this year on the challenges of AI and its impact on local elections. You can watch the webinar from the Missouri School Boards’ Association, the Missouri Association of Counties and the Missouri Municipal League on YouTube.

The technology is incredible but opens a veritable Pandora’s box of problems. In the case of the Taylor Swift depiction, it’s about money: Porn sells. In the case of deepfakes deployed for political purposes, message, power and influence are the endgame.

This is bad enough, but what frightens me is just the ability to sow propaganda and confusion. To alter our reality. We’ve seen it before, and not just in that “Black Mirror” episode on Netflix.

What about when Russian trolls attempted to interfere in the 2016 U.S. presidential election with disinformation campaigns? And remember Cambridge Analytica, the political consulting firm that improperly harvested personal user data from Facebook (now Meta) and shared it with Donald Trump’s presidential campaign?

The uproar got big enough to bring Meta CEO Mark Zuckerburg to Washington, and his company ultimately paid out $725 million to settle the class-action lawsuit. Facebook (and other social media sites) started deleting “questionable” accounts all over the place.

Good call, but they went a little too far. My Facebook account got taken down, too. Oh well. It wasn’t too hard to create another profile, something the trolls do all the time.

The inconvenience stung, but stopping phony info and deepfakes is more important than me. And that’s why the penalties in this legislation have to get tougher.

Don’t outlaw tech, but penalties are too small

The movement by the Missouri House is encouraging, but it has a way to go. If so, they would be in rare company. I found only 13 states with deepfake bills signed by their governors: California, Florida, Georgia, Hawaii, Illinois, Michigan, Minnesota, New Mexico, New York, South Dakota, Texas, Virginia and Washington. The rest of the states appear to have legislation in process.

But even if the Missouri bills pass both chambers of the General Assembly, and Gov. Mike Parson signs off, I’m not hopeful they will do much unless the punishment is stronger. They need teeth, because I don’t believe the fakers care much about following the law. Do you?

And what if the fakers don’t follow the law? H.B. 2628’s penalties include a Class B misdemeanor (punishable by up to six months in jail and a $1,000 fine), a Class A misdemeanor (punishable by up to one year in jail and a $2,000 fine) if there is harmful intent, or a Class E felony (no more than four years) if the person commits the violation within five years of a prior conviction.

Compare these penalties to Georgia and Minnesota, where the fine is up to $100,000. (In Georgia, the act is also a felony.)

Missouri’s Taylor Swift Act does go further. It mentions injunctions from showing the material, recovering money made by the faker, plus actual and punitive damages, but only spells out actual liquidated damages of up to $150,000.

Still, it seems like small potatoes for a victim of this kind of digital depiction, whose personal life and career could be turned upside down. What about a million dollars? Five million with serious jail time? How about making life extremely difficult for the fakers?

In particular, H.B. 2628’s penalties seem like a slap on the wrist to punish those who wish to prey on our vulnerable identities.

I encourage lawmakers to toughen up and pass these bills. Would it keep deepfakes from being created? Probably not, but at least there could be some remedy.

I’m no censor, so I won’t go as far to suggest we outlaw the tech itself. Some days, I’m tempted.

The deepfakes so far have mostly targeted public figures, who traditionally have a more difficult time suing for defamation damages when they don’t like how they are portrayed in media — it’s the old “absence of malice” argument, which is important to news journalists and folks like me.

You’d have a difficult time arguing that these deepfakes are absent of any malice.

In my ethics law class back in college, I learned that people become public figures by “thrusting themselves into the public vortex.” I’ve never forgotten that line. It means they became public figures on purpose.

Sure sounds like most of us these days, as we engage in the modern-day pastime of being famous on social media. And that’s why, more than ever, this legislation just might be important to you.

We aren’t Taylor Swift, but most of us are no longer anonymous. Your teen daughter or son isn’t these days. Do Missouri lawmakers want to protect them?

Kansas, with a couple of bills introduced in January, you’re next.

Advertisement