Fake nude photos with faces of underage celebrities top some search engine results

Deepfake images that graft a child’s face onto sexually explicit material are easily found in top image search results on leading search engines and mainstream social media platforms despite a U.S. law appearing to ban such material.

Such images often feature faces of well-known celebrities before they turned 18 combined with an adult’s nude body. Two of the top 10 image search results for the term “fake nudes” on Microsoft’s Bing were sexually explicit deepfakes of female celebrities from when they were ages 12 and 15, according to a review conducted by NBC News. One of those images was also returned on the first page of Google image search results for one of the celebrity’s names plus “fake nudes.”

Microsoft and Google both said they removed the specific material identified by NBC News from top search results. However, the search results for terms like “fake nudes” and specific celebrity’s names plus “fake nudes” and “deepfakes” are still returning pages of similar nonconsensual sexually explicit images on both platforms. The remaining deepfake results include faces that appear to be adults as well as young-looking faces that NBC News has not investigated.

“Microsoft has a long-standing commitment to advancing child safety and removing illegal and harmful content from our services. We’ve removed this content and remain committed to strengthening our defenses and safeguarding our services from abusive content and conduct online,” a spokesperson for Microsoft said in a statement.

“Google Search has strong protections to limit the reach of abhorrent content that depicts CSAM [child sexual abuse material] or exploits minors, and these systems are effective against synthetic CSAM imagery as well,” a Google spokesperson said in a statement. “We proactively detect, remove and report such content in Search, and have additional protections in place to filter and demote content that sexualizes minors. In line with our policies, we’ve removed the image that was reported to us, and we are constantly updating and improving our algorithms and policies to combat these evolving threats.”

In January, NBC News found two examples of sexually explicit fake media featuring teen Marvel star Xochitl Gomez that appeared at the top of X (formerly Twitter) search results for her name and the word “nudes.” X didn’t respond to a request for comment at the time, but the results have since disappeared.

A deepfake is fake or misleading media that is generated or edited using AI tools. Nonconsensual sexually explicit deepfakes generally refer to material created with AI technology, versus computer-generated or edited fake nude photos that predate the current AI wave.

A federal law that two legal experts say prohibits the knowing production, distribution, reception or possession of computer-generated sexually explicit images of children has been in the U.S. Code since the late 1990s. But the experts say the law has not been enforced against tech companies hosting images like the ones NBC News identified, but rather, it’s been used occasionally to convict defendants in cases involving individual perpetrators. There are a variety of reasons such material stays up on social media platforms and search engines, the experts said.

“Most people don’t really know what the law on child pornography is, because it’s not obvious. It doesn’t just mean ‘naked pictures of kids,’” said Mary Anne Franks, a professor at George Washington University Law School and a frequent adviser on legislation about nonconsensual sexually explicit material.

“A lot of people just kind of assume they know what the law bars, but you really don’t, and it’s not something people really want to talk about or ask about,” Franks said. “There’s this kind of inhibition even for good-faith people who are trying to figure out what the law is to not want to investigate it, because even trying to figure out research on it, depending on your search terms, could actually present you with CSAM.”

Franks said social media platforms and search engines may not be liable for simply hosting such material. Legal arguments based on Section 230, a separate part of the U.S. Code, which says that web platforms are not liable for user-generated content, have protected tech companies from litigation around illegal behavior facilitated through their services before.

But legal experts have also challenged that interpretation of Section 230. Franks also said that an argument could be made that platforms are part of the process of generating such material, which could include hosting links to apps and services that can create sexually explicit deepfakes of children. Such content is not necessarily protected by Section 230.

On Microsoft’s Bing, an image search from an incognito browser for “fake nudes” with safety filters off (allowing pornographic content to be shown) returns pages of results of fake nude images of female celebrities. Two of the first 25 results feature images that include faces of minors.

Six former Disney Channel stars, like Miley Cyrus, are included in the top 25 results. Cyrus did not respond to a request for comment.

One of the first 10 image search results shows Cyrus’ face cut out and pasted onto a pre-existing pornographic image. The result could legally be considered CSAM, as the image used was of Cyrus when she was 15 years old. Such an image can be created with photo-editing tools that have been available to consumers since the early aughts. It can also be created much more quickly and easily — and sometimes made more realistic-looking — with the AI-enhanced tools available today. The image result on Bing links to a forum post made in December 2013.

A reverse-image search for the picture of Cyrus’ face in the fake nude image retrieved its original location: It was the cover of her 2008 single “Start All Over.” The song was released in March 2008, meaning Cyrus was 15 or younger when the photo was taken. A Google search for “Miley Cyrus fake nudes” returned the same fake nude image showing a 15-year-old Cyrus in the top 10 Google image search results.

The fourth result for “fake nudes” on Bing shows the face of actor Peyton List, who appeared in the Disney Channel series “Jessie” beginning in 2011, when she was 13. A reverse-image search for List’s face in the fake nude image retrieves a match for a Getty photo of List from 2010, when she was 12. The search result links to a since-deleted Tumblr post on a blog that has not been active for seven years. This deepfake of List may or may not clear the bar for CSAM, since it depicts nudity, but not necessarily sexually explicit conduct.

The result next to the underage fake nude photo of Cyrus is a similar fake nude photo of Selena Gomez. The Gomez image links out to a post in the same thread in the same forum from December 2013. A reverse-image search found that photo of Selena Gomez uploaded on July 30, 2009, eight days after she turned 18, demonstrating how it can be difficult, especially just by guessing, to determine whether a face in a fake nude image belongs to a child or an adult. List and Gomez didn’t immediately respond to requests for comment.

Each of the fake nude images NBC News reviewed in Bing search results has been circulating online for close to 15 years, indicating that they precede the current wave of artificial-intelligence technology that has been used to victimize women and girls at scale with increasingly sophisticated fake media.

“Even before the 2000s, in the 1990s, as soon as the internet makes it possible for images to be supported, there are all these porn galleries that do this really crude Photoshopping of celebrities onto naked bodies,” Franks said.

When it comes to adult victims in the U.S., there is only a patchwork of state laws governing the issue of nonconsensual sexually explicit deepfakes. For underage girls, there is less of a legal gray area — but enforcement and paths to recourse have been lackluster, leaving some child deepfake victims disillusioned.

NBC News’ report on the fake nude images of Xochitl Gomez followed a podcast she did, during which she expressed frustration at not being able to get the images taken down.

“Why is it so hard to take down? That was my whole thought on it, was, ‘Why is this allowed?’” Gomez said on the podcast, explaining that her mother and team had tried and failed to get the images taken down from X. X has specific policies against nonconsensual sexually explicit deepfake images of anyone.

Deepfake nude and sexually explicit images of children are already illegal, a Lawfare paper published this month from the Stanford Internet Observatory’s Riana Pfefferkorn found.

Pfefferkorn cited Section 2252A of the U.S. Code, which was first enacted in 1996 and amended with Congress’ 2003 Protect Act. The section prohibits knowingly creating, sharing, receiving or possessing with the intent to distribute fake sexually explicit material depicting a child (referred to as “morphed” images), as well as “a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct.”

Pfefferkorn, a research scholar with years of litigation experience, has been writing about deepfakes and the law since around 2018. She has observed the evolution of the technology and the explosion of deepfake sexual abuse — but Pferfferkorn said she was surprised that media coverage of high-profile incidents neglected to address the existing laws around computer-generated CSAM. For example, some middle and high school cases have prompted criminal police investigations, but others have not.

“The salient question might be, why isn’t that getting enforced?” Pfefferkorn said. “Is it a hesitancy to apply the law against teenage boys? Is it that the police don’t know about it? Is it that they’re calling the FBI and the FBI isn’t picking up? You know, what’s the disconnect here?”

“We see hearing after hearing after hearing in Congress where they’re calling up the CEOs of tech companies, and I haven’t seen anything where the FBI or DOJ is being asked to account for themselves,” Pfefferkorn continued.

Microsoft CEO Satya Nadella commented on the issue of nonconsensual deepfakes of adults, like Taylor Swift, in a January interview with NBC News. He said companies like Microsoft needed to “move fast” to create “guardrails” around AI technology, and also urged law enforcement and government to act on the issue.

The Department of Justice and FBI did not respond to requests for comment.

To fill in state-level legal gaps, Pfefferkorn said she has also been in communication with state lawmakers in Connecticut and California who she said are working on legislation related to morphed deepfake images and children. Previously, in 2020, California passed a law that allows nonconsensual sexually explicit deepfake victims of any age to file a civil suit related to their case, but it does not criminalize sexually explicit deepfake images. In February, the Alabama House passed legislation to combat AI-generated CSAM, which includes the ability for courts to award punitive damages and directs the State Board of Education to require local school board policies for incidents related to such material.

Franks told NBC News that people victimized by such material can be discouraged by their lawyers from attempting to pursue takedowns legally, given the time, money and effort such cases take.

Franks said big tech companies like X and Microsoft are not likely to proactively search for deepfake CSAM or other fake nude images depicting children, in part because of the additional responsibility of determining the depicted person’s age and whether they’re a real or fictitious child. In 1996, the Supreme Court ruled that computer-generated CSAM of fictitious children is constitutionally protected, unlike morphed images that superimpose a real child’s face on an adult’s nude body.

“I think a big priority for companies that do care about this issue was to go after what seemed to be ‘real’ CSAM not only that is seemingly the most harmful, but it’s also the clearly violative thing that’s illegal and has no First Amendment protection,” Franks said.

“Any time they would actively go searching for it, they would put themselves in a different potential frame of liability than if they just didn’t realize it was there,” she added. “In criminal law, there’s a distinction between ‘You didn’t know it was there and you didn’t do anything’ and ‘You did know it was there and you didn’t do anything.’ There’s almost this incentive not to go looking for it.”

Advertisement