Entertainment

White House: Sexually explicit fake images of Taylor Swift ‘very alarming’

US President Joe Biden’s spokesperson has said fake, sexually explicit images of Taylor Swift that circulated online were “very alarming”.

White House Press Secretary Karine Jean-Pierre said on Friday that social media companies have “an important role to play in enforcing their own rules”, as she urged Congress to legislate on the issue.

The fake images of the pop star, believed to have been made using artificial intelligence (AI), were spread widely on social media this week, with one picture on X, being viewed 47 million times before the account was suspended.

The group Reality Defender, which detects deepfakes, said it tracked a deluge of nonconsensual pornographic material depicting Swift particularly on X, but also on Meta-owned Facebook and other social media platforms.

The researchers found several dozen different AI-generated images. The most widely shared were football-related, showing a painted or bloodied Swift that objectified her and in some cases inflicted violent harm on the deepfake version of her.

Ms Jean-Pierre said: “We’re going to do what we can to deal with this issue.

Taylor Swift, right, and Brittany Mahomes react during the third quarter of an NFL AFC division playoff football game between the Buffalo Bills and the Kansas City Chiefs, Sunday, Jan. 21, 2024, in Orchard Park, N.Y. (AP Photo/Frank Franklin II)
Image:
Taylor Swift, right, supporting her boyfriend Travis Kelce at a Kansas City Chiefs game last week Pic: AP

“So while social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation, and non-consensual, intimate imagery of real people.”

More on Joe Biden

The spokesperson also added that lax enforcement against false images too often disproportionately affects women.

Researchers have said the number of explicit deepfakes has increased in recent years, as the technology used to produce such images has become more accessible and easier to use.

Read more:
Sharing explicit ‘deepfakes’ without consent to be a crime
Man charged with harassment and stalking near Swift’s home
Scarlett Johansson is latest victim of alleged deepfake advert

In 2019, a report released by the AI firm DeepTrace Labs found that explicit images were overwhelmingly weaponised against women.

Most of the victims were Hollywood actors and South Korean K-pop singers, the report said.

X wrote in a post on the site on Friday that they are “actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.

“We’re closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed.”

Please use Chrome browser for a more accessible video player

Is AI an existential threat?

Meanwhile, Meta said in a statement that it strongly condemns “the content that has appeared across different internet services” and has worked to remove it.

“We continue to monitor our platforms for this violating content and will take appropriate action as needed,” the company said.

Taylor Swift’s representatives did not respond to a Sky News request for comment.

Articles You May Like

Buy now, pay later provider Klarna says it filed confidentially for U.S. IPO
Failed full sale of Homebase leaves 2,000 jobs at risk
First emperor penguin found in Australia after 2,000-mile journey
Reeves to create pension ‘mega funds’ to invest in infrastructure
U.S. crude oil edges higher but closes below $69 as large surplus expected next year