Dive Brief:
- Attorneys general from 44 states asked payments service providers, including Apple and Visa, to confront computer-generated nonconsensual intimate imagery, known commonly as deepfake pornography or revenge porn. They also addressed a similar letter to three search engine companies.
- The AGs wrote to a half dozen financial technology companies on Friday, via a trade group, asking how they identify and remove payment authorization for deepfake tools and content. The letter from the National Association of Attorneys General also sought a commitment from the companies “to take further action to avoid being complicit in the creation and spread” of such images.
- Five of the six companies to which the letter was addressed – American Express, Apple, Google, Mastercard, PayPal Holdings and Visa – did not immediately respond to a request for comment. The states’ letter asked the payment companies to “enforce their policies and avoid playing any role in allowing people to profit” from nonconsensual faked images.
Dive Insight:
Payment platforms have previously used their terms of service and acceptable use policies to “prohibit their payment services being used in connection with harmful content,” the AGs wrote.
The attorneys general of six states – Alabama, Florida, Indiana, Kansas, Montana and Texas – did not sign the letter. AGs from American Samoa, Puerto Rico and the U.S. Virgin Islands also joined the letter.
“As this technology becomes more powerful and creates more potential for harm to the public, businesses that help people search for, create, and distribute this content need to be aware of their role in propagating this content and work to prevent its spread,” the AGs wrote.
Visa, Mastercard and American Express are the largest U.S. card payment networks and operate internationally as well, while Apple, Google and PayPal are some of the largest tech companies that offer digital wallets and payment services.
“We have zero tolerance for unlawful activity on our network,” a spokesperson for Mastercard said Thursday in an emailed statement. “When we see or are made aware of specific instances of such activity, we investigate the allegations and take action to ensure compliance with both local laws and our rules and standards.”
In the AG’s letter, the state law enforcement executives “respectfully” requested responses of how the companies identify and remove payment authorization for deepfake porn tools and content. They also sought corporate commitments to take action against nonconsensual deepfake content, also often referred to as revenge porn. This material is imagery or video of real people that’s manipulated digitally to create a pornographic depiction of them.
“Payment platforms must be more aggressive in identifying and removing payment authorization for deepfake NCII tools and content,” the AGs wrote.
“We urge you to treat the problem of deepfake NCII seriously and to work to better prevent your own complicity in the creating and sharing of this content.”
The states opened their letter noting “that we have the utmost respect for the First Amendment and your rights under it,” asking the companies to assess how they can mitigate deepfake nonconsensual content.
The AGs sent a second, similar letter Monday to the top legal officers of Google, Microsoft and Yahoo, related to those companies’ efforts in combating nonconsensual imagery via their search engines.
In May, President Donald Trump signed into law the Take It Down Act, or Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks. The law requires “covered platforms” to comply with “certain notice and takedown obligations with respect to intimate visual depictions and deepfakes” by May 19, 2026, the law firm Skadden Arps Slate, Meagher & Flom wrote in a briefing about the act.
Under the new law, platforms must create processes to remove deepfake images within 48 hours of a request from people depicted and “make reasonable efforts to identify and remove” copies of the depiction.
Covered platforms include public websites, online services and applications and mobile applications that “primarily provide a forum for user-generated content or are primarily designed to publish nonconsensual intimate visual depictions,” the Skadden Arps attorneys wrote.