Google has announced its decision to clamp down on advertisements promoting deepfake pornography and services facilitating the creation of synthetic nudity.
This policy shift, set to take effect on May 30th, marks a departure from the tech giant's previous stance on sexually explicit content.
While Google's existing advertising guidelines already prohibit the promotion of sexually explicit material, the updated policy now explicitly bans advertisements for services that aid in the production of deepfake pornography and altered or synthetic nude content.
According to Google spokesperson Michael Aciman, the aim is to crack down on platforms and apps offering to generate or manipulate content of a sexual nature.
Aciman emphasized that Google will swiftly remove any ads found to be in violation of these new policies, leveraging a combination of human reviews and automated systems to enforce compliance. Notably,
Google's proactive approach has resulted in the removal of over 1.8 billion ads for sexual content violations in 2023 alone, as revealed in the company's annual Ads Safety Report.
The decision to implement these changes comes amid growing concerns over the proliferation of nonconsensual deepfake pornography and its detrimental impact on individuals and communities.
Recent incidents, such as the arrest of two Florida middle schoolers for creating AI-generated nude photos of their classmates, highlight the urgency of addressing this issue.
Legislative efforts to combat deepfake pornography are also gaining traction, with the introduction of the DEFIANCE Act aimed at providing legal recourse for victims of digital forgery.
This legislation seeks to hold accountable those responsible for the creation and dissemination of nonconsensual deepfakes.
Google's proactive stance against deepfake pornography underscores its commitment to safeguarding users and combating harmful online content.
By implementing stricter advertising policies, the company aims to mitigate the spread of synthetic nudity and protect individuals from the harmful effects of manipulated imagery.