Google has updated its Inappropriate Content Policy to include language that expressly prohibits advertisers from promoting websites and services that generate deepfake pornography. While the company already has strong restrictions in place for ads that feature certain types of sexual content, this update leaves no doubt that promoting "synthetic content that has been altered or generated to be sexually explicit or contain nudity" is in violation of its rules.
Any advertiser promoting sites or apps that generate deepfake porn, that show instructions on how to create deepfake porn and that endorse or compare various deepfake porn services will be suspended without warning. They will no longer be able to publish their ads on Google, as well. The company will start implementing this rule on May 30 and is giving advertisers the chance to remove any ad in violation of the new policy. As 404 Media notes, the rise of deepfake technologies has led to an increasing number of ads promoting tools that specifically target users wanting to create sexually explicit materials. Some of those tools reportedly even pretend to be wholesome services to be able to get listed on the Apple App Store and Google Play Store, but it's masks off on social media where they promote their ability to generate manipulated porn.
Google has, however, already started prohibiting services that create sexually explicit deepfakes in Shopping ads. Similar to its upcoming wider policy, the company has banned Shopping ads for services that "generate, distribute, or store synthetic sexually explicit content or synthetic content containing nudity. " Those include deepfake porn tutorials and pages that advertise deepfake porn generators.
This article originally appeared on Engadget at https://ift.tt/qlSLYVIfrom Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics https://ift.tt/qlSLYVI
No comments:
Post a Comment