据报道,苹果和谷歌正引导用户访问裸体化应用。

qimuai 发布于 阅读:23 一手编译

据报道,苹果和谷歌正引导用户访问裸体化应用。

内容来源:https://www.engadget.com/apps/apple-and-google-are-reportedly-pointing-users-to-nudify-apps-065144277.html?src=rss

内容总结:

据科技透明度项目最新报告,苹果与谷歌应用商店仍在推广多款“脱衣”类人工智能应用。这些应用可将真人照片生成裸体图像或移入色情视频,部分甚至被标注为“适合所有人下载”,对未成年人未设防。

报告显示,在苹果和谷歌商店搜索“脱衣”等关键词,不仅能找到数十款此类应用,平台还会在搜索结果中为其投放广告。这些应用累计下载量已达4.83亿次,产生约1.22亿美元收入,其中一款名为“视频换脸AI”的应用竟将女演员面部合成至半裸身体上演示功能,仍被标记为全年龄段可用。

尽管两家公司均声称禁止色情内容,谷歌还专门出台政策反对“脱衣”应用,但调查发现相关应用在三个月前被曝光后仍未下架。苹果回应已移除15款被点名的应用,谷歌则表示暂停了部分应用。项目负责人指出:“平台不仅是审核失职,更在主动将用户导向这些应用。”

此类应用的泛滥已引发多国监管行动。英国儿童事务专员近期呼吁禁止生成儿童裸照的AI应用,美国加州司法部长已就色情深度伪造内容向社交媒体X发出禁令。全球范围内,针对AI换脸色情内容的立法进程正在加速。

中文翻译:

据报道,苹果和谷歌的应用商店正在向用户推荐"脱衣"类应用。这些能将真人图像生成裸体或合成至色情视频的应用,竟被标注为"适合儿童使用"。

今年早些时候有调查发现,尽管苹果和谷歌都明确禁止此类内容,却仍在各自应用商店上架"脱衣"应用。科技透明项目组织最新报告显示,近三个月后,这些应用不仅未被下架,反而在iOS应用商店和Google Play中被主动推送。其中多款应用标注着"E"级(适合所有人),意味着儿童也可下载。

在这些应用商店搜索"脱衣""去衣"等关键词,用户就能找到能将真人图像生成裸体或合成至色情视频的应用。报告指出:"这些平台已成为传播AI换脸色情工具的关键推手。"应用商店甚至在搜索结果中为同类"脱衣"应用投放广告。

该组织在苹果应用商店发现18款"脱衣"应用,在Google Play发现20款。部分应用使用色情图片进行推广,另一些虽未直接宣传但同样具备深度伪造功能。据统计,这些应用累计产生约1.22亿美元收入,下载量达4.83亿次。

"这些公司不仅未能切实审查、持续下架违规应用并从中获利,"科技透明项目组织负责人凯蒂·保罗向彭博社表示,"他们实际上在主动引导用户下载这些应用。"

苹果和谷歌均设有禁止色情内容的相关政策,谷歌还专门制定了针对"脱衣"应用的禁令。苹果向彭博社表示已下架该组织指出的15款应用,谷歌则称已暂停多款应用运营。报告中提及的"视频换脸AI:深度伪造"应用,以展示女演员面部与另一女演员身体合成的图像进行宣传,允许用户将真人面部与半裸女性身体结合。该应用竟也被标注为"E"级。

"脱衣"应用和深度伪造技术的泛滥已促使多国政府立法监管。英国儿童事务专员近期呼吁禁止能生成儿童裸体或色情图像的AI深度伪造应用。美国等多国已提出或出台禁止露骨深度伪造内容的法律,加州总检察长近日就Grok生成的露骨深度伪造内容向埃隆·马斯克的X平台发出禁止令。

英文来源:

Apple and Google are reportedly pointing users to nudify apps
Apps that make real people nude or put them into pornographic videos were labeled as 'suitable' for kids.
Earlier this year it was revealed that Apple and Google were offering "nudify" apps on their stores despite having clear policies barring such content. Nearly three months later, such apps are not only still available, but being actively promoted on the iOS App Store and Google Play, according to a new report from the Tech Transparency Project (TTP). Many of those were labeled "E" for Everyone, meaning they can be downloaded by children.
Searching for "nudify," "undress" and other terms in those stores gives users access to apps that can make real people nude or put them into pornographic videos. The new report alleges that "the platforms are key participants in the spread of AI tools that can turn real people into sexualized images," TTP wrote. The app stores even ran ads for similar nudifying apps in the search results.
The group identified 18 nudify apps in Apple's App Store and 20 in Google Play. Some were marketed with sexual images, while others weren't advertised as such but could still be used for deepfakes. Those apps have collectively generated around $122 million in revenue and been downloaded 483 million times, according to the report.
"It’s not just that the companies are failing to actually appropriately review these apps and continue to approve them and profit from them," TTP director Katie Paul told Bloomberg. "They are actually directing users to the apps themselves."
Apple and Google both have policies banning sexual or pornographic material, and Google has a specific policy against nudifying apps. Apple told Bloomberg that it removed 15 apps identified by the group, while Google said that it suspended a number of them. One of the apps cited in the report called Video Face Swap AI: DeepFace, advertises itself by showing an actress's face swapped onto another actress's body and allows users to put a real person's face on the bodies of partially undressed women. The app was rated "E" for Everyone.
The proliferation of nudify and deepfake apps has pushed some governments to propose laws against them. The UK's Children's Commissioner recently called for a ban on AI deepfake apps that create nude or sexual images of children. The US and other countries have proposed or created laws banning explicit deepfakes, and the California Attorney General recently sent Elon Musk's X a cease and desist order over Grok's explicit deepfakes.

Engadget

文章目录


    扫描二维码,在手机上阅读