google.com, pub-8701563775261122, DIRECT, f08c47fec0942fa0
UK

Apps that can ‘strip’ victims still available on Apple and Google stores

Apps that allow users to create AI-generated “nude” photos of real people are still available in the Apple and Google app stores.

The creation of sexually explicit deepfakes is illegal in the UK, following outrage over Elon Musk’s Grok being used to create sexualized images of women and children.

However Independent It found that many apps that can be used to “strip” photos are still available for download from the country’s two largest app stores.

Apps that allow digital removal of clothing are still available on the Google Play Store and Apple App Store

Apps that allow digital removal of clothing are still available on the Google Play Store and Apple App Store (Alamy/PA)

It comes after a while Technology Transparency Project (TTP) research Google found 55 apps in the US version of the Play Store that can digitally strip women of their clothes and show them fully or partially naked or with very little clothing. Similarly, 47 such apps were found to be available in the US Apple App Store.

A search by: Independent TTP research showed that in addition to the apps mentioned, several similar apps were also available in the UK versions of the app stores.

The Google Play Store policy on inappropriate content states: “We do not allow apps or sexually gratifying content or services that contain or promote sexual content or profanity, including pornography.

“We do not allow apps or app content that appears to solicit or encourage a sexual act in exchange for compensation. We do not allow apps that contain or encourage content associated with sexually predatory behavior or distribute non-consensual sexual content.”

Apple says it removed 28 of 47 apps flagged by TTP

Apple says it removed 28 of 47 apps flagged by TTP (PA Wire)

Apple App Store policy states that apps “must not contain content that is offensive, insensitive, upsetting, disgusting, in extremely bad taste, or simply creepy.”

It said “overtly sexual or pornographic material” was prohibited, defined as “explicit description or display of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings.”

However, both platforms hosted applications that allowed women’s images to be digitally extracted.

One app highlighted in the TTP investigation was able to create a video of a woman taking off her top and dancing from a photo they uploaded. The app is available in both stores as of Friday afternoon and has been downloaded more than five million times.

Another app, available on the Google Play Store, advertises the ability to “try on” clothes and shows images of women in bikinis.

Apple said it removed the 28 apps it identified in TTP’s report and contacted the developers of the others to give them a chance to fix the rule violations. It seems that Google has also removed some applications.

Following the Grok controversy, women’s rights advocates including Refuge, Women’s Aid and Womankind Worldwide said the “disturbing” increase in AI’s exploitation of intimate images was leading to “dangerous” consequences for women and girls, including for their safety and mental health.

Emma Pickering, head of technology-facilitated exploitation and economic empowerment at charity Refuge, said: “As technology improves, the safety of women and girls depends on tighter regulations on image-based abuse, whether real or deepfake, as well as specialist training for prosecutors and police.

“Women have the right to use technology without fear of abuse, and when this right is violated, survivors should have access to swift justice and robust protection.”

Independent It contacted Google and the government for comment.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button