It appears Meta’s adult content advertising policy has a new challenger to take down as Wired found over 29,000 explicit “AI girlfriend” ads across Meta’s many platforms. The tech company has committed to artificial intelligence with its entire soul, promoting the use of its AI features on its Facebook, Instagram, and WhatsApp platforms relentlessly. Its tacky redesign on Instagram to convince you to use its own AI is one thing, but the company’s advertising filters have somehow let these ads for AI girlfriends slip through its digital fingers. Despite these advertisements carrying terms like “NSFW” and “NSFW AI,” they were still able to evade the company’s notice, appearing on Meta’s ad library.
Explicit AI Girlfriend Ads Show Up on Meta Platforms
It can’t be easy running a social media platform and being put in charge of all the content that turns up as millions of users upload content onto it every day. Despite having our sympathies, it is what a company signs up for when they decide to dominate every possible social media use case without pacing their growth. The explicit AI girlfriends ads are a natural progression of everything that happens on the internet—sexual content is inevitable on every platform as it is a quick way to draw attention and get the kind of clicks you’d want on unsafe content. Redditors will tell you there’s a rule for what happens to everything that exists, but we’re not necessarily going to dive into their theory right now.
In this specific case, Wired found that over 29,000 ads for explicit AI girlfriends were found primarily on Facebook and Instagram. These could be viewed on Meta’s ad library, a database where you can look for “all the ads currently running across Meta technologies, ads about social issues, elections or politics that have run in the past seven years, and ads that have run anywhere in the EU in the past year.” It’s a useful tool for those who want to study advertising trends but it also helps in holding the company accountable for the advertising they allow on their platforms.
As per the report, the Meta platform’s AI girlfriend ads contained sexually explicit images paired with equally inappropriate text, all of it undeniably pornographic no matter how you look at it. These AI chatbots even went to the extent of using popular animated characters as the face of the ad which is one way to shatter anyone’s childhood innocence. This could lead to dangerous outcomes if children ever happened to be scrolling through these apps unmonitored.
The AI Girlfriend Ads Are Not Surprising but They Are Alarming
The Meta AI itself is a work in progress and considering the focused lens on their every move, they’re more careful about the content they put out. When it comes to advertising, however, the efforts have always been a little more lax. The explicit AI girlfriend ads or “companion” ads are worrying all by themselves, just from the perspective of having to view them on an app that claims to have a clear adult content advertising policy in place. Worse still, is that these ads often lead to very real apps you can download on the Google PlayStore and Apple iOS store. For all the talk about security policy and the filtering of apps before they make it to these platforms, it is a little concerning to see this form of explicit content so openly portrayed.
If you go to the Meta ad library right now, a lot of these ads are still up and running with an “active” status. A few have a disclaimer that states, “This content was removed because it didn’t follow our Advertising Standards.” If it is this easy for us to identify the Meta platform’s AI girlfriend ads, how have they escaped erasure from those in charge of addressing these ads? It’s something to consider. A Meta spokesperson told Wired that they work quickly to remove them and improve their systems, but we’re currently able to view them, which tells us that something is missing in their checks and measures even now.
Meta’s Adult Content Advertising Policy
The explicit AI girlfriend ads bring the focus back down on Meta’s adult content advertising policy and what they take a stand against on their platforms. Artistic content that depicts nude figures without implying sexual context is allowed and so are medical diagrams. Despite this, many artists have found their content repeatedly tagged and taken down by the company, making it very annoying to see such content arrive unfiltered.
Treading the line of what should and shouldn’t be put out is complex to identify, but there are guidelines in place that help with navigating these rules. Adult content prohibited on their platform includes:
- Any depictions of nudity or implied nudity that are not permitted by their policies. The policy makes an exception for nudity as “a form of protest, to raise awareness about a cause or for educational or medical reasons”
- Any depictions of excessive visible skin, even when the content is not particularly sexual in nature
- A focus on individual body parts
- Ads with partly-clothed models promoting dating services
The Meta platform’s AI girlfriend ads aren’t the only ones that have been pointed out recently. After YouTube’s relentless crackdown on ad blockers and modded apps, users have turned to Google to ask the company what they’re doing to put an end to the soliciting they have to put up with from the spam ads on their platform. If companies want to cling to their advertising revenue, they need to be more proactive with allowing the appropriate kinds of ads to be the only ones arriving on their platform.
The post Tracking The AI Apocalypse—29,000 Explicit AI Girlfriend Ads Plague Meta appeared first on Technowize.