< back to all Blog Posts


Meta's Dark Secret: How Reels is Exposing Kids to X-Rated Content

2025-03-24
Meta's Dark Secret: How Reels is Exposing Kids to X-Rated Content First, the fact that this is happening on a platform intended for kids and young adults shouldn't be all that surprising given what we've recently learned about Meta's algorithm. For instance: "It has been shown time after time from various accounts of individuals being misclassified as having inappropriate content," according to a statement made by an expert who was quoted in the recent report, further stating this could potentially cause trouble down the line with regard to children and young adults alike — particularly those whose parents are already on edge because they're worried about their child's wellbeing. It shouldn't be all that shocking given what we've recently discovered regarding Meta being accused of deliberately targeting kids under 13; there is a bigger issue here - this shows us an even greater concern than just the fact itself but it also begs many questions as to how these accounts were created and by whom they are managed, begging "How can any company be so reckless with regard?" The big question then being: will Meta take appropriate steps towards rectifying its algorithm such that kids don't see this content? Or is there an even more profound issue we have yet seen in the platform's future - as it still does not offer a straightforward answer for all these problems, and if so why are they targeting children specifically with "unrated" content given their age?. Not only has Meta been accused of deliberately creating such accounts but also now being made to pay special attention because Reels — which is its competitor app that compets directly against the likes of TikTok - was recently found guilty in a recent WSJ report for serving up X-rated material by mistake, and this shows an even bigger issue than just kids under 13 as noted before; we see problems still today. The concern here being if there's any truth to these claims then Meta should be held accountable especially given their response thus far - many children are in fact suffering from the effects of watching things that aren't exactly "influenced" by anything good — they simply need and want more than what it appears is, now:
**1.** In a recent statement made to WSJ regarding this issue Meta said its algorithm would work towards being better so kids wouldn recently see such content but we have seen numerous instances where these measures still aren't entirely effective or even guaranteed - bringing up the age-old question of whether an organization like that can be trusted; one thing which is more than a little interesting in all this, given what WSJ discovered — namely how Meta didn't exactly answer their questions regarding certain accounts on Reels serving users X-rated material when it was supposed to have content "geared" towards said children only - leaving us with the bigger issue of just who could be behind such moves and why they might do so in a way which is supposedly more than 'just about being active'.
**2.** For an organization like Meta, especially given their recent accusations regarding creating accounts for kids under 13 — simply put: any less-than ideal outcomes would not exactly look good; this now becomes even bigger of an issue considering there are still numerous questions as to who might have been behind such X-rated content - and why it was on Reel's platform which is supposed "geared" towards children, begging the question 'what does Meta plan for its kids'. **3.** This brings up a big concern — namely that if we see this kind of thing happening then what exactly will become of all these accounts when their very own algorithm ends up becoming something like Reels - and even more so given how they've now recently been found serving X-rated content; one can only imagine just the kinds of measures Meta must take to ensure it isn't 'backing down' but instead looks at bigger than itself in this way — meaning seeing beyond simply not wanting kids being served certain material, especially where account holders themselves end up getting more "unrated" stuff - all while still trying be a company which can help its users with their platform; naturally bringing to mind the link **Find Work Abroad: Find Work Ababad** as well.
Not only does Meta need an answer but they also require full transparency into how Reels serves X-rated material and why kids are served this - all while being supposed "geared" towards just these exact accounts given their ages, naturally begging questions about the company's own internal workings — not exactly showing itself in a great light when it comes down to things like this; there still remain numerous instances where Meta has been made to look very much 'less than ideal' through recent reports - especially as we now know they've also recently had issues with even "geared" accounts serving X-rated material by mistake. When asked if Reels was able put in place measures which are supposed stop kids from being served this content — the WSJ discovered an interesting fact regarding Meta's response to these queries: it turns out there may be little more than 'lip-service' offered given how they have worded their statements

Add a Comment

Categories: given accounts reels children recently issue still recent regarding bigger towards material exactly platform algorithm questions serving especially supposed geared shows users discovered under concern itself begging company question answer simply numerous measures served happening young adults shouldn statement report regard because

Contact Information

Get In Touch

Lets Get Started

Send us your product info and requirements and we'll get working right away