Matt Cardy/Getty Images
The deck is stacked against children and families.
Handing a smartphone over to a child in 2025 is like putting a child in the middle of a junkyard and calling it a playground.
Yes, the space contains useful tools and materials for adults who know what they are doing. They arrive with knowledge, caution, and protective equipment. They know where to step and what to avoid. And if the adult should get hurt through recklessness or inattention, they have enough life experience to know how to mitigate the harm.
Missing red flags is not just possible — it is inevitable.
And even if you post warning signs at the gate, the environment itself remains full of hazards — rusted metal, broken glass, exposed wiring, spilled gasoline. A child placed in that environment is vulnerable not because the child disobeys the signs, but because the space was never designed for them to navigate safely. The danger is structural.
But kids are naturally curious, and they like to explore. Five out of every 10 children who spend time in that “playground” will be significantly, perhaps even fatally, harmed by the experience.
This is the situation we have created by normalizing smartphones for children. Smartphones were never intended for young users, yet in the U.S., more than 60% of children ages 5 to 11 and 84% of teens now have one.
Those devices are portals to an enormous ecosystem of apps — approximately 1.8 million available in Apple’s App Store alone. According to Apple’s 2023 Transparency Report, 500 experts assess about 132,000 apps each week. That breaks down to around 265 apps per reviewer per week or about nine minutes per app.
Nine minutes to determine how the app collects and stores data, whether it enables account creation (and deletion), whether it uses copyrighted materials, whether it meets hardware and software standards, whether it contains illegal or harmful content, and whether it can be used to facilitate illegal or harmful activity
Most readers could not read the Apple App Review Guidelines in nine minutes, let alone meaningfully evaluate an app’s design, mechanics, and community-risk profile.
All that before you even get to questions of safety.
Little wonder, then, that so many apps that seem innocuous at first blush are later discovered to be a predator’s playground.
RELATED: Is your child being exposed to pedophiles in the metaverse?

A recent New York Post headline warns, “Wizz is like ‘Tinder for kids,’ as teens use the app to hook up while adult predators lurk.” Wizz is marketed to users ages 12 to 18 as a way to meet new friends who share common interests. In practice, it functions more like a teen version of Tinder, complete with profile swiping and private messaging that connects minors with total strangers, including adults posing as teens. The Post details three cases of adult men who allegedly used the app to meet underage girls.
Wizz is far from the only example.
This fall, a married 42-year-old father of two was convicted in the U.K. for encouraging a child to self-harm. The man created six fake profiles on Discord and Snapchat, each one posing as a teenaged boy, in order to ensnare, blackmail, and abuse a 13-year-old girl.
The investigation was hampered by the fact that he had used stolen identities and fake accounts to communicate with his victim and by the fact that the apps he used to communicate with her allowed him to set the messages to “auto-delete,” which left no digital trail for investigators to follow.
Kik, an anonymous messaging app, was considered a haven for child predators because it provided anonymity and allowed users to communicate without sharing phone numbers. Vice reported in 2019 that the app was shutting down, but it is still available for download on Apple’s App Store and, as recently as this summer, was linked to a number of child exploitation cases.
Any social media platform targeted specifically to young users is ripe for abuse, but often parents do not know about the dangers until the harm has already been done. We rely on the imagined expertise and authority of professional reviewers.
If the app is available on an app store, we assume it has been properly vetted.
But the truth is that app stores rely on developers’ self-reported age-ratings and safety claims. And with less than 10 minutes to spend reviewing each app, the deck is stacked against children and families. Missing red flags is not just possible — it is inevitable.
Congress must act and pass the App Store Accountability Act.
The bill would require app stores to be transparent about how apps handle data, how they moderate interactions, and for whom their products are intended. It would establish clear responsibility when apps marketed to minors become vehicles for grooming, harassment, or exploitation. And it would ensure that companies profiting from child-facing platforms cannot simply shrug and point to the fine print when harm occurs.
The App Store Accountability Act will not eliminate every risk, but it will help end the era of Big Tech reviewing itself and calling it protection. That would be a big win for families.
Melissa Henson