Popular California-based web platforms Roblox and Discord are facing renewed scrutiny from tech experts and parents following a series of high-profile child exploitation cases, including unlawful conduct and abductions.
Roblox, in particular, has made headlines in the past for alleged inaction regarding child safety measures.
The platform, which welcomes more than 85 million players daily, hosts countless young users; around 2 in 5 Roblox players are under the age of 13.
With an overwhelming majority (85%) of American children playing video games, 41% of whom log in daily, parental controls and abuse prevention remain at the forefront of guardians’ and lawmakers’ minds.
Kids are using their online avatars to unwittingly connect with online predators. Some on the platform emulate violent and frightening acts.
Consumers, Experts Call for Improved Controls
Historically, web-based parental controls have filtered unsafe content using keywords, age verification, and other barriers once thought impenetrable.
These digital walls were created to prevent minors from accessing concerning content.
However, today’s tech landscape differs starkly from the post-Y2K inception and popularization of online forums and social media platforms.
As such, sites must update their protocols to account for modern kids’ skillful tech usage, high screen time, and often unfettered access to generative artificial intelligence.
“The digital spaces kids use today are fast-moving and social,” notes a spokesperson with global online gaming brand Mobile Premier League (MPL) in a news release. “Safety needs to be built in at the design stage, not added later. Controls must be flexible, proactive, and truly protective. Effective regulation of digital environments is essential to protect young players and ensure safer gaming communities.”
In the news release, MPL explicitly details three ways global platforms can improve safety for young users:
Integrated Real‑Time Monitoring. An AI‑driven system with human oversight would leverage natural‑language processing and image recognition to detect and neutralize emerging threats across text, voice, and video streams.
User‑Driven Customization. Dashboards should empower parents to configure settings per app or interaction type, adjusting filters, time limits, contact whitelists, and keyword alerts as a child’s maturity and usage evolve.
Industry Collaboration and Standardization. Technology companies, regulators, and child‑safety advocates must unite behind a unified framework: interoperable reporting APIs, regular third‑party audits, shared best practices, and safety certifications to close protection gaps left by fragmented protocols.
Too Little, Too Late
While both Discord and Roblox have strengthened safety protocols for users younger than 13, frightened and frustrated parents and police contend that the proper time to address these problems was yesterday.
The brands have been on the receiving end of lawsuits, including one New Jersey Attorney General Matthew J. Platkin announced just last week against digital messaging app Discord. The lawsuit alleges the platform has misled parents about the overall reliability of its safety measures.
Follow us on MSN for more content you love.
Read more:
A lifelong gamer raised on classic titles like Crash Bandicoot, Spyro, and Croc, Stephanie brings her expertise of gaming and pop culture to deliver unique, refreshing views on the world of video games, complete with references to absurd and obscure media.
Leave a Reply