Skip to main content

The addictive nature of social media and the negative toll it takes on our mental health — particularly for young’uns with developing brains — isn’t exactly a new concept. A generation of iPad kids is facing startling rates of depression, anxiety, and attention spans shrinking faster than wool socks in a hot dryer

But companies like Alphabet (which operates Google and YouTube) argue that it isn’t their fault, and protecting children has “always been core to their work.”  

sure jam GIF

Well, many Americans — and even entire school districts — vehemently disagree. Now, four social media titans, Alphabet, Meta (Facebook/Instagram), ByteDance (TikTok), and Snap (SnapChat), are facing an onslaught of lawsuits. 

In their defense, the companies argued they were immune to prosecution, pushing for the dismissal of all claims against them. Citing the U.S. Constitution's First Amendment and a provision of the federal Communications Decency Act, Section 230 grants immunity to social media companies from liability for anything users publish on their platforms.

In a landmark decision last Tuesday, U.S. District Judge Yvonne Gonzalez Rogers ruled against the four horsemen of the mental health-pocalypse, denying them protection from litigation. Now, they’ll have to face the music. 

Rogers says the issue wasn’t simply a matter of third-party content influencing impressionable minds — the companies have a legal responsibility to design reasonably safe products and warn users of known defects. By neglecting to help users limit screentime, using ineffective age-verification tools, and creating barriers to account deactivation, social media developers can’t get away with shifting blame onto its content creators. 

As a CX professional, do you have any ideas for how social media companies can create safer experiences for their users? As we’ve previously discussed, establishing trust with your customer base is uber-important. What should these companies do to re-establish trust with the legions of concerned parents, school boards, and even legislators? 

Safety First: Which Companies Are Getting it Right?

One of Rogers’ suggestions for social media companies to improve child safety was the application of age-verification tools that notify parents when their kids are online — but of course, that alone won’t put an end to a full-blown mental health crisis. Where else can we look for examples of effective child safety features in the digital realm? 

Perhaps Mr. Zuckerberg should take a page from the playbook of the experts: Roblox. 

roblox safety guide for parents photo

For those of you who aren’t well-versed in the world of pre-teen gaming, Roblox is a platform that allows its users — predominantly children — to create and upload their own video games and explore the creations of other users. It's free to play, but there are paid features to unlock additional content. Like many major social media channels, Roblox allows users to send friend requests and engage in chats.

Eighty percent of its users are under 16. With 70.2 million daily users and 214 million monthly users, this gaming platform has risen to the ranks of Minecraft and Fortnight in terms of popularity amongst kids. Of course, Roblox doesn’t have a spotless record when it comes to child safety issues. However, they’ve made a strong effort to enforce safety and security features, with a wealth of resources for parents so they can effectively monitor their kids’ activity. There are also loads of third-party forums for parents to seek answers and advice about how to help their kids play responsibly.

24/7 Customer Service and Safety Helplines

Roblox ensures that someone is always available to address parental concerns as soon as possible. That’s a pretty solid CX strategy, in our humble opinion! They’ve also assembled a Trust & Safety Advisory Board comprised of some of the world’s top experts in digital safety.

A Huge Team of Moderators, Plus Some Help from AI

Every piece of content uploaded to Roblox undergoes a rigorous safety review. The company currently employs over 1600 moderators to monitor inappropriate content in games and chats. After implementing machine-learning software to supplement their moderation efforts, they saw an 84% reduction in the number of users exposed to policy violations.

Age-Based Content Ratings

Some of the games feature mild violence, so Roblox categorizes content accordingly. For example, a game with sword-wielding knights would get a 13-and-up tag, whereas the tamer games get a 9-and-up rating.  

Customizable Parental Controls 

For parents who don’t want their kids engaging in any chat features, you can disable or limit this feature. If they do allow use of the chat feature, parents can easily monitor chat logs and friend requests. They can also customize access to different games and experiences based on age recommendations and place restrictions on spending to avoid any surprises on their credit card statement.

Working Together to Keep Kids Protected

As far as limiting usage and preventing a full-blown gaming addiction, of course, a lot of the responsibility falls on parents. But considering the number of customizable resources, safety features, and available support agents, Roblox empowers parents with the tools and information they need to control their kids’ gaming experiences. No complicated hoops to jump through, or excessively long wait times — just simple, streamlined processes and a helpful team of customer experience agents who are ready to help.

Want to learn more about the importance of developing ethical digital products that prioritize user well-being? Check out this episode of The Product Manager podcast, where we chat with Samantha Gonzalez, Director of Product Strategy at DockYard, Inc., to talk about ethical product strategies.