Child Safety, Digital Platforms, and the Roblox Litigation

Online platforms that cater to children and teens face growing scrutiny over how effectively they protect young users from exploitation. That scrutiny has intensified around Roblox, a popular platform used primarily by minors.
State Attorneys General and private plaintiffs have alleged that Roblox failed to adequately protect children from sexual exploitation, grooming, and sextortion, prioritizing platform growth and profit over safeguarding children. While denying liability, Roblox has faced increasing pressure to implement new protections, including AI-assisted age verification tools.
JBF Experience Highlight
The Joel Bieber Firm has been at the forefront of Roblox child-safety litigation, including spearheading State actions. The JBF attorneys were among the earliest firms to file suit against Roblox related to the sexual exploitation of minors and the distribution of child sexual abuse material (CSAM).
This work requires navigating complex legal terrain, including anticipated defenses under Section 230 of the Communications Decency Act. JBF attorneys have developed extensive research and briefing on Section 230 jurisprudence and closely monitor evolving state and federal legislation aimed at protecting minors online, including proposed updates to COPPA and related digital-safety frameworks.