Previous Article in Journal
Emerging Technologies, Law and Policies
Previous Article in Special Issue
Cryptocurrencies as a Threat to U.S. Homeland Security Interests
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

AI Moderation and Legal Frameworks in Child-Centric Social Media: A Case Study of Roblox

Department of Criminal Law, Naif Arab University for Security Sciences, Riyadh 11452, Saudi Arabia
Laws 2025, 14(3), 29; https://doi.org/10.3390/laws14030029
Submission received: 8 March 2025 / Revised: 16 April 2025 / Accepted: 22 April 2025 / Published: 25 April 2025

Abstract

This study focuses on Roblox as a case study to explore the legal and technical challenges of content moderation on child-focused social media platforms. As a leading Metaverse platform with millions of young users, Roblox provides immersive and interactive virtual experiences but also introduces significant risks, including exposure to inappropriate content, cyberbullying, and predatory behavior. The research examines the shortcomings of current automated and human moderation systems, highlighting the difficulties of managing real-time user interactions and the sheer volume of user-generated content. It investigates cases of moderation failures on Roblox, exposing gaps in existing safeguards and raising concerns about user safety. The study also explores the balance between leveraging artificial intelligence (AI) for efficient content moderation and incorporating human oversight to ensure nuanced decision-making. Comparative analysis of moderation practices on platforms like TikTok and YouTube provides additional insights to inform improvements in Roblox’s approach. From a legal standpoint, the study critically assesses regulatory frameworks such as the GDPR, the EU Digital Services Act, and the UK’s Online Safety Act, analyzing their relevance to virtual platforms like Roblox. It emphasizes the pressing need for comprehensive international cooperation to address jurisdictional challenges and establish robust legal standards for the Metaverse. The study concludes with recommendations for improved moderation strategies, including hybrid AI-human models, stricter content verification processes, and tools to empower users. It also calls for legal reforms to redefine virtual harm and enhance regulatory mechanisms. This research aims to advance safe and respectful interactions in digital environments, stressing the shared responsibility of platforms, policymakers, and users in tackling these emerging challenges.
Keywords: content moderation; Roblox; artificial intelligence; legal frameworks; online safety content moderation; Roblox; artificial intelligence; legal frameworks; online safety

Share and Cite

MDPI and ACS Style

Chawki, M. AI Moderation and Legal Frameworks in Child-Centric Social Media: A Case Study of Roblox. Laws 2025, 14, 29. https://doi.org/10.3390/laws14030029

AMA Style

Chawki M. AI Moderation and Legal Frameworks in Child-Centric Social Media: A Case Study of Roblox. Laws. 2025; 14(3):29. https://doi.org/10.3390/laws14030029

Chicago/Turabian Style

Chawki, Mohamed. 2025. "AI Moderation and Legal Frameworks in Child-Centric Social Media: A Case Study of Roblox" Laws 14, no. 3: 29. https://doi.org/10.3390/laws14030029

APA Style

Chawki, M. (2025). AI Moderation and Legal Frameworks in Child-Centric Social Media: A Case Study of Roblox. Laws, 14(3), 29. https://doi.org/10.3390/laws14030029

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop