Digital Focus

Roblox Introduces New Online Safety Measures to Protect Children

Source

Roblox, the widely popular online gaming platform known for hosting millions of user-created games, is taking new steps to strengthen protections for children. Starting soon, the platform will require players to undergo AI-powered facial age estimation to help verify their age.

This system will work alongside ID-based verification and confirmed parental consent, offering a more reliable measure of a user’s age than simply relying on self-reported information.

Initially, the age-check requirement will be enforced in select markets, including Australia, New Zealand, and the Netherlands, with plans to expand globally in early January.

In addition to these measures, Roblox is launching a dedicated online safety center designed to guide families in understanding and setting up parental controls.

These enhanced protections come amid lawsuits from families and legal authorities in Kentucky and Louisiana, accusing Roblox and similar platforms like Discord of failing to prevent sexual predators from targeting children. Florida is also investigating Roblox for alleged lapses in safeguarding children.

Previously, Roblox announced plans to extend age verification for all users wishing to access communication features, ensuring that interactions between adults and children are restricted unless they know each other in real life.

This initiative reflects the importance of upholding children’s rights to safety and protection in digital spaces, aligning with the United Nations Convention on the Rights of the Child, which emphasizes protecting children from all forms of physical or mental harm.

Read more about the article here

Image Source

Show More

Related Articles

Back to top button