Share this @internewscast.com

Major video game companies could face fines approaching $50 million if they ignore orders to curb child grooming and radicalization.
Roblox, Fortnite, Minecraft, and Steam have been formally asked by Australia’s eSafety Commissioner to detail their strategies for detecting, preventing, and addressing severe online threats.
There are growing concerns that these platforms serve as initial contact points for sexual predators to approach children or for extremists to disseminate violent ideologies and recruit followers.
Australian Catholic University professor Niusha Shafiabady lauded the move as a “global precedent,” calling it a significant advance in child protection efforts.
“Given the massive scale of a game like Roblox, complete control is unfeasible, but any steps to minimize these threats are better than none,” she told the Australian Associated Press on Wednesday.
Roblox and Fortnite, among the most favored games for younger audiences, have previously been linked to several controversies.
Neo-Nazi, anti-Semitic and violent content has been found on Fortnite, including a map based on a concentration camp where 100,000 people were killed during World War II.
Terrorist attacks and mass shootings have reportedly been recreated on Roblox.
In a statement to SBS News, Roblox said it welcomes engagement with eSafety on the topic.
“Roblox has policies that strictly prohibit content or behaviour that incites, condones, supports, glorifies, or promotes any terrorist or extremist organisation or individual, which we work tirelessly to enforce,” a spokesperson said.
The spokesperson said it swiftly removes such content when found and uses technology to prevent extremist iconography from being published.
“Our team works regularly with law enforcement, civil society groups, and other organisations with specific subject matter expertise in countering those who would seek to promote violent extremism.”
University of Sydney researcher Milica Stilinovic described the commission’s attempts as “essentially playing whack-a-mole” because the internet is fluid, but said compelling platforms to be forthcoming about the user experience was needed.
“Seeking transparency from these particular platforms is crucial because they’re not coming to the table in terms of how the plumbing works on the back end,” Stilinovic said.
The video game platforms face fines of up to $825,000 per day should they fail to comply with the commissioner’s notice.
“Gaming platforms are amongst the online spaces most heavily used by Australian children, functioning not only as places to play, but also as places to socialise and communicate,” eSafety Commissioner Julie Inman Grant said.
“Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalisation and other off-platform harms.”
About nine in 10 Australian children between the ages of eight and 17 have played games online.
Online services are required to implement processes to protect Australians from illegal and restricted material, including measures to address risks of grooming.
Roblox has pledged to make private by default those accounts belonging to children under 16, and will introduce tools to prevent adults from contacting them without parental consent.
Fortnite developer Epic Games uses chat filters to remove hate speech and has implemented systems to automatically report potentially harmful chat interactions with those under 18.
Players under 16 are not allowed to use text or voice chat until a parent consents.
For the latest from SBS News, download our app and subscribe to our newsletter.