Elon Musk’s X, formerly Twitter, announced its plan to establish a new “Trust and Safety Center of Excellence” in Austin, Texas, dedicated to upholding content and safety standards on the platform.
The move comes as the company faces scrutiny for cutbacks in trust and safety operations under Musk’s leadership, prompting concerns about the platform’s commitment to user safety.
The company plans to recruit 100 full-time content moderators at the new Texas location. The primary focus of this team will be to combat material related to child sexual exploitation, aligning with X’s commitment to creating a safer online environment.
However, the company did not provide a specific timeline for the opening of the new centre.
X’s age policy requires users to be at least 13 years old, with less than one percent of daily users falling in the 13-17 age range. Individuals under the age of 18 cannot be targeted by advertisers on the platform.
Criticism has been directed at Elon Musk for scaling back X’s trust and safety operations since assuming leadership in October 2022.
Musk has also rolled back certain policies, including those related to misinformation, with the goal of reinstating “free speech” on the platform.
The announcement coincides with X CEO Linda Yaccarino’s upcoming appearance before the Senate Judiciary Committee, scheduled for next week.
Yaccarino is expected to testify on child safety online alongside CEOs from major technology companies, including Meta, Snap, TikTok, and Discord.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataThe 2023 Child Online Safety Index, published by international think-tank The DQ Institute, found that nearly 70% of children and adolescents aged 8-18 worldwide have experienced at least one cyber risk incident in the past year.
In September, European regulator, the Data Protection Commission (DPC), fined TikTok $370m for violating laws on handling children’s personal data in the EU. TikTok announced last month it would contest the fine.
The DPC’s investigation found that profile settings for child user accounts were set to public by default, meaning anyone could view content posted by a child user.
Last year, Australia’s eSafety Commissioner, Julie Inman Grant, accused Big Tech social media platforms like Google and Musk’s X (formerly known as Twitter) of facilitating the spread of misinformation and child abuse materials.
Australia’s online safety regulator, eSafety, accused Google of not blocking links to known child sexual exploitation material, despite the availability of databases from expert organisations like the UK-based Internet Watch Foundation.