X Corp., which owns social media platform X (previously known as Twitter), has reduced its worldwide trust and safety team by 30%, with a decline of 80% in the number of safety engineers since billionaire Elon Musk took control in 2022, as reported by an Australian internet safety watchdog.
A summary of X's responses to queries about enforcing its regulations against hate speech was revealed by Australia's eSafety Commission, which boasts itself as the first government organization in the world devoted to making people safer while using the internet.
According to the commission, while X had previously provided estimates of the workforce decrease, the responses were the first detailed data regarding which departments had seen cuts that were made public.
Decrease Within the Trust and Safety Team
In a report by ABC News, X's trust and safety workforce worldwide decreased from 4,062 to 2,849 workers and contractors. This was between October 28, 2022, the day before Musk took control of San Francisco-based Twitter, and May 31, 2023, when a reporting period ended. This is a 30% worldwide decline and a 45% decline in the Asia-Pacific area.
There was an 80% decrease, from 279 to 55, in the number of engineers at X, whose primary emphasis was on trust and safety concerns. There were now 51 full-time content moderators instead of 107, a reduction of 52%. The number of contract content moderators also dropped from 2,613 to 2,305, a decrease of 12%.
Meanwhile, X has come clean about restoring 6,100 banned accounts, including 194 accounts suspended for hate speech. Those accounts were identified as being Australian, as per the commission. Platformer, a tech publication, announced in November 2022 that 62,000 accounts that had been suspended had been restored. However, X did not provide worldwide data.
According to the commission, these accounts were reactivated without further investigation, even though they had previously violated X's regulations.
Based on X's policy against hateful conduct, users are prohibited from making direct attacks on other individuals because of their race, ethnicity, national origin, caste, sexual orientation, gender, gender identity, religious affiliation, age, handicap, or severe illness.
What Could Happen?
Since Musk came to control, X has been slower in responding to user concerns about rude material.
eSafety Commissioner Julie Inman Grant warned that a decline in safety personnel and the re-accounting of banned users would lead to an increasingly toxic and unsafe social media network. She added that X's brand image and advertising income were jeopardized if the company did not raise user safety standards, even if it could not be forced to do so.
Marketers like to place their ads on sites that they see as pleasant, safe, and free of harmful content. When users see a site that's dangerous or toxic, they will also take action by leaving, said Inman Grant.