Bluesky released its moderation report for the past year on Friday, noting the significant growth the social network experienced in 2024 and how that affected the workload of its Trust & Safety team. He also noted that the largest number of reports came from users reporting accounts or posts for harassment, trolling or intolerance — an issue that has plagued Bluesky as it has grown and has even led to large-scale protests at times over individual decisions by moderation.
The company’s report did not address or explain why it did or did not take action against individual users, including those on the most blocked list.
The company added over 23 million users in 2024, as Bluesky became a new destination for ex-Twitter/X users for various reasons. Throughout the year, the social network benefited from several changes to X, including its decision to change the way blocking works and to train AI on user data. Other users left X after the results of the US presidential election, based on how the politics of X owner Elon Musk began to dominate the platform. The app also grew in users while X was temporarily banned in Brazil in September.
To meet the demands caused by that growth, Bluesky grew its moderation team to roughly 100 moderators, he said, and is continuing to hire. The company also began offering team members psychological counseling to help them with the hard work of constant exposure to graphic content. (An area we hope AI will address one day, as humans aren’t built to handle this kind of work.)
In total, there were 6.48 million reports on Bluesky’s moderation service, up 17 times from 2023 when there were only 358,000 reports.
Starting this year, Bluesky will begin accepting moderation reports directly from its app. Similar to X, this will allow users to more easily track actions and updates. Later, it will also support in-app appeals.
When Brazilian users flocked to Bluesky in August, the company was seeing up to 50,000 reports a day, at its peak. This led to a backlog in addressing moderation reports and required Bluesky to hire more Portuguese-speaking staff, including a contract salesperson.
In addition, Bluesky began automating more report categories beyond just spam to help it address the influx, though this sometimes led to false positives. However, automation helped reduce processing times to just “seconds” for “high security” accounts. Before automation, most reports were handled within 40 minutes. Now, human moderators are kept in the loop to address false positives and appeals, if not always handling the initial decision.
Bluesky says 4.57% of its active users (1.19 million) made at least one moderation report in 2024, up from 5.6% in 2023. The majority of those — 3.5 million reports — were for individual posts. Account profiles were reported 47,000 times, often for a profile photo or banner photo. Lists were reported 45,000 times; DMs were reported 17,700 times, with supplies and Starter Packs receiving 5,300 and 1,900 reports respectively.
Most of the reports were of anti-social behavior such as trolling and harassment – a signal from Bluesky users that they want to see a less toxic social network compared to X.
The other reports were for the following categories, Bluesky said:
- Fraudulent content (impersonation, misrepresentation, or false claims of identity or affiliation): 1.20 million
- Spam (excess mentions, replies or duplicate content): 1.40 million
- Sexually unwanted content (nudity or adult content not properly labeled): 630,000
- Illegal or urgent matters (clear violations of the law or Bluesky’s terms of service): 933,000
- Other (issues that do not fit into the above categories): 726,000
The company also offered an update to its tagging service, which includes tags added to posts and accounts. Human taggers added 55,422 “sexual images” tags, followed by 22,412 “rude” tags, 13,201 “spam” tags, 11,341 “intolerant” tags and 3,046 “threatening” tags.
In 2024, 93,076 users submitted a total of 205,000 appeals on Bluesky’s moderation decision.
There were also 66,308 account removals by moderators and 35,842 automated account removals. Bluesky also fielded 238 requests from law enforcement, governments and law firms. The company responded to 182 of them and complied with 146. Most of the requests were law enforcement requests from Germany, the US, Brazil and Japan, it said.
Bluesky’s full report also delves into other types of issues, including trademark and copyright claims and child safety/CSAM reports. The company noted that it submitted 1,154 confirmed CSAM reports to the National Center for Missing and Exploited Children (NCMEC).