Data Shows That X Has Significantly Fewer Moderation Staff Than Other Platforms

Data Shows That X Has Significantly Fewer Moderation Staff Than Other Platforms

Does X now have a lot fewer moderators than other apps, following its cull of around 80% of its total staff in 2022?

While we don’t have full insight into the staffing of each app, X has publicly endorsed its “Community Notes” crowd-sourced fact-checking program as a means to supplement its reduced moderation workforce, which it sees as a better solution in many ways.

But how much has that workforce actually reduced, and how does it compare to other apps?

The latest E.U. transparency reports provide some insight.

Under the E.U. Digital Services Act (D.S.A.) , all large online platforms are required to regularly report their E.U. user and moderation staff counts, in order to provide more transparency into their operations.

Over the last week, all of the major social apps have shared their latest reports, which provides a comparison between the total users and moderation staff for each.

Which stands as follows:

Social platform moderation staff

Based on this, X does have the worst ratio of moderation staff to users, at 1/60,249, with LinkedIn coming in second (1/41,652), then TikTok (1/22,586) and Meta (1/17,600).

Though there are some provisos here.

Meta, for example, reports that it has 15,000 content reviewers working across both IG and Facebook, which both have 260 million EU users each. In that sense, Meta’s staff to user ratio could arguably be doubled, though even then, it would still be better than X and LinkedIn.

X’s total user count also includes logged-out guests, which the others seemingly don’t. Though guests on Facebook, LinkedIn and IG can’t see as much content, so that’s probably not really a major factor in this context.

X also notes that these 1,849 moderators “are not specifically designated to only work on EU matters”. So it’s ratio here may actually be even worse.

On balance, then, X does have a lot fewer manual staff moderating content, in Europe at least. And if we’re to assume that this is indicative of other regions, then X would seemingly have a lot fewer moderation staff than other apps.

Which X hasn’t really made a secret of, but that would presumably also have an impact on its capacity to detect and action violative content.

Which aligns with third party reports that more rule breaking content is now being made visible on X, which could point to a potential weakness of Community Notes in providing adequate enforcement of such.

Various online safety experts have said that Community Notes is not an adequate safety solution, due to shortfalls in its process, and while X would like to see it as a better process for moderation calls, it may not be enough, in certain circumstances.

Even X has acknowledged this, to some degree, by pledging to build a new moderation center in Texas. Though since that announcement (in January), no further news on the project has come out of X HQ.

Essentially, if you’re concerned that X may not be doing as much to address harmful content, these stats likely underline such, though it is important to note that the numbers here relate to E.U. staffing only, and may not be indicative of X’s broader measures.

Source link

post a comment