INSUBCONTINENT EXCLUSIVE:
An undercover reporter with the U.K
Channel 4 visited Cpl, a content moderation outsourcing firm in Dublin, and came away rather discouraged at what they saw: queues of flagged
content waiting, videos of kids fighting staying online, orders from above not to take action on underage users
It sounds bad, but the truth is there are pretty good reasons for most of it and in the end the report comes off as rather naive.
Not that
it a bad thing for journalists to keep big companies (and their small contractors) honest, but the situations called out by Channel 4
reporter seem to reflect a misunderstanding of the moderation process rather than problems with the process itself
I&m not a big Facebook fan, but in the matter of moderation I think they are sincere, if hugely unprepared.
The bullet points raised by the
report are all addressed in a letter from Facebook to the filmmakers
The company points out that some content needs to be left up because abhorrent as it is, it isn&t in violation of the company stated
standards and may be informative; underage users and content has some special requirements but in other ways can&t be assumed to be real;
popular pages do need to exist on different terms than small ones, whether they&re radical partisans or celebrities (or both); hate speech
is a delicate and complex matter that often needs to be reviewed multiple times; and so on.
The biggest problem doesn&t at all seem to be
negligence by Facebook: there are reasons for everything, and as is often the case with moderation, those reasons are often unsatisfying but
The problem is that the company has dragged its feet for years on taking responsibility for content and, as such, its moderation resources
The volume of content flagged by both automated processes and users is immense and Facebook hasn&t staffed up
Why do you think it outsourcing the work
By the way, did you know that this is a horrible job
Short film ‘The Moderators& takes a look at
the thankless job of patrolling the web
Facebook in a blog post says that it is working on doubling its &safety and security& staff to
20,000, among which 7,500 will be on moderation duty
I&ve asked what the current number is, and whether that includes people at companies like this one (which has about 650 reviewers) and will
update if I hear back.
Update: Facebook got back to me with some specifics
The 7,500 figure does in fact include full timers, contractors, and employees of vendors like Cpl
But the representative indicated that the company has actually surpassed that figure, though it is currently at around 15,000 total.
Even
with a staff of thousands the judgments that need to be made are often so subjective, and the volume of content so great, that there will
always be backlogs and mistakes
It doesn&t mean anyone should be let off the hook, but it doesn&t necessarily indicate a systematic failure other than, perhaps, a lack of
labor.
If people want Facebook to be effectively moderated they may need to accept that the process will be done by thousands of humans who
imperfectly execute the task
Automated processes are useful but no replacement for the real thing
The result is a huge international group of moderators, overworked and cynical by profession, doing a messy and at times inadequate job of