‘Stressed’ Facebook content moderator died on duty in US: Report

Facebook.

San Francisco,  A Facebook content moderator at a US site operated by professional services vendor Cognizant died last year on duty after he failed to meet the 98 per cent “accuracy” target set by the social networking giant, a chilling report by The Verge has revealed.

Facebook moderators have the harrowing task of walling off disturbing content such as images of suicides and child pornography that is users try to post every day.

The site at the Tampa city in Florida where 42-year-old Keith Utley worked routinely failed to meet the 98 per cent “accuracy” target set by Facebook, claimed the report.

The workers face tremendous pressure to to improve the enforcement of the community standards set by the social networking giant.

Utley, a former Coast Guard lieutenant commander, started working as a moderator for Facebook after getting out of the military. He is now survived by wife Joni and two young daughters.

According to his former co-workers, stress of the job weighed on Utley and he was always worried about getting fired.

He slumped over at his desk on March 9 last year. Co-workers noticed that he was in distress when he began sliding out of his chair.

There was no defibrillator in the building and two of his co-workers began to perform CPR. A manager called for an ambulance but paramedics arrived 13 minutes after the first call as the ambulance had some trouble locating the office, said the report.

By the time they arrived, Utley had already begun to turn blue, according to the report based on interviews with 12 current and former moderators and managers at the Tampa site.

Paramedics raced Utley to a hospital which found that he suffered a heart attack. He was declared dead a short while later at the hospital.

Senior management at the Tampa site initially discouraged employees from discussing the incident, for the fear that it would hurt productivity.

Facebook moderators who work for professional services vendor often paid less and suffer from post-traumatic stress disorder and related conditions, probably due to regular exposure to images containing graphic violence and child exploitation.

Facebook says it will conduct an audit of its partner sites and make other changes to promote the well-being of its contractors, said the report.

The company said it would consider making more moderators full-time employees in the future, and hopes to someday provide counseling for moderators after they leave.

An earlier report in February, again by The Verge, revealed that Cognizant employees, tasked with vetting Facebook posts flagged for pornographic material, graphic violence or hate speech, resorted to drugs and sex at workplace.

The report revealed nearly 1,000 Cognizant employees at its Phoenix, Arizona, office have been told “not to discuss the emotional toll their job takes on them, even with loved ones, leading to increased feelings of isolation and anxiety”.

Previous articleHeather Knight excited as women’s cricket gets nominated for CWG
Next articleSamsung launching 3 new tablets in stagnant Indian market