The dreadful effect of working as a Facebook moderator

The dreadful effect of working as a Facebook moderator

The Guardian’s work on Facebook continued to leave a psychological mark on the company’s employees after months of efforts to improve conditions for thousands of contractors at the company.

He also said that others were pushed beyond the hate speech and fake news he reads every day.

They describe being underground because of the amount of work numbed by running violence, nudity, and bullying, and they have to work eight hours a night and a weekend, for “practically the minimum wage.”

One aspect discussed a little in moderation on Facebook was of particular concern to contractors: the separation of private conversations between adults and minors who described algorithms as potential sexual exploitation.

One of the supervisors said that these private conversations, of which “90% are sexual,” were “violating and frightening.”

“You understand something more about the kind of bipolar society that we build every day,” he said.

“We have wealthy men from Europe, from the United States, writing to children from the Philippines … and trying to take sex pictures for $ 10 or $ 20.”

“You can’t ask someone to work quickly, do a good job, and see graphical content. The things we saw are not right.”

The activists, whose names were changed, spoke on condition of anonymity while signing non-disclosure agreements with Facebook.

Daniels, the former supervisor: “We are a kind of pawn in this area … It is a completely new work, and everything related to it is basically an experience.”

“I’m here today because I want to save others from falling into this hole,” said John, his former colleague.

“As a modern society, we move on to this new thing – the Internet – and we need to find some rules to deal with it.

“In a social network, for example, it’s important to build a team that aims to protect users from addicts, abusive language, racial bias, better pornography programs, etc., but I think this opens up a discussion about this mission.

Importantly, we need to share our stories, because People know nothing about us and about our jobs and that we or that makes what we do.

Some supervisors’ stories were similar to problems in other countries.

He admitted that he was really worried about walking the streets at night, for example, or being surrounded by foreigners.

“Maybe because we have to confront all this hate speech every day, which somehow affects our political outlook.

So the average person, the liberal person, and maybe even the progressive person, may be more conservative on issues like immigrants, for example.

Much of the obscene language we receive on the basis of daily speeches is false news … aimed at sharing very special political views. ”

In February, the technology site The Verge produced one of the first reports the US contractor reported on Facebook.

Like his colleagues in Berlin, the Americans stated that “videos of plots and memes they see every day gradually lead them to adopt marginal ideas,” and that the former.

Others were dealing with trauma through self-medication. Arizona supervisors reportedly use drugs and alcohol, as well as Germans.

“I saw a lot of big consumer medicines in the company,” Daniel said. “We have no luck. The company, technically, is against drugs.”

As he tried to take the most legitimate route of self-help, US arbitrators complained about the psychological help provided.

“On-site counselors were largely ineffective, relying on activists to identify signs of anxiety and depression and seek help,” wrote Wage reporter Casey Newton.

The Berlin arbitrators also criticized the advisory services provided and suggested that they rely too much on comprehensive state health care.

We had some colleagues who went to [the counselor], and when they showed that they had real problems, they called to get out of the company and find a suitable psychologist.”

The Virg report appears to be triggering reforms. Berlin jurors said that after the article was published, there was a direct interest to Facebook from its main office.

Previously, they had more than 1,000 articles a day – more than one broker every 30 seconds in an eight-hour shift.

In February, John visited a Facebook office in Dublin. “This person decided to remove the maximum of 1,000 after this meeting. We had no limits for some time, but they have now re-established other limits.

Now the maximum is between 400 and 500 tickets.” The new maximum – or number of tickets – was half the previous number, But workers still have to receive one ticket per minute. However, the amount of work his American colleagues faced before the reforms.

Berlin mediators discussed whether to seek help from unions, but say the nature of the work makes it difficult.

“I wouldn’t say no one is interested, but it’s likely that nobody will do anything real,” Gina said.

“They’re very tired,” John said

While the supervisors agreed, they said the problems were solvable.

“I think it’s important to open a discussion about this work,” Daniel said, adding that it was easy to add that the solution was “to hire more people.”

We work closely with our partners to ensure the people they need, including training, psychological support and technology to reduce their exposure to graphic content.

“Content stewardship is a new and challenging industry, so we always learn and want to improve how it is managed.

Leave a Comment