Rechercher dans ce blog

Monday, May 10, 2021

Facebook content moderators say they receive little support, despite company promises - NBC News

little.indah.link

Despite Facebook’s repeated assurances that it would address poor workplace conditions for its content moderators, often contractors who spend their days reviewing graphic and violent images posted on the site, little has changed at the company, a former Facebook content moderator said in an interview that aired Monday on “NBC Nightly News.”

Josh Sklar, a former Accenture subcontractor based in Austin, Texas, who moderated content for Facebook and Instagram from September 2018 through March, said that working conditions had scarcely improved for content moderators and that they continue to review large quantities of often traumatic posts. Sklar was one of approximately 15,000 people who spend hours a day combing through the dark side of the social network, flagging posts that violate its content policies.

“You're reviewing, you know, maybe hundreds of pieces of content a day, and of that content, a really healthy percentage of it is bad stuff,” he said. “I mean, we're talking everything from hate speech, to animal mutilation, to videos of people committing suicide, child pornography.”

Sklar said he was assigned to do this for over six hours every day, with few breaks, as disturbing images flooded into his queue. In one case, he said, a reconfigured algorithm that chooses what moderators see caused him to see “a lot more gore,” including an image of the corpses of Palestinian women who had been killed in an explosion “over and over and over and over again.”

“Sometimes it'll be that you realize that you've become desensitized to this, and you're like, well, that doesn't seem like a good thing,” he said. “I don't really want to be numb to human suffering.”

Sklar is one of several content moderators who have spoken up in recent years. But Sklar said that so far speaking up has resulted in little change. One time, Sklar said he spoke out internally against a policy update that allowed images of animal mutilation to go online, unflagged for months. Sklar said he repeatedly brought the issue up to quality assurance employees, or QAs, who then brought it up to Facebook.

Josh Sklar.NBC News

Sklar said that even though a QA told him that Facebook said, “Oh, that’s not supposed to be what’s happening,” he found that it still did not change in the near term.

For more, tune in tonight to NBC Nightly News with Lester Holt at 6:30 p.m. ET/5:30 p.m. CT.

Sklar also said he had to sign a nondisclosure agreement, which he allegedly never saw again, and another document that warned he might suffer from post-traumatic stress disorder. He said he was told he was responsible for addressing these health issues.

Before he left in March, Sklar wrote an internal memo about his experience on Workplace, an internal company communication tool. In it, he called the wellness program looking to support moderators and their mental health “inadequate” and made suggestions like having moderators receive more wellness time and the ability to expense therapy.

Facebook company spokesperson Drew Pusateri said in response to Sklar’s accounts, “We appreciate the important work that content reviewers do to keep this content off of our platform and often change and improve our policies based on their feedback.” An Accenture company spokesperson said in a statement that the company makes its workers’ well being “a top priority” and that “our people have unrestricted access to 24/7 well-being support, which includes proactive, confidential and on-demand counseling.”

Repeat history

This is not the first time Facebook has been accused of mistreating its content moderators.

In February 2019, Business Insider published an article that said moderators at the facility in Austin, where Sklar worked, alleged in an internal letter on Workplace that then-added workplace restrictions cost them their “sense of humanity.” At the time, Facebook told Business Insider that there were no new rules to address these problems and that it was a “misunderstanding” of ones already in place. The company said it would address employee concerns at the time. Accenture referred Business Insider to Facebook for comment.

The following week, The Verge published an article that said Facebook content moderators in Phoenix, subcontracted by Cognizant -- reportedly no longer in the content moderation business -- suffered from mental health and trauma issues, were given less than 10 minutes a day for “wellness time” -- to debrief from harsh content -- and “inadequate” coping resources, which led some of them to resort to drugs. A Facebook spokeswoman told The Verge that the claims “do not reflect the day-to-day experiences of most of its contractors, either at Phoenix or at its other sites around the world.”

In May 2020, Facebook settled a lawsuit and said it would pay $52 million to content moderators who alleged they had developed mental health issues, like PTSD, while on the job, and make more mental health resources available, like monthly group therapy sessions and weekly one-on-one coaching sessions.

Six months later, over 200 content moderators, including Sklar, alleged in a letter to executives at Facebook, Accenture and CPL, another contractor, that the company had “forced” them back into the office during the pandemic.

“Before the pandemic, content moderation was easily Facebook’s most brutal job. We waded through violence and child abuse for hours on end. Moderators working on child abuse content had targets increased during the pandemic, with no additional support,” the moderators wrote in the letter. “Now, on top of work that is psychologically toxic, holding onto the job means walking into a hot zone.”

At the time, Facebook told NPR it “exceeded health guidance on keeping facilities safe for any in-office work” and prioritized the health and safety of its moderators. Accenture said it was gradually inviting workers back into its office but “only where there is critical need to do so and only when we are comfortable that we have put the right safety measures in place, following local ordinances.” CPL told NPR that the workers’ roles were “deemed essential” and “due to the nature of the work, it cannot be carried out from home.”

But Cori Crider, co-founder of Foxglove, the organization dedicated to supporting social media content moderators that published the letter, said Facebook could have done more.

“Facebook could absolutely afford to hire these people directly and treat these people better,” Crider said. “You can't have a healthy public square if the people you rely on to defend it are working in digital sweatshops.”

Adblock test (Why?)

The Link Lonk


May 11, 2021 at 03:59AM
https://www.nbcnews.com/business/business-news/facebook-content-moderators-say-they-receive-little-support-despite-company-n1266891

Facebook content moderators say they receive little support, despite company promises - NBC News

https://news.google.com/search?q=little&hl=en-US&gl=US&ceid=US:en

No comments:

Post a Comment

Featured Post

Nikki Haley's super PAC spent big to fuel her rise. It started 2024 with little left. - NBC News

little.indah.link The super PAC backing former U.N. Ambassador Nikki Haley entered the election year in January with just $3.5 million in...

Popular Posts