Accenture is questioning its massive contract with Facebook, which pays the elite consulting firm to scrub the site of pornography, violence, suicides and other toxic posts.
According to an Aug. 31 New York Times story, thousands of employees and contractors spend eight hours a day reviewing hundreds of videos, pictures and posts for the social media giant to prevent them from spreading online. Employees began feeling depressed, anxious and paranoid, and one joined a class action lawsuit to push back against the working conditions that include seeing rapes, animal tortures, dead bodies and grisly scenes from the Syrian war.
While questions have been raised by top executives about whether Accenture should continue the work, the matter is unresolved, the Times reports, and the lucrative contract continues. The effort is part of CEO Mark Zuckerberg’s pledge to clean up Facebook after intense criticism. It uses AI to remove about 90% of the posts but outsources much of the remaining work to at least 10 consulting firms. TaskUs, one of those firms, now gets a third of its business – at $150 million a year – from Facebook, the Times reports.
Accenture was once part of the accounting firm Arthur Andersen, which was ordered to split off its consulting arm in 2000. Andersen Consulting, as it was called at the time, rebranded as Accenture in 2001. The company provides consulting in accounting and tech, primarily, but it is also earns $500 million a year from Facebook and provides one-third of the workers moderating its content, a Times investigation found. Accenture won an accounting contract with Facebook in 2010 and expanded the arrangement to include content moderation two years later.
“Their contracts, which have not previously been reported, have redefined the traditional boundaries of an outsourcing relationship,” write Times reporters Adam Satariano and Mike Isaac, who interviewed more than 40 current and former Accenture and Facebook employees, labor lawyers and others. “Accenture has absorbed the worst facets of moderating content and made Facebook’s content issues its own. As a cost of doing business, it has dealt with workers’ mental health issues from reviewing the posts. It has grappled with labor activism when those workers pushed for more pay and benefits. And it has silently borne public scrutiny when they have spoken out against the work.”
Meanwhile, top executives have joined workers in questioning the ethics of the contract – with CEO Pierre Nanterme in 2017 (he passed away in 2019) and current CEO Julie Sweet, who ordered a review, sending observers to view managers and employees doing the work. Sweet made changes, including a lengthy legal disclosure that says the work has “the potential to negatively impact your emotional or mental health.” Accenture also listed content moderation as a business risk in its annual report last year.
Facebook and Accenture did not make executives available for comment, although an Accenture spokeswoman told the Times that the work was a public service that was “essential to protecting our society by keeping the internet safe.” Accenture now also moderates content for YouTube, Twitter, Pinterest and others.