Facebook has failed to fix the systems that allow human trafficking organizations and drug cartels to repeat the same behavior, according to a report by the The Wall Street Journal.
The dozens of internal Facebook documents obtained by the outlet showed how employees have expressed concerns about how the social media giant is being used in countries around the world and how Facebook has failed to respond. correctly to these problems.
Some of the documents reportedly showed Facebook employees worried about human trafficking organizations in the Middle East using Facebook to attract women. Other documents showed Facebook employees alerting their superiors to groups involved in organ sales and pornography.
The outlet reported that while some of the employee-reported groups and pages have been removed, dozens more remain active on the social media site.
Another document detailed a Facebook employee’s investigation into an active Mexican drug cartel on the social media site. The employee, who was a former police officer, was able to identify the Jalisco New Generation cartel’s network of accounts on Facebook and Instagram, which is owned by Facebook.
The employee wrote in the report that his team found Facebook posts between cartel recruiters and potential recruits “about being badly beaten or killed by the cartel if they tried to leave training camp “.
The documents reportedly showed the cartel was open about its criminal activities, with several pages on the social media site showing “gold-plated guns and bloody crime scenes.” the Wall Street newspaper reported that even after the employee recommended that Facebook increase its app on groups, documents showed that Facebook had not completely removed the cartel from its site and instead said it had removed the content linked to the group. Just nine days after the employee’s report, his team found a new Instagram account linked to the cartel, which included several violent posts.
Numerous documents apparently showed employees worried about the use of the social media giant in developing countries, such as militant groups in Ethiopia using Facebook to promote violence against minority groups.
Brian Boland, a former vice president of Facebook, told the the Wall Street newspaper that the social media site sees these problems in developing countries as “just the cost of doing business.”
“There is very rarely a significant and concerted effort to invest in repairing these areas,” Boland said.
In a press release sent to News week, a Facebook spokesperson said: “In countries at risk of conflict and violence, we have a comprehensive strategy, including relying on global teams with native speakers spanning over 50 languages, educational resources and partnerships with local experts and third-party fact checkers. to keep people safe. “
In a series of tweets Thursday, Facebook spokesperson Andy Stone wrote: “As the the Wall Street newspaper made clear, we have a team of experts who help us uncover patterns of harmful behavior so that we can disrupt it. We have arguably more experts and resources dedicated to this work than any other consumer technology company in the world. “
“While there is always more we can do, these teams have helped us find and disrupt the gangs and traffickers operating on our platform,” Stone wrote in a subsequent tweet. “We use a variety of tools against criminal organizations, including designating them under our Dangerous Organizations policies, human scrutiny, a wide range of AI, and network disruption.”
Stone concluded his tweet by writing, “We know we still have work to do, which is exactly why we are hiring specialists in key areas to help us research and understand the issues so that we may improve our technology, our people and our policies to address them. “
In 2018, Facebook said it agreed with a report from the nonprofit Business for Social Responsibility that it was “not doing enough to prevent our platform from being used to foment the division. and incite violence offline ”in Myanmar following violence against the Rohingya minority.
In a statement issued following this report, Facebook’s chief product officer Alex Warofka said: “We agree that we can and must do more.”