I was able to pull transcripts of sale negotiations for teen girls that traffickers were engaging in on Facebook Messenger, the private messaging function. In exhibit documents, there were pictures of trafficking victims being advertised for sale in Instagram’s Stories function. Money and logistics had been discussed. In the cases we found, none of these crimes had been detected or flagged by Meta.
McNamara and I contacted former contract workers who had been employed to moderate Facebook and Instagram, tasked with reporting and removing harmful content. Many were traumatised by the content they had had to review each day. All said their efforts to flag and escalate possible child trafficking on Meta platforms often went nowhere, and harmful content was rarely taken down by the company. They felt helpless, and believed Meta’s criteria for escalating possible crimes to law enforcement was too narrow.
They make money out of trafficking and scams etc… all the interactions sell ADS ! And the scammers even buy ADS. Why would facebook remove any of it, it would need to cost them money to make it stop, so fine them and make.them accountable for the horrors they allow.
threaded - newest
An article from the guardian patting itself on the back. Pass.
“hey, we did the work, now we expect something to be done on it” is far from “look what we found! Give us a Pulitzer!”
They make money out of trafficking and scams etc… all the interactions sell ADS ! And the scammers even buy ADS. Why would facebook remove any of it, it would need to cost them money to make it stop, so fine them and make.them accountable for the horrors they allow.
Real time in jail should probably be there, as well.
I wonder who made the original tip. Probably one of those traumatized moderators whose reports had gone nowhere.
In most likelihood, yes.