<https://www.theguardian.com/global-development/2024/mar/14/facebook-messenger-meta-pay-child-sexual-abuse-exploitation>


How Facebook Messenger and Meta Pay are used to buy child sexual abuse material


When police in Pennsylvania arrested 29-year-old Jennifer Louise Whelan in 
November 2022, they charged her with dozens of counts of serious crimes, 
including sex trafficking and indecent assault of three young children.

One month earlier, police said they had discovered Whelan was using three 
children as young as six, all in her care, to produce child sex abuse material. 
She was allegedly selling and sending videos and photos to a customer over 
Facebook Messenger. She pleaded not guilty.

The alleged buyer, Brandon Warren, was indicted by a grand jury in February 
2022 and charged with nine counts of distribution of material depicting minors 
engaged in sexually explicit conduct. Warren also pleaded not guilty.

Court documents seen by the Guardian quote Facebook messages between the two in 
which Warren allegedly describes to Whelan how he wants her to make these 
videos.

“I’ll throw in a little extra if you tell him it makes mommy feel good and get 
a good length video,” he tells Whelan, according to the criminal complaint 
document used for her arrest.

Whelan received payment for the footage over Meta Pay, Meta’s payment system, 
according to the criminal complaint against him. “Another 250 right? Heehee,” 
she allegedly wrote to Warren after sending him a video of her abusing a young 
girl.

Meta Pay, known as Facebook Pay before rebranding in 2022, is a peer-to-peer 
payment service enabling users to transfer money over the company’s social 
networks. Users upload their credit cards, debit cards or PayPal account 
information to Facebook Messenger or Instagram to send and receive money.

A spokesperson for Meta confirmed that the company has seen and reported 
payments via Meta Pay on Facebook Messenger that are suspected of being linked 
to child sexual exploitation.

“Child sexual exploitation is a horrific crime. We support law enforcement in 
its efforts to prosecute these criminals and invest in the best tools and 
expert teams to detect and respond to suspicious activity. Meta reports all 
apparent child sexual exploitation to NCMEC [the National Center of Missing and 
Exploited Children], including cases involving payment transactions,” the 
spokesperson said.

Through reviewing documents and interviewing former Meta content moderators, a 
Guardian investigation has found that payments for child sexual abuse content 
taking place on Meta Pay are probably going undetected, and unreported, by the 
company.

Court documents show Whelan and Warren’s actions were not spotted or flagged by 
Meta. Instead, Kik Messenger, another social platform, reported Warren had 
uploaded videos suspected to be child sexual abuse material (CSAM) to share 
with other users. This triggered a police investigation in West Virginia, where 
Warren lives. His electronics were seized, and police then discovered the eight 
videos and five images that he had allegedly bought from Whelan over Facebook 
Messenger.

“We responded to valid legal process,” said a Meta spokesperson, in response to 
the Guardian’s findings that the company did not detect these crimes.

Additionally, two former Meta content moderators, employed between 2019 and 
2022, told the Guardian that they saw suspicious transactions taking place via 
Meta Pay that they believed to be related to child sex trafficking, yet they 
were unable to communicate with Meta Pay compliance teams to flag these 
payments.

“It felt like [Meta Pay] was an easy-to-use payment method since these people 
were communicating on Messenger. The amounts sent could be hundreds of dollars 
at a time,” says one former moderator, who spoke under the condition of 
anonymity because they had to sign a non-disclosure agreement as a condition 
for employment. The moderator, employed for four years until mid-2022 by 
Accenture, a Meta contractor, reviewed interactions between adults and children 
over Facebook Messenger for inappropriate content.

Payments for sex or CSAM are typically just a few hundred dollars or less in 
cases reviewed by the Guardian. According to the former Meta compliance 
analyst, transactions of such small amounts are unlikely to be flagged for 
review by Meta’s systems.

This means that payments connected to illicit activities are probably taking 
place undetected, financial crimes experts said.

A Meta spokesperson said that the company uses a combination of automated and 
human review to detect suspicious financial activity in payment transactions in 
Messenger.

“The size of the payment is just one signal our teams use to identify 
potentially suspicious activity, and our compliance analysts are trained to 
assess a variety of signals,” said the Meta spokesperson. “If our teams had 
reason to suspect suspicious activity, especially activity involving a child 
and even if the payments are small, it would be investigated and reported 
appropriately.” The spokesperson also said that the company had “a strong ‘see 
something, say something’ culture”.

For situations where American men were targeting underage girls abroad to 
groom, payments could be for things like getting a phone and school supplies, 
the moderator said.

“Most of what we saw were older men from America, targeting girls in Asian 
countries and often travelling there,” the moderator added.

“When it comes to child exploitation and CSAM, it’s really all about small 
amounts,” said Silvija Krupena, director of the financial intelligence unit at 
RedCompass Labs, a London-based financial consultancy. “It’s a global crime and 
criminals, with different types of offenders. In low-income countries like the 
Philippines, $20 is big money. The production usually happens in those 
countries. These are small amounts that can fall through the cracks when it 
comes to traditional money-laundering controls.”

Meta has a team of about 15,000 moderators and compliance analysts who are 
tasked with monitoring its platforms for harmful and illegal content. Possible 
criminal behavior is supposed to be escalated by Meta and reported to law 
enforcement. Anti-money laundering regulations also require money service 
businesses to train their compliance staff to have access to enough information 
to be able to detect when illegal financing occurs.

Yet contractors monitoring Meta Pay transaction activity do not receive 
specific training for detecting and reporting money flows that could be related 
to human trafficking, including the language, codewords and slang that 
traffickers typically use, a former Meta Pay payment compliance analyst 
contractor said.

“If a human trafficker is using a codeword for selling girls, we didn’t get 
into that. We didn’t really get trained on those,” said the former compliance 
analyst. “You don’t even give it a second thought or even dig into that kind of 
stuff at all.”

A Meta spokesperson disputed the payment compliance analyst’s claims.

“Compliance analysts receive both initial and ongoing training on how to detect 
potentially suspicious activity – which includes signs of possible human 
trafficking and child sexual exploitation. Our program is regularly updated to 
reflect the latest guidance from financial crime regulators and safety 
experts,” the spokesperson said.

Meta’s history with accusations of child exploitation

Meta’s platforms have been linked to alleged child exploitation and the 
distribution of CSAM in the past. In December, the New Mexico attorney 
general’s office filed a lawsuit against the company, alleging Facebook and 
Instagram are “breeding grounds” for predators targeting children for human 
trafficking, grooming and solicitation. The suit followed an April 2023 
Guardian investigation, which revealed how child traffickers were using Meta’s 
platforms to buy and sell children into sexual exploitation.

As a money services business, Meta Pay is subject to the US anti-money 
laundering and “know your client” (KYC) banking regulations, which require 
businesses to report illicit financing to the US treasury department’s 
Financial Crimes Enforcement Network (FinCEN).

If Meta fails to detect and report these payments, it could be in violation of 
US anti-money laundering laws, financial crimes experts have said.

“Regulations apply to any company that participates in a payments business. But 
for social media because they can see users, they see their lives, their 
transactions, they can see abuse and see contact. It’s such a low-hanging fruit 
for them to detect this,” said Krupena.

Other peer-to-peer payment apps have faced scrutiny for their practices in 
preventing illicit activity. In 2023, Senate Democrats requested detailed fraud 
detection and prevention methods from PayPal, Venmo and Cash App. Sex 
trafficking “ran rampant” on Cash App, according to a report last year by US 
investment research firm Hindenburg. Block, Cash App’s owner, disputed these 
claims, threatening legal action.

Meta introduced end-to-end encryption to Facebook Messenger in late 2023, but 
even before this, payment compliance analyst contractors could not access the 
Messenger chat between the two users exchanging funds. The former Meta 
compliance analyst told the Guardian their team could only see transactions 
with notes and the relationship between the two users.

“I don’t know how you do compliance in general without being able to see 
intentions around transacting,” said Frances Haugen, a former Facebook employee 
turned whistleblower, who released tens of thousands of damaging documents 
about its inner workings in 2021. “If the platforms actually wanted to keep 
these kids safe, they could.”

Siloed work prevents flagging suspicious transactions, say ex-moderators

Other former content moderators interviewed by the Guardian compared their jobs 
to call center or factory work. Their jobs entailed reviewing content flagged 
as suspicious by users and artificial intelligence software and making quick 
decisions on whether to ignore, remove or escalate the content to Meta through 
a software program. They say they could not communicate with the Meta Pay 
compliance analysts about suspicious transactions they witnessed.

“We were not allowed to contact Facebook employees or other teams,” one former 
moderator said. “Our managers didn’t tell us why this was.”

Gretchen Peters, who is the executive director of the Alliance to Counter Crime 
Online, has documented the sale of narcotics, including fentanyl, over Meta’s 
platforms. She also interviewed Meta moderators who were not permitted to 
communicate with other teams in the company. She said this siloing was a “major 
violation” of “know your customer” banking regulations.

“We’ve heard from moderators at Meta they can see illegal conduct is occurring 
and that there are concurrent transactions through Meta Pay, but they have no 
way of communicating what they are seeing internally to moderators at Meta 
Pay,” said Peters.

A Meta spokesperson said the company prohibits the sale or purchasing of 
narcotics on its platforms and removes that content when it finds it.

“Meta complies with all applicable US anti-money laundering laws,” the 
spokesperson said. “It is also untrue to suggest that there is a lack of 
communication between teams. Content moderators are trained to escalate to a 
specific point of contact, who brings in the appropriate specialist team.”

In December, Meta announced it had rolled out end-to-end encryption for 
messages sent on Facebook and via Messenger. Encryption hides the contents of 
messages from anyone but the sender and intended recipient by converting text 
and images into unreadable cyphers that are unscrambled on receipt.

Yet this move could also affect the company’s ability to prevent illicit 
transactions on Meta Pay. Child safety experts, policymakers, parents and law 
enforcement criticized the move, arguing encryption obstructs efforts to rescue 
child sex trafficking victims and the prosecution of predators.

“When Meta Pay is linked to Messenger or Instagram, the messages associated 
with payments could uncover illicit behaviors,” said Krupena. “Now that this 
context is removed, the implications are significant. It almost feels like 
encryption is inadvertently facilitating illicit activity. This opens many 
opportunities for criminals to hide in plain sight.”

A Meta spokesperson said the decision to move to encryption was to “provide 
people with privacy”, and that the company encourages users to self-report 
private messages related to child exploitation to the company.

“Moving to an encrypted messaging environment does not mean we will sacrifice 
safety, and we have developed over 30 safety tools, all of which work in 
encrypted messaging,” said the spokesperson. “We’ve now made our reporting 
tools easier to find, reduced the number of steps to report and started 
encouraging teens to report at relevant moments.”

FinCEN declined to comment. PayPal did not respond to a request for comment.

_______________________________________________
nexa mailing list
nexa@server-nexa.polito.it
https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa

Reply via email to