Statistics obtained by the National Center for Missing & Exploited Children and recent court documents show that Amazon’s chat app Wickr is the top destination for child sexual abuse and Wickr needs to do more in regards to identifying and taking steps to prevent child sexual abuse material being traded on their platform.
DID YOU KNOW: Biden Creates DARPA For Health – ARPA-H
Major Peer-Reviewed Study: Menstrual Irregularities, Uterus Shedding Cases Spike After COVID Vaccine Rollout
According to court records, online forums, law enforcement, and anti-exploitation activists, Wickr Me, an encrypted messaging software owned by Amazon Web Services, has become a go-to location for people to share images of child sexual abuse.
According to statistics acquired by the National Center for Missing & Exploited Children, or NCMEC, it’s not the only internet platform that needs to crack down on illegal content.
Experts and law enforcement officials say Amazon is doing comparatively little to aggressively combat the problem, luring people who wish to trade such content because there is less danger of detection than in the brighter parts of the internet.
Using a combination of private and public legal and news databases and search engines, researchers examined court documents from 72 state and federal child sexual abuse or child pornography prosecutions in the United States, United Kingdom, and Australia over the last five years where the defendant allegedly used Wickr (as it’s commonly known).
Apart from those that are still being reviewed, nearly every prosecution reviewed has resulted in a conviction. Aside from a few instances where Wickr was legally compelled to divulge information via a search warrant, almost none of the criminal complaints studied mention cooperation from Wickr at the time of filing.
Over a quarter of the prosecutions originated from law enforcement undercover operations on Wickr and other internet platforms.
Wickr-related posts of child sexual assault are also strewn across the internet. NBC News discovered dozens of forums, accounts, and blogs on social media platforms like Reddit, Tumblr, and Twitter where hundreds of postings were made soliciting minors, those who had access to them, or those interested in trading child sexual abuse material with Wickr screen names. During the reporting of this story, no images of child sexual assault were seen.
“Wickr needs to do more in regards to identifying and taking steps to prevent child sexual abuse material being traded on their platform,” said John Shehan, vice president of NCMEC.
Other Meta-owned apps, like as Facebook, WhatsApp, and Instagram, use algorithmic detection methods to examine unencrypted text and media submitted to their platforms, such as content on a user’s profile, for evidence of child sexual abuse imagery.
Electronic communication service providers (pdf given below) must submit known or discovered child sexual abuse material to the National Center for Missing and Exploited Children (NCMEC) under US law.
In 2021, Meta sent millions of reports to the center: Facebook sent 22,118,952 reports, Instagram sent 3,393,654 reports, and WhatsApp sent 1,372,696 reports. A high amount of reporting, according to experts, is a good thing because it shows that a company is proactive in detecting child exploitation material on its platform.
Despite the fact that experts and law enforcement say the app is clearly used by persons trading such content online, Wickr has considerably fewer users than similar platforms and self-reported only 15 incidents of child sexual abuse imagery.
Shehan claims that roughly 3,500 reports of child sexual abuse content on Wickr originated from third parties unaffiliated with the company, implying that the corporation isn’t actively identifying child pornography, but rather allowing it to exist on the platforms for users to discover and report.
“It’s very clear that they’re not taking any proactive efforts on their own to identify this type of activity,” he said, referring to the numbers.
We act quickly on reports of illegal behavior, respond immediately to requests from law enforcement, and take the appropriate actions. Anyone found to be in violation of our terms is subject to account termination.”
“Wickr absolutely responds appropriately to, and cooperates with, law enforcement on these critical matters,” the spokesperson said.
From Reddit and Twitter to Wickr
Child sexual abuse imagery has been a problem on the internet since the beginning of the consumer web, but the problem has grown in recent years as content creation and sharing has been easier than ever.
Wickr, one of the first end-to-end encrypted messaging apps, works similarly to most privacy-focused messaging apps. Users communicate with individuals or groups in an encrypted style that removes identifiable details from messages.
This assures that only the sender and receiver may see their messages, leaving little evidence of the conversation that could be retrieved by law enforcement or Amazon. Wickr’s technology, along with settings that enable for self-delete messages, has made it an appealing tool for many people seeking anonymity, including criminals.
Unlike its competitors WhatsApp and Signal, Wickr does not require any personal information to sign up, merely a username and password. Users can interact with others individually or in group chats via search or an invite once they’ve downloaded the app.
“Social media or more open spaces, or online gaming environments, will be used by adults to recruit — to approach — children to have more private contact in more private spaces,” she said.
Certain Reddit users are well aware of the problem, and some subreddits have prohibited the usage of Wickr handles due to their affiliation with child sexual abuse content.
“It’s been brought to my attention that people are making posts about ‘taboo’ and ‘perv’ chats posting there Wickr handles are really people looking to trade child porn and discuss pedophilia,” one subreddit dedicated to meth use said in a post posted to the top of the discussion thread.
Online, the terms “taboo” and “perv” are regularly employed as code for content about child sexual abuse.
“Avoid posting illegal content or soliciting or facilitating illegal or prohibited transactions,” Reddit warns in its content guidelines. Many Wickr communities still exist today, however NBC News uncovered seven subreddits that have been removed from the platform because their titles incorporate the word “Wickr.”
Five of the suspensions were for stuff that was directly against Reddit’s guidelines against sexually suggestive content involving children. In the bans of two other subreddits, “taboowickr” and “wickr nsfw,” Reddit cited regulations concerning unmoderated communities and communities created to purposefully avoid community norms. Reddit did not answer to concerns regarding why Wickr subreddits were previously banned.
In a statement, a Reddit spokesperson said: “Our sitewide policies explicitly prohibit any sexual or suggestive content involving minors or someone who appears to be a minor. This includes child sexual abuse imagery and any other content that sexualizes minors.
Our dedicated Safety teams use a combination of automated tooling and human review to detect and action this content across the platform. We regularly ban communities for engaging in the behavior in question, and we will continue to review and action violating subreddits, users, and content.”
Reddit isn’t the only platform where Wickr users try to find one another.
A search for “Wickr” on Twitter turned up tweets with Wickr usernames and hashtags like “teen,” “perv,” and “nolimits.” “Sixteen is a cool number” and “Sells to anyone” were written alongside a Wickr handle and the acronyms “map” and “aam,” which stand for minor-attracted-person and adult-attracted-minor, respectively, by one user who looked to be selling child sexual abuse material. Other posts promote the selling of other drugs.
While reporting this article, Twitter says it suspended many accounts that were identified by the company. “Twitter has a zero-tolerance policy for child sexual exploitation content,” Twitter spokeswoman Trenton Kennedy said in a statement.
“We aggressively fight online child sexual abuse and have heavily invested in technology and tools to enforce our policy. We have rules against non-consensual nudity and take strong enforcement action against this content.”
The court cases
The court filings reviewed show how users of Wickr openly sell child sexual abuse material after being connected with groups or other users on the app. Wickr’s cooperation appears to be minimal even when law enforcement has amassed huge amounts of evidence, according to the company’s responses to court filings and its own web page that offers information about how it reacts to legal requests.
Wickr is clear in its “Legal Process Guidelines” about the limited amount of information it is ready to share with law authorities. According to the page, “Non-public information about Wickr users’ accounts will not be released to law enforcement except in response to appropriate legal process such as a subpoena, court order, or other valid legal process.”
“Requests for the contents of communications require a valid search warrant from an agency with proper jurisdiction over Wickr. However, our response to such a request will reflect that the content is not stored on our servers or that, in very limited instances where a message has not yet been retrieved by the recipient, the content is encrypted data which is indecipherable.”
Wickr claims that its terms of service prohibit illegal activities, although it has previously opposed law enforcement engagement on tech platforms in general. The Wickr Foundation, the company’s charitable arm, filed a friend of the court brief in support of Apple in 2016, arguing against providing law enforcement tools that would allow access to encrypted communications.
According to the brief, “deliberately compromised digital security would undermine human rights around the globe.” Apple was ordered in the case to help law authorities in unlocking an iPhone belonging to a mass shooter in San Bernardino, California. Eventually, the order was revoked.
The debate highlighted a growing divide between law enforcement and technology companies over encryption and potential evidence access in encrypted contexts. Wickr’s stance wasn’t groundbreaking at the time, and it reflected the concerns of many companies seeking to protect the security of encrypted settings.
Wickr’s apparent inaction in developing alternative methods to prevent crime on its platform in lieu of a “backdoor” to get around encryption sets it apart from other tech companies like Meta and Microsoft, which developed the PhotoDNA technology that has been critical in identifying and combating the spread of child sexual abuse material across the internet and is now used to scan files in Microsoft’s OneDrive cloud.
A hands-off approach
Wickr’s inaction contrasts with what other corporations have done to address the issue of child sexual abuse content.
By evaluating components of user profiles outside of encrypted chats, such as profile photos, usernames, and metadata, Baines highlighted that WhatsApp, which is also end-to-end encrypted, dramatically enhanced its reporting of child sexual abuse material.
“It’s morally the right thing to do to go looking for it,” Baines added, referring to the legal obligation to report such content.
Shehan cited one report to NCMEC’s tip line from a Wickr user as an example of what goes unchecked on the site, claiming that a user highlighted a Wickr account named “BabyAbuse,” which used a profile photo of an infant being sexually attacked.
“I would expect a company like Wickr, especially being a company and property advertised as being so closely aligned with AWS and Amazon, that they will be taking the right measures to identify this type of activity, especially even the account names and I mentioned that that’s the lowest hanging fruit that’s possible,” he said.
Some human rights activists warned against blaming Wickr’s issues with child abuse imagery on end-to-end encryption.
Anjana Rajan, the chief technology officer of Polaris, which runs the National Human Trafficking Hotline, disagreed with the argument that Wickr and other tech platforms must compromise privacy in order to prevent trafficking and child exploitation, and said that governments should instead focus on addressing societal issues that lead to crime.
“The debate is not around whether or not encryption is good or bad. It’s about how are traffickers exploiting vulnerabilities of vulnerable communities, and where are they doing that, and how do we actually get ahead of that vulnerability and meet that need,” she said.
“I think there’s oftentimes a bit of a boogeyman made around emerging technologies,” she said. “Technology is just a tool in which [crime] happens, but the underlying mechanisms need to be understood at its very core.”
Rajan believes encryption is part of a “human rights toolkit” that can help victims be protected and empowered. ““How do we prevent abuse of these technologies rather than passing a broad, sweeping critique of a tool?” she asked.
“We really feel that in an encrypted environment, there are still ways that this activity can be identified,” Shehan said, adding that Wickr could do more without sacrificing its encrypted environment. “Companies like Wickr should be exploring how to make that happen within their platforms, while also preserving security.”
But, if it comes down to it, he believes that in the debate over technology and child sexual abuse material, children should come first. “We certainly definitely are big fans and supporters of privacy, but at the end of the day, not at the cost of children.”
Read the CyberTipline report below: