In a dark attic, a little girl, no older than 9, appears with a distrustful look on her face as she wears only a training bra. She glimpses the camera out of the corner of her eye. FBI, law-enforcement authorities struggle to keep up with thousands of reports pouring in about pedophiles and child porn on Facebook
Author Chelsea Schilling writes :
She is only one of 40 other preteen girls on a single profile who are dressed in sexy swimsuits, bras or nothing at all.
It is just one photo among thousands of explicit images and videos of child sexual exploitation – all available on the social network 901 million users have come to know and love, Facebook. Most of America’s users have no idea that the social network is home to an enormous collection of unreported child pornography and sexual violence.
Editor’s note: This is Part 2 of a multi-part series by World Net Daily which is investigating child pornography on Facebook. We thank WND for giving The User Group Network and the UGN InfoManager permission to reprint this article. If you do not read the whole article, (8,000+ words) at least watch the videos at the end.
As WND previously reported, most of these predators aren’t merely looking at child pornography images. A 2007 Federal Bureau of Prisons study in which psychologists conducted an in-depth survey of online offenders’ sexual behavior, revealed that 85 percent of convicted Internet offenders said they had committed acts of sexual abuse against minors, from inappropriate touching to rape.
The U.S. Department of Justice explains:
Unfortunately, the child pornography market exploded in the advent of the Internet and advanced digital technology. The Internet provides ground for individuals to create, access, and share child sexual abuse images worldwide at the click of a button. … Child pornography offenders can connect on Internet networks and forums to share their interests, desires, and experiences abusing children in addition to selling, sharing, and trading images.
Moreover, online communities have promoted communication between child pornography offenders, both normalizing their interest in children and desensitizing them to the physical and psychological damages inflicted on child victims. Online communities may also attract or promote new individuals to get involved in the sexual exploitation of children.
As part of an undercover news investigation, WND used alias Facebook profiles and located dozens of child-porn images after “friending” many likely pedophiles and predators who trade thousands of pornographic photos on the social network. WND immediately reported graphic images of children and sex abuse to the FBI.
Other photos and videos of young children wearing G-strings and posing provocatively were reported to Facebook first to test the social network’s response.
Early in the investigation, Facebook was slow to remove photos or profiles – and, in some cases, didn’t appear to act at all. After three weeks of observation, some newly reported profiles were removed within 48 hours. However, most explicit videos and interest groups remain on the social network today, and new pedophile profiles are astonishingly simple to locate.
Despite repeated requests over the course of almost two months, Facebook did not respond to phone calls and emails from WND about the numerous images and videos shared by its users of children being sexually abused or posing nude. However, after Part 1 of this series was published, the social network provided a brief, emailed statement that WND posted here.
(Several news outlets reported a similar Facebook response just months ago following a deluge of complaints about its many pages promoting and joking about rape.*)
Raymond Bechard, author of “The Berlin Turnpike: A True Story of Human Trafficking in America,” launched Men Against Prostitution And Trafficking, the first anti-human trafficking political action committee in the U.S. He has used alias Facebook profiles to find child predators and turn them over to federal authorities.
Bechard told WND
If you report the images to Facebook, its system is woefully inadequate and extremely unresponsive to reporting a crime of this nature … You report it to Facebook, and you really don’t know if anything has happened. There’s no response – ever – and then you have to go back and see if it’s ever been removed. If a photo/video link does disappear, and they do take it down, you have no way of telling whether the crime you have witnessed has been reported to law enforcement.
In one case, nearly 80 photos of a young girl, about 8 years old, revealed a child posing in hot-pink thong underwear and climbing a tree. The girl spread her legs as the photographer took pictures of her buttocks and crotch area while standing only one or two feet from the young subject.
When WND reported the photos to the social network by phone and its online application on March 20, Facebook neither returned calls nor removed the images. The photos were still there on April 3 – when WND notified the FBI.
Within 24 hours of filing a report with the FBI, the entire user profile was removed.
FBI officials responding to reports of child porn on Facebook were courteous and helpful. However, WND asked FBI spokeswoman Jenny Shearer whether Facebook reports child pornography to authorities and readily cooperates with investigations.
“We don’t talk about our working relationships with private industry, so I can’t offer a comment,” Shearer told WND, refusing to discuss law-enforcement dealings with Facebook or Twitter in cases of child pornography.
Asked whether social media is becoming a magnet for child porn, Shearer replied, “It seems that people use social networking sites for all sorts of interests, including those that may involve, unfortunately, predation of children.”
In its “Statement of Rights and Responsibilities,” Facebook tells users: “You will not post content that: is hateful, threatening, or pornographic; incites violence; or contains nudity or graphic or gratuitous violence.”
In a twist of irony, the social network blocked WND’s alias accounts due to “security reasons” immediately after dozens of images and videos of child sexual abuse were reported by those accounts. In many cases, the reported content had been posted for months and had been viewed and shared by dozens or even hundreds of pedophiles.
Richard Lepoutre has been actively involved in the fight to protect children from sexual abuse for more than 25 years and is the co-founder of the fight against pedophiles on Facebook with the Stop Child Porn on Facebook campaign. He also wages the battle against commercial sexual exploitation through his work at the Stop Online Exploitation Campaign and Men Against Prostitution and Trafficking.
The futility of this is that, as you are reporting that individual image and trying so hard to get it taken down, it is, of course, being replicated hundreds and perhaps thousands of times in all sorts of other locations,” he explained. “It doesn’t really go away. As long as it’s up there, there are other exchange partners and other profiles that are grabbing this image. While you can ask for it to be taken down in one place under one profile, it’s very likely that it continues to propagate digitally. The real issue here is preventing the photo from getting up there in the first place.
This community of perverts and criminals” has its own “viral” phenomenon…
You can be sure a prized photo that shows up at some point is distributed and shared probably thousands of times within hours … It’s like chasing cockroaches. You think you’re getting them, but the image is like a proverbial cockroach that just clones and clones and clones. It’s hiding and it’s all over the place. You turn on the light and they scatter. But just because you don’t see them, doesn’t mean they’re not there.
The U.S. Department of Justice notes:
[V]ictims of child pornography suffer not just from the sexual abuse inflicted upon them to produce child pornography, but also from knowing that their images can be traded and viewed by others worldwide. Once an image is on the Internet, it is irretrievable and can continue to circulate forever. The permanent record of a child’s sexual abuse can alter his or her live forever. Many victims of child pornography suffer from feelings of helplessness, fear, humiliation, and lack of control given that their images are available for others to view in perpetuity.
During the course of their work, Bechard and Lepoutre have reported numerous images and profiles to Facebook. While accounts and links to child-pornographic material may be deactivated by the website, many pedophiles reappear within weeks or even days, they said. In such cases, the repeat offenders repost their massive albums of abuse.
Why aren’t they preventing it from being posted in the first place? And what are they doing to investigate where it came from? Is there any data mining they can do to find out if they’re coming from one place in particular? Why not aim resources at this to truly investigate it the way this level of crime deserves to be investigated?
Lepoutre listed names of some of the largest adult video pornography websites in the world. The material for the websites is submitted and uploaded by individuals, much like the process for posting videos on YouTube or Facebook.
You can go to any of the huge porn sites and you will not find child pornography on those websites … It stands to reason that they have the means and technology and apparently the motivation not to allow child pornography to be uploaded. Therefore, technically speaking, and from a resource perspective, why can’t Facebook do just as well as the major porn sites?
Foreign screeners making $1/hour?
Facebook has divulged very little about its content screening process. However, on Feb. 16, the news blog Gawker.com reported that it had interviewed Amine Derkaoui, a 21-year-old Moroccan man who claimed to have spent weeks training to screen illicit Facebook content through the California-based outsourcing firm oDesk.
Derkaoui said he was paid $1 an hour.
According to Gawker, Derkaoui provided some internal documents explaining how Facebook censors its content.
He said the content moderation team used a Web-based tool to view photos, videos and wall posts reported by Facebook users. The moderators have a choice between three actions: 1) confirm the flag and delete the content, 2) unconfirm it and let the content stay or 3) escalate the content to a higher level of moderation for examination by Facebook employees. (See Facebook’s “cheat sheet” for screeners)
After acing a written test and an interview, [Derkaoui] was invited to join an oDesk team of about 50 people from all over the third world – Turkey, the Philippines, Mexico, India – working to moderate Facebook content … They work from home in 4-hour shifts and earn $1 per hour plus commissions (which, according to the job listing, should add up to a ‘target’ rate of around $4 per hour).
According to the report, the job posting made no mention of Facebook. Derkaoui also said his oDesk managers never openly said Facebook was the client. However, Gawker noted, a Facebook spokesman confirmed the social network was oDesk’s client.
Other sources claiming to have been Facebook moderators complained about the nature of their work cleaning up the website.
“Think like that there is a sewer channel,” one person said during a Skype chat with the blog, “and all of the mess/dirt/waste/sh-t of the world flow towards you and you have to clean it.”
Another person quit after only three weeks of moderating.
“Pedophelia, necrophelia, beheadings, suicides, etc,” he recalled. “I left [because] I value my mental sanity.”
When WND asked Facebook about this report, a Facebook spokesman responded with the following statement:
In an effort to quickly and efficiently process the millions of reports we receive every day, we have found it helpful to contract third parties to provide precursory classification of a small proportion of reported content. These contractors are subject to rigorous quality controls and we have implemented several layers of safeguards to protect the data of those using our service. Additionally, no user information beyond the content in question and the source of the report is shared.We have, and will continue, to escalate the most serious reports internally, and all decisions made by contractors are subject to extensive audits.
We are constantly improving our processes and review our contractors on an ongoing basis. This document provides a snapshot in time of our standards with regards to one of those contractors, for the most up to date information please visit Facebook.com/CommunityStandards.
Facebook and PhotoDNA
Both the National Center for Missing and Exploited Children, or NCMEC, and Facebook tout the use of PhotoDNA software to combat child pornography. PhotoDNA* creates a digital code to represent a particular image – a sort of fingerprint or signature – and locates that exact image within large data sets. The tool is capable of finding specific images, even if they have been altered.
Microsoft donated the software to the NCMEC in December 2009. Just last year, Facebook began using PhotoDNA to hunt for thousands of registered illegal images uploaded by its users. The software had a detection accuracy of 99.7 percent in tests. It finds and removes only known, reported and specific images of sexual exploitation of pre-pubescent children.
Chris Sonderby, Facebook’s associate general counsel, was a prosecutor with the U.S. Department of Justice for 12 years. From 2006 until mid-2010, Sonderby served as the DOJ’s representative in Asia based at the U.S. embassy in Bangkok, where he worked with U.S. and foreign law-enforcement authorities on large-scale transnational criminal matters. Before his tour in Bangkok, Sonderby served as chief of the Computer Hacking and Intellectual Property unit of the U.S. Attorney’s office in San Jose, Calif.
He declared in a May 20, 2011, livestream event titled “Facebook D.C. Live: Protecting Kids Online”:
PhotoDNA is truly a game-changing technology in this fight, and we’re very excited for the opportunity to apply it on our site and really be successful in significantly reducing the amount of child exploitation images that are allowed to proliferate. In our case, we intend to put the technology to use against between 200 and 300 million photo images that are uploaded to Facebook every day. So the technology will allow us to block their upload, to prevent their distribution and the revictimization of the children who are depicted in those images, and it will also allow us to immediately refer and report those instances to law enforcement so they can take immediate action. We really think again that this is a game changer and we’re thrilled to be a part of this partnership and looking forward to continuing to work together on this.
Michelle Collins, vice president for the exploited children division at NCMEC, told WND, “I really feel like in the industry … while certainly the problem continues to grow, so does the response of many of the companies and certainly the response of law enforcement.”
We have a lot of voluntary initiatives with large companies here in the U.S. where we will provide them with PhotoDNA, in which they look specifically for child pornography images so they can remove them. It’s like locating a needle in a haystack. They have so many images coming through their servers. They utilize technology that allows them to identify and pull out those images that are known to be illegal child pornography images.
However, Microsoft DNA does not locate and remove new photos of abuse. It only finds images that have been identified and listed in a NCMEC database of photos. Furthermore, the image-matching technology does not locate videos of child sexual abuse for removal.
In one of Facebook’s own new releases on the subject, it stated: “[PhotoDNA] won’t be able to identify new pictures of child pornography nor will it tag your typical child photos as pornography. It will only catch those already known by the NCMEC.”
Lepoutre said the social network’s use of PhotoDNA software is a positive step, but it is not a “proactive effort” to prevent photos from appearing on Facebook in the first place:
I would ask: How is chasing down a picture that is already out there with PhotoDNA ‘proactive’? What a bunch of PR bullsh-t! Isn’t that amazing? Perhaps there are some people eyeballing some stuff in Morocco, but none of it is ‘proactive.
Asked about this PhotoDNA limitation, Collins told WND:
It is true. In order for a PhotoDNA signature to be generated, you need to have the image. We do, of course, see new images and new videos appearing online.
People 20 years ago would have to risk exposure by trying to find individuals who had the same sexual interests in children. With the Internet and these tools, it’s easy for people to feel cloaked in anonymity and be able to normalize and validate their sexual interests in children by talking to people all around the globe. It certainly fuels the production of more images and videos.
Facebook responds to lawmaker inquiry
In an Aug. 4, 2011, email to a congressional staffer in the office of Rep. John Larson, D-Conn., Facebook responded to an inquiry on the topic of child pornography on its website. It stated:
There are several mechanisms we utilize at Facebook to surface [child pornography] material and distributors. I have included some of our protections here:
* Deploy sophisticated technology to detect groups, both open and closed, that pertain to child abuse in any form, including work with leaders in the technology industry to collaboratively develop new technologies
* This is over and above our existing technologies that monitor and flag suspicious behavior by individuals
* This includes those obvious terms, but also codes and descriptors that these groups often use to avoid being detected
* All of this technology is supplemented by expert human eyes and ears constantly searching and shutting down groups and blocking users
* We identify the most up-to-date words and techniques used by child abuse rings/groups through expert insight from NCMEC and Interpol
* We also work in conjunction with CEOP/NCMEC to share information and flag cases in rare instances of specific concern
Nothing is more important to Facebook than the safety of the people that use our site and this material has absolutely no place on Facebook.
Unfortunately, people have attempted to use technology to distribute illegal and deeply offensive content from the earliest days of the public Internet. We have zero tolerance for this activity on Facebook and are extremely aggressive in preventing and removing child exploitive content as well as reporting it and the people responsible for it to law enforcement. We’ve built complex technical systems that either block the creation of this content, including in private groups, or flag it for quick review by our team of investigations professionals.
Additionally, we maintain a robust reporting infrastructure that leverages the over 500 million people who use our site to keep an eye out for offensive or potentially dangerous content. This reporting infrastructure includes report links on pages across the Facebook site, systems to prioritize the most serious reports, and a trained team of analysts who respond to reports and escalate them to law enforcement as needed. This team treats reports of exploitative content as an utmost priority.
We’ve also worked with the National Center for Missing and Exploited Children and New York State Attorney General Andrew Cuomo in the U.S., as well as the Child Exploitation and Online Protection Centre in the UK, to use known databases of child exploitive material to improve our detection and bring those responsible to justice.
As a result of that particular correspondence, Bechard said, Facebook simply agreed to remove the “like” group for “Nude Teen.” However, he said search results for “Nude Teen” reveal the phrase is often back and as popular as ever – and the overall problem of explicit “likes” and groups does not appear to have improved.
Some people, including those with law-enforcement backgrounds, are becoming frustrated in their efforts to report content to Facebook.
I just took a phone call from a retired police officer in the Midwest who was quite frantic … He called because he found child porn on Facebook.com and he said, ‘I’m a retired police officer. I thought I could maybe reach Facebook and report what I’ve just seen.’
The man said he tried to reach Facebook by phone, but his efforts had been futile.
He saw Facebook’s ‘report the image’ business, but he wanted to talk to somebody and report this problem … After several hours of trying to reach someone at Facebook, I explained to him what could and could not be done with regard to an IC3 form [FBI reporting procedure]. He said he was going to go do that.
Not only does there seem to be this disconnect with people like us who have been studying this issue, the folks out there wanting to report this stuff can’t do anything about it, either.
After Facebook executives read Part 1 of this WND series, they contacted Sgt. Greg Lombardo, commander for the Silicon Valley Internet Crimes Against Children taskforce, and asked him to speak with WND about his experiences dealing with the social network on this issue.
WND spoke to Lombardo about the limitations of PhotoDNA, how easy it is for pornographers to open numerous alias Facebook accounts and why Facebook is allowing explicit groups and “likes,” he responded:
I do understand what you are saying. I think most of the things you talked about are probably better answered by Facebook. All I can tell you about is our communication with Facebook and the cooperation that we’ve received from them. In the last six months, they’ve really done a lot to help us out. If we need search warrant information, they get it to us right away. They actually started a new portal that allows us to get Facebook information right away, and we’ve used it many times. They’re working with us, so I don’t have any complaints. I think they’ve been great lately. I’m sure there are going to be areas where they can improve, and that’s probably why we have people like you who are trying to probe and find out what’s going on.
I read your article, and it was well-written. I just want to make it clear that they are cooperating with us. I don’t have any issues with them at all. If there are things they can improve on that you’ve found, then that’s perfect, because we’re all trying to do the same thing here. We’re all trying to stop this. You may have found other things for them to look at here.
However, Bechard explained that a FBI special agent told him it had taken Facebook as many as eight months to respond to a single child-pornography inquiry.
Bechard asked WND,
Is anyone comparing the numbers of profiles and videos that get reported and removed from Facebook as child pornography – and therefore criminal – and the number of reports made to law enforcement? That number should be the same. If they take something down, acknowledging that it’s a crime, they should be reporting the crime to law enforcement. The biggest frustration is that it doesn’t seem like that is happening. The crime seems to just go away at the Facebook level.
Asked whether there is a way to cross-check the number of child-pornography reports Facebook receives from its users with the number of reports Facebook sends to law-enforcement authorities, Lombardo said,
That’s another good question for Facebook, because what we receive is what we get from NCMEC. I can’t compare to a number of reports Facebook received. There are things I can’t answer because I don’t know.
Bechard, frustrated with the reporting process, explained:
I can’t think of another crime – at least at this felonious level – where people can witness it and 1) they actually have no idea how to report it and 2) everyone in law enforcement tells them to go to an outside, nonprofit agency to report it [NCMEC]. And then that agency is the clearinghouse for investigative material for law enforcement to use. I don’t know of any other crime that is reported that way.
Asked how he would respond to claims that Facebook has implemented numerous proactive steps to ensure child porn is eradicated from its site and is overall doing a great job in the areas of prevention and elimination of the content, Bechard replied:
“I would put them in front of a computer and – within 30 seconds – show them child pornography on Facebook. Lots of it!”
Chelsea Schilling is a commentary editor and staff writer for WND, an editor of Jerome Corsi’s Red Alert, and a proud homeschooling mother. Schilling joined the Army at age 17, receiving the exceptional designation of expert marksman three times. In addition to WND, Schilling has worked as a news producer at USA Radio Network and as a news reporter for the Sacramento Union.
Part 1 of this series
Facebook Pulls A Few Controversial Rape Pages, But Many Remain
The Berlin Turnpike: A True Story of Human Trafficking
Richard Lepoutre : Stop Child Porn on Facebook campaign.
PhotoDNA : Facebook’s New Way to Combat Child Pornography
Thanks for reading . . .