Is there illegal child pornography on YouTube?
By Zachary Margulis-Ohnuma
An article in today’s New York Times suggests that there is an “open gate for pedophiles” on YouTube because of the way the video hosting service suggests videos to users.
If you look at one video of a partially clothed child on YouTube, the service’s algorithm will send you to more and more videos that are similar, the Times reports.
That has the effect of turning harmless videos of children into “sexualized imagery.”
While any single video may be appropriate, the net effect is “unmistakeable.”
The result: hundreds of thousands of people might look at a single home video made by a child—obviously way more people than the family and friends that the video is aimed at.
The Times article blows a lot of dog whistles to scare people. Each individual video is clearly harmless. YouTube denies steering people toward sexual or illegal content. The company made some changes in response to the criticism, including disabling comments for most videos of children. A few months ago Wired reported that the comments on specific videos were being used to steer people to sexualized—or perhaps better put sexual-izable—content involving children.
So does that mean there is child porn on YouTube?
First, let’s state the obvious: it is a very bad idea to use YouTube to obtain erotic imagery of children. At the same time, it is probably not illegal.
Even erotic images in the United States are protected by the First Amendment unless they involve the actual victimization of a child.
Under federal law and the laws of most states, illegal child pornography is content that includes sexual conduct, which can include both sexual interactions and the “lewd” or “lascivious” display of a child’s genitals. How do you know if a picture of a child is “lewd” or “lascivious”? Many courts apply the so-called Dost factors, named after a 1986 case. The factors include:
whether the focal point of the visual depiction is on the child’s genitalia or pubic area;
whether the setting of the visual depiction is sexually suggestive, i.e., in a place or pose generally associated with sexual activity;
whether the child is depicted in an unnatural pose, or in inappropriate attire, considering the age of the child;
whether the child is fully or partially clothed, or nude;
whether the visual depiction suggests sexual coyness or a willingness to engage in sexual activity;
whether the visual depiction is intended or designed to elicit a sexual response in the viewer.
So under Dost, a fleeting image of buttocks or even a child’s crotch in a non-sexual home movie will generally not constitute child pornography. Context matters. If a case goes to court, a judge and jury may have to decide.
But even if it is not illegal, it is dangerous to search on YouTube for erotic or eroticized content involving children. Google, which owns YouTube, is not shy about referring suspected child pornography to the police.
YouTube searches provoke law enforcement scrutiny. Our law office has defended cases cases where people were arrested for ambiguous images. But then the prosecutor dropped those charges but went after our clients for other images found on their electronic devices.
By looking for sexual images of children on YouTube, you put yourself in jeopardy. If a person is charged with possession or distribution of child pornography, YouTube viewing history—even unrelated to the charges—can be used to show the person’s knowledge and intent with respect to material that is clearly illegal.
If you have any question about that, you should consult with a lawyer with experience defending child pornography cases.
On the other side of the screen, it is a mistake for parents to allow children to upload any content for public viewing on YouTube if they are concerned the content could be sexualized.
With or without the algorithm, people will find what they are looking for in videos that come anywhere near the line.
So in the end, it’s best to keep your kids’ videos entirely private.