YouTube Kids, YouTube’s initial entry into making its video-sharing service more accessible to parents and their children, has come under fire yet again. Two consumer groups, the Center for Digital Democracy and the Campaign for a Commercial-Free Childhood, are alleging that the app includes a number of videos that are inappropriate for children, including ones that reference sex, alcohol and drug use, child abuse, pedophilia and more.
The organizations compiled a video detailing their findings using videos it pulled from the YouTube Kids application.
According to their review, the app allowed children the ability to view videos that included sexual language, unsafe behaviors like playing with matches or juggling knives, profanity (e.g. in a parody of the film “Casino” featuring Bert and Ernie from Sesame Street), adult discussions about family violence, pornography, and child abuse, jokes about drug use and pedophilia, and alcohol ads.
The CDD and CCFC provided the video to the FTC today, which showed some of the content they were able to find.
[vimeo 127837914 w=500 h=281]
“Google is deceiving parents by marketing YouTube Kids as a safe place for children under five to explore when, in reality, the app is rife with videos that would not meet anyone’s definition of ‘family friendly,'” the groups said in a statement released today.
This is not the first time the YouTube Kids app has been reported to the FTC. Earlier in April, these two groups along with the American Academy of Child and Adolescent Psychiatry and Consumer Watchdog also filed a complaint that asked the FTC to investigate the way advertising was handled in the YouTube Kids application. The groups said that the way the hosts of the online videos would sometimes sell products inside their shows would be illegal if their shows were on TV instead of being online.
YouTube responded by noting that it worked with a number of partners and child advocacy groups when developing its app and said that it disagreed with the groups’ assertion that “no free, ad-supported experience for kids will ever be acceptable.” (YouTube collaborated with Common Sense Media, ConnectSafely.org, the Family Online Safety Institute, the Internet Keep Safe Coalition (iKeepSafe) and WiredSafety.org to build YouTube Kids).
But while parents may have bristled at the idea that YouTube Kids could be advertising to children in ways that are different and potentially more influential than what they’re used to on TV, news that the app actually could lead to kids coming across very adult content may have them sit up and take more notice.
The problem in this situation could stem from the fact that the app is continually updated with new content as more “kid” videos are uploaded to YouTube. That ongoing deluge — the “family entertainment sector on YouTube has grown by 200 percent year-over-year — could have allowed some inappropriate videos to slip through the cracks of YouTube’s review mechanisms.
When YouTube Kids first launched, video content was carefully selected in preparation for the app’s debut and included shows and other videos pulled from YouTube then organized into categories like Shows, Music, Learning and Explore.
Aimed largely at the preschool crowd, the app includes videos from well-known children’s entertainment brands like DreamWorks TV, Jim Henson TV, Mother Goose Club, Talking Tom and Friends, National Geographic Kids, Reading Rainbow, and Thomas the Tank Engine, as well as other shows from YouTubers like Vlogbrothers and Stampylonghead.
YouTube also claimed then that new content would be doubly filtered for quality control in the future, first algorithmically then by an internal team that would manually sample videos for quality control. The company said that this two-step process would mean there would be a delay between videos arriving on YouTube.com and them becoming available in the Kids app.
But the larger problem with the YouTube Kids app is the fact that the app offers a search feature that could lead kids to videos that haven’t been explicitly included and organized into the app’s various sections. Parents, of course, can disable search from the parental controls, but many may not be aware this option exists or that they should make the change.
Google commented that anyone can flag a video that, after a review, will lead to inappropriate videos being removed from the app. The company also suggested that parents disable search if they’re concerned about what children might be able to locate.
Following these complaints, however, it may make more sense for search to be disabled by default instead of the other way around.
Despite the issues, YouTube Kids is still a better alternative to YouTube proper for parents who are allowing their children to watch YouTube videos on mobile. It’s arguable that YouTube – even vetted and curated for kids – is not really an ideal service for the younger viewers, though. There’s still clearly a risk in allowing kids to use YouTube. Their ability to accidentally stumble across adult content could make the app a non-starter for parents who are looking for an app that serves as an “iPad babysitter” rather than something parent and child use together.