YouTube is equally to blame for Logan Paul’s video

It appears that YouTube is more responsible for the first crisis of the year on its video platform than was initially thought.

Yesterday, the internet was rightly outraged by news that YouTube star Logan Paul, who has 15 million subscribers and is part of YouTube’s Red subscription service, posted and later deleted a video that included extensive footage of a suicide victim filmed at Japan’s ‘Suicide Forest’.

Paul deleted the video less than 24 hours after posting it following outrage, but not before it had been watched by some six million people and — it emerges — been okayed by YouTube’s moderation team.

That revelation comes from one of YouTube’s own content assessment team who posted a screenshot that showed that the video had been approved on January 1 after being flagged by concerned viewers, as BuzzFeed first noted.

Let that sink for a minute. A person who is paid to keep unsuitable content off the platform looked over this video, and the footage of the victim’s barely-blurred-out corpse hanging from a tree, and decided that it is the kind of thing that should exist on the internet’s most popular video service.

The video included the hanging body in the thumbnail and was titled “We found a dead body in the Japanese Suicide Forest.” Yet despite that, and the disturbing scenes it included, it not only passed YouTube’s moderation check, but also went on to rank among the site’s top ten trending videos thereby exposing the disturbing scenes to viewers beyond Paul’s already-popular channel. (Notably, many of Paul’s subscribers are children aged under 18.)

YouTube’s guidelines specifically state that “it’s not okay to post violent or gory content that’s primarily intended to be shocking, sensational, or disrespectful.”

Paul has since apologized a second time, but now the focus must be on how and why YouTube did not remove the video.

“Our hearts go out to the family of the person featured in the video. YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases it will be age-gated. We partner with safety groups such as the National Suicide Prevention Lifeline to provide educational resources that are incorporated in our YouTube Safety Center,” a YouTube spokesperson told TechCrunch in a statement.

The spokesperson did not comment on whether YouTube had taken additional action against Paul, such as issuing a strike against his channel. According to its policies, channels that receive three strikes inside a three-month period are removed from the service, but each strike expires after three months.

According to the pseudonymous YouTube content moderator, other channels that reposted Paul’s video — predominantly due to outrage — were hit with strikes.

A still from the video, via YouTuber Kavos

The incident may seem like a wrinkle in YouTube’s outgoing troubles given that the video was deleted within 24 hours, but it exposes just how broken YouTube’s current system is. It’s all the more worrying when you consider that YouTube claims over a billion users, who “each day.. watch a billion hours of video, generating billions of views.”

YouTube has pledged to increase its investment in artificial intelligence moderation and increase its army of content checkers and moderators to 10,000 people, but a more thorough revamp of its approach seems to be needed. There’s also plenty of much-justified concern that relying on AI won’t be enough, as evidenced by Google’s failure to respond to questions and exampled aired by the Home Affairs Committee in the UK’s Parliament weeks ago.

Perhaps the most damning criticism of Paul’s video came from another video star.

PewDiePie, YouTube’s most popular channel owner with over 50 million subscribers, has been a vocal critic of Paul and his equally brash younger brother Jake, while he’s also no stranger to getting in hot water due to video content.

“It encompasses everything wrong with YouTube, the clickbait, the sensationalism, the thing that’s got to keep pushing [the envelope]. At the end of the day, it just shines bad on everyone,” the YouTuber, real name Felix Kjellberg, said in a video.

Note: article updated to correct that PewDiePie has 50 million not 50 subs, because of course he does.