youtube

Most parents will admit to monitoring the television shows their children watch, making sure it’s appropriate for their age. This can range from gritty crime dramas to reality shows to the newest CW hit.

But one thing parents don’t seem to monitor as strongly is the YouTube channels their children watch. Older internet users know that not everything is appropriate for anyone of any age but it seems parents haven’t realized yet the kinds of content that can be found online.

Just before last year ended, one of the biggest content creators on YouTube, Logan Paul, with a large following of mostly kids aged 9–14, uploaded a horrifying video to his channel. He went to the Aokigahara forest in Japan, which is notoriously known as a place where people commit suicide. There he and his friends found someone’s deceased body, and Paul had the audacity to film the entire encounter.

Including showing footage of the deceased individual in the video as well as using it in the thumbnail (or preview image) for the video and titling the video “We found a dead body in the Japanese Suicide Forest . . .,” Paul used the death of another human to gain views and make money.

The video gained over 6.4 million views before it was deleted by Paul. For reference, that’s about as many people who tuned in for the Season 4 premiere of Game of Thrones. The majority of his viewers were kids, and they saw no problem with the content until adults started pointing out how horribly disgusting it was.

This isn’t the first time YouTube has run into trouble with its content aimed for younger children missing the mark. In November, it was revealed that some horrible adults were taking advantage of YouTube’s algorithms to show disturbing content on the YouTube Kids app.

YouTube Kids is an app that provides a family friendly version of the YouTube app for younger children to use, with parental control features and video filters. But those parental control features and video filters didn’t take into account the horrible nature of people looking for a quick buck.

Videos were being posted that featured common cartoon characters —Elsa from Disney’s Frozen, Paw Patrol, Peppa Pig, Spider-Man — and placing them in violent and sexual situations. Because these videos were tagged with the character names and shows, they would show up on YouTube Kids.

And who doesn’t know parents who just plop an iPad down in front of their young children and not closely monitor the content?

YouTube is a democratized platform, meaning that creators and the community of the audience make sure everyone is following the terms of service. But how can a 4-year-old know how to report a video for showing Spider-Man violently kidnapping Elsa and killing Peppa Pig? Especially if it pops up in between videos of Disney songs and Sesame Street clips?

Parents, and even older siblings or cousins, should monitor what content is being consumed on YouTube. It’s not like television where you can just turn on Nick Jr. or Disney Junior and know that the content would be appropriate.

Editorial policy is determined by the student editor, and views expressed in editorials are those of the majority of The Vidette’s Editorial Board. Columns that carry bylines are the opinions of the author and do not necessarily represent those of The Vidette or the University.

(0) entries

Sign the guestbook.

Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
PLEASE TURN OFF YOUR CAPS LOCK.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.