Spark

Latest scandals suggest it's time to build a better YouTube

Moderation woes urge radical redesign
With 12 days' worth of video being uploaded every minute, YouTube faces a challenge ensuring that all its content is safe. (Pixabay)

The first YouTube video ever uploaded featured co-founder Jawad Karim describing elephants at the San Diego Zoo. That was in 2005.

How times have changed.

Today, 12 days' worth of video is uploaded to the platform every single minute. If you took the most popular YouTube video ever posted, Psy's "Gagnam Style," and played it sequentially, once for every time it's been watched, it would take you 6,000 years.

In other words, there is a lot of content on the Google-owned site — and that, especially lately, is causing a lot of problems.

Even though it's incredibly powerful, the algorithm that examines the content for safety simply can't always keep up, according to Jamie Cohen, director of the New Media program at Molloy College in New York. And bad actors are able to exploit that.

Scandals plague YouTube

It's been a bad month for YouTube, as it tries to manage its latest scandal. Some users have been abusing the way the platform's algorithm suggests videos, and are using the comments sections to send secret messages, which has led to allegations that links to child pornography are being exchanged via YouTube comments. Also, it has led to people highlighting sections of videos featuring children that some might find provocative, and sharing them.

To be clear, what's being shown on YouTube isn't illegal, but it is content that could be abused by some people.

YouTube says it has removed the ability to comment on most material that features children in an attempt to stop the abuse.

I think that's an entire TV environment that we've never planned for in the history of media.- Jamie Cohen, director of New Media at Molloy College

But it's not the first time YouTube has had to deal with this sort of abuse of its programming. This latest issue shows just how challenging it can be for a platform that hosts billions of hours of video to control how it's being used.

As is so often the case with social-media platforms, it's a question of content and comment moderation.

Cohen said YouTube, and its parent company, Google, need to recognize that it's not merely a platform for sharing videos, as it began.

"YouTube is both an employer and a system, but refuses to act as a media production company," he told Spark host Nora Young.

"I argue that the algorithm itself should be reconsidered as a media environment," he said. "I think that's an entire TV environment that we've never planned for in the history of media."

Cohen suggested that YouTube, like Facebook, is reluctant to call itself a media producer, because then it has to take more responsibility for the content on the site. That scares investors, he said.

How to do YouTube differently

The algorithm that decides what is safe for people to view has to strike a balance between what would be in the public interest, with what keeps people's eyes glued to their screens by constantly recommending videos to watch.

Given that young people, especially, see YouTube as such an authoritative source, that's dangerous, he said.

Cohen suggested that a time delay between the time a video is posted and becomes available on the site might make it easier for the algorithm to make better decisions.

More aggressively, he suggested that YouTube should be broken up into different segments, and that those who are professional "YouTubers" be separated from more 'D-I-Y' type contributors.

Ultimately, preventing the posting of inappropriate content, or abuse of the comments section, isn't an easy problem to solve, a point acknowledged by Google CEO Sundar Pichai in testimony before U.S. Representatives in December.

"Maybe it's just scaling so large that as one system it will continue to have these problems pretty much forever," Cohen said.