YouTube allows toxic videos on platform

On April 2, Bloomberg published a report that alleges YouTube hasn’t clamped down on harmful content because it’s so profitable. The article paints a picture of a company with a severely bifurcated culture. YouTube’s executives are presented as being largely uninterested in content moderation.

Conversely, the video sharing service’s rank-and-file employees have become increasingly concerned about the site’s distribution of conspiracy theories, political extremism, and otherwise objectionable content.

The piece also outlines why YouTube has become a hotbed for toxic videos.

Advertisement

An Inauspicious Beginning

Bloomberg argues that YouTube’s problems began shortly after its 2005 launch. Millions of people visited the platform every day, but it wasn’t taken seriously. This perception existed because the company’s most viewed content was inconsequential and unprofitable.

However, the company’s moderators could restrict or remove harmful content like pro-anorexia videos during this period.

Google acquired YouTube in 2006 but it didn’t get serious about monetizing the platform until years later. In 2012, the company determined the path to transforming its video segment into a money maker was engagement. The reasoning is the more time people spent watching videos, the more advertisements the firm could attach to the clips it hosted.

To that end, then YouTube CEO Salar Kamangar set a goal of 1 billion video hours watched per day. It would take the corporation four years and one problematic change to achieve that milestone.

“Bad Virality”

In the middle of this decade, various YouTube staffers started noticing a trend. Outrageous content tended to perform very well on the site in terms of both views and time watched. As a result, the platform’s recommendation algorithms began unintentionally surfacing videos of an extremely controversial or disturbing nature.

Bloomberg named Infowars creator Alex Jones as a content creator who benefited from the “bad virality” trend. Despite introducing viewers to conspiracy theories that got him taken to court and banned from other platforms, he amassed a large audience on YouTube.

At the height of its popularity, Jones’ channel had 2.4 million subscribers and 1.6 billion views. His following grew in part because YouTube didn’t have a conspiracy theory policy in place until 2017. Eventually, Jones’ repeated violation of community policies saw him kicked off the platform, but other similar channels are still thriving.

As an example, a channel called ihealthtube has racked up more than 78 million views in 12 years despite hosting videos supporting anti-vaccination conspiracies. However, because toxic videos increase engagement, the company’s executives have rejected suggestions to purge or quarantine them.

Information Cues Aren’t Enough

YouTube now embeds text boxes containing information from online encyclopedias related to a conspiracy video’s topic. The firm’s current chief executive, Susan Wojcicki, unveiled the “information cues” at the 2018 South by Southwest event.

The company’s textual solution seems designed to clamp down on the spread of fake news while not taking an editorial position that would endanger its safe harbor status. However, the text inserts are flawed in a few different ways. As Bloomberg reports, information cues aren’t applied to all questionable videos on a channel.

Moreover, the service sometimes applies text boxes to videos that don’t feature misleading content. For example, a clip of WWE superstar John Cena announcing the death of Osama bin Laden features an information cue about the terrorist.

Presumably, the video got flagged because Cena refers to SEAL Team Six as “we.” Consequently, an algorithm or non-native English speaking moderator mistakenly took his reference to mean the professional wrestler was claiming responsibility for killing bin Laden.

While moderating content for a platform the size of YouTube is obviously difficult, the company’s current solutions are plainly inadequate. Moreover, the Bloomberg article makes a compelling case that the firm’s executives aren’t doing more because they don’t want to cut into their profit margins.

Analysts believe YouTube generated $15 billion in ad revenue last year.

But in the aftermath of the New Zealand massacre, tech companies are starting to realize that a hands-off approach to user content doesn’t work. More than 1.58 billion people view the platform’s videos every month, so it obviously has tremendous reach and influence. Hopefully, its leadership will harness that power in a way that benefits the public as well as itself in the near future.

Facebook Comments