There is a vast network of conspiracy videos on YouTube that feeds off tragic events — including the recent shooting in Parkland, Florida — according to a prominent misinformation researcher.
In a nutshell, if you are creative, outrageous, different, do wild and crazy and unexpected things (like biting into Tide Pod detergent) or espouse provocative alternative thoughts, you, too, can get to the top of a social media Trending tab, especially on YouTube.
The Google-owned network has seen itself a hotbed of controversy for much of 2017 and now 2018, for a variety of reasons.
Jonathan Albright, research director for the Tow Center for Digital Journalism at Columbia University, has studied misinformation on YouTube going back to the 2016 presidential election.
Instead, Albright focused on the finding that the many thousands of conspiracy videos that he could identify were being pushed by YouTube’s “Up next” recommendations, which persuade users to stay on the site and watch more videos.
waves of conspiracy theory videos that challenged the facts of recent tragedies. Several were YouTube hits.
In each instance, YouTube said it would take down videos if they were “flagged” by a viewer.
And after a flurry of public outrage — and some advertiser grumbling — YouTube apologized. In the case of conspiracy videos, it said it would change its algorithm to only allow for credible news sources in the trending chart, a useful tool for viewers to find what’s new and hot.
YouTube displays a list of suggested videos anytime a user looks at any video on its platform. How that list is created is a secret, much like the algorithms that run Google’s search engine and Facebook’s news feed, but it is known to be optimized to keep users on YouTube for as long as possible
Viewed by over 1.5 billion people monthly, YouTube gets 400 hours of new video submitted every minute, and the Google-owned unit has built a thriving business based on a computer-run operation, with little human intervention. (Google generated $32 billion in revenue in the most recent quarter, mostly from ad sales that are primarily self-service.)
Blogger Rakesh Agrawal says tech firms have a libertarian work ethic to present a platform for many different points and views, and don’t want to be seen in the position of censoring material.
“That’s not a good approach anymore,” he says. “Trending needs to be reviewed. There’s no reason an editorial team can’t look at this.”