Hello,

We noticed you're browsing in private or incognito mode.

To continue reading this article, please exit incognito mode or log in.

Not an Insider? Subscribe now for unlimited access to online articles.

  • Christian Wiediger
  • Connectivity

    An ex-Google engineer is scraping YouTube to pop our filter bubbles

    He’s built a website that lets you see how often YouTube’s algorithm recommends videos, so you can find out where it wants to take you.

    If you’ve ever used YouTube, you’ve probably noticed that it’s easy to fall into a sort of viewing trance: you start out watching a funny cat video, and suddenly it’s an hour later and you’ve blown through a bunch more—each one recommended to you on the right side of the screen, served up helpfully by the site’s algorithm.

    As we’re increasingly learning from social networks like Facebook and Twitter, algorithms can be used to manipulate people in all kinds of ways by showing us more of one thing and less of another. This might be fine (and fun!) when you’re looking for cute kitten videos, but it can lead you down a path of fear, conspiracy theories, and one-sided thinking if you find yourself watching videos about some other subjects, such as vaccines, recent school shootings, or climate change.

    YouTube—whose more than a billion users watch over a billion hours per day—shows us some data, like how many times a video has been viewed, liked, or disliked. But it hides more granular details about each video, like how often the site recommended it to other people. Without the full picture, it can be hard to know why, exactly, its algorithm is steering you in a certain direction.

    Guillaume Chaslot, a programmer who used to work for YouTube and Google, built AlgoTransparency.org.
    Courtesy of Guillaume Chaslot

    Guillaume Chaslot, a computer programmer who spent some time working on recommendations at YouTube and on display advertising at its parent company, Google, thinks this is a problem, and he’s fighting to bring more transparency to the ways videos are recommended. He built a website, AlgoTransparency, so visitors can see where YouTube’s algorithm takes you if you follow its recommendations—whether you’re searching for videos about recent elections, mass shootings, science, or a handful of other general topics and related search terms he’s cherry-picked to scrape.

    “Everybody should know, if you start to spend time on YouTube, where it’s going to take you,” he says.

    Since he started tracking recommendations in 2016, he’s found that for some phrases, like “vaccines facts” or “global warming,” YouTube’s recommendation algorithm pushes viewers toward conspiracy-theory, anti-science, or anti-media videos. And the algorithm seems to favor videos of more divisive politicians, who talk in an aggressive, bullying manner, he says.

    He also tracks terms that are mentioned most frequently in the most-recommended videos. On April 1, for instance, results for “is the earth flat or round” most commonly featured words like “flat,” “NASA,” “proof,” and “secret.”

    Chaslot, who worked at YouTube in 2011 and then at Google until 2013 (he claims he was fired for trying to give users more control over the algorithms that recommend content; neither Google nor YouTube addressed that contention in a response to a request for comment about this and other issues he has raised), figured this out by tracking YouTube’s suggestion algorithm. He tested his theory by building software that simulates the act of starting out watching one video on YouTube and then clicking on the recommended “Up next” video (which will also play automatically if you have YouTube’s autoplay feature turned on), over and over and over.

    In addition to following where the algorithms can lead, Chaslot wants to make YouTube viewers think more about how recommendations could be used to rack up views. For instance, if you search for “Parkland shooting” and are served a video that has 2.5 million views on YouTube, was it algorithmically recommended 50 million times to get those 2.5 million people to watch, or was it recommended 500,000 times and then shared organically?

    “That’s a world of difference,” Chaslot says. And there’s currently no way to know.

    Replying to questions about how and why YouTube suggests videos to users, a YouTube spokeswoman provided a statement saying its recommendation system has “changed substantially” over time and no longer works the way it did five years ago, when Chaslot was an employee. Whereas YouTube used to focus on watch time, it says, it’s now also looking at how “satisfied” people are, as measured by surveys, likes, dislikes, and other evidence.

    Sign up for the The Algorithm
    News and views on the latest in artificial intelligence
    Manage your newsletter preferences

    YouTube is also making changes to address issues with its recommendations. A version of the YouTube Kids app will reportedly ditch the algorithm for video recommendations in favor of having humans make them. And in March, YouTube CEO Susan Wojcicki said the site would add Wikipedia links to videos that included “significantly debated” topics, such as those focused on conspiracy theories.

    For this second move, Chaslot wonders why YouTube wouldn’t just add Wikipedia links for all kinds of topics relevant to its videos. “That would be more natural,” he says.

    Furthermore, he doesn’t think it’s hard to build tools that can get people to browse more broadly. While he was at YouTube in 2011, he says, he actually prototyped a tool that worked with users’ Google search results and took their search history into account to broaden their horizons.

    If you searched for, say, “Ford Focus” you’d see specific results, but as you scrolled down the page, you’d see more general car results. If you kept going, you’d see results related to other things you’d searched for in the past (say, ice skating).

    “It’s easy to build tools to get people outside of their filter bubbles, enable them to go further, see what’s beyond their filter bubble,” he says.

    Become an MIT Technology Review Insider for in-depth analysis and unparalleled perspective.

    Subscribe today
    More from Connectivity

    What it means to be constantly connected with each other and vast sources of information.

    Want more award-winning journalism? Subscribe to Insider Plus.
    • Insider Plus {! insider.prices.plus !}*

      {! insider.display.menuOptionsLabel !}

      Everything included in Insider Basic, plus the digital magazine, extensive archive, ad-free web experience, and discounts to partner offerings and MIT Technology Review events.

      See details+

      Print + Digital Magazine (6 bi-monthly issues)

      Unlimited online access including all articles, multimedia, and more

      The Download newsletter with top tech stories delivered daily to your inbox

      Technology Review PDF magazine archive, including articles, images, and covers dating back to 1899

      10% Discount to MIT Technology Review events and MIT Press

      Ad-free website experience

    /3
    You've read of three free articles this month. for unlimited online access. You've read of three free articles this month. for unlimited online access. This is your last free article this month. for unlimited online access. You've read all your free articles this month. for unlimited online access. You've read of three free articles this month. for more, or for unlimited online access. for two more free articles, or for unlimited online access.