People are creating rabbit holes of misinformation

Have you ever tried to search for something only to find little to no relevant information on your topic? That’s what we call a “data void.”
How do search engines find results?
When you do an online search, search engines like Google or Bing use an algorithm to try and “guess” what you’re looking for. They use statistical models based on patterns from previous Internet searches as well as data they can gather about you (like your geographical location and your search history) from your device.
Then, algorithms try to match this data to text on websites. That’s why webpage optimization and SEO (search engine optimization) are so important to businesses – it allows their pages to be found by search engines more easily. The better you are at optimizing your site, the more likely you are to end up on the front page of the results.
Seems simple, right?
But we’ve all searched for something only to be frustrated by irrelevant results. This happens when we search for something odd, when we misspell things, or when we’re simply looking for more information on a new topic that people haven’t written about yet.
Exploiting data voids
It’s rare for a search engine to produce nothing when you do a search. It will at least give you suggestions.
The problem is that there are people out there who are great at predicting or finding “data voids.” Then they create webpages that exploit them, redirecting people to sites with anything from ads and viruses to conspiracy theories and misinformation.

Manipulating users into the voids
Maybe you’re thinking you don’t need to worry about any of this. But the truth is that people are taken in every day by websites that are solely created to manipulate and mislead.
To top it off, once a webpage is created for a unique search term, the people who built it go around to other websites (like Wikipedia or Reddit) and leave “breadcrumbs” that encourage people to visit their sites.
A shocking example
Michael Golebiewski and danah boyd (yes, that’s how she styles her name) of Microsoft wrote an illuminating report in 2018 dealing with this issue. It didn’t get nearly enough attention at the time though because people don’t think they’re so easily duped. Then came the COVID-19 pandemic and its associated data voids (after all, it was a new phrase).
Now, their report is the cornerstone of understanding data voids, particularly when it comes to issues like white supremacy. They use a powerful example to illustrate exactly how these data voids can change the world for the worse:
When Dylann Roof murdered 9 Black church-goers (and injured one more) at the Emanuel African Methodist Episcopal Church in 2015, he left a manifesto. It described how he became a radical white supremacist.
Roof said:
The event that truly awakened me was the Trayvon Martin case. I read the Wikipedia article and right away I was unable to understand what the big deal was. It was obvious that Zimmerman was in the right. But more importantly this prompted me to type in the words ‘black on white crime’ into Google, and I have never been the same since that day. The first website I came to was the Council of Conservative Citizens. There were pages upon pages of these brutal black on white murders. I was in disbelief. At this moment I realized that something was very wrong. How could the news be blowing up the Trayvon Martin case while hundreds of these black on white murders got ignored?
Because the phrase “black on white crimes” was not a popular search term for anyone except white supremacists, all he got back were stories in which Black people were accused of murder.
Filling the void
However, when Roof’s case hit the news, thousands of new websites mentioned the phrase. So now if you search for the term, the results are much more diverse. You’ll find news stories and fact sheets at the top of the search results instead of racist propaganda. There’s no data void there anymore.
Golebiewski and boyd explained just how manipulative bad actors can be though. They know the Internet well enough to simply find a new search term to exploit that would lead people down a rabbit hole of misinformation. And they don’t just do it for webpages, they make splashy videos too.
Then you have the fact that most people get their news from social media making it all worse.
Once it’s on the Internet, it becomes fodder for everyone from talk radio hosts to angry people who spend all day on social media sharing stories and videos that reinforce their view of the world.
So how do we stop it? Is it possible to teach everyone to be more “search savvy”? Or do companies have a responsibility to seek out and police these voids? If they do, how do they get around people claiming the right to freedom of speech, even when it’s hateful?
Read more:
Michael Golebiewski and danah boyd, “Data Voids: Where Missing Data Can Easily Be Exploited” (Data & Society, 2018)
– Click here for a summary of the report
Joshua Benton, “Watch Your Language: “Data voids” on the web have opened a door to manipulators and other disinformation pushers” (Neiman Lab, 2019)
Francesca Tripodi, “Google and the Cost of ‘Data Voids’ During a Pandemic” (Wired, 2020)
Ari Schneider, “How Trolls Are Weaponizing ‘Data Voids’ Online” (Slate, 2020)
(Video) Annelise Wunderlich, “YouTube Algorithms: How To Avoid the Rabbit Hole” (KQED, 2019)
Kevin Roose, “Caleb Cain Was A College Dropout Looking For Direction. He Turned To YouTube” (The New York Times, 2019)
Karen Heo, “DeepMind Is Asking How AI Helped Turn the Internet Into An Echo Chamber” (MIT Technology Review, 2019)