Don’t Go Down the Rabbit Hole

https://www.nytimes.com/2021/02/18/opinion/fake-news-media-attention.html

Version 0 of 1.

For an academic, Michael Caulfield has an odd request: Stop overthinking what you see online.

Mr. Caulfield, a digital literacy expert at Washington State University Vancouver, knows all too well that at this very moment, more people are fighting for the opportunity to lie to you than at perhaps any other point in human history.

Misinformation rides the greased algorithmic rails of powerful social media platforms and travels at velocities and in volumes that make it nearly impossible to stop. That alone makes information warfare an unfair fight for the average internet user. But Mr. Caulfield argues that the deck is stacked even further against us. That the way we’re taught from a young age to evaluate and think critically about information is fundamentally flawed and out of step with the chaos of the current internet.

“We’re taught that, in order to protect ourselves from bad information, we need to deeply engage with the stuff that washes up in front of us,” Mr. Caulfield told me recently. He suggested that the dominant mode of media literacy (if kids get taught any at all) is that “you’ll get imperfect information and then use reasoning to fix that somehow. But in reality, that strategy can completely backfire.”

In other words: Resist the lure of rabbit holes, in part, by reimagining media literacy for the internet hellscape we occupy.

It’s often counterproductive to engage directly with content from an unknown source, and people can be led astray by false information. Influenced by the research of Sam Wineburg, a professor at Stanford, and Sarah McGrew, an assistant professor at the University of Maryland, Mr. Caulfield argued that the best way to learn about a source of information is to leave it and look elsewhere, a concept called lateral reading.

For instance, imagine you were to visit Stormfront, a white supremacist message board, to try to understand racist claims in order to debunk them. “Even if you see through the horrible rhetoric, at the end of the day you gave that place however many minutes of your time,” Mr. Caulfield said. “Even with good intentions, you run the risk of misunderstanding something, because Stormfront users are way better at propaganda than you. You won’t get less racist reading Stormfront critically, but you might be overloaded by information and overwhelmed.”

Our current information crisis, Mr. Caulfield argues, is an attention crisis.

“The goal of disinformation is to capture attention, and critical thinking is deep attention,” he wrote in 2018. People learn to think critically by focusing on something and contemplating it deeply — to follow the information’s logic and the inconsistencies.

That natural human mind-set is a liability in an attention economy. It allows grifters, conspiracy theorists, trolls and savvy attention hijackers to take advantage of us and steal our focus. “Whenever you give your attention to a bad actor, you allow them to steal your attention from better treatments of an issue, and give them the opportunity to warp your perspective,” Mr. Caulfield wrote.

One way to combat this dynamic is to change how we teach media literacy: Internet users need to learn that our attention is a scarce commodity that is to be spent wisely.

In 2016, Mr. Caulfield met Mr. Wineburg, who suggested modeling the process after the way professional fact checkers assess information. Mr. Caulfield refined the practice into four simple principles:

1. Stop.

2. Investigate the source.

3. Find better coverage.

4. Trace claims, quotes and media to the original context.

Otherwise known as SIFT.

Mr. Caulfield walked me through the process using an Instagram post from Robert F. Kennedy Jr., a prominent anti-vaccine activist, falsely alleging a link between the human papillomavirus vaccine and cancer. “If this is not a claim where I have a depth of understanding, then I want to stop for a second and, before going further, just investigate the source,” Mr. Caulfield said. He copied Mr. Kennedy’s name in the Instagram post and popped it into Google. “Look how fast this is,” he told me as he counted the seconds out loud. In 15 seconds, he navigated to Wikipedia and scrolled through the introductory section of the page, highlighting with his cursor the last sentence, which reads that Mr. Kennedy is an anti-vaccine activist and a conspiracy theorist.

“Is Robert F. Kennedy Jr. the best, unbiased source on information about a vaccine? I’d argue no. And that’s good enough to know we should probably just move on,” he said.

He probed deeper into the method to find better coverage by copying the main claim in Mr. Kennedy’s post and pasting that into a Google search. The first two results came from Agence France-Presse’s fact-check website and the National Institutes of Health. His quick searches showed a pattern: Mr. Kennedy’s claims were outside the consensus — a sign they were motivated by something other than science.

The SIFT method and the instructional teaching unit (about six hours of class work) that accompanies it has been picked up by dozens of universities across the country and in some Canadian high schools. What is potentially revolutionary about SIFT is that it focuses on making quick judgments. A SIFT fact check can and should take just 30, 60, 90 seconds to evaluate a piece of content.

The four steps are based on the premise that you often make a better decision with less information than you do with more. Also, spending 15 minutes to determine a single fact in order to decipher a tweet or a piece of news coming from a source you’ve never seen before will often leave you more confused than you were before. “The question we want students asking is: Is this a good source for this purpose, or could I find something better relatively quickly?” Mr. Caulfield said. “I’ve seen in the classroom where a student finds a great answer in three minutes but then keeps going and ends up won over by bad information.”

SIFT has its limits. It’s designed for casual news consumers, not experts or those attempting to do deep research. A reporter working on an investigative story or trying to synthesize complex information will have to go deep. But for someone just trying to figure out a basic fact, it’s helpful not to get bogged down. “We’ve been trained to think that Googling or just checking one resource we trust is almost like cheating,” he said. “But when people search Google, the best results may not always be first, but the good information is usually near the top. Often you see a pattern in the links of a consensus that’s been formed. But deeper into the process, it often gets weirder. It’s important to know when to stop.”

Christina Ladam, an assistant political science professor at the University of Nevada, Reno, has seen the damage firsthand. While teaching an introductory class as a Ph.D. student in 2015, she noticed her students had trouble vetting sources and distinguishing credible news from untrustworthy information. During one research assignment on the 2016 presidential race, multiple students cited a debunked claim from a satirical website claiming that Ben Carson, a candidate that year, had been endorsed by the Ku Klux Klan. “Some of these students had never had somebody even talk to them about checking sources or looking for fake news,” she told me. “It was just uncritical acceptance if it fit with the narrative in their head or complete rejection if it didn’t.”

Ms. Ladam started teaching a SIFT-based media literacy unit in her political science classes because of the method’s practical application. The unit is short, only two weeks long. Her students latched onto quick tricks like how to hover over a Twitter handle and see if the account looks legitimate or is a parody account or impersonation. They learned how to reverse image search using Google to check if a photo had been doctored or if similar photos had been published by trusted news outlets. Students were taught to identify claims in Facebook or Instagram posts and, with a few searches, decide — even if they’re unsure of the veracity — whether the account seems to be a trustworthy guide or if they should look elsewhere.

The goal isn’t to make political judgments or to talk students out of a particular point of view, but to try to get them to understand the context of a source of information and make decisions about its credibility. The course is not precious about overly academic sources, either.

“The students are confused when I tell them to try and trace something down with a quick Wikipedia search, because they’ve been told not to do it,” she said. “Not for research papers, but if you’re trying to find out if a site is legitimate or if somebody has a history as a conspiracy theorist and you show them how to follow the page’s citation, it’s quick and effective, which means it’s more likely to be used.”

As a journalist who can be a bit of a snob about research methods, it makes me anxious to type this advice. Use Wikipedia for quick guidance! Spend less time torturing yourself with complex primary sources! A part of my brain hears this and reflexively worries these methods could be exploited by conspiracy theorists. But listening to Ms. Ladam and Mr. Caulfield describe disinformation dynamics, it seems that snobs like me have it backward.

Think about YouTube conspiracy theorists or many QAnon or anti-vaccine influencers. Their tactic, as Mr. Caulfield noted, is to flatter viewers while overloading them with three-hour videos laced with debunked claims and pseudoscience, as well as legitimate information. “The internet offers this illusion of explanatory depth,” he said. “Until 20 seconds ago, you’d never thought about, say, race and IQ, but now, suddenly, somebody is treating you like an expert. It’s flattering your intellect, and so you engage, but you don’t really stand a chance.”

What he described is a kind of informational hubris we have that is quite difficult to fight. But what SIFT and Mr. Caulfield’s lessons seem to do is flatter their students in a different way: by reminding us our attention is precious.

The goal of SIFT isn’t to be the arbiter of truth but to instill a reflex that asks if something is worth one’s time and attention and to turn away if not. Because the method is less interested in political judgments, Mr. Caulfield and Ms. Ladam noticed, students across the political spectrum are more likely to embrace it. By the end of the two-week course, Ms. Ladam said, students are better at finding primary sources for research papers. In discussions they’re less likely to fall back on motivated reasoning. Students tend to be less defensive when confronted with a piece of information they disagree with. Even if their opinions on a broader issue don’t change, a window is open that makes conversation possible. Perhaps most promising, she has seen her students share the methods with family members who post dubious news stories online. “It sounds so simple, but I think that teaching people how to check their news source by even a quick Wikipedia can have profound effects,” she said.

SIFT is not an antidote to misinformation. Poor media literacy is just one component of a broader problem that includes more culpable actors like politicians, platforms and conspiracy peddlers. If powerful, influential people with the ability to command vast quantities of attention use that power to warp reality and platforms don’t intervene, no mnemonic device can stop them. But SIFT may add a bit of friction into the system. Most important, it urges us to take the attention we save with SIFT and apply it to issues that matter to us.

“Right now we are taking the scarcest, most valuable resource we have — our attention — and we’re using it to try to repair the horribly broken information ecosystem,” Mr. Caulfield said. “We’re throwing good money after bad.”

Our focus isn’t free, and yet we’re giving it away with every glance at a screen. But it doesn’t have to be that way. In fact, the economics are in our favor. Demand for our attention is at an all-time high, and we control supply. It’s time we increased our price.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email:letters@nytimes.com.

Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.