homehome Home chatchat Notifications


The crowd can do as good a job spotting fake news as professional fact-checkers -- if you group up enough people

"Quantity has a quality all its own" is an infamous quote, but it's not wrong.

Alexandru Micu
September 2, 2021 @ 10:15 pm

share Share

New research suggests that relatively small, politically balanced groups of laymen could do a reliable job of fact-checking news for a fraction of today’s cost.

Image credits Gerd Altmann.

A study from MIT researchers reports that crowdsourced fact-checking may not actually be a bad idea. Groups of normal, everyday readers can be virtually as effective as professional fact-checkers, it explains, at assessing the veracity of news from the headline and lead sentences of an article. This approach, the team explains, could help address our current misinformation problem by increasing the number of fact-checkers available to curate content at lower prices than currently possible.

Power to the people

“One problem with fact-checking is that there is just way too much content for professional fact-checkers to be able to cover, especially within a reasonable time frame,” says Jennifer Allen, a Ph.D. student at the MIT Sloan School of Management and co-author of a newly published paper detailing the study.

Let’s face it — we’re all on social media, and we’ve all seen some blatant disinformation out there. That people were throwing likes or retweets at, just to add insult to injury. Calls to have platforms better moderate content have been raised again and again. Steering clear of the question of where exactly moderation ends and manipulation or censoring begins, one practical issue blocking such efforts is sheer work volume. There is a lot of content out in the online world, and more is published every day. By contrast, professional fact-checkers are few and far between, and they don’t enjoy particularly high praise or high pay, so not many people are planning on becoming one.

With that in mind, the authors wanted to determine whether unprofessional fact-checkers could help stymie the flow of bad news. It turns out they can if you lump enough of them together. According to the findings, the accuracy of crowdsourced judgments — from relatively small, politically balanced groups of normal readers — can be virtually as accurate as those from professional fact-checkers.

The study examined over 200 news pieces that Facebook’s algorithms flagged as requiring further scrutiny. They were flagged either due to their content, due to the speed and scale they were being shared at, or for covering topics such as health. The participants, 1,128 U.S. residents, were recruited through Amazon’s Mechanical Turk platform.

“We found it to be encouraging,” says Allen. “The average rating of a crowd of 10 to 15 people correlated as well with the fact-checkers’ judgments as the fact-checkers correlated with each other. This helps with the scalability problem because these raters were regular people without fact-checking training, and they just read the headlines and lead sentences without spending the time to do any research.”

Participants were shown the headline and lead sentence of 20 news stories and were asked to rate them over seven dimensions: how “accurate,” “true,” “reliable,” “trustworthy,” “objective,” and “unbiased” they were, and how much they “describ[ed] an event that actually happened”. These were pooled together to generate an overall score for each category.

These scores were then compared to the verdicts of three professional fact-checkers, who evaluated all 207 stories involved in the study after researching each. Although the ratings these three produced were highly correlated with each other, they didn’t see eye to eye on everything — which, according to the team, is par for the course when studying fact-checking. More to the point, these fact-checkers agreed on the verdict about individual stories 49% of the stories. Two of the three agreed on a verdict with the third disagreeing on 42%, and all three disagreed on a verdict on 9% of the stories.

When the regular reader participants were sorted into groups with equal numbers of Democrats and Republicans, the average ratings were highly correlated with those of the professional fact-checkers. When these balanced groups were expanded to include between 12 and 20 participants, their ratings were as strongly correlated with those of the fact-checkers as the fact-checkers’ were with each other. In essence, these groups matched the performance of the fact-checkers, the authors explain. Participants were asked to undertake a political knowledge test and a test of their tendency to think analytically

Overall, the ratings of people who were better informed about civic issues and engaged in more analytical thinking were more closely aligned with the fact-checkers.

Judging from these findings, the authors explain, crowdsourcing could allow for fact-checking to be deployed on a wide scale for cheap. They estimate that the cost of having news verified in this way rounds up to roughly $0.90 per story. This doesn’t mean that the system is ready to implement, or that it could fix the issue completely by itself. Mechanisms have to be set in place to ensure that such a system can’t be tampered with by partisans, for example.

“We haven’t yet tested this in an environment where anyone can opt in,” Allen notes. “Platforms shouldn’t necessarily expect that other crowdsourcing strategies would produce equally positive results.”

“Most people don’t care about politics and care enough to try to influence things,” says David Rand, a professor at MIT Sloan and senior co-author of the study. “But the concern is that if you let people rate any content they want, then the only people doing it will be the ones who want to game the system. Still, to me, a bigger concern than being swamped by zealots is the problem that no one would do it. It is a classic public goods problem: Society at large benefits from people identifying misinformation, but why should users bother to invest the time and effort to give ratings?”

The paper “Scaling up fact-checking using the wisdom of crowds” has been published in the journal Science Advances.

share Share

How Hot is the Moon? A New NASA Mission is About to Find Out

Understanding how heat moves through the lunar regolith can help scientists understand how the Moon's interior formed.

This 5,500-year-old Kish tablet is the oldest written document

Beer, goats, and grains: here's what the oldest document reveals.

A Huge, Lazy Black Hole Is Redefining the Early Universe

Astronomers using the James Webb Space Telescope have discovered a massive, dormant black hole from just 800 million years after the Big Bang.

Did Columbus Bring Syphilis to Europe? Ancient DNA Suggests So

A new study pinpoints the origin of the STD to South America.

The Magnetic North Pole Has Shifted Again. Here’s Why It Matters

The magnetic North pole is now closer to Siberia than it is to Canada, and scientists aren't sure why.

For better or worse, machine learning is shaping biology research

Machine learning tools can increase the pace of biology research and open the door to new research questions, but the benefits don’t come without risks.

This Babylonian Student's 4,000-Year-Old Math Blunder Is Still Relatable Today

More than memorializing a math mistake, stone tablets show just how advanced the Babylonians were in their time.

Sixty Years Ago, We Nearly Wiped Out Bed Bugs. Then, They Started Changing

Driven to the brink of extinction, bed bugs adapted—and now pesticides are almost useless against them.

LG’s $60,000 Transparent TV Is So Luxe It’s Practically Invisible

This TV screen vanishes at the push of a button.

Couple Finds Giant Teeth in Backyard Belonging to 13,000-year-old Mastodon

A New York couple stumble upon an ancient mastodon fossil beneath their lawn.