Studies highlight social-media platforms' misinformation problem, underlining the role of the news media - Entrepreneur Generations

Recent studies show that misinformation is rife on popular social-media platforms Instagram and Facebook, highlighting not only the importance of the news media as a source of information but the difficulty of its task as conspiracy-minded Americans seek out biased or false content about politics, the pandemic, and more.

Instagram recommended many posts containing misinformation about the pandemic last fall. "The study is the latest effort to document how social media platforms' recommendation systems contribute to the spread of misinformation, which researchers say has accelerated over the past year, fueled by the pandemic and the fractious U.S. presidential election," Shannon Bond reports for NPR. "Facebook, which owns Instagram, has cracked down more aggressively in recent months. It has widened its ban on falsehoods about Covid-19 vaccines on its namesake platform and on Instagram in February. But critics say the company has not grappled sufficiently with how its automated recommendations systems expose people to misinformation. They contend that the social networks' algorithms can send those who are curious about dubious claims down a rabbit hole of more extreme content."

A new study by New York University-based group Cybersecurity For Democracy "found that far-right accounts known for spreading misinformation are not only thriving on Facebook, they're actually more successful than other kinds of accounts at getting likes, shares and other forms of user engagement," Michel Martin reports for NPR. After studying more than 8 million Facebook posts from nearly 3,000 news and information sources over a five-month period, the researchers confirmed "what some Facebook critics — and at least one anonymous executive — have been saying for some time: that far-right content is just more engaging. In fact, the study found that among far-right sources, those known for spreading misinformation significantly outperformed non-misinformation sources."

Laura Edelson who helped lead the study, told NPR that misinformation peddlers in other partisan categories don't gain as much traction. "There could be a variety of reasons for that, but certainly the simplest explanation would be that users don't find them as credible and don't want to engage with them," she said. The researchers called this the "misinformation penalty."

A Facebook spokesperson told NPR that extreme partisan content isn't quite as pervasive as the study suggests, and said that engagement isn't the same as how many people actually see a post. Edelson responded by urging Facebook back up that assertion by being transparent with how it tracks impressions and content promotion. "I think what's very clear is that Facebook has a misinformation problem," Edelson said. "I think any system that attempts to promote the most engaging content, from what we call tell, will wind up promoting misinformation."

Facebook cracked down on misinformation after the election, demoting posts and users known to spread misinformation. The move served as proof to many that Facebook could do more to halt the spread of misinformation but generally chose not to in the name of getting more traffic, The Washington Post reports.

"All of these changes may, in fact, make Facebook safer. But they also involve dialing back the very features that have powered the platform’s growth for years. It’s a telling act of self-awareness, as if Ferrari had realized that it could only stop its cars from crashing by replacing the engines with go-kart motors," Kevin Roose writes for The New York Times.

And Facebook's efforts arguably didn't help much, the Post reports: Facebook users who wanted to read that type of content responded not by consuming less of it but by decamping to Parler and other social-media apps popular with conservatives. 

Journalists have warned readers for years about the growing threat of QAnon, the Proud Boys, and other extremist groups that routinely organize on social media, but such warnings don't do much to sway Americans who distrust the news media, Rob Tornoe reports for Editor & Publisher.

"Most of these people aren’t just going to suddenly start reading real news. It’s not going to happen," Ben Collins, who covers disinformation, extremism and the internet for NBC News, told Tornoe.

from The Rural Blog https://ift.tt/2PLWM6J Studies highlight social-media platforms' misinformation problem, underlining the role of the news media - Entrepreneur Generations

0 Response to "Studies highlight social-media platforms' misinformation problem, underlining the role of the news media - Entrepreneur Generations"

Post a Comment