Uncategorized

Research Suggests Facebook’s Algorithm Is ‘Influential’ but Doesn’t Necessarily Change Beliefs

Mike Isaac and Sheera Frenkel, reporting last week for The New York Times:

In the papers, researchers from the University of Texas, New York
University, Princeton and other institutions found that removing
some key functions of the social platforms’ algorithms had “no
measurable effects” on people’s political beliefs. In one
experiment on Facebook’s algorithm, people’s knowledge of
political news declined when their ability to reshare posts was
removed, the researchers said.

At the same time, the consumption of political news on Facebook
and Instagram was highly segregated by ideology, according to
another study. More than 97 percent of the links to news
stories rated as false by fact checkers on the apps during the
2020 election drew more conservative readers than liberal readers,
the research found. […] Still, the proportion of false news
articles that Facebook users read was low compared with all news
articles viewed, researchers said.

False news articles were low overall, but the articles deemed false were overwhelming consumed by conservatives. That’s no surprise, but to me, gets to the heart of the controversy. A hypothetical social media algorithm that promotes true stories and suppresses false ones, with perfect accuracy, is going to be accused by conservatives of being biased against conservatives, because conservatives are drawn to false stories.

Jeff Horwitz, reporting for The Wall Street Journal (News+ link), on Facebook overstating the degree to which these new studies exonerate its platforms’ influence:

Science warned Meta earlier this week that it would publicly
dispute an assertion that the published studies should be read as
largely exonerating Meta of a contributing role in societal
divisions, said Meagan Phelan, who oversees the communication of
Science’s findings.

“The findings of the research suggest Meta algorithms are an
important part of what is keeping people divided,” Phelan told
Meta’s communications team on Monday, according to an excerpt of
her message she shared with The Wall Street Journal. She added
that one of the studies found that “compared to liberals,
politically conservative users were far more siloed in their news
sources, driven in part by algorithmic processes, and especially
apparent on Facebook’s Pages and Groups.”

 ★ 

Mike Isaac and Sheera Frenkel, reporting last week for The New York Times:

In the papers, researchers from the University of Texas, New York
University, Princeton and other institutions found that removing
some key functions of the social platforms’ algorithms had “no
measurable effects” on people’s political beliefs. In one
experiment
on Facebook’s algorithm, people’s knowledge of
political news declined when their ability to reshare posts was
removed, the researchers said.

At the same time, the consumption of political news on Facebook
and Instagram was highly segregated by ideology, according to
another study. More than 97 percent of the links to news
stories rated as false by fact checkers on the apps during the
2020 election drew more conservative readers than liberal readers,
the research found. […] Still, the proportion of false news
articles that Facebook users read was low compared with all news
articles viewed, researchers said.

False news articles were low overall, but the articles deemed false were overwhelming consumed by conservatives. That’s no surprise, but to me, gets to the heart of the controversy. A hypothetical social media algorithm that promotes true stories and suppresses false ones, with perfect accuracy, is going to be accused by conservatives of being biased against conservatives, because conservatives are drawn to false stories.

Jeff Horwitz, reporting for The Wall Street Journal (News+ link), on Facebook overstating the degree to which these new studies exonerate its platforms’ influence:

Science warned Meta earlier this week that it would publicly
dispute an assertion that the published studies should be read as
largely exonerating Meta of a contributing role in societal
divisions, said Meagan Phelan, who oversees the communication of
Science’s findings.

“The findings of the research suggest Meta algorithms are an
important part of what is keeping people divided,” Phelan told
Meta’s communications team on Monday, according to an excerpt of
her message she shared with The Wall Street Journal. She added
that one of the studies found that “compared to liberals,
politically conservative users were far more siloed in their news
sources, driven in part by algorithmic processes, and especially
apparent on Facebook’s Pages and Groups.”

Read More 

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top
Generated by Feedzy