According to a new research published Thursday, an unprecedented dive was offered into the political behavior across Facebook and Instagram which happen to be two major online platforms where people express and engage with their political thoughts and beliefs. The research, conducted by an interdisciplinary team of researchers in collaboration with internal Meta groups, consists of four publications published in Science and Nature that examine activity on both platforms around the time of the 2020 U.S. election.
The studies, which are merely the first of many to be published in the coming months, arose from the 2020 Facebook and Instagram Election Study (FIES), an uncommon collaboration between Meta and the scientific research community. On the academic front, the project was spearheaded by University of Texas Professor Talia Jomini Stroud of the Center for Media Engagement and New York University Professor Joshua A. Tucker, co-director of the Center for Social Media and Politics.
The discoveries are numerous and complex.
In one study on Facebook’s ideological echo chambers, researchers aimed to learn how much of the platform’s users were exposed primarily to information with which they agreed with. “Our analyses highlight that Facebook, as a social and informational setting, is substantially segregated ideologically—far more than previous research on internet news consumption based on browsing behavior has found,” the researchers wrote.
ALso, see:
Elon Musk About to Change The Bird to an “X” on Twitter
Tech warriors in the battle for Israel’s democracy
The data revealed at least two very interesting specific discoveries. First, the researchers discovered that content posted in Facebook Groups and Pages exhibited far higher “ideological segregation” than content shared by users’ friends. “Pages and Groups contribute much more to audience polarization and segregation than users,” the researchers mentioned.
That may seem obvious, but Groups and Pages have historically played a significant role in disseminating misinformation and bringing like-minded users together around dangerous shared interests, such as QAnon, anti-government militias (such as the Proud Boys, who used Facebook to recruit), and potentially life-threatening health conspiracies. Misinformation and extremism specialists have long expressed worry about the two Facebook products’ roles in political divisiveness and the spread of theories.
“Our results uncover the influence that two key affordances of Facebook—Pages and Groups—have in shaping the online information environment,” the researchers wrote. “Pages and Groups benefit from the easy reuse of content from established producers of political news and provide a curation mechanism by which ideologically consistent content from a wide variety of sources can be redistributed.”
That analysis also discovered a significant disparity in liberal and conservative political content on Facebook. The researchers discovered that Meta’s third-party fact-checking system determined a “far larger” share of conservative Facebook news content to be false, demonstrating how conservative Facebook users are exposed to far more online political misinformation than their left-leaning counterparts.
“… Misinformation shared by Pages and Groups has audiences that are more homogeneous and completely concentrated on the right,” the researchers wrote.
In another experiment conducted with Meta’s assistance, users on Facebook and Instagram had their algorithmic feeds swapped with a reverse chronological feed – a common rallying cry for those tired of social media’s endless scrolling and addictive designs. The experience had little effect on how users felt about politics, how politically involved they were offline, or how much political information they ended up having.
There was one significant difference in the experiment for users who received the reverse chronological feed. “We found that users in the Chronological Feed group spent dramatically less time on Facebook and Instagram,” the authors said, highlighting how Meta stimulates engagement — and drives obsessive behavioral patterns — by mixing content in an algorithmic jumble.
These findings are only a sample of the present findings, and only a small portion of what will be published in future studies. Meta has spun the findings of the new studies as a victory, a viewpoint that reduces complex findings to what amounts to a marketing stunt. Regardless of Meta’s interpretation of the results or the clearly unusual arrangement between the academics and the corporation, this data is critical for future social media study.