Boler in Media

How social media echo chambers influence your emotions and your political compass

An expert explains how Facebook lives on psychology, politics and your attention, by Megan DeLaire, Dec 22, 2020

“The fact that we no longer have privacy in the digital sphere means our political system is increasingly being run by algorithms, by social media operations that are influencing what news we receive and what political ads we receive according to what they know about us,” she said.

“And they are reinforcing what we’ve told them about ourselves until we are a more rabid version of ourselves.”

What is an echo chamber?

An echo chamber is an environment in which a person encounters mostly beliefs and opinions that support and reinforce their own. These beliefs “echo” continuously, while alternative ideas are largely absent or ignored.

Facebook produces echo chambers in two ways: by using algorithms — coded instructions that automate the platform’s functions — to filter the content users see, and by relying on users’ tendency to interact with people whose opinions align with their own.

Since Facebook earns revenue through ad space on the platform and posts that capture users’ attention also draw their attention to ads, Facebook places content that will generate an emotional response in front of the users most likely to react.

“A really emotional story is going to be profitable for advertising because a lot of people look at it,” Boler said. “And those are eyeballs on ads.”

As Facebook gathers sophisticated data about a user through their activity on the platform, as well as on third party apps and websites that are linked to their Facebook account, it learns what kind of content is most likely to hold that user’s attention, which leads to more ad revenue.

One topic that tends to inflame passions is politics. Since showing people consistently partisan content tends to nudge them toward either end of the political spectrum, she said Facebook can, and does, influence how people vote.

Users also create their own echo chambers by befriending people they share opinions with and blocking and “unfriending” people they don’t. The result is that users’ Facebook feeds show content that reinforces their beliefs rather than content that presents alternative viewpoints.

What can you do about it?

Boler believes Facebook users can weaken the impact of echo chambers on the platform in a few ways.

They can limit the amount of information they feed Facebook’s algorithms by adjusting their privacy settings to remove Facebook’s access to their data on third party websites and apps, and they can close the app on their phone when not actively using it, to prevent it from gathering data in the background.

They can also make an effort not to ignore, “unfriend” or block Facebook friends whose perspectives don’t align with theirs.

Most importantly, she said, they can develop a healthy, diverse independent media diet that includes “ideally five to 10 media sources, including publicly owned sources such as CBC, international sources like the Guardian and independent and corporate media.”

If we don’t take these steps, she said, our democratic system could suffer.

“People should care about this,” Boler said. “Because it’s very clear social media is eroding democracy and it is eroding our capacity to have conversations across the political aisle.”

https://www.toronto.com/news-story/10292506-how-social-media-echo-chambers-influence-your-emotions-and-your-political-compass/

%d bloggers like this: