Like-minded sources on Facebook are prevalent but not polarizing.
Brendan NyhanJaime SettleEmily ThorsonMagdalena WojcieszakPablo BarberÁAnnie Y ChenHunt AllcottTaylor BrownAdriana Crespo-TenorioDrew DimmeryDeen FreelonMatthew GentzkowSandra Gonzalez-BailonAndrew M GuessEdward KennedyYoung Mie KimDavid LazerNeil MalhotraDevra MoehlerJennifer PanDaniel Robert ThomasRebekah TrombleCarlos Velasco RiveraArjun WilkinsBeixian XiongChad Kiewiet de JongeAnnie FrancoWinter MasonNatalie Jomini StroudJoshua A TuckerPublished in: Nature (2023)
Many critics raise concerns about the prevalence of 'echo chambers' on social media and their potential role in increasing political polarization. However, the lack of available data and the challenges of conducting large-scale field experiments have made it difficult to assess the scope of the problem 1,2 . Here we present data from 2020 for the entire population of active adult Facebook users in the USA showing that content from 'like-minded' sources constitutes the majority of what people see on the platform, although political information and news represent only a small fraction of these exposures. To evaluate a potential response to concerns about the effects of echo chambers, we conducted a multi-wave field experiment on Facebook among 23,377 users for whom we reduced exposure to content from like-minded sources during the 2020 US presidential election by about one-third. We found that the intervention increased their exposure to content from cross-cutting sources and decreased exposure to uncivil language, but had no measurable effects on eight preregistered attitudinal measures such as affective polarization, ideological extremity, candidate evaluations and belief in false claims. These precisely estimated results suggest that although exposure to content from like-minded sources on social media is common, reducing its prevalence during the 2020 US presidential election did not correspondingly reduce polarization in beliefs or attitudes.