A few weeks on X and you become right-wing

Social network feeds promise to be a faithful mirror of our interests, a window on the world that filters information based on our tastes, and nothing more. If this were not the case, moreover, they …

A few weeks on X and you become right-wing

Social network feeds promise to be a faithful mirror of our interests, a window on the world that filters information based on our tastes, and nothing more. If this were not the case, moreover, they would subtly transform themselves into propaganda tools. And sadly, that appears to be exactly what is happening. The confirmation comes from a study published in Naturewhich investigated how the

The study

To reach this conclusion, the researchers – including Bocconi economist Germain Gauthier – conducted a rigorous experiment on almost five thousand US users, monitoring them for seven weeks during 2023. The team divided the participants into two distinct groups: the first continued to use the “for you” feed of

And they then analyzed how their opinions had changed at the end of the experiment, focusing on users who had had to change their habits of using

A push to the right

The results were clear. Those who were exposed to the algorithm’s choices for the first time, at the end of the experiment, were on average 4.7% more likely (compared to the beginning) to give priority to the demands of the Republican Party, to consider the judicial investigations into Donald Trump unacceptable, and to espouse positions close to the Russian ones on the war in Ukraine. Switching to the feed of followed profiles, or continuing to use the app as was done before the experiment, however, did not produce changes in the political opinions of the participants.

What makes the ability of the In practice, the “for you” orients us towards accounts that we would probably never have searched for or followed independently, creating a vicious circle that tends to permanently change our opinions.

How the algorithm works

By studying the contents proposed by the algorithm, the authors of the study identified the mechanisms by which it promotes conservative positions: it gives priority to contents that present right-wing opinions (in a slight but measurable way), and penalizes those coming from newspapers and other traditional media, in favor of posts by political activists and anonymous users.

We are therefore faced with a technology that, far from being neutral, acts as a powerful and opaque editorial force. Capable of shaping the opinions of the population in a subtle and discreet way, but with tangible and lasting effects. The solution? We should probably demand transparency from the platforms on the mechanisms with which their algorithms work, in order to subject them to the scrutiny of the public, experts and institutions. This is what Europe is trying to do, after all, with its Digital Services Act (the DSA) and with the European regulation on AI. But given the response from companies like