One week before the US presidential election, the EU is wondering about the influence of social media on political opinions. In a report, the Commission examines the influence of online technologies on decision-making and political behavior.
While it is clear that search engines play a vital role in the political landscape, no one can judge the share of responsibility of algorithms in this process. It is not ruled out that algorithms, known for their inherent complexity, influence our preferences and perceptions. Moreover, information flows and automated recommendation systems are designed to “maximize user attention by anticipating our presumed preferences”, which can lead to the publication of “polarizing, misleading, extremist or otherwise problematic content”, recognizes the report.
Recent research in the United States has revealed that even email is not immune to the political fallout of algorithmic sorting. The Gmail mail service automatically sorts incoming messages between different mailboxes. Even more troubling, the redirection differed significantly among political candidates, with 63% of a candidate’s emails appearing in the primary inbox, compared to 0% for many others (including Joe Biden and Elizabeth Warren).
The Commission says that the evidence for the existence of filter bubbles (i.e. the algorithmic separation of user information content) is “ambivalent”, even though there are “serious and legitimate concerns about the rooms. echo ”(ie formed by the self-selection of content by users), she continues.
These phenomena of echo chambers and filtering bubbles on platforms are leading to “increasing political polarization and radicalization”. So the YouTube recommendation system tends to offer viewers “more extreme every step of the way,” the report notes. For example, “Users who viewed videos of Donald Trump during the 2016 presidential campaign were subsequently shown videos featuring white supremacists and Holocaust deniers. After showing videos of Bernie Sanders, YouTube suggested videos relating to leftist conspiracies, such as the claim that the US government was behind the 9/11 attacks. A recent pre-recorded study of YouTube’s recommendation system confirmed that it was likely to promote and amplify conspiratorial content even in response to relatively harmless search terms.
The report further argues that due to the complex interactions between algorithmic and human behavior, the division of responsibilities is difficult to perceive. What remains clear to the Commission is that algorithms are doing nothing to combat online polarization.