While members of Congress do not seem to want to legislate to fight extremism and disinformation on the internet, Mark Zuckerberg offers his contribution. The Facebook CEO appeared before a House of Representatives committee on Thursday, where he suggested ways to amend Section 230 of the US Communications Decency Act.
Platforms accused of not controlling fake news
“The principles of Section 230 are as relevant today as they were in 1996, but the internet has changed dramatically,” said Mark Zuckerberg in his speech, delivered to a subcommittee of the Commission. energy and commerce of the House of Representatives. He appeared before the subcommittee along with Alphabet CEO Sundar Pichai and Twitter CEO Jack Dorsey.
Right from the start, lawmakers have expressed their anger at social media leaders, who have failed to curb disinformation on their platforms. They notably denounced the content disseminating false information on vaccines against Covid-19, as well as the false information and hate messages that led to the attempted insurrection on the Capitol in January.
“You have the means [d’arrêter la désinformation], but every time, you choose engagement and profit ”rather than a healthy civic discourse or public health and safety, accuses the chairman of the subcommittee on communications and technology, Mike Doyle (D- PA). “We will legislate to end this situation. “
Put in place safeguards to fight against illegal content
Lawmakers have been discussing for some time changes to Article 230 of the Communications Decency Act, which is part of the Telecommunications Act 1996. This law exempts online platforms from liability for content. published by third parties.
Mark Zuckerberg suggests changing the law in a manner broadly consistent with existing Facebook practices: “We believe Congress should consider making platform intermediary liability protection for certain types of illegal content conditional on the ability of companies to comply. the best practices to fight against the propagation of this content ”, he defends. “Instead of being granted immunity, platforms should be required to demonstrate that they have systems in place to identify and remove illegal content. Platforms should not be held responsible if particular content escapes their detection – this would be impractical for platforms that publish billions of posts per day – but they should be required to have adequate systems in place to deal with illegal content. . “
The definition of an adequate system could be proportional to the size of the platform and set by a third party, suggests the CEO of Facebook. As for “best practices,” he adds, they should not include unrelated issues, such as encryption or privacy, which deserve full debate.
The algorithmic choice
Sundar Pichai, meanwhile, argues that “recent proposed changes to Section 230 […] will have unintended consequences – harming both freedom of expression and the ability of platforms to take responsible action to protect users in the face of ever-changing challenges ”.
Instead, he adds, the industry should focus on “the processes for dealing with harmful content and behavior. Solutions could include developing clear and accessible content policies, notifying users when their content is deleted, and empowering them to appeal content decisions, as well as sharing how systems designed for dealing with harmful content works over time ”.
Jack Dorsey, in his speech, did not address section 230. He preferred to propose principles that social platforms could adhere to, such as “algorithmic choice”.
“We believe that individuals should have transparency or meaningful control over the algorithms that affect them,” he says. “We recognize that we can do more to provide algorithmic transparency, fair machine learning, and controls that empower people.”