Digital Services Act (DSA) has the potential to be the "gold standard" globally, and inspire other countries to 'go on with the new rules that secure our democracies, ”emphasized Ms Haugen. However, she warned that the rules must be strong in terms of transparency, supervision and enforcement. Otherwise, "we will lose this unique opportunity in our generation to adjust the future of technology and democracy."
Protection of users' rights and increased accountability
Ms Haugen's revelations about Facebook's practices and their impact on users and their fundamental rights were worrying. They expressed their concern, inter alia, the mental health and micro-targeting of children and adolescents, including for political purposes. Questions related to how to make platforms more accountable and to ensure that the risk assessment and risk mitigation provisions in the proposed Digital Services Act are strong enough. The idea is to avoid abuses, polarization and counter threats to democracy.
Members also asked Ms Haugen about her views on regulating not only illegal but also harmful content. They asked about her views on content moderation tools and whether targeted advertising should be banned. They also wanted to know what safeguards they would like to be included in EU digital rules. You also have to consider whether the package that is currently on the table is sufficient. Other issues raised during the interview are tools to ensure the DSA has teeth, the transparency of algorithms, giving academics, NGOs and investigative journalists access to the platforms concerned.
Reveal data and increase the security of algorithms
In her responses, Haugen underlined the importance of companies such as Facebook publicly disclosing the data and how it is collected. To enable people to make transparent decisions and ban "dark patterns" on the Internet. She added that people in these companies, and not committees, should be held personally accountable for their decisions.
On countering disinformation and degrading harmful content, Haugen stressed that Facebook is much less transparent than other platforms. Facebook can do much more to make the algorithms more secure. She praised legislators for their content-neutral approach. However, she warned about possible loopholes and exceptions for media organizations and trade secrets.
During her presentation, Ms Haugen also mentioned how important it is for governments to protect technology whistleblowers. Undoubtedly, their testimony will be crucial in protecting people from the harm caused by digital technologies in the future.
The hearing was organized by the European Parliament's Committee on the Internal Market and Consumer Protection. All this in cooperation with other committees: industrial, legal and civil liberties, and special committees on disinformation and artificial intelligence.
Work is underway in Parliament to regulate platforms
The Committee on the Internal Market and Consumer Protection is currently discussing how the proposal for a Digital Services Act, presented by the European Commission in December 2020, should be changed and improved. Ms Haugen's presentation will feed into the work of the DSA committee before the vote. This legislation is Europe's chance to shape the digital economy at EU level. It could become a global standard setter for digital regulation.
Background. Ms Haugen has released thousands of Facebook internal documents
Ms Frances Haugen is a former Facebook employee specializing in computer engineering and in particular in algorithmic product management. At Facebook, Ms Haugen worked as Lead Product Manager on the Citizen Disinformation Team. This team looked at electoral interference around the world and addressed issues related to democracy and disinformation. Facebook disbanded this team after the 2020 US election, shortly after, Haugen contacted the Wall Street Journal.
Haugen says Facebook is deliberately not making these platforms more secure
Haugen released thousands of internal documents she collected while working for Facebook. Some of the most striking facts supported by leaked documents include how Instagram use seriously harms adolescents' mental health, especially when it comes to supporting eating and body image disorders. Overall, Leaked Documents show how public Facebook's claims on a variety of topics often contradict internal research. Overall, Ms Haugen argues that Facebook (which owns other widely used social media companies such as Instagram) is deliberately not making these platforms safer for users as it would impact their bottom line.
Source: European Parliament