Major digital platforms, such as Facebook and Google, are in a dominant position in several areas. They have a crucial role in controlling what information and opinions reach billions of users and what do not. They have the power to influence public debate, or even decide the result of elections through their hidden algorithms. They have also gained huge economic power from their dominant position in the online advertising market. Additionally, platforms have a gatekeeper function between businesses and consumers for important digital services.

No wonder then that platforms have come to the center of regulatory attention. The earlier approach that tech giants as private companies do what they want seems to now be superseded.

With the aim of overcoming the perceived imbalance and creating a safer digital space, the EU proposed a new regulation package in December 2020: the Digital Services Act (DSA) and the Digital Markets Act (DMA). The two planned regulations include, among other things, obligations to increase transparency and to improve interaction between platforms and users, and platforms and authorities in case of complaints or illegal content.

The DSA puts forward measures that will oblige platforms to suspend their services to recipients that frequently share manifestly illegal content. It also requires the platforms to provide reasons for their decisions as well as an obligation to report suspected criminal offenses to competent local law enforcement authorities. The package is expected to be adopted in 2022.

Simultaneously, states try to counterbalance the platforms’ economic power with the help of copyright. Due to their size and the immense amount of users’ data they collect, platforms offer more effective advertising solutions than publishers. This draws advertising money away from publishers while the cost of content production does not burden the platforms.

Australia has recently adopted a News Media Bargaining Code that obliges the publishers and the platforms to reach an agreement within a year, otherwise the state will determine the way platforms should pay for the linking and sharing of the articles.

The EU tries to achieve a similar result with a different regulatory method, the amendment of the Copyright Directive. This is supposed to be implemented by EU member states by this month. The amendment introduces a neighboring right for publishers and enables them to authorize or to prohibit the reproduction and the making available to the public of their articles and to receive remuneration for the reuse of their articles by, for example, search engines. In France, the first member state that has transposed the directive, Google at first refused to pay for snippets. Then in January 2021, Google announced that it would pay for articles to members of an association of French publishers.

Questions challenging free speech are always on the agenda. Some states adopt regulation to force platforms to take down content, whereas other states intend to force platforms not to take down content. In the United States, platforms struggle with First Amendment issues.

Germany passed a controversial law in 2017 requiring social media companies such as Facebook or Twitter to remove within 24 hours illegal content, including hate speech, defamation, and incitement to violence, or face an initial fine of €5m, which could rise to €50m. France raised the stakes in 2020 and ordered social media platforms to delete hateful and illegal content, such as racism, sexual discrimination, child pornography and terrorism within one hour with potential fines capped at €1.25m.

Former US President Donald Trump and his media addiction caused a headache for many platforms. The attack on the US Capitol and Trump’s ambiguous behavior towards the attackers highlighted free speech questions. After the storming of the Capitol, several social media platforms (Facebook, Twitter, YouTube, Instagram, Snapchat, Twitch and TikTok) suspended Trump’s account to prevent further violence and then de-platformed the president. Apple, Google and Amazon banned the right-wing Twitter alternative Parler indefinitely (though it came back online last February and was allowed back in Apple’s App Store in May).

De-platforming worked and extremist attitudes on social media diminished. But a public debate about possible censorship intensified. The debate revolves around the question of whether platforms are obliged to publish content within the limits of First Amendment or whether they can justify decisions to delete anything as a business measure.

Meanwhile in Europe, afraid of losing publicity, Poland’s government has proposed a new law to stop social media platforms deleting content or banning users who do not break Polish laws. Last January, the Hungarian Minister of Justice promised a similar Facebook law. Yet, the Polish draft bill seemed to be off the agenda for now.

As a result of lobbying, the Hungarian Justice Minister has just announced that there will be no Facebook law since the provisions of the DSA and the DMA will fulfill the Hungarian government’s requirements: “In Hungary, similarly to other European countries, we do not want to set any other expectations for large tech companies than a legal, transparent and controllable operation. Nothing more than what applies to other companies and small businesses too.”

At this point in time, it is impossible to tell if these new regulations will achieve their desired goal or foresee how they will impact our lives. But one thing is certain: if we don’t even try to control the economic power of platforms and their business dominance continues to grow, they may become dangerously powerful. At the same time enhancing transparency around how tech companies moderate or promote content on their platforms will be also crucial.

Zsuzsa Detrekői is a TMT lawyer and the former general counsel of a major Hungarian online content provider. This article was first published by the Center for Media, Data and Society.

Photo: Marin Meyer (Unsplash)