Do filter bubbles radicalize us? Are NFTs against the state interest? Can the EU rules protect us? We explore the Internet’s dangers and how the EU responded to them.
Coverage from the „Átláthatóság, felelősség, szólásszabadság az internetes platformokon: az uniós Digital Services Act (DSA) jelentősége a nemzeti érdekek tükrében” (Transparency, accountability, and freedom of expression on internet platforms: the importance of the EU Digital Services Act (DSA) in the light of national interests) conference, which was held in Budapest, Hungary.
The Digital Services Act (DSA) was approved by the European Parliament on 5 July 2022, almost two years after the European Commission proposed it. The European Parliament and the Council of the European Union reached a deal on the DSA on 23 April 2022 and 24 March 2022, respectively.
The DSA and its sibling, the DMA (Digital Markets Act), are landmark legislative proposals that could fundamentally change how digital services are delivered in the European Union. This article focuses on the DSA and, in particular, the „Átláthatóság, felelősség, szólásszabadság az internetes platformokon: az uniós Digital Services Act (DSA) jelentősége a nemzeti érdekek tükrében” (Transparency, accountability, and freedom of expression on internet platforms: the importance of the EU Digital Services Act (DSA) in the light of national interests) conference, which was organized by the Institute of the Information Society at the National University of Public Service (UPS), and the Mathias Corvinus Collegium.
What are European policymakers trying to protect us from?
The Digital Services Act (DSA) aims to create a safer digital space where users’ fundamental rights are protected. Zsolt Ződi, a senior research fellow at the National University of Public Service, stressed that user protection covers not only consumers but also small businesses. Attila Menyhárd, a Hungarian lawyer and professor at Eötvös Loránd University, said that the core of the new proposal is not innovative. In fact, it does not introduce new ideas, but existing ones are reinforced because of a compromise between the Member States. Menyhárd pointed out that the future-oriented character of the legislation is questionable, as it is mainly designed for centralized systems and will not work with a decentralized structure. “We send a message to the service provider if it is not doing something right, but what if there is no such central service provider?” – the professor asked. Blockchain-based systems, such as NFTs, are examples of decentralized structures. Regarding NFTs, he specifically mentioned their special newer types that contain text or even a quote instead of an image or a video clip. There is a risk that such products may contain statements damaging the state’s interest, but the DSA does not cover these cases. Nevertheless, Menyhárd stressed that the DSA is one of the best legislative proposals in years.
The DSA primarily aims to protect users of online platforms like Facebook from the service providers’ harmful practices. An interesting new requirement is to disclose the qualifications and language skills of the platforms’ moderators.
The DSA contains rules regarding the General Terms and Conditions that are intended to protect children. It is high time to make it easier for children to understand the terms and conditions as a large number of the platforms’ users are minors. Nobody reads the General Terms and Conditions, and even if one does, the text is complicated and difficult to understand. The DSA’s solution is to rewrite these documents in all EU languages in a simple way that even children can understand them.
The new Act will protect us not only from the social network service’s harmful practices but also from harmful content. Kinga Sorbán, a research fellow at the National University of Public Service, explained that the DSA should not be interpreted as a general law since national legislation also needs to be taken into account. So, it is possible that some content will be illegal in one country, while anyone can legally access it in another Member State. Thus, Sorbán believes that the effectiveness of the DSA in this context is questionable, as VPNs will allow everyone to access controversial content. In addition, a new problem may arise as even more isolated filter bubbles and echo chambers can emerge on social media platforms, just in this case, on a territorial basis.
Are we becoming radicalized?
There is a lot of discussion about the negative effects of Facebook and similar social media platforms. Filter bubbles, also known as echo chambers, are a very serious problem. According to researchers, the greatest danger of this phenomenon is that the isolated groups where no dissent can be heard can make radicalism flourish. So, echo chambers contribute strongly to the polarization of society.
According to János Tamás Papp, a lawyer of the National Media and Infocommunications Authority, democratic transparency is essential as a country needs „well-informed citizens who make reasoned, good decisions,” whether in a parliamentary election or a referendum. He pointed out that there is a lot of conflicting research on the effect of filter bubbles. Some studies suggest that this form of radicalization is only affecting 2-8% of users; what is more, other research papers say that filter bubbles’ impact is negligible and immeasurable. Interestingly, those who get information from only one source are mainly informed by TV and other traditional media sources. So, not only new media can be blamed for polarization: conservatives read conservative newspapers and watch conservative TV channels. Referring to echo chambers’ adverse effects, Papp added that „echo chambers actually allow in dissenting opinions, but they are mocked in them.” Below posts reflecting opposing views, the comment section tends to be more intense, which may magnify the effect. Papp argues that researchers must reconsider their approach to filter bubbles’ effects.
The DSA tries to address the issue of filter bubbles: the mechanism of all ranking systems on the platforms (e.g., recommendations, news feed) should be disclosed in a clear and comprehensible way. If there is more than one recommendation system, the user must have the option to choose. The opportunity to receive recommendations without profiling should also be offered to the users, although it does not make much sense unless we want to read content utterly irrelevant to us. The DSA gives a particular focus to transparency, especially regarding algorithms and political advertising. Transparency also means that researchers must get wider access to data relating to the platforms.
Papp said that eventually, we do not know precisely how the DSA will affect social media platforms’ operations, but more reliable sources will certainly come first in our feed. But the question is: who will actually decide what reliable sources are? The lawyer concluded the presentation with his thought that polarization is not only caused by social media and cannot be solved by legislation.
Graphics by Réka Pisla | Hype&Hyper