On October 4, 2022, the European Parliament approved the final text of the Digital Services Act (DSA), which aims to limit the spread of illegal content online. The DSA has been labeled by some as a gold standard for content and platform governance in the EU. Under the DSA, online platforms will be required to promptly delete illegal content and dangerous goods. They must have mechanisms in place allowing users to mark illegal online content.
Like the General Data Protection Regulation (GDPR), which aimed to protect EU consumers, has changed the worldwide rules for personal data privacy, it is expected that the Digital Services Act will set new rules for a more transparent and safe online environment, especially Big Tech. It’s an end to industry self-regulation in the field and a new start for the future of the internet.
What is the Digital Services Act?
The Digital Services Act was adopted by the EU on October 4, 2022. The law will become applicable starting 1 January 2024.
The Digital Services Act is designed to regulate the business of so-called intermediary services, i.e. those which connect individuals to goods, services, or content online.
The DSA is the new EU law that will regulate how companies, including online platforms, search engines, cloud services, file-sharing services, social networks, online marketplaces, and every other significant provider of digital services, moderate and manage content, including illegal content, hate speech and disinformation, and the rights of the children.
Who does the law Apply to?
The DSA applies to hosting services, marketplaces, search engines, cloud services, file-sharing services, social networks, and online platforms that offer services in the EU. The DSA will apply to all businesses regardless of their place of origin. That is, the DSA protects individuals who are residing in the EU.
Moreover, the DSA distinguishes separately Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), which are platforms with more than 45 million average monthly active users in the EU. This way, the law rightly emphasizes the influence of Big Techs over public discourse, and often manipulative influence they have on people’s behavior online.
All online platforms and search engines will have to report the number of average monthly active users of their services to determine whether they qualify as VLOPs or VLOSEs. Providers of VLOPs or VLOSEs will be subject to the obligations of the DSA four months after the European Commission designates them as such.
What does This Mean for Individuals?
The DSA will affect the following fundamental rights online:
- Greater algorithmic accountability. The European Commission and its member states will have access to the algorithms of VLOPs. The law requires online platforms to disclose the number of removal orders issued by national authorities and all notices about the presence of illegal content for vetted researchers, not-for-profit bodies, organizations, or associations. The DSA requires all online platforms to publicly report on how they use automated content moderation tools, their error rates, and information about the training of their content moderators.
- Uniform response to illegal content online. The DSA defines unified criteria for so-called notice-and-action procedures, the system that determines when platforms should be held liable for the dissemination of illegal content. The intermediary liability defines how online platforms should act when they detect illegal content. Please note, that the DSA prohibits general content monitoring, and separates knowing about specific illegal content and trying to remove it, and scanning everything to find any illegal content.
- Ban on dark patterns (deceptive design). The DSA seeks to ban manipulating users' choices through deceptive or confusing user interfaces, that try to give visual prominence to a particular choice.
- Ban on targeting using sensitive data. The DSA sets up strict regulations for online advertising that exploits people’s vulnerabilities. It bans advertising based on profiling and using special categories of sensitive data like sexual or political orientation.
- More control over the content is provided to people. At the moment, individuals can't understand how and why specific content is being provided to them. The DSA obliges all online platforms to disclose why people see specific information more often than others. This information should be easily accessible via their terms of service. Importantly, people will have the right to select a recommendation system that is not based on profiling.
- More detailed moderation reporting. Companies are obliged to “release detailed biannual reports of their moderation efforts, including the number of staff, expertise, languages spoken and the use of artificial intelligence to remove illegal content”.
- Better reporting, dispute mechanisms, and compensation for individuals. Individuals will have better means to report harmful content and to appeal decisions made about the removal of their own content. Users of digital services will have a right to seek compensation for any damages or loss experienced due to infringements by platforms.
How will the Digital Services Act impact children?
Besides general requirements, there also are the following benefits for children:
- Clear recognition of the rights of the children. The DSA references The Convention on the Rights of the Child and the General Comment No 25 of the United Nations on children’s rights in relation to the digital environment. A child here is anyone under the age of 18 years old.
- The swift removal of illegal online content. This covers child sexual abuse material, terrorist content, illegal hate speech, or an illegal product. Victims of online harassment will be better protected against unlawful non-consensual sharing of private images with immediate removal.
- A ban on targeted advertising at children. Targeting with online ads based on their sexual preference, health information, religion, and political beliefs will be banned for all individuals, including children.
What does This Mean for Businesses?
Under the DSA, VLOPs and VLOSEs are required to implement:
- Mandatory risk assessment;
- Deployment of mitigation of risk measures;
- Independent audits each year;
- Publishing their terms and conditions in the official languages of all the EU Member States in which they offer their services;
- Grant authorities access to data for the purposes of monitoring and assessing DSA compliance, and explain the design, logic, and testing of algorithmic systems;
- Establish an independent compliance group to report the DSA compliance measures to senior management;
- Pay an annual supervisory fee to the European Commission for the costs associated with its oversight;
- Comply with certain actions required by the European Commission during a crisis, where activities of search engines could lead to serious threats to public security or public health.
Special due diligence obligations for VLOPs and VLOSEs acknowledge systemic risks for fundamental human rights stemming from VLOPs’ and VLOSEs' systems and operations. However, the effectiveness of these measures will be determined by future guidelines that are yet to be drafted by the Commission.
Companies will have to implement all the above-mentioned requirements regarding fundamental human rights online. Companies that won’t comply can be fined up to 6% of their worldwide turnover.
Follow CookieScript blog not to dismiss the updates on the DSA and other news regarding online privacy, content, and platform governance.
Choose CookieScript Consent Management Platform, and we will take care of your website's GDPR and other privacy laws compliance issues!
Frequently Asked Questions
What is the Digital Services Act (DSA)?
The Digital Services Act (DSA) is the new EU law that regulates how companies, including online platforms, search engines, cloud services, file-sharing services, social networks, online marketplaces, and every other significant provider of digital services, moderate and manage content, including illegal content, hate speech and disinformation, and the rights of the children. The law will become applicable starting 1 January 2024.
How will the Digital Services Act impact individuals?
Who does the Digital Services Act apply to?
The DSA will apply to hosting services, marketplaces, search engines, cloud services, file-sharing services, social networks, and online platforms that offer services in the EU, regardless of their place of origin. Moreover, the DSA distinguishes separately Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), which are platforms with more than 45 million average monthly active users in the EU.
How will the Digital Services Act impact businesses?
The law will become applicable starting 1 January 2024. Under the DSA, Very Large Online Platforms are required to implement mandatory risk assessment, deployment of mitigation of risk measures, independent audits each year, publishing their terms and conditions in the official languages of all the EU Member States in which they offer their services, grant authorities access to data for the purposes of monitoring and assessing the DSA compliance, pay an annual supervisory fee to the European Commission and other requirements.
When will the Digital Services Act take effect?
The Digital Services Act was adopted by the EU on October 4, 2022. The law will become applicable starting 1 January 2024. Follow CookieScript blog not to dismiss the updates on the DSA and other news regarding online privacy and content governance.