
The Digital Services Act and The Digital Market Act - a coordinated reaction to the constant evolution of the digital market
E-commerce has been growing exponentially in recent years. According to the US International Trade Commission (USITC) (2017), global e-commerce grew from US$19.3 trillion in 2012 to US$27.7 trillion in 2016 and reached US$29 trillion in 2017.
For France alone, internet sales in 2019 exceeded 103.4 billion euros (study presented on 5 February 2020 by Fevad to the Ministry of the Economy and Finance). The COVID 19 health crisis only amplified this phenomenon and showed the importance of digital technologies in all aspects of modern life.
It is therefore no coincidence that the digital domain is the subject of numerous debates in the Member States of the European Union, which wish to provide a harmonised response to the constant evolution of the digital market.
In light of this, on 15 December 2020, the European Commission presented the Digital Services Act and the Digital Markets ActDigital Markets Act, two texts that are part of a logic of supervision of the actors of the digital economy by means of the principles of transparency, but also of security of the fundamental rights of users.
Currently subject to the ordinary legislative procedure, these two texts were the subject of an opinion of the European Supervisor on 10 February 2021, thus welcoming the framing of the digital environment for European users, while recalling the importance of respecting fundamental principles such as the rule of law and better control of the data generated by them.
The Digital Services Act - European framework for online services (Part 1)
The Digital Services Act (DSA) is based on the E-Commerce Directive of 8 June 2000, which is now obsolete in view of the evolution of the digital market.
WHAT IS ITS SCOPE?
Following the legal framework established by the E-Commerce Directive of 2000(1), this draft regulation will apply both to the European Single Market and to online providers located outside the EU but offering their services in the Single Market.
It generally encompasses all digital services that connect the consumer user to digital content, but also to goods and services.
The project distinguishes four categories of actors(2) (digital providers) that will be subject to new obligations:
- intermediary services offering network infrastructure such as ISPs, domain name registrars;
- hosting services;
- online platforms bringing together sellers and consumers such as marketplaces, app stores, collaborative economy platforms, social networks etc.;
- very large online platforms that present particular risks of illegal content distribution and societal harm.
WHAT ARE THE GENERAL OBLIGATIONS FOR ALL SUPPLIERS?
The creation of a single contact point for communication between suppliers and the various authorities (Articles 10 and 11 of the Regulation)
The Regulation provides for formal cooperation between digital providers and the authorities to respond to any request concerning the implementation of the Digital Services Act.
In this respect, Article 10 of the draft provides for an obligation on digital providers to implement a means of communication with the authorities in charge of the environmental framework(3).
The modalities of this "direct communication" should be made public to facilitate their identification and therefore their use.
Article 11 supplements this provision for digital providers who are not established in the EU but whose services are directed in particular at European users: a legal representative, whether a natural or legal person, will have to be available to the authorities in order to be asked for any information they may require.
Access to information about the legal representative should also be facilitated, as should the newly created contact point for digital providers established in the EU. It should be noted that the Regulation introduces a double liability in case of non-compliance. Indeed, the legal representative may be held liable for any breach of the obligations of the supplier he represents, without this leading to the exclusion of the supplier's liability.
A reinforced transparency obligation (Articles 12 and 13 of the Regulation)
The methods used by digital providers to moderate digital content (the different means used that lead to restrictions on access to certain content)(4) should be explicitly mentioned in their general conditions.
Such restrictions must be based on the principles of objectivity and proportionality in order to ensure that the fundamental rights of users are respected(5).
Article 13 logically requires digital providers to publish a clear, easily understandable and detailed annual report to users on the moderation actions taken.
It details the information that should be included in the report, including all injunctions issued by the competent authorities of each Member State on the moderation of illegal content classified by type, as well as the time needed for the provider to comply with the injunction. The moderation actions of providers based on their own initiative should also be mentioned, as well as the handling of complaints submitted by users.
It should be noted that this obligation does not apply to micro-enterprises and small enterprises in the European sense (entities with fewer than 10 employees and a turnover or balance sheet total not exceeding EUR 2 million, as defined in Recommendation 2003/361/EC). Suppliers "with a certain economic impact" are therefore clearly targeted by this provision.
WHAT ARE THE OBLIGATIONS OF HOSTING PROVIDERS AND ONLINE PLATFORMS?
A notification mechanism giving the user a central role in moderating illegal content.
The regulation of illegal content is at the heart of the provisions for content hosts and online platforms.
Under Article 14§1, all users must have easy access to notify content deemed to be illegal. The notification must contain "complete and accurate" information allowing rapid identification of the said content (URL), the reasons for the notification, the identity of the user and his or her e-mail address, as well as a declaration of good faith.
The Commission thus places the onus on the user to identify and disclose illegal content.
The procedure is not insignificant: indeed, the notification issued by the user will be deemed to be knowledge of manifestly unlawful content, thus raising a real issue of responsibility for hosts and online platforms. The latter will therefore be obliged, if the text is adopted as it stands, to necessarily respond to users' requests, at the risk of being held responsible for the availability of illegal content despite the warning received.
An information obligation on the part of hosting providers and platforms on the user notification procedure
This information obligation, set out in Article 15 of the draft regulation, comes into play after the fact, when the platform or host has moderated content that is illegal or incompatible with the online provider's general terms and conditions: thus, if the content is no longer accessible to users because it has been removed, the said digital providers will have to inform the beneficiary of the decision (i.e. the author of the illegal content or the content deemed to be incompatible with the general terms and conditions) of the reasons for its moderation.
These reasons shall include the facts, the legal basis, if any, the means of redress, as well as the scope of the moderation action, while satisfying the requirement of clarity and comprehensibility of the information provided.
A public database will necessarily have to record all decisions taken by the hosting provider or online platform. Surprisingly, this database will not be managed by the provider in question, but by the Commission, with the proviso that "personal" information cannot be included.
WHAT ADDITIONAL OBLIGATIONS DO ONLINE PLATFORMS HAVE?
Section III of Chapter III of the draft, contained in Articles 16 to 24, is intended only for online platforms which cannot be qualified as micro and small enterprises within the meaning of Recommendation 2003/361/EC. Similarly, online hosting providers are excluded.
Thus, in addition to the above-mentioned notification mechanism, online platforms will also have to set up an internal complaint handling system. It is also specified that this system must be available free of charge.
The system will therefore support decisions to remove or restrict access to content deemed illegal or "incompatible with the general conditions" of the platform.
Illegal content may take several forms, in that it may be characterised by information, by the provision of a product or service, but also by a user account.
The DSA also provides guiding principles that must necessarily be taken into account by the system(6). The latter must be accessible to all, and in this respect the online platform must be user-friendly. Furthermore, the user's complaint must be dealt with "within a reasonable time" and in an objective manner. The platform must also inform the user by any means of the decision taken and remind him/her of the existence of an alternative dispute resolution method, by means of an extrajudicial body designated for this purpose(7).
This provision is complemented by an Article 19 which proposes the creation of a "trusted flaggers". Authorised(8) by the digital services coordinator of the competent Member State, it will have to receive all complaints from users of illegal content.
Approval may be withdrawn following an investigation by the Member State's coordinator. The Commission therefore leaves it to the Member States to ensure that complaints are dealt with on their own territory.
Article 20 of the draft regulates the risks of abuse by users in the use of the platform and in the recourse to complaints(9).
Firstly, the platform may restrict access to the services to any user proposing illicit content, provided that it has given prior notice and for a "reasonable" period of time, which logically leads to an "in concreto" assessment.
In order not to jeopardise the effectiveness of the system, access to the complaints system may also be suspended, in particular because of a large number of unfounded complaints. Here again, an "in concreto" assessment will necessarily be required.
In this respect, the platform's policy on this matter should be clearly set out in its general terms and conditions, determining in particular what may constitute abuse and the duration of the suspension (§ 4).
Article 23 also complements the information obligation contained in Article 13, in that it obliges online platforms to provide a set of additional information on the actions taken in response to complaints by users(10). The Member State's digital service coordinator will be provided with all this information on request.
This Section III is then completed by Article 24, still relating to the obligation of transparency, but this time concerning advertising. Indeed, it is specified that each user must be able to identify advertising content offered on an online platform, in particular by expressly mentioning its advertising nature, the person behind the advertising and the parameters taken into account to establish the recipients of the advertising in question. Naturally, this identification of the advertising nature of a content will be done in the light of the principles of clarity and comprehensibility of the information communicated to users.
ADDITIONAL OBLIGATIONS SPECIFIC TO "VERY LARGE PLATFORMS"
The draft defines a "very large platform" as any entity providing services to at least 45 million European users on a monthly basis. It is specified that this is an average indicator which may be adjusted to reflect a certain percentage of the European population.(11).
The digital services coordinator of each Member State will again have a role to play in the qualification of a "very large platform": it will be responsible for verifying that the beneficial owner (user) criteria have been met in respect of the platforms under its jurisdiction. The verification will be carried out on a quarterly basis(12).
Several obligations are placed on these platforms: they will have to present an annual analysis of any "systemic risk" resulting from the use and operation of their services.
The Commission has clarified the concept of "systemic risk": it should be remembered that the DSA is part of a logic of transparency and protection of users' fundamental rights. Thus, "systemic risks" include possible negative effects on freedom of expression and information, privacy and family life, as well as "fake news" defined as "intentional manipulation (...) with actual or foreseeable negative effect on the protection of public health, minors, civic discourse (...) or related to the electoral process and public security"(13).
In this respect, the "very large platforms" will have to take all measures to address the "systemic risk". They will also have to cooperate with the competent authorities.
The "very large platforms" will also be obliged to publish a report on their moderation system every 6 months. As for the impact study on the "systemic risk" generated by their use, the audits and results of the latter, they will have to be made available to the public, but also communicated annually to the Commission and the coordinator. However, the public nature of these documents will necessarily be mitigated by the requirement for confidentiality of certain information.
Finally, and no doubt inspired by the COVID 19 health crisis, Article 37 of the Regulation elaborates a protocol for a crisis that would extraordinarily affect safety or public health. In this way, the Commission and the "very large platforms" will be able to draw up a joint action plan to provide information on the crisis situation.
WHAT ARE THE SANCTIONS?
In the event of non-compliance with the obligations imposed by the Regulation, the Commission leaves it to the Member States to determine the system of penalties ("effective, proportionate and dissuasive") applicable and to "take all necessary measures" to ensure its implementation.
However, the commission provides a framework for sanctions by stating that:
- the maximum amount of penalties imposed for failure to comply with established obligations may not exceed 6% of the annual revenue or turnover of the service provider concerned;
- the maximum amount of penalties imposed for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual income or turnover of the provider concerned;
- the maximum amount of periodic penalty payment shall not exceed 5 % of the average daily turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned.
By: Lora Shalganova
(1)Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 (OJ L 178, 17.7.2000, p. 1);
(2)In addition to intermediary service providers, the Regulation has created two new categories of operators: platforms and very large platforms;
(3)The Commission, the Member States, the European Digital Services Committee (new entity created by the Regulation);
(4)Art. 12 of the Regulation
(5)Such as freedom of expression, the right to an effective remedy, non-discrimination, children's rights, protection of personal data, online privacy etc;
(6)Art. 17 of the Regulation;
(7)Art. 18 of the Regulation;
(8)The conditions to be met by the "trusted signaller" are listed in Article 19 of the Regulation;
(9)Art. 20 of the Regulation;
(10)Number of cases submitted to alternative dispute resolution, number of unfounded complaints, but also the number of restrictions imposed;
(11)Art. 25 of the Regulation;
(12)In this respect, the Digital Services Coordinator is "assisted" by the platforms which have an obligation under Article 23 to publish, at least once every six months, information on the average monthly number of active service users in each Member State, calculated as an average over the last six months;
(13)Art. 26 §1 c) of the Regulation.