Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Potential EU law sparks global concerns over end-to-end encryption for messaging apps 

By: djohnson
6 October 2025 at 14:25

Tech experts and companies offering encrypted messaging services are warning that  pending European regulation, which would grant governments broad authority to scan messages and content on personal devices for criminal activity, could spell “the end” of privacy in Europe.

The European Union will vote Oct. 14 on a legislative proposal from the Danish Presidency known as Chat Control — a law that would require mass scanning of user devices, for abusive or illegal material. Over the weekend, Signal warned that Germany — a longtime opponent and bulwark against the proposal — may now move to vote in favor, giving the measure the support needed to pass into law.

On Monday, Signal CEO Meredith Whittaker warned that her company, which provides end-to-end encrypted communications services, could exit the European market entirely if the proposal is adopted.

“This could end private comms-[and] Signal-in the EU,” Whittaker wrote on BlueSky. “Time’s short and they’re counting on obscurity: please let German politicians know how horrifying their reversal would be.”

According to data privacy experts, Chat Control would require access to the contents of apps like Signal, Telegram, WhatsApp, Threema and others before messages are encrypted. While ostensibly aimed at criminal activity, experts say such features would also undermine and jeopardize the integrity of all other users’ encrypted communications, including journalists, human rights activists, political dissidents, domestic abuse survivors and other victims who rely on the technology for legitimate means.

The pending EU vote is the latest chapter in a decades-long battle between governments and digital privacy proponents about whether, and how, law enforcement should be granted access to encrypted communications in criminal or national security cases. 

Supporters point to increasing use of encrypted communications by criminal organizations, child traffickers, and terrorist organizations, arguing that unrestricted encryption impedes law enforcement investigations, and that some means of “lawful access” to that information is technically feasible without imperiling privacy writ-large.

Privacy experts have long argued that there are no technically feasible ways to provide such services without creating a backdoor that could be abused by other bad actors, including foreign governments.

Whittaker reportedly told the German Press Agency that “given a choice between building a surveillance machine into Signal or leaving the market, we would leave the market,” while calling repeated claims from governments that such features could be implemented without weakening encryption “magical thinking that assumes you can create a backdoor that only the good guys can access.”

The Chaos Computer Club, an association of more than 7,000 European hackers, has also opposed the measure, saying its efforts to reach out to Germany’s Home Office, Justice Department and Digital Minister Karsten Wildberger for clarity on the country’s position ahead of the Chat Control vote have been met with “silence” and “stonewalling.”

The association and U.S.-based privacy groups like the Electronic Frontier Foundation have argued that the client-side scanning technology that the EU would implement is error-prone and “invasive.”

“If the government has access to one of the ‘ends’ of an end-to-end encrypted communication, that communication is no longer safe and secure,” wrote EFF’s Thorin Klowsowski.

Beyond the damage Chat Control could cause to privacy, the Chaos Computer Club worried that its adoption by the EU might embolden other countries to pursue similar rules, threatening encryption worldwide.

If such a law on chat control is introduced, we will not only pay with the loss of our privacy,” Elina Eickstädt, spokesperson for the Chaos Computer Club, said in a statement. “We will also open the floodgates to attacks on secure communications infrastructure.”

The Danish proposal leaves open the potential to use AI technologies to scan user content, calling for such technologies “to be vetted with regard to their effectiveness, their impact on fundamental rights and risks to cybersecurity.”

Because Chat Control is publicly focused on curtailing child sexual abuse material (CSAM), the intital scanning will target both known and newly identified CSAM, focusing on images and internet links. For now, text and audio content, as well as scanning for  evidence of grooming — a more difficult crime to define — are excluded. 

Still, the Danish proposal specifies that scanning for grooming is “subject to … possible inclusion in the future through a review clause,” which would likely require even more intrusive monitoring of text, audio and video conversations. 

It also calls for “specific safeguards applying to technologies for detection in services using end-to-end encryption” but does not specify what those safeguards would be or how they would surmount the technical challenges laid out by digital privacy experts.

The post Potential EU law sparks global concerns over end-to-end encryption for messaging apps  appeared first on CyberScoop.

Data Act FAQs – Key Takeaways for Manufacturers and Data Holders

On 3 February 2025, the European Commission (“EC”) published an updated version of its frequently asked questions (“FAQs”) on the EU Data Act.[1]  The Data Act, which is intended to make data more accessible to users of IoT devices in the EU, entered into force on 11 January 2024 and will become generally applicable as of 12 September 2025.

The FAQs, first published in September 2024, address the key concepts of “connected product” and “related service.” The latest iteration of the FAQs contains incremental updates which provide greater insight into how the EC believes that manufacturers and data holders should interpret their obligations under the Data Act.

Key Takeaways for Manufacturers and Data Holders

  1. “Connected Products” includes various smart devices, including smartphones and TVs.[2]  The FAQs acknowledge the broad definition of connected products under the Data Act and provide examples of devices that would fall under this category. In particular, despite ambiguity created from previous iterations of the Data Act, the EC has confirmed its view in the FAQs that devices such as smartphones, smart home devices and TVs are in-scope as connected products.
  2. Two conditions must be satisfied for a digital service to constitute a “Related Service.”[3]  It is expressly noted that the following conditions must be satisfied for a digital service to be a related service: (a) there must be a two-way exchange of data between the connected product and the service provider, and (b) the service must affect the connected product’s functions, behaviour, or operation. The FAQs also provide several factors that could help businesses determine whether a digital service is a related service, including user expectations for that product category, replaceability of the digital service, and pre-installation of the digital service on the connected product. Although these factors are not determinative, they may provide helpful guidance to businesses assessing whether their services fall within this definition (for example, if the service can easily be replaced by a third-party alternative, it may not meet the threshold of a related service). Ultimately, the EC has noted that practice and courts’ interpretations will play an essential role in further delineating if a digital service is a related service – so time will tell.
  3. Manufacturers have some discretion as to whether data will be directly or indirectly accessible.[4]  Importantly, the FAQs suggest that manufacturers/providers have a significant degree of discretion whether or not to design or redesign their connected products or related services to provide direct access to data. The FAQs list out certain criteria which can be taken into account when determining whether to design for direct access[5] or indirect access.[6] In this respect, the FAQs note that the wording of Article 3(1) (access by design) leaves flexibility as to whether design changes need to be implemented and it is acknowledged that data holders may prefer to offer indirect access to the data. It is also noted that the manufacturer may implement a solution that “works best for them” and consider, as part of its assessment, whether direct access is technically possible, the costs of potential technical modifications, and the difficulty of protecting trade secrets or intellectual property or of ensuring the connected product’s security.
  4. Readily available data without disproportionate effort.[7]  The FAQs confirm the position that readily available data is “product data and related service data that a data holder can obtain without disproportionate effort going beyond a simple operation.”  The EC provided some further clarity by highlighting that only data generated or collected after the entry into application of the Data Act (i.e., after 12 September 2025) should be considered “readily available data” as the definition does not include a reference to the time of their generation or collection. However, the FAQs do not provide further clarity on what would constitute “disproportionate effort” – arguably leaving businesses with further discretion to interpret this in the context of their products and services.
  5. Data made available under the Data Act should be ‘easily usable and understandable’ by users and third parties.[8]  The FAQs expressly note that data holders are required to share data of the same quality as they make available to themselves to facilitate the use of the data across the data economy.This indicates that raw and pre-processed data may require some additional investment to be usable. However, the FAQs make clear that there is no requirement for data holders to make substantial investments into such processes. Indeed, it may be the case that where the level of investment into processing the data is substantial, the Chapter II obligations may not apply to that data.
  6. Data generated outside of the EU may be subject to the Data Act.[9]  The EC’s position is that when a connected product is placed on the market in the EU, all the data generated by that connected product both inside and outside the EU will be subject to the Data Act. For example, if a user purchases a smart appliance in the EU and subsequently takes it to the US with them on vacation, any data generated by the use of the appliance in the US would also fall within the scope of the Data Act.
  7. Manufacturers will not be data holders if they do not control access to the data.[10]  It is explained in the FAQs that determining who is the data holder depends on who “controls access to the readily available data”. In particular, the FAQs acknowledge that manufacturers may contract out the role of “data holder” to a third party for all or part of their connected products. This seems to suggest that where the manufacturer does not control access to the readily available data, it will not be a data holder. In addition, a related service provider that is not the manufacturer of the connected product may also be a data holder if it controls access to readily available data that is generated by the related service it provides to the user. The FAQs further confirm that there may be instances where there is no data holder, i.e., in the case of direct access, where only the user has access to data stored directly on the connected product without the involvement of the manufacturer.
  8. Data holders can use non-personal data for any purpose agreed with the user (subject to limited exceptions).[11]  The FAQs reaffirm the position that a data holder can use the non-personal data generated by the user for any purpose, provided that this is agreed with the user.[12]  Furthermore, the data holder must not derive from such data any insights about the economic situation, assets and production methods of the user in any other manner that could undermine the commercial position of the user. Where data generated by the user includes personal data, data holders should ensure any use of such data is in compliance with the EU GDPR. To ensure compliance with the GDPR, data holders may apply privacy-enhancing technologies (“PETs”); however, the EC’s view is that applying PETs does not necessarily mean that the resulting data will be considered ‘derived’ or ‘inferred’ such that they would fall out-of-scope of the Data Act.
  9. Users may be able to request access to data from previous users of their connected product.[13]  The FAQs note that the Data Act “can be read as giving users the right to access and port readily available data generated by the use of connected objects, including data generated by other users before them.” Subsequent users may therefore have a legitimate interest in such data, for example, in respect of updates or incidents. However, the rights of previous users and other applicable law (e.g., the right to be forgotten under the EU GDPR) must be respected. Moreover, data holders are able to delete certain historical data after a reasonable retention period.[14] 

Although the initial set of FAQs, and the subsequent incremental updates, provide further guidance for businesses whose products or services may fall in scope of the Data Act, there are still areas of uncertainty that are yet to be addressed. As the FAQs are a “living document”, they may continue to be updated as and when the EC deems it necessary. It is also important to note that while the FAQs provide some useful guidance on Data Act interpretation, the Data Act is subject to supplemental domestic implementation and enforcement by national competent authorities of EU member states. Businesses should therefore pay careful attention to guidance published by national authorities in the member states and sectoral areas in which they operate.


[1] See https://digital-strategy.ec.europa.eu/en/library/commission-publishes-frequently-asked-questions-about-data-act.

[2] See Question 7 of the FAQs.

[3] See Question 10 of the FAQs.

[4] See Question 17 and 22 of the FAQs.

[5] I.e., ‘where relevant and technically feasible’ the user has the technical means to access, stream or download the data without the involvement of the data holder. For further information, see Article 3(1) of the Data Act.

[6] I.e., the connected product or related service is designed in such a way that the user must ask the data holder for access. For further information, see Article 4(1) of the Data Act.

[7] See Question 4 of the FAQs.

[8] See Question 5 of the FAQs.

[9] See Question 9 of the FAQs.

[10] See Question 21 of the FAQs.

[11] See Question 29 of the FAQs and Question 13 of the FAQs.

[12] See also Article 4(13) of the Data Act.

[13] See Question 33 of the FAQs.

[14] See Recital 24 of the Data Act.

Cybersecurity Law Enters Into Force

On July 17, 2024, Law No. 90/2024 containing provisions for strengthening national cybersecurity and addressing cybercrime (the “Cybersecurity Law”) entered into force.

The new legislation strengthens national cybersecurity, at a time when cyber-attacks have increased significantly.[1]

The Cybersecurity Law:

  1. seeks to strengthen the resilience of (a) public administrations, (b) operators that are subject to the application of the Italian National Cybersecurity Perimeter (“Perimeter”) legislation, (c) operators of essential services and providers of digital services, as defined in Italian Legislative Decree No. 65/2018, which implements the first  EU Directive 2016/1148 on security of network and information systems (“NIS 1 Operators”) and (d) operators providing public communications networks or publicly accessible electronic communications services (“Telecommunication Operators”), by establishing detailed rules on public procurement of IT goods and services that are essential for the protection of national strategic interests;
  2. imposes new incident reporting obligations;
  3. increases the role of the National Cybersecurity Agency (the “NCA”);
  4. enhances data security measures by establishing the National Cryptographic Center; and
  5. significantly focuses on the fight against cybercrime by increasing penalties for existing criminal offenses and introducing new criminal offenses in relation to individuals and entities under Italian Legislative Decree No. 231/2001 (“Decree 231”).

The Cybersecurity Law provisions are in addition to the existing Italian cybersecurity regulatory framework, which includes, as mentioned, the Perimeter legislation (Decree Law No. 105/2019),[2]  the Digital Operational Resilience Act (Regulation (EU) 2022/2554, “DORA”), and Italian Legislative Decree No. 65/2018, which implements the NIS 1 Directive.[3]

1. Scope

The Cybersecurity Law imposes obligations on Public Administrations[4] and on in-house companies that provide Public Administrations with: IT services; transportation services; urban, domestic or industrial wastewater collection, disposal or treatment services; and waste management services (“Public Operators”). These in-house companies are included within the scope of the law as they are considered to be critical infrastructure providers, in relation to which cybersecurity vulnerabilities may impact the entire supply chain of goods and services.

In addition, the Cybersecurity Law increases some of the obligations imposed on NIS 1 Operators, Telecommunication Operators and operators included in the Perimeter.

2. Incident reporting obligation

According to Article 1 of the Cybersecurity Law, Public Operators are required to report to the NCA all incidents impacting networks, information systems, and IT services listed in the taxonomy included in the NCA Resolution.[5]

Public Operators must submit an initial report within 24 hours of becoming aware of the incident and a complete report within 72 hours, using the channels available on the NCA website.

Public Operators may also voluntarily report incidents not included in the NCA Resolution taxonomy. These voluntary reports are processed only after mandatory ones to avoid unduly burdening the Italian Computer Security Response Team. Furthermore, submitting a voluntary report shall not impose any new obligations on the notifying party beyond what would be required if the report was not submitted.[6]

In the case of non-compliance with the reporting obligation, Article 1(5) of the Cybersecurity Law requires the NCA to issue a notice to the Public Operator, informing it that repeated non-compliance over a 5-year period will result in an administrative fine ranging from €25,000 to €125,000. Additionally, the NCA may conduct inspections within 12 months of identifying a delay or omission in compliance with the reporting obligation to verify that the Public Operator has taken steps to enhance resilience against the risk of incidents.

The incident reporting obligation takes effect immediately for central public administrations included in the Italian National Institute of Statistics (“ISTAT”) list, as well as for regions, the autonomous provinces of Trento and Bolzano, and metropolitan cities. For all other Public Operators, this obligation will take effect 180 days after the law enters into force.

Under Article 1 of the Cybersecurity Law, the reporting obligation is extended to more entities than those included in the Perimeter. In addition, the amendment to Article 1(3-bis) of Italian Decree-Law No. 105/2019 (establishing the Perimeter) extends the reporting procedure and timeframes set out in the Cybersecurity Law (initial reporting within 24 hours and complete reporting within 72 hours) to incidents that affect networks, information systems, and IT services other than ICT Assets[7] of entities included in the Perimeter.

The reporting obligation under Article 1 of the Cybersecurity Law does not apply to (i) NIS 1 Operators; (ii) operators included in the Perimeter in relation to incidents affecting ICT Assets (for which the provisions of the Perimeter legislation remain applicable); (iii) State bodies in charge of public and military security; (iv) the Department of Security Information, (v) the External and Internal Information and Security Agencies.

3. Addressing cybersecurity vulnerabilities reported by the NCA

The Cybersecurity Law outlines how to handle reports of the NCA addressed to Public Operators, entities included in the Perimeter, and NIS 1 and Telecommunication Operators.

In particular, the NCA may identify specific cybersecurity vulnerabilities that could affect the abovementioned recipients. These entities are required to promptly address the identified vulnerabilities within a maximum of 15 days, unless justified technical or organizational constraints prevent them from doing so immediately or necessitate postponement beyond the specified deadline.

Failure to comply with this provision will result in an administrative fine ranging from €25,000 to €125,000.

4. Contact person and cybersecurity structure

Public Operators must establish a cybersecurity structure and designate a cybersecurity contact person (with specific expertise). This contact person, whose name must be communicated to the NCA, will be the NCA’s contact point for cybersecurity matters.

The obligations, introduced for Public Operators are similar to those provided for the entities included in the Perimeter. For instance, Public Operators are required to: (i) implement internal information security policies; (ii) maintain an information risk management plan; (iii) set out the roles and responsibilities of the parties involved; (iv) implement actions to enhance information risk management based on NCA guidelines; and (v) continuously monitor security threats and system vulnerabilities to ensure timely security updates when necessary.

5. Enhancing data security measures

Public Operators, as well as operators included in the Perimeter and NIS 1 Operators, must verify that computer and electronic communication programs and applications use cryptographic solutions that comply with the guidelines on encryption and password storage issued by the NCA and the Data Protection Authority. In particular, in order to prevent encrypted data from being accessible to third parties, these entities must also ensure that the applications and programs specified in the regulation are free from known vulnerabilities.

Within the framework of the national cybersecurity strategy, the NCA has an increased role in promoting cryptography. This involves the development of standards, guidelines, and recommendations to strengthen information system security. Furthermore, the NCA conducts evaluations of cryptographic system security and coordinates initiatives aimed at advocating for cryptography as a critical cybersecurity tool.

For this purpose, the Cybersecurity Law provides for the creation of a National Cryptographic Center within the NCA, which operates under the guidelines set out by the NCA’s General Director.

6. Public procurement of ICT goods, systems and services

When procuring certain categories of ICT goods, systems and services for activities involving the protection of strategic national interests, public administrations, public service operators, publicly controlled companies,[8] and entities included in the Perimeter must ensure that the ICT goods and services acquired comply with particular criteria and technical standards, thereby safeguarding the confidentiality, integrity, and availability of processed data. These essential cybersecurity standards will be set out in a DPCM, to be adopted within 120 days of the Cybersecurity Law coming into force.

This new obligation stands alongside the existing requirement for entities included in the Perimeter to carry out an evaluation process through the Centre for National Evaluation and Certification (the “CVCN”) to ensure the security of ICT Assets intended for deployment under the Perimeter, as set out in the DPCM dated June 15, 2021. Accordingly, entities under the Perimeter are required, in addition, to assess compliance with essential cybersecurity standards outlined in the abovementioned DPCM for ICT goods and services that are not subject to CVCN evaluation.

7. Restrictions on personnel recruitment

The Cybersecurity Law introduces several restrictions, for private entities, to hire individuals who have held specific roles within certain central public administrations, which, if breached, will result in the contract entered into becoming null and void (Articles 12 and 13).

For instance, the Cybersecurity Law precludes, for a period of two years starting from the last training course, NCA employees who have attended, in the interest and at the expense of the NCA, specific specialized training courses, from taking positions with private entities aimed at performing cybersecurity-related tasks.

8. Amendments to the Dora Regulation scope

Lastly, the Cybersecurity Law amends the law implementing the DORA regulation to include, in addition to “financial entities”, financial intermediaries[9] and Poste Italiane S.p.A in relation to its Bancoposta business.

The objective of this amendment is to ensure a high level of digital operational resilience and to maintain stability across the financial sector. Consequently, in the exercise of the delegated power, the Government will make the appropriate adjustments and additions to the regulations governing these entities to align their operational resilience measures with those outlined in the DORA Regulation. These changes will apply to the activities undertaken by each entity concerned. Additionally, the Bank of Italy will assume supervisory, investigative, and sanctioning responsibilities over these entities.

9. Main amendments to the regulation on cybercrime

The Cybersecurity Law strengthens the fight against cybercrime by introducing significant amendments to both the Italian Criminal Code (the “ICC”) and the Italian Code of Criminal Procedure (the “ICCP”).

In particular, the Cybersecurity Law:

  • Increases criminal penalties for a range of cybercrimes, including the crime of unauthorized access to computer systems and the crime of destruction of computer data, information, and programs;
  • Introduces new aggravating circumstances.  It extends the aggravating circumstance which applies when the crime is committed “by a public official or a person in charge of a public service, through abuse of power or in violation of the duties of his or her position or service, by a person who, also abusively, exercises the profession of private investigator, or by abuse of the position of computer system operator”, to apply to all cybercrimes covered by the Cybersecurity Law.  It introduces a new aggravating circumstance for the crime of fraud in cases where the act is committed remotely by means of computer or telematic tools capable of impeding one’s own or another’s identification.[10] It also increases the penalties provided for the existing aggravating circumstances;
  • Introduces two new mitigating circumstances (Articles 623-quater and 639-ter ICC), applicable to specific cybercrimes,[11] which can reduce penalties by (i) up to one-third if the crime can be considered to be “minor” because of the manner in which it was committed, or if the damage or risk is particularly insignificant;  (ii) from one-half to two-thirds if the offender takes steps to prevent further consequences of the crime. This includes actively assisting the authorities in gathering evidence or recovering the proceeds of the crime or the instruments used to commit the crime;
  • Repeals Article 615-quinquies ICC, which punishes the unlawful possession, distribution and installation of instruments, devices or programs designed to damage or interrupt a computer or telematic system, and replaces it with the new criminal offense outlined in Article 635-quater.1 ICC; [12]
  • Introduces the new crime of cyber-extortion (Article 629(3) ICC), which punishes by imprisonment of 6 to 12 years and a fine of € 5,000 to € 10,000 (penalties that may be increased if certain aggravating circumstances are met)[13] anyone who, by committing or threatening to commit specific cybercrimes,[14] forces another person to do or refrain from doing something in order to obtain an unjust benefit for himself or herself or for others to the detriment of others. For example, the new crime could apply in cases where a person, having hacked into a computer system and manipulated or damaged information, data or programs, demands a ransom for the restoration of the computer system and its data.

In addition, the Cybersecurity Law provides for: (i) the allocation of the preliminary investigation of cybercrimes to the district prosecutor’s office; (ii) the application of a “simplified” system for granting an extension of the preliminary investigation period for cybercrimes;[15] and (iii) the extension of the maximum period for preliminary investigation to two years.

10. Amendments to Decree 231 and next steps for companies

The Cybersecurity Law introduces significant amendments to Decree 231. In particular, the Cybersecurity Law:

  • Increases the penalties for cybercrimes established by Article 24-bis of Decree 231, providing for (i) a maximum fine of € 1,084,300 for the offenses referred to in Article 24-bis(1)  of Decree 231,[16] and (ii) a maximum fine of € 619,600 for the offenses referred to in Article 24-bis(2) [17]  of Decree 231;[18]
  • Expands the list of crimes that may trigger liability for companies and other legal entities under Decree 231, by including the new crime of cyber-extortion (new Article 24-bis(1-bis) of Decree 231) which is subject to the following penalties (i) a maximum fine of € 1,239,200, and (ii) disqualification penalties set out in Article 9(2) of Decree 231 (i.e., disqualification from conducting business; suspension or revocation of authorizations, licenses or concessions instrumental to the commission of the crime; prohibition from entering into contracts with the public administration; exclusion from grants, loans, contributions and subsidies with the possible revocation of those already granted; and ban on advertising goods and services) for a period of at least two years.

In light of these developments, companies should consider reviewing and updating their policies and procedures to ensure that they are adequate to prevent new offenses that may trigger liability under Decree 231. In particular, companies should consider implementing new and more specific control measures, in addition to those already in place to prevent the commission of cybercrimes (which may already constitute a safeguard, even with respect to the newly introduced crime of cyber-extortion). Measures may include ensuring the proper use of IT tools, maintaining security standards for user identity, data integrity and confidentiality, monitoring employee network usage, and providing targeted information and training to company personnel.

11. Conclusion

The new Cybersecurity Law, while fitting into a complex regulatory framework that will need further changes, including  in the short term (consider, in this regard, that as early as October 2024 the NIS 2 Directive will have to be implemented) nevertheless represents a concrete response to the sudden and substantial increase in cyber threats. In particular, the expansion of incident reporting requirements to include new stakeholders and the introduction of stricter reporting deadlines for incidents not affecting ICT Assets aim to enhance national cyber resilience and security. This approach ensures that critical infrastructure providers have better control over cybersecurity incidents.

The increased penalties for cybercrimes, the introduction of new criminal offenses, and the developments regarding corporate liability under Decree 231 are also consistent with the above objectives. These measures are intended to tackle the increasing threat of cybercrime, although their effectiveness in practice remains to be seen.


[1] According to the Report published by the Italian Association for Information Security (“CLUSIT”) 2024, in 2023 cyber-attacks increased by 11% globally and by 65% at the Italian level.

[2] Together with the relevant implementing decrees: Italian President of the Council of Ministers’ Decree (“DPCM”) No. 131 of July 30, 2020; Italian Presidential Decree (“DPR”) No. 54 of February 5, 2021; DPCM No. 81 of April 14, 2021; Italian Legislative Decree No. 82 of June 14, 2021; DPCM of June 15, 2021; DPCM No. 92 of May 18, 2022; and the NCA Resolution of January 3, 2023 (the “NCA Resolution”).

[3] However, the Cybersecurity Law does not specifically refer to EU Directive 2022/2055 (the “NIS 2 Directive”), which Member States are required to implement by October 17, 2024.

[4] Specifically, according to the Cybersecurity Law, the following are considered public administrations: central public administrations included in ISTAT annual list of public administrations; regions and autonomous provinces of Trento and Bolzano; metropolitan cities; municipalities with a population of more than 100,000 inhabitants and in any case, regional capitals; urban public transportation companies with a catchment area of not less than 100,000 inhabitants; suburban public transportation companies operating within metropolitan cities; and local health care companies.

[5] See https://www.gazzettaufficiale.it/eli/id/2023/01/10/23A00114/sg.

[6] See Article 18, paragraphs 3, 4 and 5 of Italian Legislative Decree No. 65/2018.

[7] Defined, in accordance with Art. 1. letter m) of DPCM 131/2020 as a “set of networks, information systems and information services, or parts thereof, of any nature, considered unitarily for the purpose of performing essential functions of the State or for the provision of essential services.

[8] Operators referred to in Article 2(2) of the Digital Administration Code (Italian Legislative Decree No. 82/2005).

[9] Listed in the register provided for in Article 106 of the Consolidated Law on Banking and Credit, referred to in Italian Legislative Decree No. 385/1993.

[10] New paragraph 2-ter of Article 640 ICC.

[11] In particular, Article 623-quater ICC applies to the criminal offenses set out in Articles 615-ter (Unauthorized access to a computer or telematic system), 615-quater (Possession, distribution and unauthorized installation of tools, codes and other means of access to computer or telematic systems), 617-quater (Unlawful interception, obstruction, or disruption of computer or telematic communications), 617-quinquies (Possession, distribution and unauthorized installation of tools and other means to intercept, obstruct or interrupt computer or telematic communications) and 617-sexies ICC (Falsifying, altering or suppressing the content of computer or telematic communications). Article 639-ter ICC instead applies to the criminal offenses set out in Articles 629(3) (new crime of cyber-extortion), 635-ter (Damage to information, data and computer programs of a public nature or interest), 635-quarter.1 (Unauthorized possession, distribution, or installation of tools, devices, or programs designed to damage or interfere with a computer or telematic system) and 635-quinquies ICC (Damage to public utility computer or telematic systems).

[12] The new provision addresses the same conduct for which penalties were provided for under former Article 615-quinquies ICC and provides for the same penalties, with the addition of the aggravating circumstances set out in Article 615-ter(2.1) and Article 615-ter(3) ICC.

[13] In particular, a penalty of imprisonment of 8 to 22 years and a fine of € 6,000 to € 18,000 applies if the aggravating circumstances referred to in the paragraph 3 of Article 628 ICC (i.e., the aggravating circumstances provided for the crime of robbery) are met, or where the crime is committed against a person incapacitated by age or infirmity.

[14] That is, those set out in Articles 615-ter, 617-quater, 617-sexies, and 635-bis (Damage to computer information, data and programs), 635-quater (Damage to computer or telematic systems) and 635-quinquies ICC.

[15] In particular, the “simplified” regime is provided for under Article 406(5-bis) ICCP, which provides that the judge shall issue an order within ten days from the submission of the request for extension of the preliminary investigation period by the public prosecutor. This provision, which is reserved for particularly serious crimes, is intended to allow a more timely and effective investigation of the commission of the crime.

[16] That is, the crimes under Articles 615-ter, 617-quater, 617-quinquies, 635-bis, 635-ter, 635-quater and 635-quinquies ICC.

[17] That is, the crimes under Articles 615-quater and 635-quater(1) ICC.

[18] The disqualification penalties provided for these cybercrimes remain unchanged.

EHDS – The EU Parliament formally adopts the Provisional Agreement: Key Takeaways and Next Steps

In our Alert Memorandum of 19 July 2022 (available here), we outlined the European Commission’s (the “Commission”) proposal for a regulation on the “European Health Data Space” (the “Regulation” or the “EHDS”). The proposal, which was published in May 2022, is the first of nine European sector- and domain-specific data spaces set out by the Commission in 2020 in the context of its “European strategy for data”.

The EU is now reportedly aiming to conclude the EHDS dossier and adopt the Regulation before the end of the EU Parliament’s current term (June 2024). To this end, on 15 March 2024, the EU Council and the EU Parliament announced that they had reached a provisional agreement on the text of the Regulation (the text is available here). And on 24 April 2024, the EU Parliament formally adopted the text of the provisional agreement.

Background:

The proposed Regulation is an initiative that attempts to create a “European Health Union” to make it easier to exchange and access health data at EU level. The Regulation builds on other recent EU reforms such as the recently enacted Data Act and the proposed AI Act. It seeks to tackle legacy systemic issues that have hindered lawful access to electronic health data. It promotes the electronic exchange of health data by enhancing individuals’ access to and portability of these data and by enabling innovators and researchers to process these data through reliable and secure mechanisms. It contains rules that govern both primary use (i.e., use of such data in the context of healthcare) and secondary use of health data (e.g. use for non-healthcare purposes such as research, innovation, policy-making, statistics).

Recent Proposals:

On 6 December 2023, the EU Council issued a press release (available here) confirming the agreement on the EU Council’s position and its mandate to start negotiations with the EU Parliament as soon as possible in order to reach a provisional agreement on the proposed Regulation (see the EU Council’s proposed amendments here). Subsequently, on 13 December 2023, the EU Parliament finalised its proposed amendments to the Regulation (see the EU Parliament’s proposed amendments here).

Following the inter-institutional trilogue negotiations between the EU Parliament, the EU Council and the Commission, on 15 March 2024, the EU Council and the EU Parliament issued a press release (available here) confirming the reach of a provisional agreement on the text of the Regulation. They introduced new rules and also modified or clarified some of the rules that were originally proposed (some of which were outlined in our Alert Memorandum of 19 July 2022).

Some of the highlights from the provisional agreement are as follows:

  • Scope of Prohibited Purposes: The new text seeks to expand and clarifies the scope of prohibited purposes for secondary use of health data. For instance, the Regulation now provides that the secondary use of health data to take decisions that will produce economic or social effects should be prohibited – this provides an additional prohibition on top of the original proposal, which intended to prohibit secondary use of health data only where the decisions produced “legal” effects. In addition, the Regulation further includes within the scope of the prohibited purposes: (i) decisions in relation to job offers; (ii) offering less favourable terms in the provision of goods or services; (iii) decisions regarding conditions of taking loans or any other discriminative decisions taken on the basis of health data.
  • Categories of Personal Data subject to Secondary Use: As above, electronic health data can be subject to “secondary use” and health data holder should make certain categories of electronic data available for secondary use. The EU Parliament and the EU Council confirmed in their provisional agreement that Member States will be able to establish trusted data holders that can securely process requests for access to health data in order to reduce the administrative burden. The text includes a number of amendments to such categories of electronic data that can be made available for secondary use.
  • IP and Trade Secrets:
    • The EU Commission’s first draft of the Regulation did not include specific measures to preserve the confidentiality of IP rights and trade secrets; however, the Regulation now includes a set of new provisions on the protection of IP rights and trade secrets (Recital 40c, Article 33a). Accordingly, where health data is protected by IP rights or trade secrets, the Regulation should not be used to reduce or circumvent such protection. The provisions impose, among other things, an obligation on the “health data access bodies”[1] to take all specific measures, including legal, organisational and technical measures that are necessary to preserve the confidentiality of  data entailing IP rights or trade secrets. Such legal, organisational and technical measures could include common electronic health data access contractual arrangements, specific obligations in relation to the rights that fall within the data permit, pre-processing the data to generate derived data that protects a trade secret (but still has utility for the user or configuration of the secure processing environment so that such data is not accessible by the health data). If a health data user requests access to such data but should the granting of access of electronic health data for secondary purpose incur a serious risk that cannot be addressed in a satisfactory manner of infringing the intellectual property rights, trade secrets and/or the regulatory data protection right, the health data access body must refuse access and explain the reason to the user (see Article 33a(1)(d) of the Regulation).
    • In addition, the Regulation now includes additional obligations to health data holders[2] with respect to electronic health data that entail IP rights or trade secrets. For example, the original proposals required a health data holder to make the electronic data they hold available upon request to the health data access body in certain circumstances. The Regulation now requires health data holders to inform the health data access body of such IP rights or trade secrets, as well as to indicate which parts of the datasets are concerned and justify why the data needs the specific protection which the data benefits from, when communicating to the health data access body the dataset descriptions for the datasets they hold, or at the latest following a request from the health data access body.
    • The Regulation also requires health data access bodies to apply certain criteria when deciding to grant or refuse access to health data. These criteria include whether the requests demonstrate sufficient safeguards to protect the health data holder and the natural persons concerned; whether there is a lawful basis the GDPR in case of access to pseudonymised health data; whether the requested data is necessary for the purpose described in the request application. In addition, the health data body must also take into account certain risks when deciding on the same. The health data access body must permit the data access where it concludes that the above-mentioned criteria are met and the risks that it must take into account are sufficiently mitigated.
  • Transparency: The Regulation now intends to impose an additional obligation on the data holders to provide certain information to natural persons about their processing of personal health data. This information obligation is intended to supplement the transparency obligations that the data holders may have under the GDPR.
  • Right to access to personal electronic health data:The Regulation now addsthe individuals’ right to download their electronic health data and specifies that the right to access to personal electronic health data in the context of the EHDS complements the right to data portability under Article 20 of the GDPR (see Recital 11). In this context it should be noted that the GDPR right to data portability is limited only to data processed based on consent or contract – which excludes data processed under other legal bases, such as when the processing is based on law – and only concerns data provided by the data subject to a controller, excluding many inferred or indirect data, such as diagnoses, or tests.
  • Right to opt-out and need to obtain consent: New Recital 37c and Article 35f provide patients with a right to opt-out of the processing of all their health data for secondary use, except for purposes of public interest, policy making, statistics and research purposes in the public interest. In addition, individuals shall be provided with sufficient information on their right to opt-out, including on the benefits and drawbacks when exercising this right. In addition, Member States may put in place stricter measures governing access to certain kinds of sensitive data, such as genetic data, for research purposes.
  • Data localisation: Data localisation requirements are imposed in Articles 60a and 60aa. These provisions are intended to requires that personal electronic health data be stored exclusively for the purposes of primary and secondary use of personal electronic health data within the territory of the EU or in a third country, territory or one or more specified sectors within that third country covered by an adequacy decision pursuant to Article 45 of the GDPR. These proposed changes are seemingly intended to address some of the concerns expressed by the European Data Protection Board (the “EDPB”) and the European Data Protection Supervisor (the “EDPS”) in their joint opinion of 12 July 2022. However in certain ways the provisions do seem to go beyond the recommendations of the EDPB / EDPS (for example, with respect to the localisation of data, the EDPB/EDPS opinion actually proposed to require that electronic health data be stored in the EEA, but to allow for transfers under Chapter V of the GDPR, i.e. including, for example, transfers under standard contractual clauses or under the derogations provided for in Article 49 of the GDPR).

Next steps:

The provisional agreement will now have to be endorsed by the EU Council. It has been reported that the aim of the institutions is to conclude the EHDS dossier and adopt the Regulation before the end of the EU Parliament’s term (June 2024).

Once formally adopted and published in the Official Journal of the EU, the EHDS will be directly applicable following a grace-period (currently, two years) after the entry into force of the Regulation (with the exception of certain provisions which will have different application dates).


[1] This is a body that Member States will set up to be responsible for granting access to electronic health data for secondary use).

[2] This means the natural or legal person that has the ability to make available data; however note that negotiations between the EU Parliament, the EU Council and the EU Commission are still ongoing on the definition of “data holders”.

EU Court of Justice confirms earlier case law on broad interpretation of “personal data” and offers extensive interpretation of “joint controllership”, with possible broad ramifications in the AdTech industry and beyond

On March 7, 2024, the Court of Justice of the European Union (the “CJEU”) handed down its judgment in the IAB Europe case, answering a request for a preliminary ruling under Article 267 TFEU from the Brussels Market Court.[1]  The case revolves around IAB Europe’s Transparency and Consent Framework (“TCF”) and has been closely monitored by the AdTech industry ever since the Belgian DPA investigated and subsequently imposed a 250,000 euro fine on IAB Europe for alleged breaches of GDPR and e-Privacy rules back in 2022.[2]

Factual Background

IAB Europe is a European-level standard setting association for the digital marketing and advertising ecosystem.  Back in 2018, when GDPR entered into force, it designed the TCF as a set of rules and guidelines that addresses challenges posed by GDPR and e-Privacy rules in the context of online advertising auctions (such as real-time bidding).  The goal was to help AdTech companies that do not have any direct interaction with the website user (i.e., any company in the AdTech ecosystem that is not the website publisher, such as ad-networks, ad-exchanges, demand-side platforms) to ensure that the consent that the website publisher obtained (through cookies or similar technologies) is valid under the GDPR (i.e., freely given, specific, informed and unambiguous) and that, therefore, those AdTech companies can rely on that consent to serve ads to those users in compliance with GDPR and e-Privacy rules.

On a technical level, overly simplified, the TCF is used to record consent (or lack thereof) or objections to the reliance on legitimate interests under GDPR among IAB’s members by storing the information on consents and objections in a Transparency and Consent String (the “TC String”).  The TC String is a coded representation (a string of letters and numbers) of a user’s preferences, which is shared with data brokers and advertising platforms participating in the TCF auction protocol who would not otherwise have a way to know whether users have consented or objected to the processing of their personal data.[3]

First Question: Does the TC String constitute Personal Data?

The CJEU now ruled, echoing its earlier decision in Breyer,[4] that the TC String may constitute personal data under the GDPR to the extent those data may, by “reasonable means”, be associated with an identifier such as an IP address, allowing the data subject to be (re-)identified.  The fact that IAB Europe can neither access the data that are processed by its members under its membership rules without an external contribution, nor combine the TC String with other factors itself, did not preclude the TC String from potentially being considered personal data according to the CJEU.[5] 

Second Question: Does IAB Europe act as Data Controller?

Secondly, the Court decided that IAB Europe, as a sectoral organization proposing a framework of rules regarding consent to personal data processing, which contains not only binding technical rules but also rules setting out in detail the arrangements for storing and disseminating personal data, should be deemed a joint controller together with its members if and to the extent it exerts influence over the processing “for its own purposes” and, together with its members, determines the means behind such operations (e.g., through technical standards).  In the IAB Europe case, this concerns in particular the facilitation by IAB of the sale and purchase of advertising space among its members and its enforcement of rules on TC String content and handling.  It also seemed particularly relevant to the Court that IAB Europe could suspend membership in case of breach of the TC String rules and technical requirements by one of its members, which may result in the exclusion of that member from the TCF.

Further, in keeping with earlier CJEU case-law[6], the Court found it irrelevant that IAB Europe does not itself have direct access to the personal data processed by its members.  This does not in and of itself preclude IAB Europe from holding the status of joint controller under GDPR.

However, the Court also reiterated that joint controllership doesn’t automatically extend to subsequent processing by third parties, such as – in this case – website or application providers further processing the TC String following its initial creation, unless the joint controller continues to (jointly) determine the purpose and means of that subsequent processing.  This is in line with the Court’s 2019 Fashion ID judgment.[7]  In addition, the Court opined that the existence of joint controllership “does not necessarily imply equal responsibility” of the various operators engaged in the processing of personal data. The level of responsibility of each individual operator must be assessed in the light of all the relevant circumstances of a particular case, including the extent to which the different operators are involved at different stages of the data processing or to different degrees.  So not all joint controllers are created equal.

Key Takeaways

In our view, the first finding is not groundbreaking.  It largely confirms the Court’s previous case-law establishing that “personal data” must be interpreted broadly under GDPR, meaning the standard for truly “anonymized data” continues to be very high.  It will now be for the Brussels Market Court to determine whether, based on the specific facts of the IAB Europe case, the TC String indeed constitutes personal data.

The second finding may have caught more people off guard.  While it will again be up to the Brussels Market Court to determine whether IAB Europe is actually a joint controller in respect of the personal data alleged to be included in the TC String, the Court’s expansive interpretation of the concept of joint controllership (i.e., where “two or more controllers jointly determine the purposes and means of processing” (Article 26 GDPR)) could have broader ramifications beyond the AdTech industry. 

Organizations who until now have consistently taken the position that they do not qualify as a data controller in respect of data processing activities of their members, users or customers, may need to re-assess that position and, based on the specific factual circumstances relevant to them, consider whether they might in fact be subject to GDPR’s onerous obligations imposed on data controllers.  This may be particularly relevant for standard-setting bodies and industry associations active or established in Europe, potentially hampering their ability to continue developing relevant standards and rules.  Arguably, this could even capture certain providers or deployers of software and other computer systems, including those developing or deploying AI models and systems, in case they would be found to issue “binding technical rules” and “rules setting out in detail the arrangements for storing and disseminating personal data”, and they would actually enforce those rules against third parties using their models and systems to process personal data. 

Even if some solace can be found from a liability perspective in the confirmation by the Court that joint controllership relating to the initial collection of personal data does not automatically extend to the subsequent processing activities carried out by third-parties, and that not all joint controllers are created equal, the compliance burden on “newfound joint controllers” may nevertheless be burdensome because key obligations on lawfulness, transparency, data security and accountability are triggered irrespective of the “degree” of controllership in question.

In our view that would take the concept of “joint controllership” too far beyond its literal meaning and originally intended purpose, but it remains to be seen which other enforcement actions will be taken and which other cases raising similar questions may find their way through the European courts in the coming months and years.


[1]           CJEU, judgment of March 7, 2024, IAB Europe, C-604/22, ECLI:EU:C:2024:214 (https://curia.europa.eu/juris/document/document.jsf?text=&docid=283529&pageIndex=0&doclang=FR&mode=req&dir=&occ=first&part=1&cid=167405).

[2]           For more information on the original case in front of the Belgian DPA, see the DPA’s dedicated landing page: https://www.dataprotectionauthority.be/iab-europe-held-responsible-for-a-mechanism-that-infringes-the-gdpr.

[3]           For more information, see the IAB Europe website: https://iabeurope.eu/.

[4]           CJEU, judgment of 19 October 2016, Breyer, C‑582/14, EU:C:2016:779, paragraphs 41-49 (https://curia.europa.eu/juris/document/document.jsf?text=&docid=184668&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=1303370).

[5]           Recital 26 of GDPR further clarifies that, “to ascertain whether means are reasonably likely to be used to identify the natural person, account should be taken of all objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments.”  This will always require a fact-intensive, case-by-case inquiry, but it is now even more clear that “it is not required that all the information enabling the identification of the data subject must be in the hands of one person” (CJEU, IAB Europe judgment, §40).

[6]           CJEU, judgment of July 10, 2018, Jehovan todistajat, C‑25/17, EU:C:2018:551, paragraph 69 (https://curia.europa.eu/juris/document/document.jsf?text=&docid=203822&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=1305431), and CJEU; judgment of June 5, 2018, Wirtschaftsakademie Schleswig-Holstein, C‑210/16, EU:C:2018:388, paragraph 38 (https://curia.europa.eu/juris/document/document.jsf?text=&docid=202543&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=1305548).

[7]           CJEU, judgment of July 29, 2019, Fashion ID, C‑40/17, EU:C:2019:629, paragraph 74 (https://curia.europa.eu/juris/document/document.jsf?text=&docid=216555&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=1305826), as commented on in our earlier blog post here: https://www.clearycyberwatch.com/2019/08/cjeu-judgment-in-the-fashion-id-case-the-role-as-controller-under-eu-data-protection-law-of-the-website-operator-that-features-a-facebook-like-button/; See also the EDPB Guidelines 07/2020 on the concepts of controller and processor in the GDPR (version 2.1, adopted on July 7, 2021), in relation to the concept of “converging decisions”, at paragraphs 54-58 (https://www.edpb.europa.eu/system/files/2023-10/EDPB_guidelines_202007_controllerprocessor_final_en.pdf).

❌
❌