Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Key Takeaways From the EDPB’s Draft Guidelines on Scientific Research

On April 15, 2026, the European Data Protection Board (EDPB) adopted guidelines on the processing of personal data for scientific research purposes.[1] The guidelines aim to clarify GDPR compliance requirements for scientific research involving personal data.

The concepts addressed by the EDPB are of particular relevance to companies active in life sciences, artificial intelligence (AI), and advanced technology R&D.

The guidelines are open for public consultation until June 25, 2026.

The most significant aspect of the guidelines is the EDPB’s clarification of what constitutes “genuine” scientific research. The guidelines set out six key-indicative factors to be considered alongside the nature, scope, context, and purposes of the processing. These factors appear to restrict the scope of processing that can be classified as scientific research, meaning that researchers may need to re-evaluate whether their activities genuinely qualify for the GDPR’s more flexible treatment of scientific research.

Six Factor Test to Define “Scientific Research” Under GDPR

The six key-indicative factors are as follows:[2]

  1. Methodical and systematic approach: The research activities, including formulation and testing of a hypothesis, follow a methodical and systematic approach in the relevant field, for example in accordance with a comprehensive research plan.
  • Adherence to ethical standards: The research activities adhere to ethical standards in the relevant field, including respect for human autonomy and consent, transparency, accountability, and (human) oversight.
  • Verifiability and transparency: The research activities aim to achieve verifiable results, with hypotheses, methods, data and conclusions open to criticism (normally through peer review), and results shared with other parties, for example by publication.
  • Autonomy and independence: The research activities are conducted autonomously and independently, with the research team having the freedom to define research questions, identify methods, choose scientific theories, and disseminate results. The researchers have academic or scientific qualifications in the relevant field.
  • Objectives of the research: The research activities aim to contribute to the growth of society’s general knowledge and wellbeing. This does not exclude research that may also further commercial interests, but the EDPB does suggest in one of the examples included in the guidelines that research “solely concerned with furthering […] commercial interests” would not qualify.
  • Potential to contribute to existing scientific knowledge or apply existing knowledge in novel ways: The research activities have the potential to contribute to existing scientific knowledge or apply existing knowledge in novel ways, and their scientific merits can be subject to assessment, review or approval by independent experts or committees.

If all six factors are met, the activities can be presumed to constitute scientific research. If not, the controller must justify and demonstrate why the activities should nonetheless qualify.

Anonymization and Pseudonymization in the Context of Scientific Research

The remainder of the guidelines address GDPR compliance more generally in the context of scientific research, including with respect to: data protection principles, lawfulness of processing, transparency, data subjects’ rights, attribution of responsibility, and appropriate safeguards.

While these sections largely restate existing principles (albeit with useful clarifications on “broad” and “dynamic” consent, including through specific examples on how organizations can navigate the tension with the principles of specificity and purpose limitation as part of their overall data protection governance structure), the EDPB’s views on data minimization merit highlighting.[3] The EDPB takes the view that, because personal data must be “adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed[4], anonymization should be the default approach for scientific research. Once data is truly anonymized, it falls outside the scope of the GDPR entirely, although the anonymization process itself must still comply with GDPR requirements.[5] Where research aims cannot be achieved using anonymized data, personal data should be pseudonymized.[6] Processing data that can directly identify individuals should only occur where “strictly” necessary and proportionate to the research purpose.[7] Controllers will welcome the clarity provided by the guidelines, though ongoing compliance may require updates to internal processes. The full practical implications will become clearer once the dedicated guidance on anonymization and pseudonymization is published later this year.

Data subjects must be transparently informed about whether their data is processed in identifiable or pseudonymized form, and must not be misled into believing that their data is anonymized when it is not.[8]

Other Recent EDPB Updates

In addition to adopting these guidelines, the EDPB established a dedicated “sprint team” to finalize its upcoming and much anticipated guidelines on anonymization by summer 2026.[9] The questions of when personal data qualifies as “anonymous” under the GDPR and under what circumstances personal data (including sensitive personal data) can be used to train AI models, is currently also the subject of ongoing negotiations at EU level on the Digital Omnibus Package.[10]

Finally, the EDPB adopted two opinions approving two sets of Europrivacy certification criteria as a European Privacy Label, simplifying the data transfer process and enhancing accountability in high-risk sectors. The first approves an updated set of criteria whose scope now includes controllers and processors established outside Europe that are subject to Article 3(2) GDPR.[11] The second recognizes Europrivacy certification criteria as a European Data Protection Seal that can be used as a transfer mechanism under Articles 42 and 46 GDPR.[12] This will allow data importers outside Europe that are not subject to the GDPR to seek Europrivacy certification for transferred data they receive.


[1] EDPB Press Release, April 16, 2026, available here.

[2] EDPB Guidelines, section 2.1.

[3] EDPB Guidelines, section 8.3.

[4] GDPR Article 5(1)(c).

[5] EDPB Guidelines, para. 156.

[6] EDPB Guidelines, paras. 157-158.

[7] EDPB Guidelines, para. 159.

[8] EDPB Guidelines, para. 164.

[9] EDPB Press Release, April 16, 2026, available here.

[10] Cleary AI and Technology Insights, “Reset or rollback: Unpacking the EU’s Digital Omnibus Package”, November 21, 2025, available here.

[11] Opinion 14/2026 on the Europrivacy certification criteria regarding their approval by the Board as European Data Protection Seal pursuant to Article 42.5 GDP, adopted April 15, 2026, available here.

[12] Opinion 15/2026 on the Europrivacy certification criteria regarding their approval by the Board as European Data Protection Seal to be used as tool for transfers pursuant to Articles 42 and 46 GDPR, adopted April 15, 2026, available here.

European-Chinese geopolitical issues drive renewed cyberespionage campaign

1 April 2026 at 10:31

A Chinese cyberespionage group has shifted its gaze back to Europe after years of focusing on other parts of the world, Proofpoint research published Wednesday found.

The surge began in mid-2025, with a bevy of issues bubbling up between China and Europe, the company said. Proofpoint labels the government-linked group TA416, but other companies track it as Twill Typhoon, Mustang Panda or other names.

“This renewed focus most heavily targeted individuals or mailboxes associated with diplomatic missions and delegations to NATO and the EU,” Proofpoint’s Mark Kelly and Georgi Mladenov wrote. “TA416’s return to European government targeting occurred during heightened EU–China tensions over trade, the Russia–Ukraine war, and rare earths exports, and commenced immediately following the 25th EU–China summit.”

Separately, the same group took up targeting the Middle East in March after the start of the conflict in Iran, something it had never been spotted doing before, Proofpoint found.

“This aligns with a trend observed by Proofpoint of some state-aligned threat actors shifting targeting toward Middle Eastern government and diplomatic entities in the aftermath of the war,” the firm said. “This likely reflects an effort to gather regional intelligence on the status, trajectory, and broader geopolitical implications of the conflict.”

TA416 was active in Europe in 2022 and 2023, coinciding with the onset of the Ukraine-Russia war, but stepped away from the continent afterward, according to the researchers. Its focus turned to Southeast Asia, Taiwan and Mongolia for a couple years.

The group’s focus on Europe through early 2026 used a variety of web bug and malware delivery methods, including setting up reconnaissance by dangling lures about Europe sending troops to Greenland. It also included phishing emails about humanitarian concerns, interview requests and collaboration proposals, Proofpoint said.

“During this period, TA416 repeatedly altered its initial infection chains while maintaining a consistent goal of loading the group’s customized PlugX backdoor via DLL sideloading triads,” the researchers wrote.

Proofpoint’s is not the only report of late about Chinese cyberespionage groups targeting Europe, with another focused on LinkedIn solicitations to NATO and European institutions.

The post European-Chinese geopolitical issues drive renewed cyberespionage campaign appeared first on CyberScoop.

Dems pressure Google, Apple to drop X app as international regulators turn up heat

By: djohnson
9 January 2026 at 14:06

A trio of Senate Democrats are calling on Apple and Google to drop Elon Musk’s X from app stores as international regulators in Europe and Britain took steps towards investigations of the site’s mass undressing of users using Grok’s AI tool.

On Friday, Senators Ron Wyden, D-Ore., Ben Ray Luján, D-N.M., and Ed Markey, D-Mass., wrote to Apple’s and Google’s chief executives, asking them to “enforce your apps stores’ terms of service against X.”

“X’s generation of these harmful and likely illegal depictions of women and children has shown complete disregard for your stores’ distribution terms,” they wrote.

The Senators quote from Google Play Store’s terms of service stating that apps must “prohibit users from creating, uploading, or distributing content that facilitates the exploitation or abuse of children” and subject them to immediate removal for violations. Apple’s terms allow wide flexibility to take action on apps or content that are “offensive” or “just plain creepy,” something they argued should clearly cover what is happening on X.

“There can be no mistake about X’s knowledge, and, at best, negligent response to these trends,” the lawmakers wrote. 

The lawmakers explicitly compared the lack of action or comments from both companies thus far to the way the stores treated apps meant to track Immigrations and Customs Enforcement operations around the country, such as ICEBlock and Red Dot. 

“Unlike Grok’s sickening content generation, these apps were not creating or hosting harmful or illegal content, and yet, based entirely on the Administration’s claims that they posed a risk to immigration enforcers, you removed them from your stores,” the Senators noted.

The call comes as international regulators have turned up the heat on X over the scandal, while conflicting reports swirl about the extent to which X has limited Grok’s deepfake functionality after weeks of criticism.

The UK’s Office of Communications, the nation’s top communications regulatory agency, said it had made “urgent” contact with X over the images being generated by users through Grok, and that based on their response, “we will undertake a swift assessment to determine whether there are potential compliance issues” under the UK Online Safety Act. Friday, Prime Minister Keir Starmer called the images “unlawful” and “disgusting” and promised that all options, including a potential ban of X, were being considered.

Meanwhile, the European Union has ordered X to preserve all documents related to Grok through 2026, an indication that it could be subject to regulatory or law enforcement investigations, according to Reuters.

As CyberScoop and others have reported, legal experts have said that Musk may be exposing X to broad legal and regulatory risks from states, federal regulators and law enforcement.

There have been conflicting reports that X, which has not responded to inquiries from journalists under Musk’s ownership, may be taking steps to limit Grok’s deepfake functionality for some of its users.

On Friday, Musk posted on X that he was limiting the feature to paid users, which has resulted in a fresh round of outrage from observers who pointed out that monetizing illegal sexual deepfakes was not a solution to the problem. Prior to that statement, the only public response from Musk addressing the scandal was a post he made with “cry-laughing” emojis in response to a Grok-generated deepfake of himself wearing a bikini.

Musk doesn’t release numbers around paid subscribers, but a TechCrunch analysis indicates that it could be as high as 1.3-3.7 million users based on revenues reported from in-app purchases.

But even the claim that non-paying users are shut out from making further sexualized deepfakes through Grok may be inaccurate, as users on social media reported that even after the change, they were able to access Grok’s deepfake feature as a free user through X or Grok’s website.

The post Dems pressure Google, Apple to drop X app as international regulators turn up heat appeared first on CyberScoop.

GDPR vs. the hosting defence: How wary should online platforms be of the EU Court of Justice Russmedia judgment?

CJEU ruling heralded as “landmark” GDPR judgment turns on a specific set of facts and requires careful interpretation in the post-DSA regulatory reality.

The judgment of the Court of Justice of the European Union (CJEU) in the Russmedia case is a significant ruling for online platforms. Caution is needed when making inferences from the specific facts and circumstances of that case, which involved a severe breach of privacy, the processing of sensitive personal data, and an operator of an online marketplace that the CJEU deemed a “data controller” in respect of its processing of that sensitive personal data.

Key facts and findings

The case can be traced back to August 2018, when an anonymous third party published a false advertisement on an online marketplace operated by Russmedia Digital.[1] The ad falsely and maliciously presented a woman as offering sexual services and included photographs of the woman and her personal telephone number. When contacted by the woman, Russmedia took down the ad within the hour, but at that point it had already been reproduced on other websites and the damage was done.

On these facts, the Court found that Russmedia, as operator of the online marketplace, should be qualified as a “controller” under GDPR in respect of the processing of the sensitive personal data contained in the ad and that, in that specific capacity, Russmedia should have taken the following actions, in each case “by means of appropriate technical and organisational measures” (within the meaning of GDPR), to prevent the harm caused:

  • Proactively screen ads proposed to be placed on its platform to identify ads that contain sensitive personal data (a.k.a. special categories of personal data within the meaning of Article 9 of GDPR).[2]
  • If an ad containing sensitive data is identified during the screening, perform an identity check – before publishing the ad – to verify if the advertiser is the person whose sensitive data appear in the ad.
  • If the advertiser is not the person whose sensitive data are included, refuse publication unless the advertiser can prove that the relevant person has given his or her explicit consent to the publication of the ad on the online marketplace.[3]
  • Prevent ads containing sensitive personal data from being scraped (copied) from the online marketplace and unlawfully published on other websites.[4]

The Court also held that Russmedia could not rely on the hosting liability safe harbour provisions of the e-Commerce Directive. Russmedia had successfully invoked the safe harbour before the Romanian court. The CJEU disagreed, however, and held that the application of the liability exemptions provided for by the e-Commerce Directive safe harbour in a case where a breach of GDPR was (allegedly) at issue and where – crucially – the operator in question qualified as a “controller” in relation to the processing of the sensitive personal data in question would “interfere with the GDPR regime” (at §131). Therefore, in this specific instance, Russmedia could not invoke the e-Commerce Directive hosting liability safe harbour provisions to defend against the claim for breach of its obligations as a controller under the GDPR.

Why the precedential value of the judgment should not be overstated

A number of findings of the Court require a detailed analysis and raise some challenging interpretations of the GDPR and the e-Commerce Directive. For example:

  • The Court adopted a broad interpretation of the concept of “controller” under GDPR and applied it to the very specific set of facts and circumstances of the case. The fact that Russmedia’s general terms and conditions gave it “considerable freedom to exploit the information published on [its] marketplace […] for its own advertising and commercial purposes” (at §§67), in combination with the specific architecture of the online marketplace, seem to have been determining factors. In reaching its conclusion, the Court did not clearly differentiate between the roles of the key actors during the different stages of processing of the personal data in question (e.g., the placement of the ad by the third-party advertiser vs. any subsequent processing by the marketplace operator for its own purposes).[5] This stands in stark contrast to a seemingly more measured approach taken by Advocate General (AG) Szpunar in his opinion. The AG opined that the third-party advertiser alone determined the purpose of the ad, since Russmedia had no knowledge of why the advertiser would post the ad. The AG also more clearly distinguished the role of the marketplace operator when processing sensitive personal data contained in ads from its role when processing personal data of advertisers (e.g., when creating or managing their accounts) and, on that basis, concluded that Russmedia qualified as a processor (not a controller) in relation to the processing of sensitive personal data contained in ads posted on the online marketplace.[6]
  • The Court appears to have moved very quickly from qualifying the online marketplace operator as “controller” to subsequently grounding several potentially far-reaching and highly specific ex-ante screening and due diligence obligations for data controllers processing sensitive personal data, in the much more general GDPR principles of accountability, data protection by design and by default, and data security (in particular Articles 5(2), 24, 25 and 32 of GDPR).
  • The exclusion of GDPR breaches from the hosting liability safe harbour is dealt with only briefly – almost in passing (at §§129-136) – and could have benefited from more elaborate analysis, in particular regarding the potential impact of the exclusion to the careful balance struck by the EU legislator in respect of the liability of intermediary service providers under the e-Commerce Directive.[7]

Moreover, the judgment is fundamentally predicated on several highly specific facts, which were highlighted by the Court itself:

  • The Court went out of its way to stress the particular sensitivity of the personal data in question and the severity of the consequences for the data subject (see, for example, at §§47-53 and 90-96). The judgment should be read in a context where the Court had already signalled that it would be a champion of European data protection rights in a world where the harmful effects of online harassment are becoming increasingly severe and visible. The findings of the Court should therefore not necessarily be extrapolated to apply to all types of personal data or all data processing activities subject to GDPR.
  • To come to the conclusion that Russmedia was a “joint controller” in relation to the processing of the sensitive personal data included in the harmful ad in question, the Court analysed in considerable detail the specific manner in which Russmedia operated its online marketplace. Relevant elements taken into account by the Court included – as set out above – the broad rights Russmedia reserved for itself in relation to further processing of personal data included in ads, the specific architecture of the online marketplace, as well as the fact that there appear to have been few constraints on anonymous advertisers placing potentially harmful and false ads on the online marketplace in a way that means injured parties have no recourse to, or way of identifying, such malicious third-party advertisers (see, for example, at §§69-73).
  • The Court was asked to rule on the e-Commerce Directive, which governed the underlying facts back in 2018. The hosting liability safe harbour provisions of the e-Commerce Directive have since been replaced by the Digital Services Act.[8]

The precedential value of the judgment should therefore not be overstated:

  • Other online marketplaces may be operated in a different manner, have a different architecture and content limitations, and may therefore not qualify as “controller” in relation to the processing of sensitive personal data included in ads placed on their platforms by third parties.
  • Most ads will not contain any sensitive personal data, and are therefore much less likely to cause the type of severe harm to data subjects which was at issue here. Those ads would not trigger the same requirements that the Court seems to impose on Russmedia in this specific case.
  • The e-Commerce Directive has been replaced by the DSA. Although the DSA incorporated hosting liability safe harbour provisions that mirror to a large extent the equivalent language in the e-Commerce Directive, there are some important textual differences that may provide scope for broader protection under the DSA. If the same facts as those at issue in this case were to occur today, the analysis under the DSA may be different and more nuanced.[9] Case law on the hosting liability safe harbour (even some of the other recent e-Commerce Directive rulings from the CJEU) appears to be evolving to take into account technological advancements and the practical architectural realities of today’s online marketplaces and content hosting platforms.

Practical takeaways for operators which are nevertheless impacted by the judgment

The findings of the Court were limited to general findings of law, since the judgment was in response to a request for a preliminary ruling from the Romanian court of appeal. It therefore remains to be seen how these findings will be applied by national courts and data protection authorities to specific fact patterns sufficiently similar to the ones at issue in Russmedia.

For example, the Court did not specify how operators of online marketplaces should operationalise the requirements summarised above. Several of those requirements – such as preventing ads from being scraped or pre-screening ads for sensitive personal data before they are published – indeed appear difficult to reconcile with how online marketplaces and the AdTech ecosystem operate in reality and, even if they were to operate differently, what is (and may in the future become) technically feasible at scale.

Moreover, the GDPR neither compels organisations to do the impossible nor requires absolute data protection in any and all circumstances. The GDPR allows due account to be taken of “the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing” of personal data (Articles 25 and 32 of GDPR).[10] Accordingly, we expect that a key battleground will remain the issue of what measures are technically feasible and proportionate considering the “state of the art”. The Russmedia judgment still offers considerable leeway on how to ensure GDPR compliance, even for operators whose online platforms may fall within the specific scope of the judgment.


[1] See §§30 and 31 of the Judgment of December 2, 2025, Russmedia Digital and Inform Media Press, Case C-492/23, available here.

[2] The Court came to the unsurprising conclusion that the data in question qualified as special category personal data since they concerned the data subject’s sex life and sexual orientation. The fact that the data was untrue and harmful did not change that conclusion (see Judgment, § 53). There is an active debate, however, on how broadly the concept of special category personal data should be interpreted under the GDPR, including in the context of the preparation of the EU’s proposed Digital Omnibus Package (which we commented on in an earlier blog post “Reset or rollback: Unpacking the EU’s Digital Omnibus Package”).

[3] Or that another exception under Article 9(2) of GDPR is satisfied that can be relied on to justify the publication without consent, which seems rather theoretical in the context of an online marketplace such as the one operated by Russmedia as described in the Judgment.

[4] The Court held that, to this end, the operator “must consider in particular all technical measures available in the current state of technical knowledge that are apt to block the copying and reproduction of online content” (§122).

[5] The Court held that the anonymous third-party advertiser was also a “joint controller”, together with Russmedia (see Judgment, §§54-75), and clarified that “the existence of joint responsibility does not necessarily imply equal responsibility”(§63), leaving it to the national court to determine the exact extent of Russmedia’s responsibility in the case at hand; On earlier CJEU case-law adopting a comparably extensive interpretation of joint controllership, see our earlier blog post “EU Court of Justice confirms earlier case law on broad interpretation of “personal data” and offers extensive interpretation of “joint controllership”, with possible broad ramifications in the AdTech industry and beyond”.

[6] See §111 and following of the AG opinion of February 6, 2025, available here.

[7] For example, even though the Court held that the requirements imposed on Russmedia “cannot, in any event, be classified as […] a general monitoring obligation” prohibited by Article 15 of the e-Commerce Directive, this can certainly be debated.

[8] Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act); In accordance with Article 89 of the Digital Services Act (DSA), references to Articles 12 to 15 of the e-Commerce Directive (Directive 2000/31/EC) are now to be construed as references to Articles 4, 5, 6 and 8 of the DSA.

[9] The AG also hinted at this in §160 of his opinion, by pointing to the textual differences between the e-Commerce Directive and the DSA.

[10] Even the Court admitted, in respect of the anti-scraping measures referenced above, that “the unlawful dissemination of personal data initially published online is [not] sufficient to conclude that the measures adopted by the controller concerned were not appropriate” (at §123).

Dozens of groups call for governments to protect encryption 

By: djohnson
17 November 2025 at 15:56

On Monday, more than 60 digital commerce and trade groups called on governments around the globe to reject efforts or requests to weaken or bypass encryption, saying strong encrypted communications provides critical protections for user privacy, secure data protection and trust that underpin some of society’s most important interactions.

“Encryption is a vital tool for ensuring that consumers, businesses and governments can confidentially engage online, fostering a secure environment that supports economic growth and cross-border collaboration,” the groups wrote.

The letter, signed by The App Association, the Business Software Alliance, the Information Technology Industry Council, the Surveillance Technology Oversight Project and others, argues that the tradeoffs in privacy and security to all users would outweigh the benefits to law enforcement, stating “any effort to undermine encryption, whether through backdoors, key escrow systems, or technical mandates, undermines that trust.”

While policymakers in the U.S. and other democracies have been debating the question of “lawful access” to encrypted data for decades, the letter comes as countries in Europe and other parts of the world have made moves over the past year to regulate or mandate some form of legalized access for criminal and national security investigations.

This year, Apple removed its end-to-end encrypted Advanced Data Protection plans from the UK, part of a running dispute with British officials over access to encrypted iCloud data for national security investigations. Over the past three decades, the U.S. and governments around the world have come up with a range of technological proposals for gaining access to encrypted communications for law enforcement and national security investigations: from Clipper Chips to key escrow systems.

In August, Director of National Intelligence Tulsi Gabbard claimed to have persuaded British officials to reverse their position, but the next month Apple reiterated its plans to remove the advanced encryption plan from UK devices, saying it “remains committed to offering our users the highest level of security for their personal data and we are hopeful that we will be able to do so in the future in the United Kingdom.”

“As we have said many times before, we have never built a backdoor or master key to any of our products or services and we never will,” Apple’s statement reads.

Across St. George’s Channel, Ireland’s Minister of Justice Jim O’Callaghan is reportedly working on a proposal that would grant access to encrypted data to the An Garda Síochána, the country’s national police and security service.

Details of that proposal have not been publicized, but in a speech in July, O’Callaghan outlined his views on encryption, saying that the right to privacy cannot be allowed to become “sacrosanct” when it comes to law enforcement investigations and that there is “a need to grapple with the question of what data we will permit [police] to access, and what systems, protections and oversights should be in place.”

“None of us would like to imagine living in a surveillance State, with all of our private life – our thoughts, our communications, our interests – being observed and recorded,” O’Callaghan said. “But neither, I think, would we like to imagine people who have taken or plan to take the lives of others continuing to walk free with impunity, as a result of an inability on the part of Gardaí to effectively investigate their crimes.”Last month, the European Union came close to passing a new regulation, called Chat Control, that would have given governments broad authority to mass scan user devices for Child Sexual Abuse Material (CSAM).

Digital groups said the regulation would mark “the end” of privacy in Europe and threaten journalists, human rights activists, political dissidents, domestic abuse survivors and other victims who rely on the technology for legitimate means. Germany, a critical swing vote, later came out against the proposal, and EU proponents canceled the vote.

The post Dozens of groups call for governments to protect encryption  appeared first on CyberScoop.

Data Act FAQs – Key Takeaways for Manufacturers and Data Holders

On 3 February 2025, the European Commission (“EC”) published an updated version of its frequently asked questions (“FAQs”) on the EU Data Act.[1]  The Data Act, which is intended to make data more accessible to users of IoT devices in the EU, entered into force on 11 January 2024 and will become generally applicable as of 12 September 2025.

The FAQs, first published in September 2024, address the key concepts of “connected product” and “related service.” The latest iteration of the FAQs contains incremental updates which provide greater insight into how the EC believes that manufacturers and data holders should interpret their obligations under the Data Act.

Key Takeaways for Manufacturers and Data Holders

  1. “Connected Products” includes various smart devices, including smartphones and TVs.[2]  The FAQs acknowledge the broad definition of connected products under the Data Act and provide examples of devices that would fall under this category. In particular, despite ambiguity created from previous iterations of the Data Act, the EC has confirmed its view in the FAQs that devices such as smartphones, smart home devices and TVs are in-scope as connected products.
  2. Two conditions must be satisfied for a digital service to constitute a “Related Service.”[3]  It is expressly noted that the following conditions must be satisfied for a digital service to be a related service: (a) there must be a two-way exchange of data between the connected product and the service provider, and (b) the service must affect the connected product’s functions, behaviour, or operation. The FAQs also provide several factors that could help businesses determine whether a digital service is a related service, including user expectations for that product category, replaceability of the digital service, and pre-installation of the digital service on the connected product. Although these factors are not determinative, they may provide helpful guidance to businesses assessing whether their services fall within this definition (for example, if the service can easily be replaced by a third-party alternative, it may not meet the threshold of a related service). Ultimately, the EC has noted that practice and courts’ interpretations will play an essential role in further delineating if a digital service is a related service – so time will tell.
  3. Manufacturers have some discretion as to whether data will be directly or indirectly accessible.[4]  Importantly, the FAQs suggest that manufacturers/providers have a significant degree of discretion whether or not to design or redesign their connected products or related services to provide direct access to data. The FAQs list out certain criteria which can be taken into account when determining whether to design for direct access[5] or indirect access.[6] In this respect, the FAQs note that the wording of Article 3(1) (access by design) leaves flexibility as to whether design changes need to be implemented and it is acknowledged that data holders may prefer to offer indirect access to the data. It is also noted that the manufacturer may implement a solution that “works best for them” and consider, as part of its assessment, whether direct access is technically possible, the costs of potential technical modifications, and the difficulty of protecting trade secrets or intellectual property or of ensuring the connected product’s security.
  4. Readily available data without disproportionate effort.[7]  The FAQs confirm the position that readily available data is “product data and related service data that a data holder can obtain without disproportionate effort going beyond a simple operation.”  The EC provided some further clarity by highlighting that only data generated or collected after the entry into application of the Data Act (i.e., after 12 September 2025) should be considered “readily available data” as the definition does not include a reference to the time of their generation or collection. However, the FAQs do not provide further clarity on what would constitute “disproportionate effort” – arguably leaving businesses with further discretion to interpret this in the context of their products and services.
  5. Data made available under the Data Act should be ‘easily usable and understandable’ by users and third parties.[8]  The FAQs expressly note that data holders are required to share data of the same quality as they make available to themselves to facilitate the use of the data across the data economy.This indicates that raw and pre-processed data may require some additional investment to be usable. However, the FAQs make clear that there is no requirement for data holders to make substantial investments into such processes. Indeed, it may be the case that where the level of investment into processing the data is substantial, the Chapter II obligations may not apply to that data.
  6. Data generated outside of the EU may be subject to the Data Act.[9]  The EC’s position is that when a connected product is placed on the market in the EU, all the data generated by that connected product both inside and outside the EU will be subject to the Data Act. For example, if a user purchases a smart appliance in the EU and subsequently takes it to the US with them on vacation, any data generated by the use of the appliance in the US would also fall within the scope of the Data Act.
  7. Manufacturers will not be data holders if they do not control access to the data.[10]  It is explained in the FAQs that determining who is the data holder depends on who “controls access to the readily available data”. In particular, the FAQs acknowledge that manufacturers may contract out the role of “data holder” to a third party for all or part of their connected products. This seems to suggest that where the manufacturer does not control access to the readily available data, it will not be a data holder. In addition, a related service provider that is not the manufacturer of the connected product may also be a data holder if it controls access to readily available data that is generated by the related service it provides to the user. The FAQs further confirm that there may be instances where there is no data holder, i.e., in the case of direct access, where only the user has access to data stored directly on the connected product without the involvement of the manufacturer.
  8. Data holders can use non-personal data for any purpose agreed with the user (subject to limited exceptions).[11]  The FAQs reaffirm the position that a data holder can use the non-personal data generated by the user for any purpose, provided that this is agreed with the user.[12]  Furthermore, the data holder must not derive from such data any insights about the economic situation, assets and production methods of the user in any other manner that could undermine the commercial position of the user. Where data generated by the user includes personal data, data holders should ensure any use of such data is in compliance with the EU GDPR. To ensure compliance with the GDPR, data holders may apply privacy-enhancing technologies (“PETs”); however, the EC’s view is that applying PETs does not necessarily mean that the resulting data will be considered ‘derived’ or ‘inferred’ such that they would fall out-of-scope of the Data Act.
  9. Users may be able to request access to data from previous users of their connected product.[13]  The FAQs note that the Data Act “can be read as giving users the right to access and port readily available data generated by the use of connected objects, including data generated by other users before them.” Subsequent users may therefore have a legitimate interest in such data, for example, in respect of updates or incidents. However, the rights of previous users and other applicable law (e.g., the right to be forgotten under the EU GDPR) must be respected. Moreover, data holders are able to delete certain historical data after a reasonable retention period.[14] 

Although the initial set of FAQs, and the subsequent incremental updates, provide further guidance for businesses whose products or services may fall in scope of the Data Act, there are still areas of uncertainty that are yet to be addressed. As the FAQs are a “living document”, they may continue to be updated as and when the EC deems it necessary. It is also important to note that while the FAQs provide some useful guidance on Data Act interpretation, the Data Act is subject to supplemental domestic implementation and enforcement by national competent authorities of EU member states. Businesses should therefore pay careful attention to guidance published by national authorities in the member states and sectoral areas in which they operate.


[1] See https://digital-strategy.ec.europa.eu/en/library/commission-publishes-frequently-asked-questions-about-data-act.

[2] See Question 7 of the FAQs.

[3] See Question 10 of the FAQs.

[4] See Question 17 and 22 of the FAQs.

[5] I.e., ‘where relevant and technically feasible’ the user has the technical means to access, stream or download the data without the involvement of the data holder. For further information, see Article 3(1) of the Data Act.

[6] I.e., the connected product or related service is designed in such a way that the user must ask the data holder for access. For further information, see Article 4(1) of the Data Act.

[7] See Question 4 of the FAQs.

[8] See Question 5 of the FAQs.

[9] See Question 9 of the FAQs.

[10] See Question 21 of the FAQs.

[11] See Question 29 of the FAQs and Question 13 of the FAQs.

[12] See also Article 4(13) of the Data Act.

[13] See Question 33 of the FAQs.

[14] See Recital 24 of the Data Act.

Cybersecurity Law Enters Into Force

On July 17, 2024, Law No. 90/2024 containing provisions for strengthening national cybersecurity and addressing cybercrime (the “Cybersecurity Law”) entered into force.

The new legislation strengthens national cybersecurity, at a time when cyber-attacks have increased significantly.[1]

The Cybersecurity Law:

  1. seeks to strengthen the resilience of (a) public administrations, (b) operators that are subject to the application of the Italian National Cybersecurity Perimeter (“Perimeter”) legislation, (c) operators of essential services and providers of digital services, as defined in Italian Legislative Decree No. 65/2018, which implements the first  EU Directive 2016/1148 on security of network and information systems (“NIS 1 Operators”) and (d) operators providing public communications networks or publicly accessible electronic communications services (“Telecommunication Operators”), by establishing detailed rules on public procurement of IT goods and services that are essential for the protection of national strategic interests;
  2. imposes new incident reporting obligations;
  3. increases the role of the National Cybersecurity Agency (the “NCA”);
  4. enhances data security measures by establishing the National Cryptographic Center; and
  5. significantly focuses on the fight against cybercrime by increasing penalties for existing criminal offenses and introducing new criminal offenses in relation to individuals and entities under Italian Legislative Decree No. 231/2001 (“Decree 231”).

The Cybersecurity Law provisions are in addition to the existing Italian cybersecurity regulatory framework, which includes, as mentioned, the Perimeter legislation (Decree Law No. 105/2019),[2]  the Digital Operational Resilience Act (Regulation (EU) 2022/2554, “DORA”), and Italian Legislative Decree No. 65/2018, which implements the NIS 1 Directive.[3]

1. Scope

The Cybersecurity Law imposes obligations on Public Administrations[4] and on in-house companies that provide Public Administrations with: IT services; transportation services; urban, domestic or industrial wastewater collection, disposal or treatment services; and waste management services (“Public Operators”). These in-house companies are included within the scope of the law as they are considered to be critical infrastructure providers, in relation to which cybersecurity vulnerabilities may impact the entire supply chain of goods and services.

In addition, the Cybersecurity Law increases some of the obligations imposed on NIS 1 Operators, Telecommunication Operators and operators included in the Perimeter.

2. Incident reporting obligation

According to Article 1 of the Cybersecurity Law, Public Operators are required to report to the NCA all incidents impacting networks, information systems, and IT services listed in the taxonomy included in the NCA Resolution.[5]

Public Operators must submit an initial report within 24 hours of becoming aware of the incident and a complete report within 72 hours, using the channels available on the NCA website.

Public Operators may also voluntarily report incidents not included in the NCA Resolution taxonomy. These voluntary reports are processed only after mandatory ones to avoid unduly burdening the Italian Computer Security Response Team. Furthermore, submitting a voluntary report shall not impose any new obligations on the notifying party beyond what would be required if the report was not submitted.[6]

In the case of non-compliance with the reporting obligation, Article 1(5) of the Cybersecurity Law requires the NCA to issue a notice to the Public Operator, informing it that repeated non-compliance over a 5-year period will result in an administrative fine ranging from €25,000 to €125,000. Additionally, the NCA may conduct inspections within 12 months of identifying a delay or omission in compliance with the reporting obligation to verify that the Public Operator has taken steps to enhance resilience against the risk of incidents.

The incident reporting obligation takes effect immediately for central public administrations included in the Italian National Institute of Statistics (“ISTAT”) list, as well as for regions, the autonomous provinces of Trento and Bolzano, and metropolitan cities. For all other Public Operators, this obligation will take effect 180 days after the law enters into force.

Under Article 1 of the Cybersecurity Law, the reporting obligation is extended to more entities than those included in the Perimeter. In addition, the amendment to Article 1(3-bis) of Italian Decree-Law No. 105/2019 (establishing the Perimeter) extends the reporting procedure and timeframes set out in the Cybersecurity Law (initial reporting within 24 hours and complete reporting within 72 hours) to incidents that affect networks, information systems, and IT services other than ICT Assets[7] of entities included in the Perimeter.

The reporting obligation under Article 1 of the Cybersecurity Law does not apply to (i) NIS 1 Operators; (ii) operators included in the Perimeter in relation to incidents affecting ICT Assets (for which the provisions of the Perimeter legislation remain applicable); (iii) State bodies in charge of public and military security; (iv) the Department of Security Information, (v) the External and Internal Information and Security Agencies.

3. Addressing cybersecurity vulnerabilities reported by the NCA

The Cybersecurity Law outlines how to handle reports of the NCA addressed to Public Operators, entities included in the Perimeter, and NIS 1 and Telecommunication Operators.

In particular, the NCA may identify specific cybersecurity vulnerabilities that could affect the abovementioned recipients. These entities are required to promptly address the identified vulnerabilities within a maximum of 15 days, unless justified technical or organizational constraints prevent them from doing so immediately or necessitate postponement beyond the specified deadline.

Failure to comply with this provision will result in an administrative fine ranging from €25,000 to €125,000.

4. Contact person and cybersecurity structure

Public Operators must establish a cybersecurity structure and designate a cybersecurity contact person (with specific expertise). This contact person, whose name must be communicated to the NCA, will be the NCA’s contact point for cybersecurity matters.

The obligations, introduced for Public Operators are similar to those provided for the entities included in the Perimeter. For instance, Public Operators are required to: (i) implement internal information security policies; (ii) maintain an information risk management plan; (iii) set out the roles and responsibilities of the parties involved; (iv) implement actions to enhance information risk management based on NCA guidelines; and (v) continuously monitor security threats and system vulnerabilities to ensure timely security updates when necessary.

5. Enhancing data security measures

Public Operators, as well as operators included in the Perimeter and NIS 1 Operators, must verify that computer and electronic communication programs and applications use cryptographic solutions that comply with the guidelines on encryption and password storage issued by the NCA and the Data Protection Authority. In particular, in order to prevent encrypted data from being accessible to third parties, these entities must also ensure that the applications and programs specified in the regulation are free from known vulnerabilities.

Within the framework of the national cybersecurity strategy, the NCA has an increased role in promoting cryptography. This involves the development of standards, guidelines, and recommendations to strengthen information system security. Furthermore, the NCA conducts evaluations of cryptographic system security and coordinates initiatives aimed at advocating for cryptography as a critical cybersecurity tool.

For this purpose, the Cybersecurity Law provides for the creation of a National Cryptographic Center within the NCA, which operates under the guidelines set out by the NCA’s General Director.

6. Public procurement of ICT goods, systems and services

When procuring certain categories of ICT goods, systems and services for activities involving the protection of strategic national interests, public administrations, public service operators, publicly controlled companies,[8] and entities included in the Perimeter must ensure that the ICT goods and services acquired comply with particular criteria and technical standards, thereby safeguarding the confidentiality, integrity, and availability of processed data. These essential cybersecurity standards will be set out in a DPCM, to be adopted within 120 days of the Cybersecurity Law coming into force.

This new obligation stands alongside the existing requirement for entities included in the Perimeter to carry out an evaluation process through the Centre for National Evaluation and Certification (the “CVCN”) to ensure the security of ICT Assets intended for deployment under the Perimeter, as set out in the DPCM dated June 15, 2021. Accordingly, entities under the Perimeter are required, in addition, to assess compliance with essential cybersecurity standards outlined in the abovementioned DPCM for ICT goods and services that are not subject to CVCN evaluation.

7. Restrictions on personnel recruitment

The Cybersecurity Law introduces several restrictions, for private entities, to hire individuals who have held specific roles within certain central public administrations, which, if breached, will result in the contract entered into becoming null and void (Articles 12 and 13).

For instance, the Cybersecurity Law precludes, for a period of two years starting from the last training course, NCA employees who have attended, in the interest and at the expense of the NCA, specific specialized training courses, from taking positions with private entities aimed at performing cybersecurity-related tasks.

8. Amendments to the Dora Regulation scope

Lastly, the Cybersecurity Law amends the law implementing the DORA regulation to include, in addition to “financial entities”, financial intermediaries[9] and Poste Italiane S.p.A in relation to its Bancoposta business.

The objective of this amendment is to ensure a high level of digital operational resilience and to maintain stability across the financial sector. Consequently, in the exercise of the delegated power, the Government will make the appropriate adjustments and additions to the regulations governing these entities to align their operational resilience measures with those outlined in the DORA Regulation. These changes will apply to the activities undertaken by each entity concerned. Additionally, the Bank of Italy will assume supervisory, investigative, and sanctioning responsibilities over these entities.

9. Main amendments to the regulation on cybercrime

The Cybersecurity Law strengthens the fight against cybercrime by introducing significant amendments to both the Italian Criminal Code (the “ICC”) and the Italian Code of Criminal Procedure (the “ICCP”).

In particular, the Cybersecurity Law:

  • Increases criminal penalties for a range of cybercrimes, including the crime of unauthorized access to computer systems and the crime of destruction of computer data, information, and programs;
  • Introduces new aggravating circumstances.  It extends the aggravating circumstance which applies when the crime is committed “by a public official or a person in charge of a public service, through abuse of power or in violation of the duties of his or her position or service, by a person who, also abusively, exercises the profession of private investigator, or by abuse of the position of computer system operator”, to apply to all cybercrimes covered by the Cybersecurity Law.  It introduces a new aggravating circumstance for the crime of fraud in cases where the act is committed remotely by means of computer or telematic tools capable of impeding one’s own or another’s identification.[10] It also increases the penalties provided for the existing aggravating circumstances;
  • Introduces two new mitigating circumstances (Articles 623-quater and 639-ter ICC), applicable to specific cybercrimes,[11] which can reduce penalties by (i) up to one-third if the crime can be considered to be “minor” because of the manner in which it was committed, or if the damage or risk is particularly insignificant;  (ii) from one-half to two-thirds if the offender takes steps to prevent further consequences of the crime. This includes actively assisting the authorities in gathering evidence or recovering the proceeds of the crime or the instruments used to commit the crime;
  • Repeals Article 615-quinquies ICC, which punishes the unlawful possession, distribution and installation of instruments, devices or programs designed to damage or interrupt a computer or telematic system, and replaces it with the new criminal offense outlined in Article 635-quater.1 ICC; [12]
  • Introduces the new crime of cyber-extortion (Article 629(3) ICC), which punishes by imprisonment of 6 to 12 years and a fine of € 5,000 to € 10,000 (penalties that may be increased if certain aggravating circumstances are met)[13] anyone who, by committing or threatening to commit specific cybercrimes,[14] forces another person to do or refrain from doing something in order to obtain an unjust benefit for himself or herself or for others to the detriment of others. For example, the new crime could apply in cases where a person, having hacked into a computer system and manipulated or damaged information, data or programs, demands a ransom for the restoration of the computer system and its data.

In addition, the Cybersecurity Law provides for: (i) the allocation of the preliminary investigation of cybercrimes to the district prosecutor’s office; (ii) the application of a “simplified” system for granting an extension of the preliminary investigation period for cybercrimes;[15] and (iii) the extension of the maximum period for preliminary investigation to two years.

10. Amendments to Decree 231 and next steps for companies

The Cybersecurity Law introduces significant amendments to Decree 231. In particular, the Cybersecurity Law:

  • Increases the penalties for cybercrimes established by Article 24-bis of Decree 231, providing for (i) a maximum fine of € 1,084,300 for the offenses referred to in Article 24-bis(1)  of Decree 231,[16] and (ii) a maximum fine of € 619,600 for the offenses referred to in Article 24-bis(2) [17]  of Decree 231;[18]
  • Expands the list of crimes that may trigger liability for companies and other legal entities under Decree 231, by including the new crime of cyber-extortion (new Article 24-bis(1-bis) of Decree 231) which is subject to the following penalties (i) a maximum fine of € 1,239,200, and (ii) disqualification penalties set out in Article 9(2) of Decree 231 (i.e., disqualification from conducting business; suspension or revocation of authorizations, licenses or concessions instrumental to the commission of the crime; prohibition from entering into contracts with the public administration; exclusion from grants, loans, contributions and subsidies with the possible revocation of those already granted; and ban on advertising goods and services) for a period of at least two years.

In light of these developments, companies should consider reviewing and updating their policies and procedures to ensure that they are adequate to prevent new offenses that may trigger liability under Decree 231. In particular, companies should consider implementing new and more specific control measures, in addition to those already in place to prevent the commission of cybercrimes (which may already constitute a safeguard, even with respect to the newly introduced crime of cyber-extortion). Measures may include ensuring the proper use of IT tools, maintaining security standards for user identity, data integrity and confidentiality, monitoring employee network usage, and providing targeted information and training to company personnel.

11. Conclusion

The new Cybersecurity Law, while fitting into a complex regulatory framework that will need further changes, including  in the short term (consider, in this regard, that as early as October 2024 the NIS 2 Directive will have to be implemented) nevertheless represents a concrete response to the sudden and substantial increase in cyber threats. In particular, the expansion of incident reporting requirements to include new stakeholders and the introduction of stricter reporting deadlines for incidents not affecting ICT Assets aim to enhance national cyber resilience and security. This approach ensures that critical infrastructure providers have better control over cybersecurity incidents.

The increased penalties for cybercrimes, the introduction of new criminal offenses, and the developments regarding corporate liability under Decree 231 are also consistent with the above objectives. These measures are intended to tackle the increasing threat of cybercrime, although their effectiveness in practice remains to be seen.


[1] According to the Report published by the Italian Association for Information Security (“CLUSIT”) 2024, in 2023 cyber-attacks increased by 11% globally and by 65% at the Italian level.

[2] Together with the relevant implementing decrees: Italian President of the Council of Ministers’ Decree (“DPCM”) No. 131 of July 30, 2020; Italian Presidential Decree (“DPR”) No. 54 of February 5, 2021; DPCM No. 81 of April 14, 2021; Italian Legislative Decree No. 82 of June 14, 2021; DPCM of June 15, 2021; DPCM No. 92 of May 18, 2022; and the NCA Resolution of January 3, 2023 (the “NCA Resolution”).

[3] However, the Cybersecurity Law does not specifically refer to EU Directive 2022/2055 (the “NIS 2 Directive”), which Member States are required to implement by October 17, 2024.

[4] Specifically, according to the Cybersecurity Law, the following are considered public administrations: central public administrations included in ISTAT annual list of public administrations; regions and autonomous provinces of Trento and Bolzano; metropolitan cities; municipalities with a population of more than 100,000 inhabitants and in any case, regional capitals; urban public transportation companies with a catchment area of not less than 100,000 inhabitants; suburban public transportation companies operating within metropolitan cities; and local health care companies.

[5] See https://www.gazzettaufficiale.it/eli/id/2023/01/10/23A00114/sg.

[6] See Article 18, paragraphs 3, 4 and 5 of Italian Legislative Decree No. 65/2018.

[7] Defined, in accordance with Art. 1. letter m) of DPCM 131/2020 as a “set of networks, information systems and information services, or parts thereof, of any nature, considered unitarily for the purpose of performing essential functions of the State or for the provision of essential services.

[8] Operators referred to in Article 2(2) of the Digital Administration Code (Italian Legislative Decree No. 82/2005).

[9] Listed in the register provided for in Article 106 of the Consolidated Law on Banking and Credit, referred to in Italian Legislative Decree No. 385/1993.

[10] New paragraph 2-ter of Article 640 ICC.

[11] In particular, Article 623-quater ICC applies to the criminal offenses set out in Articles 615-ter (Unauthorized access to a computer or telematic system), 615-quater (Possession, distribution and unauthorized installation of tools, codes and other means of access to computer or telematic systems), 617-quater (Unlawful interception, obstruction, or disruption of computer or telematic communications), 617-quinquies (Possession, distribution and unauthorized installation of tools and other means to intercept, obstruct or interrupt computer or telematic communications) and 617-sexies ICC (Falsifying, altering or suppressing the content of computer or telematic communications). Article 639-ter ICC instead applies to the criminal offenses set out in Articles 629(3) (new crime of cyber-extortion), 635-ter (Damage to information, data and computer programs of a public nature or interest), 635-quarter.1 (Unauthorized possession, distribution, or installation of tools, devices, or programs designed to damage or interfere with a computer or telematic system) and 635-quinquies ICC (Damage to public utility computer or telematic systems).

[12] The new provision addresses the same conduct for which penalties were provided for under former Article 615-quinquies ICC and provides for the same penalties, with the addition of the aggravating circumstances set out in Article 615-ter(2.1) and Article 615-ter(3) ICC.

[13] In particular, a penalty of imprisonment of 8 to 22 years and a fine of € 6,000 to € 18,000 applies if the aggravating circumstances referred to in the paragraph 3 of Article 628 ICC (i.e., the aggravating circumstances provided for the crime of robbery) are met, or where the crime is committed against a person incapacitated by age or infirmity.

[14] That is, those set out in Articles 615-ter, 617-quater, 617-sexies, and 635-bis (Damage to computer information, data and programs), 635-quater (Damage to computer or telematic systems) and 635-quinquies ICC.

[15] In particular, the “simplified” regime is provided for under Article 406(5-bis) ICCP, which provides that the judge shall issue an order within ten days from the submission of the request for extension of the preliminary investigation period by the public prosecutor. This provision, which is reserved for particularly serious crimes, is intended to allow a more timely and effective investigation of the commission of the crime.

[16] That is, the crimes under Articles 615-ter, 617-quater, 617-quinquies, 635-bis, 635-ter, 635-quater and 635-quinquies ICC.

[17] That is, the crimes under Articles 615-quater and 635-quater(1) ICC.

[18] The disqualification penalties provided for these cybercrimes remain unchanged.

EHDS – The EU Parliament formally adopts the Provisional Agreement: Key Takeaways and Next Steps

In our Alert Memorandum of 19 July 2022 (available here), we outlined the European Commission’s (the “Commission”) proposal for a regulation on the “European Health Data Space” (the “Regulation” or the “EHDS”). The proposal, which was published in May 2022, is the first of nine European sector- and domain-specific data spaces set out by the Commission in 2020 in the context of its “European strategy for data”.

The EU is now reportedly aiming to conclude the EHDS dossier and adopt the Regulation before the end of the EU Parliament’s current term (June 2024). To this end, on 15 March 2024, the EU Council and the EU Parliament announced that they had reached a provisional agreement on the text of the Regulation (the text is available here). And on 24 April 2024, the EU Parliament formally adopted the text of the provisional agreement.

Background:

The proposed Regulation is an initiative that attempts to create a “European Health Union” to make it easier to exchange and access health data at EU level. The Regulation builds on other recent EU reforms such as the recently enacted Data Act and the proposed AI Act. It seeks to tackle legacy systemic issues that have hindered lawful access to electronic health data. It promotes the electronic exchange of health data by enhancing individuals’ access to and portability of these data and by enabling innovators and researchers to process these data through reliable and secure mechanisms. It contains rules that govern both primary use (i.e., use of such data in the context of healthcare) and secondary use of health data (e.g. use for non-healthcare purposes such as research, innovation, policy-making, statistics).

Recent Proposals:

On 6 December 2023, the EU Council issued a press release (available here) confirming the agreement on the EU Council’s position and its mandate to start negotiations with the EU Parliament as soon as possible in order to reach a provisional agreement on the proposed Regulation (see the EU Council’s proposed amendments here). Subsequently, on 13 December 2023, the EU Parliament finalised its proposed amendments to the Regulation (see the EU Parliament’s proposed amendments here).

Following the inter-institutional trilogue negotiations between the EU Parliament, the EU Council and the Commission, on 15 March 2024, the EU Council and the EU Parliament issued a press release (available here) confirming the reach of a provisional agreement on the text of the Regulation. They introduced new rules and also modified or clarified some of the rules that were originally proposed (some of which were outlined in our Alert Memorandum of 19 July 2022).

Some of the highlights from the provisional agreement are as follows:

  • Scope of Prohibited Purposes: The new text seeks to expand and clarifies the scope of prohibited purposes for secondary use of health data. For instance, the Regulation now provides that the secondary use of health data to take decisions that will produce economic or social effects should be prohibited – this provides an additional prohibition on top of the original proposal, which intended to prohibit secondary use of health data only where the decisions produced “legal” effects. In addition, the Regulation further includes within the scope of the prohibited purposes: (i) decisions in relation to job offers; (ii) offering less favourable terms in the provision of goods or services; (iii) decisions regarding conditions of taking loans or any other discriminative decisions taken on the basis of health data.
  • Categories of Personal Data subject to Secondary Use: As above, electronic health data can be subject to “secondary use” and health data holder should make certain categories of electronic data available for secondary use. The EU Parliament and the EU Council confirmed in their provisional agreement that Member States will be able to establish trusted data holders that can securely process requests for access to health data in order to reduce the administrative burden. The text includes a number of amendments to such categories of electronic data that can be made available for secondary use.
  • IP and Trade Secrets:
    • The EU Commission’s first draft of the Regulation did not include specific measures to preserve the confidentiality of IP rights and trade secrets; however, the Regulation now includes a set of new provisions on the protection of IP rights and trade secrets (Recital 40c, Article 33a). Accordingly, where health data is protected by IP rights or trade secrets, the Regulation should not be used to reduce or circumvent such protection. The provisions impose, among other things, an obligation on the “health data access bodies”[1] to take all specific measures, including legal, organisational and technical measures that are necessary to preserve the confidentiality of  data entailing IP rights or trade secrets. Such legal, organisational and technical measures could include common electronic health data access contractual arrangements, specific obligations in relation to the rights that fall within the data permit, pre-processing the data to generate derived data that protects a trade secret (but still has utility for the user or configuration of the secure processing environment so that such data is not accessible by the health data). If a health data user requests access to such data but should the granting of access of electronic health data for secondary purpose incur a serious risk that cannot be addressed in a satisfactory manner of infringing the intellectual property rights, trade secrets and/or the regulatory data protection right, the health data access body must refuse access and explain the reason to the user (see Article 33a(1)(d) of the Regulation).
    • In addition, the Regulation now includes additional obligations to health data holders[2] with respect to electronic health data that entail IP rights or trade secrets. For example, the original proposals required a health data holder to make the electronic data they hold available upon request to the health data access body in certain circumstances. The Regulation now requires health data holders to inform the health data access body of such IP rights or trade secrets, as well as to indicate which parts of the datasets are concerned and justify why the data needs the specific protection which the data benefits from, when communicating to the health data access body the dataset descriptions for the datasets they hold, or at the latest following a request from the health data access body.
    • The Regulation also requires health data access bodies to apply certain criteria when deciding to grant or refuse access to health data. These criteria include whether the requests demonstrate sufficient safeguards to protect the health data holder and the natural persons concerned; whether there is a lawful basis the GDPR in case of access to pseudonymised health data; whether the requested data is necessary for the purpose described in the request application. In addition, the health data body must also take into account certain risks when deciding on the same. The health data access body must permit the data access where it concludes that the above-mentioned criteria are met and the risks that it must take into account are sufficiently mitigated.
  • Transparency: The Regulation now intends to impose an additional obligation on the data holders to provide certain information to natural persons about their processing of personal health data. This information obligation is intended to supplement the transparency obligations that the data holders may have under the GDPR.
  • Right to access to personal electronic health data:The Regulation now addsthe individuals’ right to download their electronic health data and specifies that the right to access to personal electronic health data in the context of the EHDS complements the right to data portability under Article 20 of the GDPR (see Recital 11). In this context it should be noted that the GDPR right to data portability is limited only to data processed based on consent or contract – which excludes data processed under other legal bases, such as when the processing is based on law – and only concerns data provided by the data subject to a controller, excluding many inferred or indirect data, such as diagnoses, or tests.
  • Right to opt-out and need to obtain consent: New Recital 37c and Article 35f provide patients with a right to opt-out of the processing of all their health data for secondary use, except for purposes of public interest, policy making, statistics and research purposes in the public interest. In addition, individuals shall be provided with sufficient information on their right to opt-out, including on the benefits and drawbacks when exercising this right. In addition, Member States may put in place stricter measures governing access to certain kinds of sensitive data, such as genetic data, for research purposes.
  • Data localisation: Data localisation requirements are imposed in Articles 60a and 60aa. These provisions are intended to requires that personal electronic health data be stored exclusively for the purposes of primary and secondary use of personal electronic health data within the territory of the EU or in a third country, territory or one or more specified sectors within that third country covered by an adequacy decision pursuant to Article 45 of the GDPR. These proposed changes are seemingly intended to address some of the concerns expressed by the European Data Protection Board (the “EDPB”) and the European Data Protection Supervisor (the “EDPS”) in their joint opinion of 12 July 2022. However in certain ways the provisions do seem to go beyond the recommendations of the EDPB / EDPS (for example, with respect to the localisation of data, the EDPB/EDPS opinion actually proposed to require that electronic health data be stored in the EEA, but to allow for transfers under Chapter V of the GDPR, i.e. including, for example, transfers under standard contractual clauses or under the derogations provided for in Article 49 of the GDPR).

Next steps:

The provisional agreement will now have to be endorsed by the EU Council. It has been reported that the aim of the institutions is to conclude the EHDS dossier and adopt the Regulation before the end of the EU Parliament’s term (June 2024).

Once formally adopted and published in the Official Journal of the EU, the EHDS will be directly applicable following a grace-period (currently, two years) after the entry into force of the Regulation (with the exception of certain provisions which will have different application dates).


[1] This is a body that Member States will set up to be responsible for granting access to electronic health data for secondary use).

[2] This means the natural or legal person that has the ability to make available data; however note that negotiations between the EU Parliament, the EU Council and the EU Commission are still ongoing on the definition of “data holders”.

EU Court of Justice confirms earlier case law on broad interpretation of “personal data” and offers extensive interpretation of “joint controllership”, with possible broad ramifications in the AdTech industry and beyond

On March 7, 2024, the Court of Justice of the European Union (the “CJEU”) handed down its judgment in the IAB Europe case, answering a request for a preliminary ruling under Article 267 TFEU from the Brussels Market Court.[1]  The case revolves around IAB Europe’s Transparency and Consent Framework (“TCF”) and has been closely monitored by the AdTech industry ever since the Belgian DPA investigated and subsequently imposed a 250,000 euro fine on IAB Europe for alleged breaches of GDPR and e-Privacy rules back in 2022.[2]

Factual Background

IAB Europe is a European-level standard setting association for the digital marketing and advertising ecosystem.  Back in 2018, when GDPR entered into force, it designed the TCF as a set of rules and guidelines that addresses challenges posed by GDPR and e-Privacy rules in the context of online advertising auctions (such as real-time bidding).  The goal was to help AdTech companies that do not have any direct interaction with the website user (i.e., any company in the AdTech ecosystem that is not the website publisher, such as ad-networks, ad-exchanges, demand-side platforms) to ensure that the consent that the website publisher obtained (through cookies or similar technologies) is valid under the GDPR (i.e., freely given, specific, informed and unambiguous) and that, therefore, those AdTech companies can rely on that consent to serve ads to those users in compliance with GDPR and e-Privacy rules.

On a technical level, overly simplified, the TCF is used to record consent (or lack thereof) or objections to the reliance on legitimate interests under GDPR among IAB’s members by storing the information on consents and objections in a Transparency and Consent String (the “TC String”).  The TC String is a coded representation (a string of letters and numbers) of a user’s preferences, which is shared with data brokers and advertising platforms participating in the TCF auction protocol who would not otherwise have a way to know whether users have consented or objected to the processing of their personal data.[3]

First Question: Does the TC String constitute Personal Data?

The CJEU now ruled, echoing its earlier decision in Breyer,[4] that the TC String may constitute personal data under the GDPR to the extent those data may, by “reasonable means”, be associated with an identifier such as an IP address, allowing the data subject to be (re-)identified.  The fact that IAB Europe can neither access the data that are processed by its members under its membership rules without an external contribution, nor combine the TC String with other factors itself, did not preclude the TC String from potentially being considered personal data according to the CJEU.[5] 

Second Question: Does IAB Europe act as Data Controller?

Secondly, the Court decided that IAB Europe, as a sectoral organization proposing a framework of rules regarding consent to personal data processing, which contains not only binding technical rules but also rules setting out in detail the arrangements for storing and disseminating personal data, should be deemed a joint controller together with its members if and to the extent it exerts influence over the processing “for its own purposes” and, together with its members, determines the means behind such operations (e.g., through technical standards).  In the IAB Europe case, this concerns in particular the facilitation by IAB of the sale and purchase of advertising space among its members and its enforcement of rules on TC String content and handling.  It also seemed particularly relevant to the Court that IAB Europe could suspend membership in case of breach of the TC String rules and technical requirements by one of its members, which may result in the exclusion of that member from the TCF.

Further, in keeping with earlier CJEU case-law[6], the Court found it irrelevant that IAB Europe does not itself have direct access to the personal data processed by its members.  This does not in and of itself preclude IAB Europe from holding the status of joint controller under GDPR.

However, the Court also reiterated that joint controllership doesn’t automatically extend to subsequent processing by third parties, such as – in this case – website or application providers further processing the TC String following its initial creation, unless the joint controller continues to (jointly) determine the purpose and means of that subsequent processing.  This is in line with the Court’s 2019 Fashion ID judgment.[7]  In addition, the Court opined that the existence of joint controllership “does not necessarily imply equal responsibility” of the various operators engaged in the processing of personal data. The level of responsibility of each individual operator must be assessed in the light of all the relevant circumstances of a particular case, including the extent to which the different operators are involved at different stages of the data processing or to different degrees.  So not all joint controllers are created equal.

Key Takeaways

In our view, the first finding is not groundbreaking.  It largely confirms the Court’s previous case-law establishing that “personal data” must be interpreted broadly under GDPR, meaning the standard for truly “anonymized data” continues to be very high.  It will now be for the Brussels Market Court to determine whether, based on the specific facts of the IAB Europe case, the TC String indeed constitutes personal data.

The second finding may have caught more people off guard.  While it will again be up to the Brussels Market Court to determine whether IAB Europe is actually a joint controller in respect of the personal data alleged to be included in the TC String, the Court’s expansive interpretation of the concept of joint controllership (i.e., where “two or more controllers jointly determine the purposes and means of processing” (Article 26 GDPR)) could have broader ramifications beyond the AdTech industry. 

Organizations who until now have consistently taken the position that they do not qualify as a data controller in respect of data processing activities of their members, users or customers, may need to re-assess that position and, based on the specific factual circumstances relevant to them, consider whether they might in fact be subject to GDPR’s onerous obligations imposed on data controllers.  This may be particularly relevant for standard-setting bodies and industry associations active or established in Europe, potentially hampering their ability to continue developing relevant standards and rules.  Arguably, this could even capture certain providers or deployers of software and other computer systems, including those developing or deploying AI models and systems, in case they would be found to issue “binding technical rules” and “rules setting out in detail the arrangements for storing and disseminating personal data”, and they would actually enforce those rules against third parties using their models and systems to process personal data. 

Even if some solace can be found from a liability perspective in the confirmation by the Court that joint controllership relating to the initial collection of personal data does not automatically extend to the subsequent processing activities carried out by third-parties, and that not all joint controllers are created equal, the compliance burden on “newfound joint controllers” may nevertheless be burdensome because key obligations on lawfulness, transparency, data security and accountability are triggered irrespective of the “degree” of controllership in question.

In our view that would take the concept of “joint controllership” too far beyond its literal meaning and originally intended purpose, but it remains to be seen which other enforcement actions will be taken and which other cases raising similar questions may find their way through the European courts in the coming months and years.


[1]           CJEU, judgment of March 7, 2024, IAB Europe, C-604/22, ECLI:EU:C:2024:214 (https://curia.europa.eu/juris/document/document.jsf?text=&docid=283529&pageIndex=0&doclang=FR&mode=req&dir=&occ=first&part=1&cid=167405).

[2]           For more information on the original case in front of the Belgian DPA, see the DPA’s dedicated landing page: https://www.dataprotectionauthority.be/iab-europe-held-responsible-for-a-mechanism-that-infringes-the-gdpr.

[3]           For more information, see the IAB Europe website: https://iabeurope.eu/.

[4]           CJEU, judgment of 19 October 2016, Breyer, C‑582/14, EU:C:2016:779, paragraphs 41-49 (https://curia.europa.eu/juris/document/document.jsf?text=&docid=184668&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=1303370).

[5]           Recital 26 of GDPR further clarifies that, “to ascertain whether means are reasonably likely to be used to identify the natural person, account should be taken of all objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments.”  This will always require a fact-intensive, case-by-case inquiry, but it is now even more clear that “it is not required that all the information enabling the identification of the data subject must be in the hands of one person” (CJEU, IAB Europe judgment, §40).

[6]           CJEU, judgment of July 10, 2018, Jehovan todistajat, C‑25/17, EU:C:2018:551, paragraph 69 (https://curia.europa.eu/juris/document/document.jsf?text=&docid=203822&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=1305431), and CJEU; judgment of June 5, 2018, Wirtschaftsakademie Schleswig-Holstein, C‑210/16, EU:C:2018:388, paragraph 38 (https://curia.europa.eu/juris/document/document.jsf?text=&docid=202543&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=1305548).

[7]           CJEU, judgment of July 29, 2019, Fashion ID, C‑40/17, EU:C:2019:629, paragraph 74 (https://curia.europa.eu/juris/document/document.jsf?text=&docid=216555&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=1305826), as commented on in our earlier blog post here: https://www.clearycyberwatch.com/2019/08/cjeu-judgment-in-the-fashion-id-case-the-role-as-controller-under-eu-data-protection-law-of-the-website-operator-that-features-a-facebook-like-button/; See also the EDPB Guidelines 07/2020 on the concepts of controller and processor in the GDPR (version 2.1, adopted on July 7, 2021), in relation to the concept of “converging decisions”, at paragraphs 54-58 (https://www.edpb.europa.eu/system/files/2023-10/EDPB_guidelines_202007_controllerprocessor_final_en.pdf).

❌
❌