Digital Omnibus: EU Commission wants to wreck core GDPR principles
Here is a first overview on the main problems:(1) A new GDPR loophole via "pseudonyms" or "IDs". The Commission proposes to significantly narrow the definition of "personal data" – which would result in the GDPR not applying to many companies in various sectors. For example, sectors that currently operate via "pseudonyms" or random ID numbers, such as data brokers or the advertising industry, would not be (fully) covered anymore. This would done by adding a "subjective approach" in the text of the GDPR.Instead of having an objective definition of personal data (e.g. data that is linked to a directly or indirectly identifiable person), a subjective definition would mean that if a specific company claims that it cannot (yet) or does not aim to (currently) identify a person, the GDPR ceases to apply. Such a case-by-case decision is inherently more complex and everything but a “simplification”. It also means that data may be “personal” or not depending on the internal thinking of a company, or given the circumstances that they have at a current point. This can also make cooperation between companies more complex as some would fall under the GDPR and others not.Further, such a “subjective” definition makes it impossible for users or authorities to know if the GDPR applies in each case. In practice, this can make the GDPR hardly enforceable due to endless debates and disagreements on the true intentions and plans of a company.Max Schrems: “It is like a gun law that only applies to guns when the owner confirms he is able to handle a gun, and intends to shoot someone. It is obvious how absurd such subjective definitions are.”(2) Pulling personal data from your device? So far, Article 5(3) ePrivacy has protected users against remote access of data stored on "terminal equipment", such as PCs or smartphones. This is based on the right to protection of communications under Article 7 of the Charter of Fundamental Rights of the EU and made sure that companies cannot "remotely search" devices.The Commission now adds "white listed" processing operations for the access to terminal equipment, that would include "aggregated statistics" and "security purposes". While the general direction of changes is understandable, the wording is extremely permissive and would also allow excessive "searches" on user devices for (tiny) security purposes.(3) AI Training of Meta or Google with EU's Personal Data? When Meta or LinkedIn started using social media data, it was widely unpopular. In a recent study for example only 7% of Germans say that they want Meta to use their personal data to train AI. Nevertheless, the Commission now wants to allow the use of highly personal data (like the content of 15+ years of a social media profile) for AI training by Big Tech. Max Schrems: "There is absolutely no public support for Meta or Google to include Europeans' personal data into their algorithms. For years we were told that people should not worry, because our personal data will be used to ‘connect’ us, or at best be used for targeting some advertisement. Now all your data is shoved into the algorithms of Meta, Google or Amazon. This makes it easier for AI systems to know even the most intimate details - and consequently manipulate people. This primarily benefits the trillion-dollar US industry that builds based models from our personal details." The European Commission foresees that users can opt-out, but companies and users usually don't know whose data is in a training dataset. Even if they know, users would have to opt-out thousands of times per year, whenever another company trains an algorithm with their data. Max Schrems: “The opt-out approach does not work in practice. Companies don’t have the contract details of users and users don’t know who is training based on their data. This opt-out approach is the Commission’s try to place a fig leaf over this obviously unlawful processing activity.”The Commission does not only want to privilege training of AI systems, but also the “operation” of such systems. This would amount to a “wildcard” where otherwise illegal processing would become legal, just because it is done using AI.Max Schrems: “Usually more risky technologies have to meet a higher standard. The Commission proposal now opens the floodgates once AI is used – while traditional data processing would still fall under the current laws. That’s insane.”(4) User Rights Cut to almost Zero - upon German Demand? Based on a national debate that GDPR access rights can be used to prove e.g. non-payment in employment contracts, the German government demanded a massive limitation to these rights – framing such use as “abuse”, even though the GDPR already has an “abuse” clause. The Commission has followed that German demand and proposes to limit the use of data subject access right to "data protection purposes" only. Conversely, this means that if an employee uses an access request in a labour dispute over unpaid hours – for example, to obtain a record of the hours they have worked – the employer could reject it as "abusive". The same would be true for journalists or researchers. In a broad reading this could go even further. If a person asks access to their data in order, subsequently, to delete false credit ranking data to get a cheaper loan at the bank, such rights may not be exercised purely for a "data protection purpose" but out of economic interest.This limitation is a clear violation of CJEU case law and Article 8(2) of the Charter. The right to informational self-determination is explicitly meant to level the information gap between users and the companies that hold the information, as more and more information is hidden on company servers (e.g. copy of time sheets). The CJEU has held multiple times that one can exercise these rights for any purpose - including litigation or to generate evidence.Max Schrems: “This change is a clear violation of the Charter and the CJEU case law. It will be used by controllers throughout Europe to further undermine users' rights. The reality is that we don't have wide-spread abuse of GDPR rights by citizens, but that we have wide-spread non-compliance by companies. To cut back user rights even further shows how detached the Commission is from the daily experience of users.”
Comments (0)