Privacy advocates are condemning the European Commission's leaked plans to overhaul digital privacy legislation, accusing officials of bypassing proper legislative processes to favor Big Tech interests.
Max Schrems, founder of privacy group Noyb, warned: "One part of the European Commission (EC) seems to try overrunning everyone else in Brussels, disregarding rules on good lawmaking, with potentially terrible results."
He compared the approach to Trump administration tactics, arguing the proposals masquerade as small business relief while actually benefiting tech and advertising giants.
As first reported by MLex, the EC's proposed legislative changes are manifold, and in Noyb's view these would poke so many holes in existing rules to "make [GDPR] overall unusable for most cases."
The EC is planning to introduce the "Digital Omnibus" package on November 19, introducing amendments to legislation covering AI regulation, cybersecurity, data protection, and privacy.
An overview of the leaked proposals [PDF], shared by Noyb, includes details on the most potentially impactful ideas to existing laws and regulations.
One of the proposed changes covers an amendment to the GDPR, which the privacy group claims would introduce a loophole that affords a company freer rein to use personal data for its commercial benefit.
The current GDPR stipulates that even if personal data is tied to a pseudonomized user (ie, "John Doe" is changed to "User12345"), then the data must still be treated as if it belongs to an identifiable, natural person, and data protection rules should still apply.
Under the new proposals, this stipulation would no longer be enforced, potentially allowing data controllers to be more lax with protecting users' personal data. "This could apply to almost all online tracking, online advertisement, and most data brokers," Noyb said.
The EC may also propose a "purposes limitation" on data access rights, hindering an individual's right to access, correct, or delete the data an organization or company has on them.
Noyb's interpretation is that data controllers would have greater powers to reject data access requests. "This means that if an employee uses an access request in a labor dispute over unpaid hours – for example, to obtain a record of the hours they have worked – the employer could reject it as 'abusive.' The same would be true for journalists or researchers."
The proposals weaken GDPR's Article 9 sensitive data protections - sexual orientation, health status, political views - would only apply when "directly revealed." Companies could infer this data from other sources without triggering protections.
Noyb warned this could enable employers to deduce pregnancies and terminate employees before legal protections attach, or discriminate based on inferred sexual orientation.
All of these measures are, in part, being framed by the EC as a means to alleviate the administrative burden placed on small businesses, but Schrems instead labeled this a "side-show to get public support."
Whether these proposals do indeed attract the public support, the EC will need for them to pass could have consequences for policymaking beyond Europe.
The current US administration has taken a more pro-innovation approach to regulating technology, such as AI, but it is not inconceivable that the way in which the EC's proposals are received later this month could later inform similar policy decisions – at least at state level – as they have done previously.
For example, the GDPR, introduced in 2018, inspired the landmark California Consumer Privacy Act (CCPA), which passed in the same year and became enforceable in 2020.
Big Tech and other EU companies have lobbied the EU to weaken the AI Act since it passed and partially came into force last year.
Core to their arguments is that the regulations are too restrictive on innovation, and the reforms may give AI systems a special exemption, allowing them to process data that would otherwise require a legitimate legal basis.
According to Noyb's interpretation, "this would lead to a grotesque situation: If personal data is processed via a traditional database, Excel sheet or software, a company has to find a legal basis under Article 6(1) GDPR. However, if the same processing is done via an AI system, it can qualify as a 'legitimate interest' under Article 6(1)(f) GDPR."
The org adds: "This would privilege one (risky) technology over all other forms of data processing and be contrary to the 'tech neutral' approach of the GDPR."
The proposals additionally aim to introduce amendments that make it easier for data controllers to comply with data protection laws, while being allowed to use people's data to train their models.
Various protections are outlined in the leaked draft, such as the requirement for data minimization and safeguards to be implemented, although the document does not specify what safeguards mean in this context.
Noyb also said certain interpretations of the proposals could allow companies to gather more data from users' personal devices that could then be used to train Big Tech's AI models.
Such data is currently protected by Article 5(3) of the GDPR, which is underpinned by Article 7 of the Charter of Fundamental Rights of the European Union – respect for private and family life, home, and communications.
A legitimate interest protection for gathering data related to "security purposes" and "aggregated information" could be interpreted broadly by AI companies if the EC does not apply strict definitions, potentially leading to excessive searches of data subjects' devices, the privacy campaigners argued. ®
Source: The register