Skip to main content

Technical Victories: Why “Shadow” Work Matters for Your Digital Rights

Our digital world is built on code, and our laws must be just as precise.

My goal as a politician and in the European Parliament was to remain the voice in the room that understands the technology and fights to make sure it serves us—not the other way around.

Working as a Shadow Rapportor?

When you read about European laws like the AI Act or the Digital Services Act, the headlines usually focus on the broad “what.” But as a Shadow Rapporteur, my job was to focus on the “how.” I am one of the MEPs who sat in the room to negotiate the actual, technical text of the law—the specific articles and definitions that determine whether a law actually works or just creates more red tape.

For me, digital policy isn’t just a general interest; it’s a commitment to protecting the people who build our digital world and the citizens who live in it. Here is a look behind the curtain at the technical victories I’ve fought for.


🛡️ Protecting the “Digital Commons”: Open Source in the CRA & PLD

One of my proudest successes was defending the global developer community. When the Cyber Resilience Act (CRA) was first drafted, there was a massive oversight: individual open-source contributors feared they could be held liable for security flaws in code they donated to the world for free. This would have been a death sentence for innovation.

I stepped in to ensure that:

  • The Definition of Open Source: We secured a clear definition that protects non-commercial software development.
  • Harmonization with the Product Liability Directive (PLD): I fought to mirror these protections in the PLD, ensuring developers aren’t sued for contributing to the “digital commons” that powers everything from your phone to our public infrastructure.

🚫 Banning “Pseudo-Science” in the AI Act

In the JURI (Legal Affairs) Committee, I led the charge against AI technologies that aren’t just invasive—they’re scientifically flawed. While many were happy with a general ban on biometric surveillance, I pushed for more granular prohibitions.

I successfully argued that technologies like emotional recognition (AI claiming to “read” your feelings) and behavioral recognition are often based on “pseudo-science.” By securing bans on these specific uses, we’ve ensured that AI cannot be used to categorize or monitor Europeans based on flawed, biased assumptions about how they look or act.


⚖️ The Digital Services Act (DSA): Moving Beyond Algorithms

My early career was spent on the front lines of the “Cartoon Crisis” and countering Russian disinformation. From 2010 onwards, I saw how platforms could be used to unite opposition in order to topple dictators, but later I’ve seen how platforms spread polarisation and chaos. In the DSA, I turned that experience into law.

I focused on “Responsibility by Design.” Platforms can no longer hide behind “the algorithm.” I fought for amendments that require:

  • Human Oversight: Ensuring that technical tools are backed by human judgment.
  • Staffing Requirements: Forcing platforms to have the actual human resources (and local language expertise) to moderate content effectively and fairly.

🔍 Legislative Deep Dive: The Tracker

Here is a breakdown of the specific legislative “articles” where I have made a direct impact on the final text:

Legislative FileSpecific FocusMy Technical Intervention
Cyber Resilience Act (CRA)Open Source ProtectionSecured the exemption for free and open-source software (FOSS) developed outside commercial activity (Art. 2 & Recitals).
Product Liability Directive (PLD)Developer SafetyAligned the definition of “software” to ensure individual contributors aren’t held liable for non-commercial code (Art. 4).
AI Act (JURI Committee)Biometric BansPushed the ban on Emotion Recognition and Biometric Categorization in workplaces and education (Art. 5).
Digital Services Act (DSA)Platform ResponsibilityStrengthened requirements for human-in-the-loop oversight and systemic risk mitigation (Art. 34/35).
Data ActConsumer PrivacyIntroduced amendments to ensure devices can be used anonymously and to prevent profiling via IoT devices.

Leave a Reply

Your email address will not be published. Required fields are marked *