What if the next time you boarded a plane, you weren’t just showing your passport you were asked to remove your contact lenses so your retina could be scanned? Would you feel safer, or would you feel exposed while thinking about EU AI compliance in technology?
This is no longer science fiction. In parts of the Middle East, it’s already standard practice. And this tension the promise of safety versus the risk of intrusion sits at the core of the European Union’s AI Act, which demands compliance. For in-house lawyers, it’s more than a policy debate. It’s a set of choices about compliance, innovation, and trust.
In a recent episode of Notes to My (Legal) Self® AI Insights, Olga Mack and Kassi Burns unpacked how the AI Act treats “unacceptable risk” systems such as social scoring, subliminal manipulation, and biometric surveillance. These categories are not just theoretical they raise daily questions for companies navigating the new territory of the AI Act for in-house counsel who must ensure EU AI Act compliance.
Watch the full conversation with Olga Mack and Kassi Burns here:
EU AI Act Compliance and the Ambiguity of “Unacceptable Risk”
On paper, the Act bans AI systems that manipulate behavior, score citizens, or deploy biometric recognition without clear limits. Simple enough until you start asking what those terms mean in the context of compliance.
Take social scoring. Is it only a government assigning ratings to citizens, as in dystopian fiction? Or could a company’s consumer scoring model something already common in advertising fall under the same prohibition and be scrutinized for EU AI Act compliance?
For in-house counsel, this ambiguity matters. It can slow decisions, deter innovation, and create uncertainty about when a promising product might cross into forbidden territory. Advising in this gray zone requires not just legal knowledge but judgment about ethics, public perception, and future enforcement, with a focus on compliance.
Biometric Risks and EU AI Act Compliance
Voice cloning, facial recognition, fingerprint scans the possibilities of biometric AI are vast. Law enforcement sees tools to identify criminals. Businesses see smoother security and faster customer experiences, while maintaining compliance with the EU AI Act.
But the risks are equally stark. Biometric systems can entrench bias, invade privacy, and damage trust if deployed without restraint. The AI Act reflects both the hope and the fear. It allows narrow law enforcement exceptions but tightly restricts commercial use. For in-house lawyers, this means every biometric deployment under the AI Act commands scrutiny not just for compliance, but for reputational safety, eyeing EU AI compliance closely.
In-House Lawyers Navigating EU AI Act Compliance
What does this mean for corporate counsel? It means that the legal department will increasingly be asked to interpret rules that aren’t fully defined. When your product team experiments with personalization, does it risk sliding into social scoring? When your executives ask whether the Act applies today or five years from now, how do you give guidance that balances caution with agility, ensuring EU compliance?
These aren’t questions that can be answered by black-letter law alone. They require strategic thinking about risk, opportunity, and the values your company wants to project in the market, keeping EU AI Act compliance in mind.
From Compliance to Leadership
The EU AI Act isn’t just about banning technologies. It’s a signal that AI is no longer unregulated and that ethics is inseparable from business strategy. For in-house lawyers, this creates an opening. You’re not only advisors on what the law says; you’re leaders helping your organizations decide what kind of future they want to build while observing EU compliance.
That leadership will separate the companies that thrive in the AI era from those that stumble under uncertainty. And it starts with lawyers who are willing to engage, ask the hard questions, and guide their teams toward responsible innovation, all within the framework of ensuring EU AI compliance.
Watch the full conversation here: Notes to My (Legal) Self: Season 7, Episode 2 (ft.Olga Mack and Kassi Burns)
Join the Conversation
At Notes to My (Legal) Self®, we’re dedicated to helping in-house legal professionals develop the skills, insights, and strategies needed to thrive in today’s evolving legal landscape. From leadership development to legal operations optimization and emerging technology, we provide the tools to help you stay ahead.
What’s been your biggest breakthrough moment in your legal career? Let’s talk about it—share your story.