Are you ready to lead your company through the biggest regulatory shift in artificial intelligence to date? President Biden’s AI executive order isn’t just another policy—it’s a defining moment for in-house counsel. To stay ahead, legal leaders must develop a clear and proactive AI legal strategy that addresses new standards around safety, data privacy, and AI governance. So the real question is: is your organization prepared for what’s coming?
To explore the implications of this sweeping directive, we spoke with Steven Kelts, a professor of digital ethics at Princeton. His insights offer a roadmap for how legal teams can turn the executive order into an opportunity for leadership.
Watch the full conversation with Steven Kelts here:
Why an AI Legal Strategy Matters for In-House Counsel
One of the most important elements of the order is its requirement for robust AI safety testing. The directive calls on the National Institute of Standards and Technology (NIST) to create red teaming protocols—practices long used in cybersecurity, now adapted for AI oversight. According to Kelts, this broadens the scope of risk evaluations and places a high premium on well-documented safety processes.
To meet these expectations, in-house counsel must take the lead in integrating safety assessments into AI product development. An effective AI legal strategy should involve legal, compliance, engineering, and product teams working together. This collaboration ensures the company can meet scrutiny under evolving regulatory frameworks.
Red Teaming and Risk Assessment in AI Governance
Importantly, the executive order doesn’t apply only to federal contractors. It signals a much broader impact across the AI ecosystem. Procurement guidelines will likely extend to vendors and subcontractors, making compliance essential for most businesses using or selling AI.
Given this shift, in-house counsel should proactively revise contract templates, reexamine supplier agreements, and identify legal exposure. By embedding these practices into your AI legal strategy, your company can remain competitive while strengthening its legal positioning.
Crafting an AI Legal Strategy That Addresses Vendor and Procurement Risks
Another pivotal section of the order addresses watermarking and authentication of AI-generated content. Initially, this is for federal use but with industry-wide implications. This development could redefine liability under Section 230 of the Communications Decency Act. Kelts warns that organizations deploying generative AI tools may increasingly be viewed as content publishers. They may not just be platforms.
Legal teams must now consider how to mitigate reputational and regulatory risk. Your AI executive order legal strategy should include content review policies, platform moderation guidelines, and risk assessments tied to user-generated and AI-created outputs.
Mitigating Liability for AI-Generated Content
The order also tightens restrictions on government use of personal data purchased from brokers, especially when it involves personally identifiable information (PII). This change will likely reshape data broker practices. It will also raise new compliance obligations for companies relying on third-party data.
In-house counsel must reevaluate how data is collected, used, and stored. An effective AI executive order legal strategy should include updated privacy policies, due diligence checklists for vendors, and revised internal protocols. These should align with evolving data governance expectations.
Privacy, Compliance, and the Future of AI Legal Strategy
As Kelts emphasizes, this executive order is a starting point—not the final word. Future regulations are expected from the FTC, DOJ, HHS, and other agencies. But rather than waiting passively, legal teams can use this moment to lead. Engaging in public comment processes, shaping internal policies, and educating executive leadership are all critical steps.
Building a forward-thinking AI executive order legal strategy today will help your organization stay compliant, competitive, and trusted. This is crucial as AI governance continues to evolve. This isn’t just about managing risk—it’s about leading responsibly in the age of intelligent technology.
Watch the full conversation here: Notes to My (Legal) Self: Season 6, Episode 12 (ft.Steven Kelts)
Join the Conversation
At Notes to My (Legal) Self®, we’re dedicated to helping in-house legal professionals develop the skills, insights, and strategies needed to thrive in today’s evolving legal landscape. From leadership development to legal operations optimization and emerging technology, we provide the tools to help you stay ahead.
What’s been your biggest breakthrough moment in your legal career? Let’s talk about it—share your story.