Former Microsoft Legal Chief Joins ServiceNow as President and CLO

ServiceNow hired Microsoft's ex-CLO as president/CLO, putting legal at the business core and AI governance up front. In-house teams: earlier product input, tighter contracts.

Categorized in: AI News Legal
Published on: Jan 06, 2026
Former Microsoft Legal Chief Joins ServiceNow as President and CLO

ServiceNow Hires Former Microsoft Chief Legal Officer as President and CLO: What It Means for In-House Counsel

Artificial intelligence company ServiceNow said Monday that it hired the former chief legal officer of Microsoft as its new president and chief legal officer. One move, two messages: legal leadership is moving closer to the business core, and AI governance is now a top-line priority.

This isn't just a title shift. It's a signal. Legal is being asked to own risk, policy, and growth, not just compliance.

Why this matters for legal teams

  • AI risk is now executive-level work. A CLO stepping in as president shows that AI policy, safety, and regulatory exposure sit next to revenue and product decisions.
  • Integrated legal and product. Expect earlier legal involvement in model design, data strategy, and go-to-market-especially for high-risk use cases.
  • Regulatory pressure is rising. Think EU AI Act scoping, NIST AI RMF alignment, privacy-by-design, and transparent model documentation.
  • Contracts will get sharper. Data provenance, IP indemnities, monitoring rights, incident reporting, and model-change notifications will become standard.
  • Board oversight tightens. Governance needs cadence: metrics, incidents, third-party audits, and clear RACI across Legal, Security, and Product.

What to do next (practical steps)

  • Map AI use. Inventory internal and vendor models. Classify by risk. Identify shadow AI in workflows and contracts.
  • Update policies. Set rules for training data, human-in-the-loop, testing, and prohibited uses. Keep them simple and enforceable.
  • Tighten agreements. Add clauses for data sources, IP warranties, indemnity, evaluation rights, audit cooperation, incident SLAs, and export controls.
  • Stand up governance. Create an AI review council with Legal, Security, Privacy, and Product. Document decisions. Track exceptions.
  • Prepare disclosures. Align risk factors and incident reporting with securities, consumer protection, and sector rules.
  • Train the business. Short, role-based modules beat long manuals. Focus on what to use, what to avoid, and who to call.

Implications for deals, hiring, and org design

Expect more GCs and CLOs to take operational roles where AI is core to the product. Legal backgrounds with product fluency will rise in value. If you lead an in-house team, this is the moment to build credibility beyond "no"-own frameworks, speed decisions, and help ship safely.

A quick checklist to run this week

  • List your top 10 AI-involved processes or vendors; mark high-risk items (safety, privacy, IP, discrimination).
  • Add AI-specific terms to your standard MSAs and DPAs.
  • Require model cards or equivalent documentation from key vendors.
  • Define incident thresholds and 24-72 hour notification expectations.
  • Set a quarterly AI risk review with the board or risk committee.
  • Launch a 30-minute training for product and procurement teams.

Authoritative references

Level up your team

If your legal function is building AI fluency, consider role-based training to speed up policy, contracting, and governance work. See curated learning paths by job here: AI courses by job.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide