North Carolina's DIT Fast-Tracks AI with New Accelerator, Oversight Teams and Public Training

North Carolina launches a DIT-run AI Accelerator, ordering agency oversight teams and public literacy efforts. 60-day pilots target efficiency while keeping strong safeguards.

Categorized in: AI News Government
Published on: Sep 20, 2025
North Carolina's DIT Fast-Tracks AI with New Accelerator, Oversight Teams and Public Training

North Carolina's AI Push: What It Means for State Agencies

North Carolina is moving to integrate artificial intelligence into state operations, with the Department of Information Technology (DIT) leading the charge. Formed in 2015, DIT centralizes IT services, oversees the 911 Board, and expands broadband access-making it a practical hub for AI adoption across cabinet agencies.

On Sept. 2, Gov. Josh Stein issued an executive order establishing an AI Accelerator at DIT, requiring each cabinet agency to form an AI oversight team, and launching AI literacy efforts for the public. The goal: move from talk to tested pilots that improve service delivery while keeping guardrails intact.

Inside the AI Accelerator

DIT's AI Accelerator is a 60-day, internally focused program where agencies partner with businesses and universities to develop, test, and pilot AI solutions in a safe environment. The department has committed to transparency with an updated list of use cases under evaluation, and it is currently soliciting ideas from across government.

DIT's near-term target is clear: give time back to employees. "The immediate focus is to improve efficiencies so that state employees can get back to working just a 40-hour week," a spokesperson said. The plan is to automate repetitive tasks so staff can focus on work that requires judgment and interaction.

Who's Driving the Work

Teena Piccione, a former Google executive, now leads DIT and has said she intends to "run at the speed of business, not at the speed of government." She appointed I-Sah Hsieh, a longtime SAS leader, as the first Deputy Secretary of Policy & AI to ensure "ethical, transparent and accountable" deployments.

Hsieh's stated priority: build trust to increase adoption. He argues public service culture-privacy, compliance, and serving everyone-can support quick testing and deployment without compromising safety.

The reality check: government process is slower by design. Even leadership confirmations can lag, underscoring the need for disciplined pilots, clear policies, and measurable outcomes to keep momentum.

Governance Moves: The AI Leadership Council

The executive order created a 24-person AI Leadership Council co-chaired by Piccione and Department of Commerce Secretary Lee Lilley. The council will meet quarterly and includes voices from government, industry, and academia-an intentional mix to balance innovation with accountability.

"No single sector can move AI forward on its own," said Siobahn Day Grady of North Carolina Central University. "Universities provide research and talent, government ensures accountability and trust, and companies deliver innovation and scale."

Early Lessons from a State Pilot

State Treasurer Brad Briner ran a pilot with OpenAI to see how ChatGPT could support staff in the Division of Unclaimed Property and the Division of State and Local Government Finance. Staff reported saving 30-60 minutes per day on routine work by the end of the pilot. "This technology is all about empowering public servants to do an even better job serving our citizens, not about replacing them," Briner said.

The pilot used only publicly available data, honoring a "bright red line" against entering private information into public AI tools. DIT guidance reinforces that anything entered into a public generative AI tool is considered released to the public and may be subject to records requests. The Treasurer's office is evaluating future tools after its free licenses expired and expects to integrate AI into its workflow.

What Agencies Should Do Now

  • Stand up your AI oversight team. Assign a product owner for each pilot with clear objectives, guardrails, and documentation.
  • Identify high-volume, low-risk candidates for automation (summaries, drafting, classification, routing, knowledge retrieval). Start with processes that use public data.
  • Set data rules. Classify data, define what cannot go into public tools, and use approved, secure environments for any sensitive work. Reference frameworks like the NIST AI Risk Management Framework.
  • Adopt a 60-day pilot cadence. Measure time saved, error rates, and user satisfaction. Publish use cases for transparency.
  • Update procurement. Require clear terms on data use, model updates, audit rights, incident reporting, and exit plans to avoid lock-in.
  • Keep a human in the loop for decisions that affect benefits, eligibility, compliance, or public safety. Document overrides and outcomes.
  • Train staff in AI literacy, prompts, review techniques, and data protection. If your team needs structured options by role, see AI courses by job.
  • Treat inputs and outputs as records where applicable. Set retention rules and a process for public records requests.
  • Test for bias and accessibility. Require plain language outputs and ADA-compliant formats for public-facing use.

Guardrails That Matter

  • Data protection: prohibit entry of private or regulated data into public tools; use state-approved solutions.
  • Accuracy: require source citations, fact checks, and sampling audits to reduce false outputs.
  • Transparency: maintain a public list of pilots and outcomes to build trust.
  • Security: vet vendors for identity controls, logging, and incident response.
  • Change control: manage model updates and version drift; retest before scaling.

Stay Connected

Track updates and resources from the NC Department of Information Technology as the AI Accelerator matures. The agencies that win here will be the ones that pilot quickly, measure honestly, and communicate clearly with the public.