mAbxience and HP bring AI-driven digital twins to biomanufacturing
mAbxience, a Fresenius company, and HP Inc. have launched a project to apply artificial intelligence to the production of monoclonal antibodies and biosimilars. The core deliverable: a digital twin of the biological process that improves predictability, consistency, and efficiency at scale.
Built on real production data and validated in an industrial setting, the system lets teams simulate, analyze, and tune key stages of cell culture manufacturing. The target is clear-higher yields with less process variability.
Why this matters for IT and engineering teams
This is a practical blueprint for bringing AI into regulated bioprocessing without breaking GMP workflows. Biosimilar licensees and CDMO clients stand to benefit from faster iteration, tighter control, and smarter resource use-all while staying within quality and compliance guardrails.
Leaders at both companies point to impact over hype: technology that helps critical treatments reach more people, faster. The first prototype is live, proving the concept can support more efficient and tightly controlled manufacturing campaigns.
How the digital twin is built
- Data foundation: Historical and in-process data from cell culture runs, with strict lineage and versioning.
- Modeling approach: Advanced neural networks trained on production data to capture process dynamics and sensitivities.
- Simulation layer: What-if analysis across feed strategies, setpoints, and process parameters before changing anything on the floor.
- Optimization loop: Recommendations to increase yield and cut variability, with guardrails based on validated ranges.
- Validation: Industrial validation to ensure performance holds up on real equipment and real batches.
This first project, designed and validated in LeΓ³n, is intended to be replicated across other mAbxience sites and processes as part of a broader HP-mAbxience collaboration.
Architecture notes for implementation
- Pipelines: Stream and batch ETL/ELT from historians, MES, and QC systems. Enforce schema contracts and late-binding joins to preserve flexibility.
- Feature management: Central feature definitions for time-series and event-aligned features (e.g., DO, pH, feed rates, metabolites, viability).
- Model choices: Time-series models (e.g., temporal CNNs or sequence models), surrogate modeling, and constrained optimizers for safe recommendations.
- MLOps: Model registry, CI/CD for models and data pipelines, shadow runs, and staged rollouts. Automate monitoring for drift and stability.
- Interfaces: Read-only simulation sandbox for process engineers; controlled write-backs through validated change controls.
Quality, safety, and compliance checklist
- Data integrity: Full audit trail, versioned datasets, model cards, and immutable artifacts.
- GxP validation: Risk-based validation plan (requirements, verification, and performance evidence) mapped to SOPs.
- Access control: Role-based access with segregation between development, validation, and production use.
- Electronic records: e-signatures and audit logs aligned with 21 CFR Part 11 expectations.
For background on digital twins, see this overview of the concept: Digital twin. For electronic records and signatures, review the FDA reference: 21 CFR Part 11.
KPIs worth tracking
- Yield and titer per batch
- Batch-to-batch variability
- Cycle time and time-to-release
- Media, energy, and consumable use per kilogram produced
- Deviation rate and rework hours
What's next
This is the first project under a broader HP-mAbxience framework to apply AI across multiple areas-productivity, quality, and sustainability-throughout global operations. With a working prototype and industrial validation in place, the next logical step is scaling the twin to adjacent processes and sites, then closing the loop with decision support under strict change control.
If you're upskilling your team for AI in regulated environments, these practical learning paths can help: AI courses by skill.
Your membership also unlocks: