AI & Big Data Designline
Letter from Cambridge, U.K. – AI Talk, Ceva, and Intelligent Power
Summer in Europe used to be a time for slowing down, but this year the electronics industry kept a hectic pace. From AI chips and technology updates to geopolitical tensions and tariffs, the sector remained in sharp focus. This week in Cambridge, U.K., executives attending Arm’s annual partner meeting gathered, providing insight into the current state of AI and its practical challenges.
One key conversation was with Shankar Krishnamoorthy, Chief Product Development Officer at Synopsys. While the company’s recent Ansys acquisition was a major milestone, Shankar emphasized the significant diplomatic and strategic work involved in closing the deal. He also shared his personal commitment to supporting students at IIT Bombay, his alma mater, highlighting the importance of nurturing future talent.
At Arm’s event, strict rules kept details under wraps, but AI clearly dominated discussions. The focus was on how Arm plans to address the opportunities that this new AI era presents. A recurring theme at conferences is the strength of support ecosystems and software from established players like Arm and Nvidia. Startups aiming to challenge these incumbents face hurdles around scalability and software portability.
One AI chip startup CEO pointed out that many newcomers rely on automatic compilers to map AI models to their hardware, unlike Nvidia’s Cuda, which uses handcrafted libraries and kernels supported by a mature ecosystem. This gap is a significant challenge for startups lacking the extensive software infrastructure.
Making Energy and Power Management Intelligent
In Cambridge, another noteworthy discussion was with Mahesh Tirupattur, CEO of Analog Bits, about smart energy and power management for AI applications. This year, Analog Bits showcased analog IP on the latest 2-nanometer and 3-nanometer process nodes at major industry events including TSMC’s showcases and DAC 2025.
Delivering analog IP on these advanced process nodes is significant because it helps manage the growing power demands in AI data centers and other high-performance applications. The company’s intelligent power architecture aims to optimize energy usage, which is critical as AI workloads continue to increase.
Ceva’s 20-Billion Devices: A Tipping Point for NPUs and Edge AI
Outside Cambridge, Amir Panush, CEO of Ceva, shared insights on the company reaching a milestone of 20 billion shipped devices powered by Ceva technology. He highlighted that neural processing units (NPUs) are becoming essential for edge AI devices.
According to Panush, the NPU market is at a tipping point for broader adoption. He stressed that AI is not solely about GPUs and that NPUs play a crucial role, especially in edge computing where power efficiency and real-time processing matter.
Ceva's journey reflects the evolution from mobile technology to IoT and now AI. Wireless connectivity remains vital, and NPUs are positioned to support future smart devices. Panush also noted the critical role of software in making AI solutions practical and scalable.
For those interested in deepening their AI skills, exploring courses and training on AI hardware and software ecosystems can provide valuable insights. Resources like Complete AI Training offer up-to-date courses tailored for professionals aiming to understand AI chip design, edge computing, and more.
Your membership also unlocks: