How AI Changes What Creatives Can Actually Do With Their Tools
Nvidia's Jensen Huang outlined a fundamental shift in how creative software works during his appearance at Adobe Summit 2026. The change isn't about new tools. It's about how people access the capabilities already built into them.
For decades, creative software has operated the same way. Users learn menus, build shortcuts, and work within the interface they see. Huang estimates most people use only a fraction of what their tools can do. "I think my entire vocabulary of Photoshop is probably 7% of its capabilities," he said.
AI systems that understand intent can close that gap. Instead of navigating step by step through menus and dialogs, a creator describes what they want and the system executes the task by drawing on the full range of available functions.
The Interface Becomes Conversational
Huang described this as a new user interface: artificial intelligence itself. Agentic systems interpret instructions and handle execution without requiring users to know where every function lives.
"We used to use tools by point and click and loading files and dragging down menus," Huang said. That model is changing.
Applications like Photoshop and Premiere remain in place. But a layer sits in front of them now. Users can still work directly inside the software or work through a system that interprets what they're asking for and handles more of the work.
This has a direct effect on who can use these tools effectively. Functions hidden behind complexity become accessible. Companies report that tool use is skyrocketing as a result.
Expertise Shifts From Operation to Direction
This doesn't eliminate the need for skill. It changes where that skill matters. The advantage moves away from knowing how to operate software and toward understanding what to ask for and how to guide the result.
A broader group of people can now produce high-quality work. The barrier to using complex software drops when systems understand intent and carry out tasks automatically.
For teams in creative, marketing, and product development, this creates a different learning curve. It's less about mastering interfaces and more about working with systems that interpret instructions and act on them.
Beyond Creation Into Action
Huang extended the concept beyond generating images and content. He described "vision language action models" - systems that take input from the physical world and respond to it directly.
"If you can go from language to images, images to language⦠why can't we go from language to chunks of actions?" he said.
This approach applies to manufacturing, logistics, and transportation, where understanding real-world conditions is essential. Nvidia is building digital representations of physical objects and environments with precision, then combining them with AI to simulate and act on those conditions.
The underlying principle remains consistent: when systems understand intent, more capability becomes accessible. The user's role becomes focused on direction and judgment rather than execution.
Huang presented this as something already being built into tools companies use today, not a distant prospect. That means creatives working now need to adapt to working alongside systems that interpret their intent rather than mastering every function manually.
Learn more about how AI is changing creative workflows through AI Design Courses and AI Agents & Automation training.
Your membership also unlocks: