NVIDIA GTC Marks Turning Point in Physical AI
NVIDIA GTC showcased a turning point in physical AI, with the release of the NVIDIA Physical AI Data Factory Blueprint. This blueprint aims to address the complexities of modern AI factories, which span thermals, power grids, network load, and mechanical systems. Building these factories on time and on budget is facilitated by simulation technology, enabling operators to optimize performance and efficiency before physical installation. The company also introduced the NVIDIA Omniverse DSX Blueprint at the event.
The company stated, “Together with cloud leaders, we’re providing a new kind of agentic engine that transforms compute into the high-quality data required to bring the next generation of autonomous systems and robots to life. In this new era, compute is data.” Robots, vehicles, and factories are scaling from single use cases to sophisticated enterprise workloads across industries, indicating a significant shift in operational paradigms.
New Frontier Models and Agentic Frameworks Drive Physical AI Advancements
At the forefront of this physical AI shift are new frontier models, including NVIDIA Cosmos 3, NVIDIA Isaac GR00T N1.7, and NVIDIA Alpamayo 1.5. These models are central to the development of next-generation autonomous systems. Open-source agentic frameworks such as OpenClaw extend the AI stack to operations, enabling long-running processes that utilize tools, memory, and messaging interfaces to orchestrate workflows, manage data pipelines, and execute tasks autonomously. Peter Steinberger commented, “With NVIDIA and the broader ecosystem, we’re building the claws and guardrails that let anyone create powerful, secure AI assistants.”
OpenUSD is highlighted as a critical driver for physical AI scalability, providing a common scene-description language. This allows teams to integrate computer-aided design (CAD) data, simulation assets, and real-world telemetry into a shared, physically accurate view of the world. Converting CAD files to OpenUSD is a vital step, transforming engineering data into simulation-ready assets for building, testing, and validating robots in physically accurate virtual environments. The company also noted, “Factories themselves are now robotic systems.”
NVIDIA Partners with Robotics Ecosystem to Scale Physical AI
NVIDIA is actively partnering with the global robotics ecosystem, including leading robot brain developers and industrial robot giants, to enhance production-level physical AI. Companies such as ABB Robotics, FANUC, KUKA, and Yaskawa, boasting a combined global install base of over 2 million robots, are utilizing NVIDIA Omniverse libraries and NVIDIA Isaac simulation frameworks. These tools are employed to validate complex robot applications and production lines through physically accurate digital twins. ABB Robotics, FANUC, KUKA, and Yaskawa have also integrated NVIDIA Jetson modules into their controllers.
Robot development begins with the robot brains, and developers like FieldAI and Skild AI are building theirs using NVIDIA Cosmos world models for data generation and Isaac simulation frameworks for policy validation. Generalist AI is also exploring synthetic data generation using NVIDIA Cosmos. This combined approach accelerates robot proficiency across various tasks, from supply chain monitoring to food delivery. The NVIDIA Mega Omniverse Blueprint offers enterprises a reference architecture for designing, testing, and optimizing robot fleets and AI agents within facility digital twins before real-world deployment. KION, in collaboration with Accenture and Siemens, is leveraging this blueprint to construct large-scale warehouse digital twins for training and testing NVIDIA Jetson-based autonomous forklifts destined for GXO.
Cloud Platforms Enable Turnkey Data Production for Physical AI
Microsoft Azure and Nebius are the first cloud platforms to offer the NVIDIA Physical AI Data Factory Blueprint, transforming world-scale compute into turnkey data production engines. This blueprint is built on NVIDIA Cosmos open-world foundation models and the NVIDIA OSMO operator, unifying data curation, augmentation, and evaluation into a single pipeline. This integration enables developers to generate diverse, long-tail datasets from limited real-world inputs. The company emphasized that real-world data, once a barrier, does not scale due to its messy and unpredictable nature, leading to fragmented pipelines for processing, simulating, and evaluating data.
The bottleneck is not solely data, but the entire data factory. Using tools such as the NVIDIA Omniverse Kit software development kit and NVIDIA Isaac Sim, teams can optimize and enrich 3D data for real-time rendering, simulation, and collaborative workflows. The statement “All factories are born in simulation” underscores the importance of virtual environments in the development process. This approach allows for the optimization of performance and efficiency within simulations before any physical infrastructure is deployed.
✨ Intelligent Curation Note
This article was processed by AI Universe’s Intelligent Curation system. We’ve decoded complex technical jargon and distilled dense data into this high-impact briefing.
Estimated time saved: ~1 minutes of reading.
Tools We Use for Working with AI:









