NVIDIA GTC Kicks Off in San Jose with Focus on Full AI Stack
The NVIDIA GTC (GPU Technology Conference) has commenced, running from March 16-19, with an anticipated 30,000 attendees from 190 countries converging across ten venues. A key highlight is the keynote address by Jensen Huang, expected to delve into the complete AI ecosystem, encompassing chips, software, models, and applications. The conference will feature 700 sessions and will be streamed online for virtual participants.
This year’s GTC is set to explore a broad spectrum of AI advancements, including physical AI, AI factories, agentic AI, and inference. The event will also include discussions on designing and scaling enterprise AI factories for large language models, agentic AI, physical AI, and high-performance computing. The conference will be held at the SAP Center, the home venue of the San Jose Sharks.
AI Infrastructure Buildout and Future Architectures Under Discussion
Jensen Huang has previously articulated that the ongoing AI infrastructure buildout, currently valued at a few hundred billion dollars, could eventually escalate to trillions. He emphasized that AI development is contingent on five critical “layers”—energy, chips, infrastructure, models, and applications—all requiring simultaneous scaling for widespread adoption. NVIDIA is strategically positioned at the core of this interconnected ecosystem.
Discussions at the GTC are expected to touch upon NVIDIA’s roadmap, from the Blackwell Ultra architecture to the forthcoming Feynman architecture. Insights into NVIDIA’s strategies for handling the significant power demands of upcoming Rubin chips, and the potential adoption by major cloud providers, are anticipated. The Feynman generation may incorporate copackaged optics to enhance data transfer efficiency, thereby reducing power consumption and enabling larger AI clusters. Analysts are also observing the evolution of NVIDIA’s SuperPod architecture, introduced at the Consumer Electronics Show, which integrates chips, networking, storage, and software for managing large-scale AI workloads.
Industry Leaders and Open Models Conversation at GTC
CEOs from prominent AI companies including Perplexity AI, LangChain, Mistral AI, Skild AI, and OpenEvidence are scheduled to participate in pre-event discussions. Additionally, Harrison Chase, alongside leaders from Andreessen Horowitz, the Allen Institute for AI, Cursor, and the Thinking Machines Lab, will engage in a conversation with Jensen Huang regarding the merits of open models versus frontier closed models.
Experts will also be demonstrating the practical development of physical AI through the use of NVIDIA Isaac and NVIDIA Omniverse platforms. The conference is occurring amidst significant recent developments for NVIDIA, with analysts noting that the current framing of AI as “AI factories” where data and electricity yield tokens and insights, is a key focus for this GTC. This perspective reframes data centers as revenue-generating production facilities, directly linking compute power to earnings.
✨ Intelligent Curation Note
This article was processed by AI Universe’s Intelligent Curation system. We’ve decoded complex technical jargon and distilled dense data into this high-impact briefing.
Estimated time saved: ~1 minutes of reading.
Tools We Use for Working with AI:









