‘We’re going to need three computers… one to create the AIā€¦ one to simulate the AIā€¦ and one to run the AI’

by Pelican Press
30 views 5 minutes read

ā€˜Weā€™re going to need three computersā€¦ one to create the AIā€¦ one to simulate the AIā€¦ and one to run the AIā€™

When you buy through links on our articles, Future and its syndication partners may earn a commission.

 An image showing Nvidia's Jen-Hsun Huang discussing the future of AI at Siggraph 2024 with Wired.

Credit: Nvidia/Siggraph

At this yearā€™s Siggraph event, Nvidiaā€™s Jen-Hsun Huang sat down with Wired for an hour-long chat about all things Nvidia, RTX, and AI. Among the varied topics touched upon, including a recognition that AI training and inference have huge energy demands, was Huangā€™s assertion that more computers are going to be needed for AI systems in the futureā€”specifically, three times more.

Siggraph is an annual conference normally about computer graphics and technology involved with interactivity (think AR and VR, that kind of thing) but it was only a matter of time before AI would become the main topic of discussion. To that end, Nvidiaā€™s CEO was interviewed by Wiredā€™s Lauren Goode for an hour-long streamed discussion that covered GPUs, RTX, and ray tracing, but mostly AI.

If youā€™ve been keeping up to date with Nvidiaā€™s push for generative AI to be everywhere, then thereā€™s nothing in the discussion that will really pique your interest. However, at one point, Huang mentioned how the world of AI is now moving away from its pioneering phase and moving toward the next one, which Nvidiaā€™s CEO called the ā€œenterprise wave.ā€

After that comes the ā€œphysical waveā€, which, according to Huang, is ā€œreally, really quite extraordinary.ā€ He clarified that statement by saying three computers will be required: one computer to create the AI, another to simulate and refine the AI, and finally a third computer to run the AI itself.

ā€œItā€™s a three computer problem. You know, a three body problem and itā€™s so incredibly complicated and we created three computers to do that.ā€

Jen-Hsun is, of course, talking about Nvidiaā€™s raft of hardware and software packages, from its DGX H100 servers to create the AI, Jetsen embedded computers to simulate the AI, and then workstations and servers using Omniverse and RTX GPUs to run the AI.

Siggraph is one of my favourite tech events and Iā€™ve been watching presentations and reading research papers presented at the conference for years. I have to say that itā€™s a bit of shame that Nvidiaā€™s fairly blatant sales pitch for its AI systems took center stage this yearā€”Huangā€™s chat with Metaā€™s Mark Zuckerberg was another example of AI-promotion with no substanceā€”and although there will still be plenty of discussion about computer graphics, and AI will naturally be a part of that, Huang didnā€™t say anything that made me think ā€œWow, this is going to be so cool!ā€

Are we really going to need three computers? PC gamers certainly wonā€™t and neither will most businesses. Even those looking to really integrate AI into their core operations may baulk at the potential cost and complexity of using and paying for three tiers of Nvidiaā€™s products.

Nvidia is clearly 100% focused on AI now. The days of it just being a graphics/gaming-only company are long gone, despite it being a core part of the business when it morphed into a data-processing one. Thatā€™s not to say PC gamers wonā€™t benefit from Nvidiaā€™s technological advancements in AI, of course, as the likes of RTX and DLSS have arguably been a big step forward in the world of rendering.

And I certainly wouldnā€™t expect the CEO of the worldā€™s most successful AI company to not push it at every possible business opportunity but I think we could all do with a bit of a breather from the relentless push for artificial intelligence festooning every aspect of our computing livesā€”it does wear a little thin after a while.



Source link

#computers.. #create #simulate #run

You may also like