"When you use the model, you save enormous amounts of energy" — NVIDIA CEO says they can't do graphics anymore without AI

Zotac GeForce RTX 4090 AMP Extreme AIRO
The RTX 4090 is insanely powerful, but even this goliath can benefit positively from AI. (Image credit: Harish Jonnalagadda / Windows Central)

What you need to know

  • In a recent interview, NVIDIA CEO, Jensen Huang, has been talking up AI in regard to his company's products. 
  • Simply put, Huang says NVIDIA can no longer do graphics without using AI. 
  • By using AI models in tools like DLSS, the process is both faster and more energy efficient. 

AI is everywhere, whether you're a fan of it or not, and in some cases it's even in places you might have been using and taking for granted. Like in your PC gaming rig. 

I've written in the past about how I hope that AI could lead to positive developments in gaming, like bringing the movie Free Guy to reality, but there's a very different angle, too. 

NVIDIA CEO Jensen Huang has in a recent interview (via PC Gamer) been talking about such uses of AI, and honestly, I hadn't considered some of this stuff. I'm probably not alone, either. 

One section that really stands out is this one: 

We can't do computer graphics anymore without artificial intelligence. We compute one pixel, we infer the other 32. I mean, it's incredible. And so we hallucinate, if you will, the other 32, and it looks temporally stable, it looks photorealistic, and the image quality is incredible, the performance is incredible, the amount of energy we save -- computing one pixel takes a lot of energy. That's computation. Inferencing the other 32 takes very little energy, and you can do it incredibly fast.

Jensen Huang, NVIDIA CEO

AI uses insane amounts of power, but we haven't considered the opposite

It's hard to believe DLSS has been with us for six years already, and it truly changed the game.  (Image credit: Windows Central)

There has been much talk of the biblical amount of energy consumed by the likes of OpenAI in training and executing its AI models. It's generally accepted that this stuff isn't great for the planet without a sustainable way of powering it. 

Today's graphics cards are also thirsty beasts. I'll never forget opening my RTX 4090 review sample and audibly gasping at the adapter cable that came with it that had four 8-pin power connectors on it. Granted, this is the best of the best, still, but it's capable of sucking up a crazy amount of energy. 

What never occurred to me is how AI and DLSS can actually mitigate how much energy is being consumed while generating the glorious visuals in today's hottest games. DLSS is easily the best implementation of upscaling, and its biggest competition, AMD's FSR, still uses a compute-based solution versus AI. 

NVIDIA's GPUs have incredible AI processing power, as has become evident when compared to the NPUs on the new breed of Copilot+ PCs. It's both interesting and a little shocking that the CEO of the biggest player in the game has outright said that it's no longer possible to ignore using AI in this way. 

Even if you're not a fan of generative AI or its infusion in search, art, or a multitude of other purposes, if you're a gamer, there's a good chance you're already a fan. Whenever the RTX 5000 series finally surfaces, it looks like its AI chops are going to be more important than ever before. 

🎒The best Back to School deals📝

Richard Devine
Managing Editor - Tech, Reviews

Richard Devine is a Managing Editor at Windows Central with over a decade of experience. A former Project Manager and long-term tech addict, he joined Mobile Nations in 2011 and has been found on Android Central and iMore as well as Windows Central. Currently, you'll find him steering the site's coverage of all manner of PC hardware and reviews. Find him on Mastodon at mstdn.social/@richdevine

  • fjtorres5591
    It's "worse".
    Huang is only talking graphics.
    He's not talking coding. Debugging in particular.
    NPC path finding, tactics, dialogue and other player interactions.
    Living and/or Destructible environments.
    In-game physics.
    Game engines in general are likely going to evolve into collections of SLMs.

    The anti-AI luddites are fighting the inevitable.
    Reply
  • Phyzzi
    fjtorres5591 said:
    It's "worse".
    Huang is only talking graphics.
    He's not talking coding. Debugging in particular.
    NPC path finding, tactics, dialogue and other player interactions.
    Living and/or Destructible environments.
    In-game physics.
    Game engines in general are likely going to evolve into collections of SLMs.

    The anti-AI luddites are fighting the inevitable.
    There are some AI luddites, but many more people upset with the "how" of current AI use than with using it in general. AI used to "generate" often has a lot of ethical problems with it, namely that it tends to use copious amounts of human generated source material without permission, which it then tends to be able to replicate quite well, with minor alterations, but which it does not and can not use "for inspiration" in a vaguely human way. It's not a stretch to call this theft, and it also means that these AI models tend to not be good at filling in the gaps of what humans have already done creatively without often irritating or unpleasant consequences to go along with.

    What Jensen is talking about is one of the cases where this kind of slight extrapolation is very useful instead of one where it's borderline criminal. I think we will see many more cases where AI is used, quite effectively, to fill in tedious gaps in the execution of human or even traditional computationally driven creativity. That's AI as a tool to save effort, and not a CEO dream to circumvent hiring real people.
    Reply
  • fjtorres5591
    Phyzzi said:
    There are some AI luddites, but many more people upset with the "how" of current AI use than with using it in general. AI used to "generate" often has a lot of ethical problems with it, namely that it tends to use copious amounts of human generated source material without permission, which it then tends to be able to replicate quite well, with minor alterations, but which it does not and can not use "for inspiration" in a vaguely human way. It's not a stretch to call this theft, and it also means that these AI models tend to not be good at filling in the gaps of what humans have already done creatively without often irritating or unpleasant consequences to go along with.

    What Jensen is talking about is one of the cases where this kind of slight extrapolation is very useful instead of one where it's borderline criminal. I think we will see many more cases where AI is used, quite effectively, to fill in tedious gaps in the execution of human or even traditional computationally driven creativity. That's AI as a tool to save effort, and not a CEO dream to circumvent hiring real people.
    On the matter of "human generated content" the bulk of the legal cases launched are flimsy, based on undocumented assumptions, and totally ignore legal precedents on both sides of the pond.

    The only one that looks to have a leg to stand on is the Getty lawsuit where the infringement of *paywalled* images is the point of contention and they claim the Stable Diffusion output includes at least portions of the Getty watermark.

    Most everything else is heartburn over freely available content that is/was neither paywalled nor fenced off by robot.txt. Well, if you provide content to all comers with no restrictions you can't after the fact try to impose restrictions you can't even implement. There are no do-overs in the law. And in societies where ethics are an afterthought only invoked when convenient it is a flimsy challenge to the self interest of the masses. Legal systems don't generally work retroactively. (Corporate publishers tried that 100+ years ago in the US and it only resulted in the First Sale Doctrine.)

    Note that the NYT lawsuit admits they only fenced off content via robots.txt *after* the chatbots started making money, which suggests the primary interest isn't ethics or principle, but money grabbing. (Which Getty is at least honest about.)

    There is a pervasive theory floating around, particularly in Europe, that just because you built a successful business out of a successful model/set of conditions, you are entitled to profitably exist in perpetuity. That the conditions that allowed you to succeed must be preserved at all costs.

    Isn't that the very essence of Luddism?
    It comes with consequences. Sooner or later the piper demands his due:

    https://www.telegraph.co.uk/business/2024/09/10/eu-elites-are-in-despair-over-europes-economic-death-spiral/
    If you look at the past few decades of "anything-but" media angst (Microsoft, Amazon, Google search, ebooks, SpaceX, etc) they all boil down to new technologies and business models superceding dated assumptions about the behavior and interests of the masses. (No, people will not willingly pay more fora lesser product/service. Remember Windows-N?)

    Time changes things and the business world, for one, is a darwinian Red Queen's race. You have to keep up or be left behind and just because something used to be successful in the past does not entitle anybody to be successful moving forward.

    Without going too far: look at the fading business of cable TV distributors who for decades refused to provide consumers with ala carte options only to find consumers abandoning them altogether for the ultimate ala carte distribution system in the form of content silo paid and ad-supported streaming services. 75% losses in a decade and counting.

    Time changes things and the business world, for one, is a darwinian Red Queen's race. You have to keep up or be left behind and just because something used to be successful in the past does not entitle anybody to be successful moving forward.

    Whining is not a successful business model.
    Reply
  • sfjuocekr
    No, don't believe this nonsense.

    You don't need AI to destroy image quality even more.

    Most of you self entitled gamers play games on monitors with sharpness and contrast cranked to the max, start with the basics first... turn sharpness off and set contrast back to neutral... wow now you are actually looking at a proper image and I bet you won't even like it...

    The same shit is true for AI techniques like DLSS, it solves NOTHING what it was meant to solve... aliasing (super sampling). All it does it add noise, like you did by cranking sharpness and contrast, blur it and then sharpen it...

    It looks like shit in EVERY game, but gamers don't even know what they are looking at in the first place so they shouldn't have an opinion... but they do because it "improves" their FPS?

    No, it does not.

    If I play a game at 1080 with 4x MSAA on a 4K screen it run worse than a DLSS 1080p game running at 4k "performance" setting. It uses more VRAM and it looks like dog shit smeared over my monitor... it just looks bad whenever there is motion.

    Text becomes a blur, lines vanish, details are lost and temporal artifacts are introduced where there shouldn't be in the first place... it literally draws your eyes away from what you need to focus on.

    As a nice example, my nephew keeps turning on TSR because he likes it... it improves his FPS... but every time I show him the game looks like crap he agrees, but then weeks later he has turned it back on! Why? He likes it... he likes looking at bad image quality? No, he has already forgotten what it did and just turned it on because some dog shit guide told him to do so...

    We need to stop pushing bullshit, image destroying "technology" onto players that don't even know what said technology does.

    DLSS makes a 1080p game look worse on a 4K screen by using more resources, so if you had a ridiculous gaming setup that would be fine... but most people out there don't play a 1080p game on a 4090... No, they play it on a 3050 Ti and adding DLSS to a system that is already struggling... is not good advice at all.
    Reply