• Coelacanth@feddit.nu
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    20 hours ago

    Like the other user said, DLSS is literally more power efficient than native rendering if you care about power draw or whatever. You can harp on it all you want for not achieving perfect visual fidelity, especially on modes like Ultra Performance, but efficiency is literally the whole point of it.

    • NihilsineNefas@slrpnk.net
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      5
      ·
      19 hours ago

      Oh I’m not harping on it for visual fidelity, I’m harping on it for being a useless feature thats become a major part of the genai problem with the 50 series and as a result part of why ram/gpus/cpus prices are massively overinflated, despite having barely any increase in actual graphical power over the previous generation.

      • Coelacanth@feddit.nu
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        19 hours ago

        You can rage against the machine if that makes you happy, but DLSS is patently not a useless feature. It lets you sacrifice visual fidelity for performance, that’s it. Many people find it useful. Any hardware you buy will be obsolete at some point. You may be able to play new releases in native resolution now, but in a few years your card won’t keep up anymore. Instead of buying a new card, you can keep using your old one and turn on DLSS. That’s useful. DLDSR is also a fantastic use of AI that is especially impactful on older games, but will make almost any game look better if you use it, particularly games that don’t have good native anti-aliasing.

        DLSS is also a very minor part of the AI landscape - in fact I think the only reason Nvidia hasn’t scrapped selling gaming cards entirely is that it’s part of their “legacy”. If you want to hate on every scrap of AI in existence because of a dogmatic hatred of AI in general then that’s fair enough, but then say so instead of calling a technology useless and inefficient when it’s neither.

        • NihilsineNefas@slrpnk.net
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          3 hours ago

          You know what also reduces visual fidelity for more frames, works with every generation of game since the beginning of the ‘settings’ option? Running the game on lower graphics.

          I call it useless because it’s only ‘use’ is faking a higher frame rate, and making it so that games developers dont have to put in effort when they’re optimising their game.

          The main gripe I have with it is that nvidia decided to implement it and sell it as a new generation of card, and by not changing any architecture in their cards, pushing for a “Hey Datacenters; buy more cards, they’re designed to work with your AI systems.” Method of shifting their stock, because they knew they could make a quick buck off the AI bubble.

          As a result the general public get absolutely shafted with higher prices for the same tech as the previous generation, but with a bit of code in it that works well with the program that’s actively making the world a worse place.

          I don’t have a dogmatic hatred against AI, I very much enjoy the fact that nowadays we can use analytical machines to search the absolutely astronomical level of data that comes from observatories and telescopes like Webb and Hubble without having to put an undergrad through what equates to mental torture by sifting through all the data for them.

          What I dislike is that generative ai’s current use is deepfakes, conspiracy content, shortform video slop, astroturfing political opinions and giving a summary of the wrong answers across every search engine, and some of the worst writing that has ever been put to digital paper, to name a couple.