• Alaknár@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      6 hours ago

      Because it’s the low-end xx60? Also, apparently Nvidia did some high-tech magic that allows higher-res textures to be handled with less vRAM.

      But, yeah, it’s a 5060. You’re not buying this to play in 4k Ultra.

      • WereCat@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        4 hours ago

        This video clearly shows the NVIDIA magic is pooping in your own pants.

        And the card even struggles at 1080p with 8GB…

    • Avg@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 hours ago

      You would need so many channels for that to be viable.

  • WormFood@lemmy.world
    link
    fedilink
    English
    arrow-up
    54
    ·
    19 hours ago

    it is 2019, the 2060ti has 8gb of vram. it is 2020, the 3060ti has 8gb of vram. it is 2023, the 4060ti has 8gb of vram. it is 2025, the 5060ti has 8gb of vram.

  • Ulrich@feddit.org
    link
    fedilink
    English
    arrow-up
    76
    ·
    edit-2
    22 hours ago

    tl:dw some the testing shows 300-500% improvements in the 16GB model. Some games are completely unplayable on 8GB while delivering an excellent experience on the 16GB.

    It really does seem like Nvidia is intentionally trying to confuse their own customers for some reason.

    • empireOfLove2@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      30
      ·
      edit-2
      21 hours ago

      It really does seem like Nvidia is intentionally trying to confuse their own customers for some reason.

      Its moreso for OEM system integrators, who can buy up thousands of these 5060ti’s and sell them in systems as 5060Ti’s, and the average Joe who buys prebuilts won’t know to go looking at the bottom half of the tech sheet to see if its an 8 or 16.
      As well as yes, direct scamming consumers, because Jensen needs more leather jackets off the AI craze and couldn’t give a rats ass about gamers.

      • MBech@feddit.dk
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        20 hours ago

        I agree that they don’t give half a shit about their actual product, but their biggest competitor has never been more competitive, and Nvidia knows it. Pissing off your costumer base when you don’t have a monopoly is fucking stupid, and Nvidia and the prebuilt manufacturers knows this. It’s business 101.

        There’s gotta be something else. I know businesses aren’t known for making long term plans, because all that will ever matter to them is short term profits. But this is just way too stupid to be because of that.

        • empireOfLove2@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          13 hours ago

          There’s gotta be something else.

          That something else is that they don’t need the gamer market. Providing consumer cards is literally an inconvenience for them at this point, they make 2 billion a quarter from gaming cards but 18 billion on datacenter compute, with some insane 76% gross margins on those products they sell (to continue funding R&D).

    • inclementimmigrant@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      21 hours ago

      To me it sounds like they are preying on the gamer who isn’t tech savvy or are desperate. Just a continuation of being anti-consumer and anti-gamer.

      • NekuSoul@lemmy.nekusoul.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        14 hours ago

        Yup. This is basically aimed at the people who only know that integrated GPUs are bad and they need a dedicated card, so system manufacturers can create a pre built that technically checks that box for as little money as possible.

      • Ulrich@feddit.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        20 hours ago

        Okay well that’s the low-hanging fruit but explain to me the correlation? How does confusing their customers fuel their greed?

          • Ulrich@feddit.org
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            20 hours ago

            So their strategy is making and selling shitty cards at high prices? Don’t you think that would just make consumers consider a competing brand in the future?

            • gaael@lemm.ee
              link
              fedilink
              English
              arrow-up
              5
              ·
              20 hours ago

              For most consumers it might not, the amount of nvidia propaganda advertisement in games is huge.

            • MBech@feddit.dk
              link
              fedilink
              English
              arrow-up
              2
              ·
              20 hours ago

              Yea I don’t know why buying a shitty product should convince me to throw more money at the company. They don’t have a monopoly, so I would just go to their competitor instead.

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      21 hours ago

      They had trouble increasing memory even before this AI nonsense. Now they have a perverse incentive to keep it low on affordable cards, to avoid undercutting their own industrial-grade products.

      Which only matters thanks to anticompetitive practices leveraging CUDA’s monopoly. Refusing to give up the fat margins on professional equipment is what killed DEC. They successfully miniaturized their PDP mainframes, while personal computers became serious business, but they refused to let those run existing software. They crippled their own product and the market destroyed them. That can’t happen, here, because ATI is not allowed to participate in the inflated market of… linear algebra.

      The flipside is: why the hell doesn’t any game work on eight gigabytes of VRAM? Devs. What are you doing? Does Epic not know how a texture atlas works?

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        18 hours ago

        The flipside is: why the hell doesn’t any game work on eight gigabytes of VRAM? Devs. What are you doing? Does Epic not know how a texture atlas works?

        It’s not that they don’t work.

        Basically what you’ll see is kinda like a cache miss, except the stall time to go ‘oops, don’t have that’ and go out and get the required bits is very slow, and so you can see 8gb cards getting 20fps, and 16gb ones getting 40 or 60, simply because the path to get the missing textures is fucking slow.

        And worse, you’ll get big framerate dips and the game will feel like absolute shit because you keep running into hitches loading textures.

        It’s made worse in games where you can’t reasonably predict what texture you’ll get next (ex. Fortnite and other such online things that are you know, played by a lot of people) but even games where you might be able to reasonably guess, you’re still going to run into the simple fact that the textures from a modern game are simply higher quality and thus bigger than the ones you might have had 5 years ago and thus 8gb in 2019 and 8gb in 2025 is not an equivalent thing.

        It’s crippling the performance of the GPU that may be able to perform substantially better, and for a relatively low BOM cost decrease. They’re trash, and should all end up in the trash.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          16 hours ago

          That’s what I’m on about. We have the technology to avoid going ‘hold up, I gotta get something.’ There’s supposed to be a shitty version that’s always there, in case you have to render it by surprise, and say ‘better luck next frame.’ The most important part is to put roughly the right colors onscreen and move on.

          id Software did this on Xbox 360… loading from a DVD drive. Framerate impact: nil.

          • PenguinTD@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            12 hours ago

            the virtual texture tech is not all mighty and you can still run into situation where if the allocation is fewer than you need you run into the page swap. It act similarly to traditional cache miss if you cross certain threshold because you can’t keep enough “tiles” in memory. Texture quality popping and then stuttering is the symptom progressing from lower than needed vram allocated to severely insufficient.

          • PenguinTD@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            12 hours ago

            the virtual texture tech is not all mighty and you can still run into situation where if the allocation is fewer than you need you run into the page swap. It act similarly to traditional cache miss if you cross certain threshold because you can’t keep enough “tiles” in memory. Texture quality popping and then stuttering is the symptom progressing from lower than needed vram allocated to severely insufficient.

  • filister@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    1
    ·
    20 hours ago

    The whole fact that NVIDIA is not allowing AIBs to send the 8GB card to reviewers is quite telling. They are simply banking on illiterate purchasers, system integrators to sell this variant. That’s another low for NVIDIA but hardly surprising anyone.

    Planned obsolescence.

    • HeyJoe@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      16 hours ago

      I agree, but it is still crazy that there are people out there making $500 plus purchases without the smallest bit of research. I really hope this card fails only for the reason that it deserves to.

  • kugmo@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    9
    ·
    edit-2
    12 hours ago

    On the flip-side, every game worth playing uses 2GB vram or less at 1080p.

      • notthebees@reddthat.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 hours ago

        I have a game that eats 11 gb of vram on low at 1080p (I play it on windowed). It suffers from some Unreal engine shenanigans and it’s also a few years old.

        • ihatefascist@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 hour ago

          UE is probably the worst engine ever made, even games from 20 years ago look better than that blurry mess of an engine. I hope nobody makes any game on it anymore, most of them are also badly optimized, never understood why people like that engine.

      • amorangi@lemmy.nz
        link
        fedilink
        English
        arrow-up
        4
        ·
        12 hours ago

        Video editing and AI require as much VRAM as you can get. Not everyone uses the cards just for gaming.

        • Alaknár@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          6 hours ago

          Then don’t buy the current-gen low-end card for video editing, mate. Get previous-gen with more vRAM, or go AMD.

      • real_squids@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        11 hours ago

        Depends on the use case. And even at 1080p there are quite a few games that use 8gb or close to it. Ghostrunner and LOTF (2023) come to mind. Although tbf, I played the first one on my RX580 8G (I think) almost maxed out and it did fine.

        But if you’re buying a card now, especially at new modern card prices, you want to have at least a bit of future proofing.

    • jnod4@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      12 hours ago

      I don’t even have a GPU, but to be honest I don’t even game anymore cuz I work more hours then there are in a day