• brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    4 days ago

    Not as many as you’d think. The 5000 series is not great for AI because they have like no VRAM, with respect to their price.

    4x3090 or 3060 homelabs are the standard, heh.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        3
        ·
        4 days ago

        Yeah. What does that have to do with home setups? No one is putting an H200 or L40 in their homelab.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            ·
            edit-2
            4 days ago

            It mentions desktop GPUs, which are not part of this market cap survey.

            Basically I don’t see what the server market has to do with desktop dGPU market share. Why did you bring that up?

    • MystikIncarnate@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      Who the fuck buys a consumer GPU for AI?

      If you’re not doing it in a home lab, you’ll need more juice than anything a RTX 3000/4000/5000/whatever000 series could have.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        3 days ago

        Who the fuck buys a consumer GPU for AI?

        Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.

        I can (just barely) run GLM-4.5 on a single 3090 desktop.

        • MystikIncarnate@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          … Yeah, for yourself.

          I’m referring to anyone running an LLM for commercial purposes.

          Y’know, 80% of Nvidia’s business?

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 days ago

            I’ve kinda lost this thread, but what does that have to do with consumer GPU market share? The servers are a totally separate category.

            I guess my original point was agreement: the 5000 series is not great for ‘AI’, not like everyone makes it out to be, to the point where folks who can’t drop $10K for a GPU are picking up older cards instead. But if you look at download stats for these models, there is interest in running stuff locally instead of ChatGPT, just like people are interested in internet free games, or Lemmy instead of Reddit.