• panda_abyss@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        2 hours ago

        This guy’s videos are super long and he spends half the time hawking crap while going on tangents.

          • rumba@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            2 hours ago

            He’s not everyone’s cup of tea. I’m delighted he’s intent on unearthing things, but watching his stuff is work to me. He’s a bit of a rabble-rouser, he doesn’t make much of an attempt to stay balanced on a subject. It’s the ‘calls it as he sees it’ kind of thing that I tend to stay away from. When he starts going off on speculation, you have to keep an eye out for other explanations. I think he needs to add a good dose of Hanlon’s razor to his reporting. It’s great that he unearths things, but i’d rather get his findings filtered through another party that has the time to question his findings before they’re presented to me.

            And lest I get accused of being an LTT fanboy, they suck in the opposite way. When GN and LTT got in their ‘fight’ I stopped watching them both.

    • MalReynolds@slrpnk.net
      link
      fedilink
      English
      arrow-up
      53
      ·
      19 hours ago

      The endgame of the RAM/SSDs being pushed out of the hands of consumers by billionaires and corpos is the end of personal computing, ‘You will own nothing and subscribe to everything’, compute, which this video is about, also housing (rent), transportation (car lease) and anything else they can get their greedy, rent seeking, oligarchal, technofuedalist hands on.

    • ryokimball@infosec.pub
      link
      fedilink
      English
      arrow-up
      79
      ·
      edit-2
      21 hours ago

      Nvidia is cutting back on consumer market significantly, primarily selling to commercial entities

      • Wildmimic@anarchist.nexus
        link
        fedilink
        English
        arrow-up
        16
        ·
        18 hours ago

        And that private equity is pulling off the same shit that they did with the housing market - buy up the global silicon waver production so costs go up, rent it back to you while externalizing costs for the data centers and electricity. This goes hand in hand with NVidia and the likes extracting money from the Pentagon and getting tax breaks, externalizing costs further. This is not about building your PC anymore, we are just the first ones to feel it - memory is in fucking everything, and it’s a market that’s easily cornered if you have a lot of money to throw around because of limited production capability.

      • Scratch@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        21
        ·
        21 hours ago

        It’s not a dig against Nvidia, imo. All companies do this if they can. AMD siphoned off chips for enterprise from both cpu and gpu, because the premiums you can charge are so much higher.

        • VieuxQueb@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          5 hours ago

          Ford just shut down the new lightning factory to make batteries for datacenters instead. It’s just a great way to make sure prices keep going up. Create a scarcity and sell at extra markup.

        • Naz@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          23
          ·
          21 hours ago

          $3200 for a 5090 or $22,000 for an H200

          The choice is clear but Nvidia was created by the low-end and enthusiast GPU market…

            • cardfire@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              4
              ·
              7 hours ago

              GTX 750 Ti was an absolute champ even for playing games at 1440p, and I would argue the GTX 1060/1050 were worth spiritual successors.

              20 years ago you’re looking at an MX440, over AGP if you were lucky.

              In think they had many market segments covered, even with their Shield products.

              It’s clear that where they are going, tech enthusiasts/consumers won’t sustain them, though. They will have to overextend themselves in Corpo bullshit, and either behind just another nightmare conglomerate, or die trying.

              • rumba@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 hours ago

                Bad news! They’re taking it so far the other way it’s even worse!

            • MalReynolds@slrpnk.net
              link
              fedilink
              English
              arrow-up
              3
              ·
              17 hours ago

              Maybe that’s just something they say, maybe not, if not laws need to change, if so they need reining in.

              • Scratch@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 hours ago

                CEOs are obliged to act in the best interests of the company (as opposed to shareholders, which I stated previously). 99.99% of the time, that means increase market share, revenue and margins while cutting costs as close to the bone as possible.

        • ZkhqrD5o@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          6 hours ago

          It does not work that way. AMD is mostly in the customised hardware business (PlayStation, Steamdeck, etc.) and they are just selling knockoff products in the GPU space because they just make some money on the side. You may hate Nvidia for the VRAM gimping, trash drivers, bad vision for computer graphics, but at least they have a clear and vision, even if that vision is blurry, temporally smeared graphics. Nvidia is actually doing something smart. It basically tells developers “you can save time and money with our stuff” and once everyone uses it, people are forced to buy their GPUs, because everything exclusively runs on CUDA, this happened in the productivity and machine learning spaces. Intel and AMD just follow said vision with the machine learning infestation of computer generated graphics. AMD’s bets were failures in the past. Intel cannot afford risks right now. So no wonder they’re now just doing whatever Nvidia does minus 100€. That’s the problem with computing. Good luck competing with them. That’s also why EU companies like Infineon or STMicroelectronics, NXP Semiconductors just don’t. Because while they have deep pockets, not that deep to force themselves into it. Maybe once the bubble pops and Nvidia goes the way of IBM, maybe then. One can hope.

        • cassandrafatigue@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          7
          ·
          14 hours ago

          Doesn’t really work with fancy computer stuff. The supply lines are too long. There are like a dozen factories on earth that can make modern good computer. Less, I think

        • Ilixtze@lemmy.ml
          link
          fedilink
          English
          arrow-up
          11
          ·
          edit-2
          18 hours ago

          The is my sentiment exactly. This is a perfect opportunity for China to step in to the market with arm computers. And just at the right time when valve is investing heavily in running windows games in ARM. Unsurprising for American vendors to step on their dicks and shoot themselves on the foot out of sheer arrogance.

          • MalReynolds@slrpnk.net
            link
            fedilink
            English
            arrow-up
            8
            ·
            edit-2
            15 hours ago

            Yes, or better yet, RISC-V, it’s within spitting distance of viability, China could push it over the line. Pretty sure Valve would get on board, they’re pretty good at porting proton now. If they can get modern DRAM (e.g. DDR7 or HBM) fab going, to the moon (perhaps literally ;). Likely simpler than CPU for first gen EUV.

            ETA: Ask and ye shall receive, China’s EUV prototype https://www.heise.de/en/news/Report-China-is-said-to-have-a-functioning-EUV-lithography-system-11121936.html

            It’s on baby, at least in 2028 ITA. (I hope sooner, they’re talking about a HBM graphics card next year elsewhere (likely manufactured at TSMC tho), so they know what to do with it once it works. and given the current situation I expect China to just thrrow money at the project, it’d be a huge win)

          • ShadowRam@fedia.io
            link
            fedilink
            arrow-up
            1
            ·
            12 minutes ago

            That’s exactly my point?!

            There is currently none, but now they are opening up a void to allow one in.

          • mnemonicmonkeys@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 hours ago

            I just switched from an RTX 3070ti to an RX 9070XT and I stopped getting weird graphics glitches constantly. Nvidia drivers have gotten worse than AMD drivers over the years, even on Windows

          • Sal@lemmy.world
            link
            fedilink
            English
            arrow-up
            19
            arrow-down
            2
            ·
            20 hours ago

            AMD is actually better now considering there’s a very high chance Nvidia is vibe coding their drivers.

            • Prove_your_argument@piefed.social
              link
              fedilink
              English
              arrow-up
              3
              ·
              6 hours ago

              I agree.

              Didn’t stop my brother from buying his wife a 5060ti build for sketchup. I suggested he give her his 4070ti and use a 9070xt instead because why buy a garbage card and… nope… nvidia only. I literally said “You could go with amd and just give her the 4070ti” and he literally replied with “fuck no U_U”

              Eventually nvidia will make mistakes and probably end up like intel on the consumer side, but I have a coworker I just helped with a build who refused to even consider AMD even after the intel catastrophe, she’s too much of a brand loyalist (not at all a gamer, IT infrastructure worker.)

              People have shit on AMD for so long and to be fair for a while AMD really was behind, but it feels like the script has flipped and the people in my circles haven’t realized it. I know the overall consumer market has been going towards amd processors but I just don’t see AMD winning on the graphics front.

              Guessing we’ll see AMD keep nvidia grounded on gpu prices to an extent though since AMD doesn’t really have much market share in the AI/ML space but they’ll certainly allow for more price inflation since it inflates their margin.

            • Leon@pawb.social
              link
              fedilink
              English
              arrow-up
              10
              ·
              19 hours ago

              I’m strongly considering selling my 4070 super and picking up a 9070xt. I’m sick to death of NVidia and their garbage. Hope they fucking bankrupt.

          • BlackLaZoR@fedia.io
            link
            fedilink
            arrow-up
            10
            ·
            19 hours ago

            Their rocm driver stack for computing isn’t as good as CUDA, but they’re closing the gap year after year. Their open GPU drivers on linux are far superior to Nvidia

        • frongt@lemmy.zip
          link
          fedilink
          English
          arrow-up
          7
          ·
          20 hours ago

          It doesn’t, because there is no excess of parts. Nearly the chips are already going to nvidia.

      • ryper@lemmy.ca
        link
        fedilink
        English
        arrow-up
        10
        ·
        20 hours ago

        Their justification is basically that the RAM shortage is going to drive up prices and drive down consumer demand, which is probably right. The companies building data centers don’t seem to be so price sensitive

      • Leon@pawb.social
        link
        fedilink
        English
        arrow-up
        9
        ·
        19 hours ago

        Well. The good thing about that is that when you have nothing to lose you have all to gain, and there’s a particular class of people with everything to lose.

        • Strath@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          14 hours ago

          “Freedom’s just another word for nothing left to lose,” – Janis Joplin (Kris Kristofferson)