• definitemaybe@lemmy.ca
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    16 hours ago

    AI companies have only pre-bought future years’ supplies for memory so long as they can make their payments.

    OpenAI is losing $1MM USD about every 30 minutes, iirc. How can they afford their $1.4T USD expansion plans to 2030 when they’re bleeding that kind of cashflow?

    A year ago, Altman was saying ChatGPT will never have ads, but they’ve already started adding them.

    They have no “moat”—just about anyone can run a LLM with open models right now, and big players (Meta, Google, and others) are directly competing already.

    All of these companies are facing massive potential legal judgements, too.

    There’s no realistic path to profitability for OpenAI. OpenAI’s publications call for them, on their own, accounting for 2% of US GDP by 2030. Insanity.

    So, here’s the point for PC gamers: the AI crash is coming, and hardware prices are going to come crashing down once datacentres are no longer able to afford to pay for the chips they ordered.

    (Of course, server parts can’t be swapped to consumer hardware, but they’ll retool to meet demand ASAP since there will be strong pricing incentives to pivot back to consumer hardware quicker than the competition.)

    • MalReynolds@slrpnk.net
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 hours ago

      They have no “moat”

      Pretty sure one of the goals with this BS is digging a moat by choking off competitors access to hardware. Shame that LLMs are actually getting smaller and faster for the same (dubious) effectiveness, e.g. Qwen 3.5. It pulls extra duty as a Ponzi scheme and potentially forcing most consumer compute into their datacentres (a much better, if more evil business plan than AI) for that sweet rentseeking continuing revenue.