• Grofit@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    Are you talking specifically about LLMs or Neural Network style AI in general? Super computers have been doing this sort of stuff for decades without much problem, and tbh the main issue is on training for LLMs inference is pretty computationally cheap

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      3 months ago

      Super computers have been doing this sort of stuff for decades without much problem

      Idk if I’d point at a supercomputer system and suggest it was constructed “without much problem”. Cray has significantly lagged the computer market as a whole.

      the main issue is on training for LLMs inference is pretty computationally cheap

      Again, I would not consider anything in the LLM marketplace particularly cheap. Seems like they’re losing money rapidly.