• arthurpizza@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    11 months ago

    It’s not like that though. Newer phones are going to have dedicated hardware for processing neural platforms, LLMs, and other generative tools. The dedicated hardware will make these processes just barely sip the battery life.

    • MenacingPerson@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      wrong.

      if that existed, all those AI server farms wouldn’t be so necessary, would they?

      dedicated hardware for that already exists, it definitely isn’t gonna be able to fit a sizeable model on a phone any time soon. models themselves require multiple tens of gigabytes of storage space. you won’t be able to fit more than a handful on even a 512gb internal storage. the phones can’t hit the ram required for these models at all. and the dedicated hardware still requires a lot more power than a tiny phone battery.

      • arthurpizza@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        Those server farms are because the needs of corporations might just be different from the needs of regular users.

        I’m running a 8 GB LLM model locally on my PC that performs better than 16 GB models from just a few months ago.

        It’s almost as if technology can get better and more efficient over time.