• dreugeworst@lemmy.ml
      link
      fedilink
      English
      arrow-up
      26
      ·
      edit-2
      7 days ago

      afaict they’re computers with a GPU that has some hardware dedicated to the kind of matrix multiplication common in inference in current neural networks. pure marketing BS because most GPUs come with that these days, and some will still not he powerful enough to be useful

      • blarth@thelemmy.club
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        6 days ago

        This comment is the most importantly one in this thread. Laptops already had GPUs. Does the copilot button actually result in you conversing with an LLM locally or is inference done in the cloud? If the latter, it’s even more useless.

      • Gutek8134@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        7 days ago

        IDK if the double pun was intended, but a FLOPS is a measurement of how many (floating point) operations can a computer make per second