• bbuez
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    Now THAT is the AI innovation I’m here for

    • Lmaydev@programming.dev
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      LLMs are in a position to make boring NPCs much better.

      Once they can be run locally at a good speed it’ll be a game changer.

      I reckon we’ll start getting AI cards for computers soon.

      • bbuez
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        We already do! And on the cheap! I have a Coral TPU running for presence detection on some security cameras, I’m pretty sure they can run LLMs but I haven’t looked around.

        GPT4ALL runs rather well on a 2060 and I would only imagine a lot better on newer hardware