• Pennomi
    link
    fedilink
    English
    1519 months ago

    Woah there, I’m not sure I’m ready for that level of commitment.

      • ɐɥO
        link
        fedilink
        English
        89 months ago

        you dont need that much power. something like a rx 6600xt/rtx 3060/rx580 is plenty

        • @[email protected]
          link
          fedilink
          English
          7
          edit-2
          9 months ago

          Is support for AMD cards better these days? Last time I checked it involved checking ROCM compatibility because CUDA needs nvidia cards exclusively.

          • ɐɥO
            link
            fedilink
            English
            69 months ago

            gpt4all worked out of the box for me