• ɐɥO
    link
    fedilink
    English
    811 months ago

    you dont need that much power. something like a rx 6600xt/rtx 3060/rx580 is plenty

    • @[email protected]
      link
      fedilink
      English
      7
      edit-2
      11 months ago

      Is support for AMD cards better these days? Last time I checked it involved checking ROCM compatibility because CUDA needs nvidia cards exclusively.

      • ɐɥO
        link
        fedilink
        English
        611 months ago

        gpt4all worked out of the box for me