Hi all, I have transitioned my desktop PC to linux and am really liking it so far. I recently bought Oblivion Eemastered, but it seems like it’s too much for my old 1060 6gb to handle. So now I’m looking at what options are available - and would obviously like to get an option that works well with Linux. Since I don’t game as much anymore, I don’t think I can justify spending much more than €300 on it. I haven’t looked at the GPU market for 8 years now, so I don’t know what’s going. What advice do you people have? I have looked at the 4060, the 7600 and the 7600XT, but not sure if they are good value, I’m getting mixed info online.

I appreciate any help and advice you people have.

  • bw42@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    For Linux I would recommend AMD or Intel GPUs. They are less hassle getting up and running.

    I’m currently running an Intel Arc A770 and its been running great. Was a lot more affordable than recent AMD or Nvidia cards.

    • Harisfromcyber@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      I have personally seen more support for AMD GPUs during my research into Linux compatibility. I would love to hear your take on the Intel side. In your experience, how would you rate your Intel GPU experience with Linux gaming?

      • bw42@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Getting it up and running was as simple as swapping my AMD RX580 for the Intel Arc A770. The drivers are open source and built into the kernel and Mesa. It picked up and started working without issues.

        Its ran every game I have smoothly at max settings. I haven’t had to turn down the graphics settings on a game yet. Though to be fair, the most graphic intensive game I play is No Man’s Sky.

        Only issue I encountered was when I first got it I had to tell No Man’s Sky to use the Xe Vulkan support, instead of trying to use the old Intel HD version. Since then I’ve reinstalled No Man’s Sky a few times. Newer updates properly detect the Vulkan support.

        I was impressed enough with its performance that I bought a second to upgrade my wife’s computer. She has been using that system to do modeling in Blender and hasn’t had any issues with it that I’ve heard of.

        I have been quite happy with the Intel Arc card. If they are still making them when I do my next upgrade I will likely get another.

  • zpteixeira@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    I would try to go for a used 6800XT, I snatched one at 320€ some months ago, it should be easy to find now for 300€. If you want performance for your money, and you use Linux, that would be my way to go.

  • anamethatisnt@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    I went for an AMD Radeon RX 6650 XT when I built my machine and I dearly regret not aiming for 16GB memory on my GPU. Even when playing at 1080p I’ve had games fill my GPU memory and then crawl to a halt for a short while until enough memory has been freed.
    AMD Radeon RX 9060 XT is supposed to be released in the middle of May and will come in both 8GB and 16GB flavours. I’d aim for the one with 16GB even if the estimated price point is $330-$380 according to the sites I’ve checked.

    • hellofriend@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      6650XT gang. Been a good card for me so far. The only game I’ve had crawling to a halt so far is the Oblivion remaster (refunded after realizing it ran like dogshit, not just on my system, but on any system). Was the best option at the time (late 2024) for my money and I needed an upgrade badly. I might upgrade to the 9060 XT for raytracing though since it’s dogass on the 6650XT. That is my only regret so far.

  • INeedMana@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    When catching up with hardware performance for Linux gaming, I always browse phoronix, try to find a few comparisons from different years to see how the card looked like in comparison to other options. There might be some sleepers now no-one remembers about, that you magically have an option to buy. Think which game in the comparisons might have similar requirements to what you want to play and see how the card/cpu did on the settings you find agreeable/non-agreeable/perfect

    Don’t go into its forum, though. There be dragons

    Maybe recently it started to change with NVIDIA opening their drivers but for years we’ve been second class citizens for them. Personally I say “fuck NVIDIA”

    If you decide to go AMD, definitely explore the landscape of fan controllers. I use corectl but maybe you would prefer something else (this is Arch wiki, but should be fine for other distros too)

  • Lembot_0002@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    linux

    Don’t buy Nvidia. It’ll be a pain in the ass, especially if you use something like Debian testing where kernel is updated quite frequently.

    AMD would be a good choice.

    • Telorand@reddthat.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      Nvidia is barely an inconvenience anymore. I run Bazzite on an old laptop with a GTX 960M, and it’s flawless.

      That said, I agree that AMD is the current best choice.

      • IEatDaGoat@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        16 hours ago

        These latest drivers had been annoying tf outta me though. Random screen glitches while using 3080 for weeks.

    • Eager Eagle@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      It depends on the use. AMD for gaming, sure. For machine learning I would go with nvidia again. I run it on arch, which updates very often.

      • Lembot_0002@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 day ago

        €300

        I doubt that OP talks about machine learning. And still, Nvidia’s driver is an additional source of problems.

        • Eager Eagle@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 day ago

          you can run local models with 16GB VRAM, but I’m not sure most models will run at all on an AMD card