L4sBot@lemmy.worldMB to Technology@lemmy.worldEnglish · 10 months ago2023 was the year that GPUs stood stillarstechnica.comexternal-linkmessage-square97fedilinkarrow-up1314arrow-down18file-text
arrow-up1306arrow-down1external-link2023 was the year that GPUs stood stillarstechnica.comL4sBot@lemmy.worldMB to Technology@lemmy.worldEnglish · 10 months agomessage-square97fedilinkfile-text
2023 was the year that GPUs stood still::A new GPU generation did very little to change the speed you get for your money.
minus-squareCalcProgrammer1@lemmy.mllinkfedilinkEnglisharrow-up11arrow-down1·10 months agoI’ve been very happy with my Arc A770, it works great on Linux and performs well for what I paid for it.
minus-squarebarsoap@lemm.eelinkfedilinkEnglisharrow-up1·10 months agoHave you tried ML workloads, differently put: How is compatibility with stuff that expects CUDA/ROCm? Because the A770 is certainly the absolutely cheapest way to get 16G nowadays.
minus-squareCalcProgrammer1@lemmy.mllinkfedilinkEnglisharrow-up1·10 months agoNo, I don’t use any ML stuff or really anything that uses GPU compute at all. I just use it for gaming and other 3D applications.
I’ve been very happy with my Arc A770, it works great on Linux and performs well for what I paid for it.
Have you tried ML workloads, differently put: How is compatibility with stuff that expects CUDA/ROCm? Because the A770 is certainly the absolutely cheapest way to get 16G nowadays.
No, I don’t use any ML stuff or really anything that uses GPU compute at all. I just use it for gaming and other 3D applications.