Well I am shocked, SHOCKED I say! Well, not that shocked.

    • overload@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Absolutely. True creative games are made by smaller dev teams that aren’t forcing ray tracing and lifelike graphics. The new Indianna Jones game isn’t a GPU-selling card, and is the only game that I’ve personally had poor performance on with my 3070ti at 1440p.

  • excral@feddit.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    For me it’s the GPU prices, stagnation of the technology (most performance gains come at the cost of stupid power draw) and importantly being fed up with AAA games. Most games I played recently were a couple years old, indie titles or a couple years old indie titles. And I don’t need a high powered graphics card for that. I’ve been playing far more on my steam deck than my desktop PC, despite the latter having significantly more powerful hardware. You can’t force fun through sheer hardware performance

  • gravitas_deficiency@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    Nvidia doesn’t really care about the high-end gamer demographic nearly as much as they used to, because it’s no longer their bread and butter. Nvidia’s cash cow at this point is supplying hardware for ML data centers. It’s an order of magnitude more lucrative than serving consumer + enthusiast market.

    So my next card is probably gonna be an RX 9070XT.

    • ameancow@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 month ago

      even the RX9070 is running around $900 USD, I cannot fathom affording even state-of-the-art gaming from years ago at this point. I am still using a GTX1660 and playing games from years ago I never got around to and having a grand time. Most adults I know are in the same boat and either not even considering upgrading their PC or they’re playing their kid’s console games.

      Every year we say “Gonna look into upgrading” but every year prices go up and wages stay the same (or disappear entirely as private-equity ravages the business world, digesting every company that isn’t also a private equity predator) and the prices of just living and eating are insane, so at this rate, a lot of us might start reading again.

  • simple@piefed.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    Unfortunately gamers aren’t the real target audience for new GPUs, it’s AI bros. Even if nobody buys a 4090/5090 for gaming, they’re always out of stock as LLM enthusiasts and small companies use them for AI.

  • Dammam No. 7@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 month ago

    I just looked up the price and I was “Yikes!”. You can get a PS5 Pro + optional Blu-ray drive, Steam Deck OLED, Nintendo Switch 2 and still have plenty of money left to spend on games.

  • bluesheep@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090

    Yeah no shit, what a weird fucking take

  • ItsMrChristmas@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    GPU prices are what drove me back to consoles. It was time to overhaul my PC as it was getting painfully out of date. Video card alone was gonna be 700. Meanwhile a whole ass PS5 that plays the same games was 500.

    It’s been 2 years since and I don’t regret it. I miss mods, but not nearly as much as I thought. It also SOOO nice to play multiplayer games without cheaters everywhere. I actually used to be one of those people who thought controllers gave an unfair advantage but… you can use a M/KB on PS5 and guess what? I do just fine! Turns out that the problem was never controllers, it was the cheaters.

    But then there is that. The controller. Oh my lord it’s so much more comfortable than even the best gaming mouse. I’ve done a complete 180 on this. So many game genres are just so terrible to play with M/KB that I now tell people whining about controller players this:

    Use gaming equipment for gaming and leave office equipment in the office.

  • RedSnt 👓♂️🖥️@feddit.dk
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    It’s like how banks figured there was more money in catering to the super rich and just shit all over the rest of us peasants, GPU manufacturers that got big because of gamers have now turned their backs to us to cater to the insane “AI” agenda.
    Also, friendly advice, unless you need CUDA cores and you have to upgrade, try avoiding Nvidia.

  • Phoenicianpirate@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    I bought my most expensive dream machine last year (when the RTX-4090 was still the best) and I am proud of it. I hope it’ll be my right for at least 10 years.

    But it was expensive.

  • JackbyDev@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    Uhhh, I went from a Radeon 1090 (or whatever they’re called, it’s an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It’s normal to not buy a GPU every year.

    • Nik282000@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Still running a 1080, between nvidia and windows 11 I think I’ll stay where I am.

  • chunes@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I stopped maintaining a AAA-capable rig in 2016. I’ve been playing indies since and haven’t felt left out whatsoever.

    • MotoAsh@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Don’t worry, you haven’t missed anything. Sure, the games are prettier, but most of them are designed and written more poorly than 99% of indie titles…

      • Honytawk@feddit.nl
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        The majority sure, but there are some gems though.

        Baldurs Gate 3, Clair Obscur: Expedition 33, Doom Eternal, Elden Ring, God Of War, … for example

        You can always wait for a couple of years before playing them, but saying they didn’t miss anything is a gross understatement.

      • JustEnoughDucks@feddit.nl
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 month ago

        It’s funny, because often they aren’t prettier. Well optimized and well made games from 5 or even 10 years ago often look on par better than the majority of AAA slop pushed out now (obviously with exceptions of some really good looking games like space marine and some others) and the disk size is still 10x what it was. They are just unrefined and unoptimized and try to use computationally expensive filters, lighting, sharpening, and antialiasing to make up for the mediocre quality.

  • arc99@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Not surprised. Many of these high end GPUs are bought not for gaming but for bitcoin mining and demand has driven prices beyond MSRP in some cases. Stupidly power hungry and overpriced.

    My GPU which is an RTX2060 is getting a little long in the tooth and I’ll hand it off to one of the kids for their PC but I need to find something that is a tangible performance improvement without costing eleventy stupid dollars. Nvidia seems to be lying a lot about the performance of that 5060 so I might look at AMD or Intel next time around. Probably need to replace my PSU while I’m at it.

    • DefederateLemmyMl@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      bitcoin mining

      That’s a thing of the past, not profitable anymore unless you use ASIC miners. Some people still GPU mine it on niche coins, but it’s nowhere near the scale as it was during the bitcoin and ethereum craze a few years ago.

      AI is driving up prices or rather, it’s reducing availability, which then translates into higher prices.

      Another thing is that board manufacturers, distributors and retailers have figured out that they can jack up GPU prices above MSRP and enough suckers will still buy them. They’ll sell less volume but they’ll make more profit per unit.

    • Valmond@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      My kid got the 2060, I bought a RX 6400, I don’t need the hairy arms any more.

      Then again I have become old and grumpy, playing old games.

      • WhatYouNeed@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        Hell, I’m still rocking with a GTX 950. It runs Left for Dead 2 and Team Fortress 2, what more do I need?

  • Demognomicon@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    I have a 4090. I don’t see any reason to pay $4K+ for fake frames and a few % better performance. Maybe post Trump next gen and/or if prices become reasonable and cables stop melting.

    • Critical_Thinker@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 month ago

      I don’t think the 5090 has been 4k in months in terms of average sale price. 4k was basically March. 3k is pretty common now as a listed scalp price, and completed sales on fleabay seem to be 2600-2800 commonly now.

      The problem is that 2k was too much to begin with though. It should be cheaper, but they are selling ML cards at such a markup with true literal endless demand currently, there’s zero reason to put any focus at all on the gaming segment beyond a token offering that raises the margin for them, so business wise they are doing great I guess?

      As a 9070xt and 6800xt owner, it feels like AMD is practically done with the gpu market. It just sucks for everyone that the gpu monopoly is here, presumably to stay. Feels like backroom deals creating a noncompetitive landscape must be prevalent, plus a total stranglehold with artificial monopoly of code compatibility from nvidia’s side make hardware irrelevant.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 month ago

        One issue is everyone is supply constrained by TSMC. Even Arc Battlemage is OOS at MSRP.

        I bet Intel is kicking themselves for using TSMC. It kinda made sense when they decided years ago, but holy heck, they’d be swimming in market share if they used their own fabs instead (and kept the bigger die).

        I feel like another is… marketing?

        Like, many buyers just impulse buy, or go with what some shill recommended in a feed. Doesn’t matter how competitive anything is anymore.