In thought that 60Hz was enough for most games, and that for shooters and other real time games 120 or 144 was better. However, it reaches a point where the human eye can’t notice even if it tried.
Honestly, going up in framerate t9o much is just a waste of GPU potency and electricity.
At 60 FPS/Hz, a single frame is displayed for 16.67ms. At 120 Hz, a single frame is displayed for 8.33ms. At 240 Hz, a single frame is displayed for 4.16ms. A difference of >8ms per frame (60 vs 120) is quite noticeable for many people, and >4ms (120 vs 240) is as well, but the impact is just half as much. So you get diminishing returns pretty quickly.
Now I’m not sure how noticeable 1000 Hz would be to pretty much anyone as I haven’t seen a 1000 Hz display in action yet, but you can definitely make a case for 240 Hz and beyond.
It’s pretty easy to discern refresh rate with the human eye if one tries. Just move your cursor back and forth really quickly. The number of ghost cursors in the trail it leaves behind (which btw only exist in perception by the human eye) is inversely proportional to the refresh rate.
Sure, but wasting double or triple the resources for that is not fine. There’s very limited places where that even is a gain on games, because outside those super competitive limited games it’s not like it matters.
oh, yeah I’ve read and heard of plenty people saying that they definitely notice it. I’m lucky enough not to because most ARPGs don’t run 60FPS on intense combat, let alone 120 fps on a rtx3080 lmao.
I was talking more about the jump from 240 and beyond, which I find surprising for people to notice the upgrade on intense gaming encounters, not while calmly checking or testing. I guess that there’s people who do notice, but again, running games on such high tick rate is very expensive for the gpu and a waste most of the time.
I’m just kinda butthurt that people feel like screens below 120 are bad, when most games I play hardly run 60 fps smooth, because the market will follow and in some years we will hardly have what I consider normal monitors, and the cards will just eat way more electricity for very small gains.
Competitive (professional) gamers?
Seems there are diminishing returns, but at least some gains are measurable at 360.
In thought that 60Hz was enough for most games, and that for shooters and other real time games 120 or 144 was better. However, it reaches a point where the human eye can’t notice even if it tried.
Honestly, going up in framerate t9o much is just a waste of GPU potency and electricity.
A better way to look at this is frametime.
At 60 FPS/Hz, a single frame is displayed for 16.67ms. At 120 Hz, a single frame is displayed for 8.33ms. At 240 Hz, a single frame is displayed for 4.16ms. A difference of >8ms per frame (60 vs 120) is quite noticeable for many people, and >4ms (120 vs 240) is as well, but the impact is just half as much. So you get diminishing returns pretty quickly.
Now I’m not sure how noticeable 1000 Hz would be to pretty much anyone as I haven’t seen a 1000 Hz display in action yet, but you can definitely make a case for 240 Hz and beyond.
It’s pretty easy to discern refresh rate with the human eye if one tries. Just move your cursor back and forth really quickly. The number of ghost cursors in the trail it leaves behind (which btw only exist in perception by the human eye) is inversely proportional to the refresh rate.
Sure, but wasting double or triple the resources for that is not fine. There’s very limited places where that even is a gain on games, because outside those super competitive limited games it’s not like it matters.
Yeah I agree with you, but I was just refuting your claim that it’s not perceivable even if you try.
oh, yeah I’ve read and heard of plenty people saying that they definitely notice it. I’m lucky enough not to because most ARPGs don’t run 60FPS on intense combat, let alone 120 fps on a rtx3080 lmao.
I was talking more about the jump from 240 and beyond, which I find surprising for people to notice the upgrade on intense gaming encounters, not while calmly checking or testing. I guess that there’s people who do notice, but again, running games on such high tick rate is very expensive for the gpu and a waste most of the time.
I’m just kinda butthurt that people feel like screens below 120 are bad, when most games I play hardly run 60 fps smooth, because the market will follow and in some years we will hardly have what I consider normal monitors, and the cards will just eat way more electricity for very small gains.
I thought games are to have fun, what’s the point of monetising them?
New Careers for the new (and Current) generation.