It basically just takes your eyes a second to adjust, the same is true for 30 fps to 60. If you only play at 30 then you might not notice the leap when going to 60, but once you start playing 60 regularly your eyes will adjust, and 30 will start to look choppy. Once that happens the difference will become easy to point out, and you’ll be able to appreciate the frame increase.
Eventually it’ll become night and day, taking zero effort to notice the difference between frame rates, and the difference being a massive deal. I still only play at 1080p, for example, so I can play at 120-144 fps consistently, as that smoothness is infinitely more important to me than a sharper image.
It’s hard to say exactly what it is, but if my monitor ends up getting set to 60hz because some game has weird defaults, I’ll notice that something is off until I change the setting and get it back to 144hz. Maybe it’s the monitor itself being tuned for 144hz so the pixels fade a bit before they get refreshed? Or maybe my eyes/brain can tell the difference after getting used to the higher refresh rate.
I think it is different for games that are fps locked to 60fps while the monitor is set to 144hz, which suggests that it might be the fading thing (or something similar).
Though I did notice a big difference in overwatch when I upgraded my GPU from one that would get fps in the range of 60-90 to one that could consistently get over 120.
It’s really hard to quantify your own senses, so all I can say for sure is that I definitely notice it when my monitor is set to 60hz instead of 144hz.
sus. If you can’t notice a big difference right away then what difference IS there?
It basically just takes your eyes a second to adjust, the same is true for 30 fps to 60. If you only play at 30 then you might not notice the leap when going to 60, but once you start playing 60 regularly your eyes will adjust, and 30 will start to look choppy. Once that happens the difference will become easy to point out, and you’ll be able to appreciate the frame increase.
Eventually it’ll become night and day, taking zero effort to notice the difference between frame rates, and the difference being a massive deal. I still only play at 1080p, for example, so I can play at 120-144 fps consistently, as that smoothness is infinitely more important to me than a sharper image.
It’s hard to say exactly what it is, but if my monitor ends up getting set to 60hz because some game has weird defaults, I’ll notice that something is off until I change the setting and get it back to 144hz. Maybe it’s the monitor itself being tuned for 144hz so the pixels fade a bit before they get refreshed? Or maybe my eyes/brain can tell the difference after getting used to the higher refresh rate.
I think it is different for games that are fps locked to 60fps while the monitor is set to 144hz, which suggests that it might be the fading thing (or something similar).
Though I did notice a big difference in overwatch when I upgraded my GPU from one that would get fps in the range of 60-90 to one that could consistently get over 120.
It’s really hard to quantify your own senses, so all I can say for sure is that I definitely notice it when my monitor is set to 60hz instead of 144hz.