The Asus ROG Swift Pro PG248QP is an extremely fast gaming monitor designed for professional esports gamers. Boasting a massive 540Hz refresh rate, it's a glimpse into...
I did my primary coursework on this exact topic in college so I feel rather aptly in the know here, and VDU’s with displays above 100hz have diminishing returns and anything above 200hz is completely indistinguishable to the Human eye.
Anyone who says otherwise either has superhuman eyesight or is just lying.
Because what you are saying is absolutely not true. Trained eyes and pros can definitely tell between 240hz and 360hz. There is a difference, it’s pretty marginal but there is a difference. Yes this is absolute nonsense for anyone but FPS pros, but there is still a market for it and it’s more of alpha consumer funded research than anything else
It’s less about seeing 540 frames per second and more that each frame taking less time to render on screen, it means what you’re seeing is a more recent representation of the game state. I’m a casual gamer that can’t appreciate anything past 144hz but that’s what I’ve heard about ultra high refresh rate screens.
According to the US Air Force, training allows people to process at and above the 220 fps level after having started at only being able to process at around 40 fps; some pilots even ended the study processing at 300 fps.
Anything above that is probably far past extraneous, though, especially for the average consumer. So a 540hz monitor is a little bit silly.
I did quite a few years on a 60hz monitor. The change to 144hz was massive. Did quite a few years on that 144hz monitor. The change to 240hz was practically unnoticeable.
One problem a lot of those studies have is it doesn’t factor in how you can get used to something and then notice a change. I remember when I was a kid that happening with headphones. I thought my good headphones weren’t that large of an improvement until I had to borrow somebody else’s years later and thought they sounded like absolute junk.
So somebody might not be able to tell much of a difference between 144 and 240 if they’re sitting side by side, but they might be able to tell the difference very easily if they’re used to one and then switch to the other. People are also more sensitive to negative changes, so you’re more likely to notice the difference between the two if you go from 240 to 144 versus 144 to 240.
Somebody below me posted a link to a study showing that humans can detect differences up to about 1000, and most can easily tell the difference between 240 and 360.
Thank you. My background is biology. Frankly neurons and cells just aren’t that fast. You know how your hand can feel cool when you touch a hot plate for a second? It’s because it takes like 200ms for the pain signal to register in your brain. And here these people are acting like a monitor with microsecond refresh rates is somehow something you can even detect.
I’m a 120 or 144hz man myself. Seems plenty snappy and smooth. 60 for office work is even fine. I can for sure sense 30 though. Like when a driver gets updated and something goes wrong and it sets your main display to 24 or 30fps. That shit is obvious.
My background is biology. Frankly neurons and cells just aren’t that fast.
The way we’re able to see high refresh rate isn’t based on neuron speed. (Ignoring input lag stuff). The eyes continuously sample as light hits them sending signals. Because it’s not a global shutter we’re able to perceive a displays refresh as it’s sending discrete images of a scene.
It’s far easier to notice an issue when there is a large change between two frames. The easiest way to reproduce this is in 3rd person games where you quickly rotate the camera as the large change between frames is very obvious to the eye. (There’s a limit obviously, but I can tell between 120Hz and roughly 240Hz, but I’m fine with 120Hz as I don’t tend to rotate the camera fast in games).
I mean, my subjective experience is that it’s less of a distinguishing one frame from another, but a perception of fluidity in motion.
Like I can see a noticeable jutter between 60-100hz and 144hz. This has been further trained because my personal setup has a cabling issue where emi from my chair piston causes the display cable to be disrupted which sometimes resets my monitor to 100hz. It is very annoying because I’m some of my games I notice a considerable play performance drop when this happens and I didn’t notice it in the past.
So I can definitely see the difference there. Then moving on from 144 to my integrated 240hz display in my laptop, I can absolutely notice a difference with specifically the edges of objects and which is extra-noticeable when moving the mouse around in circles and such.
This also translates to gaming where I’ve genuinely gone and checked my settings because something felt different from normal when I get so used to using the external display that I forget the internal one is the higher spec. Not bad, mind you, just different. I’m super vigilant about any change, though, since I’ve previously had issues that were indicated by performance changes beforehand.
That being said, I don’t think your understanding or coursework is wrong, but just that my subjective experience indicates otherwise and that the research potentially fails to translate well onto an exact Hz scale. My understanding is that the eye doesn’t really work on “frames” and is a bit more “stream of information” about it, meaning that distinguishing change even on minuscule time scales is relatively easy but distinguishing individual objects or the substance of an image flashed at that speed is very hard. If you know what to expect, though, I’d be willing to bet the process becomes significantly easier.
Perhaps there is already a plethora of studies on this exact stuff, but I’d be willing to bet that many existing studies fail to reasonably account for expectation when evaluating comprehension as well.
My brother has a 240hz monitor so I decided to play around with some CS on it on different refresh rates to see if I could notice any difference. And yeah, back to back 144hz to 240hz you notice *something* but it’s really hard to say I can make use of that extra information and every step above 140hz is really hard to notice any difference at all. It’s only barely noticeable when you jump from 140hz to 240hz in one step. In fact even after 90hz taking 10hz steps it already doesn’t feel like a big improvement after 90hz and it rapidly becomes less noticeable after 100hz.
It’s very possible your brother’s 240Hz monitor isn’t particularly good. Just because a display can refresh at a certain number, it doesn’t actually mean that it’s better than a display that might have a lower refresh rate, but has faster pixel response, etc.
A 144Hz OLED for example, will be significantly better than a much higher refresh rate LCD, because OLEDs have pixel responses that are sub 0.1ms. This is more than an order of magnitude faster than the fastest LCD monitor.
I think I might slap someone if I ever hear them say anything under 240 is unplayable. And I just know someone will find their way to this comment to argue why that’s not a ridiculous thing to say with some sort of special scenario.
The only special scenario I can think of is playing a game at 144hz while using a monitor thats 240hz. Going from playing 280hz on my monitor to 144hz makes games visually stutter to a noticeable degree that makes games frustrating to play when you’re used to the super ultra^tm smooth feel of 240hz. Even in that case, which I am familiar with, I wouldn’t go as far to say that anything is unplayable, just noticeably more visually frustrating than usual.
Going from playing a given game on a 240hz display, to a 144hz display does have a noticeable difference however not anywhere near significant enough that its unplayable. I could imagine even pro FPS players being able to tolerate it despite possibly taking a hit in their performance. Point being I completely agree its stupid to suggest its unplayable though the special scenario, in my case, I could imagine is what most people are experiencing when they say its unplayable (I hope lol); even then I wouldn’t really agree.
This youtuber says it’s a really noticeable jump from 240 to 500.
He has tested every single new high refresh rate panel and use scientific methodologies.
Have you actually tried some of these technologies ? I own a 240hz an have never seen above so I can’t talk. I can say that motion blur reduction technologies influences perceived smoothness a lot tho.
And I’m just sitting here with my old & busted eyes thinking 60Hz is enough…
Don’t get me me wrong, I can see a difference between my 60Hz simracing screens and my 144Hz desktop display (both pushing north of 120fps), but it just doesn’t matter when I hammering around Spa in a GT3 car or blasting scavs in Night City.
Absolutely not true. Play a high speed FPS with extremely high FPS (think 1000+ fps, with some games being played at 2k or more fps), and current 240, 280, 360hz monitors are not fast enough to eliminate the lack of smoothness. Ghosting is still plainly visible. These games will absolutely benefit from 500+hz and your eye will be able to tell.
anything above 200hz is completely indistinguishable to the Human eye
First off that is just not true, the human eye can perceive the difference, but the change may not be noticeable to those who are not familiar. Dropping from 240hz to 200hz is absolutely noticeable to competitive FPS players, though it primarily has to do with feeling the response time of your inputs rather than the way it looks.
First off that is just not true, the human eye can perceive the difference, but the change may not be noticeable to those who are not familiar
Pretty much. The theoretical results are about what basically amounts to seeing light pulses. To see the actual difference in a game it requires not only a very good monitor (most high refresh rate monitors are trash with worst case response times way longer than a frame time) and a very fast moving object. Past certain refresh rate these conditions just become unlikely, so you don’t really see a difference unless it’s a synthetic test.
Anyone who says otherwise either has superhuman eyesight or is just lying.
Y’ever listen to audiophiles argue over quality? Argue that -yes- they can tell the difference between $10 HDMI cables and $300 HDMI cables. Or how their $5000 “power filter” cleans the dirty pedestrian AC current before it gets to their sacred audio gear? Or that their diamond/iridium/angel-eyelash record needle gives them a FAR better sound reproduction… YOU just can’t hear it.
There will be someone out there that will tell you that yes, they CAN tell the difference between 200hz and 250hz.
But gosh, they’ll never take a double-blind test to prove their golden ears/eyes.
I can’t comment on $10000 power cleaners, but based on your comment I’m assuming you’re not an electrical engineer with a background in amplifiers and high frequency noise (neither am I)
this is assmiung displays don’t have any issues such as overshoot, which the majority of 240hz and 360hz panels on the market do - a lot of the 240hz panels (ahem, Samsung) are worse than even budget 144hz panels.
I think it is well understood now that this is incorrect. I can tell the difference between 360 and 240, and it is even more jarring when going from 360 to 144 or 120.
I have a 360hz display and sometimes when I play Halo MCC there’s this random thing where its like my eyes realize the refresh rate and everything looks really strange. the movement and speed looks weird. And if I play for awhile and get up and move around its like my eyes are still seeing at that refresh rate for a few minutes. Even with this experience I can’t go back below 360hz. There’s just something about it I enjoy that I can’t even explain
Okay but isn’t this a case of the classic mouse sensor that can do 24000 dpi tale? People say these mice are good because even though you won’t ever go that high, the fact that the sensor can go that high means it’s much more accurate than ones that can only go up to 6000 or whatever dpi you’ll end up using.
Doesn’t this logic also apply to these displays?
Why would you ever want a refresh rate that high?
I did my primary coursework on this exact topic in college so I feel rather aptly in the know here, and VDU’s with displays above 100hz have diminishing returns and anything above 200hz is completely indistinguishable to the Human eye.
Anyone who says otherwise either has superhuman eyesight or is just lying.
Because what you are saying is absolutely not true. Trained eyes and pros can definitely tell between 240hz and 360hz. There is a difference, it’s pretty marginal but there is a difference. Yes this is absolute nonsense for anyone but FPS pros, but there is still a market for it and it’s more of alpha consumer funded research than anything else
He did a coursework on it though!! Op prolly cant perceive lighting strikes too since many of them last less than 1 milisecond
mainly only useful for professional gamers. Also there’s definitely been people that can distinguish above 200hz
It’s less about seeing 540 frames per second and more that each frame taking less time to render on screen, it means what you’re seeing is a more recent representation of the game state. I’m a casual gamer that can’t appreciate anything past 144hz but that’s what I’ve heard about ultra high refresh rate screens.
According to the US Air Force, training allows people to process at and above the 220 fps level after having started at only being able to process at around 40 fps; some pilots even ended the study processing at 300 fps.
Anything above that is probably far past extraneous, though, especially for the average consumer. So a 540hz monitor is a little bit silly.
PC Gamer did a good deep dive into this.
While theres no set answer and everyone’s vision is different, generally theres zero benefit above 200Hz as you said.
I game on my couch at 120 hz. I somehow keep beating all of these multi-thousand-dollar-setup competitors.
https://youtu.be/nqa7QVwfu7s
I did a blind test between two monitor screens.
Maybe to the eye you can’t tell the difference, but whenever I am gaming, I guessed the correct refresh rate 10/10 times.
I tried with my eyes and it was just a 50% guess rate… but whenever I had a controller in my hands, I could instantly tell.
Not sure how I can explain this 🤔
Input lag is how you could tell when gaming. It wasn’t your eyes, but what your brain expected to see.
I did quite a few years on a 60hz monitor. The change to 144hz was massive. Did quite a few years on that 144hz monitor. The change to 240hz was practically unnoticeable.
One problem a lot of those studies have is it doesn’t factor in how you can get used to something and then notice a change. I remember when I was a kid that happening with headphones. I thought my good headphones weren’t that large of an improvement until I had to borrow somebody else’s years later and thought they sounded like absolute junk.
So somebody might not be able to tell much of a difference between 144 and 240 if they’re sitting side by side, but they might be able to tell the difference very easily if they’re used to one and then switch to the other. People are also more sensitive to negative changes, so you’re more likely to notice the difference between the two if you go from 240 to 144 versus 144 to 240.
Somebody below me posted a link to a study showing that humans can detect differences up to about 1000, and most can easily tell the difference between 240 and 360.
Thank you. My background is biology. Frankly neurons and cells just aren’t that fast. You know how your hand can feel cool when you touch a hot plate for a second? It’s because it takes like 200ms for the pain signal to register in your brain. And here these people are acting like a monitor with microsecond refresh rates is somehow something you can even detect.
I’m a 120 or 144hz man myself. Seems plenty snappy and smooth. 60 for office work is even fine. I can for sure sense 30 though. Like when a driver gets updated and something goes wrong and it sets your main display to 24 or 30fps. That shit is obvious.
The way we’re able to see high refresh rate isn’t based on neuron speed. (Ignoring input lag stuff). The eyes continuously sample as light hits them sending signals. Because it’s not a global shutter we’re able to perceive a displays refresh as it’s sending discrete images of a scene.
It’s far easier to notice an issue when there is a large change between two frames. The easiest way to reproduce this is in 3rd person games where you quickly rotate the camera as the large change between frames is very obvious to the eye. (There’s a limit obviously, but I can tell between 120Hz and roughly 240Hz, but I’m fine with 120Hz as I don’t tend to rotate the camera fast in games).
I mean, my subjective experience is that it’s less of a distinguishing one frame from another, but a perception of fluidity in motion.
Like I can see a noticeable jutter between 60-100hz and 144hz. This has been further trained because my personal setup has a cabling issue where emi from my chair piston causes the display cable to be disrupted which sometimes resets my monitor to 100hz. It is very annoying because I’m some of my games I notice a considerable play performance drop when this happens and I didn’t notice it in the past.
So I can definitely see the difference there. Then moving on from 144 to my integrated 240hz display in my laptop, I can absolutely notice a difference with specifically the edges of objects and which is extra-noticeable when moving the mouse around in circles and such.
This also translates to gaming where I’ve genuinely gone and checked my settings because something felt different from normal when I get so used to using the external display that I forget the internal one is the higher spec. Not bad, mind you, just different. I’m super vigilant about any change, though, since I’ve previously had issues that were indicated by performance changes beforehand.
That being said, I don’t think your understanding or coursework is wrong, but just that my subjective experience indicates otherwise and that the research potentially fails to translate well onto an exact Hz scale. My understanding is that the eye doesn’t really work on “frames” and is a bit more “stream of information” about it, meaning that distinguishing change even on minuscule time scales is relatively easy but distinguishing individual objects or the substance of an image flashed at that speed is very hard. If you know what to expect, though, I’d be willing to bet the process becomes significantly easier.
Perhaps there is already a plethora of studies on this exact stuff, but I’d be willing to bet that many existing studies fail to reasonably account for expectation when evaluating comprehension as well.
Frame rate snobs.
Idk, I can’t distinguish after 144hz. Hell I wouldn’t even know if it was 120 or 144
I’ve found that beyond 144hz you need a serious investiment to actually get a pixel response that satisfies it without getting ghosting
oled helps there.
I can only tell when playing fps going from 60 to 144. After 144 it pseudo felt better but really I couldn’t tell
In automotive parlance we call that the Butt Dynometer. Your ass feels extra speed in the seat but the numbers don’t lie.
My brother has a 240hz monitor so I decided to play around with some CS on it on different refresh rates to see if I could notice any difference. And yeah, back to back 144hz to 240hz you notice *something* but it’s really hard to say I can make use of that extra information and every step above 140hz is really hard to notice any difference at all. It’s only barely noticeable when you jump from 140hz to 240hz in one step. In fact even after 90hz taking 10hz steps it already doesn’t feel like a big improvement after 90hz and it rapidly becomes less noticeable after 100hz.
It’s very possible your brother’s 240Hz monitor isn’t particularly good. Just because a display can refresh at a certain number, it doesn’t actually mean that it’s better than a display that might have a lower refresh rate, but has faster pixel response, etc.
A 144Hz OLED for example, will be significantly better than a much higher refresh rate LCD, because OLEDs have pixel responses that are sub 0.1ms. This is more than an order of magnitude faster than the fastest LCD monitor.
I think I might slap someone if I ever hear them say anything under 240 is unplayable. And I just know someone will find their way to this comment to argue why that’s not a ridiculous thing to say with some sort of special scenario.
I have a functional neurological disorder and will have a seizure if subjected to a screen that is under 240hz.
The only special scenario I can think of is playing a game at 144hz while using a monitor thats 240hz. Going from playing 280hz on my monitor to 144hz makes games visually stutter to a noticeable degree that makes games frustrating to play when you’re used to the super ultra^tm smooth feel of 240hz. Even in that case, which I am familiar with, I wouldn’t go as far to say that anything is unplayable, just noticeably more visually frustrating than usual.
Going from playing a given game on a 240hz display, to a 144hz display does have a noticeable difference however not anywhere near significant enough that its unplayable. I could imagine even pro FPS players being able to tolerate it despite possibly taking a hit in their performance. Point being I completely agree its stupid to suggest its unplayable though the special scenario, in my case, I could imagine is what most people are experiencing when they say its unplayable (I hope lol); even then I wouldn’t really agree.
90 is definitely some kind of threshold where i notice the performance is dropping, as long as an fps is like 95low im happy with the performance
144-to 240+ is extremely noticeable. Can’t speak on 240 to 360 since I don’t have a 360hz monitor
To spot a sweaty DCS player going mach 3 over your port view, maybe?
All those idiot pro gamers using 240+ they’ve been swindled! /s
This youtuber says it’s a really noticeable jump from 240 to 500.
He has tested every single new high refresh rate panel and use scientific methodologies.
Have you actually tried some of these technologies ? I own a 240hz an have never seen above so I can’t talk. I can say that motion blur reduction technologies influences perceived smoothness a lot tho.
And I’m just sitting here with my old & busted eyes thinking 60Hz is enough…
Don’t get me me wrong, I can see a difference between my 60Hz simracing screens and my 144Hz desktop display (both pushing north of 120fps), but it just doesn’t matter when I hammering around Spa in a GT3 car or blasting scavs in Night City.
such as?
Watch the video?
measuring pixel response times, total system latency, slow motions of different displays
Extensively. He is sponsored to sell you a product.
You are VERY correct in your deduction that motion blur tech is a far bigger hitter than just up’ing the refresh rate.
Didn’t people do a lot of research on why +120Hz wasn’t really necessary? I can absolutely tell the difference between 200Hz and 240Hz
Absolutely not true. Play a high speed FPS with extremely high FPS (think 1000+ fps, with some games being played at 2k or more fps), and current 240, 280, 360hz monitors are not fast enough to eliminate the lack of smoothness. Ghosting is still plainly visible. These games will absolutely benefit from 500+hz and your eye will be able to tell.
You can’t see it but you can feel it in some games
First off that is just not true, the human eye can perceive the difference, but the change may not be noticeable to those who are not familiar. Dropping from 240hz to 200hz is absolutely noticeable to competitive FPS players, though it primarily has to do with feeling the response time of your inputs rather than the way it looks.
Pretty much. The theoretical results are about what basically amounts to seeing light pulses. To see the actual difference in a game it requires not only a very good monitor (most high refresh rate monitors are trash with worst case response times way longer than a frame time) and a very fast moving object. Past certain refresh rate these conditions just become unlikely, so you don’t really see a difference unless it’s a synthetic test.
Y’ever listen to audiophiles argue over quality? Argue that -yes- they can tell the difference between $10 HDMI cables and $300 HDMI cables. Or how their $5000 “power filter” cleans the dirty pedestrian AC current before it gets to their sacred audio gear? Or that their diamond/iridium/angel-eyelash record needle gives them a FAR better sound reproduction… YOU just can’t hear it.
There will be someone out there that will tell you that yes, they CAN tell the difference between 200hz and 250hz.
But gosh, they’ll never take a double-blind test to prove their golden ears/eyes.
Completely false equivalence btw
I can’t comment on $10000 power cleaners, but based on your comment I’m assuming you’re not an electrical engineer with a background in amplifiers and high frequency noise (neither am I)
There’s def a difference between 144hz and 360hz.
It’s 216
Human hubris
Have you written a paper on it? I would like to read more about it.
this is assmiung displays don’t have any issues such as overshoot, which the majority of 240hz and 360hz panels on the market do - a lot of the 240hz panels (ahem, Samsung) are worse than even budget 144hz panels.
I think it is well understood now that this is incorrect. I can tell the difference between 360 and 240, and it is even more jarring when going from 360 to 144 or 120.
I have a 360hz display and sometimes when I play Halo MCC there’s this random thing where its like my eyes realize the refresh rate and everything looks really strange. the movement and speed looks weird. And if I play for awhile and get up and move around its like my eyes are still seeing at that refresh rate for a few minutes. Even with this experience I can’t go back below 360hz. There’s just something about it I enjoy that I can’t even explain
It’s not usually about seeing the actual fps or refreshes, it’s usually about how the game reacts to the high FPS and refresh rate.
CSGO behaved particularly different at 120 vs 300+fps
Okay but isn’t this a case of the classic mouse sensor that can do 24000 dpi tale? People say these mice are good because even though you won’t ever go that high, the fact that the sensor can go that high means it’s much more accurate than ones that can only go up to 6000 or whatever dpi you’ll end up using. Doesn’t this logic also apply to these displays?