The Asus ROG Swift Pro PG248QP is an extremely fast gaming monitor designed for professional esports gamers. Boasting a massive 540Hz refresh rate, it's a glimpse into...
human eye will scale to around 1000hz, albeit with exponentially decreasing difference past 240.
Most trained eyes should still be able to blind tell between 360 and 540 quite easily, many people I know have passed double blind test for 240 VS 360.
I’m sure that you and your friends think that you can see a difference between 240 and 360 but objectively, no, you all suffer heavily from bias and placebo.
Not what you’re looking for, but I remember watching a documentary (decades ago) where they wired people up to EEG machines and then showed them images for increasingly slight instants. After a certain point, like 1/1000th of a second or something, the individual would stop reporting seeing anything, but their brain would still register it. And furthermore it would register it quicker than in the cases where the person was still able to consciously notice it.
It’s pretty f*cking funny to read when 15 years ago you were arguing with people on the internet about the superiority of 120Hz monitors and they would tell you that the human eye cannot see any difference above the 60Hz.
Reminds me of people claiming that 30FPS was better than 60 because “the human eye can only see 30FPS”. At least the “30FPS is cinematic” argument made a little sense, despite the fact that it is completely wrong in the context of games.
Human visual processing breaks down at different rates. I had an extremely informative college course which I took fairly poor notes but here are the basics.
Object recognition - 24hz. Actually got a live demo for this one. Flashing 24 unrelated images per second and students could call out when a duck was visible. About half of us could spot it even when most other images were unrecognizable. This rate targets the visual cortex.
Flicker Fusion threshold - 60hz, the rate the eye can percieve that a light is flickering vs solidly lit. Depends also on the display technology. This rate targets the rods/cones of the retina themselves.
Motion Recognition - 500-1000hz, Inside the retina itself, the human eye has special ganglion cells which detect edge patterns traveling between rods/cones, these are clever hair trigger comparators that can spot differences in detection smaller than the original signal (stilll ~60hz but unsynchronized) but this information travels separately from the object recognition signals to the brain and is never fully re-incorporated. Motion blur/interpolation used in film almost complete blinds these cells and the absence of these signals is relatively subtle. Unblurred 60hz film and video games can still trigger the stop motion effect. These cells need much higher rates to be completely fooled.
Think of it another way, if you have a vertical line moving across the screen, you want atleast one frame for each motion cell. If the line jumps pixels you get more like motion stripes than a motion image. If you jump multiple pixels per frame, skipped motion cells report zero motion and the perception of motion breaks down. To complete the illusion you nearly need a new frame for every column of pixels(not every rod/cone cell is connected to a motion cell) . This is why perception still subtly changes up to absurdly high frame rates.
https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/
human eye will scale to around 1000hz, albeit with exponentially decreasing difference past 240.
Most trained eyes should still be able to blind tell between 360 and 540 quite easily, many people I know have passed double blind test for 240 VS 360.
I’m sure that you and your friends think that you can see a difference between 240 and 360 but objectively, no, you all suffer heavily from bias and placebo.
You are wrong. And the same thing was said about 120 vs 240, and surprisingly 60 vs 120.
This is an age old argument. But those who use 120 vs 240 will tell you it is not just noticeable, but will affect your hand eye coordination timing.
Placebo should not be ruled out, but I believe you have the bias, here.
Please purchase a dictionary and acquaint yourself with the “O” section.
You have the equipment to test people you know’s ability to see 240 vs 360 in a reliable manner at home?
Not what you’re looking for, but I remember watching a documentary (decades ago) where they wired people up to EEG machines and then showed them images for increasingly slight instants. After a certain point, like 1/1000th of a second or something, the individual would stop reporting seeing anything, but their brain would still register it. And furthermore it would register it quicker than in the cases where the person was still able to consciously notice it.
youre talking to a printer
you literally only need monitors for a double blind test, that’s it.
It’s pretty f*cking funny to read when 15 years ago you were arguing with people on the internet about the superiority of 120Hz monitors and they would tell you that the human eye cannot see any difference above the 60Hz.
they just moved the goal post. now its “the human eye cant see past 1000 hz”
we’ll see about that!
Reminds me of people claiming that 30FPS was better than 60 because “the human eye can only see 30FPS”. At least the “30FPS is cinematic” argument made a little sense, despite the fact that it is completely wrong in the context of games.
Human visual processing breaks down at different rates. I had an extremely informative college course which I took fairly poor notes but here are the basics.
Object recognition - 24hz. Actually got a live demo for this one. Flashing 24 unrelated images per second and students could call out when a duck was visible. About half of us could spot it even when most other images were unrecognizable. This rate targets the visual cortex.
Flicker Fusion threshold - 60hz, the rate the eye can percieve that a light is flickering vs solidly lit. Depends also on the display technology. This rate targets the rods/cones of the retina themselves.
Motion Recognition - 500-1000hz, Inside the retina itself, the human eye has special ganglion cells which detect edge patterns traveling between rods/cones, these are clever hair trigger comparators that can spot differences in detection smaller than the original signal (stilll ~60hz but unsynchronized) but this information travels separately from the object recognition signals to the brain and is never fully re-incorporated. Motion blur/interpolation used in film almost complete blinds these cells and the absence of these signals is relatively subtle. Unblurred 60hz film and video games can still trigger the stop motion effect. These cells need much higher rates to be completely fooled.
Think of it another way, if you have a vertical line moving across the screen, you want atleast one frame for each motion cell. If the line jumps pixels you get more like motion stripes than a motion image. If you jump multiple pixels per frame, skipped motion cells report zero motion and the perception of motion breaks down. To complete the illusion you nearly need a new frame for every column of pixels(not every rod/cone cell is connected to a motion cell) . This is why perception still subtly changes up to absurdly high frame rates.
I have to wonder if they have ever even tried higher refresh rates/FPS? The difference is so apparent.