I mean, my subjective experience is that it’s less of a distinguishing one frame from another, but a perception of fluidity in motion.
Like I can see a noticeable jutter between 60-100hz and 144hz. This has been further trained because my personal setup has a cabling issue where emi from my chair piston causes the display cable to be disrupted which sometimes resets my monitor to 100hz. It is very annoying because I’m some of my games I notice a considerable play performance drop when this happens and I didn’t notice it in the past.
So I can definitely see the difference there. Then moving on from 144 to my integrated 240hz display in my laptop, I can absolutely notice a difference with specifically the edges of objects and which is extra-noticeable when moving the mouse around in circles and such.
This also translates to gaming where I’ve genuinely gone and checked my settings because something felt different from normal when I get so used to using the external display that I forget the internal one is the higher spec. Not bad, mind you, just different. I’m super vigilant about any change, though, since I’ve previously had issues that were indicated by performance changes beforehand.
That being said, I don’t think your understanding or coursework is wrong, but just that my subjective experience indicates otherwise and that the research potentially fails to translate well onto an exact Hz scale. My understanding is that the eye doesn’t really work on “frames” and is a bit more “stream of information” about it, meaning that distinguishing change even on minuscule time scales is relatively easy but distinguishing individual objects or the substance of an image flashed at that speed is very hard. If you know what to expect, though, I’d be willing to bet the process becomes significantly easier.
Perhaps there is already a plethora of studies on this exact stuff, but I’d be willing to bet that many existing studies fail to reasonably account for expectation when evaluating comprehension as well.
I mean, my subjective experience is that it’s less of a distinguishing one frame from another, but a perception of fluidity in motion.
Like I can see a noticeable jutter between 60-100hz and 144hz. This has been further trained because my personal setup has a cabling issue where emi from my chair piston causes the display cable to be disrupted which sometimes resets my monitor to 100hz. It is very annoying because I’m some of my games I notice a considerable play performance drop when this happens and I didn’t notice it in the past.
So I can definitely see the difference there. Then moving on from 144 to my integrated 240hz display in my laptop, I can absolutely notice a difference with specifically the edges of objects and which is extra-noticeable when moving the mouse around in circles and such.
This also translates to gaming where I’ve genuinely gone and checked my settings because something felt different from normal when I get so used to using the external display that I forget the internal one is the higher spec. Not bad, mind you, just different. I’m super vigilant about any change, though, since I’ve previously had issues that were indicated by performance changes beforehand.
That being said, I don’t think your understanding or coursework is wrong, but just that my subjective experience indicates otherwise and that the research potentially fails to translate well onto an exact Hz scale. My understanding is that the eye doesn’t really work on “frames” and is a bit more “stream of information” about it, meaning that distinguishing change even on minuscule time scales is relatively easy but distinguishing individual objects or the substance of an image flashed at that speed is very hard. If you know what to expect, though, I’d be willing to bet the process becomes significantly easier.
Perhaps there is already a plethora of studies on this exact stuff, but I’d be willing to bet that many existing studies fail to reasonably account for expectation when evaluating comprehension as well.