Why? It's not a Yes/No test -- does the
eye tracking animation produce an Error 404? Or are you using a non-LCD display such as CRT? Or older LCD (those will not produce the correct optical illusion on that motion test). Do you have a
stutter free web browser? (Aero ON, primary monitor only, VSYNC-supported browser). This test pattern seems to works for about ~80% of humans, the remaining 20% either has browser problems or has a vision behaviour (e.g. diffent motion psychotropics) that makes their vision deviate from the norm.
Correct, it's not possible on most IPS displays. But try again on a CRT, and you can.
And on all modern strobe-backlight LCD's. I have an EIZO FG2421, G-SYNC monitor (NVIDIA sent me one for testing), several LightBoost monitors, and a beta BENQ XL2720Z sitting on my desk (BENQ sent me one to test it strobe mode; as I'm the "LightBoost" guy). So I've seen the major strobe-backlight monitors. All of them pass the TestUFO Panning Map Test. And everyone who've I've asked to come to the computer screen, says they can read the map labels on these monitors.
3. The "blindtest" was setup by a monitor company AOC fyi. Hardly scientific. Not even considering it was only 50 people. Hardly a slamdunk.
Fair comment, depending on what is defined as "science" nowadays. It is still more scientific than most "science" being done nowadays. It's sad how public research has declined. However, that aside, let's touch base on a few fundamental concepts, to at least point out it is not a theory.
That I disagree. Unless you mean anything less than "five sigma" is a theory. Then I give up arguing with you.
🙂
It's true that only roughly about two-sigma of human population can see the difference with LightBoost in a framerate=stroberate test. It's already confirmed to be a majority. Key is a test with controlled variables (framerate locking), not a random game. Very few games can run stutter free and tearing free, at framerate=stroberate that LightBoost require for ideal-motion scenario, or it doesn't look massively better. Even at variants (e.g. framerate>stroberate) can look better motionwise for many.
Depends on many factors, age, room lighting, contacts, etc.
That I agree.
Everybody is different & every environment is different.
Many factors, even color blindness -- a certain percentage (8%) is color blind.
And tons of others, as you've pointed out.
Now, motion blur caused by persistence is not a theory.
Google "MPRT response" in Google Scholar, for a lot of science on this already.
And the papers about the amount of motion blur relationship to camera shutters.
Each refresh on a LCD is often mostly static -- you've seen see high speed videos. Your eyes are in a different position at the beginning of a refresh cycle than at the end of a refresh cycle. This causes static frames to be blurred across your retinas. Reducing this motion blur involves shortening the persistence (aka reducing sample-and-hold / persistence / frame visibility). This reduces the opportunity for that specific frame to be motion blurred across your retinas. It's already well-established science by vision researchers with plenty of peer reviewed papers.
The photographers equivalent also exists -- if you pan your camera fast past scenery, a shutter speed of 1/240sec creates less tracking-based motion blur than a shutter speed of 1/120sec or 1/60sec. There's already a peer reviewed paper somewhere that shows an uncanny co-relation between perceived motion blur and the frame duration time, especially at rates high enough for motion fluidity.
As your eyes track a moving object on a screen at 1000 pixels/second, if there's 60 frames per second, and each frame is shown for 1/60sec, then your eyes have moved 60/1000ths of the screen width on average (there's the eye saccade effect, which varies between human to human, but at slower speeds, eye tracking is pretty accurate and an insignificant error margin). 60/1000ths of 1000 pixels/second is 16.7 pixels. So that's a linear motion blurring of about 16.7 pixels for the static frame.
Of course, this is just a Coles Notes 101 version of the science papers I've read over the last two years, and already linked to several times in the past.
The existing knowledge is all easily (for the typical human vision case) simplified down: For easily calculated persistence (e.g. square wave persistence, such as ON-OFF, or instant transition to next frame, no decay effects or ghosting effects to muddy up the persistence), 1ms of persistence (static frame time) exactly equals 1 pixel of motion blur during 1000 pixels/second.
It of course, makes several assumptions:
-- Square wave persistence (ON-OFF, or if no off cycle period, then instant transition to next frame); phosphor decay and other effects complicate the math
-- Certain modern displays now closely resembles square wave persistence (e.g. OLED's, fast LCD's, some DLP's) moreso than ever before, which makes tracking based motion blur accurately calculable for motion cases (e.g. framerates sufficiently high enough to be perceived as motion without stop-motion feel or edge-strobing effects during motion).
Here, the tracking-based display motion blur caused by persistence (static image time) corresponds very accurately (for most human vision) to the well-known photographic camera equivalent is -- 1ms of shutter open time on a panning camera equals motion blur of 1/1000th the distance the camera pans during 1 second. (2ms equals twice the blur, 3ms equals three times the blur, etc). On-screen persistence behaves accurately as a camera shutter in terms of motion blur behaviours during framerates sufficiently high enough to be perceived as motion. For flickerfree dispoays, 120fps has half the motion blur of 60fps, and flickerfree 240fps has half the motion blur of flickerfree 120fps. (We're talking about completely flickerfree displays; no strobing, no phosphor decay, no light modulation).
Vision researchers have confirmed this effect (for typical vision; aka most humans) already -- vendors such as vpixx and others sell a true-500Hz or true-1000Hz scientific projector for such experiments and related. It will be a long time we will simultaneously be blur-free AND flicker-free since that requires framerateless technology (or ultrahigh framerates to simulate framerateless -- much like 1/1000sec high speed camera shutter, or a 1000fps@1000Hz display -- in order to limit motion blurring to 1ms without using black periods until the next frame). Strobing is only a band-aid for now (even as I advocate LightBoost heavily...) Strobing has its cons -- flicker -- and the need for high strobe rate to eliminate flicker -- and the requirement of stroberate=framerate for proper motion clarity (otherwise LightBoost motion clarity isn't worthwhile) -- is often problematic with today's GPU's -- and the cons such as the unofficial LightBoost brightness/color degradation (finally solved by EIZO FG2421 and some next-generation strobe backlights).
But fortunately at least not impossible anymore, given sufficient money (good GPU's...) and game choice. (e.g. If you visit many forums, then you've noticed those "
LightBoost is crap below X fps" posts in threads I never read before today but googled and easily found). As a result, users of ultrapowerful GPU's are more likely to see LightBoost benefits. Controlled tests confirm the effect is maximied at framerate=stroberate (e.g. 120fps panning photo tests). Metaphorically, much like a movie camera or video camera opening the shutter only once during each frame -- otherwise the resulting movie would look crap. As the motion tracks, the eye tracks, the frame presents in the correct time with the resulting objects in the correct place, zero stutter, zero motion blur.
What's not well covered/resarched is the sudden emergence of really good strobe-backlight displays including unofficial ones (LightBoost) and official ones (EIZO Turbo240, BENQ Blur Reduction, and G-SYNC monitors' LightBoost sequel revealed by John Carmack), and so little scientific study have been done on those. However, the persistence science and tracking-based motion-blur is long well-established. The monitor manufacturers are finally starting to pay a bit more attention...
That said I agree; more scientific research is needed as I would love to see peer reviewed papers on the widely confirmed benefits of lower persistence on recent (2013-era) gaming computer monitors.
Either way, it's not disputable that there can be benefits from including optional strobe modes with various monitors. Even if it doesn't benefit you. And several gamers may not see/track motion that exercise the ideal case scenarios that strobing benefits (even if the vast majority of the same gamers easily see it in motion tests).