Where can I view a 1000FPS display?

Discussion in 'General Science & Technology' started by Tronkostroglutegunkr, Oct 16, 2011.

  1. Tronkostroglutegunkr Registered Senior Member

    Messages:
    16
    Modern computer monitors only go up to 120FPS, but I want to test human time perception, so I'll need something far above that.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. Believe Happy medium Valued Senior Member

    Messages:
    1,194
    120fps is already well above human perception.

    http://en.wikipedia.org/wiki/Frame_rate
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. Tronkostroglutegunkr Registered Senior Member

    Messages:
    16
    That's talking about continuity of vision, which is about images overlapping. I'm talking about "What's the smallest visual increment a human can perceive."

    I've tested that myself using Flash and video editors. If I run the frame rate at 100FPS, and place a single random frame of something completely different into a video, I can spot it (though there is some shredding of the image, due to it approaching the monitor refresh rate). This shows that an increment of 100th of a second is an understandable unit of time for the human brain. Note: So long as it is something normally/high contrasting. The refresh rate itself works, because bright images beat dark images in perception. There is a light/eye element here.

    Since I can still see it with a cap of 100-120th of a second, I need to be able to see a much higher FPS display to test the limits. I don't actually think it's hard to achieve these high frame rates, since it's digital, but it seems like it's pointless for general use, so they don't make them as far as I can tell.
     
    Last edited: Oct 18, 2011
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. Believe Happy medium Valued Senior Member

    Messages:
    1,194
    So theres some basic problems with this that your going to have to test first to even see if you need a faster monitor or not (you probably don't).

    The first problem is if your monitor can actually show 100hz signal. If your montor is 120hz capable it could be 120hz capable in one of 2 ways.

    1. It is only capable of actually accepting a 60hz signal which it then uses internal software to created 1 frame in between each from. This is how pre-3D televisions did it and this is also how many TV's and computer monitors still do it. The extra frame is added in to reduce motion blur. (i.e. can't accept a 100hz signal at all)

    2. The unit is actually capable of accepting and displaying a 120hz signal. Usually this is only the case with 3d monitors. (can display a 120hz signal, but will problably have to adjust the frame rate to compensate for only getting a 100hz signal)

    Now why is this a problem? Why its a little thing called telecine:

    http://en.wikipedia.org/wiki/Telecine

    This basically means that if the frame rate that you put forth (100hz) does not match the display capabilty of the monitor (60hz or 120hz I've actually never seen one that works inbetween but some professional set ups might) then frames are either added or partially removed to bring it down or up to the correct frame rate. That this means for your single frame is that is was likely stretched of the course of several frames making it easier to see.

    The second problem has to do with monitor itself. If (and I'm assuming it is) the monitor is LCD or LED and you run a single frame through that contrasts greatly with the frames on either side of you will introduce big time artifacts into the frame which can stay around for several frames also making it easier to spot.

    To this right you need to do 2 things. One, match the frame rate displayed to the frame rate of the monitor exactly. Two, conduct your experiment on either a plasma television or an old school CRT monitor (good ones too). Both of these are very fast switching and will not introduce extra artifacts into the media (assuming your software doesn't). After you have tried this, you can then say for sure that you need a faster monitor (you don't). Also, you may want to try it without it being high contrast (i.e. dont go black to white). Something like just replacing the person talking with someone else who is dressed differently but in the same color will help the monitor keep up better.

    Also, you can't test it on yourself. If you made it you know where to look for the frame and you will be anticipating it making it easier to see. If you want to it right you cannot even tell the people your testing it on that it is there at all. Just ask them at the end if they saw anything funny.
     
  8. MacGyver1968 Fixin' Shit that Ain't Broke Valued Senior Member

    Messages:
    7,028
    Why would anyone build a monitor with a 1000fps capability?...and even if they did...do you have 10's of thousands of dollars to purchase one?
     
  9. Tronkostroglutegunkr Registered Senior Member

    Messages:
    16
    That's informative, thanks.
     
  10. Believe Happy medium Valued Senior Member

    Messages:
    1,194
    Your welcome! I'm just glad that my normally useless knowledge could help someone at least

    Please Register or Log in to view the hidden image!

     
  11. BrandeX Registered Senior Member

    Messages:
    9
    What you want is a tachistoscope, and this sort of research has already been done (if I understand your intent).

    Wikipedia: Tachistoscope or similar
     
  12. Trippy ALEA IACTA EST Staff Member

    Messages:
    10,890
    The problem with monitors is with the computation required to do the calculations associated with each frame.

    This, however, can be avoided.

    Buy yourself a tri-colour LED, connect it up to a cheap controller, if your programming skills are up to par, you could even do it through an IO port on your computer. Program it so that it flashes 1000 times per second, and for every 999 red flashes, there's a green one, or something similar.

    I predict that you won't be able to detect the green flash, but I also predict that as long as they aren't consecutive, as you increase the number of green flashes, that the light from the LED will appear increasingly yellow.
     
  13. mdrejhon Registered Member

    Messages:
    4
    This is an old thread, but currently contains now-outdated information based on new research.
    However, if the LED is displaying a dim color, and the different color is a bright color, you WILL see the flicker. The human eye has no lower limit to flicker detection assuming the flicker is sufficent brightened to compensate for the shortness of the flicker.

    For flickers that looks instantaneous to the eye (e.g. photo flash), this is known as Talbot-Plateau's Law, where a flicker twice as bright looks the same brightness as a flicker twice the length. So a 1 millisecond flash looks as bright as a 1 microsecond flash that's 1000x brighter. They both look equally 'instant' to the eye. We're assuming there's enough photons for the light to be detected, of course. Also, with such a short flicker, the extreme brightness typically has no time to oversaturate/overheat the retinas, and it still looks the same brightness.

    However, this is actually a totally different matter from the benefits of a 1000fps@1000Hz display, which works to eliminate motion blur and eliminate stroboscopic artifacts.

    Recently, vision researchers have definitively proven this incorrect.

    Bumping an old thread, because it has since been determined that human eyes can benefit from a 1000fps @ 1000Hz display, especially when you're trying to invent a display for holodeck/VR uses. For these applications you need to eliminate all side-effects of finite-framerate displays such as stroboscopic effects, wagonwheel artifacts, flicker issues, sample-and-hold issues, motion blur (above-and-beyond natural human limitation) forced upon you, etc. You can solve one or few of the above. But not all of them simultaneously on finite-framereate displays. Unfortunately, it is scientifically impossible simultaneously to solve all of these side-effect simultaneously with a finite-framerate display. At least one (or few) of the above artifacts are noticable that's a dead-giveaway of a finite-framerate display. Recently, I've replied to an engineering question about high-framerate displays, but apparently, I can't post links directly as a new poster, so you will have to copy and paste these into Google, to find the articles.

    Google searches of industry writings:
    -- Michael Abrash of Valve Software: Down the VR rabbit hole: Fixing judder, mentions 1000Hz display
    -- John Carmack of id Software: QuakeCon keynote talking about motion blur, YouTube, at 5m35s
    -- Why Do Some OLED's Have Motion Blur (has lots of science references)

    Google searches of some of my writings
    -- AVSFORUM article; Why We Need 1000fps @ 1000Hz this century
    -- 1000fps OLED engineering question on Electronics Stack Exchange
    -- TestUFO Animation: Eye Tracking Motion Blur (change top selector to the correct animation "Eye Tracking")

    An example excerpt:
    Thanks,
    Mark Rejhon
    Owner of BlurBusters -- Blog About Eliminating Motion Blur On Displays
     
  14. mdrejhon Registered Member

    Messages:
    4
    Incidentially, NVIDIA just announced G-SYNC which is apparently a variable refresh-rate technology -- the monitor doesn't refresh at discrete scheduled intervals, but at asynchronous intervals whenever the graphics card finishes rendering a frame! The time between refreshes varies. This is one good small steps towards an infinite frame rate display.

    Although G-SYNC is currently capped to 144fps, there is no technical reason why the maximum frame delivery rate can be raised on a variable refresh-rate display. We are unable to simultaneously eliminate motion blur -and- stroboscopic effects on any finite framerate displays.

    Side note -- Vpixx has a 500Hz projector for vision scientists / researchers and they've demonstrated that humans can still either see motion blur (e.g. 2 pixels of motion blurring during 1000 pixels/sec motion; e.g. fine tiny text in a fast scrolling marquee) and/or stroboscopic effects (the familiar wagonwheel effect, or the mouse dropping effect when waving mouse in a circle). But it's not possible to simultaneously eliminate motion blur & strobscopic effects simultaneously. Only one or the other can be eliminated at a time. Artificially adding adding motion blur to the frames to fix the stroboscopic effect, is very problematic for VR goggles applications that want to match "Holodeck quality" as closely as possible. Externally-forced motion blur (from the display, or in the frames) is unacceptable for high-end VR applications where we want 100% of motion blur to be 100% human natural, and not forced upon the retinas. This means, unfortunately, 500fps@500Hz isn't enough to create "Holodeck" because of the law of physics forcing a blur/strobe tradeoff.

    (various similar peer reviewed papers regarding the common blur/strobe tradeoff are already written elsewhere over the last ten years - unable to link them due to my newuser status)
     
  15. mdrejhon Registered Member

    Messages:
    4
    Since I've last posted, I co-operated with a NIST.gov and NOKIA researcher, on a peer-reviewed conference paper about display motion blur testing.

    I also tested a prototype 480Hz monitor, and now have witnessed 1000Hz experimental displays in the laboratory. I have now written an article about blurless sample-and-hold:

    Blur Busters Law: The Amazing Journey To Future 1000Hz+ Displays

    The difference is indeed noticeable due to:
    • Motion blur
      1ms of persistence (MPRT) = 1 pixel of motion blur per 1000 pixels/sec
    • Stroboscopic effects
      And related universe of artifacts: phantom array, rainbow artifacts, wagon wheel effect, etc.
    • Persistence of vision effects
      The effect you see in the animation at www.testufo.com/persistence - same category of phenomenae as Nipkow wheels, mechanical clocks (bouncing/spinning LED stick), spinning bike wheel graphics, etc.
    In the long term, >1000Hz+ will eventually be necessary in decades/centuries to come in humankind to pass a Holodeck Turing Test (reality indistinguishable from virtual reality), no strobe effects, no additional motion blur forced on you above-and-beyond natural vision, etc.

    My article is full of extremely useful images, including the following:

    Please Register or Log in to view the hidden image!



    For more info about recent 480Hz and 1000Hz experiments and findings, read here.
     
    Last edited: Jan 2, 2018
  16. mdrejhon Registered Member

    Messages:
    4
    As the resident Refresh Rate Einstein, I'd like to add:

    1. I've collaborated with NVIDIA on Temporally Dense Ray tracing. (See Page 2 for credit)

    2. I've created a new article, Frame Rate Amplification Technologies where future GPUs will be able to create high frame rate cheaply with dedicated silicon designed towards lagless/flawless equivalents of interpolation. Today, one of these is already in wide use -- as Oculus Asynchronous Space Warp Version 2.0 which is essentially like near-flawless 3D interpolation that usese high-Hz headtracking and the Z-Buffer to make it more flawless and less black box.

    Either way, it is currently anticipated future GPUs will be able to generate 1000fps out of just 100fps, within ten years, in very brand new GPU workflows mentioned in the article.

    At 100-200fps, the latency of frame rate amplification can become practically nil, especially with lookbehind-only information feeds (e.g. 1000Hz mouse data, etc) to make it far less blackbox. Zero parallax artifacts, etc.

    Eventually the 3D GPU workflow will slowly morph into a different workflow -- potentially using ray tracing techniques or potentially a 3D metaphorical equivalent I-Frame, B-Frames, and P-Frames in video codecs, where video is only 1 fully compressed frame per second, with everything predictive in between (but that humans are typically unable to tell).

    Many researchers are working to solve these to eliminate the need for full GPU renders every single frame, by creating nearly flawless intermediate frames, to produce strobeless motion blur reduction (no need for strobing / phosphor / black frames / flicker), a defacto blurless sample-and-hold fashion -- emulation of analog stepfree motion via ultrahigh Hz.

    The refresh rate race towards retina refresh rates is kind of a slow Moore's Law with refresh rates doubling approximately every 10 years(ish) in the near term. Manufacturers need to double refresh rate (120Hz -> 240Hz -> 480Hz -> 960Hz) to keep human-visible improvements, but the computer-side-of-things need to keep up.

    The diminishing curve of returns is very strong but the weak links are getting to be much more well understood by researchers -- Consider 60Hz->120Hz is an 8.3ms blur improvement (the math difference of 1/60 vs 1/120), while 120Hz->960Hz is a 7.3ms blur improvement (the math difference of 1/120 vs 1/1000) -- one has to go very steeply up the curve and fix all weak links all the way in between.

    Assuming GtG is not a limiting factor (0ms GtG), this means MPRT (sample-and-hold) becomes the motion blur limiting factor.

    Please Register or Log in to view the hidden image!



    The higher the resolution, the limitations of a low refresh rate become visible. Thus, 4K needs 1000Hz more badly than 480p, because of more pixels per second to blur over the same angular motion (pixels per radian of movement, pixels per inch of movement, etc).

    One screenwidth per second on 4K is 3840 pixels/sec, which is 3.8 pixels of motion blur even on a 1000Hz display -- so even 1000Hz is not necessarily the final frontier of the whole millennium. Some researchers are discovering cheap ways to do ultra-high-Hz, much like yesterday's researchers discovered how to make 1080p and 4K cheap, so if it's a small value add -- it will be added.

    For those familiar with variable refresh rate (animation) such as GSYNC or FreeSync -- ultra-high-Hz also behaves like per-pixel variable refresh -- VSYNC ON, VSYNC OFF, GSYNC, FreeSync, Fast Sync, etc -- all begin to equalize in appearance and latency the closer you reach retina refresh rates. This makes many sync technologies unnecessary.

    Because of tiny refresh granularity, five windows 24fps, 25fps, 50fps, 59.94fps, 60fps all look identically smooth on an ultra-high-Hz display because the stutter of the tiny refresh granularity is so tiny as to be rendered invisible; so an ultra-high-Hz display also essentially makes VRR obsolete someday in the future. Ultra-Hz doesn't just benefit ultra-high-framerate, it also universally smooths out frame rates on a per-pixel basis -- as if each pixel has its own independent refresh rate!

    Another great angle for film makers -- It also theoretically lets a movie director choose the preferred motion clarity of their films, even on a per-scene basis, or even per-portion-of-scene basis. Want normal blurry 24mm look? Done. Want your picture to suddenly go CRT-motion-clarity? Done. Want your picture to flicker like a phosphor tube TV? Done. Want half of your frame to look like projector film (complete with rolling shutter and motion blur) and the other part of the frame to look like a CRT tube? (zero blur, flicker) Done. Ultra-high-Hz is kind of like a universal venn diagram that gives director more control over motion clarity of scenes -- or even portions of their scenes too. Instead of display limiting you -- the motion clarity is controllable by the source material! It's a matter of manipulating what each of the individual refresh cycles look like, to look like anything of their choosing (clearer display, blurrier display, flicker display, etc).

    There are some UltraHFR experiments under way. Past experiments failed human-benefits tests until the weak links were understood (the tips listed at the bottom of the UltraHFR article).

    Even as I collaborate with various parties such NIST, NVIDIA, etc -- the myth of non-benefits of ultra-high-Hz is still pervasive (like yesterdays "Humans can't see 30fps vs 60fps" myths)....

    ...It still drops quite a few jaws until I show them in-person demos. I thought to add this followup for completeness' sake -- because I had linked to this thread from a few places elsewhere. Prominent researchers who understand the weak links towards ultra-high-Hz displays -- have now fully ceased to dismiss ultra-Hz. Many researchers studying true ultra-high-Hz display research. It merits very close attention now as a "Technology of the mid 21st century" to watch for.

    Cheers.
    Mark Rejhon
    Founder, Blur Busters / TestUFO
     
    Last edited: Jul 30, 2019
  17. davewhite04 Valued Senior Member

    Messages:
    3,710
    Nowhere.
     

Share This Page