Talk:Display motion blur

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Rewrites by Mark Rejhon[edit]

I've stepped up to the plate to reword/rewrite portions of the article. I have worked in the home theater industry for multiple manufacturers [1][2] and over time, I have been researching many excellent references that also relate to this topic [3]. There was a lot of Europe-specific terminology ("100Hz") so I've genericized it somewhat. I removed weasel words, phrases, and controversial statements and instead linked to references. I've rearranged paragraphs (moved the second/third paragraphs to the "Causes" section), and renamed to generic titles ("100Hz" renamed to "Motion Interpolation"), and added some additional, new references, removed many dead links with replacements found, slimmed down the motion interpolation section (because there's already a separate article; extra material can be added there instead). I think most will agree that most of my changes are beneficial. If I made errors, please feel free to point them out specifically, line item by item. Let's be cautious to proceed in order to make sure not to "throw the baby out with the bath water" in a manner of speaking -- some references can be further improved, pointing out any potential weasel words/phrases, etc. This isn't all perfect, but a big improvement. Mdrejhon (talk) 21:16, 2 October 2012 (UTC)[reply]

Fixed Items[edit]

Wrong article title? [fixed][edit]

I can't help noticing that of the six factors itemized in the opening of this article, only one really has much of a connection to HDTV. LCDs, 3:2 pulldown, telecine lag, and intentional CGI motion blur all predate HDTV.Algr (talk) 07:41, 23 December 2007 (UTC)[reply]

I wouldn't disagree with these points. "HDTV blur" connotes the various artifacts more generally than any other terms. Also, this informative page is meant to ultimately bubble up to the top of a google search for "hdtv blur" and serve as an informative explanation for people who don't understand where the blurring is coming from on their hdtv set.

Many less tech-savvy consumers who are buying high definition sets to replace their tubes are noticing the blurriness and wondering "why is my HDTV blurry?" Even though the subject matter relates to a multiplicity of technologies and topics, this is what I felt the most common search would be in terms of the average person trying to understand the problem.

There are similar resources the pre-date this wiki page, such as "HDTV lag" that inspired me to create this page. Ahigh (talk) 07:58, 7 January 2008 (UTC)[reply]

Fixed. Moved to "Display motion blur" because that has more google search results (if you use a quoted search with the quotation marks, "display motion blur" versus "hdtv blur"). I kept a redirect HDTV blur, LCD blur, and LCD motion blur, which all now redirect to this article. I added a sentence saying that the introduction of digital technologies (e.g. compression artifact blur), and panel technologies (e.g. LCD), created additional motion blur during the modern HDTV era. Mdrejhon (talk) 21:19, 2 October 2012 (UTC)[reply]

retinal blurring [fixed][edit]

It's not effects that cause retinal blurring. It is "sample and hold display technology" that causes retinal blurring. Leaving sample and hold, and changing the frequency of updating to double, halves the effect of retinal blurring due to eye tracking moving objects. It's just that simple. Again, prove to me that you understand retinal blurring, and we can move on. You said poynton doesn't make claims about a linear relationship between hold time and amount of retinal blurring due to eye tracking, but he explains it very clearly. And in fact, leaving sample and hold and doubling framerate halves retinal blurring. Since you're the logic nazi, let's assume that I am wrong, and I will demonstrate it's absurd. Retinal blurring is the result of the pixels being on while your eye sweeps past an image. If there isn't a linear relationship between the hold time period and the amount of retinal blur, then the relationship would be nonlinear. So what nonlinear relationship would it be? Of course it's absurd to believe that the reduction in retinal blur would be non-linear to the hold time, therefore it is proven. Is that enough of a proof for you? I suspect the problem is that you simply don't understand retinal blurring or you simply refuse to believe it exists somehow. It's a very subtle effect and something that very few people understand. But that doesn't negate it's existence or comprehension of it's existence by people who understand it. Ahigh 03:45, 7 November 2007 (UTC)[reply]

Fixed. I've added citations to an additional paper. Excellent reference is page 3 of [4] which explains eye(fovea)-tracking based motion blur on sample and hold technologies. In addition, I already noticed one of the other references already explains this too. So two references, now it appears to be covered. That said, let's remain all professional (all sides please) Mdrejhon (talk) 22:10, 2 October 2012 (UTC)[reply]
Strengthened even further. There's now at least four academic citations that explains eye-tracking based motion blur (retinal blurring), and how it's a completely separate motion blurring factor than LCD pixel response speed. Having worked in the home theater industry before, I also personally agree with the academic assessments, including some informal tests I've done personally. Mdrejhon (talk) 17:28, 3 October 2012 (UTC)[reply]

To Be Researched For Citations[edit]

Half Motion Blur[edit]

I removed the references to double the framerate halving the blur. It was unreferenced and likely the result of the false assumption that the perception of blur *must* be lineraly related to the refresh time. Without a reference or expert knowledge that this is the case (could be but needs to be tested not assumed) please don't add this claim again.Logicnazi 21:50, 19 August 2007 (UTC)[reply]

Mathematically, it is correct, at least -- about double the framerate halving the blur. An electronic camera rig programmed to linearly track the object, running at a slow shutter speed, would have 50% less motion blur at 120Hz, and 75% less motion blur at 240Hz, and 87.5% less motion blur at 480Hz -- because of the distance of the camera tracking movement versus the frame stepping forward (50% smaller steps at 120Hz, 75% smaller steps at 240Hz, etc). It is easy to come up with the appropriate mathematic equations proving that double framerate = half motion blur (for a precision camera tracking rig running at slow shutter speed) .... However, it does not compensate for eye tracking inaccuracies and saccades that varies in accuracy depending on speed and clarity of object being tracked. Some sources on the Net says double framerate equals half motion blur.

Understand retinal blurring first, please[edit]

I appreciate the contributions (or deletions as it were), however, if you read through the references (especially the Poynton reference) and actually understand how retinal blurring works, you'd realize that 120hz doesn't solve retinal blurring. Strobing does. 120hz reduces retinal blur by 50% versus 60hz. That is all.

I've been making video games now for over 20 years, including an arcade racer from Atari Games coin-op - San Francisco Rush 2049 (my name is Aaron Hightower). I know what I'm talking about, and I created this page to educate people.

Obviously, it's open to edit from anyone who knows more than me, but you can't say that I don't know what I'm talking about when I say 120hz cuts retinal blurring in half, it does. I know. Because I know how it works, and you clearly don't.User:Ahigh 8:37pm, September 17, 2007. —Preceding signed but undated comment was added at 03:39, 17 September 2007 (UTC)[reply]

Where is a source? Poynton does NOT say what you want. As far as I could see he makes no claims at all about a quantitative amount by which perceived blurring is reduced. He makes claims about 50% changes in some other things but NOT about the actual apparent perceptual blur. No one doubts that going to 120Hz vs 60Hz reduces retinal blur the question is whether it reduces it by 50%. Hell it's not even clear what it *means* to reduce retinal blur by 50%. My best guess at an interpretation of this is that if we sat down a bunch of people and showed them a bunch of videos (not telling them which was which) and asked them to estimate their relative amounts of blur they would say this one has half as much blur as that one but I didn't see any suggestion of this in Ponyton or anywhere else.
Remember just because it might reduce the effects that *cause* blurring by 50% doesn't mean that it reduces the perceived blur by 50%. For instance doubling the intensity of a sound wave doesn't double the perceived volume and the same effect could be at play here. Besides at some frame rate there is going to be NO perceived blurring so it can't always be the case that doubling frame rate halves perceptual blurring. In any case I'm fine with the article as it stands now (saying it reduces blurring) but I would like you to point out where Ponyton contradicts me before putting back the 50% figure.Logicnazi 16:01, 23 September 2007 (UTC)[reply]
Research is definitely needed, or finding a correct paper, but the 50% is actually mathematically accurate (subject to limitations of eye tracking accuracy, which comes into play during faster motion). Objective vs subjective testing is a different matter. As eyes continue tracking motion on a sample-and-hold display, for an object moving at the same speed, there could be one centimeter of movement between frames at 60Hz. As your eyes track, this leads to a rentina-smear of 1 centimeter (relative to the image plane). However, for 120Hz, object moving at the same speed at double frame rate, would move at half that increment -- half a centimeter. As your eyes track, this leads to a retina-smear of 0.5 centimeter (relative to the image plane). The difference would be visible in photographs, assuming you've got an electronic camera-panning unit that moved the camera to track perfectly smoothly on the moving object on the LCD screen, and you used a slow shutter speed (1/10sec). The 1/10sec exposure at 60Hz would have double the motion blur as the 1/10sec exposure at 120Hz, because of the reduced smearing effect of the moving eye (moving camera) of each stationary frame before the object stepped forward in the next frame. I read online that one manufacturer used a moving mirror to do the equivalent, for motion-blur testing of their LCD panels. (I'll try to dig that link up) Mathematically, it is very easy to prove the 50% reduction in retina-tracking-based motion blur, during doubling of framerate (excluding ALL OTHER factors). What simply makes this complicated is that there's multiple factors contributing to this, including the imperfectness of human-eye tracking, other sources of motion blur (e.g. camera shutter), etc. "Mathematically proven motion blur" is not the same thing as "Human perceptual motion blur", but I think this topic needs to be covered. Some additional references could be mined from my references links list at [5] (my blog), although not all references are relevant to this article. Working together, we can build enough references mined from elsewhere, to show the mathematically-provable relationship between framerates, strobe lengths (CRT vs sample-and-hold vs scanning backlight), and HDTV blur. Mdrejhon (talk) 17:42, 2 October 2012 (UTC)[reply]

120 Hz adds to lag?[edit]

I was wondering if there was a source that mentioned why 120 Hz technology added to lag. I assume it's simply because it has to process an extra frame, every frame. It doesn't just duplicate the original frame? So a 120 Hz TV would actually be a bad idea when attempting to eliminate lag? A lot of clueless sales people would try to say the opposite. One looking for a TV for video games should stay away from 120 Hz TVs? There is an extraordinary amount of confusion about this, and Wikipedia could clear things up.--SkiDragon 05:37, 14 October 2007 (UTC)[reply]

On a 60hz signal, the frame period is 16.66ms. Half of that is 8.33ms, which is the 120hz period. In order to generate the in between frames, the circuitry needs access to the previous and next frames. So having access to the next frame introduces one frame of latency to the signal. This doesn't not include any additional latency due to signal processing. The signal processing I do not know a typical latency for generating the imagery itself, but I would not be surprised by very large numbers (for example 50ms). Most players can notice 200ms of lag or more. When the total lag is less than one frame period, this is the ideal case. 1 frame of lag is the typical lag inserted by a good LCD. 120hz sets can theoretically do just as good as typical LCD's. But in reality you will get the typical lag for an average LCD plus the lag associated with the image processing which will be at least one half a frame period of the input source (EG: 8.3ms). Let me know if this is clear. Ahigh 05:19, 29 October 2007 (UTC)[reply]

I'd love to clear this up, and this is a wonderful explanation, but it also constitutes original research and/or new information. Cite it somewhere and I love you forever. 146.115.51.64 (talk) 18:35, 3 January 2010 (UTC)[reply]

Having worked in the home theater industry, motion interpolation definitely adds to lag, but I agree a citation to papers needs to be added by someone. Mdrejhon (talk) 22:10, 2 October 2012 (UTC)[reply]

plasma 600Hz subfield rate[edit]

LG has a number of displays that refresh at a rate of 600Hz. 600 is divided by 24, 25, 30, 50, 60, just about any fps. Now I'm not familiar with the technicalities so I'm expecting someone to read this and find it interesting enough to update the article. — Preceding unsigned comment added by Laserscout (talkcontribs) 00:10, 13 December 2010 (UTC)[reply]

Agree that some info should be included. However, google "plasma temporal dithering" which complicates this topic somewhat. We need some good citations, somebody (or myself) will need to hunt this down. The 600Hz subfield (and 720Hz) rate of plasma actually does multiple refreshes of the same plasma image per refresh. In many cases, there's no motion-interpolated information, so it's just simple repeated refreshes of temporally-dithered versions of the same frame. However, many plasmas do add some black period between the temporally-dithered subfields, to improve motion much better than LCD does. The actual subfield refresh is not necessarily 1/600sec at 600 times per second, and not based on 600 distinct images per second, so the motion is not as smooth as a theoretical "600Hz motion interpolation". It can be a surge of multiple temporally-dithered refreshes, followed by a black period. If the uninterrupted black period (period without pixel pulses) is only 50% of a refresh, it'll be similiar to an LCD with a strobed backlight that's dark 50% of the time. Temporal dithering is simply well-calculated pixel pulses (also the source of "noise" in plasmas, especially in dark colors on cheaper plasmas). There's discussion in Internet forums (AVSFORUM, etc) but forum threads aren't appropriate to be linked from Wikipedia, as far as I know. I'll try to find some real citations in regards to plasma subfield rate in relationship to motion blur. Good google search include "temporal dithering", which is what plasmas do, and temporal dithering doesn't necessarily enhance motion at all, unless the temporal dithering pattern is intentionally adjusted to also do this (e.g. motion interpolation, or black frames) Mdrejhon (talk) 22:14, 2 October 2012 (UTC)[reply]

Comments Beyond Scope Of Article[edit]

50Hz[edit]

Are the problems greater or lesser at 50Hz? (well, obviously there isn't the 3:2 pulldown problem - but what about the other factors?) zoney talk 15:14, 20 September 2007 (UTC)[reply]

I don't have as much experience with 50hz, but all the issues are the same in general, except, as you mention the 3:2 issue. There really aren't any significant differences besides that. I do know that the flicker of 50hz is much more noticable than 60hz on a CRT. Another difference is that games can sometimes achieve a solid 50hz performance easier than a solid 60hz performance, and therefore may actually look smoother on a CRT at 50hz compared to another CRT at 60hz.

The strobe effect on LCD's (LED backlighting) has the ability to be designed to flicker less than a CRT trading off a slight bit of retinal blurring. I have yet to do my own personal tests on the Aptura and LED Motion Plus technologies. I do know that Aptura sets are more popular in the 50 hz formatAhigh 03:45, 7 November 2007 (UTC)[reply]

I'd deem this 50Hz discussion unrelated to the article. It is, however, a combined motion interpolation benefit/flicker reduction benefit if you're using a flickery display technology (e.g. CRT) -- when doing 50Hz->100Hz motion interpolation on a CRT, as an example. (Or LCD with a strobed backlight) Mdrejhon (talk) 00:02, 3 October 2012 (UTC)[reply]

Needs updating since introduction of LaserVue[edit]

After seeing the LaserVue set, I was disappointed by the extreme loss of viewing angles. I have not yet evaluated the set for motion sharpness, but my personal interest in the set went way down after seeing one first-hand. Ahigh

Viewing angles is also a disadvantage of some LCD's. Some are better than others, some are worse than others. From my understanding, some laser based displays could have better viewing angles than others, so I suggest this is an external, separate factor unrelated to display motion blur, but I'm still interested in seeing this information update a different (appropriate) Wikipedia article which could be wiki'd to, from this article. There must already be a Wikipedia article about viewing angles of various technolgoies. Mdrejhon (talk) 22:10, 2 October 2012 (UTC)[reply]

what the heck is BenQ on about?[edit]

Making an LCD flicker on purpose = "as stable as a CRT"? Well, that's probably true from a purely objective perspective, but it's a very weaselly way of saying they've made it LESS stable than it already was. I moved from CRT to LCD as soon as I could afford it, thanks to the much greater image stability and reduction of flicker. A slight, probably psychosomatic or biologically-derived bit of blurring is far preferable to the eyestrain and headaches a constant scanning strobe produces.

Save us from insane marketing gibberish, someone... 193.63.174.10 (talk) 10:29, 3 June 2010 (UTC)[reply]

My view, is this is simply an interpretation of "as stable as CRT" which is potentially confusing -- BENQ really meant "LCD with stable and clear motion similiar to a CRT", then BENQ is more accurate, especially if BENQ is using very short strobes for their backlight. Some links to academic papers about strobed/scanning backlights can be found at [6], and some of these references show that shorter strobes leads to less motion blur, and this is also taken advantage of the very newest "960" displays (e.g. Samsung Clear Motion 960, EliteLcdHDTV.com, etc). Alas, the statements is subject to potential misinterpretation. I've edited the article to focus references more directly on the academic papers instead. Mdrejhon (talk) 22:19, 2 October 2012 (UTC)[reply]
I'd love an LCD TV (but NOT a computer monitor) that flashed each image for a small fraction of the frame duration. This is what a CRT effectively does, and it results in much less motion blur when the viewer's eyes are following a moving object. This is especially noticeable on video games that run at 60Hz. Since this trades off reduced motion blur for increased eyestrain, it's not suitable for computer displays; for that, current LCDs are superior to CRTs. 66.90.216.158 (talk) 00:06, 5 December 2012 (UTC)[reply]

List of causes is misleading[edit]

Why does the article include causes that are not causes of motion blur, or are not caused by characteristics of the display? Regarding "Resolution resampling", the article even says "not a motion blur", so what is it doing in an article about motion blur? Regarding "Lower camera shutter speeds", the article says that the blur is created during the filming process, not by the display. And are deinterlacing and compression artefacts unique to flat panel displays, as the article seems to suggest, since the introduction specifically refers to those kinds of displays only? — Preceding unsigned comment added by 89.150.29.106 (talk) 14:57, 31 August 2018 (UTC)[reply]

External links modified[edit]

Hello fellow Wikipedians,

I have just modified 8 external links on Display motion blur. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}).

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 18 January 2022).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 23:48, 13 December 2016 (UTC)[reply]