Jump to content

Wikipedia:Reference desk/Archives/Computing/2019 September 30

From Wikipedia, the free encyclopedia
Computing desk
< September 29 << Aug | September | Oct >> Current desk >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


September 30[edit]

TV picture quality[edit]

I've been noticing this all the way since the analog TV era: some programs (generally local TV programming and stuff like news) look more lifelike than others (including all American TV and movies). I suspect this has something to do with interlacing. I think the effect still exists on a PC screen but is less obvious.

Also, and this is today TV series avoid completely, when a lit TV screen appears on the "crappy" quality TV series, it looks as lifelike as the good signal, even tho it's usually showing another crappily done TV clip. What causes this? Why does a crappy picture in a crappy picture look lifelike?

Forgot to say, the good content seems to have started around the 80s. Older local TV and movies have the same "style" as American TV.

— Preceding unsigned comment added by 95.168.120.11 (talk) 16:59, 30 September 2019 (UTC)[reply]

If American shows look crappy, and you are not in North America, then it may well be a conversion issue from one digital technology to another. See List of digital television deployments by country.
As for the TV screen displayed in a TV show or movie, they regularly fake those by splicing in a better image, to avoid the unwatchable effects created otherwise (wide white or black bands, etc.). The most obvious examples of this is when a text computer screen was shown, which had like 5 lines of text, far less than the real computers of that era. The analog to digital transition may have affected this, as now you have 4 combos to worry about: analog on analog, analog on digital, digital on digital, and (rarely) digital on analog. In many other cases, rather than deal with this, they just arrange it so the screen isn't visible, as in All in the Family and That 70's Show, and the audience only hears the audio.
Also note that the smaller size of a screen within a show or movie means that the lower resolution of old programs may not be as obvious. For example, a 480i image looks like crap on a full 1080i screen, but if the portion of the 1080 screen used for the display of the image is only 480 lines high, then it looks just as good as the rest of the 1080 screen. SinisterLefty (talk) 21:53, 30 September 2019 (UTC)[reply]
It has nothing to do with resolution and digital, I've seen this effect since the 90s at the least, and American TV looked just as un-lifelike when I saw it in the states. Altho some shows do look better than others, like Oprah has too muted colors but motion seems more lifelike. 93.138.159.71 (talk) 23:33, 30 September 2019 (UTC)[reply]
Care to tell us which nation's TV you are comparing with the US ? SinisterLefty (talk) 23:45, 30 September 2019 (UTC)[reply]
Half of Europe really. Germany, Czech, Balkans have the "lifelike" shows, UK and Italy have American-like, France seems 50-50. It's kind of like the difference between playing a video game at 30fps and 60fps, but the difference is visible on the same TV sets and goes way back into the analog/VHS age which makes me think it's something about the interlacing. 93.138.159.71 (talk) 23:52, 30 September 2019 (UTC)[reply]
OK, sounds like you are talking about the frame rate. There are many reasons why that could be low. Interlaced video was specifically designed to double the apparent frame rate by refreshing half the lines at a time, but can result in motion blur. Slow connections or processors can also result in dropped frames, as could conversion from one video format to another. Non-intuitively, changing frame rates between values close to each other can be more jarring than very different frame rates. For example, 60fps source displayed at 30fps means dropping every other frame, which won't be too bad. And going the other way wouldn't be too bad either, just repeating each frame twice or interpolating every other frame. But going from 31 fps to 30 would mean twice a second you would get a noticeable jump. SinisterLefty (talk) 00:07, 1 October 2019 (UTC)[reply]

Are you thinking of the difference between (generally ~24p) film and 50i or ~60i video? AFAIK, in the past, in the US, most prime time TV (and of course non-indy movies) were shot on film generally at ~24 frames a second. It didn't matter if their prime target was broadcast NTSC ~60i. They used Three-two pull down to produce content for broadcast or videos. (For DVDs they may have either done that or kept the original ~24p.) By comparison day time soap operas and other lower budgets TV was shot on video generally with 60 interlaced fields per second. This meant that the later had smoother motion that got associated with low budget productions. For broadcast and video tapes in PAL regions, the ~24p content would mostly be sped up slightly to work with PAL. For genuine 60i content, well it gets more complicated and perhaps doesn't matter much for this question.

For markets outside the US, lower budgets and other factors meant video usage was more common for TV depending on the circumstance [1]. Of course for stuff intended for PAL and SECAM countries where 50i was the norm, I believe if shooting for TV you'd generally choose 25p and not 24p with film, something often continuing to this day [2].)

Nowadays almost no one shoots on (analog) video and film is also getting very uncommon even for theatrical movies [3]. But digital cameras card record in similar ways although I think few bother to record at 60i or 50i. Far more likely they will record at 50p or 60p. They can often record even higher. However the preferences and experiences of those involved in the production process and other factors mean in the US most still aim for the 'cinematographic feel' of ~24p and so that is how scripted shows are shot and edited. (BTW, to be clear it isn't generally just the choice of frame rate, see e.g. [4] [5]. Also I should clarify that sometimes scenes may be shot at higher refreshes e.g. for slow motion playback. If you're not so worried about 50p or 25p, 120 fps is a good one given the options it gives. But those are still targeting a 24p final output and processing it accordingly including for non slow motion scenes.) The norms elsewhere may vary, probably also following their own histories unless there were a reason for those to change.

With streaming services becoming increasingly popular, at least nowadays a lot of people are able to receive the shows at ~24p if they want to. That said, for broadcast TV since the stream randomly changing parameters isn't something they're designed to cope with, most will be broadcast as digital 720p60 or 1080i60 in the US and 720p50 or 1080i50 in former PAL and SECAM countries. (1080p50 and 1080p60 was considered to require too much bandwidth at the time the broadcast standards generally used now were designed. This may change in the 4k era.) 720p50/60 is generally accepted as preferred for high motion content.

There have been some attempts to move to high frame rate for films and other high end scripted content but it hasn't always been well received, a notable example is The Hobbit (film series) which was only 48p.

Modern TVs especially those with greater than 60 hz refresh rates may also by default use motion interpolation will frequently smooth ~24p content something often derided by producers and connoisseurs. Computers are able to do the same thing if you're using them for playback e.g. the Smooth video project [6] or sometimes even GPU driver options with certain players [7].

While these techniques will often introduce artifacts, my impression is most of the criticism comes due to the smoothing out and "ruining" the "vision" of the director and others involved in the production. See e.g. these links Soap opera effect (currently a redirect to a specific heading of the earlier article) [8] [9] [10][11] [12] [13]. For broadcast TV, for example 720p50, it's also possible that the broadcaster could have chosen to use motion interpolation for their 720p50 broadcasts. Just a note on my earlier point, the limitations of motion interpolation mean content recorded at 1080p50 is likely to significantly better than 1080p24 content interpolated.

A loosely related example may be the lack of film grain with digital recordings and it's simulation see e.g. [14]. Personally, while I don't care that much I would prefer if everyone would just start aiming for at least 50p rather than sticking with 24p. Also while I've glossed the issues caused by 2:3 pull down (or similar techniques) and the issues moving from 50i or 50p content to 60i or 60p or vice versa or 25p to 30p these can have effects some will notice. Even the differences between the minor variations in frame rates (like true 24p vs 23.976p) are not always completely inconsequential. Television standards conversion isn't that great but maybe is an okay starting point if you're interesting. I just don't believe it's that relevant to what you're observing from what you've said.

Another thing which I've mostly glossed over (other than the motion interpolation point and broadcast stuff) is the connection between what you're playing back and your TV or monitor. While AFAIK most digital TVs in former PAL or SECAM territories have been able handle the key frame rates with their refresh rate options for quite a long time, if you just connect your computer to your TV, it's often going to just stay at one refresh rate, probably either 1080p50 or 1080p60. If you have p24 content you either need to do the same things or change your TVs refresh rate. But many players won't do the later not wanting to interfere with other stuff or risk issues. And they also may not handle the conversion as well as they could, in particular they often won't do a simple speedup for 1080p50. I think, but don't really know, that Bluray and stand alone devices (like Chromecasts, Rokus and Apple TVs) are generally better at changing the refresh rate where possible although nowadays it shouldn't be that hard to set your computer to do the same at least with the right video player. (For playback with your browser or the vendor's own app, it may get a little complicated.) I really have no idea how GPUs and monitors with adaptive refresh rates handle stuff.

In you case, it sounds like you would want to set up SVP or something to do motion interpolation if your system can handle it. That said, the earlier factors may come into play. For 24p content it may be better to play it at 48p or 72p if your TV or monitor can handle it. The answer may be no in which case, you may want to see whether you prefer 50p with a speed up or 60p. Using 50p without a speedup is more likely to be suboptimal. Again, for many TVs you can probably switch between them as your content demands if you set up your computer properly. (Look into reclock if you want to be especially fussy.) For broadcast TV, check and see if your TV has any motion interpolation support that is disabled although you're limited here by the fact that your broadcast is probably going to be 50p or 50i and your TV possibly not capable of much higher (i.e. your broadcaster has already increase the frame rate with a limited form of frame interpolation). Note if your TV does support motion interpolation especially if your TV's refresh rate is higher than the input maximum e.g. it can do 100/120hz but only accepts 60hz, then you may want to simply set your computer to output at 24p etc as needed and rely on your TV's motion interpolation.

Nil Einne (talk) 16:25, 5 October 2019 (UTC)[reply]