Jump to content

Wikipedia:Reference desk/Archives/Computing/2015 December 5

From Wikipedia, the free encyclopedia
Computing desk
< December 4 << Nov | December | Jan >> December 6 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


December 5

[edit]

Are those 4k TVs or ultra hd TVs any good?

[edit]

Would a 4k TV or ultra hd TV have a higher pixel density than TV broadcast, DVD or even Blue-Ray movies? If dpi bottle-neck is at the stored image, and not at the display, would it look better than full hd? --Scicurious (talk) 19:20, 5 December 2015 (UTC)[reply]

Obviously the TV isn't going to give you more visual information than is in the signal. One nice thing about them, though, is that they can double up as very nice computer monitors. I'm using a 32" HP ENVY Quad-HD, and that way I can compute from my couch without eyestrain and plenty of screen real estate. Paid around $400 for it. That was almost a year ago; I think by now there are similar 4K monitors that are also affordable.
If you go that way, though, there are a couple of things to look out for. Make sure it accepts a Display Port connection (otherwise it's just a TV and may not work well as a monitor).
And whatever you do, don't get a "smart" TV. Whether you want to use it as a monitor or not. We have to kill that thing. That's like inviting the manufacturer to control the thing even after you paid for it. --Trovatore (talk) 19:48, 5 December 2015 (UTC)[reply]
Well... from a purist point of view, a device can display more information than is in the signal - by adding new information, based on some heuristic model! Many modern televisions render more pixels than the source image; and nowadays, in 2015, it's very common for televisions to provide frame-rate doubling and similar technologies. This lets a television display 4K or 120 frames-per-second, even if the source data does not contain video at that resolution or frame-rate. The exact methods for upscaling or frame-interpolating vary; we have a section on standard methods in common use. The output on screen therefore can have more frames, more pixels, more "bits", more "entropy," than the input. Whether this causes the output image to have a higher visual quality is an altogether different question. I personally find TV framerate doublers very annoying and visually distracting - even the very "good" ones with fancy interpolation algorithms. But, good upscaling algorithms can make a huge impact. Most humans probably can't tell the difference between native resolution or an upscaled image from 4x or even 16x undersampled data (using modern technology).
In the most trivial case, new pixels can be added by upscaling and interpolating. New frames can be added by the same general method - interpolating in the time domain, instead of in the spatial domain. If you study modern digital signal processing, you can see that these exact types of algorithms are among the most heavily-researched topics - especially as they apply to specific use-cases like "4K video." A few years ago, IEEE published A Tour of Modern Image Filtering, which briefed and reviewed a lot of the advances that have happened over the last decade. We have come a long way from the simple sinc filter; various techniques have blossomed as computers have gotten faster and more parallelizable; as analytical and subjective quality has been studied; and as commercial applications have proliferated.
Nimur (talk) 21:17, 5 December 2015 (UTC)[reply]
I do in fact know a little bit about digital signal processing, and yes, my answer was incomplete. Still, you aren't going to get a 4K experience from an HD signal; I'm reasonably confident about that. --Trovatore (talk) 21:21, 5 December 2015 (UTC)[reply]
Well, I won't get a 4K experience by upscaling HD source data; perhaps you won't either... but we are outliers! A very large percentage of the population would experience these videos in the same way - because many people can't even tell HD from SD! You can find lots of statistics to back this up - something like 20% of the population will believe a display is showing HD content, even if the hardware is a standard-definition unit. (Shoot - 40% of the population has myopia and can't focus on the TV without using corrective lenses!) In other words, the quality delta between SD and HD is below the just-noticeable difference for a large segment of the bell-curve of consumers; between HD and 4K, the quality delta is imperceptible to even more users. Nimur (talk) 21:42, 5 December 2015 (UTC)[reply]
OK, you may be right. So I'll fall back to quibbling that my original claim about information is still correct, unless the TV does something non-deterministic (and I doubt that there's any useful application for non-determinism), or unless it actually consumes pre-loaded entropy (which, again, seems unlikely to be useful). --Trovatore (talk) 21:47, 5 December 2015 (UTC)[reply]
Yes, I think you are still correct from a pure, information-theory standpoint. Information, and entropy, doesn't come from nothing!
As far as "usefulness" - one person's noise is another person's signal! Some algorithms intentionally inject noise into the output - e.g., shaped noise or noise-filling - because this can improve subjective perceptions of the image- or audio- quality. In fact, using shaped white noise as the primary input in digital signal synthesis - whether audio or video - is one way to create more psychologically-believable output signals. Go figure! The noise probably comes from a pseudorandom noise generator, so it's still usually deterministic, but it doesn't have to be.
Here's a nice review paper on one particular image processing algorithm, Total Variation Regularization (surely there are thousands of other great papers on this topic, but this one happens to have lots of nice image examples). Personally, I think Lena looks better before de-noising - but her noise is Gaussian. The subjective trade-offs become very different if the noise spectrum is less random.
The long and short of it is, your better quality televisions may have better algorithms - which can mean less visible block-artifacts, less color inaccuracy, and so on.
Nimur (talk) 02:09, 6 December 2015 (UTC)[reply]
Try reading the fine print on US TV ads for medication. I can currently read those in 1080, but some of the companies seem to have gotten wise to that and made the fine print even smaller, so I can't read them even there. (You also need the ability to freeze the screen, which on my TV doesn't work in 1080 mode ... I wonder if big pharma bribed them to make that happen.) StuRat (talk) 21:57, 5 December 2015 (UTC)[reply]
And just because you can hook a PC up to it doesn't mean you will get 4k resolution. The PC has to support it, the cables and connectors have to support it, and the TV has to support it. I've been burned by TVs with a high native resolution, but that only support a lower res when used as a monitor. StuRat (talk) 20:18, 5 December 2015 (UTC)[reply]
I would suggest waiting until more 4k content is available. Hopefully the price of 4k TVs will also be significantly lower by then. Also, if the 4k makers were smart, they would allow you to watch four 1080 shows at once, and just turn the sound on for one at a time. Much better than picture-in-picture. StuRat (talk) 20:18, 5 December 2015 (UTC)[reply]
I guess I'll point out that unless you sit unusually close to your TV and/or have unusually good vision, 4K won't look better than HD even with 4K source material (Optimum HDTV viewing distance#Human visual system limitation). -- BenRG (talk) 21:33, 5 December 2015 (UTC)[reply]
Well it depends on how big your TV is.Dja1979 (talk) 15:46, 6 December 2015 (UTC)[reply]
First, when you say "DPI" (dots per inch) you really mean "PPI" (pixels per inch). DPI is used to describe the quality of prints.
Pixel density (PPI) is a function of the TV hardware. For example, a 30-inch 1080p TV would have a higher pixel density than a 40-inch 1080p TV simply because it's cramming the same amount of pixels into a smaller screen. When you're talking about the video content, terms like 1080p and 720p will simply refer to the size of the video frame. For example, a 1080p video frame measures 1080 pixels tall and 1920 pixels across.
An HD TV will automatically up-sample any SD broadcast or DVD content so it fills the screen. However, it won't look any better because it's essentially just duplicating information onto more pixels, rather than adding detail to the picture. In addition, SD content is typically 480i, which has a 4:3 aspect ratio. So, each frame resembles a square in proportion. A 4k TV has a ratio of 16:9, which looks more like a rectangle. So, any SD content won't fill the entire screen and will have black areas to the side of the frames because the picture isn't wide enough to fill the whole screen.
Broadcast HD content is typically 1080i or 720p. Until recently, Blu Ray content was limited to 1080p. Recently, there has been some "ultra HD Blu Ray" content released, which is 3840 by 2160. So, if you had a 4k TV, a new ultra HD Blu Ray player, and an ultra Blu Ray disk, you would be (mostly) taking full advantage of the TV. Most content online still tops out at 1080p, though.
In the future, there will be a lot of UHD content online, on disk, and on TV. However, you'll need a large TV to notice the difference. Or, if you sit really close to smaller UHD TV, you might be able to tell, too. This chart shows the relationship between size, pixel dimensions, and distance: http://i.rtings.com/images/optimal-viewing-distance-television-graph-size.png.—Best Dog Ever (talk) 04:03, 6 December 2015 (UTC)[reply]
They seem to have missed something in that chart. If you have a 100 inch TV, you can't view it from two feet away. Well, you could view a small portion of the screen from that far, but the rest will be much farther away. If it was curved into a hemisphere you could get your head a bit closer to the rest of the TV. Still, it would be intensely annoying to be so close to such a large TV. StuRat (talk) 17:04, 6 December 2015 (UTC)[reply]
The chart recommends placing a 100-inch TV at least six feet away if it's an ultra-HD model.—Best Dog Ever (talk) 23:31, 6 December 2015 (UTC)[reply]
But it also says you can buy a 100 inch TV in an even higher res that 4k and place it as close as you want. :-) StuRat (talk) 00:41, 7 December 2015 (UTC)[reply]
Another question is whether more resolution is really better, even assuming that you can see the difference. With 1080 I now see lots of beard stubble, skin blemishes, wrinkles, etc., that I didn't see before, but does that make my viewing experience better or worse ? At 4k, I will probably be able to see every pore on their faces. Is this an improvement ? The one place I do appreciate the higher res is when trying to read tiny text, as you sometimes find in the credits for a movie. StuRat (talk) 17:08, 6 December 2015 (UTC)[reply]
It depends on the type of content. If you watch sports, for example, you want as much detail as you can get. With higher detail, you can see if a player really did foul another player or did the ball really cross the line, etc.
Also, I forgot to mention that ultra HD TV supports more colors than current TV standards. Current content has eight bits per channel, but UHD content has ten bits per channel. Different people can see different amounts of color, but for many, colors in UHD content will look more realistic.—Best Dog Ever (talk) 23:31, 6 December 2015 (UTC)[reply]
For sports, I'm sure they will show a zoom replay that will make it clear. StuRat (talk) 00:43, 7 December 2015 (UTC)[reply]