Can We See 4K/UHD On A Normal Sized Screen? You Betcha!

4k_30_logo

I read many articles that claim quadrupling the number of pixels with 4K over 1080p is subtle or requires a very large screen to see the benefit. During the course of my work I am seeing a considerable amount of 4K/UHD source material and display product.  It is obvious to me that doubling the resolution in the vertical and horizontal plane (4K) done properly is actually a big improvement when done well.  This article addresses the benefits of additional pixels above 1080p along with sample photos, experimental proof and theoretical analysis that this is the case.

Resolution Experiment – Click To See

Let’s first look at a series of experiments done by NHK that compared a plaster bust, model ship and butterflies to a display.  These results can be found in ITE Technical Report Vol. 35, No. 16.  The summary of the results are shown in the chart above.   NHK claimed the tests showed 310 pixels/degree are needed for an image to reach the limit for human resolution.  If this is true we should be able to benefit from 4K and even higher resolutions without changing the viewing angles used for comfort reasons.  At a THX recommended 36 degree viewing angle this corresponds to a 11K display to hit the 310 pixels/degree limit NHK observed. 1080p is only 53 pixels/degree and far short of what the tests indicate we can see at THX recommended viewing angels.  My own observations of properly performing 4K displays using quality 4K sources are that 4K is almost as big an upgrade as 480p was to 1080p.

Assuming these tests are correct 4K is useful out to about a 12 degree viewing angle.  This corresponds to 17 feet for a 50 inch diagonal and 21 feet for a 60 inch diagonal display.  Far beyond distances many people suggest for 4K.  1080p with the same criteria limits to about 6 degree viewing angle doubling the distances shown for 4K.

ARRI Film Scanner Results

ARRI Film Scanner Results – Click To See

The ARRI film scanner results above show the results for a 2K, 4K and 10K film scanners.  These scans are magnified from a section of 35mm film.  I have heard in my work with the film industry that 12K is what is needed to replicate the best 35mm film.  It is interesting that the NHK data indicated that 11K is needed to hit the limit of human vision for these applications.

Arri Film Scan Segment

ARRI Film Scan Segment – Click To See

The image above is a segment of 35mm film showing the difference in an image for 2K versus 4K. I have also seen “show prints” in Los Angeles which are prints that were made from masters instead of several copies removed.  A 35mm show print in pristine quality is an amazingly detailed and high contrast image that exceeds what I have seen with 4K to this point, but it is getting much closer.  Unfortunately, this media is extinct with the loss of film in theaters.

How can so many people be so wrong about 4K?  I think it boils down to two false assumptions.  The first is that vision is limited to 20/20 and second is that the digital media can capture all of the resolution at the resolution of the imager.  Both of these are false. Here is a quote from an expert on human vision. August Colenbrander, M.D. (Smith-Kettlewell Eye research Institute and California Pacific Medical Center) “…emphasizes that, contrary to popular belief, 20/20 is not actually normal or average, let alone perfect, acuity. Snellen, he says, established it is a reference standard. Normal acuity in healthy adults is one or two lines better. Average acuity in a population sample does not drop to the 20/20 level until age 60 or 70. This explains the existence of the two lines smaller than 20/20: 20/15 and 20/10.”

Human Vision – Click To See

The chart above shows the results of human visual acuity (corrected if necessary) versus age for several different experiments.  It is obvious from the results that 20/20 is not where most people can be with their vision.  The data is not as simple as this chart shows as these are trends among scattered results, but it is clear that the average for many people is closer to 20/15.  This data clearly challenges the assumption that 20/20 is the limit for the typical person.  I know in my own case my vision was about 20/10 when I was younger and has degraded to 20/15 at the current time.  I am still better than 20/20 at 55.

The other important thing to realize is that one of the primary benefits of 4K is the fact that it reduces the image artifacts and softening caused from digitally sampling the analog world.  The dominant theory for this was proposed in 1928 by Nyquist.  This theorem simplistically states that when you sample a system digitally you need to sample it at twice the frequency of the information you are interested in at a minimum.  This limit is why CD’s are sampled at 44kHz to be able to to pass 20kHz which is the common limit for human hearing.  4K has the bandwidth to allow the camera or film scanner to transmit an image properly to get a 1080p resolution on your 4K display if everything is done properly.  This digital sampling problem is the reason that an animated movie from Pixar looks so much sharper on Blu-Ray than the best camera image which must be filtered to meet Nyquist to avoid horrible artifacts.  Many 1080p cameras do not conform to Nyquist and have insufficient filtering which results in many image artifacts.  Computer animation is not subject to the analog sampling problem and does not need to have sampling rate artifacts at 1080p because the image is actually rendered at 1080p and not sampled from the analog world.

Taking these two pieces of information into account one can calculate the number of pixels per degree required for a person with 20/15 vision which would be considered relatively common.  20/15 vision corresponds to 0.75 arc minute of resolving power.  2 pixels are required at the resolution limit at a minimum to meet the Nyquist rule for digital sampling.

Pixels/Degree = (60 arc minutes/ degree)(2 pixels/0.75 arc minutes) = 160 pixels/degree  

(based on 20/15 vision & Nyquist sampling requirement)

The NHK data shown above indicates that more than 160 pixels is a benefit, but 160 pixels is where the chart takes a definite turn in slope.  This indicates a more subtle improvement in pixels/degree beyond 160 pixels/degree.  If you want to estimate where 4K is worth it based on this limit halve the distances above.  This means a 50″ display is useful for 4K to about 8.6 feet, 60″ to 10.2 feet and a 50″ 1080p display is useful to about 17.2 feet.  There are many likely reasons that more than 160 pixels per degree were required in the NHK experiment to hit the limit for human vision.  These include factors like peoples vision can be better than 20/15 and less than optimal filtering from digital sampling increased the number of pixels required to avoid digital sampling degradation of the image.  I would use 160 pixels/degree as the maximum distance for 4K at this time.

I believe these calculations along with the samples and test results shown give strong proof that screen resolutions greater than 1080p can offer significant improvements in picture quality at distances found in many applications.  Real world experience will depend on source quality, display quality and system setup.  As we push higher resolution more careful attention to detail will need to be exercised to obtain all of the quality improvements available.

Link to display resolution calculator based on your vision, display size and distance to the screen https://www.homecinemaguru.com/display-resolution-calculator/.

 

W. Jeff Meier

ISF and THX Certified Home Theater Consultant

10 Comments
Dan

Thank you for the great information.

Interestingly, scientists who study the human vision system report that the retina can detect a single photo, and that the visual cortex will generate a conscious visual impression when signals from 5-9 photons arrive within about a 100 ms window. There’s a lot more to vision than resolving power. Most people are stunned when they experience 4K for the first time. Is that attributable to individual pixels not being consciously resolved at a particular distance? I doubt it.

W. Jeff Meier

As we get higher quality 4K sources that offer the full precision available in the specification (10 bit, 4:4:4) the superiority of 4K over HD is even more obvious.

AVRant Home Theater Podcast » AV Rant #457: Sorting Weird Rooms

[…] why anyone would buy a CD player and is curious of our thoughts on a home cinema guru article (LINK), and Tim has found conflicting evidence on how to calibrate dual subs. Adriano is designing his […]

Bullcrap

What a load of horse manure. The author is clearly not a scientist. Most of his arguments rests on the idea that “most” people have better than 20/20 vision. The 4K resolving charts floating around the Internet are, of course based on 20/20 vision, which is the established average/normal vision, not 20/10 or 20/15. My own uncorrected vision hasn’t been 20/20 since 1st grade and given the sheer numbers of people I see wearing glasses and contacts (let alone that have had surgery to correct their vision), I find that chart above suspect at best. However, it IS true that people with better than 20/20 vision will see the details of 4K at distance further out than people with 20/20 (no difference than someone with 20/60 seeing better when they put on glasses that correct them to 20/20 being able to see more detail with the glasses on than off). THAT is what accounts for some of the distances mentioned, but it is *completely dependent on the individual’s eyesight* and your eyesight changes over the course of the day and how much improvement you might see on a small 50″ screen is debatable even with 20/10 vision (it won’t be the full 4K at 18 feet, that is for certain; being able to discern a difference and being able to see every detail are two different things).

At least there is some truth in that part. The real BS factor comes in with the Nyquist theory and comparison to digital audio. Nyquist theorem is for sampling frequency on the CAPTURE end not the output effect (here in pixel counts, not frequencies). Let’s start off by stating what Nyquist actually IS rather than quoting and using some mumbo-jumbo black magic to make people think you know what you’re talking about. Nyquist states that to digitally sample a waveform, you need to sample at 2x the rate you are trying to capture in order to preserve all the data to recreate that waveform. If you try to capture information outside that spectrum, you get what is called “ALIASING DISTORTION” and this is not a trivial thing to have in your playback signal (avoide it at ALL costs would be what I’d simplify it as it will sound like GARBAGE in audio and in Video you’d have artifacts that are ONE HELL of a lot worse than some tiny distortion around a pixel. You’d damn well KNOW IT if you had aliasing distortion present because you would have FALSE INFORMATION present in the signal.

This is because aliasing distortion is caused by a waveform that is outside the sampling range being “cut” by the sampling limit and resulting in TWO waveform “pieces” rather than one correctly captured waveform. One waveform will be at your maximum limit (1/2 your sampling rate) and the other at a corresponding lower “beat frequency” relative to the original waveform minus the sampling bandwidth limit (a 32.5kHz wave would produce a 22.5kHz wave and a 10kHz wave. The 10kHz wave would be audible as a high frequency SINE WAVE and THIS from a waveform that shouldn’t even be audible to human ears in the first place (see https://www.youtube.com/watch?v=1HxLBEGzuWE for a demonstration of aliasing audio distortion where you’ll hear two audible frequencies where there should be only ONE (notice how the audible frequency shifts in pitch as the sampling rate is changed with the same input). That pitch would not be audible if filtering were used within the sampling rate constraints.

That brings us to filtering. Many in audio know of the so-called “brick wall” filtering that supposedly makes CD Audio sound so inferior to analog by the analog zealots (and I have a high-end LP rig at home so this is by no means some universal belief among those owning quality turntable rigs). It’s true that a brick wall filter used BY ITSELF will affect frequencies below it. This due to the fact that a given order filter is a CURVE, not a straight line, mathematically speaking. If this is that all that were actually used in practice in the real world, the vinyl zealots would 100% correct in assuming that CD audio has a fatal flaw in it without sampling out to some crazy 192kHz or whatever to avoid the problem.

Here’s the kicker and why this article above is wrong about digital sampling of video. Filtering has been a lot more advanced than simple brick wall high order cut-offs. I’m not going to go into too much specifics here, but it’s safe to say that if you look up OVERSAMPLING you’re going to find that you can avoid all the artifacting that this article claims happens to video signals by capturing information at a much higher sampling rate on the RECORDING END, *THEN* brick-wall filtering out the frequencies at a much higher frequency and then restoring the signal. Your “curve” dip “ruining” the higher frequencies is suddenly a straight line for all intensive purposes. The best part is you don’t need 192kHz recordings on playback (or even to store them at that rate when sampling). Your ears can only hear up to about 20kHz to begin with (I’ve never heard of anyone that could beyond maybe 21kHz and there is no musical information up there you’d want to hear anyway). The reason so many CD recordings sounded bad (and some still sound bad) has 100% to do with two things, the original recording (you can’t fix garbage ‘sound’ no matter how well recorded it is) and ***MASTERING*** (i.e. mastering engineers purposely DESTROY things like dynamic range with “compressors”, whether digital or analog in nature to make things sound LOUDER than they actually were. That is because LOUDER *SELLS*. People that know the difference hear it for what it is and a lot of older analog compressors generated their own artifacts and CDs let them crank them up that much more without overloading things like the cutting head on analog record recording equipment thus making them louder than they were ever made on LP (and hence why many LPs sound less compressed and more natural than their CD counterparts).

In other words, the REAL problem with CD Audio has NOTHING to do with the format itself or some sampling error thing as many people mistakenly “believe” (the ignorance of digital sampling out there is staggering), but rather with how people in the industry USE IT. This revelation alone should make you already question what this has to do with the claims made in this article here.

It basically comes down to this simple FACT. If a given movie has a resolution limitation compared to say a cartoon by Pixar that is “rendered” to a given resolution, the problem is on the RECORDING END not the playback end! Just as proper filtering removes aliasing distortion in digital audio and results in a near perfect straight line to the sampling limits (due to oversampling solving the ‘curve’ distortion issue with brick wall filtering), so does oversampling solve the problem with capturing video information that might be affected in a similar manner. So even if you capture a movie at 12k on the input side, you can easily scale that down with proper filtering in place to 4k or 1080p and it will be as sharp as it can possibly be given the original signal limitations and THAT brings us to all kinds of other factors involved during the capture of a movie including the lenses used on the camera, the lighting conditions and thus the aperture settings used with the lenses, the sensors used to capture the information, the digital processing used, etc. etc.

Just like garbage audio being recorded with digital audio still sounds like garbage when played back (garbage in = garbage out) so will an old movie on 35mm film that has deteriorated and is grainy as hell look bad if you just sample it and put it out (i.e. it will be grainy and show mold or anything else that might be on the film reel and that includes old cartoons which is why moves are “REMASTERED” including digital cleanup, processing etc. Looking less grainy doesn’t = sharper either. It may appear that way to your eye (grain looks bad), but often details are being obscured (less sharp). You can see a similar effect but turning up the “sharpness” setting on a TV/projector and while it may appear to sharpen (edges are more noticeable), there are bloom affects that are obscuring details, not revealing them which is why anyone that does home theater calibrations will tell you the sharpness needs to be at ZERO on a television set PERIOD.

In short, 1080p reveals what the signal it’s sent has in it. If the signal it gets is compromised (cable network resamples and send out the signal at a low low bit-rate, you will get artifacts due to the low bit-rate being used, etc. This is one reason why Blu-Ray looks so much better than most cable channels showing the same program. It has far less compression, but even Blu-Ray is still compressed (i.e. this is more akin to AAC/MP3 vs. the uncompressed audio on a CD or SACD rather than sampling constraints). A lot of the 4K out there right now (e.g. Netflix) is SO compressed that in some cases you are getting more artifacts than any improvement in resolution you might be hoping for, bringing up the question that even when you are sitting close enough to see the full 4K, will it actually LOOK “better” or just DIFFERENT? How sensitive are you to digital artifacts? If the answer is VERY sensitive, you’d probably prefer a clean Blu-Ray over a junk Netflix signal and the Netflix signal is only going to be worse for 4K (heavily compressed as it is to try and make it work for the “majority” of people out there since not everyone has Google Fiber yet.

Most of this article is pushing the equivalent of fear mongering in politics. The reality is far more subtle and most people are not going to see improvement at 20 feet from a 50″ 4K set over a 1080p version given both are of good quality. At 5 feet? It will be amazing how much better 4k looks and that is precisely the sort of distance people look at 4K sets at places like Best Buy. They take them home and sit 20 feet away on the couch in an average sized room and yes, it still looks “sharp” but you aren’t seeing those mega details at that distance even with a proper 4K picture unless you are using binoculars!

W. Jeff Meier

“Bullcrap” I suggest you become more informed instead of slinging insults and foul language. I am not sure how I am inducing “fear” unless someone is fearful of some display being better than what that person currently owns.

I recommend in my article a 50″ display is useful for 4K to about 8.6 feet, 60″ to 10.2 feet not the 20 feet you suggest. It would be nice if you at least get my position straight. That size is a minimum not recommended. I recommend a 36 degree viewing angle for a 16:9 picture or 45 degrees for a 2:35 image which is much larger than the minimum I suggest.

My statements on human vision are backed by data not ignorance of the facts. My own vision was 20/10 for many years and 20/15 now that I am older. My comments are related to the acuity limit of the human eye. This is corrected vision if necessary. Here is a quote from an expert on human vision. August Colenbrander, M.D. (Smith-Kettlewell Eye research Institute and California Pacific Medical Center) “…emphasizes that, contrary to popular belief, 20/20 is not actually normal or average, let alone perfect, acuity. Snellen, he says, established it is a reference standard. Normal acuity in healthy adults is one or two lines better. Average acuity in a population sample does not drop to the 20/20 level until age 60 or 70. This explains the existence of the two lines smaller than 20/20: 20/15 and 20/10.” http://lowvision.preventblindness.org/eye-conditions/how-visual-acuity-is-measured The article that I quote in my post references experiment data supporting this viewpoint.

Your assumptions on oversampling video and filtering for Nyquist are also false. Below is link to a very high quality Sony 1080p camera and its resolution characteristics. The dark curve shown is the overall camera MTF. Oversampling along with the filters you describe are not commonly found in the video world.

Sony 24P Camera MTF

My article has nothing to do with the quality of various 4K sources. Many of these are poor, but not all. HD started out this way as well. I have seen 4K out of a RED camera on a quality 4K display. The results are amazing. This technology has room for improvement, but limitations in human vision are not a primary reason to fault it for those who have vision and the displays to see it.

Anecdotally, I have had hundreds of 4K display owners and all of them that actually can display 4K content which excludes some of the Sharp brand products want more content. The customer satisfaction level with this technology is very high. If your vision is poor and/or your display is too small it is a pity that you cannot enjoy the benefits that many do and will in the future. I have found this even for those at the minimum distances I suggest.

I am not suggesting one needs to go out and buy a new 4K display today. The content is limited and the displays are improving in quality every year. HD has much more content and video in that format can be very enjoyable. Slamming 4K though based on it being too subtle to see because of vision limitations is just false for a large segment of the population.

pjm

Well, insults aside I have to agree 100% with what Bullcrap said about the Nyquist theorem and oversampling. Jeff Mayer, you clearly have a poor understanding of the theory of signal processing – this is not an insult, just an assessment. When you say that you have seen content recorded in 4K and that it looks great on 4K screen, fine! Of course 4K content needs a 4K screen to be fully revealed (depending on screen size you may need to sit very close to see the detail, though). But that was not the point. The point is that SOURCE 1080p content will not be improved by looking at it on a 4K screen. As Bullcrap said, the processing needs to be done on the capturing side, using filtering and oversampling to remove the quantization artifacts. The best you can do on the viewing side is to have a good upsampling algorithm to map each 1080p pixel onto a number of 4K pixels. This upsampling is critical for good image quality, but it is a necessity, rather than an advantage – if you have 1080p content, a 1080p screen is the best support to view it. Of course you have additional problems like compression and transmission artifacts, but having a 4K screen will not ease those problems. One more thing: if you are watching 1080p content and you sit close enough to see the pixels on your 4K screen, then you are obviously sitting too close. This a bit like looking at an offset printed image with a magnifying glass – you don’t want to see the individual color dots.
One a side note: oversampling and anti-aliasing are commonly used in the digital video world. Optical anti-aliasing filters are very common (though they may not always be needed), and oversampling is implemented e.g. when producing 1080p content on a sensor with a higher pixel count.This is common place, but a minor point. Processing of the source (or its lack thereof) will not change this simple fact: you cannot recover/improve signal on the viewing side using a screen with a resolution higher than that signal.

W. Jeff Meier

I am not saying 1080p content is improved with a 4K display. This article is discussing the relationship of source resolution and viewing distance and vision.

Leave a Reply

Your email address will not be published.