4K Calculator – Do You Benefit?

As 4K displays are now coming out, a common debate between people online has been about the benefits of 4K compared to HD in a normal viewing environment. Many people will say that 4K looks amazingly sharp and it’s impossible not to see the difference, but many others will say that in a regular living room, with a regular sized screen, you really won’t see much of a difference.

Update July 3rd, 2013: Read more about HiDPI and 4K Display use cases and benefits. My new article compares smartphones, tablets, laptops, monitors, TVs, and projectors to see where high-resolution provides the most visible benefit.

Most people who have seen 4K to this point see it at trade shows, or at stores that have dedicated displays for it, and not at home with normal content.Many times these are setup to only allow you to be so far away from the screen, which makes 4K look great, as it really does look amazing from 3′ away. They also use special content to show off 4K, as you can’t go to Amazon or Best Buy and order a 4K movie right now. One company at CES was showing off their 4K TV using a 5 minute demo from Dreamworks, which they told me was hundreds of gigabytes as it was totally uncompressed. That kind of content won’t be available to anyone at home anytime soon, and makes for a completely unfair comparison between HD and 4K.

There is a chart from Carlton Bale available that shows when you might be able to see the difference with 4K compared to 1080p or 720p, but I decided to make my own 4K calculator that gives you just a little more detail. Using this you can enter your screen size, your distance from the screen, and your vision to generate some numbers for you. The information this will provide you is:

  • The Pixels Per Inch for 480p, 1080p, and 4K signals based on the screen size.
  • The maximum resolution that you can discern with your vision based on the distance from the display
  • The ideal 4K screen size for your distance, which is the smallest screen at which point you can resolve all the pixels
  • The ideal distance from a 4K display of your specified size, which is how close you need to sit to resolve every pixel
  • The amount of extra resolution that would be visible on a 4K display compared to a 1080p display based on your screen size and distance. The maximum amount would be 300%, and the minimum is 0%

I’ve also created a chart, seen below, that gives you a quick glance to see what the ideal viewing distance is for a 16:9 display based on size and resolution.  This is based on 20/20 vision, and the viewing range for each resolution is the distance you can sit from that TV and see more detail than a lower resolution, and are not close enough to see the extra detail in a higher resolution screen.  So with a 50″ 1080p display, if you are closer than 9’9″ you will see more detail than a 720p display, but if you are more than 6’6″ away, you couldn’t see any more detail on a 4K display.

4K Calculator   Do You Benefit? commentary

This calculator does make assumptions about vision and arc minutes, but those that I talked to said this was as good of an assumption to make for human vision as anything else. If you think that it is off, you can easily adjust the vision to 20/15 or 20/10 to make it more accurate to you. This also will let you calculate for devices like cell phones and tablets, that will often be placed much closer to your face, to compare them to a 4K display that is much further away. One more assumption is that you have a 16×9 screen, though other screen ratios may be supported in the future.

4K Calculator
Screen Size (Inches)
Screen Distance
Vision
4K vs 1080p Improvement %
Ideal Resolution
1080p vs. 480p
1080p Benefits Begin when ” From Screen
1080p Pixels Visible when ” From Screen
4K vs. 1080p
Ideal 4K Distance for Screen Size: “
Ideal 4K Screen Size for Distance: “
Pixels Per Inch for Input Data
4K: PPI1080p: PPISD: PPI

Hopefully this will help you to determine if you will see much benefit from 4K in your situation, as well as making it easier to compare devices like a cell phone or tablet that you hold very close to your face to a TV that likely sits across the room from you.

, ,

  • Pingback: Review: Seiki SE50UY04 4K Ultra HDTV (it’s $1,400!) | HD Guru

  • azeem

    I’ve stood in front of the Sony 4K display, and have a 60″ Panasonic st60 at home. At 6 feet, the Sony looks dramatically nicer that my Panny.
    Maybe you can’t see individual pixels, but the overall picture difference was stunning to me..

    • Chris Heinonen

      From the chart, it should look great at 6′. The question is, how many people actually sit 6′ away from a 60″ TV? I don’t know any that do. Surveying my family and friends the other week, most people were 9-10′ away from a 50″ display on average, where the difference wouldn’t really exist.

      • azeem

        I sit about 15 feet away from my 60″ , so I may never see any difference, but what I am trying to say is that looking at the picture of the 4K, there was no question that it was much sharper, and exhibited dramatically more detail than my HDTV could ever reproduce.

        It would be interesting to see a similar chart comparing what happens on a laptop with a Retina-class display versus a standard display. I’d venture to say you have to move quite a bit aways from the keyboard on a standard display before you could not tell the difference between the two.

        • Chris Heinonen

          Right, and the calculator is designed to show that it is sharper, but when you’re in a certain range of distances. This way you can measure your viewing environment, and think about the TV size you might get, and determine if you’re likely to see a difference or not.

          I’m going to add a follow-up article that takes some common screen sizes (iPad, iPhone, Laptop, LCD Monitor, TV) and the common resolutions for them, and shows the distances at which you can tell a difference. The iPad/iPhone example is brought up a lot, but with that screen size and resolution, you don’t see pixels until they are 10″ or so away from your face, which is a more likely distance for an iPhone than for a 4K HDTV to be. PPI matters, but as a function of distance more than anything.

      • Mike McG

        Everyone seems to consider only TV size and viewing distance, but misses that fact that average human vision is closer to 20/14, not 20/20. 20/20 is simply the cut-off for when your vision is considered good enough that you don’t need glasses. When you’re talking about 20/14 vision (average), or 20/10 vision (twice as sharp as 20/20), the equation changes. Someone with 20/10 vision can resolve twice the resolution of someone with 20/20 vision, and will benefit more from higher resolutions at the same distance.

    • No

      There is a limit in angle of view for comfortable viewing. If you have more resolution that what you can see when you match that angle, it’s useless resolution. You get dizzy with some material and want to throw up. That’s the typical response of the viewer.

      Computer displays are different, because we can work on a small part of the screen at a time, so we can put large displays very close.

      The diagonal of the screen is a safe bet for getting a maximum angle of view. 50″ screen at 50″ distance and so on. At that distance, you can see an action film without vomiting. The same applies to theaters, the good seat is at a distance equal to the diagonal, not with the idiots on the front. One can check the THX design guidelines. Theaters want to sell more tickets through, so they will not limit their sales to seats with good video and good audio. They hope you will be fine after 90 or 120 minutes. TV is a different thing though, people watch TV or home cinema for hours at a time.

      At a good distance, most people can see all pixels of a 1080p display and a little more than that. Give them 2x or 4x or 8x the resolution and it will not even affect their experience. They will simply not notice it. meanwhile, production cost, distribution cost,and other costs are increased, and data has to be compressed more. Digital distribution cost is a huge factor for corporations, so they will try to give you 4K without additional cost over 2K, and that will certainly come with more artifacts. Pressed by the cost factor, broadcast 1080p is not as good as it should be today.

      Of course,since there are people that have their heating set to 30C in their winter and their cooling to 15C in the summer, we will also have people with 50″ screens sitting at a distance of 20″ so they can “appreciate” all the resolution.

      In the real world, TV or projection screens have to be 100″-120″ to actually make their 1080p visible.

  • Surge

    This is FANTASTIC!!
    It confirms what others have said, on CNET, as well as on Hometheaterreview.com (where the authors claimed NO difference for 1080P content on a JVC 1080P projector vs the Sony 4K projector on a 110″ screen from 8-10′ away).

    For those who claim that the 4K still looks better… that’s probably because it’s brighter, or has better contrast, or better color… You are not comparing apples-to-apples resolution only.

    A 4K projector is 4-5X more expensive than a very good 1080P projector. Unless your screen size is >120″ and you’re sitting 10′ away or closer (approx., better to use the calculator above)… you really won’t see much of a difference in resolution.

    • Mike McG

      Ever notice the authors on CNET are wearing glasses? As a person with better than 20/20 vision (which is the population norm), there is a MASSIVE difference between 4K and 1080P from the same viewing distance. See for yourself, select 20/10 or 20/15 in the calculator.

      • http://referencehometheater.com/ Chris Heinonen

        I’d suggest reading Katzmaier’s take on the recent Panasonic 4K set, where he compares it directly with a 1080p set of the same size, using the same 4K content, and the closest THX recommended viewing angle:

        http://reviews.cnet.com/flat-panel-tvs/panasonic-tc-l65wt600/4505-6482_7-35827200.html

        Not sure why you’re bashing glasses. So we shouldn’t believe people that make sure their vision is corrected to be at least average, if not better?

        • Mike McG

          I’m not bashing people with glasses, I’m just tired of the misleading and inaccurate reviews. I’ve seen the TVs myself, side by side, with the same content vs 1080p, in person, and the difference to me is night and day. If you make the mistaken assumption that 20/20 is “average”, then you’re going to come to false conclusions.

          • Surge

            20/20 IS Average! Research what the term means.

          • Mike McG

            Thanks, I have! You should too! Let me get you started with studies of visual acuity by age.

            http://www.oculist.net/downaton502/prof/ebook/duanes/pages/v5/v5c051.html

            By definition, 20/20 vision is nominal, or “normal”. Check Wikipedia…or anywhere really. This does not imply in any way that it is a statistical average in a given population. In fact, it’s not even a scientifically chosen measure! It just so happens that Snell, who invented the chart, had an assistant who could read a standard lead size print from 20 feet. As he was considered to have good eyesight, that became the starting point for the test, but they later determined (scientifically) that many people see far better than that.

            Today, we know that a normal healthy human eye sees at about 20/12 to 20/16 from childhood, through middle age. By the age of 60 or 70, visual accuity drops to 20/20. There are people with better vision, and people with worse.

            Regardless of what “average” vision is, or even what “normal” vision is, there are people who see twice as sharply as those with 20/20 vision. So making generic claims about whether or not the “average” person can see a difference is nonsense. First of all, 20/20 is not the average, thanks to glasses and contacts, about 75% of people of all ages see better than 20/20. And even if 20/20 was average, some people still see twice as sharply! We don’t all have the same vision, we don’t all have the same size TV, and we don’t all sit the same distance away from it. And that’s why the calculator has 3 parameters, not 2.

  • Surge

    How do you translate the findings for a 2.35:1 screen?

    • Chris Heinonen

      You’d need to base it on a 16×9 screen size of the same width. So a 109″ 2.40:1 screen is the same width as a 116″ 16:9 screen. I’ll see if I can try to make an option for this in the future.

  • No

    The Limits of Human Vision
    Michael F. Deering

    http://www.swift.ac.uk/about/files/vision.pdf?

  • No

    And the THX setup guidelines:

    http://www.thx.com/consumer/home-entertainment/home-theater/hdtv-set-up/

    These are interesting because they are designed for comfortable viewing. You can’t simply change the angles to 2x or 4x (the distance to 1/2 or 1/4) to accommodate 2x or 4x resolution.

    In effect, increasing resolution does not make much sense over 2K because of the comfortable viewing factor that limits the angle, and the fixed resolution of the human eye. I would rather invest in higher bitrate instead, because this will provide a true advantage, a more natural image with less artifacts.

    • Mike McG

      I’m sorry, but you’re wrong. An increase in resolution to match the capabilities of the human eye makes a huge difference. Human visual capabilities average from 20/10 in young people, to 20/14 on average, to 20/20 at age 55 as our vision slowly degrades. At 20/10, even a moderate 40″ TV sees a nearly 300% benefit at 4K from 6ft. At 55″, 4K isn’t enough, as 5K resolution would be ideal. When you’re doing calculations, base it on your vision, but don’t assume 20/20 is average, because it is not.

  • Naveen

    The equation for optimal 20/20 viewing distance in feet (in decimals) is:
    =(SQRT(x^2/(16^2+9^2))*9)/(1080)*60*360/(22/7)/2/12

    * where x = screen’s diagonal length
    * where aspect ratio isn’t 16:9, change 16 & 9 in the above equation with appropriate aspect ratio
    * 1080 is the resolution used in the above equation; change if required to the appropriate resolution
    * the last part of the equation ’12′ is to convert the final figure into feet. For applying this equation to a smaller device, say a smartphone or a tablet/laptop, not incl this 12 would show results in inches.

    • Chris Heinonen

      That appears to be the Snellen formula, which is what I use to generate this data. Other people support other formulas, and hopefully I can update the calculator to support multiple formulas in the near future. Thanks.

      • Mark Rejhon

        One interesting consideration is the compression variable. Compression artifacts are much bigger than pixels. On 4K, compression artifacts are one-quarter the size than during the same-size 1080p.

        When using more extreme compression ratios (keeping bitrate-per-resolution constant), the viewing distance benefit actually increases ; to the point where people have reported noticeable differences more than 10 feet away from a 4K display.

        With streaming becoming the default choice for movie delivery over 4K (instead of physical media), compression artifacts are noticeable from 10 feet away from a 4K 50″ display, and can actually look significantly better. (e.g. 1080p @ 4Mbps versus 4K @ 16Mbps, keeping proportional bitrate per pixel)

        It is an interesting “fudge factor” variable to consider…

        • http://referencehometheater.com/ Chris Heinonen

          You’re treating that higher bit-rate as an either/or decision, that to get it you’d need to also move to 4K from 1080p. As we’ve seen from Netflix and their 4K streams, which are apparently a paltry 3 mb/sec, that’s unlikely to happen.

          You could also take an increase in bit rate, and the better compression of HEVC, and apply that to 1080p content. If you’re already at the viewing limits for pixels, you could then have fewer compression artifacts and add in greater bit depth to both luma and chroma content. That greater range would give us fewer banding compression artifacts and be more noticeable than any increase in resolution from a distance.

          • Mark Rejhon

            For better or for worse — it’s disappointingly relevant since content makers are getting more into streaming, and it’s looking less likely that consumers are going to flock to physical 4K formats as streaming 4K formats, especially as net connections get faster. As smart people like you and me already know, eventually 4K will be a small premium over 1080p, much like how 1080p became a small premium over 720p, and thus the reason to avoid 4K becomes smaller.

            The rationale seems to be:
            – The world is slowly going to streaming;
            – The premium of 4K will eventually become fairly small eventually;
            – Content makers are (theoretically) more likely to give you higher bitrates for 4K than for 1080p, just to push their premium product. Eventually, it could potentially become easier to find 15Mbps+ and 20Mbps+ streams in 4K format than in 1080p format. Depends on which direction the market goes in.
            – At some point in the future, a critical mass occurs and streaming potentially becomes the primary driving force for the sales of 4K TV’s.

            P.S. 3Mbps? That must be the minimum rate in a congested Internet area — My router bitrate meter measured twice that 3 Mbps rate just for my 1080p streams (e.g. for movie “Hugo”, SuperHD stream, top bitrate). It’s a 35Mbps VDSL hosting a Netflix Open Connect appliance locally — so my streams tends to always max out here in Toronto). When I did some bitrate measurements, the movie Hugo measured 6 to 7 megabits per second when measured from my router (probably includes a bit of overhead). It looked far more well-compressed than my television channels. Certainly Blu-Ray is superior (I try to get them when I can too) but I’ve been fortunate to be on one of those ‘good Netflix Internet connections’ that consistently delivers the better streams. People buy TV’s for getting cable too — yet around here, streaming can often be better than television quality. It’s a sad state of affairs, really, as I understand the lovely look of a H.264 stream from Blu-Way at 25 megabits per second.

  • ronniejr

    I have a 2007 model Toshiba.. still has a pretty good picture, but when I went to see the Sony 4k I was blown away.. I have 20/15 vision. You have to be right on top of the tv to see any kind of pixels But, I do believe it makes a difference. I have a 1080p 5″ phone an my wife has a 4.7″ 720p phone an I can tell the difference in the two as far as resolution.. I don’t care what Steve jobs said.. some people with near perfect vision can distinguish the resolution even at that high of ppi.. I am definitely going to be getting a 4k Sony in the near future. Not just because of the pixel density but the color and contrast are so vibrant it all plays into the factors of a person that wants the best possible picture.. although ill wait for the next model year before I make a purchase.

    • Mike McG

      I’m with ya ;) I have 20/10 vision, and I still see the pixels on the retina displays. 4K is ideal for me for a 40″ TV. Anything larger, and an even higher resolution would be beneficial.

  • Richard Barg

    Amazingly brilliant. You nailed it.

  • Mike McG

    Thanks for sharing this calculator! It’s great to have a tool that can show what improvement real people will see, vs. all these misinformed articles claiming there is no noticeable difference, written by authors in glasses, assuming 20/20 is average vision (it is not). Average human vision is closer to 20/14, with many people in the 20/12 to 20/10 range. I have 20/10, and your calculator tells me on a 40″ 4K TV at 6ft away, I’ll perceive about a 278% improvement over 1080P, which matches what I’ve seen in person with 4K TVs…the picture is SOOOO much sharper! Now, I just need the prices to drop ;)

    • http://referencehometheater.com/ Chris Heinonen

      Except what you are writing makes no sense. 20/20 vision means that someone sees at 20 feet what an average person sees at 20 feet. If 20/14 was actually “average” then the scale would be reset and that would become 20/20.

      • Mike McG

        Hi Chris, no, that is not what 20/20 means. That’s an inaccurate assumption made by most people when discussing 4K. 20/20 is not a statistical average of human vision (at least not until you look at people about 60+ years old). 20/20 is what they consider “normal” vision, meaning that your vision is considered within the normal range. The population average is closer to 20/14, with some people seeing as well as 20/10. At 20/10, I can resolve the same detail at 40ft as a person with 20/20 could see at 20ft. Which means for me, and many others (under the age of 60), 4K TV is a welcome sight!

        See figure 4 for a number of studies of average vision by age…

        http://www.oculist.net/downaton502/prof/ebook/duanes/pages/v5/v5c051.html

  • Mark Rejhon

    An interesting test for 4K benefit is computer graphics aliasing — http://www.testufo.com/aliasing-visibility

    Surprisingly, this is actually visible 10 feet away from a 50″ 4K display (or 24″ 1080p monitor — almost the same ppi). The line aliasing/antialiasing artifacts translates to a bead-type artifact that shows up at wide spacing intervals, which then becomes visible from a farther-than-expected viewing distance.

  • Mahmoud El-Darwish

    Correct but there is a missing parameter. Field scanning by the viewer! Since it increases with larger screens, it affects the experience. Note how the benefits of 4k are derived from a larger panorama. This mimics movie theater experience at overly close seating to screen, which we know to be counter productive!

  • jason_darrow

    I think most people here are confusing increased resolution with improved color representation and brightness. As for resolution, I have 20/20 and am unable to discern the difference between 720p and 1080p on a 46″ screen at 2 meters distance. I have absolutely no interest in 4k, but am very interested in better and brighter colors.

  • Jens Emil Ravn Nielsen

    I know you posted this a long time ago, but I just want to point out that watching 4k content on a 1080 screen won’t look as nice as 1080 content on a 1080 screen. Just like upscaling 1080 content on a 4k screen won’t look as nice as 4k content on a 4k screen, downscaling/sampling 4k to a 1080 screen just won’t look as nice.