4K Calculator – Do You Benefit?

As 4K displays are now coming out, a common debate between people online has been about the benefits of 4K compared to HD in a normal viewing environment. Many people will say that 4K looks amazingly sharp and it’s impossible not to see the difference, but many others will say that in a regular living room, with a regular sized screen, you really won’t see much of a difference.

Update July 3rd, 2013: Read more about HiDPI and 4K Display use cases and benefits. My new article compares smartphones, tablets, laptops, monitors, TVs, and projectors to see where high-resolution provides the most visible benefit.

December 1, 2014: Revised text as 4K/UHD TVs are now available everywhere

Most people who have seen 4K to this point see it at stores that have dedicated displays for it, and not at home. Many times these are setup to only allow you to be so far away from the screen, which makes 4K look great, as it really does look amazing from 3′ away. They also use special content to show off 4K, as the only 4K sources right now are Netflix and custom sources from Sony and Samsung. Since Netflix is unreliable for a showroom floor and most companies don’t want to use a media player from a different company, you see custom demo loops that are nothing like what you will see at home.

Now that you can get a 40″ 4K TV from Samsung for only $600 or a 50″ from Vizio for $1,000, UltraHD TVs are mainstream. Since most manufacturers are making their top-of-the-line sets 4K, those who want all the best features have no choice but to go with 4K at this point. Plasma no longer exists, and OLED is still expensive and only from LG, so you are almost certainly purchasing an LCD. If you’re looking for a TV that isn’t top-of-the-line, will you see a difference with 4K in your house, or should you invest your money on different features or just spend less?

There is a chart from Carlton Bale available that shows when you might be able to see the difference with 4K compared to 1080p or 720p, but I decided to make my own 4K calculator that gives you just a little more detail. Using this you can enter your screen size, your distance from the screen, and your vision to generate some numbers for you. The information this will provide you is:

  • The Pixels Per Inch for 480p, 1080p, and 4K signals based on the screen size.
  • The maximum resolution that you can discern with your vision based on the distance from the display
  • The ideal 4K screen size for your distance, which is the smallest screen at which point you can resolve all the pixels
  • The ideal distance from a 4K display of your specified size, which is how close you need to sit to resolve every pixel
  • The amount of extra resolution that would be visible on a 4K display compared to a 1080p display based on your screen size and distance. The maximum amount would be 300%, and the minimum is 0%

I’ve also created a chart, seen below, that gives you a quick glance to see what the ideal viewing distance is for a 16:9 display based on size and resolution.  This is based on 20/20 vision, and the viewing range for each resolution is the distance you can sit from that TV and see more detail than a lower resolution, and are not close enough to see the extra detail in a higher resolution screen.  So with a 50″ 1080p display, if you are closer than 9’9″ you will see more detail than a 720p display, but if you are more than 6’6″ away, you couldn’t see any more detail on a 4K display.

4K Calculator   Do You Benefit? commentary

This calculator does make assumptions about vision and arc minutes, but those that I talked to said this was as good of an assumption to make for human vision as anything else. If you think that it is off, you can easily adjust the vision to 20/15 or 20/10 to make it more accurate to you. This also will let you calculate for devices like cell phones and tablets, that will often be placed much closer to your face, to compare them to a 4K display that is much further away. One more assumption is that you have a 16×9 screen, though other screen ratios may be supported in the future.

Many reviewers have tried to compare 4K to 1080p to see if they notice a difference. David Katzmaier pulled in a panel and showed them the same content on 4K streaming from Netflix and 1080p Blu-ray and none of the people could pick out the 4K display. At the same time, HDTVTest did a similar test using 1080p compared to 4K, but they used their own custom content instead of streaming content. In their testing people do notice the difference from a reasonable distance, though unless you are shooting you own 4K content you can’t test this yourself.

I talked to other reviewers who tested projectors, being able to instantly switch between a Sony 4K projector and a JVC X700R on a 120″ screen. They could barely notice the difference with the 4K resolution using content directly from a RED 4K camera. Even when they did notice, they preferred the JVC image because it had better blacks and a better contrast ratio, and the eye notices that more than resolution. With any display, resolution is only a single factor in how good a display looks. Knowing how much you might see that increased resolution can help you decide what TV will best work for you.

 

4K Calculator
Screen Size (Inches)
Screen Distance
Vision
4K vs 1080p Improvement %
Ideal Resolution
1080p vs. 480p
1080p Benefits Begin when ” From Screen
1080p Pixels Visible when ” From Screen
4K vs. 1080p
Ideal 4K Distance for Screen Size: “
Ideal 4K Screen Size for Distance: “
Pixels Per Inch for Input Data
4K: PPI 1080p: PPI SD: PPI

Hopefully this will help you to determine if you will see much benefit from 4K in your situation, as well as making it easier to compare devices like a cell phone or tablet that you hold very close to your face to a TV that likely sits across the room from you.

, ,

  • Pingback: Review: Seiki SE50UY04 4K Ultra HDTV (it’s $1,400!) | HD Guru()

  • azeem

    I’ve stood in front of the Sony 4K display, and have a 60″ Panasonic st60 at home. At 6 feet, the Sony looks dramatically nicer that my Panny.
    Maybe you can’t see individual pixels, but the overall picture difference was stunning to me..

    • Chris Heinonen

      From the chart, it should look great at 6′. The question is, how many people actually sit 6′ away from a 60″ TV? I don’t know any that do. Surveying my family and friends the other week, most people were 9-10′ away from a 50″ display on average, where the difference wouldn’t really exist.

      • azeem

        I sit about 15 feet away from my 60″ , so I may never see any difference, but what I am trying to say is that looking at the picture of the 4K, there was no question that it was much sharper, and exhibited dramatically more detail than my HDTV could ever reproduce.

        It would be interesting to see a similar chart comparing what happens on a laptop with a Retina-class display versus a standard display. I’d venture to say you have to move quite a bit aways from the keyboard on a standard display before you could not tell the difference between the two.

        • Chris Heinonen

          Right, and the calculator is designed to show that it is sharper, but when you’re in a certain range of distances. This way you can measure your viewing environment, and think about the TV size you might get, and determine if you’re likely to see a difference or not.

          I’m going to add a follow-up article that takes some common screen sizes (iPad, iPhone, Laptop, LCD Monitor, TV) and the common resolutions for them, and shows the distances at which you can tell a difference. The iPad/iPhone example is brought up a lot, but with that screen size and resolution, you don’t see pixels until they are 10″ or so away from your face, which is a more likely distance for an iPhone than for a 4K HDTV to be. PPI matters, but as a function of distance more than anything.

      • Mike McG

        Everyone seems to consider only TV size and viewing distance, but misses that fact that average human vision is closer to 20/14, not 20/20. 20/20 is simply the cut-off for when your vision is considered good enough that you don’t need glasses. When you’re talking about 20/14 vision (average), or 20/10 vision (twice as sharp as 20/20), the equation changes. Someone with 20/10 vision can resolve twice the resolution of someone with 20/20 vision, and will benefit more from higher resolutions at the same distance.

    • No

      There is a limit in angle of view for comfortable viewing. If you have more resolution that what you can see when you match that angle, it’s useless resolution. You get dizzy with some material and want to throw up. That’s the typical response of the viewer.

      Computer displays are different, because we can work on a small part of the screen at a time, so we can put large displays very close.

      The diagonal of the screen is a safe bet for getting a maximum angle of view. 50″ screen at 50″ distance and so on. At that distance, you can see an action film without vomiting. The same applies to theaters, the good seat is at a distance equal to the diagonal, not with the idiots on the front. One can check the THX design guidelines. Theaters want to sell more tickets through, so they will not limit their sales to seats with good video and good audio. They hope you will be fine after 90 or 120 minutes. TV is a different thing though, people watch TV or home cinema for hours at a time.

      At a good distance, most people can see all pixels of a 1080p display and a little more than that. Give them 2x or 4x or 8x the resolution and it will not even affect their experience. They will simply not notice it. meanwhile, production cost, distribution cost,and other costs are increased, and data has to be compressed more. Digital distribution cost is a huge factor for corporations, so they will try to give you 4K without additional cost over 2K, and that will certainly come with more artifacts. Pressed by the cost factor, broadcast 1080p is not as good as it should be today.

      Of course,since there are people that have their heating set to 30C in their winter and their cooling to 15C in the summer, we will also have people with 50″ screens sitting at a distance of 20″ so they can “appreciate” all the resolution.

      In the real world, TV or projection screens have to be 100″-120″ to actually make their 1080p visible.

  • Surge

    This is FANTASTIC!!
    It confirms what others have said, on CNET, as well as on Hometheaterreview.com (where the authors claimed NO difference for 1080P content on a JVC 1080P projector vs the Sony 4K projector on a 110″ screen from 8-10′ away).

    For those who claim that the 4K still looks better… that’s probably because it’s brighter, or has better contrast, or better color… You are not comparing apples-to-apples resolution only.

    A 4K projector is 4-5X more expensive than a very good 1080P projector. Unless your screen size is >120″ and you’re sitting 10′ away or closer (approx., better to use the calculator above)… you really won’t see much of a difference in resolution.

    • Mike McG

      Ever notice the authors on CNET are wearing glasses? As a person with better than 20/20 vision (which is the population norm), there is a MASSIVE difference between 4K and 1080P from the same viewing distance. See for yourself, select 20/10 or 20/15 in the calculator.

      • http://referencehometheater.com/ Chris Heinonen

        I’d suggest reading Katzmaier’s take on the recent Panasonic 4K set, where he compares it directly with a 1080p set of the same size, using the same 4K content, and the closest THX recommended viewing angle:

        http://reviews.cnet.com/flat-panel-tvs/panasonic-tc-l65wt600/4505-6482_7-35827200.html

        Not sure why you’re bashing glasses. So we shouldn’t believe people that make sure their vision is corrected to be at least average, if not better?

        • Mike McG

          I’m not bashing people with glasses, I’m just tired of the misleading and inaccurate reviews. I’ve seen the TVs myself, side by side, with the same content vs 1080p, in person, and the difference to me is night and day. If you make the mistaken assumption that 20/20 is “average”, then you’re going to come to false conclusions.

          • Surge

            20/20 IS Average! Research what the term means.

          • Mike McG

            Thanks, I have! You should too! Let me get you started with studies of visual acuity by age.

            http://www.oculist.net/downaton502/prof/ebook/duanes/pages/v5/v5c051.html

            By definition, 20/20 vision is nominal, or “normal”. Check Wikipedia…or anywhere really. This does not imply in any way that it is a statistical average in a given population. In fact, it’s not even a scientifically chosen measure! It just so happens that Snell, who invented the chart, had an assistant who could read a standard lead size print from 20 feet. As he was considered to have good eyesight, that became the starting point for the test, but they later determined (scientifically) that many people see far better than that.

            Today, we know that a normal healthy human eye sees at about 20/12 to 20/16 from childhood, through middle age. By the age of 60 or 70, visual accuity drops to 20/20. There are people with better vision, and people with worse.

            Regardless of what “average” vision is, or even what “normal” vision is, there are people who see twice as sharply as those with 20/20 vision. So making generic claims about whether or not the “average” person can see a difference is nonsense. First of all, 20/20 is not the average, thanks to glasses and contacts, about 75% of people of all ages see better than 20/20. And even if 20/20 was average, some people still see twice as sharply! We don’t all have the same vision, we don’t all have the same size TV, and we don’t all sit the same distance away from it. And that’s why the calculator has 3 parameters, not 2.

          • John

            Nope. Norm vs average. Wake up dude.

        • John

          Chris,
          You article is rubbish.
          Thankfully we reply on science and not your objective opinion that has no basis or support in science.

          You are assuming all people has 20/20. That is the normal and not the average. The majority of people have a better vision than 20/20.
          The average is 20/14, so 4k matters at normal viewing distances.
          What we see if a reconstruction of our mind based on input provided by our eyes. It is not the actual light (photon particle-waves) received by our eyes.
          The human eye can resolve the equivalent of 52 megapixels (assuming a 60% angle of view, 20/20 vision). However, if we consider a 20% off-center we perceive 1/10 as much detail. At periphery we detect
          larger scale contrast and minimal color. Depending on eye sight we perceive detail more akin to 5-15 megapixels. We also do not remember pixel by pixel, but textures, color and contrast on an image by image basis. The memorable image is really one that is prioritized based on interest. (We see faces, look for patterns, often false patters). You should consider asymmetry (that is why you nee to place your tv lower
          than your line of sight, in a direction away from your nose). Also consider low light because you start losing color and perceive monochrome. Also consider graduations where enlarged detail become less visible to the human eyes, compared to a camera where it’s the opposite.
          4k (2160p) is merely 8.29 megapixels. 1080p is only 2.07 megapixels. No wonder they are working on 8k already. there is a difference between the content resolution/detail and what your mind perceive.

          People that knock 4k content/panels due to pixel
          resolution based on distance or 4k is not 4k (because of the 3840×2160 vs 4096 x 2160) are merely projecting their ignorance with confidence.
          3840×2160 scaled better from 1080p (1920×1080), we have 16×9 TV’s, not 19:10 for 4096 x 2160. But you do get the detail of 4096 x 2160 on your 16×9 and that is why you have a black bar at the top and bottom of your screen.

          Wake up Chris, as far as we know space is continuous, not discrete. More pixels is not just better resolution, it’s better color and detail too.

  • Surge

    How do you translate the findings for a 2.35:1 screen?

    • Chris Heinonen

      You’d need to base it on a 16×9 screen size of the same width. So a 109″ 2.40:1 screen is the same width as a 116″ 16:9 screen. I’ll see if I can try to make an option for this in the future.

  • No

    The Limits of Human Vision
    Michael F. Deering

    http://www.swift.ac.uk/about/files/vision.pdf?

  • No

    And the THX setup guidelines:

    http://www.thx.com/consumer/home-entertainment/home-theater/hdtv-set-up/

    These are interesting because they are designed for comfortable viewing. You can’t simply change the angles to 2x or 4x (the distance to 1/2 or 1/4) to accommodate 2x or 4x resolution.

    In effect, increasing resolution does not make much sense over 2K because of the comfortable viewing factor that limits the angle, and the fixed resolution of the human eye. I would rather invest in higher bitrate instead, because this will provide a true advantage, a more natural image with less artifacts.

    • Mike McG

      I’m sorry, but you’re wrong. An increase in resolution to match the capabilities of the human eye makes a huge difference. Human visual capabilities average from 20/10 in young people, to 20/14 on average, to 20/20 at age 55 as our vision slowly degrades. At 20/10, even a moderate 40″ TV sees a nearly 300% benefit at 4K from 6ft. At 55″, 4K isn’t enough, as 5K resolution would be ideal. When you’re doing calculations, base it on your vision, but don’t assume 20/20 is average, because it is not.

  • Naveen

    The equation for optimal 20/20 viewing distance in feet (in decimals) is:
    =(SQRT(x^2/(16^2+9^2))*9)/(1080)*60*360/(22/7)/2/12

    * where x = screen’s diagonal length
    * where aspect ratio isn’t 16:9, change 16 & 9 in the above equation with appropriate aspect ratio
    * 1080 is the resolution used in the above equation; change if required to the appropriate resolution
    * the last part of the equation ’12′ is to convert the final figure into feet. For applying this equation to a smaller device, say a smartphone or a tablet/laptop, not incl this 12 would show results in inches.

    • Chris Heinonen

      That appears to be the Snellen formula, which is what I use to generate this data. Other people support other formulas, and hopefully I can update the calculator to support multiple formulas in the near future. Thanks.

      • Mark Rejhon

        One interesting consideration is the compression variable. Compression artifacts are much bigger than pixels. On 4K, compression artifacts are one-quarter the size than during the same-size 1080p.

        When using more extreme compression ratios (keeping bitrate-per-resolution constant), the viewing distance benefit actually increases ; to the point where people have reported noticeable differences more than 10 feet away from a 4K display.

        With streaming becoming the default choice for movie delivery over 4K (instead of physical media), compression artifacts are noticeable from 10 feet away from a 4K 50″ display, and can actually look significantly better. (e.g. 1080p @ 4Mbps versus 4K @ 16Mbps, keeping proportional bitrate per pixel)

        It is an interesting “fudge factor” variable to consider…

        • http://referencehometheater.com/ Chris Heinonen

          You’re treating that higher bit-rate as an either/or decision, that to get it you’d need to also move to 4K from 1080p. As we’ve seen from Netflix and their 4K streams, which are apparently a paltry 3 mb/sec, that’s unlikely to happen.

          You could also take an increase in bit rate, and the better compression of HEVC, and apply that to 1080p content. If you’re already at the viewing limits for pixels, you could then have fewer compression artifacts and add in greater bit depth to both luma and chroma content. That greater range would give us fewer banding compression artifacts and be more noticeable than any increase in resolution from a distance.

          • Mark Rejhon

            For better or for worse — it’s disappointingly relevant since content makers are getting more into streaming, and it’s looking less likely that consumers are going to flock to physical 4K formats as streaming 4K formats, especially as net connections get faster. As smart people like you and me already know, eventually 4K will be a small premium over 1080p, much like how 1080p became a small premium over 720p, and thus the reason to avoid 4K becomes smaller.

            The rationale seems to be:
            – The world is slowly going to streaming;
            – The premium of 4K will eventually become fairly small eventually;
            – Content makers are (theoretically) more likely to give you higher bitrates for 4K than for 1080p, just to push their premium product. Eventually, it could potentially become easier to find 15Mbps+ and 20Mbps+ streams in 4K format than in 1080p format. Depends on which direction the market goes in.
            – At some point in the future, a critical mass occurs and streaming potentially becomes the primary driving force for the sales of 4K TV’s.

            P.S. 3Mbps? That must be the minimum rate in a congested Internet area — My router bitrate meter measured twice that 3 Mbps rate just for my 1080p streams (e.g. for movie “Hugo”, SuperHD stream, top bitrate). It’s a 35Mbps VDSL hosting a Netflix Open Connect appliance locally — so my streams tends to always max out here in Toronto). When I did some bitrate measurements, the movie Hugo measured 6 to 7 megabits per second when measured from my router (probably includes a bit of overhead). It looked far more well-compressed than my television channels. Certainly Blu-Ray is superior (I try to get them when I can too) but I’ve been fortunate to be on one of those ‘good Netflix Internet connections’ that consistently delivers the better streams. People buy TV’s for getting cable too — yet around here, streaming can often be better than television quality. It’s a sad state of affairs, really, as I understand the lovely look of a H.264 stream from Blu-Way at 25 megabits per second.

  • ronniejr

    I have a 2007 model Toshiba.. still has a pretty good picture, but when I went to see the Sony 4k I was blown away.. I have 20/15 vision. You have to be right on top of the tv to see any kind of pixels But, I do believe it makes a difference. I have a 1080p 5″ phone an my wife has a 4.7″ 720p phone an I can tell the difference in the two as far as resolution.. I don’t care what Steve jobs said.. some people with near perfect vision can distinguish the resolution even at that high of ppi.. I am definitely going to be getting a 4k Sony in the near future. Not just because of the pixel density but the color and contrast are so vibrant it all plays into the factors of a person that wants the best possible picture.. although ill wait for the next model year before I make a purchase.

    • Mike McG

      I’m with ya ;) I have 20/10 vision, and I still see the pixels on the retina displays. 4K is ideal for me for a 40″ TV. Anything larger, and an even higher resolution would be beneficial.

  • Richard Barg

    Amazingly brilliant. You nailed it.

    • John

      Nope, he assumes all people has 20/20. That is the normal and not the average. The majority of people have a better vision than 20/20. The average is 20/14, so 4k matters at normal viewing distances.
      What we see if a reconstruction of our mind based on imput provided by our eyes. It is not the actual light (photon particle-waves) received by our eues
      The human eye can resolve the equivalent of 52 megapixels (assuming a 60% angle of view, 20/20 vision). However, if we consider a 20% off-center we perceive 1/10 as much detail. At periphery we detect larger scale contrast and minimal color. Depending on eye sight we perceive detail more like 5-15 megapixels. We also do not remember pixel by pixel, but memorable textures, color and contrast on an image by image basis. The memorable image is really one that is prioritized based on interest. (We see faces, look for patterns, often false patters). You should consider asymmetry (that is why you nee to place your tv lower than your line of sight, in a direction away from your nose). Also consider low light because you start losing color and perceive monochrome. Also consider graduations where enlanded detail become less visible to the human eyes, compared to a camera where it’s the opposite.
      4k (2160p) is merely 8.29 megapixels. 1080 p is only 2.07 megapixels. No wonder they are working on 8k already. there is a difference between the content resolution/detail and what your mind perceive.

      People that knock 4k content/panels due to pixel resolution based on distance or 4k is not 4k (because of the 3840×2160 vs 4096 x 2160) are merely projecting their ignorance with confidence.

      Wake up people, as far as we know space is continuous, not discrete. More pixels is not just better resolution, it’s better color and detail too.

  • Mike McG

    Thanks for sharing this calculator! It’s great to have a tool that can show what improvement real people will see, vs. all these misinformed articles claiming there is no noticeable difference, written by authors in glasses, assuming 20/20 is average vision (it is not). Average human vision is closer to 20/14, with many people in the 20/12 to 20/10 range. I have 20/10, and your calculator tells me on a 40″ 4K TV at 6ft away, I’ll perceive about a 278% improvement over 1080P, which matches what I’ve seen in person with 4K TVs…the picture is SOOOO much sharper! Now, I just need the prices to drop ;)

    • http://referencehometheater.com/ Chris Heinonen

      Except what you are writing makes no sense. 20/20 vision means that someone sees at 20 feet what an average person sees at 20 feet. If 20/14 was actually “average” then the scale would be reset and that would become 20/20.

      • Mike McG

        Hi Chris, no, that is not what 20/20 means. That’s an inaccurate assumption made by most people when discussing 4K. 20/20 is not a statistical average of human vision (at least not until you look at people about 60+ years old). 20/20 is what they consider “normal” vision, meaning that your vision is considered within the normal range. The population average is closer to 20/14, with some people seeing as well as 20/10. At 20/10, I can resolve the same detail at 40ft as a person with 20/20 could see at 20ft. Which means for me, and many others (under the age of 60), 4K TV is a welcome sight!

        See figure 4 for a number of studies of average vision by age…

        http://www.oculist.net/downaton502/prof/ebook/duanes/pages/v5/v5c051.html

  • Mark Rejhon

    An interesting test for 4K benefit is computer graphics aliasing — http://www.testufo.com/aliasing-visibility

    Surprisingly, this is actually visible 10 feet away from a 50″ 4K display (or 24″ 1080p monitor — almost the same ppi). The line aliasing/antialiasing artifacts translates to a bead-type artifact that shows up at wide spacing intervals, which then becomes visible from a farther-than-expected viewing distance.

  • Mahmoud El-Darwish

    Correct but there is a missing parameter. Field scanning by the viewer! Since it increases with larger screens, it affects the experience. Note how the benefits of 4k are derived from a larger panorama. This mimics movie theater experience at overly close seating to screen, which we know to be counter productive!

  • jason_darrow

    I think most people here are confusing increased resolution with improved color representation and brightness. As for resolution, I have 20/20 and am unable to discern the difference between 720p and 1080p on a 46″ screen at 2 meters distance. I have absolutely no interest in 4k, but am very interested in better and brighter colors.

    • John

      4k is not only better resolution, the more pixels and updated standards yield more colors.

      A 4k panel also helps 1080p content look better, especially if the content is 4k (although it does not have to be); upscaling.

      1080p TV’s use hdmi 1.4 which is limited to 30 frames per second (or 30Hz), 10.2Gbps TMDS throughput.

      Some 4k tv’s on hdmi 1.4 is only 24Hz for the 4k content. HDMI 1.4 can deliver 4k content, but will be at 24/30fps and with some luck most likely with 8 bit 4:2:0 color.

      4K panels with HDMI 2.0 supports 50/60 fps, with 12-bit 4:2:2 (or 4:4:4, or 4:2:0) color reproduction (chroma subsampling), 18Gbps TMDS throughput. Rec.2020 color space is coming soon I hope.

      The color depth (bit/px) remains the same on hdmi 1.4 and 2.0 at 48. But the pixel clock rate (MHz) improves from 340 to 600.

      Also look for HDCP 2.2 (copy protection standard) in your 4k TV to allow content across connected devices. A built-in HEVC decoder should also be present so that you can get Netflix 4k (your need a broadband with speed of at least 15 Mbps).

      Check if the 4K TV supports a gamut to fulfil Rec.2020 color space.

      Check that your TV is upgradeable as HDMI 2.0 standards is being evolved and adopted differently my different TV manufacturers. HDMI 2.0 supports the Rec. 2020 color space, but it does not imply your TV manufacturer adopted it (but a firmware upgrade may do it and not hardware if the TV already has hdmi 2.0). 4K Blu-ray Disc players may arrive December 2015.

      So, unless you are an Apple fan (who rejects technology standards and use yesterday’s technology at tomorrow’s prices), there is no reason to claim that 4K TV’s has no benefit to the viewer.

  • Jens Emil Ravn Nielsen

    I know you posted this a long time ago, but I just want to point out that watching 4k content on a 1080 screen won’t look as nice as 1080 content on a 1080 screen. Just like upscaling 1080 content on a 4k screen won’t look as nice as 4k content on a 4k screen, downscaling/sampling 4k to a 1080 screen just won’t look as nice.

    • John

      That is incorrect.
      2160p content on a 1080p panel would look better than 1080p content on a 1080p panel.
      1080p content on a 2160 panel would look better than 1080p content on a 1080p panel.
      2160p content on a 2160p panel would look better than 1080p content on a 1080p panel.
      http://www.red.com/learn/red-101/upscaled-1080P-vs-4K

      • http://referencehometheater.com/ Chris Heinonen

        The first assertion is almost certainly wrong, the second is up for debate, and the third is of no debate.

        Downsampling 2160p to 1080p isn’t going to look better than 1080p because there a 99% certainty that the scaler inside the TV is worse than the one being used to master the content. Studios are already taking a 4K master and converting it down to 1080p and when they do that, they have someone there doing that job to make sure there aren’t jaggies, moire, and other artifacts when they downsample. If they are there, they can tweak the master to eliminate it. A scaler in a TV can’t do this, and those artifacts are going to be present.

        1080p upsampled to 2160p is going to depend on the scaler. What is going to happen is that it’s going to look smoother and often softer, since keeping it sharp results in it either looking blocky, or adding edge enhancement artifacts. The 1080p content on a 1080p screen might look better because it can do the 1080p master at pixel-for-pixel accuracy. The 2160p upscaled one will be somewhat softer when you look. On many things this will be fine, but hairs and other fine textures will likely not be as detailed anymore.

        Yes, 2160p will look better than 1080p if you’re close enough and the sources are of equal quality. If you’re too far away to see the difference, or the 4K source is heavily compressed as they almost are currently, then you won’t see a difference. Resolution is just a factor in image quality, not the end-all, be-all.

        • John

          Sigh.

          You did not read the link I posted, did you? I have not stated claims, I stated facts and provided the source.

          You, however, think that your opinion can graduate to fact merely because you posted it. ***roll eyes***

          1) 2160p content on a 1080p panel would look better than 1080p content on a 1080p panel. Why is this a fact? Because…wait for it…wait for it… wait more… here goes ..
          Downscaling somewhat reduces the quality, but the end result is still better than a 1080p content due to the bit rate of 4k content. Not depending on Chris Heinonen subjective opinion.

          For example, 4k 20 Mbps downscaled to 1080p will look better than 1080p 3Mbps.

          2) 1080p content on a 2160 panel would look better than 1080p content on a 1080p panel.

          Go read my link again, see the section UPSCALING TO 4K. Come back when we can deal in fact.

          3) 2160p content on a 2160p panel would look better than 1080p content on a 1080p panel.

          And then you go and offer the old 20/20 viewing distance rubbish (which were based on a 1980′ties fallacy) to try and say this is not always the case. Pathetic.

          You even contradicted yourself, see:

          >>>Yes, 2160p will look better than 1080p if you’re close enough and the sources are of equal quality.

          Then:

          >>>>Resolution is just a factor in image quality, not the end-all, be-all.

          ——-

          So, you say that it’s depending only on resolving pixels and then say resolution is not the only factor. Have your cake and eat it?

          It’s clear you have no idea what 4K is, nor what HDMI 2.0 (Rec.2020, higher throughput, Rec.2020, more frames per second, better sound, more colors, etc) is. You have no idea that what we see is a reconstruction of our mind based on input provided by our eyes. It is not the actual light (photon particle-waves) received by our eyes. Space is continuous, not discrete pixels. They are already working on 8k TV’s, you idiot. if you have some evidence that these engineers and sceintists are wasiting their time, why not just present it? Why offer your bloody masquerading of ignorance as fact when you have no idea what you are talking about?

          Wake up Chris. People like you see no use in 4K, yet you admit that you will at some stage buy a 4K panel. Unless you tell me now you will never ever ever buy one, will you? Even when they are cheaper than 1080p and no more 1080p TV available, will you still refuse to buy a 4K TV and continue to spout your ignorance that their is no use for 4k unless you sit close to the TV? FFS.
          I want you to answer my question. I guess you will not, because you will be to scared that you will be exposed as a fraud.

          You are merely waiting for the early adopter to buy 4K TV’s so that your conservative uninformed ignorant mind can adopt it later.
          People like you should be ashamed to call yourself a technology reviewer, you are stifling technology progress. Your ignorance begets knowledge, but you do it with confidence.

          Pathetic!

          I want you to answer my questions.

          • http://referencehometheater.com/ Chris Heinonen

            This will be my last reply to this thread as I’m not going to continually engage with name calling trolls. Nor will they be allowed to post. You can disagree, and you can do it in a civil manner.

            Downscaling somewhat reduces the quality, but the end result is still better than a 1080p content due to the bit rate of 4k content.

            If you give one signal a much larger bit-rate then it can look better. However, if you are sending a 4K signal to a 1080p display, you are effectively throwing away 75% of your bitrate. Why? Because that same 4K to 1080p conversion could have been done before, with a mastering engineer watching to correct the issues I mentioned, and then distribute it as a 1080p master. Now it’s 25% of the bitrate of the 4K signal (assuming we use the same compression technology) and will look identical. If we suddenly have 4K sources with bitrates > 4x what Blu-ray offers, which we do not right now, then this could possibly happen. But there’s still a good chance the TV scaler would introduce other artifacts.

            Go read my link again, see the section UPSCALING TO 4K. Come back when we can deal in fact.

            I’ll just quote the piece for you.

            “Despite these disadvantages, interpolation is the only option when the original content does not have the necessary resolution.”

            “However, attempting to simulate missing pixels comes at a cost; all methods incur some combination of blurring, blocking and halo artifacts”

            “upscaling noticeably reduces the appearance of blockiness and jagged edges, but falls short when it comes to depicting more detail”

            As I said, you’re trading in sharpness here because of the scaling and interpolation used to convert 1080p to 2160p. You might prefer a 1080p display because it wouldn’t show these artifacts, or you might prefer the 2160p display because you like the smoothness of the interpolation. It isn’t clear cut.

            It’s clear you have no idea what 4K is, nor what HDMI 2.0 (Rec.2020, higher throughput, Rec.2020, more frames per second, better sound, more colors, etc) is.

            I’m afraid you are mistaken here, as you mention Rec. 2020 a lot in other posts as well, and HDMI 2.0. So I’ll clear those things up:

            – Rec. 2020 is a future content standard. It includes a larger color gamut, the potential for larger bit-depths and less chroma subsampling, and support for 4K and 8K resolutions.

            – No 4K display on the market today supports Rec. 2020. None. Zero. So bringing in Rec. 2020 as a “benefit” of 4K is pointless because it doesn’t exist in the world. It’s like using Ethernet support as a reason why HDMI is awesome. HDMI has the option for Ethernet support, but since no one uses it, it’s not a practical consideration.

            – No current display is going to get a firmware update to Rec. 2020 like you mentioned. This is because no display out there supports the Rec. 2020 gamut. The display that has the largest gamut that I’ve measured, the HP z27x monitor, has a gamut that is larger than DCI/P3 but does not come close to Rec. 2020 still. This is a $1,500 production monitor used at places like Dreamworks and Pixar. It’s not a consumer display, and a consumer display won’t get there. If you’re buying a 4K display on the promise of Rec. 2020 then that’s a mistake, because any display you are buying now won’t support all of Rec. 2020.

            – If you buy a display or other product now that has an HDMI 2.0 chipset, it is likely missing HDCP 2.2. Current chipsets on the market allow for full HDMI 2.0 bandwidth, or HDCP 2.2, but not both. This is NOT upgradable down the line. The HDCP 2.2 ones lack the bandwidth for an update, and the HDCP can not be added on via firmware.

            People like you see no use in 4K, yet you admit that you will at some stage buy a 4K panel. Unless you tell me now you will never ever ever buy one, will you? Even when they are cheaper than 1080p and no more 1080p TV available, will you still refuse to buy a 4K TV and continue to spout your ignorance that their is no use for 4k unless you sit close to the TV?

            Go back and read everything I’ve written, and articles beyond this. I’ve never said that I see no use in 4K. I’ve said that 4K is simply a single component of a display, and just because a TV is 4K and another one is not. Last years plasma TVs were not 4K yet still looked better than any 4K display. Why? Because they had better black levels, viewing angles, and color. Things that you can see on all your content, not just brand new 4K content, and from any distance.

            The next TV I buy will probably be a 4K one, but because I’ll want the best TV and it will happen to be 4K. The best tech demo at CES this year was not the 4K demos but the Dolby HDR demos with larger color gamuts, bigger contrast ratios, and more dynamic range and bit depths. On a 42″ 1080p set you could easily see the difference compared to a standard 42″ 1080p set, even from 15′ away (I tried). Those are all things that we interpret to be more important to an image than resolution.

            People like you should be ashamed to call yourself a technology reviewer, you are stifling technology progress. Your ignorance begets knowledge, but you do it with confidence.

            If we all are so bad at what we do, then it should be really easy for you to do a better job and put us all out of business.

      • Jens Emil Ravn Nielsen

        Your link doesn’t have anything to do with downscaled resolution. At least, I didn’t read anything that mentioned it. I stand by my original comment. By the way, I wasn’t trying to be contentious, and I am still not, but resolution is resolution. I am not talking about color fidelity

  • Sakanawei l

    I’m sure the writer of this article is one of those many “tech gurus” that back in the days claimed things like “you don’t need an expensive 1080p t.v, 720p is just fine because you won’t see any difference between these two resolutions”.

    • http://referencehometheater.com/ Chris Heinonen

      I didn’t claim that back then, because I wasn’t writing about AV then, but I will now. Look at the reviews here for TVs and you’ll find two Vizio 32″ displays from this year. One is a 720p set and one is a 1080p set. Having reviewed and watched both extensively, the 720p set is the one I’d recommend 99% of the time. It has deeper blacks, better contrast ratios, and more accurate colors. Having compared them side-by-side, using a 1080p signal from a Blu-ray player going to both, once you are more than 3-4′ away you cannot see the extra resolution. You can easily see the better blacks and color from further away.

      Resolution is a single factor in an image. You also have to take display size and viewing distance into account. If you’re using the Vizio as a PC monitor from 2′ away, then you’re in the 1% I’d suggest get the 1080p version. The same goes with 4K. For many people, they won’t sit close enough or have a large enough screen to really see the difference. If they can buy a 1080p set for the same price, but with better overall picture quality (color, contrast ratio, viewing angles), then the image will look better.

      • minimalist1969

        Those who argue about TV resolutions remind me of the people who argue about absurd megapixels on cameras or the ones who think that having true “1080p” on a 5″ phone screen actually makes a difference. The notion that higher numbers equals higher quality is something electronics companies have used for years to seduce people into buying next years model. Its easier to advertise those “40 MP” or “4K Ultra HD” than explain to people the complex web of factors that go into real image quality.

  • Pingback: Samsung PN64H5000 Review: The Last Plasma | HD Guru()

  • Will

    Standing even 20′ away the UHD is more immersive than any standard HD TV. With nothing more than a blueray player and a 20BB/sec cable or better. If retail stores have anything the consumer doesn’t it is for sale. They are not hoarding anything. Getting the consumer ever better stuff is how they stay in business. What is there to hold back? Also I live in a google fiber hood. Google hosts Netflix servers. The 4k stream is great and really beats everything for 4k, if you can have it. I am sure Fios and Uverse are awesome also. There are tons of great holiday deals on UHD, go get some.