4K Calculator – Do You Benefit?

As 4K displays are now coming out, a common debate between people online has been about the benefits of 4K compared to HD in a normal viewing environment. Many people will say that 4K looks amazingly sharp and it’s impossible not to see the difference, but many others will say that in a regular living room, with a regular sized screen, you really won’t see much of a difference.

Update July 3rd, 2013: Read more about HiDPI and 4K Display use cases and benefits. My new article compares smartphones, tablets, laptops, monitors, TVs, and projectors to see where high-resolution provides the most visible benefit.

December 1, 2014: Revised text as 4K/UHD TVs are now available everywhere

Most people who have seen 4K to this point see it at stores that have dedicated displays for it, and not at home. Many times these are setup to only allow you to be so far away from the screen, which makes 4K look great, as it really does look amazing from 3′ away. They also use special content to show off 4K, as the only 4K sources right now are Netflix and custom sources from Sony and Samsung. Since Netflix is unreliable for a showroom floor and most companies don’t want to use a media player from a different company, you see custom demo loops that are nothing like what you will see at home.

Now that you can get a 40″ 4K TV from Samsung for only $600 or a 50″ from Vizio for $1,000, UltraHD TVs are mainstream. Since most manufacturers are making their top-of-the-line sets 4K, those who want all the best features have no choice but to go with 4K at this point. Plasma no longer exists, and OLED is still expensive and only from LG, so you are almost certainly purchasing an LCD. If you’re looking for a TV that isn’t top-of-the-line, will you see a difference with 4K in your house, or should you invest your money on different features or just spend less?

There is a chart from Carlton Bale available that shows when you might be able to see the difference with 4K compared to 1080p or 720p, but I decided to make my own 4K calculator that gives you just a little more detail. Using this you can enter your screen size, your distance from the screen, and your vision to generate some numbers for you. The information this will provide you is:

  • The Pixels Per Inch for 480p, 1080p, and 4K signals based on the screen size.
  • The maximum resolution that you can discern with your vision based on the distance from the display
  • The ideal 4K screen size for your distance, which is the smallest screen at which point you can resolve all the pixels
  • The ideal distance from a 4K display of your specified size, which is how close you need to sit to resolve every pixel
  • The amount of extra resolution that would be visible on a 4K display compared to a 1080p display based on your screen size and distance. The maximum amount would be 300%, and the minimum is 0%

I’ve also created a chart, seen below, that gives you a quick glance to see what the ideal viewing distance is for a 16:9 display based on size and resolution.  This is based on 20/20 vision, and the viewing range for each resolution is the distance you can sit from that TV and see more detail than a lower resolution, and are not close enough to see the extra detail in a higher resolution screen.  So with a 50″ 1080p display, if you are closer than 9’9″ you will see more detail than a 720p display, but if you are more than 6’6″ away, you couldn’t see any more detail on a 4K display.

Ideal Distances Chart

This calculator does make assumptions about vision and arc minutes, but those that I talked to said this was as good of an assumption to make for human vision as anything else. If you think that it is off, you can easily adjust the vision to 20/15 or 20/10 to make it more accurate to you. This also will let you calculate for devices like cell phones and tablets, that will often be placed much closer to your face, to compare them to a 4K display that is much further away. One more assumption is that you have a 16×9 screen, though other screen ratios may be supported in the future.

Many reviewers have tried to compare 4K to 1080p to see if they notice a difference. David Katzmaier pulled in a panel and showed them the same content on 4K streaming from Netflix and 1080p Blu-ray and none of the people could pick out the 4K display. At the same time, HDTVTest did a similar test using 1080p compared to 4K, but they used their own custom content instead of streaming content. In their testing people do notice the difference from a reasonable distance, though unless you are shooting you own 4K content you can’t test this yourself.

I talked to other reviewers who tested projectors, being able to instantly switch between a Sony 4K projector and a JVC X700R on a 120″ screen. They could barely notice the difference with the 4K resolution using content directly from a RED 4K camera. Even when they did notice, they preferred the JVC image because it had better blacks and a better contrast ratio, and the eye notices that more than resolution. With any display, resolution is only a single factor in how good a display looks. Knowing how much you might see that increased resolution can help you decide what TV will best work for you.

 

4K Calculator
Screen Size (Inches)
Screen Distance
Vision
4K vs 1080p Improvement %
Ideal Resolution
1080p vs. 480p
1080p Benefits Begin when ” From Screen
1080p Pixels Visible when ” From Screen
4K vs. 1080p
Ideal 4K Distance for Screen Size: “
Ideal 4K Screen Size for Distance: “
Pixels Per Inch for Input Data
4K: PPI 1080p: PPI SD: PPI

Hopefully this will help you to determine if you will see much benefit from 4K in your situation, as well as making it easier to compare devices like a cell phone or tablet that you hold very close to your face to a TV that likely sits across the room from you.

, ,

  • Pingback: Review: Seiki SE50UY04 4K Ultra HDTV (it’s $1,400!) | HD Guru()

  • azeem

    I’ve stood in front of the Sony 4K display, and have a 60″ Panasonic st60 at home. At 6 feet, the Sony looks dramatically nicer that my Panny.
    Maybe you can’t see individual pixels, but the overall picture difference was stunning to me..

    • Chris Heinonen

      From the chart, it should look great at 6′. The question is, how many people actually sit 6′ away from a 60″ TV? I don’t know any that do. Surveying my family and friends the other week, most people were 9-10′ away from a 50″ display on average, where the difference wouldn’t really exist.

      • azeem

        I sit about 15 feet away from my 60″ , so I may never see any difference, but what I am trying to say is that looking at the picture of the 4K, there was no question that it was much sharper, and exhibited dramatically more detail than my HDTV could ever reproduce.

        It would be interesting to see a similar chart comparing what happens on a laptop with a Retina-class display versus a standard display. I’d venture to say you have to move quite a bit aways from the keyboard on a standard display before you could not tell the difference between the two.

        • Chris Heinonen

          Right, and the calculator is designed to show that it is sharper, but when you’re in a certain range of distances. This way you can measure your viewing environment, and think about the TV size you might get, and determine if you’re likely to see a difference or not.

          I’m going to add a follow-up article that takes some common screen sizes (iPad, iPhone, Laptop, LCD Monitor, TV) and the common resolutions for them, and shows the distances at which you can tell a difference. The iPad/iPhone example is brought up a lot, but with that screen size and resolution, you don’t see pixels until they are 10″ or so away from your face, which is a more likely distance for an iPhone than for a 4K HDTV to be. PPI matters, but as a function of distance more than anything.

          • JP

            I sit 10 feet from my tv. In your opinion, which Sharp 70 should I purchase between these two models:

            LC70LE660U or

            LC70UD27U? Agonizing, any advise would be greatly welcomed.

          • http://referencehometheater.com/ Chris Heinonen

            Well I can say I wouldn’t get the LC70UD27U. For the same price you can get the 70″ Vizio M-Series which has a better backlight system and better contrast ratios. If you don’t like Vizio, you can get the 65″ Samsung 7100 series for around the same price and have better smart apps and a better picture. If I’m getting a 4K TV in that size, those are the first two I’d look at right now (and two that I’m planning to review soon though in 55″-60″ sizes more likely).

            For a 1080p set the Sharp seems fine, though I’ve not reviewed it so I can’t say much for sure. However, the 4K ones will look better not because of the resolution but because they have dynamic lighting systems that will improve contrast ratios.

          • JP

            I know the Sharps aren’t the best out there but the price on the 4k 27u is 1999.00 through a friend who is the GM at a big box retailer, so I thought that would be a no-brainer.

          • JP

            1499.00 for the 1080p.

          • http://referencehometheater.com/ Chris Heinonen

            That’s not that much of a savings, and $200 more gets you the Vizio M-Series which has a much better overall picture. Speaking for myself, there’s no way I’d pick the Sharp over the Vizio to save myself 10% on a TV I plan to have for a long time.

      • Mike McG

        Everyone seems to consider only TV size and viewing distance, but misses that fact that average human vision is closer to 20/14, not 20/20. 20/20 is simply the cut-off for when your vision is considered good enough that you don’t need glasses. When you’re talking about 20/14 vision (average), or 20/10 vision (twice as sharp as 20/20), the equation changes. Someone with 20/10 vision can resolve twice the resolution of someone with 20/20 vision, and will benefit more from higher resolutions at the same distance.

    • No

      There is a limit in angle of view for comfortable viewing. If you have more resolution that what you can see when you match that angle, it’s useless resolution. You get dizzy with some material and want to throw up. That’s the typical response of the viewer.

      Computer displays are different, because we can work on a small part of the screen at a time, so we can put large displays very close.

      The diagonal of the screen is a safe bet for getting a maximum angle of view. 50″ screen at 50″ distance and so on. At that distance, you can see an action film without vomiting. The same applies to theaters, the good seat is at a distance equal to the diagonal, not with the idiots on the front. One can check the THX design guidelines. Theaters want to sell more tickets through, so they will not limit their sales to seats with good video and good audio. They hope you will be fine after 90 or 120 minutes. TV is a different thing though, people watch TV or home cinema for hours at a time.

      At a good distance, most people can see all pixels of a 1080p display and a little more than that. Give them 2x or 4x or 8x the resolution and it will not even affect their experience. They will simply not notice it. meanwhile, production cost, distribution cost,and other costs are increased, and data has to be compressed more. Digital distribution cost is a huge factor for corporations, so they will try to give you 4K without additional cost over 2K, and that will certainly come with more artifacts. Pressed by the cost factor, broadcast 1080p is not as good as it should be today.

      Of course,since there are people that have their heating set to 30C in their winter and their cooling to 15C in the summer, we will also have people with 50″ screens sitting at a distance of 20″ so they can “appreciate” all the resolution.

      In the real world, TV or projection screens have to be 100″-120″ to actually make their 1080p visible.

  • Surge

    This is FANTASTIC!!
    It confirms what others have said, on CNET, as well as on Hometheaterreview.com (where the authors claimed NO difference for 1080P content on a JVC 1080P projector vs the Sony 4K projector on a 110″ screen from 8-10′ away).

    For those who claim that the 4K still looks better… that’s probably because it’s brighter, or has better contrast, or better color… You are not comparing apples-to-apples resolution only.

    A 4K projector is 4-5X more expensive than a very good 1080P projector. Unless your screen size is >120″ and you’re sitting 10′ away or closer (approx., better to use the calculator above)… you really won’t see much of a difference in resolution.

    • Mike McG

      Ever notice the authors on CNET are wearing glasses? As a person with better than 20/20 vision (which is the population norm), there is a MASSIVE difference between 4K and 1080P from the same viewing distance. See for yourself, select 20/10 or 20/15 in the calculator.

      • http://referencehometheater.com/ Chris Heinonen

        I’d suggest reading Katzmaier’s take on the recent Panasonic 4K set, where he compares it directly with a 1080p set of the same size, using the same 4K content, and the closest THX recommended viewing angle:

        http://reviews.cnet.com/flat-panel-tvs/panasonic-tc-l65wt600/4505-6482_7-35827200.html

        Not sure why you’re bashing glasses. So we shouldn’t believe people that make sure their vision is corrected to be at least average, if not better?

        • Mike McG

          I’m not bashing people with glasses, I’m just tired of the misleading and inaccurate reviews. I’ve seen the TVs myself, side by side, with the same content vs 1080p, in person, and the difference to me is night and day. If you make the mistaken assumption that 20/20 is “average”, then you’re going to come to false conclusions.

          • Surge

            20/20 IS Average! Research what the term means.

          • Mike McG

            Thanks, I have! You should too! Let me get you started with studies of visual acuity by age.

            http://www.oculist.net/downaton502/prof/ebook/duanes/pages/v5/v5c051.html

            By definition, 20/20 vision is nominal, or “normal”. Check Wikipedia…or anywhere really. This does not imply in any way that it is a statistical average in a given population. In fact, it’s not even a scientifically chosen measure! It just so happens that Snell, who invented the chart, had an assistant who could read a standard lead size print from 20 feet. As he was considered to have good eyesight, that became the starting point for the test, but they later determined (scientifically) that many people see far better than that.

            Today, we know that a normal healthy human eye sees at about 20/12 to 20/16 from childhood, through middle age. By the age of 60 or 70, visual accuity drops to 20/20. There are people with better vision, and people with worse.

            Regardless of what “average” vision is, or even what “normal” vision is, there are people who see twice as sharply as those with 20/20 vision. So making generic claims about whether or not the “average” person can see a difference is nonsense. First of all, 20/20 is not the average, thanks to glasses and contacts, about 75% of people of all ages see better than 20/20. And even if 20/20 was average, some people still see twice as sharply! We don’t all have the same vision, we don’t all have the same size TV, and we don’t all sit the same distance away from it. And that’s why the calculator has 3 parameters, not 2.

          • John

            Nope. Norm vs average. Wake up dude.

        • John

          Chris,
          You article is rubbish.
          Thankfully we reply on science and not your objective opinion that has no basis or support in science.

          You are assuming all people has 20/20. That is the normal and not the average. The majority of people have a better vision than 20/20.
          The average is 20/14, so 4k matters at normal viewing distances.
          What we see if a reconstruction of our mind based on input provided by our eyes. It is not the actual light (photon particle-waves) received by our eyes.
          The human eye can resolve the equivalent of 52 megapixels (assuming a 60% angle of view, 20/20 vision). However, if we consider a 20% off-center we perceive 1/10 as much detail. At periphery we detect
          larger scale contrast and minimal color. Depending on eye sight we perceive detail more akin to 5-15 megapixels. We also do not remember pixel by pixel, but textures, color and contrast on an image by image basis. The memorable image is really one that is prioritized based on interest. (We see faces, look for patterns, often false patters). You should consider asymmetry (that is why you nee to place your tv lower
          than your line of sight, in a direction away from your nose). Also consider low light because you start losing color and perceive monochrome. Also consider graduations where enlarged detail become less visible to the human eyes, compared to a camera where it’s the opposite.
          4k (2160p) is merely 8.29 megapixels. 1080p is only 2.07 megapixels. No wonder they are working on 8k already. there is a difference between the content resolution/detail and what your mind perceive.

          People that knock 4k content/panels due to pixel
          resolution based on distance or 4k is not 4k (because of the 3840×2160 vs 4096 x 2160) are merely projecting their ignorance with confidence.
          3840×2160 scaled better from 1080p (1920×1080), we have 16×9 TV’s, not 19:10 for 4096 x 2160. But you do get the detail of 4096 x 2160 on your 16×9 and that is why you have a black bar at the top and bottom of your screen.

          Wake up Chris, as far as we know space is continuous, not discrete. More pixels is not just better resolution, it’s better color and detail too.

  • Surge

    How do you translate the findings for a 2.35:1 screen?

    • Chris Heinonen

      You’d need to base it on a 16×9 screen size of the same width. So a 109″ 2.40:1 screen is the same width as a 116″ 16:9 screen. I’ll see if I can try to make an option for this in the future.

  • No

    The Limits of Human Vision
    Michael F. Deering

    http://www.swift.ac.uk/about/files/vision.pdf?

  • No

    And the THX setup guidelines:

    http://www.thx.com/consumer/home-entertainment/home-theater/hdtv-set-up/

    These are interesting because they are designed for comfortable viewing. You can’t simply change the angles to 2x or 4x (the distance to 1/2 or 1/4) to accommodate 2x or 4x resolution.

    In effect, increasing resolution does not make much sense over 2K because of the comfortable viewing factor that limits the angle, and the fixed resolution of the human eye. I would rather invest in higher bitrate instead, because this will provide a true advantage, a more natural image with less artifacts.

    • Mike McG

      I’m sorry, but you’re wrong. An increase in resolution to match the capabilities of the human eye makes a huge difference. Human visual capabilities average from 20/10 in young people, to 20/14 on average, to 20/20 at age 55 as our vision slowly degrades. At 20/10, even a moderate 40″ TV sees a nearly 300% benefit at 4K from 6ft. At 55″, 4K isn’t enough, as 5K resolution would be ideal. When you’re doing calculations, base it on your vision, but don’t assume 20/20 is average, because it is not.

    • Tom Ldn

      THX actually recommend a 40 degree viewing angle, for a fully ‘immersive’ experience. This means you have to sit quite close to the TV – most people are more comfortable with a 30 degree viewing angle. If you sit as close as THX recommend, a 4K UHD will make sense, but usually Full HD resolution will be sufficient. You can see a comparison and optimum viewing distances in this infographic http://www.kagoo.co.uk/blog/how-to-find-the-right-size-tv-infographic/

  • Naveen

    The equation for optimal 20/20 viewing distance in feet (in decimals) is:
    =(SQRT(x^2/(16^2+9^2))*9)/(1080)*60*360/(22/7)/2/12

    * where x = screen’s diagonal length
    * where aspect ratio isn’t 16:9, change 16 & 9 in the above equation with appropriate aspect ratio
    * 1080 is the resolution used in the above equation; change if required to the appropriate resolution
    * the last part of the equation ’12’ is to convert the final figure into feet. For applying this equation to a smaller device, say a smartphone or a tablet/laptop, not incl this 12 would show results in inches.

    • Chris Heinonen

      That appears to be the Snellen formula, which is what I use to generate this data. Other people support other formulas, and hopefully I can update the calculator to support multiple formulas in the near future. Thanks.

      • Mark Rejhon

        One interesting consideration is the compression variable. Compression artifacts are much bigger than pixels. On 4K, compression artifacts are one-quarter the size than during the same-size 1080p.

        When using more extreme compression ratios (keeping bitrate-per-resolution constant), the viewing distance benefit actually increases ; to the point where people have reported noticeable differences more than 10 feet away from a 4K display.

        With streaming becoming the default choice for movie delivery over 4K (instead of physical media), compression artifacts are noticeable from 10 feet away from a 4K 50″ display, and can actually look significantly better. (e.g. 1080p @ 4Mbps versus 4K @ 16Mbps, keeping proportional bitrate per pixel)

        It is an interesting “fudge factor” variable to consider…

        • http://referencehometheater.com/ Chris Heinonen

          You’re treating that higher bit-rate as an either/or decision, that to get it you’d need to also move to 4K from 1080p. As we’ve seen from Netflix and their 4K streams, which are apparently a paltry 3 mb/sec, that’s unlikely to happen.

          You could also take an increase in bit rate, and the better compression of HEVC, and apply that to 1080p content. If you’re already at the viewing limits for pixels, you could then have fewer compression artifacts and add in greater bit depth to both luma and chroma content. That greater range would give us fewer banding compression artifacts and be more noticeable than any increase in resolution from a distance.

          • Mark Rejhon

            For better or for worse — it’s disappointingly relevant since content makers are getting more into streaming, and it’s looking less likely that consumers are going to flock to physical 4K formats as streaming 4K formats, especially as net connections get faster. As smart people like you and me already know, eventually 4K will be a small premium over 1080p, much like how 1080p became a small premium over 720p, and thus the reason to avoid 4K becomes smaller.

            The rationale seems to be:
            — The world is slowly going to streaming;
            — The premium of 4K will eventually become fairly small eventually;
            — Content makers are (theoretically) more likely to give you higher bitrates for 4K than for 1080p, just to push their premium product. Eventually, it could potentially become easier to find 15Mbps+ and 20Mbps+ streams in 4K format than in 1080p format. Depends on which direction the market goes in.
            — At some point in the future, a critical mass occurs and streaming potentially becomes the primary driving force for the sales of 4K TV’s.

            P.S. 3Mbps? That must be the minimum rate in a congested Internet area — My router bitrate meter measured twice that 3 Mbps rate just for my 1080p streams (e.g. for movie “Hugo”, SuperHD stream, top bitrate). It’s a 35Mbps VDSL hosting a Netflix Open Connect appliance locally — so my streams tends to always max out here in Toronto). When I did some bitrate measurements, the movie Hugo measured 6 to 7 megabits per second when measured from my router (probably includes a bit of overhead). It looked far more well-compressed than my television channels. Certainly Blu-Ray is superior (I try to get them when I can too) but I’ve been fortunate to be on one of those ‘good Netflix Internet connections’ that consistently delivers the better streams. People buy TV’s for getting cable too — yet around here, streaming can often be better than television quality. It’s a sad state of affairs, really, as I understand the lovely look of a H.264 stream from Blu-Way at 25 megabits per second.

  • ronniejr

    I have a 2007 model Toshiba.. still has a pretty good picture, but when I went to see the Sony 4k I was blown away.. I have 20/15 vision. You have to be right on top of the tv to see any kind of pixels But, I do believe it makes a difference. I have a 1080p 5″ phone an my wife has a 4.7″ 720p phone an I can tell the difference in the two as far as resolution.. I don’t care what Steve jobs said.. some people with near perfect vision can distinguish the resolution even at that high of ppi.. I am definitely going to be getting a 4k Sony in the near future. Not just because of the pixel density but the color and contrast are so vibrant it all plays into the factors of a person that wants the best possible picture.. although ill wait for the next model year before I make a purchase.

    • Mike McG

      I’m with ya ;) I have 20/10 vision, and I still see the pixels on the retina displays. 4K is ideal for me for a 40″ TV. Anything larger, and an even higher resolution would be beneficial.

  • Richard Barg

    Amazingly brilliant. You nailed it.

    • John

      Nope, he assumes all people has 20/20. That is the normal and not the average. The majority of people have a better vision than 20/20. The average is 20/14, so 4k matters at normal viewing distances.
      What we see if a reconstruction of our mind based on imput provided by our eyes. It is not the actual light (photon particle-waves) received by our eues
      The human eye can resolve the equivalent of 52 megapixels (assuming a 60% angle of view, 20/20 vision). However, if we consider a 20% off-center we perceive 1/10 as much detail. At periphery we detect larger scale contrast and minimal color. Depending on eye sight we perceive detail more like 5-15 megapixels. We also do not remember pixel by pixel, but memorable textures, color and contrast on an image by image basis. The memorable image is really one that is prioritized based on interest. (We see faces, look for patterns, often false patters). You should consider asymmetry (that is why you nee to place your tv lower than your line of sight, in a direction away from your nose). Also consider low light because you start losing color and perceive monochrome. Also consider graduations where enlanded detail become less visible to the human eyes, compared to a camera where it’s the opposite.
      4k (2160p) is merely 8.29 megapixels. 1080 p is only 2.07 megapixels. No wonder they are working on 8k already. there is a difference between the content resolution/detail and what your mind perceive.

      People that knock 4k content/panels due to pixel resolution based on distance or 4k is not 4k (because of the 3840×2160 vs 4096 x 2160) are merely projecting their ignorance with confidence.

      Wake up people, as far as we know space is continuous, not discrete. More pixels is not just better resolution, it’s better color and detail too.

  • Mike McG

    Thanks for sharing this calculator! It’s great to have a tool that can show what improvement real people will see, vs. all these misinformed articles claiming there is no noticeable difference, written by authors in glasses, assuming 20/20 is average vision (it is not). Average human vision is closer to 20/14, with many people in the 20/12 to 20/10 range. I have 20/10, and your calculator tells me on a 40″ 4K TV at 6ft away, I’ll perceive about a 278% improvement over 1080P, which matches what I’ve seen in person with 4K TVs…the picture is SOOOO much sharper! Now, I just need the prices to drop ;)

    • http://referencehometheater.com/ Chris Heinonen

      Except what you are writing makes no sense. 20/20 vision means that someone sees at 20 feet what an average person sees at 20 feet. If 20/14 was actually “average” then the scale would be reset and that would become 20/20.

      • Mike McG

        Hi Chris, no, that is not what 20/20 means. That’s an inaccurate assumption made by most people when discussing 4K. 20/20 is not a statistical average of human vision (at least not until you look at people about 60+ years old). 20/20 is what they consider “normal” vision, meaning that your vision is considered within the normal range. The population average is closer to 20/14, with some people seeing as well as 20/10. At 20/10, I can resolve the same detail at 40ft as a person with 20/20 could see at 20ft. Which means for me, and many others (under the age of 60), 4K TV is a welcome sight!

        See figure 4 for a number of studies of average vision by age…

        http://www.oculist.net/downaton502/prof/ebook/duanes/pages/v5/v5c051.html

  • Mark Rejhon

    An interesting test for 4K benefit is computer graphics aliasing — http://www.testufo.com/aliasing-visibility

    Surprisingly, this is actually visible 10 feet away from a 50″ 4K display (or 24″ 1080p monitor — almost the same ppi). The line aliasing/antialiasing artifacts translates to a bead-type artifact that shows up at wide spacing intervals, which then becomes visible from a farther-than-expected viewing distance.

  • Mahmoud El-Darwish

    Correct but there is a missing parameter. Field scanning by the viewer! Since it increases with larger screens, it affects the experience. Note how the benefits of 4k are derived from a larger panorama. This mimics movie theater experience at overly close seating to screen, which we know to be counter productive!

  • jason_darrow

    I think most people here are confusing increased resolution with improved color representation and brightness. As for resolution, I have 20/20 and am unable to discern the difference between 720p and 1080p on a 46″ screen at 2 meters distance. I have absolutely no interest in 4k, but am very interested in better and brighter colors.

    • John

      4k is not only better resolution, the more pixels and updated standards yield more colors.

      A 4k panel also helps 1080p content look better, especially if the content is 4k (although it does not have to be); upscaling.

      1080p TV’s use hdmi 1.4 which is limited to 30 frames per second (or 30Hz), 10.2Gbps TMDS throughput.

      Some 4k tv’s on hdmi 1.4 is only 24Hz for the 4k content. HDMI 1.4 can deliver 4k content, but will be at 24/30fps and with some luck most likely with 8 bit 4:2:0 color.

      4K panels with HDMI 2.0 supports 50/60 fps, with 12-bit 4:2:2 (or 4:4:4, or 4:2:0) color reproduction (chroma subsampling), 18Gbps TMDS throughput. Rec.2020 color space is coming soon I hope.

      The color depth (bit/px) remains the same on hdmi 1.4 and 2.0 at 48. But the pixel clock rate (MHz) improves from 340 to 600.

      Also look for HDCP 2.2 (copy protection standard) in your 4k TV to allow content across connected devices. A built-in HEVC decoder should also be present so that you can get Netflix 4k (your need a broadband with speed of at least 15 Mbps).

      Check if the 4K TV supports a gamut to fulfil Rec.2020 color space.

      Check that your TV is upgradeable as HDMI 2.0 standards is being evolved and adopted differently my different TV manufacturers. HDMI 2.0 supports the Rec. 2020 color space, but it does not imply your TV manufacturer adopted it (but a firmware upgrade may do it and not hardware if the TV already has hdmi 2.0). 4K Blu-ray Disc players may arrive December 2015.

      So, unless you are an Apple fan (who rejects technology standards and use yesterday’s technology at tomorrow’s prices), there is no reason to claim that 4K TV’s has no benefit to the viewer.

  • Jens Emil Ravn Nielsen

    I know you posted this a long time ago, but I just want to point out that watching 4k content on a 1080 screen won’t look as nice as 1080 content on a 1080 screen. Just like upscaling 1080 content on a 4k screen won’t look as nice as 4k content on a 4k screen, downscaling/sampling 4k to a 1080 screen just won’t look as nice.

    • John

      That is incorrect.
      2160p content on a 1080p panel would look better than 1080p content on a 1080p panel.
      1080p content on a 2160 panel would look better than 1080p content on a 1080p panel.
      2160p content on a 2160p panel would look better than 1080p content on a 1080p panel.
      http://www.red.com/learn/red-101/upscaled-1080P-vs-4K

      • http://referencehometheater.com/ Chris Heinonen

        The first assertion is almost certainly wrong, the second is up for debate, and the third is of no debate.

        Downsampling 2160p to 1080p isn’t going to look better than 1080p because there a 99% certainty that the scaler inside the TV is worse than the one being used to master the content. Studios are already taking a 4K master and converting it down to 1080p and when they do that, they have someone there doing that job to make sure there aren’t jaggies, moire, and other artifacts when they downsample. If they are there, they can tweak the master to eliminate it. A scaler in a TV can’t do this, and those artifacts are going to be present.

        1080p upsampled to 2160p is going to depend on the scaler. What is going to happen is that it’s going to look smoother and often softer, since keeping it sharp results in it either looking blocky, or adding edge enhancement artifacts. The 1080p content on a 1080p screen might look better because it can do the 1080p master at pixel-for-pixel accuracy. The 2160p upscaled one will be somewhat softer when you look. On many things this will be fine, but hairs and other fine textures will likely not be as detailed anymore.

        Yes, 2160p will look better than 1080p if you’re close enough and the sources are of equal quality. If you’re too far away to see the difference, or the 4K source is heavily compressed as they almost are currently, then you won’t see a difference. Resolution is just a factor in image quality, not the end-all, be-all.

        • John

          Sigh.

          You did not read the link I posted, did you? I have not stated claims, I stated facts and provided the source.

          You, however, think that your opinion can graduate to fact merely because you posted it. ***roll eyes***

          1) 2160p content on a 1080p panel would look better than 1080p content on a 1080p panel. Why is this a fact? Because…wait for it…wait for it… wait more… here goes ..
          Downscaling somewhat reduces the quality, but the end result is still better than a 1080p content due to the bit rate of 4k content. Not depending on Chris Heinonen subjective opinion.

          For example, 4k 20 Mbps downscaled to 1080p will look better than 1080p 3Mbps.

          2) 1080p content on a 2160 panel would look better than 1080p content on a 1080p panel.

          Go read my link again, see the section UPSCALING TO 4K. Come back when we can deal in fact.

          3) 2160p content on a 2160p panel would look better than 1080p content on a 1080p panel.

          And then you go and offer the old 20/20 viewing distance rubbish (which were based on a 1980’ties fallacy) to try and say this is not always the case. Pathetic.

          You even contradicted yourself, see:

          >>>Yes, 2160p will look better than 1080p if you’re close enough and the sources are of equal quality.

          Then:

          >>>>Resolution is just a factor in image quality, not the end-all, be-all.

          ——-

          So, you say that it’s depending only on resolving pixels and then say resolution is not the only factor. Have your cake and eat it?

          It’s clear you have no idea what 4K is, nor what HDMI 2.0 (Rec.2020, higher throughput, Rec.2020, more frames per second, better sound, more colors, etc) is. You have no idea that what we see is a reconstruction of our mind based on input provided by our eyes. It is not the actual light (photon particle-waves) received by our eyes. Space is continuous, not discrete pixels. They are already working on 8k TV’s, you idiot. if you have some evidence that these engineers and sceintists are wasiting their time, why not just present it? Why offer your bloody masquerading of ignorance as fact when you have no idea what you are talking about?

          Wake up Chris. People like you see no use in 4K, yet you admit that you will at some stage buy a 4K panel. Unless you tell me now you will never ever ever buy one, will you? Even when they are cheaper than 1080p and no more 1080p TV available, will you still refuse to buy a 4K TV and continue to spout your ignorance that their is no use for 4k unless you sit close to the TV? FFS.
          I want you to answer my question. I guess you will not, because you will be to scared that you will be exposed as a fraud.

          You are merely waiting for the early adopter to buy 4K TV’s so that your conservative uninformed ignorant mind can adopt it later.
          People like you should be ashamed to call yourself a technology reviewer, you are stifling technology progress. Your ignorance begets knowledge, but you do it with confidence.

          Pathetic!

          I want you to answer my questions.

          • http://referencehometheater.com/ Chris Heinonen

            This will be my last reply to this thread as I’m not going to continually engage with name calling trolls. Nor will they be allowed to post. You can disagree, and you can do it in a civil manner.

            Downscaling somewhat reduces the quality, but the end result is still better than a 1080p content due to the bit rate of 4k content.

            If you give one signal a much larger bit-rate then it can look better. However, if you are sending a 4K signal to a 1080p display, you are effectively throwing away 75% of your bitrate. Why? Because that same 4K to 1080p conversion could have been done before, with a mastering engineer watching to correct the issues I mentioned, and then distribute it as a 1080p master. Now it’s 25% of the bitrate of the 4K signal (assuming we use the same compression technology) and will look identical. If we suddenly have 4K sources with bitrates > 4x what Blu-ray offers, which we do not right now, then this could possibly happen. But there’s still a good chance the TV scaler would introduce other artifacts.

            Go read my link again, see the section UPSCALING TO 4K. Come back when we can deal in fact.

            I’ll just quote the piece for you.

            “Despite these disadvantages, interpolation is the only option when the original content does not have the necessary resolution.”

            “However, attempting to simulate missing pixels comes at a cost; all methods incur some combination of blurring, blocking and halo artifacts”

            “upscaling noticeably reduces the appearance of blockiness and jagged edges, but falls short when it comes to depicting more detail”

            As I said, you’re trading in sharpness here because of the scaling and interpolation used to convert 1080p to 2160p. You might prefer a 1080p display because it wouldn’t show these artifacts, or you might prefer the 2160p display because you like the smoothness of the interpolation. It isn’t clear cut.

            It’s clear you have no idea what 4K is, nor what HDMI 2.0 (Rec.2020, higher throughput, Rec.2020, more frames per second, better sound, more colors, etc) is.

            I’m afraid you are mistaken here, as you mention Rec. 2020 a lot in other posts as well, and HDMI 2.0. So I’ll clear those things up:

            – Rec. 2020 is a future content standard. It includes a larger color gamut, the potential for larger bit-depths and less chroma subsampling, and support for 4K and 8K resolutions.

            – No 4K display on the market today supports Rec. 2020. None. Zero. So bringing in Rec. 2020 as a “benefit” of 4K is pointless because it doesn’t exist in the world. It’s like using Ethernet support as a reason why HDMI is awesome. HDMI has the option for Ethernet support, but since no one uses it, it’s not a practical consideration.

            – No current display is going to get a firmware update to Rec. 2020 like you mentioned. This is because no display out there supports the Rec. 2020 gamut. The display that has the largest gamut that I’ve measured, the HP z27x monitor, has a gamut that is larger than DCI/P3 but does not come close to Rec. 2020 still. This is a $1,500 production monitor used at places like Dreamworks and Pixar. It’s not a consumer display, and a consumer display won’t get there. If you’re buying a 4K display on the promise of Rec. 2020 then that’s a mistake, because any display you are buying now won’t support all of Rec. 2020.

            – If you buy a display or other product now that has an HDMI 2.0 chipset, it is likely missing HDCP 2.2. Current chipsets on the market allow for full HDMI 2.0 bandwidth, or HDCP 2.2, but not both. This is NOT upgradable down the line. The HDCP 2.2 ones lack the bandwidth for an update, and the HDCP can not be added on via firmware.

            People like you see no use in 4K, yet you admit that you will at some stage buy a 4K panel. Unless you tell me now you will never ever ever buy one, will you? Even when they are cheaper than 1080p and no more 1080p TV available, will you still refuse to buy a 4K TV and continue to spout your ignorance that their is no use for 4k unless you sit close to the TV?

            Go back and read everything I’ve written, and articles beyond this. I’ve never said that I see no use in 4K. I’ve said that 4K is simply a single component of a display, and just because a TV is 4K and another one is not. Last years plasma TVs were not 4K yet still looked better than any 4K display. Why? Because they had better black levels, viewing angles, and color. Things that you can see on all your content, not just brand new 4K content, and from any distance.

            The next TV I buy will probably be a 4K one, but because I’ll want the best TV and it will happen to be 4K. The best tech demo at CES this year was not the 4K demos but the Dolby HDR demos with larger color gamuts, bigger contrast ratios, and more dynamic range and bit depths. On a 42″ 1080p set you could easily see the difference compared to a standard 42″ 1080p set, even from 15′ away (I tried). Those are all things that we interpret to be more important to an image than resolution.

            People like you should be ashamed to call yourself a technology reviewer, you are stifling technology progress. Your ignorance begets knowledge, but you do it with confidence.

            If we all are so bad at what we do, then it should be really easy for you to do a better job and put us all out of business.

          • Remedy Ailment

            Hmm, i think John’s assertions about downsampling based on higher bitrates are fatally flawed, I also agree that his attitude is terrible I’m not sure what calling people idiots achieves, maybe he spent $$$$ on 4K. the main problem i have is that the assertion that downsampling from a higher is obviously better because to him it appears a perfect conclusion logically, far from it. Much like those who misbelieve that hi-res audio is better, when sampling from higher bitrates the error margins are much larger and therefore so are the errors. I also think that some 4K sets look way better at distances beyond those listed in the chart but not because 4K is clearly visible but mainly because the sets and their panels are much better. An objective test between 1080p and 4K with these academic criteria is impossible but i firmly believe the science here is correct, human eye resolution is proven to reach extinction based within these parameters

          • abbottoklus

            This is the best destruction of an idiot comment I’ve ever seen. Bravo.

          • bruce livingston

            Watching that disrespectful arsehat get schooled is the best thing I’ve seen today. Well done sir.

          • EAK

            Dear poor John,
            Just wanted to let you know 2 points:
            1. As soon as you name your opponent names, e.g., ‘you idiot’, – you become one, i.e., an idiot. It’s just a fact. Ask your wife.
            2. The issue is RESOLUTION. It’s a specific anatomo-physiological term, well defined (look it up). NOT ‘looks better’. So, regarding RESOLUTION, everything the author (Chris) wrote in the article AND answered to you is correct.

            Simply hook up a blue-ray dvd to SD tv and see what happens WITH THE RESOLUTION.

      • Jens Emil Ravn Nielsen

        Your link doesn’t have anything to do with downscaled resolution. At least, I didn’t read anything that mentioned it. I stand by my original comment. By the way, I wasn’t trying to be contentious, and I am still not, but resolution is resolution. I am not talking about color fidelity

  • Sakanawei l

    I’m sure the writer of this article is one of those many “tech gurus” that back in the days claimed things like “you don’t need an expensive 1080p t.v, 720p is just fine because you won’t see any difference between these two resolutions”.

    • http://referencehometheater.com/ Chris Heinonen

      I didn’t claim that back then, because I wasn’t writing about AV then, but I will now. Look at the reviews here for TVs and you’ll find two Vizio 32″ displays from this year. One is a 720p set and one is a 1080p set. Having reviewed and watched both extensively, the 720p set is the one I’d recommend 99% of the time. It has deeper blacks, better contrast ratios, and more accurate colors. Having compared them side-by-side, using a 1080p signal from a Blu-ray player going to both, once you are more than 3-4′ away you cannot see the extra resolution. You can easily see the better blacks and color from further away.

      Resolution is a single factor in an image. You also have to take display size and viewing distance into account. If you’re using the Vizio as a PC monitor from 2′ away, then you’re in the 1% I’d suggest get the 1080p version. The same goes with 4K. For many people, they won’t sit close enough or have a large enough screen to really see the difference. If they can buy a 1080p set for the same price, but with better overall picture quality (color, contrast ratio, viewing angles), then the image will look better.

      • minimalist1969

        Those who argue about TV resolutions remind me of the people who argue about absurd megapixels on cameras or the ones who think that having true “1080p” on a 5″ phone screen actually makes a difference. The notion that higher numbers equals higher quality is something electronics companies have used for years to seduce people into buying next years model. Its easier to advertise those “40 MP” or “4K Ultra HD” than explain to people the complex web of factors that go into real image quality.

  • Pingback: Samsung PN64H5000 Review: The Last Plasma | HD Guru()

  • Will

    Standing even 20′ away the UHD is more immersive than any standard HD TV. With nothing more than a blueray player and a 20BB/sec cable or better. If retail stores have anything the consumer doesn’t it is for sale. They are not hoarding anything. Getting the consumer ever better stuff is how they stay in business. What is there to hold back? Also I live in a google fiber hood. Google hosts Netflix servers. The 4k stream is great and really beats everything for 4k, if you can have it. I am sure Fios and Uverse are awesome also. There are tons of great holiday deals on UHD, go get some.

  • Underhill

    Remember something else. If you walk into the store and expect to see the best possible 1080p screen next to the flashy new 4k display they are trying to upsell everyone to, you are out of your mind. Most stores will display a 5 year old LCD from the bargain bin next to their newest 4k model to highlight the vast difference.
    I wouldn’t even be surprised if the settings were fiddled with to give the 4k even more advantage.
    There is a difference, but it’s hard to argue with the facts. For most of us, in normal living rooms where people actually live and do things other than watch TV, 4k just isn’t worth it.

  • PSAGuy

    Agree….The techies are out with their logic but the fact remains….I am a professional photographer and absolutely know how to look at an image for flaws and resolution detail etc. The 4K’s seem better to the human brain. Crank all the numbers you want. People come to my home, stand 10-15 feet away from my 60″ 4k TV and stand there with their mouths open !!! They cannot believe it. We tell them nothing about the TV…..until they ask.

    • abbottoklus

      I seriously question this. 10-15 feet away with a 4K screen makes zero difference to my eye and I’ve yet to encounter someone who can prove otherwise, I honestly think you’re just experiencing a placebo effect – you’re arguing with cold hard science, sir. It also makes a big difference to just have a TV setup properly which frankly most people don’t know how to do, as well as having a TV which displays great colour and blacks. Like the OP said, there is so much more behind a great display than resolution.

      This is coming from someone in video/TV production working extensively with 1080 and 4K content. Don’t get me wrong – it has great advantages, but much more-so in production than content. As a consumer technology, it is practically negligible. The benefits far underweigh the price. Even as a professional user, I have two 24″ 1080p monitors 3′ from my face all day long. I’ve compared the same size in 4K at the same distance – absolutely no difference whatsoever. I didn’t work with them, just previewed, so perhaps in certain workflows they would show some benefits – zooming in footage for example, or those times I have to lean in close – but again, the benefits far underweigh the price.

  • Chase Payne

    I think many people are being mislead on how TV’s handle lower resolutions. To make a long story short, a 1080p image theoretically looks exactly the same on a 4k TV. Let me explain to you how it works:

    Pretty much all modern TV’s today feature a technology called upscaling, which means instead of reducing the televisions resolution entirely, it doubles the pixels to fit the same screen space as it would originally. It looks exactly the same as you would be running on that resolution. This doesn’t work on everything, but if you’re just ana average American who only watches TV and never plays video games, you won’t notice a difference other than better contract (see below). That benefit is worth the cost considering 1080p has lasted for over 10 years and if you bought a 4k TV now it could last even longer.

    The illusion here is that 4k television sets actually have a better contrast ratio and what happens is the image appears more colorful and detailed, because it’s more accurate lighting. You get the same affect by calibrating your monitor, simply adjusting the contract and brightness to be not too bright or dark, you can see up to 30% more detail.

    Now, the big issue here is not everything supports upscaling. For example, video games ran on the PS3 does not pixel double. It actually physically changes the internal resolution on the TV and causes it to stretch the image and look horrible. Again, this is all on the TV. Some TV’s report a lower resolution to the video game console, and pixel doubles anyway to get the same true image as you would on a native screen. These TV’s are considerably more expensive, and they’re not advertised as much. The Sony tv’s do this for you, so if you’re a gamer don’t get anything other than Sony.

    • http://referencehometheater.com/ Chris Heinonen

      A bit of this is off, so I’ll quote those bits.

      Pretty much all modern TV’s today feature a technology called upscaling, which means instead of reducing the televisions resolution entirely, it doubles the pixels to fit the same screen space as it would originally. It looks exactly the same as you would be running on that resolution.

      That’s incorrect. First, you’d be quadrupling the pixels and not doubling, but otherwise almost no TVs do that anyway. That would be a dumb scaler which no TV uses now. They use interpolation to attempt to create detail. So if you have an angled line, you’ll see it attempt to better define that angle.

      Some TVs and scalers are better at this than others. Scaling can introduce ringing or halo artifacts if they try to make it too sharp, or it can look soft if they go too far away from that. But nothing does just a simple 4x pixel multiplier on the input signal unless they’re trying to run in a really low input lag mode, like gaming, where lag is worse than artifacts.

      What makes 4k tv sets look better even though it’s the same image.. is the more accurate contrast and brightness. You can’t achieve the same levels of contrast and brightness on a 1080p TV vs a 4k one, so they will always look better regardless of what people claim.

      What makes any TV look better is brightness and contrast but your 1080p vs 4K statement is 100% false. All the final 1080p plasma displays (Panasonic ST, VT, ZT, and Samsung 8500 models) have much, much better black levels than any 4K LCD can produce today. Their peak light output is lower, but their black levels are so much better their native contrast ratios are far better.

      The best contrast ratios available on any TV today are on LG OLEDs. Those are coming out in 4K models, but right now they are 1080p displays. Contrast ratio is the single most important element to producing a good looking image, and since 1080p sets at this moment have better contrast ratios than 4K TVs, and for far less money, they can produce an image that many will find better.

      • Chase Payne

        I can run 600×480 on my 4k TV, and then run the same image on a CRT screen which doesn’t suffer from resolution issues like LCD’s and the 4k TV looks better. There’s probably more to it than pixel doubling (it’s called that because it was mainly done for 720 -> 1080p, hence why it’s not called quadrupling. )

        4k TV’s are an advantage for video gamers, especially those who use 3d (passive) as it is true 1080p 3d for each eye.

        Some tvs also feature enhanced upscaling like Xreality which is really good, but sometimes can backfire and looks better off.

        I don’t think its fair to compare OLED to it, I only claimed 4k offers better color and contrast compared to a normal 1080p LED TV. OLED is also not related to the display at all, it is simply a change in the backlight much like the old CRT lights on LCD screens.

        In reality, there is no need for OLED unless you want reduced input lag. Many videos today don’t use the full color gambit (255) and run limited, which means anything better than 17 reference black wouldn’t make a difference. There’s also no standard defined for what exactly measures contrast ratio and many TV’s allow you to artificially increase the contrast ratio by increasing the levels of just the blacks and whites while retaining the same brightness and contract and color.

        Just keep in mind they did the same thing when 1080p came out, people were convinced there was no reason to upgrade from 720p and instead you should just get a plasma, but in the end resolution won over the market.

        • http://referencehometheater.com/ Chris Heinonen

          Actually, you just don’t really know what you’re talking about here, as this makes clear.

          I don’t think its fair to compare OLED to it, I only claimed 4k offers better color and contrast compared to a normal 1080p LED TV.

          Your original statement was “You can’t achieve the same levels of contrast and brightness on a 1080p TV vs a 4k one, so they will always look better regardless of what people claim.” with zero caveats about display type.

          OLED is also not related to the display at all, it is simply a change in the backlight much like the old CRT lights on LCD screens.

          This is completely false. LED LCDs and LCDs are displays with the same panel type, LCD, but one uses an LED backlight over a CCFL backlight. OLED is a completely different kind of panel, where the pixel itself gives off the light and there is no backlight. This is why an OLED can produce infinite blacks, since the pixel itself turns off instead of just trying to filter out a backlight.

          In reality, there is no need for OLED unless you want reduced input lag.

          Again, just a completely misunderstanding of what OLED is. Also input lag is unrelated to panel technology. Lag is related to the processing section of the TV including scaling, color management system, and so on.

          Many videos today don’t use the full color gambit (255) and run limited, which means anything better than 17 reference black wouldn’t make a difference.

          Another misunderstanding. Plasma and OLED, when fed a black level of 16 (which is video black, not 17) produce a lower light output level than LEDs do. Plasma and OLED produce darker, richer blacks with the same content than LEDs do, and so they produce better contrast ratios, which is what people notice.

          There’s also no standard defined for what exactly measures contrast ratio and many TV’s allow you to artificially increase the contrast ratio by increasing the levels of just the blacks and whites while retaining the same brightness and contract and color.

          Which is why I measure them all the same way. I set a luminance level of 35-40ftL, use 18% APL patterns to defeat LEDs that turn off the entire backlight, and measure 0% and 100% test patterns. It is the closest to real life measurements you can get, and makes it a level playing field.

          Just keep in mind they did the same thing when 1080p came out, people were convinced there was no reason to upgrade from 720p and instead you should just get a plasma, but in the end resolution won over the market.

          The same thing will happen here. 4K will eventually take over the market because all that anyone will produce are 4K displays as costs decrease. However, if you had bought a nice 720p panel instead of a 1080p panel when they were new, you could have saved money and gotten equal to better contrast ratios. The 720 Kuro panels were better than any of the 1080p panels out at the time in overall image quality. Then you could put away that savings and buy a 1080p display in a few years and get something much better than you could have bought at the time.

          The same is true with 4K. In 3-4 years almost all mainstream TVs of 50″ or larger will probably be 4K. However at that time they’ll also have HDR, P3 color, and other features that are just starting to hit the market in $6,000 displays. You could get a $6,000 one now, which some people will, or you could get a really nice $1,500 TV right now, and then buy a $1,500 TV in a few years and have something better than $3,000 would get you today.

          • Chase Payne

            I don’t know why anyone would buy an OLED tv, for better blacks and whites when the artificial effects applied on modern tv’s are fairly accurate, for significantly cheaper.

            Not to mention that OLEDS have a shorter lifespan, they lose brightness much faster than a LCD TV.

            If you buy a 4k TV that’s an LCD, you can expect it to last 15-20 years under moderate use before it reaches half brightness.

            If you an OLED TV, it will last 7 years under the exact same usage but it will run into half brightness. And the big issue is the Blues are notorious for burning out faster, so not all of the lights will be the same… which defeats the purpose of OLED entirely, you want it to be accurate contrast but after 4-7 years you might as well throw it away, because a LCD that’s 12 years old will look far better.

            How come when you’re at the store, they don’t mention this glaring issue?

            OLED will take over eventually, but it is way to expensive and won’t last as long as a LCD, which can look just as good if you adjust your settings right.

          • http://referencehometheater.com/ Chris Heinonen

            Obviously nothing is going to convince you that OLED is good, even though you don’t seem to understand it. Current OLEDs have a half-brightness of 25,000-40,000 hours for the blue sub pixel. That’s 17+ years at 4 hours a day of usage, and then it reaches half brightness. The OLED hardware is made to compensate for this as well so the colors don’t skew.

            You’re really overstating the lag issue. That’s the panel response time, whereas LCDs have a panel response time of a whole 5ms now. Every panel has much, much higher lag than this because of video processing. The difference of 4ms is really inconsequential to anyone in daily life.

            Also none of the artificial effects applied to LCDs can do what an OLED can do. They improve the LCD, but they can’t beat the OLED in contrast ratios. Believe me, I spent a few hours last month in a room with the best TVs on the market, all calibrated, and you can see the difference in contrast ratio on the OLED. The LCDs look good, but the OLED has them beat there. This is OLEDs huge advantage on the market.

            Where OLED fails right now is being curved, and that the processing in the current models has some issues with sharpness and dithering. However, virtually every reviewer and videophile I know would still pick one today if they needed a new TV because their contrast ratios make them look better than anything out there.

          • Chase Payne

            OLED will be worth it once it gets cheaper, there was a time were LED costed a ton of money but after mass production it became worth it. There will always be video enthusiasts who actually do want the best picture possible, but the fact the matter is most people don’t even change the default settings on their own TV and it would be hard to convince people that staying the same resolution for the same color is worth the price as opposed to 4k where you are sitting a certain distance (<5 feet) and you notice a huge difference in picture quality. Some people actually like the extra brightness and washed out colors and find the displays too dark.

            If you put a true 65 1080p display and a 4k 65 display next to each other with the same image, at five feet the customer will be able to tell a huge difference. For one, they would actually see the pixels on the 65 inch display, and they will immediately notice that the 4k tv has better picture quality. The OLED may have better colors, but is it worth justifying when its the same price?

            Maybe in 10 years, we will see OLED 4k in the mainstream. But just keep in mind in the future we won't even be using TV's anyways, because once we hit 8k resolution you can't notice the difference unless you are 2 feet and the screen is massive. Because of that, the future is either projection into the eye (like google glass) or virtual reality in order to achieve realistic levels of imagery.

          • http://referencehometheater.com/ Chris Heinonen

            If you put a true 65 1080p display and a 4k 65 display next to each other with the same image, at five feet the customer will be able to tell a huge difference.

            This is the entire reason for the calculator. No one sits 5′ away from a 65″ TV at home, but stores are setup to make you be this far away. Of course you can see a difference at that distance, but no one will do that in real life.

            However, sit the normal 10-12′ away from an LCD, plasma, and OLED and you’ll be able to tell the difference in black levels no matter what resolution. I have a 65″ 4K TV here right now and I’d never give up my VT60 plasma for it. If I’m 6′ away I can see a bit more detail on the 4K display, but the black levels are so much worse that everything looks better on the plasma.

            Also, the mainstream isn’t going to switch to using glasses to watch TV. I think the failure of 3D in the home, where you have to wear glasses, is a testament to the fact that people don’t enjoy that.

          • Chase Payne

            You’re right Chris. OLED is significantly better, I actually sold my 4k TV and got an LG OLED 1080p and it was miles better than the 4k LCD I had. I also discovered that field of view is a huge problem (our vision is actually only 20/20 in like a 10 degree field of view) that it made no sense to get 4k at all. I appreciate your time pointing out my ignorance because I have an amazing TV now and everytime someone comes over they think my tv is 4k when it’s actually 1080p =)

  • QUIMICOMORTAL

    I heavily disagree with this calculator, it tells me that there is no
    difference between fullhd and 4k for a 60 inch screen at 9 feet and I
    have seen them on the store and you can really notice a difference (and
    without even having to compare them side by side). I just have not
    bought one because the content to apreciate it is not really here.

    • abbottoklus

      Store floors are the worst place to view content on a TV. They tailor-make demo content specifically meant to play on a 4K TV, playing nearby HD TVs broadcasting lower quality content, to convince buyers of their advantage. You’re also generally standing much closer to the screen than you will be in your home, unrealistically highlighting the details. They also usually don’t play them side-by side, rather a few feet away at perpendicular angles. Why? Because you can’t make a true comparison that way. Ask a store rep to play the content on a 4K TV on their best 1080 TV and I bet you they’ll have reasons for why it can’t be done.

      • QUIMICOMORTAL

        Yes, you are correct, however I have seen the 4k demos from a bigger distance that the calculator says and I can apreciate a better image than the ones I get on any fullhd screen. (the one of the rubber ducks is one of my favorites).

    • MarvelFan_1

      Once again that calculator is adding up distance vs screen resolution, its hard for people to get their head around it because they associate one with the other but TV resolution is not the same as UHD content, in store your watching UHD content played on a 4k TV, I’m sure its beautiful, what would be a more accurate test is Full HD (1080p) content played on a 4k TV compared to a 1080p TV there would be hardly any difference I’d wager.
      4k can play UHD natively a 1080p TV has to downscale it so the advantage would lie with the 4k but once again when you get home how much if any content you watch will be from a UHD source?

  • Randy

    Where 4K TV shine is in watching 3D content. Now you get a full 1080p per eye and you can really tell the difference.

    • http://referencehometheater.com/ Chris Heinonen

      You already are getting 1080p per eye with a 1080p set with active glasses. Only passive glasses are giving you 540p per eye but those are gone now I believe. 3D on 4K can be better because you get to wear passive glasses so the image is brighter and it is easier on the eyes usually.

      However, on all the TVs I’ve seen this year or others have done (Panasonic AX900U, LG 4K OLED, Samsung JS9000) the way that 3D is done is with a polarizer. If you are closer than 10′ to the set the angle is too sharp for the polarizer to work correctly. If you sit far enough away for it to work then you can’t see the extra resolution on the 4K set (the larger the TV, the further you have to be for the polarizer angle to be correct).

      So yes, 4K 3D can be better, but you aren’t getting extra resolution, only higher light output and cheaper glasses.

  • William FitzPatrick

    Did you ever stop to think that the content that they were playing on the two displays weighted towards the 4K display? There are things in that can be done to images, such as reduction in contrast and other atrocities that would give an otherwise stunning 1080p panel a bad rep. If they put the same content on both displays, then I can undoubtedly say that they would look the same, except if the 1080p panel was garbage. That brings me to my next point you could be comparing an absolute crap 1080p display to a 4k IPS display. That brings me to the different types of panels they could be using, a general rule of thumb is that OLED > Plasma > IPS > TN, and there are many other types of panels out there that fall somewhere in that array. Here is another questions to raise: What are their cd/m² brightness ratings. Before making an uneducated purchase do some damn research into what makes thing one better than thing two. It is NEVER black and white.

    EDIT: than the other “thing”, not two

  • MarvelFan_1

    4k content is the key here in this comment, there is a difference that some people don’t grasp when comparing 4k TV resolution benefits to UHD content benefits.
    A 4k resolution is going to make your TV sharper in terms of pixels so the benefit in a 4k TV comes with the size of screen i.e bigger the screen more pixels = better image however UHD content is where the real issue is because that content is what’s really carrying the quality of image delivered to your TV screen ……. problem right now and in the future is apart from that shiny (made for purpose) UHD demo they showed you in the store, whens the next time you’ll get UHD content on a film or program your watching at home?

  • Blackbart Jones

    First there is very little true 4K video to watch on a TV and the up scaling does not do that much. Also hdmi ver 1.2 cannot handle the bitrate needed to play a full 4k video, at 80mbs or higher. That is why it is too early to by a 4k tv unless it is just a great deal. Sony sells a few movies on a proprietary disk but those movies are reduced in bitrate. Once the h265 encoding takes off more then there will be movies that are small enough to be practical and look good
    A h264 2 hour movie will need approx. 100GB, the new h265 can compress with the same quality and put it on a Blu-ray. Once this starts to happen, then you will see the benefits of a 4K TV
    When it comes to streaming via cable tv, that will be much longer. Currently HD movies that should steam around 20Mbps are done well below 10Mbps
    Also note that the demos in the stores are showing a true 4K video, something you will not get at home.

    • http://referencehometheater.com/ Chris Heinonen

      Every single UltraHD TV this year, and I believe everyone last year, have inputs that are full bandwidth HDMI 2.0 with HDCP 2.2. That’s enough bitrate to handle UltraHD with full 4:4:4 chroma subsampling at 60Hz or even 120Hz. No UltraHD TV shipped with HDMI 1.2 that I’m aware of as that was long outdated when the first models came out.

      All UltraHD content currently available (Netflix, Amazon) also already uses H.265 as the TVs need to have internal HEVC decoders to handle the content today. UltraHD will also use HEVC and their discs are larger than standard Blu-ray (66GB or 100GB vs. 25/50GB).

      • Blackbart Jones

        2.0a still has the limit of 18gb, which will not give you full 4K, and thought it says 4K, you are not getting true 4K from netflix, they just do not have the bandwidth to push that.

        It will be a while before you get true 4K and until you get players that support h265 that dont just upscale, you wont

        • http://referencehometheater.com/ Chris Heinonen

          You’re just wrong on the HDMI 2.0 bandwidth. I laid out the data requirements for raw, uncompressed UHD signals in a piece for HDGuru a couple years ago:

          http://hdguru.com/hdmi-2-0-what-you-need-to-know/

          Or to make the math more plain, we have a limit of 18,000,000,000 bits per second. To calculate the number of bits you need for UltraHD, it would be 3840×2016 (pixels) x 30 bits per pixel (for 10-bit color, which is HDR) x 24 or 60 (for frames per second). That comes out to 5.98 Gb/sec (for 24p) or 14.93 Gb/sec (for 60p). Both of those are less than the 18.0 Gb/sec that HDMI 2.0 allows.

          Define “True 4K”. Netflix streams that are UltraHD are 3840×2160 resolution using HEVC/H.265 encoding, and has a bitrate of 15.6 Mb/sec. Does that look as good at UltraHD Blu-ray will? No, because that can offer bitrates of 82, 108, or 128 Mb/sec depending on disc size. It doesn’t meant that the Netflix isn’t true 4K, just that it will have more compression artifacts and banding than UltraHD Blu-ray will.