Blu-ray Player Reviews: Subjective vs. Objective

Recently I got involved in a long thread over at the AVS Forums over the merits of the Sony BDP-S790 player.  The crux of the argument from the users and fans of the player was that it had a “better” image than the Oppo BDP-93 players, and a far better image in 3D.  Even though I hadn’t used the BDP-S790 yet, this resonated with me as the Oppo BDP-93 has been shown to be bit-perfect on its output, and across all colorspaces as well, so it presents the content on the disc exactly as encoded.  If the BDP-S790 looked better, there were a few reasons:

– The S790 has decoding errors, but those people using it liked the errors, even if they were technically incorrect
– The S790 has a lot of image adjustment controls, and they liked the image that was presented by using those
– They all just bought the S790 and want to believe that it’s better, even with a lack of objective proof.

I tend to lean towards option #3 as the main reason, but I also though it a good time to discuss the Blu-ray player reviews that I do for Secrets of Home Theater and High Fidelity, and why we test the way we do.  For almost all of our benchmark data, we look for the most objective data that we can.  The HDMI Benchmark is 100% objective, using only the information presented by the Quantum Data 882 and analyzing it to show the flaws in the pixels we are reading.  If we expect a green value of 235 and we see a green value of 231, we are reporting that error, and often I’m not even looking at the test patterns on-screen by eye, since the QD882 can give me all the data I need.

With some of our other Blu-ray and DVD tests, there is a more subjective nature on if the clip locks onto a cadence quickly enough, if the noise reduction is effective, and how responsive the player is.  I’ve been trying to eliminate as much subjectivity from the testing as possible, but there is still the possibility of having some scores be slightly off due to the subjective nature.  Perhaps the recently BDP-S590 from Sony is an 8 on the User Responsiveness test and not a 7, but that’s splitting hairs compared to the other testing.

This is also why I dislike testing and commenting on online streaming content with the players.  There are so many things in play that I can’t be certain that the player is at fault, or something else.  With my Netflix test clip of The Iron Giant, I noticed on the Sony BDP-S590 that the panning motion was very good compared to most players, but I also noticed that a common artifact I see in the scene was missing.  Did the Sony clean it up?  Did Netflix do a new encode of the movie that eliminated it?  Can I trust this compared to the previous tests?  Maybe my network performance was just much better?  Until Netflix and Amazon and others want to help me make some objective tests for this performance, it will remain totally subjective.

This comes back to the BDP-S790 review and subjectivity.  Objectively, if the Sony was performing perfectly, it and the Oppo would be identical, unless you were using the image adjustment options in the Sony.  In that case, you’re not trying to adjust image features (gamma, sharpness, black level, and so on) to accentuate certain things, but in doing so you are inherently going to harm other parts of the image.  You can adjust the gamma in the player, but in doing so you have to adjust the Luminance level, which then leads to a decreased dynamic range.  Adjusting sharpness will usually add ringing, or obscure fine detail due to how those algorithms work.  There is no free lunch, and when every pixel is already accounted for, you’re going to run into issues in trying to manipulate them.

Could people like what they see?  Sure, but they also are unlikely to have an ISF or THX calibrated display, as those users would probably want to keep the picture as close to reference standards as possible.  One person doing the review at AVS was lamenting the lack of Dynamic modes on the projector he was using to evaluate it, so they seem to prefer an overly blown-out image that is going to lack fine shadow detail, or accurate colors.  I do make some commentary on image quality, but often it is of the “I noticed no issues” variety, since a Blu-ray player that is performing to reference quality on the bench tests should fly through normal content with no issues visible.  Reference content serves as a sanity check, to make sure we didn’t miss anything, but the crux of the Blu-ray review would be the bench tests in most cases.

We all have different preferences.  I like a totally neutral, reference standards picture, even if that means that Lightning McQueen isn’t an insanely bright shade of red compared to using a displays native gamut.  It lets me know I’m seeing exactly what the director, cinematographer, and everyone else involved in the film intended, and so I want my Blu-ray player to stick to the standards.  If I liked something else, and only reported subjectively about that, anyone looking for a reference image because of my reviews might be incredibly disappointed and no longer trust what I say.  Sticking to objective measures as much as possible, and being objective only when necessary lets my readers know what they are getting, and they can make their own subjective adjustments if they desire.

, , , , ,

  • Couto27

    When you mention this ” Even though I hadn’t used the BDP-S790 yet” i stop reading.
    regards

    • Chris Heinonen

      Finishing the rest of the article might have filled you in then, since there was no way for that player to be objectively better (as you can’t better than perfect), and it even had a few glaring flaws, but it was subjectively better to someone for certain reasons. Perhaps you will read the rest and see how it makes sense in context.