Posted Fri Oct 26, 2007 at 01:22 PM PDT by Joshua Zyber
Editor's Note: A long-time movie buff and collector of discs from laserdisc to DVD, Joshua Zyber is a veteran disc reviewer, and an enthusiastic supporter of all things High Definition. In his twice-monthly High-Def Digest column, Josh discusses a broad range of topics of interest to other early adopters. |
By Joshua Zyber
One of the inevitable side effects of the High Definition revolution is that the advanced video and audio technology used in the Blu-ray and HD DVD formats tends to bring out the know-it-all tech geek in home theater fans. Sometimes this can be a great benefit, when knowledgeable users band together to analyze specific technical deficiencies that have occurred and share their feedback with the parties responsible, hopefully leading to improvements in the future. We've seen some of this at various points during the format war. Early Blu-ray releases such as 'The Fifth Element' exhibited obvious visual deficiencies due to weak source materials and poor digital compression encoding. Likewise, HD DVD catalog titles from Universal have been hit-or-miss in quality, many of them recycled from dated and problematic video masters (like 'In Good Company', with its ghastly edge enhancement artifacts). Reviews published on this site and others were negative, and buyers voiced their displeasure to the studios, eventually resulting in improved mastering on subsequent releases. 'The Fifth Element' was even remastered in significantly better quality as a direct result of owner feedback. That wouldn't have happened had no one spoken up about it.
Generally speaking, the High Definition studios, knowing the intense scrutiny their work is placed under, have maintained a much higher standard of quality on recent releases (with some notable exceptions, of course). Just imagine what might have happened had the public been apathetic and merely accepted whatever shoddy treatment they were handed. In this case, the voice of the people resulted in a better end product for everyone to enjoy.
Unfortunately, the above example is a best case scenario. On the flip side of that coin, we have countless cases of agenda-driven individuals attempting to use a partial understanding of technical matters as a bludgeon in arguments supposedly "proving" the superiority of one format over the other. Anyone who's spent time browsing home theater discussion forums has suffered through an endless string of debates about how the HD DVD format "sucks" because its discs can only store 30 gb of content, while Blu-ray discs can store up to 50 gb, and therefore must be amazingly superior. Never mind that HD DVD has time and again proven capable of delivering exceptional picture and sound quality, plus copious bonus material, easily equaling even the best available on Blu-ray. At the same time, there are others who point to the occasional Blu-ray encoded with MPEG-2 compression as being "unacceptable", even though MPEG-2 can certainly achieve excellent results when given enough room to breathe (witness 'Black Hawk Down'). To some people, the actual quality presented to them is irrelevant if they don't like the sound of the specs on paper.
This "specs above all else" mentality has reared its ugly head again recently with the release of 'Transformers' on HD DVD, a title that delivers stunning video and audio, as well as a number of innovative interactive features. What could possibly be the problem here? Well, the soundtrack is only encoded in Dolby Digital Plus format, not a lossless codec such as Dolby TrueHD or an uncompressed one like PCM. In his review of the disc for this site, our Peter Bracke gave the DD+ track a perfect "5" for audio quality and said of it that, "Directionality, imaging, accuracy of localized effects, and the sheer depth of the soundfield are all fantastic stuff." Nonetheless, in the minds of many, this disc is a huge failure, and its soundtrack a pathetic disgrace for not including a TrueHD or PCM option.
I should mention at this point that at least one working Hollywood sound mixer has voiced his opinion that, when played back on his professional dubbing stage, well-mastered Dolby Digital Plus soundtracks encoded at the high 1.5 Mb/s bit rate that Paramount uses can be audibly transparent to the studio masters, when tested on movies that he mixed himself and would presumably know better than anyone else. But what use is the informed opinion of an expert in the field when it's easier to just point to the specs list on the back of a disc's packaging to make conclusive statements about matters of quality? In the forum on this site, a number of readers have made proclamations such as, "Compressed audio is just not acceptable these days" and "Whether you can tell the difference or not is irrelevant."
The disc's audio being indistinguishable from its studio master is "irrelevant"? Even with just a Dolby Digital Plus track, the 'Transformers' disc rated the highest score for audio quality that we can give. What more could we demand from it? It's absolutely terrific, but it's just not absolutely terrific enough if the packaging doesn't have a listing for TrueHD or PCM, even when it's likely impossible for human ears to tell the difference? What kind of argument is that?
The lossy compressed audio formats offered by Dolby and DTS use perceptual encoding techniques to filter out data from the studio masters in order to conserve disc space. The intent of perceptual encoding is that the data removed should consist mainly of either frequencies beyond the range of human hearing or frequencies that would normally be masked by other frequencies in the track anyway. With the most heavily compressed formats, including basic Dolby Digital and DTS (the standards on regular DVD), often additional frequencies within the range of hearing are affected, and this has resulted in much variability in sound quality. However, Dolby Digital Plus, especially the 1.5 Mb/s variety found on a disc like the 'Transformers' HD DVD, uses much more efficient encoding techniques at a very high bit rate. The people who actually make these movie soundtracks have found it pretty impressive, and yet average home listeners seem to believe with absolute certainty that the home theater speakers in their living rooms would be capable of resolving with precision the mathematical difference between a high bit rate Dolby Digital Plus track and a lossless one, and that their golden audiophile ears would also be capable of discerning it. Personally, I would like to put these people to a properly-controlled blind test, where all of the audio levels have been carefully matched to the same volume, and then see how well their hearing fares.
I would not claim that all DD+ tracks are flawless or transparent to their masters; it does take some effort to encode them properly. But to dismiss the format out of hand simply because the soundtrack isn't labeled as lossless or uncompressed demonstrates an ignorance of the technology being used. If the audio codec alone were the only important criteria in sound quality, how could it be that a disc like 'Dinosaur' with a 48 kHz / 24-bit PCM 5.1 track would sound so underwhelming? With specs like those, why isn't that disc a spectacular audio showcase? Somehow I doubt you'll find too many critical listeners who would ever claim that 'Dinosaur' sounds better than 'Transformers', but based on the specs, shouldn't it? Perhaps it's time we all realize that there's more to quality than the specs can tell us.
Yet we see the same thinking applied to matters of video. How many more arguments must there be about the different video compression codecs? Proponents on one side proclaim the infallible superiority of VC-1 above all other options, while those opposed insist that VC-1 is garbage and only AVC MPEG-4 is any good. Both camps attempt to prove their point by capturing screen shots on their computers, which they run through Photoshop to crop, zoom, filter, and distort in all manner of convoluted ways in order to locate individual errant pixels, completely invisible to the naked eye in the normal course of movie watching, and heartily declare their victory in the debate.
The truth of the matter is that all video compression codecs have the same purpose, to accurately represent the source using a fraction of the storage space. In the hands of a good operator, both VC-1 and AVC are more than capable of achieving this goal. Even the dated MPEG-2 codec has been known to deliver excellent results (owners of the now-defunct D-Theater tape format sure didn't seem to have any problem with it). There are plenty of examples of "reference quality" transfers using any of the above, from 'King Kong' (VC-1) to 'Final Fantasy' (AVC) to 'Kingdom of Heaven' (MPEG-2). In all cases, the skill of the compressionist and the quality of the work is more important than the codec used to get there.
It's also more important than the bit rate. As far as I'm concerned, Sony's decision to incorporate a bit rate meter in their PS3 Blu-ray player is one of the worst things to have ever happened to the home theater hobby. Because of that one seemingly-innocuous and frequently-inaccurate data display, now just about anyone, no matter how technologically ignorant, can believe themselves to be experts in the field of video reproduction, based on nothing more than whether their bit rate meters read a high number or a low one -- as if that number were even relevant. The whole point of video compression is to squeeze a High Definition picture into as little space as possible. A compressionist who's maintained a high-quality picture with a low bit rate has done an excellent job, but that's a point lost on most consumers, who assume that a good picture needs a high bit rate, regardless of what they actually see on their TV screens. The bit rate alone is a meaningless statistic and says nothing about the quality of the compression work. It is equally possible to create a lousy video image with a high bit rate, or a great image with a low bit rate, depending on the complexity of the content and how well the work is done. I found it extremely amusing to read complaints about the low bit rate used on 'TMNT', a disc with a razor sharp and amazingly detailed picture that some owners nonetheless decried as "soft" against the evidence their own eyes gave them, for no reason other than an ill-founded assumption that the picture would have been even sharper if the bit rate meter spiked a little higher. How would they know? Have they compared it against the studio master?
This misconception has reached such heights of absurdity that certain viewers have started petitions demanding that Warner Bros. stop using the same video encodes on HD DVD and Blu-ray, and instead "maximize" the bit rates on their Blu-ray releases if the extra disc space is available. But for what purpose? Video compression doesn't work on a linear scale. Using advanced codecs like VC-1 and AVC, there are diminishing returns above a certain point, and throwing more bits at a picture that doesn't require them accomplishes nothing more than to make the meter number go up. As time goes on, compression tools and techniques become more efficient, requiring even less space to achieve visual transparency to the original master. Warner Bros. has many times over demonstrated outstanding results within the 30 gb limit of HD DVD, even on very long films such as the 'Troy: Director's Cut', a movie that runs 3 1/2 hours and yet fits comfortably on a 30 gb disc with beautiful picture quality, despite also squeezing in a lossless Dolby TrueHD audio track and a bunch of supplements. So what if the Blu-ray edition has an extra 20 gb of space available? Are we watching the movie or watching the bit rate meter? If there were no bit rate meter, would anyone have a legitimate basis to complain?
Back when they were supporting both High-Def formats, Paramount actually did what these users are demanding. They authored every movie separately for HD DVD and Blu-ray, each maximized to its format's potential. And what were the results? The same movie looked visibly identical on the bit rate maximized Blu-ray as it did on the lower bit rate HD DVD. Once again, the quality of the compression trumped other considerations regarding tech specs or bit rate.
Don't get me wrong, I'm not trying to imply that all HD DVDs and Blu-rays are perfect now. Video artifacts do occur, and the studios have been known to rest on their laurels and allow shoddy work to slip through. Sometimes disc space really does strain the limits of what a studio wants to include on a High-Def title. It's important to scrutinize their results, lest we return to a state where the original 'Fifth Element' Blu-ray is considered acceptable. But it's equally important to understand what we're actually looking at. Many times, the "artifacts" picked apart by viewers have nothing to do with video compression or encoding whatsoever, but rather are issues found in the source, such as natural film grain, which isn't a flaw at all. Yes, a soft picture can be the result of poor compression or excessive filtering, but it can also be the result of soft focus photography. A heavily-grainy image could be overcompressed, or it could be stylistically intentional. Not every movie is photographed to look exactly the same as every other, and even within a film certain shots or scenes may look different than others. We must understand what a movie is supposed to look like before we can judge how well a video disc reproduces it. Being moderately proficient at manipulating still images in Photoshop does not necessarily qualify someone as an expert in the art of filmmaking.
I'm not suggesting that viewers should relax their standards or accept substandard quality as "good enough" when it's really not, but the technical specs alone simply do not tell the whole story, and over-emphasizing them is a matter of misplaced priorities. We should judge these discs by the actual quality they deliver, not by misleading statistics like the bit rate or the specs listing on the packaging. Surely, that can't be too much to ask.
Discuss this article in our forums, or check out other recent discussions. |
Got a question you'd like to see Josh Zyber answer in a future column? Send it to us via our Feedback form.
Joshua Zyber's opinions are his own and do not necessarily reflect those of this site, its owners or employees. To view a complete collection of Josh's commentaries for High-Def Digest, click here.
The latest news on all things 4K Ultra HD, blu-ray and Gear.