Is 16/44 good enough

An engineering report by Brad Meyer and David Moran of the Boston Audio Society in the September 2007 issue of the Journal of the Audio Engineering Society (JAES) reports on "a series of double-blind tests comparing the analog output of high-resolution players playing high-resolution recordings with the same signal passed though a 16-bit/44.1-kHz 'bottleneck.'" The authors used the ABX test methodology.  The bottom line is that no one could tell the difference.

In another paper from 2004, Bob Stuart takes a more theoretical approach and examines systems up to 24-bit/96-kHz.  He discusses noise-shaped dither as well as 1-bit coding system (e.g. DSD).  It is worth reading if only to get a feeling for the breadth and depth of the things that must be considered.  It is difficult to find succinct recommendations in this paper, however if we look at an earlier version from 1997 that was a convention preprint we find a recommendation for 20-bits/58-kHz.  With proper noise-shaped dither, 14-bits is enough.

However, my friend Eric is uncomfortable about such extreme uses of noise-shaped dither.  While theoretically possible, downstream processing (e.g., sample rate conversion) may remove the dither, leaving all the quantization artifacts.

Getting back to bits and Hz, Andy Moorer points out [need citation] that higher sample rates and greater bit depths make it easier to implement most types of digital signal processing operations, and hence the ones found in commercial products may sound better.

More or less supporting this is the observation in the first paper (Meyer) that the high-res versions of albums do in fact sound better than their CD counterparts, but they attribute this to greater care in production and reduced pressure to "sound good" under poor listening conditions such as iPods and car stereos.

Comments

Popular posts from this blog

New BBC Radio 3 streams

A two-character fix in two hours: Wrestling with gpsd, cgps, and a $15 GPS puck

Further adventures of AK6IM, the kosher ham