Which is more accurate: digital or vinyl?


More accurate, mind you, not better sounding. We've all agreed on that one already, right?

How about more precise?

Any metrics or quantitative facts to support your case is appreciated.
128x128mapman
Yes. Nyquist assumes an analog sample of unlimited resolution, not a 16-bit sample. Its application to digital audio is thus, not. Ah, people don't like to talk about this! Or they do but it just turns into a ridiculous argument. But I suggest anyone look into the life of Nyquist:
http://en.wikipedia.org/wiki/Harry_Nyquist

(you will note that Nyquist had no concept of digital audio back when he proposed his sampling theorem)

and

http://en.wikipedia.org/wiki/Nyquist%E2%80%93Shannon_sampling_theorem#The_sampling_process

If you read carefully, you will note that the samples are not defined as '16 bit', instead they are samples of the 'bandwidth-limited' signal, which have an analog value.

Now 16 bits can define a fairly precise value, but that is by no means the same as saying it can define the exact value. Further, the significance of 'bandwidth limited' should not be ignored. Current Redbook specs put the sampling frequency at 44.1KHz, if you think about it, the significance is that anything above about 19-20Khz is ignored. It is not so much that Nyquist is out to lunch that it is that Redbook specs are poorly applied.

The Redbook specs were created in the late 1970s and early 1980s. Seems to me I heard one of the first CD players about 1981. Back then, the IBM PC was king; a $10 cell phone has *considerably* more computing power! IOW, Redbook was **intentionally** limited in order to cope with the limitations of the hardware of the day. It is quite anachronistic that we still take it seriously today...
If I sit and play an instrument for recording purposes onto an analog tape I will record all that I play. Is this also true for digital recording or is the device recording parts of the sound (sampling) I am playing and the computer puts it together sort of like digital morphing of one image to another. If it is the latter then why call it a sample you are just asking for trouble and confusion.
"If I sit and play an instrument for recording purposes onto an analog tape I will record all that I play. Is this also true for digital recording or is the device recording parts of the sound (sampling) I am playing and the computer puts it together sort of like digital morphing of one image to another. If it is the latter then why call it a sample you are just asking for trouble and confusion."

Both are somewhat imperfect reproductions of the original using two different approaches. The question is always "how somewhat????" and how much do whatever teh differences are matter? That is true be the approach digital or analog. We live in an imperfect world. There is no such thing as a perfect reproduction in most any case. The 16 bit sample size for CD redbook is perhaps the prime bottleneck with teh CD redbook format, but as ATMAS noted 16 bits gets you a lot of resolution ie 2 to 16th power individual levels.

I think that bottleneck can be heard in some cases, but not all and is very difficult to determine when done right, at least that is my subjective assessment having heard both really good analog and really good digital.
Atmasphere,

Just wondering how good is your hearing at 19-20Khz?

I am 52 years old. I do not hear those frequencies anymore as best I can tell.

When I was a young punk 18 year old budding audiophile, I recall getting up there pretty good with test tones and such.

Maybe thats why good systems sound better than ever to me these days in general?
11-04-11: Hevac1
When recording say a violin, is the first sample taken at the start of a note played and does it also sample at the very end of the note regardless of the samples in between? If it does not then how can digital play back components perform proper decay and bloom of the music played regardless of the sample rate?
There is no synchronization between the start or end of a musical note, or anything else involving the timing of the music, and when the samples are taken. But keep in mind that notes don't start infinitely fast, and don't end with infinite abruptness. The speeds that are involved correspond to the highest frequency components of the note. If all frequency components that are audibly perceptible can be captured with sufficient accuracy (whatever that may mean), then nothing is lost.
If I sit and play an instrument for recording purposes onto an analog tape I will record all that I play. Is this also true for digital recording or is the device recording parts of the sound (sampling) I am playing and the computer puts it together sort of like digital morphing of one image to another.
It will record (and the digital data will contain) all that you play, but only up to around 20kHz, and with accuracy that is less than perfect in a number of ways (quantization noise reflecting the finite number of bits per sample, frequency response ripple and phase shifts resulting in part from the low pass filtering that must precede the a/d converter to prevent aliasing, etc.).

The d/a conversion process does not, at least conceptually, involve adding information, combining images, interpolating between samples, or anything along those lines. Conceptually, once the digital data for each sample has been converted to a corresponding voltage it just involves REMOVING (filtering out) ultrasonic (higher than 20kHz) frequency components that are present in combination with the musical information (at frequencies below 20kHz). It is the presence of those ultrasonic spectral components that are what distinguish a sampled waveform from a continuous non-sampled waveform containing the same information.

Your questions are good ones, though, as it's all pretty counter-intuitive.

Good answers from Ralph & Mapman, also.

Regards,
-- Al