I'm not even quite sure how to phrase my question, but here it goes...


So my DAC has LED's for 44, 96, 176 and so forth. I tried to get an understanding on how the different bit rates affect quality, but quickly became confused with bit depth, Flac files, Redbook and other terminology which all plays into the equation.
Can anyone point me to a dumb-down, digital for dummies kind of resource?
Thank you.
128x12861falcon
The sample rate means how many times per second a sample is taken. Say you measure your pool's chlorine every day.

So 1/day or 365/year.

44,100/second is the CD sample rate. 44,100 times a second a measure is taken and recorded. other common rates are 48,000 and 96,000.  Usually expressed in kilohertz, like 96 kHz.

The bit depth determines the precision of that measurement. How finely can we record the signal.  More bits, more resolution.

So 44/16 refers to the CD standard. I occasionally see online radio stations offer 44/24 or 96/16.

Honestly with modern DACs 44/16 sounds so good it gets harder to justify more.
Erik,
Thank you for taking the time to give a concise understandable explanation. 
Really like the pool/chlorine analogy!

Really like the pool/chlorine analogy!


Hah! I really thought that was my worst analogy ever.