How are bit rates related to sample rates?


Hours of reading haven't cleared up a dilema: at 44.1KHz sampling rate x 2 channels x 16 bits , about 1400k bits of information should be produced per second when we play a CD (or 1400 bits of space should be used on the CD). However, my itunes files show that songs use from 128 to 863 kbits/sec. Why? The lower bit rate songs were all downloaded from itunes (no more of that). But why are songs ripped from CD's showing variations in the bit rate and all under 1000kbps? I'm lost.
manorraul
Probably because your iTunes preferences are set to import in a compressed format, such as mp3. In iTunes, go to the Edit menu, then to Preferences, then to Advanced, then to the Importing tab, and then set "Import Using" to "WAV Encoder." That will result in uncompressed bit-for-bit importing (aside from some additional bits that will be added as part of the .wav file format "wrapper").

Regards,
-- Al
I would highly recommend that you use AIFF instead of WAV in the iTunes environment. They are both uncompressed, but AIFF includes track information and album art in the file itself rather than putting it somewhere else.

This doesn't make a difference unless you have to do things like backup up file, move them to another machine or similar operations. Then you suddenly may have WAV files with no artist, album or track information. I have not been in that situation since I don't use WAV, but I have read plenty of stories of people who have had to re-rip their entire library (for me that would be a disaster, classical music requires manual edit since there is no consistency in the CDDB database).

You can always ask iTunes to convert your files to WAV (or MP3) in the future if that is needed for some reason, but AIFF is the best way of ensuring that you won't lose information (info or music bits) in the meantime.