Ripping CDs to lossless in Itunes.. HiFi approved?


Hi,

The name says it all.

I want to rip my CDs once, and do it right. I started with eac, but it's complicated to get it to work right with Apple Lossless and get the tags right.

So... I switched over to Itunes directly, ripping CDs to Apple Lossless.

Without getting too "audiophile abstract," is there anything wrong with these files?
goatwuss
If I am not mistaken, if you start out with WAV or AIFF in itunes, on a mac or PC, and transmit the file over a network to an airport express or apple tv, you are unknowingly transmitting apple loseless. I assume this is not the case when connecting your computer to a DAC via USB or Toslink as apple loseless is not supported by any DAC I know of. Itunes does this because transmitting a file half the size is faster and easier I assume. Maybe that is why I cannot hear the difference when I A/B AIFF and apple loseless in my system. Not to start an argument, but AIFF and WAV are both native CD Redbook formats, AIFF being Mac and Silicon Graphics native format, WAV being Microsoft. If WAV doesn't support tags (im not sure on this as I don't use it) you are making a LOT of extra work for yourself.
Does it matter that for FLAC, they can reconstruct a bit-exact copy of the original?
Can the same be said of Apple Lossless? If yes, than they should be 'equal'. And equal to an uncompressed .WAV or other format.
Magfan, I think that bit-exact IS important. I know that there are large differences when I turn on error correction when ripping to lossless, but I'm not sure if it's bit-exact. I can tell you that it's certainly almost-exact and I DO hear differences when I turn off error correction.

I wonder if anyone here knows how exact error correction is with lossless.

Dave
Please don't use Apple Lossless--it is by far the worst of the lossless formats and there is a difference. It seems to strangle the life out of my recordings.

However, after retesting on my HD600 headphones and my reference system (dCS Delius+Purcell+ B&W N802) I cannot for the life of me tell the difference between a .flac and .wav file. The same goes for .aiff which is exactly like .wav except you can tag the files. If you want to save space and can do without iTunes, .flac is the only way to go in my opinion. Use Exact Audio Copy. If you have the space, in iTunes .aiff is the best format because it is totally uncompressed but the files themselves can be tagged with track and artist information.

Now with respect to error correction and bit perfect, I am not an expert but I think when you talk about "bit-perfect" error correction and "lossless" you are talking about two different things.

Bit-perfect and error correction deal with the initial rip of the the CD. This is the process by which a program reads and rereads suspicious areas that might contain errors, dust, damage, scratches, ect until it comes up with the right answer. EAC rereads these areas up to 8 times I believe. Because error correction takes the errors one by one and over and over it is possible that a disc that would generate audio artifacts when played in real time due to dirt, dust, scratches, imperfections, ect can be perfectly extracted into an audio file. The difference between a corrected rip and a CD played in real time is that with the rip, the computer has time to think things over, if you will.

Now, after the initial rip, regardless of whether you error corrected or not, the computer generates at least temporarily a raw .wav or .aiff file which is then converted into the lossless format. iTunes does this transparently but if you use a program like EAC, which claims bit perfect rips, you will see that after the initial file is "perfectly" ripped off the CD, EAC actually launches an external program that converts the .wav file into the format you want.

So really there are two issues. First, what impact do error correction and "bit perfect" rippers have on the uncompressed sound files? Second, what impact does converting to a lossless format have on those files?
Here is another question...if there is a difference in sound quality between raw .wav/.aiff and the lossless formats, is the difference the result of the encoding process or decoding process?

It seems to me that if "lossless" is actually lossless, then the following MUST be true: a .wav file, converted to a lossless file, and then converted back into a .wav file, should sound identical to a duplicate of the original .wav file that was never compressed. Try it!

If that is true, and these lossless formats do what they are supposed to do, then the only explanation for any difference in sound quality is that at the time of playback, the process of decompressing the lossless format impacts the sound.

It can't just be a question of CPU power because I am assuming that there is some kind of memory buffer. Further, I host my files on two machines--one is a quad core with 2 gigs of ram at 2.6GHz and the other is a Core2Duo at 3.4GHz and I can still hear difference in the apple lossless files.