Just back from a mates house who has recently got a few of the Motown Singles collections. He has them all ripped to Apple Lossless and playing them through iTunes. I noticed something about the bit rates. The lowest bit rate was 415 which was mostly spoken word. However, if you went to the actual file, hovered the mouse over it the track info would pop up showing Sample Rate 44.1 KHz, Sample Size 16 Bit and Bit Rate 1411kbps (CD). Why is there a difference in what iTunes is showing and what it says on the file?
How important are bit rates in showing quality of music? Checking my own library, I also have stuff starting from the 400s up to over a thousand with everything having been ripped from CD in Apple Lossless.
How important are bit rates in showing quality of music? Checking my own library, I also have stuff starting from the 400s up to over a thousand with everything having been ripped from CD in Apple Lossless.