Some people will tell you there can't possibly be any difference and therefore you're imagining it, but there is a slight difference in what happens when a lossless file is played back in comparison to an uncompressed one, in that the lossless file has to be un-packed as it's played back. Those who believe that there can be a difference heard between these 2 file types (you're certainly not alone) postulate that the increased load on the processor of the replay device, whilst obviously only using a fraction of the processor's capabilities can lead to increased electrical noise which accounts for the difference in sound quality. 2 things to note - 1) I don't think that this has ever ben proven 2) I've not seen any other theories to account for the difference.
Luckily for you there is an easy fix and you don't have to re-rip everything to AIFF. In iTunes (assuming that's what you're using to rip) change your import settings to AIFF, then select all Apple Lossless format tracks in your library (there are view options of file type to help you with this) and then right click and select 'create AIFF version'. This will create a copy of all your tracks in AIFF format. Then you can delete the ALAC originals.
Since the source files are all losslessly compressed this means no data was thrown away as the file was ripped (it was just packed in the most efficient way, just like a zipped document), so the full uncompressed equivalent can be extracted and will be identical to if it had been ripped in AIFF in the first place. Job done, paranoia (whether justified or not) gone. This is exactly what I did for exactly the same reason!