A delay starts immediately there is a difference, but an audible one? Let’s say you can hear up to 20kHz (in other words you are young). I think to get out of phase by 180° (which you’d definitely hear as sounds would cancel out) would take half the gap between soundwave peaks - 1/40,000th of a second. (I pick the upper end of the frequency spectrum as the lower the frequency, the greater time would need to elapse to get out of phase and therefore the longer the difference in cable run.)
If the signal travels at (say) 80% of the speed of light, then the question is how long would a cable need to be before it would take 1/40,000 of a second to travel it – which is (299,792,458m/40,000) x 0.8. My maths suggests that’s just under 6km, and I think there would be bigger issues (impedance etc) with trying to run a system with a cable of that length than phase.
As to how far out of phase a signal would need to be before it started messing around with your stereo image I have no idea, but I suspect it would be almost impossible to test as the lengths would be uneconomic and electrically impractical.
I suspect the differences caused by having your speakers at different distances are more noticeable (in theory) as sound travels so much more slowly, but in practice you’d need a ridiculously large space for it to be a practical issue in the domestic space.
I can’t tell you how irritated I’ll be if I’ve c*cked the sums up!