AmigaNut said:
Put it this way, a pair of shoes when new are stiff and tight. After wearing for a while they loosen up and flex easier.
Same for speakers the initial stiffness breaks down after some use and they become more pliable.
Simples.
Sounds plausible but in practice, run-in occurs very quickly:
http://www.audioholics.com/education/loudspeaker-basics/speaker-break-in-fact-or-fiction
"Required break in time for the common spider-diaphragm-surround is typically on the order of 10s of seconds [sic] and is a one-off proposition, not requiring repetition. Once broken in, the driver should measure/perform as do its siblings, within usual unit-to-unit parameter tolerances.
Probably the most common approach used by manufacturers who purposely take the time to break in raw drivers is to apply a sine wave signal, at a frequency equivalent to the unit's free air resonance, delivered at amplitude sufficient to thoroughly stretch out the spider, without damaging the unit, of course
An alternate approach referred to in the literature is the use of broad band noise. However, this approach is inefficient when compared to the sine- wave-at-free-air-resonance approach.
Break in, however, isn't necessarily a discrete step, purpose built in to the driver or loudspeaker system manufacture process. Does that mean loudspeaker systems produced by a manufacturer that doesn't break in the drivers require breaking in by the consumer? No, not necessarily.
Quite often, spider break in occurs when the driver is tested, before and/or after placement in the cabinet for which it's intended. Driver testing by signal stimulus at some point (or points) in the manufacturing process - if done at levels sufficient to break in the spider - generally makes further break in unnecessary. Hence, a finished system will not - in so far as its drivers are concerned - require further break in by a consumer once taken home from the dealer."