You are mistaken.
The colder a CPU runs, the better, as conductivity improves as temperature decreases.
In order for you to damage it out of coldness you should cool it enough so it goes into a state of matter called the Bose-Einstein condensate. And that is measured in mK (mili-Kelvins). Not achievable outside a lab. That, or to be stupid enough to use a compressor, and let it condensate water around the CPU block, which will lead to a short circuit.
Um, no. What you say would be true if they were made in one piece. Since they are not, on some cpu:s when the cold makes parts contract extremely small spaces open up between them, resistance increases and timing errors can occur - which leads to 'cold bugs'. AMD is known to have problems with this, Intel - not so much afaik.
John Johnson - Master of Synth.Foods-Convoy|049
Hans Adler - Synth Foods escort wing