quote: Let's imagine a TEC that conducts NO heat. If we were to stick this TEC between a CPU and a heatsink, would heat fail to flow from the 'hot' side to the 'cold' side, and thus melt our CPU?
quote: Let's suppose we have a CPU that is cooled by water, so a high wattage of heat dissipated by the CPU equals a low temperature increase in the water. But instead of some thermal compound, let's stick a piece of cardboard between the CPU and the waterblock. Our CPU when turned off is at ambient. We turn it on, and within a second the CPU temp rises by some wattage, which equals 5 degrees. Now there is a differential of 5 degrees between it and the water. A differential of 5 degrees means 1 degree's worth of heat wattage can leak past the cardboard and into the water, which for the purposes of this remains the same. The CPU has been coolde by 1 degree so it is 4 degrees above ambient. In the next second that the CPU is on it again rises by 5 degrees worth of heat wattage. This time the differential is 9, and therefore 1.8 degrees can leak over. It's at 7.2, we add 5, and a 12.2 differential means we dissipate 2.44 degrees. And so on until the CPU is 25 degrees hotter than the ambient water temperature and now the CPU dissipates as much heat into the water as it generates.
quote: Let's do the same thing with a regular silicone nanowire TEC with poor heat conductivity. The CPU turns on and it raises 5 degrees. 1 degree of heat on the CPU side leaks out so it's again at +4. Now here comes the tricky part - where it leaks out to - because I don't understand the seebeck effect. One of 2 things could happen.