In addition to what is stated in the main article, one of the comments nails the specifics:
Seems like that would make sense if at normal 'voltage/speed' the Ivy Bridge is using less power. Based on the the numbers in the link the Ivy Bridge has a higher overall thermal resistance, junction-to-air, of roughly 30% [=((100C-20C)/(80C-20C))*(231W/236W)]. Based on other reviews the Ivy Bridge processors uses less power at stock frequency/voltage so that may be offsetting much of the temperature rise due to an increase in package resistance and heatsink interface resistance, under normal conditions.
Power dissipation increases exponential with increases in frequency/voltage and it appears to rise faster with the Ivy Bridge processors. So as the power dissipation approaches or exceeds that of the Sandy Bridge processor much higher processor temperatures will be measured in the Ivy Bridge because of the higher thermal resistances.
I think this is a non-issue for the average consumer. However, overclockers would probably be better off with the Sandy Bridge hardware.