CPU temperature is a value measured down from tjmax and then converted into degrees in software. Why? Because CPU temperature is measured for CPU safety, not for "how cool it is at idle". The measurement precision drops significantly as distance from tjmax increases. It is also varies not only from die revision to another, but even within the same stepping. Basically CPU manufacturers wanted to protect CPU (distance to tjmax), not implement thermometer.
In some Intel chips idle temp was measuring below ambient. In some - at 50 degreeC. It is just scale coefficient applied to tjmax distance and subtracted from tjmax. That scale coefficient is just an average value for the stepping. The load temperatures are a lot more accurate as tjmax distance is smaller, so scale coefficient error is less.
So, idle temps are not very good measure to make a call on AMD or Intel chips. Load temps are.