• This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn more.

CPU vs GPU Temps

Why is it that a CPU begins to 'burn' at about 80 degrees C, whereas my FX5600 has a GPU 'threshold' set to 134 degrees C? They are both made of the same stuff right?

Electronic Punk

Staff member
Political User
A valid point :)
Could well depend on the sensor itself I guess.

My CPU runs at half the temperature that my GPU runs at (according to statistics)


Debiant by way of Ubuntu
I'll wait for a boffin here, but my hunch is that the actual low level architecture of the chip is different, such that a GPU is more "rugged" and can stand more heat. After all there are similar variations from Intel to AMD CPUs, albeit less marked, where an AMD will be able to run hotter before it starts to fry, I believe (or is it just that they are cranked up and run hotter anyway?)

Anyway, I await an educated opinion to (maybe) bear me out.


Political User
champ2005 said:
Why is it that a CPU begins to 'burn' at about 80 degrees C, whereas my FX5600 has a GPU 'threshold' set to 134 degrees C? They are both made of the same stuff right?
BFG 6800 Ultra OC is factory set to 120C :confused:


F@H - Is it in you?
Staff member
Political User
modern day gpu's have as many as or more trannies than a cpu :D

it would seem they are designed to handle a higher heat threshold than cpu's and have been doing so for a while...

/me shrugs...

am sure there are a coupla engs on these boards who can shed more light..
Running any piece of high density silicon above 90 Deg C is an invitation for an early death. Failure rate is an exponential function of temperature and 90 C is the knee of the curve.

If the junction temperatures truly are running at 120C don't expect the GPU to last too long past the warantee expiration date. But then early failures of Nvidia GPU's is the reason I switched back to ATI products.

On the other hand I really doubt they are running the die that hot. Water boils at 100C. The solder used in electronics starts getting soft at 145C. Nobody in the business could be that stupid... Or could they?
A few reasons I think:

1: The GPU is running less crucial information then a CPU. CPU obviously has more information running at once then a GPU, such as whatever the GPU is needing, as well as OS and other things in the background all at once.

2: Maybe the architecture of the Chip on the GPU is not as well designed as on a processor so it reaches higher temps. Thus making them utilize a different process to build the chips so that they can last longer.

I had another reason, but I can't remember.


Time Dr. Freeman?
i wouldn't necessarily trust that. it's best to have a separate die that you yourself put on the GPU and monitor the temperature with that instead. a lot of onboard monitoring temps can be very inaccurate.

Members online

Latest posts

Latest profile posts

Perris Calderon wrote on Electronic Punk's profile.
Ep, glad to see you come back and tidy up...did want to ask a one day favor, I want to enhance my resume , was hoping you could make me administrator for a day, if so, take me right off since I won't be here to do anything, and don't know the slightest about the board, but it would be nice putting "served administrator osnn", if can do, THANKS

Been running around Quora lately, luv it there https://tinyurl.com/ycpxl
Electronic Punk wrote on Perris Calderon's profile.
All good still mate?
Hello, is there anybody in there? Just nod if you can hear me ...
What a long strange trip it's been. =)

Forum statistics

Latest member