I had read in some guide that had given rules for overclocking graphic card that the Temperature/process graph has to be checked and this affects the performance. Is this true? And upto what extent can it affect?
I had read in some guide that had given rules for overclocking graphic card that the Temperature/process graph has to be checked and this affects the performance. Is this true? And upto what extent can it affect?
Yes, the Temperature/process graph is important particularly to keep check on the temperature. It also depends on the machine over its ability to drop the temperature at very high temperatures (from 105º and so).
So if the graphic card/processor graph that should give temps around 60º but instead it is giving the difference like beyond 70º or 80º ...and 90º so will that depend on the temperature lowering ability of the cooler?
Less temperature is superior for pushing the system via overclock. Heat only affect the performance when it gets so hot that the motherboard start throttling the CPU to cool the system down. On the over hand you can overclock further if the compeonts are cooler, but that assume that he has overclocked so far that heat is the barrier.
I've been thinking, and maybe you are referring to the loss of power due to reduced energy efficiency by temperature, but that would be in very, very extreme in which the power supply could not provide enough energy or its peak.
But this is not to worry. Before you have other problems to reach this point. The loss of efficiency by temperature in a computer is marginal, only if not properly cooled.
How much influence the temperature performance
Bookmarks