So Im trying to OC my CPU and based on 3 different stress testers, I have found 3 different configurations all with the same system that keeps the CPU stable and "safe" temps
My hardware setup is:
i5 4690k
midrange cooler (but my ambient temps are on the high end)
16 gb DDR3 1600 RAM
GTX 980ti
750W PSU
Stress tester 1:
Prime95 v2.87 with small FFT
Could only pull max 4.2ghz 1.04v before the temps went past what I call "safe"
Stress tester 2:
Prime95 v2.66 with blender
Highest clock I could pull was 4.5ghz 1.20v, temps were still under my "safe" temp
Stress tester 3:
Asus Realbench
Again, I could pull was 4.5ghz 1.20v. I tried 4.6ghz with 1.22v but the temps were slightly past my "safe" temp mark.
From what I understand, eventhough all the testers jack the CPU to 100%, they do it differently.
Tester 1 uses only small pieces of data therefore the CPU has almost no missed cache cycles and therefore has to run harder. Test 2 has a mix of small and large so there is missed cache cycles. Tester 3, idk what it is in terms of data size. Its rendering an image by testing how individual rays act so Im guessing its mixed?
So my question is, which stress test is actually more representative of real applications like games and CAD software?
The reason for me asking this is bc if normal programs dont actually use as much small cache data points as the FFT test, then the actual "use" of the data I got from the test is pretty useless. In which case, I would use the data I received from the other tests instead.
Info I found on the cache size thing if you want to read more about it:
http://superuser.com/questions/981466/in-prime95-why-do-small-ffts-generate-the-most-heat-despite-cp...