Wow...just...wow.

Active Member
✔️ HL Verified
💻 Oldtimer
Joined
Nov 6, 2004
Messages
3,055
Best answers
0
Location
Round Rock, TX
If your system is getting on in years, sure. Overclocking is the perfect solution for an aging system to keep up with newer games and applications. I'm only talking about people who do it right out of the box when their system can already handle anything. You're voiding the warranty for what? A subtle performance gain that only the extremely impatient would ever notice? I've always felt that was a huge waste. A Q6600 at stock speed should still handle just about anything like a dream. I only ever see bench whores overclocking new systems.
 
AMXX Coder (:
✔️ HL Verified
Discord Member
Joined
Dec 31, 2008
Messages
55
Best answers
0
A entire book about "overclocking cpu" ?!?!
 
Resting in H.E.L.L
Banned
💻 Oldtimer
Joined
Jun 9, 2009
Messages
1,328
Best answers
0
Location
New England
WHAT? LOL!

A Q6600 at stock should run anything like a dream? Not anything made since 2009 really. A Q6600 @ stock is 2.4GHZ, at such a low GHZ, you're limiting your GPU, or "Bottlenecking", and your overall performance is degraded alot. It's not just Benchmark whores.

For example, in PCSX2, the PS2 emulator, @ 2.4GHZ, I get let's say, 30FPS in some really nasty areas, but OC'ed to 3.4GHZ, I got 45FPS in those same areas, due to my system not bottle-necking. There ARE good reasons for over-clocking.
 
Active Member
✔️ HL Verified
💻 Oldtimer
Joined
Nov 6, 2004
Messages
3,055
Best answers
0
Location
Round Rock, TX
All the CPU does is feed data to the GPU. That's really not much work. The CPU essentially just sits in the background running the show. It's the intermediate device that keeps all of your other devices happy. The GPU is the workhorse of a gaming system. A high frequency CPU isn't strictly necessary, especially now that pretty much every new game is heavily multithreaded. You can even look back at old games like BioShock, where Irrational gave every major task its own thread in anticipation of future technology (even though current hardware at the time took almost full advantage of it). It takes a pretty weak CPU to bottleneck a GPU.

With emulators, your system is literally emulating the game's native data format, and in such an extreme case, the CPU is the workhorse, not the GPU. This is not a real-world application that can be taken seriously. In most games, you'll notice a pretty much universal maximum of 25% in performance gains, and if your system can already crush it, that's absolutely no reason to void your warranty. That's all I'm trying to say. If your system is aging, good on you, overclock away. You don't have as much to lose.

Even without the performance gained through overclocking, you've still got a fine system that should last years. There's no conceivable reason to overclock until you need to. If you want to, of course there's nothing that can stop you, and there's nothing wrong with wanting to if that's your thing. I just don't see it as being particularly practical, and I'll never overclock a brand new system.
 
Last edited:
King of the Hello Kitty Fanclub
💻 Oldtimer
Joined
Sep 6, 2004
Messages
1,675
Best answers
0
Location
Australia
Kinda agree with Spunky, I bought a Q6600 back nearly 3 years ago and at the time with 2x 8800GT's in SLI there was nothing that I couldn't handle, sure if I wanted to, I could have overclocked but for what would have been a minimal (if at all noticeable) performance increase, it's really not worth the trouble.

I'm already have issues with my PC shutting down randomly (which I believe to be a PSU problem) so overclocking would probably cause me even more grief. I just don't see the point, I'll be rebuilding my system around March next year (on the current ones third anniversary) and I'll buy fresh everything and run my old machine somewhere on the network where it'll hopefully last a long time. I've also had no issue running any modern game at high graphics settings - something my attribute to my SLI GPU's, not my CPU's clock speed.
 
New Member
Joined
Feb 17, 2008
Messages
108
Best answers
0
Anyone calling people who overclock their hardware retards or saying it has jack all performance difference is uninformed or has tunnel vision. Seriously, its not that hard at all to grasp the concept and learn your way around it.

The only way you can ever blow up your system is by settings stupidly high voltages.

If I was to clock a CPU from 2.4Ghz (say a Core2 E6600 or Q6600 for that matter) to 3.0Ghz on the same voltage it takes for it to run @ 2.4Ghz, then I really haven't reduced its life span because its running at intel's specified voltage specs.

The only thing you should have to worry about in overclocking is trial and error to get it stable depending on what speeds you are aiming for.

I'll give you an example of performance scaling using GTX 480 SLI from different cpu clock speeds on my system using Crysis:

3.33Ghz:
Code:
5/09/2010 3:55:00 PM - Vista 64
Beginning Run #1 on Map-island, Demo-benchmark_gpu
DX10 1900x1200, AA=No AA, Vsync=Disabled, 32 bit test, FullScreen
Demo Loops=3, Time Of Day= 9
Global Game Quality: VeryHigh
 ==============================================================
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
!TimeDemo Run 0 Finished.
    Play Time: 40.97s, Average FPS: 48.82
    Min FPS: 22.83 at frame 150, Max FPS: 76.92 at frame 1631
    Average Tri/Sec: -17108296, Tri/Frame: -350464
    Recorded/Played Tris ratio: -2.62
!TimeDemo Run 1 Finished.
    Play Time: 32.31s, Average FPS: 61.90
    Min FPS: 22.83 at frame 150, Max FPS: 78.64 at frame 1761
    Average Tri/Sec: -20909754, Tri/Frame: -337809
    Recorded/Played Tris ratio: -2.71
!TimeDemo Run 2 Finished.
    Play Time: 32.04s, Average FPS: 62.42
    Min FPS: 22.83 at frame 150, Max FPS: 78.64 at frame 1761
    Average Tri/Sec: -21097890, Tri/Frame: -337982
    Recorded/Played Tris ratio: -2.71
TimeDemo Play Ended, (3 Runs Performed)
==============================================================

Completed All Tests

<><><><><><><><><><><><><>>--SUMMARY--<<><><><><><><><><><><><><>

5/09/2010 3:55:00 PM - Vista 64

Run #1- DX10 1900x1200 AA=No AA, 32 bit test, Quality: VeryHigh ~~ Overall Average FPS: 62.16
3.8Ghz
Code:
5/09/2010 4:33:15 PM - Vista 64
Beginning Run #1 on Map-island, Demo-benchmark_gpu
DX10 1900x1200, AA=No AA, Vsync=Disabled, 32 bit test, FullScreen
Demo Loops=3, Time Of Day= 9
Global Game Quality: VeryHigh
 ==============================================================
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
!TimeDemo Run 0 Finished.
    Play Time: 36.85s, Average FPS: 54.28
    Min FPS: 32.71 at frame 166, Max FPS: 81.35 at frame 897
    Average Tri/Sec: -19014174, Tri/Frame: -350314
    Recorded/Played Tris ratio: -2.62
!TimeDemo Run 1 Finished.
    Play Time: 28.26s, Average FPS: 70.76
    Min FPS: 32.71 at frame 166, Max FPS: 83.81 at frame 859
    Average Tri/Sec: -23871958, Tri/Frame: -337370
    Recorded/Played Tris ratio: -2.72
!TimeDemo Run 2 Finished.
    Play Time: 28.25s, Average FPS: 70.79
    Min FPS: 32.71 at frame 166, Max FPS: 83.98 at frame 858
    Average Tri/Sec: -23861196, Tri/Frame: -337092
    Recorded/Played Tris ratio: -2.72
TimeDemo Play Ended, (3 Runs Performed)
==============================================================

Completed All Tests

<><><><><><><><><><><><><>>--SUMMARY--<<><><><><><><><><><><><><>

5/09/2010 4:33:15 PM - Vista 64

Run #1- DX10 1900x1200 AA=No AA, 32 bit test, Quality: VeryHigh ~~ Overall Average FPS: 70.775
4.0Ghz
Code:
5/09/2010 4:52:33 PM - Vista 64
Beginning Run #1 on Map-island, Demo-benchmark_gpu
DX10 1900x1200, AA=No AA, Vsync=Disabled, 32 bit test, FullScreen
Demo Loops=3, Time Of Day= 9
Global Game Quality: VeryHigh
 ==============================================================
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
!TimeDemo Run 0 Finished.
    Play Time: 36.71s, Average FPS: 54.48
    Min FPS: 44.40 at frame 137, Max FPS: 81.53 at frame 957
    Average Tri/Sec: -19078192, Tri/Frame: -350189
    Recorded/Played Tris ratio: -2.62
!TimeDemo Run 1 Finished.
    Play Time: 27.64s, Average FPS: 72.37
    Min FPS: 44.40 at frame 137, Max FPS: 84.42 at frame 888
    Average Tri/Sec: -24389270, Tri/Frame: -337021
    Recorded/Played Tris ratio: -2.72
!TimeDemo Run 2 Finished.
    Play Time: 27.49s, Average FPS: 72.75
    Min FPS: 44.40 at frame 137, Max FPS: 84.42 at frame 888
    Average Tri/Sec: -24567006, Tri/Frame: -337699
    Recorded/Played Tris ratio: -2.71
TimeDemo Play Ended, (3 Runs Performed)
==============================================================

Completed All Tests

<><><><><><><><><><><><><>>--SUMMARY--<<><><><><><><><><><><><><>

5/09/2010 4:52:33 PM - Vista 64

Run #1- DX10 1900x1200 AA=No AA, 32 bit test, Quality: VeryHigh ~~ Overall Average FPS: 72.56
4.2Ghz
Code:
5/09/2010 5:00:33 PM - Vista 64
Beginning Run #1 on Map-island, Demo-benchmark_gpu
DX10 1900x1200, AA=No AA, Vsync=Disabled, 32 bit test, FullScreen
Demo Loops=3, Time Of Day= 9
Global Game Quality: VeryHigh
 ==============================================================
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
!TimeDemo Run 0 Finished.
    Play Time: 35.05s, Average FPS: 57.06
    Min FPS: 44.70 at frame 182, Max FPS: 82.78 at frame 954
    Average Tri/Sec: -19923628, Tri/Frame: -349177
    Recorded/Played Tris ratio: -2.63
!TimeDemo Run 1 Finished.
    Play Time: 27.10s, Average FPS: 73.79
    Min FPS: 44.70 at frame 182, Max FPS: 84.73 at frame 925
    Average Tri/Sec: -24894958, Tri/Frame: -337364
    Recorded/Played Tris ratio: -2.72
!TimeDemo Run 2 Finished.
    Play Time: 26.99s, Average FPS: 74.10
    Min FPS: 44.70 at frame 182, Max FPS: 84.73 at frame 925
    Average Tri/Sec: -25015340, Tri/Frame: -337586
    Recorded/Played Tris ratio: -2.72
TimeDemo Play Ended, (3 Runs Performed)
==============================================================

Completed All Tests

<><><><><><><><><><><><><>>--SUMMARY--<<><><><><><><><><><><><><>

5/09/2010 5:00:33 PM - Vista 64

Run #1- DX10 1900x1200 AA=No AA, 32 bit test, Quality: VeryHigh ~~ Overall Average FPS: 73.945
As you can clearly see, using 3.33Ghz, which are 975 stock clocks, on my avg fps I get 10fps difference by the time I reach 4.2Ghz
 
Active Member
✔️ HL Verified
💻 Oldtimer
Joined
Nov 6, 2004
Messages
3,055
Best answers
0
Location
Round Rock, TX
Go back and read my posts again. Also, lol @ SLI GTX 480s. You either have a ton of money or hate the money you have.

Spunky said:
In most games, you'll notice a pretty much universal maximum of 25% in performance gains, and if your system can already crush it, that's absolutely no reason to void your warranty.
My entire point. gg
 
Last edited:
King of the Hello Kitty Fanclub
💻 Oldtimer
Joined
Sep 6, 2004
Messages
1,675
Best answers
0
Location
Australia
If you could legitimately tell the difference between 62fps and 72fps, then you may have a point, but in all honesty it's so negligible it's not even funny.
 
Freelance Mappzor
✔️ HL Verified
🚂 Steam Linked
💻 Oldtimer
Joined
Nov 21, 2003
Messages
17,065
Best answers
0
Location
Stairing at the Abyss
Human eyes can not tell the difference in 10 FPS above 30 anyhow, heck you cant tell the difference at all once it goes above 60. Im with Spunky. If the system works with what you have at manufacturer settings, there is no need to OC it. Especially since it cancles out the warranty on the system. Then you get the following scenarios.

"I OCd my system and it burned due to a manufacturer fault. Sadly they wont replace it cause i messed with the settings, even though that wasnt what caused the faliure"

Yea id be real happy at that outcome especially given that i have to save up for months in order to upgrade the thing in the first place.
 
Resting in H.E.L.L
Banned
💻 Oldtimer
Joined
Jun 9, 2009
Messages
1,328
Best answers
0
Location
New England
Grega? What the **** are you talking about? The human eye doesn't see in FPS, and we can see up to 200 and beyond!

And most parts these days have warranties that ensure over-clocking, because it's so common.

http://www.100fps.com/how_many_frames_can_humans_see.htm

I can tell the difference between 50 and 60 FPS easily, between 55 & 60 sometimes, and 59 & 60 on a very rare occasion.
 
Active Member
✔️ HL Verified
💻 Oldtimer
Joined
Nov 6, 2004
Messages
3,055
Best answers
0
Location
Round Rock, TX
I can tell the difference between 50 and 60 FPS easily
Alright, I'm with you here.

between 55 & 60 sometimes
Far less likely, but I'll suspend my disbelief.

59 & 60 on a very rare occasion
I really doubt you're a robot.

The human eye doesn't see in FPS, and we can see up to 200 and beyond!
The human brain doesn't process a single, whole image, but it absolutely does process a certain amount of information (light) per second (how much that is I honestly don't know) to a limit (which I also won't presume to know), at least somewhat comparable to a frames-per-second measurement. It's not accurate to say we see X amount of frames-per-second, but it's suitable given the frame of reference most of the people on this forum possess.

It absolutely is acceptable to say the human eye cannot distinguish a difference of ((> 60) < 5) FPS. Exactly how accurate that is I won't presume to know, but I'd take an educated guess and say yeah, that sounds about right.

Personally, I'd need to increment my framerate exponentially to notice a significant difference above 60 FPS. Not everyone's eyes are the same though, but unless you're a robot or have some kind of super brain, I really, really doubt you're able to notice a difference of < 1 FPS. That's silly.
 
Freelance Mappzor
✔️ HL Verified
🚂 Steam Linked
💻 Oldtimer
Joined
Nov 21, 2003
Messages
17,065
Best answers
0
Location
Stairing at the Abyss
Grega? What the **** are you talking about? The human eye doesn't see in FPS, and we can see up to 200 and beyond!

And most parts these days have warranties that ensure over-clocking, because it's so common.

http://www.100fps.com/how_many_frames_can_humans_see.htm

I can tell the difference between 50 and 60 FPS easily, between 55 & 60 sometimes, and 59 & 60 on a very rare occasion.

One thing is seeing it and another thing is difference in motion.

If a motion continues steadily at 60 FPS and you see the same motion at 100 FPS you would not notice a difference. On the other hand if it jumped from 60 to 100 you would see a difference.

My whole statement was not if you can see a flicker of the screen at 60Hz, but rather if you see the difference in motion when you are running a game at a constant FPS rate. Yes you see the difference greatly below and above 30. But motion above 30 is smooth and above 60 you need to look really really hard in order to actually see the flicker of motion (not the screen).

The link you posted, is a totally different field of eye perception than what i was getting at.
 
Last edited:
Active Member
💻 Oldtimer
Joined
Nov 6, 2005
Messages
1,037
Best answers
0
On the whole fps front: when people say fps above 24/30 is no visible by the eye, they confuse things. Those fps (if steady) are the threshold where the eye is tricked to see fluid motion into a series of stills.

It's the bare minimum, not the maximum where you can tell a difference.
 
Active Member
✔️ HL Verified
🚂 Steam Linked
💻 Oldtimer
Joined
Sep 23, 2002
Messages
1,876
Best answers
0
Location
Fryslân Boppe! The Netherlands
All the CPU does is feed data to the GPU. That's really not much work. The CPU essentially just sits in the background running the show. It's the intermediate device that keeps all of your other devices happy. The GPU is the workhorse of a gaming system. A high frequency CPU isn't strictly necessary, especially now that pretty much every new game is heavily multithreaded. You can even look back at old games like BioShock, where Irrational gave every major task its own thread in anticipation of future technology (even though current hardware at the time took almost full advantage of it). It takes a pretty weak CPU to bottleneck a GPU.

With emulators, your system is literally emulating the game's native data format, and in such an extreme case, the CPU is the workhorse, not the GPU. This is not a real-world application that can be taken seriously. In most games, you'll notice a pretty much universal maximum of 25% in performance gains, and if your system can already crush it, that's absolutely no reason to void your warranty. That's all I'm trying to say. If your system is aging, good on you, overclock away. You don't have as much to lose.
no, no, no, no.. Games are more then just flashy lights and silly pixels.
Games like supreme commander, WoW, games which relay heavily on AI. Games that have a lot going on at once (Like TF2 on a 32 man player map) are all very much CPU depended. It's not a good graphics card that makes or breaks a gaming system, it's a combination of hardware, motherboard, CPU, RAM and graphics card. Hell even your HDD can influence game performance.

A CPU isn't a manager, more like an assistant manager. He does all the managers jobs, while having it's own to do too :p
 
Active Member
✔️ HL Verified
💻 Oldtimer
Joined
Nov 6, 2004
Messages
3,055
Best answers
0
Location
Round Rock, TX
The GPU is inarguably the most significant piece of hardware in a gaming system. Supreme Commander is CPU intensive because of the sheer number of AI scripts running parallel to each other. That's why they recommend dual or quad cores as opposed to a high frequency. In most other instances, the CPU is far less important. You'll notice the biggest difference in performance scaling down to 1 core, or running single-threaded games and applications. All the motherboard does is bridge your components and ensure everything is getting the power it should. RAM speed can mildly affect CPU performance if it's ridiculously slower than your CPU's front side bus, but otherwise all that matters is how much you have. Your hard drive is actually pretty important, and that's about the only thing you've pointed out that isn't almost completely wrong. Having a big, slow drive, especially in games where textures load on-demand and the like will really hurt your graphics performance. Also, obviously, loading times. That's about it though. You can keep your drive defragged (so as to make all of that information as readily available to the system as possible) and be just fine in almost any other situation.

Also, just an fyi, AI in WoW is computed server-side. All the client-side programming does is update their actions and positions so you know what's going on. If AI were computed client-side, you could hack like crazy, and that'd just be silly.
 

Users who are viewing this thread

Top Bottom