Left 4 Dead Multicore Rendering: Need clarification

New Member
✔️ HL Verified
💻 Oldtimer
Joined
May 28, 2006
Messages
1,094
Best answers
0
I love L4D, even if I don't play it as often as TF2. Playing as zombies is nothing but fun (in reasonable servers).

However, I've noticed that the game also heats up my computer like nobody's business. Honestly, I can play Crysis with tweaked ultra-high graphics (hacked settings to achieve this) and not get this kind of heat.

Oh, there's no lag. My computer can perform highest setting on current-gen games without render-lag thanks to my 8800GT.

But heat is where I feel fearful. My last video card (7950gt) pretty much died after a year of use because of heat damage. It got pushed so bad that it tried to tell me it's axillary power had been cut, so it had to tone the card's performance down to where playing Half Life Source made the thing protest. Hence the new card. I installed Speedfan so I could monitor the temps and change the fan speeds. Thankfully, NVidia ntune also allowed me to change the card's fan speed. I'm thinking my PC's heat detection is slow to react, because I KNOW when it gets TOO hot that the CPU fan starts moving accordingly, but from the heat readings, it's too little too late.

So, whenever I play a game these days, or set something to render in 3dsmax, I always fire up speed-fan and set the speeds to about 75% of the max. I don't like doing it, because the fans are loud, but it's necessary to make sure nothing gets damaged. My PSU even had different fan speeds in a switch in the back, but that one seems dodgy. Sometimes it almost feels like turning up the PSU's fan makes the damned thing HOTTER...

But anyway, I know that L4D has a "multicore rendering" feature. Now, I'm using Windows XP Media Center, and have a 3.8 gigahertz Athlon 64 dual core processor. I know that XP can use both cores for conventional purposes, but that games will only use the one core.

My understanding was that Windows Vista was the only system that fixed this for game processing, and was also the only Windows system that could employ quad cores. My logic is that the XP version of L4D would be unable to truly utilize the "multicore rendering" feature, and yet it's turned on by default anyway. Could it be that having this feature on is forcing the processor to work harder? I've already turned off anti-aliasing and filtering and set the shader detail down. It's better, but not all that much. I set the CPU from 70 to 80% out of fear. Anything higher and the CPU fan sounds like it's an engine or something. It's a scary sound, and I almost become afraid that the fan will fly clear off the processor. Maybe if it starts out with those settings it will stay cooler than before, but I don't want to play the game at all if it risks melting my CPU onto my motherboard.
 
Live free or die by the sword
Retired Forum Staff
✔️ HL Verified
💻 Oldtimer
Joined
Dec 1, 2001
Messages
7,416
Best answers
0
Location
North East Pennsylvania
We need to have some more info here. First we need the actual temps you are running when playing l4d. Second, I need to know if you've overclocked any of your hardware. As an aside, XP handles mulitple threads for dual proc in a game just fine, the problem was that developers really weren't including it till recently. To this day, only a few titles support dual proc setups, and none to my knowledge handle quad, regardless of how Vista distributes the threads.
 

Users who are viewing this thread

Top Bottom