AMTI Athleon...

Lost in space
Banned
💻 Oldtimer
Joined
Dec 21, 2003
Messages
3,608
Best answers
0
Devion said:
If you are so on graphics, then why take crossfire with the x800?
Crossfire(Only the X serie, not the X1) is limited to 1600x1200 @ 60hz.(So there goes your "so I'd rather be capped at 85 and have it look comfortable...." story)

nVidia doesnt have weird glitches, dont make some bull**** up.

nVidia only dont have independent angle AFiltering, which you dont notice running and shooting through a level, but it could be anoying.

But then again ATi has got texture crawling(In some cases, both Radeon and nVidia have it, most of the time ATi though), which is just as anoying as independent AFiltering.

Also ATi has sucky support in Linux or other OS.

nVidia drivers menu may not look shiny like ATi(Although there is alternative skin and layout now), but its 1000x more efficient IMO.

ATi has HDR+AA and nVidia doesn't except Half-Life 2.

New Geforce serie(7) runs more quiet and cooler then the Radeon's X1 serie delivering the same performance.
Both cards have their own strong and weak points, there isnt a clear superiour GFX maker.
So dont be so ATi-biased and go and pick up an nVidia card sometime. You'll be surprised how well they work and especially the user friendly driver menu.
Both points are true. ATI's current stock HSFs are screaming loud in comparision to nvidia coolers, there also double slot (however the 7900gtx is both of these, but the fan is not as loud according to people that have owned both cards). The only single slot soulution for a X1900 card atm is the X1900GT, and hopefully the X1950 HSF will be quite compared to the X1900 XT/XTX Coolers. Also, G80 is expected to still be a 90nm part while ATI is shrinking it's chips to 80nm (X1650 and X1950 series).

A picture of the X1950 reference cooler:

(It looks simlar to a HIS designed cooler if you ask me)
http://theinquirer.net/default.aspx?article=33215 - X1950 has HDCP support.

Also, a few links on AMD's 4x4 platform, which is a concept to introduce quad core setups to desktop/gaming segments of the market. This is what will fight off Conroe untill AMD releases the K8L core and was just recently demonstrated
Links:
http://techreport.com/onearticle.x/10428
http://theinquirer.net/default.aspx?article=33268
 
New Member
💻 Oldtimer
Joined
May 14, 2003
Messages
1,929
Best answers
0
|Overlord| said:
Both points are true. ATI's current stock HSFs are screaming loud in comparision to nvidia coolers, there also double slot (however the 7900gtx is both of these, but the fan is not as loud according to people that have owned both cards). The only single slot soulution for a X1900 card atm is the X1900GT, and hopefully the X1950 HSF will be quite compared to the X1900 XT/XTX Coolers. Also, G80 is expected to still be a 90nm part while ATI is shrinking it's chips to 80nm (X1650 and X1950 series).

A picture of the X1950 reference cooler:

(It looks simlar to a HIS designed cooler if you ask me)
http://theinquirer.net/default.aspx?article=33215 - X1950 has HDCP support.
Overlord :/.
Cucumba said:
Lets not turn this into a fanboy thread.
Thanks.

nVidia loves AMD+ATI
 
New Member
💻 Oldtimer
Joined
Dec 3, 2002
Messages
2,490
Best answers
0
Ah, I forgot why I don't normally talk hardware. Same reason I don't talk music.

Some of the standards for hardware are not really concerns to me (or any sane person). I don't care which card is quieter; my computer is in a relatively soundproof cabinet, and the fans and HDD will always have more noise than the vid card anyway. Coolness and heat are factors of performance and as such aren't a concern to me either--who cares which is cooler, the one that works best is likely to be plenty cool by definition, anyway. If it isn't the coolest, that doesn't matter to me much anyhow because it's not as if I'm a dumbass who is going to get a comp with no fans and massive heat.

Everyone I've known with an ATI card gets way more shelf life out of them than GeForces. It wasn't always like that but that is how it's been lately from my personal experiences. Also, I've gone and set up plenty of vid cards for other people, doing the rounds of refresh rate optimizations and graphics tweaking and all that.

Go to tweakguides.com and look up the guides for the catalyst and forceware drivers. With one simple utility for ATI (ATI Tray Tools), you can control hundreds of graphical variables including those in the control center, which means you don't even need it installed. For Nvidia you will have to go through three different third party programs in addition to the normal driver interface, only to find that have only a handful of extra options, most of which are useless.

I just like having that much more control of my card with Tray Tools. My best bud recently got a new machine, Nforce4 with a Geforce 7800GT, and I went through to his forceware settings and almost burst out laughing at how limited it all was compared to my ATI card's one, simple utility. He wasn't able to set his stuff much higher than my 9800 Pro, even with PCI express and a gig of RAM.

That day pretty much made my decision for me. Things without truform = legomania, far as my eyes are concerned.

It comes down to personal preference. I just happen to prefer the card with more flexibility. I consider that user friendly driver menu to be a complete insult to my intelligence, honestly--like the Windows 'idiot control panel' that XP defaults to does.

I don't care about how it sounds. How it cools. How it is designed. As long as it works great I could care less about any of that, and that ideology has had me ahead of the curve for my last three cards. They always vastly outperform my expecations...lasting for years longer than most people's. I've had my 9800 Pro for a while, replacing a 9700 I had that got damaged in 03. Whatever card I buy for this generation will undoubtedly keep me going well past the time the rest of you are forced to upgrade, I'm sure. Especially if you've got one of those cruddy, leaky Geforce SLI setups.
 
G-Bear
✔️ HL Verified
🚂 Steam Linked
Discord Member
Joined
Nov 28, 2002
Messages
764
Best answers
0
SaiyanPrideXIX said:
I just like having that much more control of my card with Tray Tools. My best bud recently got a new machine, Nforce4 with a Geforce 7800GT, and I went through to his forceware settings and almost burst out laughing at how limited it all was compared to my ATI card's one, simple utility. He wasn't able to set his stuff much higher than my 9800 Pro, even with PCI express and a gig of RAM.
Dont blame your own inability to just look, on the nVidia's drivers.

ATi dont have superiour(Important) features that nVidia doesnt have in its drivers.
 
New Member
💻 Oldtimer
Joined
Dec 3, 2002
Messages
2,490
Best answers
0
ATi dont have superiour(Important) features that nVidia doesnt have in its drivers.
When did they get Truform...?

Either way, like I said before...I guess you never tried to enable trilinear filtering in Direct3D on an Nvidia card.

TweakGuides.com said:
DXTweaker is a small utility which allows users to forcefully enable Triple Buffering in Direct3D, something which is not possible using the normal Triple Buffering setting in the Nvidia Control Panel (See the Triple Buffering setting at the bottom of Page 6). You can download DXTweaker for free from here (mirror here). Note that you will need to have the DirectX9.0c June 05 update or newer installed otherwise you will get errors. You can get the latest DirectX Update here. You will also need .NET Framework 1.1 for DXTweaker to run.
I'd call that kind of important, no? You need to use NVTweak -- http://www.tweakguides.com/images/NVFORCE_8.jpg -- to even have access to some of the menus in the drivers themselves...including things like Refresh Rate forcing, a basic feature that even SP2 can do on it's own (I think).

Then again if you didn't care about this massive flaw, you would probably either A.) have refresh rate tears and don't notice (maybe that's what you meant by 'texture slips' earlier), or B.) Have fixed the problem similarly to how TweakGuides suggests...which involves installing three different things in addition to the drivers just to enable the option for this old and relatively straightforward feature. Even then you have to go through an annoying process for EACH INDIVIDUAL GAME you want it enabled in.

In ATI's control center, or even in my custom Tray Tools, I just click one checkbox and then it's done.

As I said before, after I found this out I said forget about Nvidia. You were apparently not aware of this problem, so I'm going to assume that you probably have never dealt with it. And that's fine. Keep it at 60hz, for all I know. But just remember I was tricking out vid cards while you were still watching Power Rangers, man. Trust me on this. Nvidia used to be the king. I had my Geforce256 for YEARS and it stayed on top of everything. But not anymore. What I saw on my bud's computer is enough to convince me of that. So it's probably gonna be dual x1900s for me.

Let me know when you get truform.
 
New Member
Retired Forum Staff
✔️ HL Verified
💻 Oldtimer
Joined
Apr 7, 2003
Messages
1,478
Best answers
0
Um.. Pride.. Granted, I haven't been fully following this thread, but why exactly do you keep bringing up "Truform"?

Unless Wikpedia is mistaken,

"Truform is a graphics texture technology created by ATI and employed in DirectX 8 and OpenGL, on both Mac and PC. The technology was first employed on Radeon 8500. However, after the card's release, the much-hyped technology was only used in a few games (such as Bugdom) and fell by the wayside. Typically 3D shapes are composed of large numbers of triangles. The more triangles are used, the finer the detail levels. Truform creates a curved surface between triangle points, so the user experiences curved lines, but the main software still only has to render triangles. It is designed to increase visual quality, without significantly impacting frame rates."

Why exactly would we care about a overlooked and neglected feature from the DX8 days? Seems a bit like shouting, advocating Intel, "We've got MMX! AMD doesn't have MMX!"
 
New Member
💻 Oldtimer
Joined
Dec 3, 2002
Messages
2,490
Best answers
0
That was sort of my point. They never did get it. Granted it wasn't utilized fully but it was still the first thing that leapt to mind in defense of things ATI could do that Nvidia couldn't.

Other than the glaring No triple buffering in Direct3D thing, of course.

I think I may still see bits and pieces of Truform acting on my machine sometimes; a friend and I have identical specs but when we play older games mine always look more rounded (Bullets in Max Payne 2, Missiles in Generals, etc.) It's probably just my imagination, honestly.

Fact remains it was an old feature no one implemented. It is a similar principle to bump mapping--making it look a different shape then the computer interprets it--but for some reason no one thought that'd be a big technology.

Incidentally if I remember correctly I think they got bump mapping later than ATI. Or was it the other way around? I don't really remember, honestly.
 
Lost in space
Banned
💻 Oldtimer
Joined
Dec 21, 2003
Messages
3,608
Best answers
0
SailorAlea said:
Um.. Pride.. Granted, I haven't been fully following this thread, but why exactly do you keep bringing up "Truform"?

Unless Wikpedia is mistaken,

"Truform is a graphics texture technology created by ATI and employed in DirectX 8 and OpenGL, on both Mac and PC. The technology was first employed on Radeon 8500. However, after the card's release, the much-hyped technology was only used in a few games (such as Bugdom) and fell by the wayside. Typically 3D shapes are composed of large numbers of triangles. The more triangles are used, the finer the detail levels. Truform creates a curved surface between triangle points, so the user experiences curved lines, but the main software still only has to render triangles. It is designed to increase visual quality, without significantly impacting frame rates."

Why exactly would we care about a overlooked and neglected feature from the DX8 days? Seems a bit like shouting, advocating Intel, "We've got MMX! AMD doesn't have MMX!"
AMD does have MMX :/.

"Intel doesn't have 3DNow!, only we do!"
Intel practically purchased rights from amd to use 64bit extensions on their chips. It could be a simlar story with mmx, but to use an example such as that is really old nowa days.
 
New Member
💻 Oldtimer
Joined
May 14, 2003
Messages
1,929
Best answers
0
|Overlord| said:
Intel practically purchased rights from amd to use 64bit extensions on their chips.
Err...
SaiyanPride said:
Incidentally if I remember correctly I think they got bump mapping later than ATI. Or was it the other way around? I don't really remember, honestly.
I honestly don't know about bump mapping, as that could even be done on GeForce 2's, but nVidia brought programmable shading technology to the consumer level first with their GeForce 3s.
 
New Member
💻 Oldtimer
Joined
Dec 3, 2002
Messages
2,490
Best answers
0
That is true, because Doom 3 was first unveiled on a GF3.

Either way I still prefer ATI. In my experience they are a lot more flexible. The Direct3d/No Triple Buffering thing really bugs me, because as you know Smith, I once had a lot of gfx problems until you asked me if I had triple buffering enabled, and once I did it improved my performance vastly across the board. So naturally when setting up anyone's card I go the same route and adjust from there. And then I found Nvidias couldn't do that without doing a bunch of BS, so that pretty much sealed the deal for me.
 
New Member
💻 Oldtimer
Joined
May 14, 2003
Messages
1,929
Best answers
0
SaiyanPrideXIX said:
Either way I still prefer ATI. In my experience they are a lot more flexible. The Direct3d/No Triple Buffering thing really bugs me, because as you know Smith, I once had a lot of gfx problems until you asked me if I had triple buffering enabled, and once I did it improved my performance vastly across the board. So naturally when setting up anyone's card I go the same route and adjust from there. And then I found Nvidias couldn't do that without doing a bunch of BS, so that pretty much sealed the deal for me.
I guess I could've told you just to disable vertical sync, heh.
 
New Member
💻 Oldtimer
Joined
Dec 3, 2002
Messages
2,490
Best answers
0
Yeah but I get that *** tearing problem. It seems so much more stable with vsync and a good monitor...
 
G-Bear
✔️ HL Verified
🚂 Steam Linked
Discord Member
Joined
Nov 28, 2002
Messages
764
Best answers
0
SaiyanPrideXIX said:
I'd call that kind of important, no? You need to use NVTweak -- http://www.tweakguides.com/images/NVFORCE_8.jpg -- to even have access to some of the menus in the drivers themselves...including things like Refresh Rate forcing, a basic feature that even SP2 can do on it's own (I think).

Then again if you didn't care about this massive flaw, you would probably either A.) have refresh rate tears and don't notice (maybe that's what you meant by 'texture slips' earlier), or B.) Have fixed the problem similarly to how TweakGuides suggests...which involves installing three different things in addition to the drivers just to enable the option for this old and relatively straightforward feature. Even then you have to go through an annoying process for EACH INDIVIDUAL GAME you want it enabled in.

In ATI's control center, or even in my custom Tray Tools, I just click one checkbox and then it's done.

As I said before, after I found this out I said forget about Nvidia. You were apparently not aware of this problem, so I'm going to assume that you probably have never dealt with it. And that's fine. Keep it at 60hz, for all I know. But just remember I was tricking out vid cards while you were still watching Power Rangers, man. Trust me on this. Nvidia used to be the king. I had my Geforce256 for YEARS and it stayed on top of everything. But not anymore. What I saw on my bud's computer is enough to convince me of that. So it's probably gonna be dual x1900s for me.
Seriously cut the arrogance. As I've been watching the videocards scene since the Voodoo Banshee, the last remark is totally out of line.

First of all, you trip on 1 thing, NO TRIPLEBUFFERING NOES!1!!.
Does that makes ATi's drivers superiour? No.
And there is also something called as V-Sync to prevent tearing and it's not the texture "problem". The texture thing is because nVidia doesnt support independent Anti-throspicfiltering, but then again ATi has its anoying texture crawling.

What if I made fun of the X800 serie with its lack of support for SM3.0?
Then you would come with the excuse of SM3.0 holds no advantage over SM2.0.(Like the Far Cry screenshot)
Now you being basicly thrown to death with ATi's, WE GOT BETTER SM3.0!1!1(For physic calculations and HDR)
SM3.0 didn't hold any advantage over SM2.0 eh?

I've always been bit green, if it came to videocards, but now I'm thinking of getting an X1950XT(Soon to be released @ $399), but I wont.
Why?
Because when that videocard is 4 years old, I want driver support, no not some quick fixes and patches as they are planning to do so with the 9000 serie and already did with the olders videocard series.

And to be truly honest, the early videocards like Geforce 2 MX still kick ass. Who can says that he still runs GTA: San Andreas, play Fifa 2006 and even CS: Source/DoD: Source with his Radeon 7000.
I can still play that with my Geforce 2 MX400. I'll even try CoD2 in a few minutes.
 
New Member
💻 Oldtimer
Joined
May 14, 2003
Messages
1,929
Best answers
0
Devion said:
First of all, you trip on 1 thing, NO TRIPLEBUFFERING NOES!1!!.
Does that makes ATi's drivers superiour? No.
Yes and no. Yes in the sense it can do it out of the box. No in the sense that nVidia can do it, albeit in a more involved manner.
And there is also something called as V-Sync to prevent tearing and it's not the texture "problem".
I'm quite sure that SaiyanPrideXIX knows this.
The texture thing is because nVidia doesnt support independent Anti-throspicfiltering
I believe the words you're looking for are anistropic filtering.
I've always been bit green, if it came to videocards, but now I'm thinking of getting an X1950XT(Soon to be released @ $399), but I wont.
Why?
Because when that videocard is 4 years old, I want driver support, no not some quick fixes and patches as they are planning to do so with the 9000 serie and already did with the olders videocard series.
My friend, when that video card is 4 years old, driver support will be the least of your worries.
And to be truly honest, the early videocards like Geforce 2 MX still kick ass. Who can says that he still runs GTA: San Andreas, play Fifa 2006 and even CS: Source/DoD: Source with his Radeon 7000.
I can still play that with my Geforce 2 MX400. I'll even try CoD2 in a few minutes.
Cool =). Oh, and I'm not purposefully being mean, if you interpret it that way. You post a sarcastic/inflamed post, I post a sarcrastic reply.
SaiyanPrideXIX said:
Yeah but I get that *** tearing problem. It seems so much more stable with vsync and a good monitor...
Heh, I understand.

Well, anywho, can you both drop it? No more fanboyishness please, as the Cucumba asked. Thanks.

Edit: Actually, after reading the last series of posts, it's already gotten pretty far off-topic. No worries, though, the main idea still had a decent little chat and much info was brought up. As such, closed.
 

Users who are viewing this thread

Top Bottom