Need a solid Video Card (got one)

New Member
Joined
May 14, 2003
Messages
904
Best answers
0
Also If you could find one, Grab a Radeon 9550! If it has a 128bit memory interface, it would wipe the floor with the 6200 64bit.
Dude......god save your soul......that grfx card is about equivlent to a TNT2 in performance......expect the TNT2 had a 128 bit interface!!!!!!!!!
I love how you're all condemning a card even if you probably haven't seen a single review of it. I also love how you're both judging performance on only memory bandwidth. The 6200A > a Radeon 9550, even if the former has only a "64 bit" memory bus. Please, both of you, understand that as technology advances, more and more optimizations take place that lead to more that can be done with less, which in this case is memory bandwidth. The 6200 performs admirably, even with its short leash. That's not to say more isn't better, especially with today's latest offerings (yes, I'm talking to you 8600GT/S).
... I doubt that. The Geforce 6200 has 4 pipes and I think 3 vertex units.
To find out of the total memory bandwidth just multiply the memory bits to the memory speed for instance 64X500Mhz (For the Gefoce FX 5200) Which equates to:64000 or 6.4gb.
The TNT 2 had 128* the M64 had 64*
The TNT was running at 150Mhz*128 = 19200 or 1.92gb.
On the other hand...
The 6200A biostar card has 533Mhz memory
so 533*64 = 34112 or 3.4112Gb
Overclock that and you might be able to hit the 4gb barrier :)
That's a flawed way of determining memory bandwidth. He won't have to break the 4GB/s barrier considering the 6200A 64-bit version will already have ~4.4GB/s.

One of the few things I saw and agreed on was to wait for this:
Pemalite said:
That'll eat any 6200 for breakfast.
 
Member
✔️ HL Verified
🚂 Steam Linked
Discord Member
Joined
Jan 7, 2003
Messages
347
Best answers
0
Location
South Australia
That's a flawed way of determining memory bandwidth. He won't have to break the 4GB/s barrier considering the 6200A 64-bit version will already have ~4.4GB/s.
http://www.evga.com/products/moreinfo.asp?pn=256-A8-N341-LX&family=17

It shows it has 3.2gb of memory bandwidth so my 'flawed' way of finding out the memory bandwidth isn't that far off after all.
Memory is usually left pretty open to the manufacturers anyhow, one card might have 2gb yet another model that seems to be the same may have 4gb of bandwidth. And another thing Some graphics cards might show on the box,
8gb of bandwidth when in fact its sometimes the PCI-E bandwidth.
Or they add the AGP bandwidth+Memory Bandwidth. (Advertising).

http://www.digital-daily.com/video/nv-gf6200agp-roundup/index02.htm
Note in Half Life 2 it almost made it to the top of the stock 6200's with 64 bit memory bus.

I used to own a 9550 And they are actually rather good over clockers, And I prefer ATI cards, they always seem... More tweakable. - My honest opinion anyway.

The Radeon 9xxx series wasn't bad. (Minus the 9200/8500 Derivatives).
And the 9500Pro/9600/9700/9800 Series can all play Oblivion, And considering these cards were around some years ago... They do rather well.
And I was choosing a card that may be better suited to the games he plays.

I love how you're all condemning a card even if you probably haven't seen a single review of it
And yes I have read reviews, you just weren't around to watch me read 'em ;)

Another minor annoyance that I find with the 6200 series is the lack of
colour and z-compression support in the memory controller, What this means is that Anti-Aliasing, Anisotropic Filtering, and larger resolutions take an even larger performance hit, Even if they did improve the Anti-Aliasing engine, removing the colour and z-compression support sorta' negates those optimizations.

If you need more performance/benchmark information on the 6200 go here: http://anandtech.com/video/showdoc.aspx?i=2238&p=7
 
New Member
Joined
May 14, 2003
Messages
904
Best answers
0
http://www.evga.com/products/moreinfo.asp?pn=256-A8-N341-LX&family=17

It shows it has 3.2gb of memory bandwidth so my 'flawed' way of finding out the memory bandwidth isn't that far off after all.
Uh, yea it is. You were 'determining' the bandwidth of "64bit" plus "533MHz", not "64bit" and "400MHz", which is what that link shows. You were pretty far off, indeed.
Pemalite said:
Memory is usually left pretty open to the manufacturers anyhow, one card might have 2gb yet another model that seems to be the same may have 4gb of bandwidth. And another thing Some graphics cards might show on the box,
8gb of bandwidth when in fact its sometimes the PCI-E bandwidth.
Or they add the AGP bandwidth+Memory Bandwidth. (Advertising).
Generally not, though.
Pemalite said:
http://www.digital-daily.com/video/nv-gf6200agp-roundup/index02.htm
Note in Half Life 2 it almost made it to the top of the stock 6200's with 64 bit memory bus.
I don't know why you quoted that, if only to solidify my claim: "The 6200A > a Radeon 9550, even if the former has only a "64 bit" memory bus."
Pemalite said:
Another minor annoyance that I find with the 6200 series is the lack of
colour and z-compression support in the memory controller, What this means is that Anti-Aliasing, Anisotropic Filtering, and larger resolutions take an even larger performance hit, Even if they did improve the Anti-Aliasing engine, removing the colour and z-compression support sorta' negates those optimizations.
It's low end. It's not meant to do high multiple AA (or at all). It's meant to render games, not render them at the highest settings.
 
Member
✔️ HL Verified
🚂 Steam Linked
Discord Member
Joined
Jan 7, 2003
Messages
347
Best answers
0
Location
South Australia
I don't know why you quoted that, if only to solidify my claim: "The 6200A > a Radeon 9550, even if the former has only a "64 bit" memory bus.
I don't how you can see that claim, I wouldn't be surprised if a manufacturer produced a 9550 that wiped the 9200's, The benchmarks are way to near each other to say a victor.
And note, that the 9550 can be nodded into a 9600 Pro then overclocked.
Just gotta flash the Bios... And the 9550 If you don't flash the bios, You can reach 9600 speeds rather easily, Which then Will leave the 6200A in the dust.

It's low end. It's not meant to do high multiple AA (or at all). It's meant to render games, not render them at the highest settings.
I don't care what you say, but If I was buying a low-end card, I would like to run AA on Unreal Tournament at least! and 2x AA looks better than none.

Uh, yea it is. You were 'determining' the bandwidth of "64bit" plus "533MHz", not "64bit" and "400MHz", which is what that link shows. You were pretty far off, indeed.
I realized the difference of the 533mhz and 400mhz.
So far, unless you have a better method, by all means show me.
To me its enough to justify the amount of bandwidth roughly, As GPU's have other ways of saving on bandwidth like S3's Texture compression or (S3TC).
 
New Member
Joined
May 14, 2003
Messages
904
Best answers
0
I don't how you can see that claim, I wouldn't be surprised if a manufacturer produced a 9550 that wiped the 9200's
We're discussing the 9550 and the 6200A, not the 9200s.
Pemalite said:
And note, that the 9550 can be nodded into a 9600 Pro then overclocked.
Just gotta flash the Bios... And the 9550 If you don't flash the bios, You can reach 9600 speeds rather easily, Which then Will leave the 6200A in the dust.
Oh I forgot that it was impossible to also overclock the 6200.[/sarcasm]
Pemalite said:
I don't care what you say, but If I was buying a low-end card, I would like to run AA on Unreal Tournament at least! and 2x AA looks better than none.
You can do whatever you want, I'm just telling you what they're for. You can keep your fingers in your ears and ignore the truth all you want, heh.


Pemalite said:
I realized the difference of the 533mhz and 400mhz.
I'm assuming you "realized" it after I showed you?
Pemalite said:
So far, unless you have a better method, by all means show me.
Heh. You multiply the memory bus (in bytes, not bits) by the clock speed. Then you get the accurate rating. So for my 6800, 32 x 700 = 22400, or 22.4GB/s. THE MORE YOU KNOW.
 
Senior Member
💻 Oldtimer
Joined
Oct 21, 2003
Messages
2,706
Best answers
0
I picked the Biostar Geforce 6200a in the end. I just got it yesterday and I was pleasantly surprised. The installation was painless, just installed the latest forceware drivers and it was ready to go. I can play most games very well, like HL2 I can play on the highest settings and get 25-40fps. This is a shock to me, because the rest of my system isn't amazing at all. I have a amd sempron 1.4 ghz processor and 768mb of ram.

Not bad at all for a $35 card, heres a screen:



I even get HDR options by default (dunno if thats amazing, but I didn't get that option on my geforce4 ti4200).
 

Users who are viewing this thread

Top Bottom