ATI or NVIDIA?

  • Thread starter 2ez2KiLL
  • 88 comments
  • 1,998 views
Start > Control Panel > Display > Settings > Advanced > GeForce * > Clock Frequencies

(*name of your Nvidia card)
 
I'm sure it comes with instructions.

Thermal paste? You use paste with heatsinks, not fans.
 
No prob. 👍

Correction: Yes, you can use thermal paste. I didn't realize the fan from FrozenCPU is also a heatsink.
 
I'd say go with ATI. Read some review on tomshardware, ATI out performs nvidia on most of the tests. And I've used both. Nvidia drivers suck hardcore. The "new" drivers made my geforce crap. My laptop 9200 32MB out performs my Geforce 2 MX400 64MB
 
Originally posted by AznFuman
Nvidia drivers suck hardcore.

And ATI's Catalyst drivers are all daisies and sunshine? Yeah, right. Try out the StarStorm Detonators and then come back and tell me who has the best drivers.

The "new" drivers made my geforce crap.

Usually, 'new' drivers make older cards perform less, since the 'new' drivers are designed for newer cards.

My laptop 9200 32MB out performs my Geforce 2 MX400 64MB

Did you realize that a GeForce 2 MX is just a dressed up version of an original GeForce GTS? Of course your Radeon 9200 out performs it, by three or four years of advancements in technology.
 
haha... well somethings i didn't mention
Geforce 2 MX400 64 MB
i used to get 60-80fps on a full loaded server for cs
on 1024X resolution open gl.
Now i'm down to 600X480 resolution and still only get 40-60 fps
I lowered every setting to the minimum.


yah i realize the clock speed/type of ram and piping design is alot more advance. But did u know the 9200 is based on the 8500? Its not that much more advance. maybe 2-3X... but still 32MB. so 2X. I

But yah go to
www.tomshardware.com
they got the tests...
 
I suggest you install the older versions of the Detonator drivers. The 50.xx series were designed just for the FX series. For the GeForce 2 MX, you might have to go all the way back to the 20.xx series.
 
i say ati, not because i have a 9200 128mb, but because every comparison i see the ati beats the nvidia...
i use to like nvidia but after seeing comparison after comparison id take the ati anyday.
 
I bought ATI's 9600XT yesterday and so far I am impressed by it. It out performs my Ti4200 and I can even play Call of Duty at resolutions higher than 1024 X 768.

If anyone has this card, what settings do you recommend running it at? I have Anti-Aliasing at 2x and Anisotrophic Filtering at 8x. I may bump it to 4x/16x to see if it causes a performance hit or not. I usually play at 1024 x 768 but if I can run higher resolutions I will.

Also, does anyone with an ATI card run Overdrive? I'm wondering if that will hurt my card if I overclock it by 50 mhz without a better heatsink/fan for the card.

I really hope this card isn't outdated by the time Half-Life 2 comes out. It's one of the reasons I bought it.

EDIT: I'm planning to run 3dmark2003 to benchmark my card as soon as I finish downloading it. I hate 56k.
 
wow, 20.XX version.. shiet.. haha.. ok i'll try it thanx Viper

2ezKill - rephrase ur question please, I dun get exactly what ur asking. If ur wondering if DDR memory will double ur speed, it doesn't.

Factors - The way the card is design (the path ways, the simplier than better)
- memory speed of the card
- CPU and GPU speed
- Amount of ram u have
- The amount of processor avaiablity u have (e.g. is the processor already running on max)
- The temp or GPU and CPU has a large impact.


And Matrixhasu77 - if u overclock ur Card u should use a heatsink WITH fan, just incase. I'm no expert but yah, i rather spend another 50 or so dollars on a good heat sink and fan then have the card screw up on me
 
Originally posted by 2ez2KiLL
hey if my vid card is ddr that means my output is double what i alrdy have on clock and memory?

I already explained it to 2ez2KiLL, but for anyone else who was wondering...

DDR memory stands for Double Data Rate. For every cycle the memory runs, it can send 2 pieces of data, at the start and end of a cycle. For example: your video card memory is running at 300 MHz. The memory can send 600 pieces of data to the GPU for that 300 MHz.
 
Originally posted by Matrixhasu77

If anyone has this card, what settings do you recommend running it at? I have Anti-Aliasing at 2x and Anisotrophic Filtering at 8x. I may bump it to 4x/16x to see if it causes a performance hit or not. I usually play at 1024 x 768 but if I can run higher resolutions I will.

Experiment! For example, having a higher resolution will require less Anti-aliasing, but it could have a negative performance impact. It's a trade off.

IMO, I would run all games with at least 2x AA and AF. If a game allows more, then bump it up until you are satisfied with the image quality and performance speed.
 
Originally posted by AznFuman
And Matrixhasu77 - if u overclock ur Card u should use a heatsink WITH fan, just incase. I'm no expert but yah, i rather spend another 50 or so dollars on a good heat sink and fan then have the card screw up on me

I've turned Overdrive on and my card's core clock speed without stress is 526 mhz. That's 27 mhz faster than the stock speed. I have no idea what it is while playing a game. The card is supposed to be able to detect when the card is getting too hot and slow it down. I played CoD at 1280 x 1024 last night at 4x/8x and saw no problems whatsoever.
 
I suggest you read up on some of the posts at guru3d.com. There is no 'correct' or 'best' driver for any card. Some drivers work for a game, some don't. Some give your a performance boost, some increase quality. You need to test out the drivers yourself.

Detonator drivers can be downloaded here: http://download.guru3d.com/detonator/

The current version for the Detonators (ForceWare) is 53.06.

Happy driver testing! :)
 
The 50 series drivers are ok but I'm thinking about going back to 40 series drivers. Specifically either 43.00 or 43.51. I never had any problems with those drivers and they seemed to make my card work great. Not to mention I never had any problems playing games with those drivers.
 
You know, I have a question.

I installed that registry thing for overclocking and it says my memory is at 334MHz. Is that because DDR memory?
I've got a GeForce 2 MX200 and I believe the memory is like 16xMHz, so that times 2 is about 334. I just want to make sure that if I up that setting I don't blow anything up.
 
Burnout: Look at page four, I already explained about DDR.

DDR does not magically double your clock speed.
 
DDR does not magically double your clock speed.
All these years I th...

Dude, I know what DDR(double data rate) is. I was just making sure it wasn't defaulting the memory clock to twice of what it was rated for. I wasn't asking what DDR was...

Heck, I build computers.:irked:
 
We are gathered here today.. For 2ez2kill's Fx5200.. It was a good card, it has come and gone.. I have overclocked it too much and my exhaust fan failed and melted my gpu.. Oh well, say hello to my leadtek fx5950 ultra :)
 
Back