nVidia Caught Cheating... Again

  • Thread starter Shannon
  • 29 comments
  • 905 views
15,799
WE CALL IT optimisation these days, but what do you call it when a firm over optimises? We used to call it something different in the days of the NV30, when Nvidia had to do something to make its chip work better and do better in benchmarks but to our surprise it seems to have done it again. The chaps from 3D Center, a very talented in-depth site, spotted and tested and proved that Nvidia is using a lower anisotropic filtering quality than any other card available.

Those guys noticed a texture shimmering problem when you are using normal driver settings. This was the case with NV40 cards but you could resolve this flickering by using high quality driver settings. This won't work on G70 based cards, so the guys well known for its thorough benchmarks went digging a little deeper into the chip.

It turns out that Nvidia is not doing anisotropic filtering the way it should and that the picture quality is the one to suffer. You will get the shimmering effect on your textures whenever you are using Geforce 7800GTX cards but you won't see this using Radeon cards.

The guys claimed that all NV40 and G70 cards will suffer from the same flickering problem and that these cards have "by far worse AF quality". They also add that Nvidia got the flickering because it was using general under sampling and as a result is getting the flickers. It's interesting to note that older Geforce 5800 Ultra won't suffer from this, just the new cards that 6800 or 7800 based.

Another German web site Computerbase , went a step further. It made a custom driver by changing the inf, where the driver could not recognise 7800GTX and use its optimisations. The card was listed as unknown but was working just fine. But when the guys went testing they noticed a massive performance drop when using those drivers, close to 30 percent and related it to anisotropic filtering. Nvidia has a lot to explain.

3DCenter, for the original article, is here in English while the Computerbase German article is here. We will ask Nvidia what is going on but we think there's something up. At lease the guys proved it isn’t a hardware bug - it's a driver problem only but performance drops dramatically as soon as you resolve it.

- Source: The Inquirer


For the price of these cards, they shouldn't be lowering the image quality. If ATI can achieve around the same framerates (if anyone mentions SLI I'll beat them with a stick :P) with better image quality, nVidia have no excuse. 👎

ATI +1
nVidia -1
 
German Muscle
i have a SLI board with a ATI card. owned! :D
I meant the speed differences between running 2 nVidia cards in SLI compared to a single ATI card, in which case the SLI setup will obviously run faster. :rolleyes: But a single GeForce compared to a single Radeon, the FPS is around the same (I'm sure when ATI bring out their new card it will be just as fast as the 7800 GTX).

So, if ATI can make their cards run around the same speed as nVidia's without lowering image quality, what's nVidia's excuse?
 
Their excuse is "We can't keep up with ATi on a single card basis, so we cheated and made our cards look even more amazing-er than we have already hyped them up to be. We implemented SLi to cover this fact up. So, any questions?"

At this point a hundred fanboys ran to the podium and battered the speaker to death with their SLi motherboards. Amidst the ruckus, a cry of "How do we get our money back you homo?!" was heard.

True story.
 
SLI, simply the best, anyway ATI are planning to implement a version of SLI on their cards so I think it is hypocritical to claim SLI is an Nvidia Point of Weakness (POW)
 
German Muscle
i have a SLI board with a ATI card. owned! :D
Whats the point of having an SLI board if you only have an ATI card .. not two of them.

:confused:

Why not just go buy that second card?
 
but why would you want to have cards from competing companies. thats like teaming up the devil with the archangel... trouble is bound to happen.
 
Ev0
Wasn't ATI caught cheating a while ago, or am I just getting something mixed up in my head?

Yes, ATI was caught too. Just like in politics, only one side seems to be covered.
 
Hmm, so my memory is not playing tricks on me.

Well, I guess this is sort of like F1. It's fairly safe to assume everyone cheats to some extent, but the ones who get away with it are just simply more clever than those who get caught.

But either way, I'm still staying loyal to ATI. They're a Canadian company producing good products, and their cards do run the Source engine faster than nVidia. I don't see myself switching anytime soon.
 
I would like to see comparison photo evidence to see if nVidia is truly cheating again. As of now, I don't see any evidence to claim nVidia is cheating.
 
sting
but why would you want to have cards from competing companies. thats like teaming up the devil with the archangel... trouble is bound to happen.

No it doesnt actually, I have a friend who has both an X850xt and a 7800GT in his pc, they work very well together. Each has its own advantages and dis-advantages and they work in conjunction to make an almost perfect machine.

He has no trouble at all and can switch between the two for different games and tasks depending on which is best suited for the operation he intends to perform.
 
Ev0
Wasn't ATI caught cheating a while ago, or am I just getting something mixed up in my head?

No, you're quite right... ATI cheats, too. Oh, the horror. *yawn*

These benchmarks are meaningless to me. I don't plan on ever owning any of these cards, since I don't do any PC gaming.
 
Jedi2016
No, you're quite right... ATI cheats, too. Oh, the horror. *yawn*

These benchmarks are meaningless to me. I don't plan on ever owning any of these cards, since I don't do any PC gaming.

I almost feel the same way, except I'll still continue buying Nvidia.
 
Viper Zero
Yes, ATI was caught too. Just like in politics, only one side seems to be covered.
From what I've read, ATI's trilinear filtering level varies depending on the angle you're viewing the texture at. If you look at a texture at 90 degrees with 16x AF enabled, it will look exactly the same as no AF, but will still require 16 more texture samples. So, the ATI card will dynamically adjust the level depending on the angle your'e viewing it at. And this can easily be disabled to make the card do full trilinear filtering.

The fact is, nVidia were caught cheating before and now they've been caught again. I guess there's no such thing as learning from mistakes...

Viper Zero
I would like to see comparison photo evidence to see if nVidia is truly cheating again. As of now, I don't see any evidence to claim nVidia is cheating.
Did you click the link at the bottom?

http://www.3dcenter.org/artikel/g70_flimmern/index2_e.php

There's video evidence there if you wish.
 
Sorry, Shanon. That link was down when I posted, so I didn't see the evidence.

With the evidence, can it really be cheating? It seems like it's just lower quality just like it's always been compared to ATI. Is nVidia doing this on purpose, can we tell?
 
Flame-returns
I'm confused. So 7800GTX or X850XT?
7800gtx is dx10 ready. x850xt isn't. Just wait for the next ATI card to hit the market and look at the comparisons between that and the 7800gtx. The ATI one should hit the streets sometime before the end of this month.
 
Jedi2016
No, you're quite right... ATI cheats, too. Oh, the horror. *yawn*

These benchmarks are meaningless to me. I don't plan on ever owning any of these cards, since I don't do any PC gaming.

Then you better not buy an Xbox 360, or a Playstation 3 (or a regular Xbox for that matter), as the Xbox 360 contains a custom ATI card, the Playstation 3 contains a card that Nvidia co-developed with Sony, and the original Xbox contains a custom Nvidia GeForce3 / nForce chipset on board.

Does make your life easier though - from now on you'll only have to consider one brand: Nintendo!

But might get difficult if something like GT5 comes out on the PS3. :scared: :lol:
 
Arwin
Then you better not buy an Xbox 360, or a Playstation 3 (or a regular Xbox for that matter), as the Xbox 360 contains a custom ATI card, the Playstation 3 contains a card that Nvidia co-developed with Sony, and the original Xbox contains a custom Nvidia GeForce3 / nForce chipset on board.

Does make your life easier though - from now on you'll only have to consider one brand: Nintendo!

But might get difficult if something like GT5 comes out on the PS3. :scared: :lol:
Nintendo uses an ATI chipset ;)

Incidentally, all 3 console manufacturers are sourcing their cpu's through IBM. A conspiracy of some sorts? I think so.

:P
 
Flame-returns
Because if you have an SLI board you can have an ATI and an NVidia, best of both worlds.
is this true!?


not really computer savy, but i would like to build my own, (amd64 3500) to be exact.
i need to talk to some1 for some advice, if they can give me good advice, ill put them in my sig.
now back to nvidia cheating.
 
svtsnake
is this true!?
To make use of the SLI capability, you'll need a pair of nvidia cards on an Nforce based SLI board or a pair of ATI cards (one of them being a crossfire model) installed onto a motherboard with ATI's chipset.

If you *can* mix ATI and Nvidia, it will be just to have 2 video cards on 1 pc - the only real use would be to support more than 2 monitors. Even then, having an SLI combo would end up better just because of driver conflicts..
 
Flame-returns
But I need to decide by the 19th of this month!!!
If the ATI R520 isn't out by then, I'd go with the 7800 GTX just for the sake of being a WGF card (WGF = DX10. It isn't called DirectX anymore people :irked:).
 
emad
To make use of the SLI capability, you'll need a pair of nvidia cards on an Nforce based SLI board or a pair of ATI cards (one of them being a crossfire model) installed onto a motherboard with ATI's chipset.

If you *can* mix ATI and Nvidia, it will be just to have 2 video cards on 1 pc - the only real use would be to support more than 2 monitors. Even then, having an SLI combo would end up better just because of driver conflicts..

You can mix them running say FS2004 with the ATI and say Farcry with the GeForce. You can not run the two cards together at the same time and switching between one and the other can be fiddly. Driver conflicts are quite rare but SLI is still much more convienient :) .
 
Ev0
Wasn't ATI caught cheating a while ago, or am I just getting something mixed up in my head?
Yes they were it was a driver.

Arwin
Then you better not buy an Xbox 360, or a Playstation 3 (or a regular Xbox for that matter), as the Xbox 360 contains a custom ATI card, the Playstation 3 contains a card that Nvidia co-developed with Sony, and the original Xbox contains a custom Nvidia GeForce3 / nForce chipset on board.

Does make your life easier though - from now on you'll only have to consider one brand: Nintendo!

But might get difficult if something like GT5 comes out on the PS3. :scared: :lol:
Glad HL2 looks way better on my PC then the Xbox.
 
Shannon
If the ATI R520 isn't out by then, I'd go with the 7800 GTX just for the sake of being a WGF card (WGF = DX10. It isn't called DirectX anymore people :irked:).
Would I be correct in thinking the 7800GT is also a DX10 card?

Sorry for bringing the thread up again. :nervous:
 
What shimmering? I've got 4 different nVidia cards and 3 Radeons lying around...

I wanted to test this theory, and I still see no quality difference in the two cards. I just see tons better framerates on the nVidia side of the border. I don't see this story holding water.

nVidia cards:
6600gt agp8x
6800ultra agp8x
6800gt pci-x
7800gtx pci-x

Radeon cards:
X800 pro TURBO agp8x
X700 pro pci-x
X850 xt pci-x
 

Latest Posts

Back