Originally posted by canadian_divx
well first off yea i am pissed off that they needed to do that, second, ati has also done this too but i dont think to the extenct that nvidia has, and third, most of the gamers that i know encluding my self dont go by what a menchmark that has never made a real game says, we go by how the games play on it not what some other program does on it, and as long as there is pisture quality and at the most 30fps then it is good. but it is very shamefull that they needed to resort to this to get people to buy there cards
ati has taken the high road and addressed its optimizations...
they did nothing to change the way 3dmark03 really works... it is the renaming of functions (2 of them) to work better with their architecture... which is considered a valid optimization... they are not lowering precision or clipping the scene like nvidia has decided to do... however ati is REMOVING these optimizations..
reading nvidia's response this does not seem to be their intention...
they are flat out blaming futuremark for blackmarking them as well as implicating that ati is involved in this fracas... even though it was ET and b3d that came out with the info first off...
note : nvidia's ps 2.0 and vertex shader performance is abysmal... I don't really care so much about their scores as the fact that their shader performance is WELL below what I would expect... i.e it is broken in hardware...
by artificially speeding this up through cheating in drivers this implies the product is a GOOD performer in dx9 level shader perforamnce...
thats BS as we all know now... as a product to run current games... anything above a gf4 ti 4200/ati radeon 9500pro is good enough...
for future games were shader performance is @ a premium... things do not look good for nvidia unless they can find a way to get game developers to do some fancy work to cover up its hardware's deficiencies..
yes.. on paper nvidia's FX products (5800/5900) are excellent and appear better than ati's products...
BUT... while ati's products are designed per directx 9 specs... nvidia's products are a little lacking... there IS a performance hit when dealing with fp16 or fp32 precision... dx9 specifies fp24 which is what ati's products do ALL the time...
nvidia's FX gpu's natively support integer (fx 12 I believe) and floating point 16 and floating point 32... but the products generally render scenes using fx12/fp16 RATHER than fp32...
the quality difference between fp16/fp32 is not that great though there is a quite noticeable performance hit... however if the product is performing dx9 operations with fp16 performance.. that is not following the dx9 API...
it is unfair to ati which has decided to follow the standards and does so properly...
perhaps as more fud comes out about nvidia people will understand better why I despise their ethics and the 'values' they hold dear...