pixel shader 2.0 discussion

Sazar

Rest In Peace
Joined
12 Apr 2002
Messages
14,905
http://www.digit-life.com/articles2/ps-precision/index.html

feel free to discuss or question the points in this article...

frankly it is a very insightful and educational article concerning what our graphics cards are doing and what exactly precision is... what discussion concerning what fx12 and fp16/24/32 really is all about...

:)

fyi : fx == fixed integer

fp == floating point...

dx9 api asks for fp24 minimum precision though partial precision in some situations can be fp16...

an example of ps 2.0 can be found in 3dmark03 by looking @ the sky...
 
It certainly explains why my new 9500 PRO looked better than my old NV30 based card even thought the FPS were comparable. I could never pick out a specific detail but everything looked "richer" on the 9500 PRO. Less interpolation of the graphic details.

Now Madonion/Futuremark having the tizzy fit over the "cheating" makes sense. When the chipmaker degrades graphics to boost his FPS the benchmark that people use to make buying decisions is not telling the full story.

If I went on staright benchmark score the GE4-4200 looks good enough and cheaper than the 9500 PRO. When I look at the picture quality too I have a more enjoyable gaming experience. Going back to NV30 now would be like going back from an GE2 to the Voodoo family.

Now the question is why did Madonion/Futuremark roll over so quickly and drop it's protests? Because Nvida and ATI pay madonion/futuremark salaries through the partnership fees? Then they aren't an independent benchmarking company, they're just another set of hired hands in the PR departments.

Seems like the partnership program needs to go or the card buyers (OEM or personal) need to start ignoring madonion results. So we're left with the only way to compare cards is to buy them and try them. Man that would drive card manufacturers return rates way up for the second rate card maker. Maybe it would just be cheaper for the card makers to quit cheating...
 
It was a very informative article. It does a good job of explaining the differences and similarities between the R300 based chipsets and the NV30 based sets.

I think this just proves just how revolutionary the R300 GPU really was, it blindsided everyone when it was released, and it's since become the benchmark for DX9.0 compliance.

But like everything there is room for improvement, and I'm sure we'll see fp accuracy go up in the next gen of ATi cards, like it did for nVidia with the NV35.

I'm very happy with the current market, it's keeping the technology moving, and affordable. maybe not every six months affordable, but it's there. Now saying that, having used both Ti4600s and Radeon 9500 Pro, and now my AIW 9700 Pro, I love the ATi's colour....

they are more vibrant, and lively, they have more character in the colours.

I'm not saying the nvidia cards are worse, just different....

My one worry is John Carmack is behind the technology, he's out for his own profit with Doom...

I think as long has the hardware manufacturers give the programmers the tools, they will use the features to their fullest and we'll have some pretty amazing games in the future...
 
Originally posted by LeeJend
It certainly explains why my new 9500 PRO looked better than my old NV30 based card even thought the FPS were comparable. I could never pick out a specific detail but everything looked "richer" on the 9500 PRO. Less interpolation of the graphic details.

Now Madonion/Futuremark having the tizzy fit over the "cheating" makes sense. When the chipmaker degrades graphics to boost his FPS the benchmark that people use to make buying decisions is not telling the full story.

If I went on staright benchmark score the GE4-4200 looks good enough and cheaper than the 9500 PRO. When I look at the picture quality too I have a more enjoyable gaming experience. Going back to NV30 now would be like going back from an GE2 to the Voodoo family.

Now the question is why did Madonion/Futuremark roll over so quickly and drop it's protests? Because Nvida and ATI pay madonion/futuremark salaries through the partnership fees? Then they aren't an independent benchmarking company, they're just another set of hired hands in the PR departments.

Seems like the partnership program needs to go or the card buyers (OEM or personal) need to start ignoring madonion results. So we're left with the only way to compare cards is to buy them and try them. Man that would drive card manufacturers return rates way up for the second rate card maker. Maybe it would just be cheaper for the card makers to quit cheating...

basically futuremark is a company that makes a product to benchmark various things to give approximate indications of performance levels :)

concerning 3dmark03.. FM has probably had to back off of the term 'cheat' because of possible legal implications..

this is a business where the image of an IHV to its shareholders is VERY important :) and to an extent... its consumer base...

nvidia is no longer a beta member and therefore is no longer paying fees to FM... keep that in mind...

ATi IS a beta member and IS paying fees... it is a tier I member meaning that they pay a nice chunk of money for being a member... but they are not there alone... other major IHV's are present and various websites are also present... OEM;s have input... it is not as bad a setup as nvidia makes it out to appear... :)

perhaps if nvidia's hardware were actually fast enough.. this particular IHV would sing the same praises of FM or madonion as they did when 3dmark2001 first came out and they were the only ones with a dx8 card... and hence had the TOP card on the market for a while :)

oh the irony...

the basic concept of 3dmark03 is quite good and it IS indicative.. once the CHEATS and OPTIMIZATIONS are removed...

basically... the gf Fx core has problems rendering @ speed... it is an excellent dx8 gen card.. but it does appear to be a tad slow when it comes to true dx9 (ps/vs 2.0) rendering... which is WHY the whole issue of lowered precision comes into place... :)

personally I have reservations with application detecting... but in some instances (and depending on the optimizations used) I am all for it :) since it would make my gaming experience better...

if there is an 'optimization' that does not have ANY real world application (such as fixed POV image culling a la det 44.03 and a couple of others in 3dmark03) then there is an issue... there is no possible way this 'optimization' can be implemented in a game where the camera is NOT fixed... :)
 
Originally posted by Goatman
It was a very informative article. It does a good job of explaining the differences and similarities between the R300 based chipsets and the NV30 based sets.

I think this just proves just how revolutionary the R300 GPU really was, it blindsided everyone when it was released, and it's since become the benchmark for DX9.0 compliance.

But like everything there is room for improvement, and I'm sure we'll see fp accuracy go up in the next gen of ATi cards, like it did for nVidia with the NV35.

I'm very happy with the current market, it's keeping the technology moving, and affordable. maybe not every six months affordable, but it's there. Now saying that, having used both Ti4600s and Radeon 9500 Pro, and now my AIW 9700 Pro, I love the ATi's colour....

they are more vibrant, and lively, they have more character in the colours.

I'm not saying the nvidia cards are worse, just different....

My one worry is John Carmack is behind the technology, he's out for his own profit with Doom...

I think as long has the hardware manufacturers give the programmers the tools, they will use the features to their fullest and we'll have some pretty amazing games in the future...

fp accuracy did not go up with the nv35 :)

it merely actually arrived @ the level that it was advertised... and you will notice it is fp32 only in certain instances... :)

personally I think the industry should move towards a more standardized approach instead of fragmenting hte market into nvidia powered games only and the rest of the market powered games only... that would be a huge blow to consumers... == console market basically...get some games for one... some for others... but not for both... (not incl cross platform ports)

carmack is not exactly driving things as much as he likes to bandy it around :)

doom III for example for all intents and purposes is a dx7 class game... not dx8.. not dx9... :)

it looks bloody gorgeous.... and has excellent use of lighting and bump mapping... BUT it does have rivals in the engine department...

/me points to half life and its dx9 level shaders... and its splendid Havoc(k ?) physics engine :)

he's done good stuff... but I can't really see carmack being the man driving the industry... perhaps once he was... or was making the suggestions to do so... now however.. there are others who have influence too...
 
...my video card is shiny and the fan spins really fast. :D
 
Originally posted by Sazar
fp accuracy did not go up with the nv35 :)

it merely actually arrived @ the level that it was advertised... and you will notice it is fp32 only in certain instances... :)

personally I think the industry should move towards a more standardized approach instead of fragmenting hte market into nvidia powered games only and the rest of the market powered games only... that would be a huge blow to consumers... == console market basically...get some games for one... some for others... but not for both... (not incl cross platform ports)

carmack is not exactly driving things as much as he likes to bandy it around :)

doom III for example for all intents and purposes is a dx7 class game... not dx8.. not dx9... :)

it looks bloody gorgeous.... and has excellent use of lighting and bump mapping... BUT it does have rivals in the engine department...

/me points to half life and its dx9 level shaders... and its splendid Havoc(k ?) physics engine :)

he's done good stuff... but I can't really see carmack being the man driving the industry... perhaps once he was... or was making the suggestions to do so... now however.. there are others who have influence too...

I totally agree with you there Sazar
 
Well, atcually the nv35 chip may beat ATI picture quality but with 7 weeks on the 9500 PRO without a crash on and a price tag of $400 USD on the FX 5900, I'm not in any hurry to find out.

Besides I have delicate ears.

Anybody with a 5900 out there who has used an ATI 9500/9700 card. Let's hear from you on picture quality in games.
 
Originally posted by LeeJend
Well, atcually the nv35 chip may beat ATI picture quality but with 7 weeks on the 9500 PRO without a crash on and a price tag of $400 USD on the FX 5900, I'm not in any hurry to find out.

Besides I have delicate ears.

Anybody with a 5900 out there who has used an ATI 9500/9700 card. Let's hear from you on picture quality in games.

you dont have to find out :)

the IQ on the ati cards is better...

the 5900 does not ship with the Flow FX either so it is not a loud solution..

AA and AF tested... ati wins hands down with AA... AF is more subjective but the performance delta on the ati cards is much smaller with a larger increase in AF than nvidia's...
 

Members online

No members online now.

Latest profile posts

Also Hi EP and people. I found this place again while looking through a oooollllllldddd backup. I have filled over 10TB and was looking at my collection of antiques. Any bids on the 500Mhz Win 95 fix?
Any of the SP crew still out there?
Xie wrote on Electronic Punk's profile.
Impressed you have kept this alive this long EP! So many sites have come and gone. :(

Just did some crude math and I apparently joined almost 18yrs ago, how is that possible???
hello peeps... is been some time since i last came here.
Electronic Punk wrote on Sazar's profile.
Rest in peace my friend, been trying to find you and finally did in the worst way imaginable.

Forum statistics

Threads
62,015
Messages
673,494
Members
5,621
Latest member
naeemsafi
Back