Why Flame the GeForce FX?

melon

MS-DOS 2.0
Political Access
Joined
13 Feb 2002
Messages
854
I have noticed that there have been lots of whining about the GeForce FX (not necessarily here, but all over the internet), especially in regards to preliminary benchmark tests. But, really, how many frames per second do we really need? Film operates at 24 fps, PAL at 25 fps, NTSC at 29.97, and SECAM at 60 fps (I think?). DV always attained for 60 fps itself, so where is the advantage of climbing up to 200 / 300+ fps?

Looking at some of the FX demos has amazed me, honestly. Can the Radeon 9700 bring that? (it's an honest question....can it? LOL). I almost think there is a point where quality is going to have to take a front seat to raw benchmark speed, if only in the short term.

Anyhow, FYI, I'm not a die-hard ATI or nVidia fan. I had an ATI card, which I loved, and, when it became obsolete, I bought a GeForce4 Ti 4200 (AGP 8X), which I also love. In about three years, I'll likely reevaluate the products out there, and who knows which company I'll go with.

Overall, though, I have yet to see a reason why to flame the GeForce FX even before it has come out and quibbles over a dozen or two fps differences, when it is already mindnumbingly faster than our eyes can handle. :p

Your thoughts?

Melon
 
Overall I agree with you sure we all want the best we can get but its just an on going thing there will always be the next one!! And in the end I think it is whats the best buy for the $ that counts I think the hype is what made things the way they are and the FX was not the ATI killer is was hyped to be and the cost is imho just too high for what it can do... I dont have a preference for ATI or Nvidia I own both and have liked them both for different reasons ...
 
Why flame the FX? Because it's a hair dryer.

Apart from that it's a good card.
 
Originally posted by melon
I have noticed that there have been lots of whining about the GeForce FX (not necessarily here, but all over the internet), especially in regards to preliminary benchmark tests. But, really, how many frames per second do we really need? Film operates at 24 fps, PAL at 25 fps, NTSC at 29.97, and SECAM at 60 fps (I think?). DV always attained for 60 fps itself, so where is the advantage of climbing up to 200 / 300+ fps?

Looking at some of the FX demos has amazed me, honestly. Can the Radeon 9700 bring that? (it's an honest question....can it? LOL). I almost think there is a point where quality is going to have to take a front seat to raw benchmark speed, if only in the short term.

Anyhow, FYI, I'm not a die-hard ATI or nVidia fan. I had an ATI card, which I loved, and, when it became obsolete, I bought a GeForce4 Ti 4200 (AGP 8X), which I also love. In about three years, I'll likely reevaluate the products out there, and who knows which company I'll go with.

Overall, though, I have yet to see a reason why to flame the GeForce FX even before it has come out and quibbles over a dozen or two fps differences, when it is already mindnumbingly faster than our eyes can handle. :p

Your thoughts?

Melon

all the negative feedback on the nv30 card... the gf FX ultra can be attributed to one thing... NVIDIA...

its that simple really...

nvidia made the card out to be something it was not... they used false advertising promoting a product that is not out even as we speak and does not perform as the consumers were promised they would and now it is not even going to retail in the ULTRA capacity... which has lead to a new excuse being created... that it was designed to be a LIMITED QUANTITY product...

go to most websites... in fact go to www.nvnews.net and see the amount of stick the product has taken from its own fanboy website and read the interesting article with all the CEO's quotes and supposed retail dates and specs on the part et al...

concerning your frame rate argument... nvidia has always strived to maintain a hold on gamers minds that frame rate is king... just observe their marketing strategy over the years and look @ the Image Quality (IQ)... inverse relationship... gamers want as high a frame rate as they can get... even 300 is not enough for some n00bs as you will notice in message boards all over the web...

that has now changed...

now they want a high frame rate with Level Of Detail (LOD) maxed out and AA and FSAA enabled to give maximum eye candy... the 9700pro from ati was the first card on the market able to do this with highly playable frame rates... nvidia had claimed they would be able to do better...

the cards demos are specific to the gf FX... there is no way the video card can actually render the quality in the demo's in real time... the performance hit would be ridiculous.. you just have to look @ the dx9 demos from ATi to see that... the bear demo was the most impressive for me... perhaps not as sexy as dawn but IMO harder to do as each individual hair on the bear's fur seemed to be alive...

now the PR that nvidia used.... for close to 7 months or so nvidia has been claiming their card is the greatest thing ever and that it would demolish the 9700pro by 30-50% @ least and it would have better IQ... and that people looking to upgrade should WAIT for their product....

the product has now been delayed well over 6 months... and is no faster than the 9700pro and has WORSE IQ than the 9700pro... even nvidia fans can see that and therein lies a lot of the disgruntled mutterings over this product...

it is HOT... it is LOUD... it is HUGE... it is not what was promised and it certainly does not perform to the specs that were promised... beyond that you have driver problems... the ati architecture is simpler than the gf FX architecture and therefore drivers for the gf FX ultra are probably not going to mature as rapidly as people blindly say...

nvidia has a good set of drivers for the gf3/4 because they are essentially using the same core for the past 2 years... hence it is optimized to ridiculous levels... ati has a new core/tech out on a much more frequent basis meaning driver optimizations are a little behind in that time frame... considering that the nv30 is a NEW core/architecture per nvidia... drivers will not be nearly as polished... though I find it ludicrous to think this product has yet to have decent drivers... its been taped out for ages... AGES...

you may go ahead and call me a fanboy as many relish doing... but I try and stick to the basic arguments and I truly do detest nvidia's pr department... they more than the engineers killed off this product before it hit the shelves...

it is a great card yes... but it costs WAY more than a product that is out already that competes very well @ stock levels (raw benches are close) and owns the nv30 with levels of AA and FSAA turned on...

v/s the ti4600... sure its a wonderful card and an easy sell... v/s a radeon 9700pro with all the bogus PR crud spewed out for months... I don't think so...
 
Damn Sazar! I'm not sure I could write that much in under 28 mins even if I wanted to. :D
 
The FX is hot, loud and late.

Sazar, why don't you just point people to the other 50 threads where you have said exactly the same thing. It would save you a lot of typing :)
 
I think that the card has a few major drawbacks. The fact that it was released so long after the Radeon 9700pro gave ATi another advantage. Rumor is, i shouldnt say rumor, that the RV350 is being released soon, and is also supposed to make the FX obsolete. The FX also has a smaller memory bandwith than the 9700pro. I cant really criticize it too much because it still beats the crap out of my MX420, although my card is signficantly quieter :p
 
Originally posted by black-syth
I think that the card has a few major drawbacks. The fact that it was released so long after the Radeon 9700pro gave ATi another advantage. Rumor is, i shouldnt say rumor, that the RV350 is being released soon, and is also supposed to make the FX obsolete. The FX also has a smaller memory bandwith than the 9700pro. I cant really criticize it too much because it still beats the crap out of my MX420, although my card is signficantly quieter :p

rv350 is a low end 0.13 micron process product... designed to replace the 9500pro and below... perhaps even the 9700 lineup..

the r350 is the high end part on the proven 0.15 micron process... THIS is the next big product... the big brother so to speak of the 9700pro...
 
I agree with Sazar

I'm pissed 'cause I fell for the "Just wait til its done, it'll whoop the 9700 silly!" BS. I could have gotten the 9700 much earlier, and gained more enjoyment. But I fell for the usual Nvidia crap. Never again.

In fact, I just bought the 9700 a week ago, and am still kicking myself for waiting so long.

The GFFX is not a slow card by any means. It will play anything out there sufficiently. But thats not the point.

As Sazar pointed out, its loud, hot, large, and very late. Nvidia were the ones that invented the 6-month product cycle, and now they cant take the heat. in this business, it only takes one cycle to fall behind, but two cycles to take the lead...

It is mostly the let down caused by the hype and marketing BS that pissed me off. They created this overclocked monstrosity, and it is basically neck-to-neck with the 6-month old 9700, which tech spec-wise, only beats the GFFX in memory bandwidth. And the 9700 is cheaper, easier on your power supply, cheaper, smaller, looks better, and overall has much better min-to-max FPS ratio.

Practically all the selling points unique to the GFFX will not be seen for the next year or two (IOW, your not going to be seeing any butch fairies gracing your games anytime soon :D ). And by the time these features are implemented in games, the GFFX will be obsolete. AFAIK, the rest of the GFFX features can be had on the 9700Pro RIGHT NOW, and again for cheaper.

This is not to say that the FX chipset cant be a hot deal in the future. Nvidia claims they can ramp this chipset up to some very high speeds 'ala Pentium 4. I dont see how their going to do this, considering the card is already hot as hell, but you never know.

So, in a nutshell, its NVs lying and egotism that turned me off. And its going to take a lot to win me back.

Sincerely,

A former Nvidiot
 
well nvidia had wanted to release their video card using low k dielectrics which would have helped a tiny little bit in terms of heat... but more so the chatter would have been reduced and perhaps also the power consumption (fewer layers of pcb = lower power consumption)

concerning easily being able to scale the video card... nvidia needs to either move to 128bit memory or really crank the speed of the memory high FAST...

neither of these is truly feasible simply because of the nature of ddr II @ the moment... it is highly likely we will see the nv35 also using 128bit memory but it will be a better card because yields will be up and more work will have been carried out on the core..

as for cranking up speeds... IMO they would need to move to gddr3 over gddr2 in order to scale rapidly...

for all intents and purposes the 2 memories are almost the same but gddr3 is tweaked to better suit gfx cards and has the ability to hit higher clocks... :)

nvidia's card does have some nice features but once again... if the engineers who work can be allowed to stay within physical limitations of the hardware being used instead of modifying things to suit the PR departments claims (ok ok.. so an exageration on my part :) ) perhaps we will see a more complete product soon :)
 
resurrecting a thread from the dead :)

why? because of new developments which I would have commented on before but I would have been shot down for being fan-boyish...

:)

hence I waited and now nvidia has confirmed what has been discussed @ b3d all this time... even before the inquirer got its hands on the news snippet...

http://tech-report.com/etc/2003q1/3dmark03-story/index.x?pg=5

NOW... what really is interesting here is the fact that the card is still being touted for its programmability...

IMO the basic flaws of the product are in its architecture... its flexibility and programmability have not been questioned... who knows... maybe there will be more revelations in the future concerning the architecture whereby there will be explanations in detail as to what is going on and what areas the card will actually be powerful enough in and where it will not be able to keep up...
 
Originally posted by melon
But, really, how many frames per second do we really need? Film operates at 24 fps, PAL at 25 fps, NTSC at 29.97, and SECAM at 60 fps (I think?). DV always attained for 60 fps itself, so where is the advantage of climbing up to 200 / 300+ fps?
turn up the resolution, FSAA, and AA and 300 q3 fps turns into 30 unreal2 fps. and people shouldn't buy a video card for the games they play now... they should buy it for the games they will play later.

Originally posted by Speed4Ever
I'm pissed 'cause I fell for the "Just wait til its done, it'll whoop the 9700 silly!" BS. I could have gotten the 9700 much earlier, and gained more enjoyment. But I fell for the usual Nvidia crap. Never again.

Sincerely,

A former Nvidiot
when has nvidia done anything like that in the past? you act like it's something they do with every new card release. this is a first for nvidia as every other core they released before this did nothing short of impress everyone.

it's also a silly reason to immediately turn your back to them and simultaneously put down nvidia fans. we didn't make the card... and my 4200 is still a good buy.
 
You've got four groups of people to cater to in the video card market, and the loudest of the bunch (hobbiest/ gamers) aren't happy. Here's how it adds up in my mind:

1.) Mom & Dad users: Don't care about brand or FPS, only want to be able to play hearts and their golf games.
2.) The multi-media hobbiest/professional: nVidia is moving towards catering to these people with the FX - look towards a driver base optimized for multi-media with the FX. ATI has ruled this market for years with the ALL IN Wonder - nVidia is focusing the FX towards this market.
3.) The professional: nVidia hits this market with the Quadro - drafters, 3D artists and others, the FX shouldn't even be considerd here.
4.) Gamer/ hobbiest: This is where nVidia has made it's money. I believe they're trying to cater to this market and the multi-media market with the FX. The gamer/ hobbiest is pissed because we were expecting the same performance jump with the FX that we saw from the TnT to the original GeForce. ATI leads this market right now, and with the Radeon 9800 coming next month/ April, nVidias chances of conquering both markets looks slim...
 
My Nvidia contact was more than happy to verify this:

Those who are in the know said that Nvidia, which had a stand at IDF this year, were taking people round the back and showing them NV35, which works at 256-bit quite fine, and runs at half the heat of the NV30.

But he wouldn't tell me when it was coming out. No matter what it is, I am buying the fastest (normal not quadro or owt daft like that) card on the market.
 
I just think that GeForce3 Ti500 sounds waayyyyy cooler than GeForceFX....makes me think of some lame tvshow or something:p anyway....too bad for all the nvidia fans out there...the crown has been taken by ATi, and Nvidia has to get crackin within a short period of time to save their precious name they've build up in all those years...or else it's byebye nvidia...

/me grabs hold of his nvidia GF4, with hope it may become a collectors item in a few years...lol
 
Originally posted by deadzombie
You've got four groups of people to cater to in the video card market, and the loudest of the bunch (hobbiest/ gamers) aren't happy. Here's how it adds up in my mind:

1.) Mom & Dad users: Don't care about brand or FPS, only want to be able to play hearts and their golf games.
2.) The multi-media hobbiest/professional: nVidia is moving towards catering to these people with the FX - look towards a driver base optimized for multi-media with the FX. ATI has ruled this market for years with the ALL IN Wonder - nVidia is focusing the FX towards this market.
3.) The professional: nVidia hits this market with the Quadro - drafters, 3D artists and others, the FX shouldn't even be considerd here.
4.) Gamer/ hobbiest: This is where nVidia has made it's money. I believe they're trying to cater to this market and the multi-media market with the FX. The gamer/ hobbiest is pissed because we were expecting the same performance jump with the FX that we saw from the TnT to the original GeForce. ATI leads this market right now, and with the Radeon 9800 coming next month/ April, nVidias chances of conquering both markets looks slim...

you raise some interesting points...

group 1 gets whatever is in their system... which explains why intel integrated graphics has such a large market share :)

group 3 has been using quadro's and wildcat and matrox cards for a while... but the new firegl series from ati offers almost the same performance for less than half the price.. hence game dev's MAY start to look @ this product more carefully in the future... if they have not done so already..

group 4 is the group that along with group 2 discusses and disects the product to the barebones... these groups have a lot of influence over the purchases of a product... imagine if 1 enthusiast changed the mind of 1 regular user... thats around 400 bucks out of the pocket of the gpu maker... and word of mouth counts...

now I will tackle group 2... nvidia does not have a multimedia solution that is even close to matching the performance of the AIW series... it seems the entire lot of gpu makers have decided not to challenge ati on this front... which is a shame really... ati will develop more techs if it is pushed :) and competition = lower prices... :)

nvidia is at least a year removed from an integrated solution... just consider the board size if they did have something out like that...
 
Originally posted by Electronic Punk
My Nvidia contact was more than happy to verify this:

Those who are in the know said that Nvidia, which had a stand at IDF this year, were taking people round the back and showing them NV35, which works at 256-bit quite fine, and runs at half the heat of the NV30.

But he wouldn't tell me when it was coming out. No matter what it is, I am buying the fastest (normal not quadro or owt daft like that) card on the market.

supposedly the card should debut with the core clocks that were expected for the nv30 to compete with the 9700pro.. ie.. 650-ish mhz...

now the problem is that the move from 128 bit to 256 bit may cause other issues ie transistor counts et al.. and it really has to be seen if nvidia can run their next product @ full speed (without throttling back @ high temps) when it debuts...

I have no doubt in my mind that the nv35 SHOULD take the performance crown when it debuts in the latter half of the year from the r350... but it will then have to face the r400 which appears to be the behemoth of the year...

the 256 bit memory bus should help nvidia... and lets see if they move to a different pipeline/tmu setup or have completely programmable instead of fixed setup :) thats what really intrigues me... 4x2 is a joke... :) but if they move to their specified goal... whoa nelly :)
 
I can't beleive nobody is *****ing about the heat more.

Heat is what killed the t-bred last spring (well, summer), it's what killed voodoo. If Nvidia/ATI can't come up with an architecture and fab process that keeps the heat reasonable they are dead against ATI (and AMD against Intel). My last voodoo card was hot enough to burn my finger on the back side opposite the GPU. I had to put a stick on heat sink there.

Nvidia chipsets are at thermal limits now. The need for that FX cooling system is a joke. Personally I don't want any more heat or noise out of my system.

Whats next an integrated water cooler for the cpu and video card?
 
Originally posted by LeeJend
I can't beleive nobody is *****ing about the heat more.

Heat is what killed the t-bred last spring (well, summer), it's what killed voodoo. If Nvidia/ATI can't come up with an architecture and fab process that keeps the heat reasonable they are dead against ATI (and AMD against Intel). My last voodoo card was hot enough to burn my finger on the back side opposite the GPU. I had to put a stick on heat sink there.

Nvidia chipsets are at thermal limits now. The need for that FX cooling system is a joke. Personally I don't want any more heat or noise out of my system.

Whats next an integrated water cooler for the cpu and video card?

the reason why no one is complaining about the heat perhaps is because it is well known that the ultra was oc'd to the max... :)

the regular version has a non Flow FX solution... and does not produce nearly as much heat as the 5800 ultra... but it also does not have the performance of its bigger brethren...

btw both the 5800 and 580 ultra's coolers take up an extra pci slot... unless the new 5800 non-ultra has some 3rd party board partners working on the component :)

notanvidiot.gif
 

Members online

No members online now.

Latest profile posts

Also Hi EP and people. I found this place again while looking through a oooollllllldddd backup. I have filled over 10TB and was looking at my collection of antiques. Any bids on the 500Mhz Win 95 fix?
Any of the SP crew still out there?
Xie wrote on Electronic Punk's profile.
Impressed you have kept this alive this long EP! So many sites have come and gone. :(

Just did some crude math and I apparently joined almost 18yrs ago, how is that possible???
hello peeps... is been some time since i last came here.
Electronic Punk wrote on Sazar's profile.
Rest in peace my friend, been trying to find you and finally did in the worst way imaginable.

Forum statistics

Threads
62,015
Messages
673,494
Members
5,621
Latest member
naeemsafi
Back