Albatron or eVGA or other FX 5900?

G

G|ass

Guest
I'm buying a new GeForce FX 5900 Ultra 256MB video card, and the cheapest ones are manufactured by eVGA and Albatron. Are there any big differences between these 2, or would I be better off paying more for the same card just from a different manufacturer? And if so, which manufacturer should I go for and why? Thanks in advance.
 
Albatron has a good cooling but that's all ...
______________
Brands :

  • Leadtek [ own ]
  • Gainward
  • ASUS
  • Abit

There is pretty much difference between brands at the same product .. different cooling system - memory used .. and more others ..
 
Originally posted by G|ass
I'm buying a new GeForce FX 5900 Ultra 256MB video card, and the cheapest ones are manufactured by eVGA and Albatron. Are there any big differences between these 2, or would I be better off paying more for the same card just from a different manufacturer? And if so, which manufacturer should I go for and why? Thanks in advance.

are you sure you want to buy a gf FX card right now ?

:)

there should be no difference in quality no matter which 5900 you buy.. its just the package wil be different and perhaps the coolers will be different too...

the extra money spent should not matter @ all... but again... like I said... are you sure you want to buy a gf FX right now ?

dx9 games are coming out and unless the nv40 comes out with a new core that actually handles the dx9 api the way its supposed to... you are not going to be able to enjoy dx9 games to the extent you can with an r3xx level card... even ones that cost hundreds of dollars less...
 
I am not familiar with this nv40 you speak of. Is there some place I can read more about it? And when will it be out? I am currently using some stoneage 8MB card because my GeForce 4 Ti 4200 card died a couple days ago.
 
Originally posted by G|ass
I am not familiar with this nv40 you speak of. Is there some place I can read more about it? And when will it be out? I am currently using some stoneage 8MB card because my GeForce 4 Ti 4200 card died a couple days ago.

nv40 will be released in the first half of next year...

the gf FX lineup has bad dx9 performance and bad default arb2 path performance in openGL... which is why I am not so sure you want to waste close to $450 or so on basically a dx8 card..

I would highly suggest buying a radeon 9800pro instead... it will be cheaper and faster and give you better graphics...
 
Yea, I just looked at those real world benchmarks. I'm definitely going with the Radeon 9800 Pro now. The reason I was going to go with the FX 5900 was some benchmarks I saw on tomshardware. I feel really betrayed by nVidia, I always spoke highly of them, but they are straight up lying about this damn card. I'll never buy anything from them without doing painstaking research. I shouldn't have to do painstaking research to make sure I'm not being lied to. *******s.
 
Hmm... I want the best performance available, but only if it's wroth the money. Should I be going for the Radeon 9800 Pro Ultimate or the regular Radeon 9800 Pro?
 
Originally posted by G|ass
Hmm... I want the best performance available, but only if it's wroth the money. Should I be going for the Radeon 9800 Pro Ultimate or the regular Radeon 9800 Pro?


the ultimate is a version released by Sapphire which is completely silent... it is the PRO version with passive cooling...

you can either buy a 128mb version or a 256mb version...

a few vendors carry the cards... ie

tyan/gigbyte/sapphiretech/ati/visiontek and the like...

the QA on all is excellent since I believe they are all made @ the same fab (the pcb's that is.. not just the core)

the color and coolers may vary a small amount... but in general the cards are the same...

btw... fyi... lars from THG is slightly biased in his opinions on certain things... much like Kyle from [H]...

best place for information concerning gfx cards == www.beyond3d.com

they even have game developers over there such as tim sweeney...
 
Many thanks for your help, this is some very good info. I'm going to go with the Ultimate, since noise is a big issue with me. Last thing though, are there any 9800 Pro 256MB cards with video in? Not a big deal but it would be nice.
 
Originally posted by G|ass
My god that card is a monster. Does it take 2 or 3 slots?

2 slots... it is actually massive :) but it is completely silent...

check out your mobo's dimensions before using it... but most people have no problems with it :)

all the newer ati cards AFAIK have vivo... the specs should explain it...

you could alternatively get an all in wonder version... which is quite excellent...
 
Actually, after seeing the article on the R360 (9800xt) I think I'll wait for that to come out. It looks like it will even be cheaper. Cool.
 
Originally posted by G|ass
Actually, after seeing the article on the R360 (9800xt) I think I'll wait for that to come out. It looks like it will even be cheaper. Cool.

I think the expected debut of benches is around sept 30th...

if you can hold out... do so :)
 
Originally posted by G|ass
I was looking at http://www.digit-life.com/articles2/radeon/r9800pro-ue.html#p2 and saw this pic of the Ultimate after 10 hours of use, and the heatsink turned RED! Is this something to be concerned about? I leave my PC on 24x7... Would I be better off getting the regular Pro with fan?

heat is not an issue... besides there is a practically silent 92mm fan that can be attached to the whole setup which keeps it even cooler...
 
Damn... You guys are really down on the FX5900 aren't you!:)

It's not all bad! Seriously! I know that the ATI offers certain things that NVidia doesn't and that it conforms more rigorously to the DX9 API than Nvidia do, but there are good reasons for that. The ATI renders everything to high detail; all the time. The problem with this is that you are wasting GPU power all over the show. In real world, rendered situations, there is very little around you that you can actually see in high detail. Most things are too far away, obscured in shadow or hidden behind something else. Therefore the NV30 paths try to emulate this. Sure they get it wrong, like with those dodgy 'cheat' drivers they had, but they are getting there.

Dont take my word for it, got look up some interviews with people like Jon Carmack. He is absolutley convinced that the FX is the better card and that it has far more potential. While talking about the FX5800 and the Radeon 9700 rendering Doom III he said that (and this was 6 months ago) the ATI was rendering it better but it was maxing out its power to do so. The NVidia was slightly behind in performance but was not even breaking a sweat. The point here being that once we get a correctly written driver (ATI definelty kick arse on that front) the FX 5900 will fly!

And its pretty damned nippy now!;)
 
Originally posted by StormFront
Damn... You guys are really down on the FX5900 aren't you!:)

It's not all bad! Seriously! I know that the ATI offers certain things that NVidia doesn't and that it conforms more rigorously to the DX9 API than Nvidia do, but there are good reasons for that. The ATI renders everything to high detail; all the time. The problem with this is that you are wasting GPU power all over the show. In real world, rendered situations, there is very little around you that you can actually see in high detail. Most things are too far away, obscured in shadow or hidden behind something else. Therefore the NV30 paths try to emulate this. Sure they get it wrong, like with those dodgy 'cheat' drivers they had, but they are getting there.

Dont take my word for it, got look up some interviews with people like Jon Carmack. He is absolutley convinced that the FX is the better card and that it has far more potential. While talking about the FX5800 and the Radeon 9700 rendering Doom III he said that (and this was 6 months ago) the ATI was rendering it better but it was maxing out its power to do so. The NVidia was slightly behind in performance but was not even breaking a sweat. The point here being that once we get a correctly written driver (ATI definelty kick arse on that front) the FX 5900 will fly!

And its pretty damned nippy now!;)

again... that is b$..

the nv3x cards are excellent for dx8 and below... when it comes to dx9 or following the standard arb2 path in openGL.. the nv3x cards weaknesses are shown up...

it is not a question of being down on whatever... it is a fact... the architecture is fubar... every single dx9 test has shown this... even a TWIMTBP game like Tomb Raider shows this...

carmack has also said no such thing about the nv3x being the better card... he said using lower precision and lesser iq... the nv3x cards are slightly faster in doom3... that is when nv3x cards run their own path... when rnning on the standard arb2 path the performance is 1/2 that of the radeons...

remember this... even when ati renders everythng @ fp24 precision... it is running FASTER than the nvidia cards that render things with fp16 precision... dx9 calls for a minimum of fp24... nvidia is dropping back down in all cards except the nv35 to fx12 which is integer.. not floating point...

the architecture with 4 pipelines and 2 tmus on the nv30/35 is well and good in most situations but ati's 8x1 pipeline setup for its highend cards and 4x1 setup is faster as has been shown...

heck the 9600pro is faster than the nv35 in pure dx9 situations... this has nothing to do with drivers or whatever else... it is broken hadware as has been stated for a long time already by everyone except nvidia's pr department...

global shader replacement may sound well and good to you storm but consider this.. people are spending $400-500 on the high end cards... do you REALLY expect people who spend that kind of money to settle for anything less than the best ?

like I said... if you want a dx8 card... get nvidia... for dx9 unless nvidia pulls a rabbit out of the hat in 2004 with the nv40... their dx9 and above performance is still going to be fubar... even the nv38 is no more than a refresh of the nv35 (as is the r360 but it will including better shadowing thechnology for games such as doom3)

btw... occlusion culling already occurs on the high end cards... both nvidia and ati already have algorithms which detect and limits rendering on items which are obscured... this has been happening for awhile... and is done by other makers as well... no power is wasted...

I am going to mention one last thing... concerning carmack... the 9700pro never was running out of power... if anyone knows carmack they know he likes pushing the envelope... he was seeing how far he could go with instructions... and he hit the limit on what the 9700pro could do while testing... it did not run out of power... its registers were full with instructions... the nv30 on the other hand can handle...in principle... more instructions...

here is teh problem with that though... it doesnt MATTER how many instructions the card can run... it is how FAST it can run them... eg... arb2 path...

why is it that all these game developers are spending 5-10 times more time developing code for nvidia cards which lower precision and replace shaders with lower level shaders ? it is because the cards CAN'T HANDLE the workload...

I will reiterate... there is nothing that nvidia can do short of a new architecture that will bring them back up... and that is not going to happen this year...

I personally want members of this forum to get the best experience possible for the money they spent... if you feel so inclined to blow your money on an nv3x card @ this time.. feel free... I will not be recommending any nvidia product other than gf4 ti's to anyone anytime soon...
 
Sazar you the man. I was reading the latest issue of maximum PC and here they are touting the 5900FX as the new bread of life. Why? Because according to them an overclocked 5900 wins more benchmarks than a 9800. Hmmm but these cards werent tested on DX9 games at all. I just cant wait for the new Half Life benchmark test and see whats reality. You Sazar and the constant reading I provide myself has now sealed my decision to but ATI and ATI it is.:)
 
Originally posted by Terrahertz
Sazar you the man. I was reading the latest issue of maximum PC and here they are touting the 5900FX as the new bread of life. Why? Because according to them an overclocked 5900 wins more benchmarks than a 9800. Hmmm but these cards werent tested on DX9 games at all. I just cant wait for the new Half Life benchmark test and see whats reality. You Sazar and the constant reading I provide myself has now sealed my decision to but ATI and ATI it is.:)

MAXIMUM PC does not take into account the benchmark hacks nvidia employs nor nvidia's use of bilinear filtering to trilinear filtering...

they have been touting the nv3x series even when all the while most other technical websites and publications have come out with information that shoots down their logic...

but thats what you get with uninformed reviewers... if you base your decision on an uninformed reviewers conclusions... it is YOUR money... not that of the publication that goes down the drain...
 
Noted:

One question. Is it really necessary to buy the 256 version of the 9800 or 128 will be just right for whats upcoming?

forgot to mention Nvidia and ATI sent them cards that were overclocked and the 5900 was chosen as the winner for their dream amchine.
 

Members online

No members online now.

Latest profile posts

Also Hi EP and people. I found this place again while looking through a oooollllllldddd backup. I have filled over 10TB and was looking at my collection of antiques. Any bids on the 500Mhz Win 95 fix?
Any of the SP crew still out there?
Xie wrote on Electronic Punk's profile.
Impressed you have kept this alive this long EP! So many sites have come and gone. :(

Just did some crude math and I apparently joined almost 18yrs ago, how is that possible???
hello peeps... is been some time since i last came here.
Electronic Punk wrote on Sazar's profile.
Rest in peace my friend, been trying to find you and finally did in the worst way imaginable.

Forum statistics

Threads
62,015
Messages
673,494
Members
5,621
Latest member
naeemsafi
Back