Has nVidia taken the crown??

Originally posted by ElementalDragon
Actually, i think the fan idea is pretty good. if you think about it, why would you want a small fan on a card that is that fast if it doesn't cool the card down as much as possible? Look at older CPU's, they only really need a very small heatsink and a fan. now we have processors even from manufacturer's that have like a 2 inch deep heatsink and a somewhat large fan. even video cards are like that. you look at on-board video or VERY, VERY old 4 or 8 MB video cards that have absolutely no fan (unless you put a fan in your case to cool the whole thing down all together). now, 32 mb cards have fan's and they don't ALWAYS run too great depending on the manufacturer. All i'm really trying to say is that Why have that nice of a card, put too little cooling on it, ship it, then someone uses it and it fries cause of gettin too warm?

you miss the point my friend :)

the ORIGINAL specs for the card were for it to run @ 400-MAXIMUM 450mhz... if you note that the non-ultra 5800 runs @ 400 and uses a basic heatsink you will see the point...

the 5800 ultra is OVERCLOCKED... its that simple... heck if I want to overclock my 9700pro and stick that big mofo on it I could easily scale up and beyond 400/400 mem and core... not a problem...@ those speeds the gf FX would be a very distant second in most games and the fillrates would be extremely close...

now to the question of the card... it is running a fan that is louder than deltas... you know how loud those things are?

scaling to 70 DB ? do you realise exactly HOW loud that is?

beyond that the device uses per nvidia 2 slots... agp and 1 pci... but in reality you would be hardpressed to squeeze anything into the 2nd pci slot as well... take a look @ the pics of it sitting inside a computer... since the rear of the card is being cooled passively... there is heat generated and there is a LOT of it generated...

there is not one intel or amd cooler out there that does this... and why can ati use a much smaller heatsink on its card without using a monster cooler like that? coz of better design and less reliance on the possibility that low-k dielectrics would be mature enough to employ in this new video card..

ati apparently had a much better understanding of how mature the processes were than nvidia and are currently able to reap technical benefits because of that...

the card and its cooler are a little off if you ask me... the REAL gf FX would not have been able to compete with the 9700pro... so they oc it... call it the ultra... slap a massive cooler on it and boom... there you go :)
 
what? so it would be okay in ten years time if everyone has a fridge like compartment attatched to each rig? this problem with cooling is really quite interessting.. it kind of indicates that the market is moving faster then technology. Because now you've got nVidia releasing a super graphics card to beat radeon any which way.. and people actually like that? i understand that people say new processors release more heat.. but new 0.13 micron tech. decreases that.. so it goes to show that they are making something not right for its time... slow down hoes! Ati made a craking card and they dont have a bulky cooling system... which goes to show better quality design..

i hope you people get my point, i dont mean to go a bit topic.. but that last post [elemental dragon]
 
Originally posted by Sazar
you miss the point my friend :)...........................

the 5800 ultra is OVERCLOCKED... its that simple... heck if I want to overclock my 9700pro and stick that big mofo on it I could easily scale up and beyond 400/400 mem and core... not a problem...@ those speeds the gf FX would be a very distant second in most games and the fillrates would be extremely close


let me say this sazar, if nvidia were to go with the 256bit memory instead of 128, ati would be a verrry distant second. so 9700pro days are numbered like any other video card. what would u have to say for ati if nvidia did? "umm oh no, ati is losing now!". just like all u ati freaks screaming now about nvidia losing to 9700pro. BIG DEAL!

think about 256bit DDRII...ati's 256bit DDR wouldnt hold a candle to 256 DDRII. so if nvidia decides to give that a whirl, and it proves to make DDR obsolete, ati will be playing rush to keep up with nvidia again. ati only sparked motivation for nvidia.
 
Originally posted by mikill
let me say this sazar, if nvidia were to go with the 256bit memory instead of 128, ati would be a verrry distant second. so 9700pro days are numbered like any other video card. what would u have to say for ati if nvidia did? "umm oh no, ati is losing now!". just like all u ati freaks screaming now about nvidia losing to 9700pro. BIG DEAL!

think about 256bit DDRII...ati's 256bit DDR wouldnt hold a candle to 256 DDRII. so if nvidia decides to give that a whirl, and it proves to make DDR obsolete, ati will be playing rush to keep up with nvidia again. ati only sparked motivation for nvidia.

you are obviously missing the big picture :)

nvidia CHOSE their design early on... they DECIDED to go with 128bit memory... heck their CEO said only a week or so ago that ati with 256bit memory would not have an advantage and that 128bit was the way to go... :rolleyes:

ati had the option of going to 0.13 micron process for their cards... they had the option of going to ddr II... I think you have forgotten ati DEMONSTRATED a video card with ddr II (a 9700pro on a TECH TV type program... LIVE with ddr II) this was far before nvidia even managed to get their cards working properly apparently...

the way things have worked out... ati definitely had a better understanding of the current technology on offer than nvidia did... nvidia selected its process/hardware and ended up having to rework it just to keep up with the 9700pro (overclocking it)

also do you realise that ddr II is not only expensive but also has higher latencies than ddr ? that ati is probably going to introduce its r400 card with gddr3 spec that it helped develop and which not only supports higher speeds than gddII but also is more optimized for video cards?

nvidia has to not only develop 256bit memory but also solve a lot of the problems that it has right now...

if you are going to say they have the upper hand on the 0.13 micron process... ati taped out its rv350 part in november on the 0.13 micron process...

again it is all about UNDERSTANDING the technology on offer... historically ati has had much better technology than nvidia... its just been a case of other factors holding it back.. hence they went back to the drawing board and BLAM we have the r300 which even now is showing it is more than able to hold its own v/s a much higher clocked card..

btw.. go to any nvidia forum and see what the nvidia 'freaks (?)' are saying about the company they support... :) its not ati 'freaks' that are saying the nv30 is crud... which it is not btw...

they are just laughing @ nvidia's pr department and the utter BS they have been spewing all this time about their card and the ambiguous nature of their advertising...
 
You also have to remember that the GeForce FX is running DDRII in DDRI mode, they are only using the DDRII for the higher clockspeed.... if ATi were to use DDRII in it's full functionality with a 256 bit bus then they'd be much father ahead....
 
Originally posted by Goatman
You also have to remember that the GeForce FX is running DDRII in DDRI mode, they are only using the DDRII for the higher clockspeed.... if ATi were to use DDRII in it's full functionality with a 256 bit bus then they'd be much father ahead....

???

gddrII is not the same as ddr.... also the gf FX is using gddrII to its capabilities... I don't understand how you can run gddrII as ddr... elaborate please... btw the clocks work such... 250mhz clock = 250x4 = 1ghz... in ddr II... but with higher latencies...

ati has chosen not to use ddrII for its current lineup and likelyl its r350 part due to cost reasons...

[edit] sorry if the way I wrote that sounds offensive.. was not intentional.. just do not understand quite what you are saying there... [/edit]
 
Originally posted by mikill
let me say this sazar, if nvidia were to go with the 256bit memory instead of 128, ati would be a verrry distant second. so 9700pro days are numbered like any other video card. what would u have to say for ati if nvidia did? "umm oh no, ati is losing now!". just like all u ati freaks screaming now about nvidia losing to 9700pro. BIG DEAL!

think about 256bit DDRII...ati's 256bit DDR wouldnt hold a candle to 256 DDRII. so if nvidia decides to give that a whirl, and it proves to make DDR obsolete, ati will be playing rush to keep up with nvidia again. ati only sparked motivation for nvidia.

Umm, no.

If the GF FX was blowing every other piece of hardware out of the water by about 20 or so points in the benchmarks and still had a huge fk off fan, then I would still complain. Noise levels are important to me.
 
Originally posted by Geffy
Umm, no.

If the GF FX was blowing every other piece of hardware out of the water by about 20 or so points in the benchmarks and still had a huge fk off fan, then I would still complain. Noise levels are important to me.

I love the way you use BLOW and BIG FAN in the same paragraph :)

just seemed kewl to me lol... and yes I concur..

btw if you do read the reviews on this product from a few sites you will notice that they point out raw speed stopped being important a while back... image quality is what is important.. and the gf FX fails to deliver here...

the overall speeds of the cards in terms of frames is quite close... one being faster than the other and vice versa in various situations... but the IQ is deifinitely in the favor of ati... and remember it was nvidia that had been promising CINEMATIC QUALITY in their games.. this is not going to happen... their DAWN demo is a one off thing... nice and using ineresting features... yet highly impractical when used in the context of gaming...

conclusion... clock per clock ati has the faster card and STILL retains the lead in IQ...
 
I was lookin at some reviews of the GF FX and it had the FX 6 points over the Radeon 9700 Pro for something, but for 6 less points you get quieter PC and more money in the pocket
 
actually the difference between DDRII and DDR is that DDRII has a larger Prefetch.... and on the FX it uses the same prefect as standard DDR


Sigh..... Again, GO READ THE WHITE PAPAER. OK aparently you didn't even bother to click the link or go to JEDEC.com to see that you are wrong before posting so Ill just post the relevant parts of the white paper here along with a link to the white paper on JEDEC's website.

quote:
Survey results from Dram vendors on CAS cycle time
Assumptions: ~2002-2003 time frame
256mbit device
Achieve high yield for x4, x8, x16 devices
-----------------------------------------------------------------------------------
Supplier | Cycle Time | -----Max pin-- | -----Max pin-- | ----Max pin
--------------------------------frequency ----- frequency ---- frequency
--------------------------------prefetch 2 ----- prefetch 4 ---- prefetch 8


A ----------- 7.5ns -------- 266 mbit/sec ---- 533 mbit/sec ---- 1066 mbit/sec
B ----------- 6.6ns -------- 303 mbit/sec ---- 606 mbit/sec ----1212 mbit/sec
C ----------- 6.0ns -------- 333 mbit/sec ---- 666 mbit/sec ---- 1333 mbit/sec
D ----------- 6.7ns -------- 298 mbit/sec ---- 597 mbit/sec ---- 1194 mbit/sec
E ----------- 6.5ns -------- 308 mbit/sec ---- 615 mbit/sec ---- 1230 mbit/sec
F ----------- 6.5ns -------- 308 mbit/sec ---- 615 mbit/sec ---- 1230 mbit/sec
G ----------- 6.5ns -------- 308 mbit/sec ---- 615 mbit/sec ---- 1230 mbit/sec
H ----------- 6.5ns -------- 308 mbit/sec ---- 615 mbit/sec ---- 1230 mbit/sec


As you can see from that chart changing the prefetch from 2 to 4 doubles the bandwidth at any given speed, doubling the prefetch again from 4 to 8 would indeed also double the bandwidth. So now we know that changing the prefetch to 4 WILL double the bandwidth and thats all well and good but how do we know that DDR2 is supposed to use a 4 bit prefetch? Again, its in the white paper:

quote:
• Dram core utilizes a prefetch of 4
• Decreases frequency of dram core relative needed to support
increased data rate
• Dram core runs at 1/4 data bus frequency



Also note the part about the core running at 1/4 of the data bus frequency. Think of what we now call 266mhz DDR or pc2100, we know it doesnt run at 266mhz - it runs at 133mhz but writes twice per clock cycle, effectivly doubling the badwidth. So the core runs at 1/2 the data bus frequency, when we increase the prefetch to 4 it again doubles the bandwidth allowing a 133mhz core to have an effective bandwidth of 533mhz (4x the 133mhz core).

The quotes are taken from page 6 of the JEDEC white paper on DDR2, which you can see here (it is in pdf format so you will need adobe acrobat or another pdf viewer).

And once again I must say just because it has twice the bandwidth of DDR it DOES NOT MAKE IT QDR - QDR writes 4 times per clock cycle - DDR2 still writes twice per clcok cycle. Think of it this way, DDR2 is writting twice per clock cycle but the chunks of data it's writting are twice the size of the chunks DDR1 would write.

So if you still disagree with this there is no point in posting in here as anyone who understands this stuff or has taken the time to actually read the links I posted now knows that I am right and why. If you disagree with it then just email JEDEC and ask them why they didnt make the DDR2 standard the way you wanted them to make it.

Does anybody else think I should post all of this in the Memory forum in the hopes it will be made a sticky so I won't have to keep having this same discussion?
 
you do realise that jedec APPROVES standards ?

those are guidelines so that everyone can go ahead and make their products under those basic setups..

please do understand that I don't post for the sake of making on look stupid or what not... not my intention and if it comes across as that... please do not take it as so...

I have said and will continue to say that ddrII has HIGHER latencies than ddr memory... your very own post proves my point...

the memory used in video cards from the ti4600 onwards has a latency less than half the lowest on what you have in the list from jedec..

I went to the samsung website (the manufacturer of gddr II)

http://www.samsungelectronics.com/s...R-II_SDRAM/128M_bit/K4N26323AE/k4n26323ae.htm

there you go :) thats the product...

my comments on the clock are also correct per what you have posted... Dram core runs at 1/4 data bus frequency

therefore for 1ghz divide by 4 and you have 250 which is what I posted... note I have NEVER stated that gddrII is qdr... because that is not so...

there are distinct advantages to using gddrII over ddr... that is evident... and you will notice I never mentioned prefetches in my posts... I am not even slamming gddrII I think its great... thought gddrIII is more optimized per white paper specs submitted to jedec as well :)

reading the samsung link I noticed that the memory they are using is 128bit there... hence it does raise the question as to WHETHER the gf FX could even have been conceived on a 256bit memory platform...

reading that it seems highly unlikely... @ this point in time... perhaps why ati decided to skip using it... their demoed card probably was a 9500pro type version of the card with half the memory bandwidth...

and I reiterate from my other posts on this card.. it is highly unbalanced and highly disappointing based on the pr on this card...
 
BUt what I said was the FX is running the DDRII with a Prefetch of 2 instead of 4... so they are losing the extra edge it should provide.
 
oopps hit quote instead of edit....

Also the JEDEC thing I posted above was from another forum...
 
Originally posted by Goatman
BUt what I said was the FX is running the DDRII with a Prefetch of 2 instead of 4... so they are losing the extra edge it should provide.

thats fine... there are speed advantages and there are disadvantages in the way ddrII works as well... its just a necessary evil when weighing the 2 and thinking which direction to go in :)
 

Members online

No members online now.

Latest profile posts

Also Hi EP and people. I found this place again while looking through a oooollllllldddd backup. I have filled over 10TB and was looking at my collection of antiques. Any bids on the 500Mhz Win 95 fix?
Any of the SP crew still out there?
Xie wrote on Electronic Punk's profile.
Impressed you have kept this alive this long EP! So many sites have come and gone. :(

Just did some crude math and I apparently joined almost 18yrs ago, how is that possible???
hello peeps... is been some time since i last came here.
Electronic Punk wrote on Sazar's profile.
Rest in peace my friend, been trying to find you and finally did in the worst way imaginable.

Forum statistics

Threads
62,015
Messages
673,496
Members
5,625
Latest member
vinit
Back