• This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn more.

HL2 Benchmarks...EVERYWHERE!

#3
Nvidia however states that they do not see the point in these benchmarks being Released as they were and are working very closely with Valve on there Rel 50 Drivers, n the benchmarks are runnin on rel 45 Drivers. Nvidia Also Claim that the Rel 50 Drivers will be the most advanced and best Drivers the company have ever released.
 
#4
You mean the Beta 50.xx dets that remove fog in the game to improve on the benchmark, or the 50.xx dets that are currenlty getting ripped by many sites including DriverHeaven for reducing image quality at the expence of a better score in the new Aquamark benchmarker. *cough*cheats*cough*

Your going to be hearing a hell of a lot about that over the next few days methinks. But just because im nice and I love all you guys an gals here, here are the links to not 1 but two sites talking lovingly about the new 50.xx drivers.... :cool:

Driverheaven

3dGPU

I'd love an Nvida card right now - wouldnt you.... snigger..:D
 

Sazar

F@H - Is it in you?
Staff member
Political User
#5
Originally posted by ste_w
Nvidia however states that they do not see the point in these benchmarks being Released as they were and are working very closely with Valve on there Rel 50 Drivers, n the benchmarks are runnin on rel 45 Drivers. Nvidia Also Claim that the Rel 50 Drivers will be the most advanced and best Drivers the company have ever released.
that is frankly a crock of bull$hit...

nvidia's det 5x.xx drivers do nothing but replace shaders...

you want to see how good they look?

check this out...



give it time to load... and then be shocked @ the total dumbing down of shaders/precision and replacement of shaders nvidia has to employ just to keep up with ati's dx9 hardware...

nvidia is going to be creamed on any of its supposed dx9 hardware regardless of how much it sugarcoats its deficiencies..

like I have been saying for practically a year... nvidia made a boneheaded move with their design and they are going to pay the consequences of it till they come out with a new core... which is not looking like it will be anytime soon... perhaps nv40 will bring fixes... perhaps not... I personally doubt it considering how much teh nv3x lineup has cost nvidia...

@ the moment I think nvidia's pr department makes more than their engineers...

my recommendation.. anyone looking to buy a dx9 compliant card... buy ati... nvidia's hardware is not going to catch up anytime soon... if you need a good dx8 card... nvidia is still a good choice... but I'll be damned if I am going to spend 500 dollars on a video card only to watch it being totally CREAMED by a card that only costs 150 bucks...

completely inexcusable...
 
#8
Originally posted by Sazar
that is frankly a crock of bull$hit...

nvidia's det 5x.xx drivers do nothing but replace shaders...

you want to see how good they look?

check this out...





the link doesnt load for me.
 
#9
No driver has ever increased performance by 50-100% (as in the huge gaps between the Radeon and the FX) overall. If they can increase the performance in HL2 with custom shaders, they will also have to write new shaders for every DX9 game that comes out just to keep up. How long can they do that? I.e. Detonator 341.25 by this time next year :D
 

Sazar

F@H - Is it in you?
Staff member
Political User
#11
Originally posted by HandyBuddy
No driver has ever increased performance by 50-100% (as in the huge gaps between the Radeon and the FX) overall. If they can increase the performance in HL2 with custom shaders, they will also have to write new shaders for every DX9 game that comes out just to keep up. How long can they do that? I.e. Detonator 341.25 by this time next year :D
that is not necessary...

all they have to do is start checking for the shaders and then replace them with partial precision stuff.. naturally this requires more work from the game devs and there is only so many delays they can tolerate before they will stop doing this 5x/10x whatever more work on nvidia specific paths to make the game playable on nvidia hardware...

effectively a simple patch should update the shader replacement routines...

this will mean that a dx9 game will be treated effectively as a dx8/8.1 game by the gf FX hardware... so you will miss out on some effects/visuals... but the game will still play...

of particular note... apparently the gf4 ti 4600 works faster than the 5600 in hl2 benchies... gabe has suggested using the 5600/5200 cards in dx8.1 mode instead of dx9 mode to have a chance of playing the game... this will be true in other games that utilise dx9... such as halo... and this has been shown in tomb raider already...
 

Sazar

F@H - Is it in you?
Staff member
Political User
#12
an interesting note... it would appear as though microsoft has endorsed half life 2's benchmark utility as a true dx9 benchmark...

recall 3dmark03 (yes... the same benchmark that has gone down the toilet since allowing 'optimizations' that discredit what it does completely) never received this endorsement...

curiouser and curiouser this tale becomes...
 
#13
LOL, back when I was looking at new video cards, I felt good about going from NV to ATI because of IQ, physical size, relatively noise free (compared to NV), etc... now it seems there was a REAL good reason for the switch! This non-synthetic DX9 test shows the superior product even though NV has the clock advantage. And this coming from an NV fan since the GF256!

If a new set of drivers fixes the GF DX9 performance to ATI's level, then either NV will be cheating again, or all the drivers they've developed in the past were very poorly written. :)

NV must be feeling a bit small right now.
 

Terrahertz

Extinction Agenda
Political User
#14
Man I tell ya Nvidia you should read Sazar's stuff he just might save your asses lol. I already preordered HL2 and Ill be damned if I have to run this DX9 master piece in DX8 mode lol. Good job Nvidia lol
 

Teddy

Boogie Nights...!
#15
Originally posted by Sazar
you want to see how good they look?

check this out...

Hardly proof...looks like the same pic with the gamma or something reduced.

Where was this pic from? A reliable source or someone with a little too much time on their hands and a copy of PaintShop Pro?
 

Terrahertz

Extinction Agenda
Political User
#16
Originally posted by Teddy
Hardly proof...looks like the same pic with the gamma or something reduced.

Where was this pic from? A reliable source or someone with a little too much time on their hands and a copy of PaintShop Pro?
Alright we can argue the pic but official benchmarks have spoken and someone got their asses kicked.
 
S

StormFront

Guest
#17
Originally posted by Sazar
that is frankly a crock of bull$hit...

nvidia's det 5x.xx drivers do nothing but replace shaders...

you want to see how good they look?

check this out...



I gotta disagree here. The Nvidia render looks better. The lighting is smoother and more subtle and you can see a greater level of detail with less edges and blurring occuring...

Just my 2 cents anyhoo:D
 
S

StormFront

Guest
#18
Actually check that. Just realised that this has to be a fake. The 2 frames are identical in their object and explosion positioning. No way you can render an image like this (specifically the 'splodes) twice and get everything exactly the same....:confused:
 
#19
Yeah, all these sites/game developers and benchmark creators have decided to get togeteher and gang up on Nvidia to make them look bad.

Or....

And here is some even more interesting cover-up's, sorry news...

EIDOS Interactive, the publisher for Tomb Raider: Angel of Darkness issued a patch a couple of weeks ago for the game which happened to include a way to use the game as a DX9 benchmark. Since it shows NVIDIA hardware performing slower than ATI, EIDOS has pulled it down. Remember, this is a "Way it's meant to be played" game, which means NVIDIA has paid EIDOS marketing money. Keep in mind, that this patch improved performance on both ATI and NVIDIA hardware. Here's a bs statement from EIDOS Europe:

"It has come to the attention of Eidos that an unreleased patch for Tomb Raider: AOD has unfortunately been used as the basis for videocard benchmarking. While Eidos and Core appreciate the need for modern benchmarking software that utilizes the advanced shading capabilities of modern graphics hardware, Tomb Raider: AOD Patch 49 was never intended for public release and is not a basis for valid benchmarking comparisons. Core and Eidos believe that Tomb Raider: AOD performs exceptionally well on NVIDIA hardware." - Paul Baldwin, Eidos
 
S

StormFront

Guest
#20
Again, more hype. This article, if read correctly, actually does not say anything against NVidia.
Eidos freely admit that this was an internal build patch and not for public release. Therefore it is likely that this benchmark was not designed to test both render pathways. Like it or not, we have to accept that NVidia have chosen to do thinks differently than how Microshaft have decreed with their precious Direct X (interesting point this: Why have we all decided without thinking about it that Direct X is the right way to go?? Odd that...:confused: )
 

Members online

Latest posts

Latest profile posts

Hello, is there anybody in there? Just nod if you can hear me ...
Xie
What a long strange trip it's been. =)

Forum statistics

Threads
61,961
Messages
673,239
Members
89,014
Latest member
sanoravies