HL2 Benchmarks...EVERYWHERE!

Discussion in 'Benchmarks & Performance' started by Petros, Sep 12, 2003.

  1. Petros

    Petros Thief IV

    Messages:
    3,038
    Location:
    Pacific Northwest
  2. TheBlueRaja

    TheBlueRaja BR to Some

    Messages:
    766
    Location:
    Fawkirk!
    NVidia are getting owned on this one...

    :D
     
  3. ste_w

    ste_w Moderator

    Messages:
    756
    Location:
    UK
    Nvidia however states that they do not see the point in these benchmarks being Released as they were and are working very closely with Valve on there Rel 50 Drivers, n the benchmarks are runnin on rel 45 Drivers. Nvidia Also Claim that the Rel 50 Drivers will be the most advanced and best Drivers the company have ever released.
     
  4. TheBlueRaja

    TheBlueRaja BR to Some

    Messages:
    766
    Location:
    Fawkirk!
    You mean the Beta 50.xx dets that remove fog in the game to improve on the benchmark, or the 50.xx dets that are currenlty getting ripped by many sites including DriverHeaven for reducing image quality at the expence of a better score in the new Aquamark benchmarker. *cough*cheats*cough*

    Your going to be hearing a hell of a lot about that over the next few days methinks. But just because im nice and I love all you guys an gals here, here are the links to not 1 but two sites talking lovingly about the new 50.xx drivers.... :cool:

    Driverheaven

    3dGPU

    I'd love an Nvida card right now - wouldnt you.... snigger..:D
     
  5. Sazar

    Sazar F@H - Is it in you? Staff Member Political User Folding Team

    Messages:
    14,905
    Location:
    Between Austin and Tampa
    that is frankly a crock of bull$hit...

    nvidia's det 5x.xx drivers do nothing but replace shaders...

    you want to see how good they look?

    check this out...

    [​IMG]

    give it time to load... and then be shocked @ the total dumbing down of shaders/precision and replacement of shaders nvidia has to employ just to keep up with ati's dx9 hardware...

    nvidia is going to be creamed on any of its supposed dx9 hardware regardless of how much it sugarcoats its deficiencies..

    like I have been saying for practically a year... nvidia made a boneheaded move with their design and they are going to pay the consequences of it till they come out with a new core... which is not looking like it will be anytime soon... perhaps nv40 will bring fixes... perhaps not... I personally doubt it considering how much teh nv3x lineup has cost nvidia...

    @ the moment I think nvidia's pr department makes more than their engineers...

    my recommendation.. anyone looking to buy a dx9 compliant card... buy ati... nvidia's hardware is not going to catch up anytime soon... if you need a good dx8 card... nvidia is still a good choice... but I'll be damned if I am going to spend 500 dollars on a video card only to watch it being totally CREAMED by a card that only costs 150 bucks...

    completely inexcusable...
     
  6. TheBlueRaja

    TheBlueRaja BR to Some

    Messages:
    766
    Location:
    Fawkirk!
    Owned....
     
  7. Goatman

    Goatman Ska Daddy

    Messages:
    676
    Hopefully (for the market's sake) nVidia get's it's act together, and their next big release on video cards fixs the DX9 peformance
     
  8. mike09

    mike09 Moderator

    Messages:
    531
    Location:
    Washingtonville , New York


    the link doesnt load for me.
     
  9. Petros

    Petros Thief IV

    Messages:
    3,038
    Location:
    Pacific Northwest
    No driver has ever increased performance by 50-100% (as in the huge gaps between the Radeon and the FX) overall. If they can increase the performance in HL2 with custom shaders, they will also have to write new shaders for every DX9 game that comes out just to keep up. How long can they do that? I.e. Detonator 341.25 by this time next year :D
     
  10. wyrlwyn

    wyrlwyn Guest

    looks like i'm ready to give gordon another try... now that i don't buy nvidia anymore!
     
  11. Sazar

    Sazar F@H - Is it in you? Staff Member Political User Folding Team

    Messages:
    14,905
    Location:
    Between Austin and Tampa
    that is not necessary...

    all they have to do is start checking for the shaders and then replace them with partial precision stuff.. naturally this requires more work from the game devs and there is only so many delays they can tolerate before they will stop doing this 5x/10x whatever more work on nvidia specific paths to make the game playable on nvidia hardware...

    effectively a simple patch should update the shader replacement routines...

    this will mean that a dx9 game will be treated effectively as a dx8/8.1 game by the gf FX hardware... so you will miss out on some effects/visuals... but the game will still play...

    of particular note... apparently the gf4 ti 4600 works faster than the 5600 in hl2 benchies... gabe has suggested using the 5600/5200 cards in dx8.1 mode instead of dx9 mode to have a chance of playing the game... this will be true in other games that utilise dx9... such as halo... and this has been shown in tomb raider already...
     
  12. Sazar

    Sazar F@H - Is it in you? Staff Member Political User Folding Team

    Messages:
    14,905
    Location:
    Between Austin and Tampa
    an interesting note... it would appear as though microsoft has endorsed half life 2's benchmark utility as a true dx9 benchmark...

    recall 3dmark03 (yes... the same benchmark that has gone down the toilet since allowing 'optimizations' that discredit what it does completely) never received this endorsement...

    curiouser and curiouser this tale becomes...
     
  13. scriptasylum

    scriptasylum Moderator

    Messages:
    832
    Location:
    Des Moines,IA
    LOL, back when I was looking at new video cards, I felt good about going from NV to ATI because of IQ, physical size, relatively noise free (compared to NV), etc... now it seems there was a REAL good reason for the switch! This non-synthetic DX9 test shows the superior product even though NV has the clock advantage. And this coming from an NV fan since the GF256!

    If a new set of drivers fixes the GF DX9 performance to ATI's level, then either NV will be cheating again, or all the drivers they've developed in the past were very poorly written. :)

    NV must be feeling a bit small right now.
     
  14. Terrahertz

    Terrahertz Extinction Agenda Political User Folding Team

    Messages:
    972
    Location:
    New York
    Man I tell ya Nvidia you should read Sazar's stuff he just might save your asses lol. I already preordered HL2 and Ill be damned if I have to run this DX9 master piece in DX8 mode lol. Good job Nvidia lol
     
  15. Teddy

    Teddy Boogie Nights...!

    Messages:
    1,551
    Location:
    London, UK
    Hardly proof...looks like the same pic with the gamma or something reduced.

    Where was this pic from? A reliable source or someone with a little too much time on their hands and a copy of PaintShop Pro?
     
  16. Terrahertz

    Terrahertz Extinction Agenda Political User Folding Team

    Messages:
    972
    Location:
    New York
    Alright we can argue the pic but official benchmarks have spoken and someone got their asses kicked.
     
  17. StormFront

    StormFront Guest

    I gotta disagree here. The Nvidia render looks better. The lighting is smoother and more subtle and you can see a greater level of detail with less edges and blurring occuring...

    Just my 2 cents anyhoo:D
     
  18. StormFront

    StormFront Guest

    Actually check that. Just realised that this has to be a fake. The 2 frames are identical in their object and explosion positioning. No way you can render an image like this (specifically the 'splodes) twice and get everything exactly the same....:confused:
     
  19. TheBlueRaja

    TheBlueRaja BR to Some

    Messages:
    766
    Location:
    Fawkirk!
    Yeah, all these sites/game developers and benchmark creators have decided to get togeteher and gang up on Nvidia to make them look bad.

    Or....

    And here is some even more interesting cover-up's, sorry news...

    EIDOS Interactive, the publisher for Tomb Raider: Angel of Darkness issued a patch a couple of weeks ago for the game which happened to include a way to use the game as a DX9 benchmark. Since it shows NVIDIA hardware performing slower than ATI, EIDOS has pulled it down. Remember, this is a "Way it's meant to be played" game, which means NVIDIA has paid EIDOS marketing money. Keep in mind, that this patch improved performance on both ATI and NVIDIA hardware. Here's a bs statement from EIDOS Europe:

    "It has come to the attention of Eidos that an unreleased patch for Tomb Raider: AOD has unfortunately been used as the basis for videocard benchmarking. While Eidos and Core appreciate the need for modern benchmarking software that utilizes the advanced shading capabilities of modern graphics hardware, Tomb Raider: AOD Patch 49 was never intended for public release and is not a basis for valid benchmarking comparisons. Core and Eidos believe that Tomb Raider: AOD performs exceptionally well on NVIDIA hardware." - Paul Baldwin, Eidos
     
  20. StormFront

    StormFront Guest

    Again, more hype. This article, if read correctly, actually does not say anything against NVidia.
    Eidos freely admit that this was an internal build patch and not for public release. Therefore it is likely that this benchmark was not designed to test both render pathways. Like it or not, we have to accept that NVidia have chosen to do thinks differently than how Microshaft have decreed with their precious Direct X (interesting point this: Why have we all decided without thinking about it that Direct X is the right way to go?? Odd that...:confused: )