• This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn more.

Half-Life 2 To Be Nvidia Exclusive?

Electronic Punk

willalwaysbewithyou
Staff member
Political User
#1
There is quite a heated debate over on Warp2Search about the fact that some completely unknown site which states that Half-life 2 will only support nvidia graphics cards.

http://www.warp2search.net/article.php?sid=11581

Sounds quite daft, that a geforce 2 mx will be able to run a game that the ati 9700 can't... so I approached my mate at nvidia about that very topic.

He can confirm that it IS true BUT only under a particular set of circumstances.

Wanna guess what they are ? :D
 
#2
I thought DirectX was invented to solve these kind of things :confused:. If this is true, then I am almost sure there will be a patch released to enable Radeon support.

btw. what are those 'circumstances'?
 
O

OTE

Guest
#3
This I doubt will ever see the light of day. Nvidia may hold more of the market than anyone else, at the moment. The game creators would loose about 30% of total possible gamers as a result of this, not financially sound, so i wouldnt make any sense from that point of view. Also I hadnt heard that Halflife 2 was officially announced yet anyway, even if it was it will be a year atleast before the game is released, maybe more if they take the 'its done when its done' card, anyone could be top dog by the time its released, just look at how quickly nVidia got there.
However what may very well happen, in my opinion, is that Half Life 2 could have specific drivers instructions and commands like they did wil Glide for the Voodoo cards when 3DFX was top dog. At the time running the game on a voodoo card was they way to do it, like unreal and unreal tournament which both had Glide specific code.
 

Electronic Punk

willalwaysbewithyou
Staff member
Political User
#4
Well DirectX does change this.

The situation in which Half-life 2 will only support Nvidia cards is actually on linux!

"Maybe for Linux. UT2003 runs only on NVIDIA for Linux because ATI is to buggy and they won't fix their own Linux drivers."

So w2s only have part of the scoop ;)
 
#5
have you forgotten or not?it's April:D so this is one of April's lies;) even if it's true anyone with small background with drivers can hack ATI's drivers..it's the same operating system and same hardwares..remember that 3d cards are just accelerators.:)
 

Krux

Nissan Powered
#6
Originally posted by OTE
This I doubt will ever see the light of day. Nvidia may hold more of the market than anyone else, at the moment. The game creators would loose about 30% of total possible gamers as a result of this, not financially sound, so i wouldnt make any sense from that point of view. Also I hadnt heard that Halflife 2 was officially announced yet anyway, even if it was it will be a year atleast before the game is released, maybe more if they take the 'its done when its done' card, anyone could be top dog by the time its released, just look at how quickly nVidia got there.
However what may very well happen, in my opinion, is that Half Life 2 could have specific drivers instructions and commands like they did wil Glide for the Voodoo cards when 3DFX was top dog. At the time running the game on a voodoo card was they way to do it, like unreal and unreal tournament which both had Glide specific code.


but...... what IF Nvidia payed for that 30% loss? and told sierra that they can't make any patches for ATI, the only way you could play the game would be to buy one of there cards or have some one hack the code in the game to bypass whatever there is in it to check for an Nvidia card. but then there would prolly be lawsuits and it might even make it so your cd key doesn't work online I dunno.... :eek:
 
R

rettahc

Guest
#7
i seriously doubt that Nvidia would pay to keep the game strictly for their cards, simply because they know that someone would hack it to run on ATI cards.
 
T

toxicity

Guest
#9
lol, had me worried for a sec. Even though I have a GeForce 2, I am getting a radeon soon.
 

Sazar

F@H - Is it in you?
Staff member
Political User
#10
FYi... nvidia does NOT own 70% of the GPU/IGP market... they own around 40% with intel right behind them and ati next...

in terms of stand alone cards... nvidia has a larger share BUT there are a LOT of systems with intel integrated graphics :)

if you look @ the way video cards work.. it is easier to program for ati cards than nvidia simply because ati follows specs... nvidia creates its own specs...

why is it that john carmack spends extra time working on an nv30 card? because the DEFAULT pathways... arb2 is slower... by a large margin on the nv30 than the r300... he has to use nvidias own path which means more programming...

btw I highly doubt that news article is true :)

its been discussed to death @ www.nvnews.net... and nvidia fan site... just does not seem realistic...
 

Krux

Nissan Powered
#11
Originally posted by Sazar
FYi... nvidia does NOT own 70% of the GPU/IGP market... they own around 40% with intel right behind them and ati next...

in terms of stand alone cards... nvidia has a larger share BUT there are a LOT of systems with intel integrated graphics :)

if you look @ the way video cards work.. it is easier to program for ati cards than nvidia simply because ati follows specs... nvidia creates its own specs...

why is it that john carmack spends extra time working on an nv30 card? because the DEFAULT pathways... arb2 is slower... by a large margin on the nv30 than the r300... he has to use nvidias own path which means more programming...

btw I highly doubt that news article is true :)

its been discussed to death @ www.nvnews.net... and nvidia fan site... just does not seem realistic...

thats not entirly true sazar the reason the FX was off spec was because when they where working with MS towards the end of the DX9 project MS told them when we are done you will sign over all rights to your hardware design I do believe Intel was in on this deal also and when MS tryed to pull this on them they looks at MS and told them to shove it up there ass. I remember reading about this on anandtech.com a few month back.... I'm not sure if it was intel but there was another company in the deal
 

Sazar

F@H - Is it in you?
Staff member
Political User
#12
Originally posted by Krux
thats not entirly true sazar the reason the FX was off spec was because when they where working with MS towards the end of the DX9 project MS told them when we are done you will sign over all rights to your hardware design I do believe Intel was in on this deal also and when MS tryed to pull this on them they looks at MS and told them to shove it up there ass. I remember reading about this on anandtech.com a few month back.... I'm not sure if it was intel but there was another company in the deal
wot ?

the FX is not off spec... AFAIK...

sry.. not following what you are saying here..
 
O

OTE

Guest
#13
i think what he means is the fact that the GFFX is not exactly DX9 supporting to the letter, unlike the ATi card(s).
 

Electronic Punk

willalwaysbewithyou
Staff member
Political User
#14
Thats suprising especially as nvidia have this written on their driver's page:

Release Highlights:
- The industry’s best Microsoft® DirectX® 9 support


Natually I hope I have made it clear I really am not picking sides between the whole nvidia/ati thing.

May be a tiny bit biased as I have an gf3 atm, but will be switching to the 9800 Pro at some stage when my bank manager lets me.
 

Sazar

F@H - Is it in you?
Staff member
Political User
#15
Originally posted by Electronic Punk
Thats suprising especially as nvidia have this written on their driver's page:

Release Highlights:
- The industry’s best Microsoft® DirectX® 9 support


Natually I hope I have made it clear I really am not picking sides between the whole nvidia/ati thing.

May be a tiny bit biased as I have an gf3 atm, but will be switching to the 9800 Pro at some stage when my bank manager lets me.
they go beyond spec on many things... yes... but there are other things.. such as displacement mapping which are not implemented as well as in the ati cards.... remember that the r300 came out 6-7 months before the nv30... but it is all within spec... the nv30 may go beyond spec on many things but it is barely up to spec on some others...

also to be in spec it has to be done in hardware... emulating in software doesn't really cut the mustard :)

note also... NO whql dx9 drivers from nvidia... hence the best in industry is a bit of BS...

lol but I will stick to my original stance that the card is AFAIK within spec.. how it performs on default pathways is obvious :) but it is within spec...
 

Members online

No members online now.

Latest posts

Latest profile posts

Hello, is there anybody in there? Just nod if you can hear me ...
Xie
What a long strange trip it's been. =)

Forum statistics

Threads
61,961
Messages
673,239
Members
89,014
Latest member
sanoravies