I wont get into the specifics of what FPS the eye can detect again...I've done that before at ather forums, and it can give a guy a bigger headache than an hour at 60Hz refresh rate
The primary thing to think here is that while viewing a game at 30-40 FPS may be bareable, every game has its up and down dips in FPS. IOW, the downside dips of 10-20 FPS can really kill a game experience. A good example of this is in the latest JK games. Theres a bug whereas when you enable force feedback, you swing your lightsaber, and all of a sudden your 80-120 FPS takes a nosedive to 20-30 FPS momentarily. For those who own Jedi Knight: Outcast and/or Jedi Academy, try this out and tell me you dont notice the difference
Ok, S1RE, you're saying that those of us with an NVidia card are screwed when it comes to HL2? i'm guessing you're going by the benchmarks and how they said that it would be unplayable on lower end graphics cards.
It depends. If you have an FX5200 or 5600, dont even think about playing in DX9 mode or high resolutions. If your talking FX5900, it *might* be playable in DX9, but basically in the 30-40 FPS average range. The Det 5.0 drivers are may increase speed, but it has been determined that the most up-to-date betas sacrifice IQ for speed. How much is it noticable? Unknown at this time. But Gabe Newell basically said that the Nv code in the game already sacrifices some IQ to enable some speed, and it still couldn't catch the Radeon 9700 or 9800 cards. In fact, the Rad 9600 Pro was said to be faster than the 5900 Ultra in most benches.
Does this mean that HL2 is unplayable with the 5900? Far from it.
But you'd think that for a card that costs 400-500 USD and was marketed to be able to excel at DX9 games would perform as good or better than its ATI counterparts in said games. Alas, it doesnt, even at reduced IQ.
Quite frankly, this would piss me off. But thats just me.
As far as the 5200 goes, dont expect a good experience with HL2, unless of course your expectations are already low (not flamebaiting).
Well if that's the case, i guess you weren't keeping up with the Half-Life 2 news that's been around for a few months. for one, it would be able to be played on a low end system, but not be near as graphical as on a higher end system. That's cause of the engine that they took so long to make, which configures itself to your system. and how they said that on a GeForce FX 5200 Ultra it would hit about 10 frames per second. if you think that kind of refresh rate would slow the video down that much, then i'm wondering what your view of "slow" is. probably thinking of the 60 frames per second that it says for the Radeon 9800. you think you'd really notice that much of a difference between 60 frames per second and maybe 20 or so? i highly doubt it.
IMHO, the 5200 just sucks for almost any gaming purposes, especially anything recent. Note: that obviously doesnt mean all games. There are some games out there are relativally easy on the card, but those are an exception to the rule.
The 5200 is essentially a GF2. It replaces the MX series that have performed relativally sucktackular in the past.
Oh, and my view of slow has been defined earlier in this post. The difference between 60 FPS and 20 FPS is like night and day. Try out the JK test I outlined earlier and you'll see what I mean.
Of course, preferences are like opinions, everyones differs. But to say that HL2 would be a good experience on the 5200 to the general public, or especially to hardcore gamers, is really going against the grain.