A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

This whole HalfLife2 thing is ridiculous



 
 
Thread Tools Display Modes
  #41  
Old September 15th 03, 07:46 PM
Lenny
external usenet poster
 
Posts: n/a
Default


Yep. MS is now in the final stages of DX-10 anyway. Probably be released
late this year or early 2004.


Sorry, but you're wrong. DX10 won't be here until the successor to winxp
(aka "longhorn") is here, and DX10 hasn't even entered beta stage yet. It'll
be 2 years or more until it's out.

Also, there is the update to DX-9b coming
soon. A few lines of code could make a world of difference for everything.


Any updates done to DX inbetween major revisions are likely to be bugfixes
only. Certainly no new features on the immediate horizon, and typically, a
new DX revision does not mean higher performance, just greater
functionality.


  #42  
Old September 15th 03, 10:26 PM
Lenny
external usenet poster
 
Posts: n/a
Default


so how do you rate that "IN GENERAL" optimization ?


Look man, there are TWO rendering paths in the game: Nvidia "mixed"
DirectX8/9, and General DX (supports DX8 OR DX9, but not both
simultaneously).

So IN GENERAL, card X would select the GENERAL path, while Nvidia cards
would likely opt for the "mixed" path, unless the users don't like the
reduced quality. However, since Nvidia cooks their drivers, quality's ****ed
anyway on NV30 hardware so it doesn't really matter anyway. GFFX owners are
taking it up the ass in both cases courtesy of Nvidia's crack driver team
(see Halo, Aquamark etc screenies for examples of that).

Would you believe
Valve if they came one day saying : "Our game is D3D 9, unfortunately
there's no video card that can display it" ? Of course not.


What's this contrieved, made-up bogus scenario supposed to mean? If you're
going to try and make a point, aim for one that makes sense too!

So the
general optimization you are talking about is a general one, yes of
course, but what you see is the one dedicated for one card or the other.
Right now, it is for ATI.


Sorry, but the general rendering path isn't optimized for ANY card in
particular. ATi cards run the game faster because they're simply BETTER. Why
is that so hard to understand?

In my opinion, when you design something, /one way or the other / you
end up making your tests on *one* device.


It's obvious you do not work in the computer games business.

Trust me, okay? These guys do NOT test/develop on just ONE device. That
would be like, stupid, when there's like a hundred billion different PC
setups out there in the wild.

B is not a valid argument. HL2 is not available either afaik.


B is a perfectly valid argument, because unreleased drivers aren't official
and are thus considered to be moving targets. What if NV cooks up a driver
that gives great HL performance by sacrificing something else, reaps the
rewards in the form of good press from gaming sites but then doesn't release
that driver because it just doesn't work very well for anything ELSE than
HL2?

Stuff like this has happened in the past you know.

Unreleased drivers are like shooting at a moving target. NV isn't
responsible for what an unreleased driver does because it isn't official, it
isn't WHQL certified. For all intents and purposes, it is to be considered
BETA.

Well, apart from what Valve says, how can you be sure they used D3D9 and
not some ATI special features ?


Duh, because they like, didn't. There ARE no ATi-specific features!

The only way to find out is to wait for
other D3D9 games.


You're QUITE the Nvidia apologist, aren't you? Dude, there IS other DX9
software out, GFFX sucks pretty much across the board on it, including
Nvidia's own Dawn demo.


  #43  
Old September 16th 03, 04:23 AM
kyork
external usenet poster
 
Posts: n/a
Default

Found this article today, NVIDIA's response to HL2 and ATI:

http://www.neoseeker.com/news/story/2767/


I gotta say, after hearing rumors that none of todays cards are sufficient
to run the new DOOM, Who would want to spend 400+ on a vid card which is
already obsolete. Spend @200 and enjoy life a while. After Doom is out,
then get a new card for $400+



"Roger Squires" wrote in message
m...
I think that is your twisted interpretation of things. They have acted

as
if
HL2 is 100% representative of how HL2 will perform, but Doom3 for

example
performs better on GFFX line (when properly downgraded in quality

anyway).

Please remember that Nvidia tried to pull an end-run around ATI with

the
early Doom3 benchmarks, just as ATI has done with HL2. There were many
statements at the time that they had droves of driver people all over the

D3
benchmarking machine, and the whole production was controlled by them,
enough to make Carmack suspicious.

Payback time, baby.

rms




  #44  
Old September 16th 03, 09:54 AM
J.Clarke
external usenet poster
 
Posts: n/a
Default

On Tue, 16 Sep 2003 11:20:32 +0200
ho alexandre wrote:

J.Clarke wrote:
william bell did not say that, he said that ATI
cards can't handle application that do not use Z-buffer.


He did? Where did he say that?


"even ATI Cards have a problem with no Z Buffer or a limited one.."


Yes he did. He did not say anything about applications.

Do you know of any 3D applications which do not use Z-buffer?

--
XandreX
/I'm that kind of people your parents warned you about/



--
--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
  #45  
Old September 16th 03, 10:20 AM
ho alexandre
external usenet poster
 
Posts: n/a
Default

J.Clarke wrote:
william bell did not say that, he said that ATI
cards can't handle application that do not use Z-buffer.


He did? Where did he say that?


"even ATI Cards have a problem with no Z Buffer or a limited one.."

--
XandreX
/I'm that kind of people your parents warned you about/

  #46  
Old September 16th 03, 04:05 PM
Danny
external usenet poster
 
Posts: n/a
Default


It's actually Tomb Raider 6.


Oh well, Tomb Raider 6 then... Thanks for the heads-up, heh. They're all
crap anyway so who cares? ;-)


#1 was absolute brilliance imo.
#6 was a shoddy release with appalling control and dreadful corruption in
places.
Shame really.


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Not OT, right on topic: Ridiculous prices+rich(?) people=inflation in the market GT-Force Ati Videocards 31 August 10th 04 05:05 PM
X600 - does this thing exist? Nissim Trifonov Ati Videocards 1 July 22nd 04 01:17 AM
Weird thing to explain snowball Ati Videocards 3 November 15th 03 08:35 AM
Interesting thing At CD Freaks - Register article 12x and 16x DVD writers near future and double sized DVD writables - two layers Alceryes General 1 November 4th 03 02:08 PM
Power problems on Dell thing Zilog Jones General 8 July 23rd 03 06:16 PM


All times are GMT +1. The time now is 12:22 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.