View Single Post
  #2  
Old July 22nd 04, 09:52 PM
Philburg2
external usenet poster
 
Posts: n/a
Default

Well from almost every test I've seen, the 6800GT destroys the x800 pro by
several fps, not just a few. The 6800GT also has several features like
PS3.0 that will be enabled and optimized in future drivers so performance
will only go up. I think the more future proof card is the 6800GT.

"dookie" wrote in message
.com...
Hey y'all,

I'm reentering the gaming world after a long hiatus. How long? I'm
replacing a 2xP2/300, 384mb, Voodoo3, AWE64 rig! I'm going to ask the

same
question that everyone is these days, but hopefully a little more
intelligently than "DudE! My mOm say she'll h00k me up with eItheR.

Which
iZ da bizzy-b0mB?" I've been reading everything I can, and I have some

very
specific questions (the answer to many of which will be "only time will
tell" I suspect). I'd appreciate logical and informed responses (what?

On
Usenet?). The email address herein is legit (after you remove the

obvious),
if you prefer to stay out of the fray.

The new rig is an Athlon XP 3200+ with 1gb DDR400. This is not up for
debate. The price was *very* right and it's already purchased (~$225 for
CPU, cooler, case, motherboard, 400w power supply, tax and shipping). I'm
not very interested in overclocking anything. The question is which $400
GPU to put in it, the 6800GT or the X800Pro, if I'm planning to have this
box as long as I did my last. Availability is not an issue...I happen to
have both cards right here in front of me (an ATI and a PNY, both still in
cellophane) . Yes, I *am* a bitch.

So, with *only* the X800Pro and 6800GT in mind...

Performance:
We've all seen the Doom3 benchmarks. Big whoop...this is not the only

game
I'll be playing. On the other hand, a great engine will get a lot of

reuse.
Is it realistic to believe that ATI will a) be able to, and b) choose to

fix
the OpenGL performance of the X800Pro. Or is it a) crippled by its
12-pipeline architecture and lack of Shader 3.0 support, and/or b) doomed

at
birth by the promise of a near-term declocked 16-pipe card (the so-called
X800GT)?
And in the other camp, plenty of benchmarks show the two cards pretty much
neck and neck in DirectX games today, with perhaps a slight advantage to
ATI. Will 9.0c (and its Shader 3.0 support) change much? How important

is
Shader 3.0 support really?

Noise:
Anybody with real world experience with both? I understand the 6800GT is
loud. I spend my days in climate-controlled server rooms, so a little
machine whirr ain't no big thing. On the other hand, the rig will be left
on pretty much all the time in a very open-architecture house. Will I

hear
it in the next room?

Hacks:
Not that I'll be jacking around with my $400 toy any time soon, but it's
widely reported that BIOS flashes are a poor man's upgrade. As I

understand
it, the chipsets that don't pass muster to be part of an XT / Ultra PCB

are
then tested to lower (ie: Pro / GT) standards. So the probability of
flashing actually improving anything depends on how 'broken' the

individual
GPU is? Furthermore, my X800 is probably not a VIVO version, which I
understand means it is not flashable to an XT regardless? Whereas all

GT's
are capable? Has anyone actually performed a flash on either of these
cards?

What else bears consideration? I've got a couple weeks to make a decision
and I know they're both great cards. Nor am I particularly loyal to (or
vengeful against) either manufacturer.

Thanks for any and all input,

Dookie