A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Ati Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

6800GT vs. X800Pro...with an eye to the future



 
 
Thread Tools Display Modes
  #1  
Old July 22nd 04, 09:35 PM
dookie
external usenet poster
 
Posts: n/a
Default 6800GT vs. X800Pro...with an eye to the future

Hey y'all,

I'm reentering the gaming world after a long hiatus. How long? I'm
replacing a 2xP2/300, 384mb, Voodoo3, AWE64 rig! I'm going to ask the same
question that everyone is these days, but hopefully a little more
intelligently than "DudE! My mOm say she'll h00k me up with eItheR. Which
iZ da bizzy-b0mB?" I've been reading everything I can, and I have some very
specific questions (the answer to many of which will be "only time will
tell" I suspect). I'd appreciate logical and informed responses (what? On
Usenet?). The email address herein is legit (after you remove the obvious),
if you prefer to stay out of the fray.

The new rig is an Athlon XP 3200+ with 1gb DDR400. This is not up for
debate. The price was *very* right and it's already purchased (~$225 for
CPU, cooler, case, motherboard, 400w power supply, tax and shipping). I'm
not very interested in overclocking anything. The question is which $400
GPU to put in it, the 6800GT or the X800Pro, if I'm planning to have this
box as long as I did my last. Availability is not an issue...I happen to
have both cards right here in front of me (an ATI and a PNY, both still in
cellophane) . Yes, I *am* a bitch.

So, with *only* the X800Pro and 6800GT in mind...

Performance:
We've all seen the Doom3 benchmarks. Big whoop...this is not the only game
I'll be playing. On the other hand, a great engine will get a lot of reuse.
Is it realistic to believe that ATI will a) be able to, and b) choose to fix
the OpenGL performance of the X800Pro. Or is it a) crippled by its
12-pipeline architecture and lack of Shader 3.0 support, and/or b) doomed at
birth by the promise of a near-term declocked 16-pipe card (the so-called
X800GT)?
And in the other camp, plenty of benchmarks show the two cards pretty much
neck and neck in DirectX games today, with perhaps a slight advantage to
ATI. Will 9.0c (and its Shader 3.0 support) change much? How important is
Shader 3.0 support really?

Noise:
Anybody with real world experience with both? I understand the 6800GT is
loud. I spend my days in climate-controlled server rooms, so a little
machine whirr ain't no big thing. On the other hand, the rig will be left
on pretty much all the time in a very open-architecture house. Will I hear
it in the next room?

Hacks:
Not that I'll be jacking around with my $400 toy any time soon, but it's
widely reported that BIOS flashes are a poor man's upgrade. As I understand
it, the chipsets that don't pass muster to be part of an XT / Ultra PCB are
then tested to lower (ie: Pro / GT) standards. So the probability of
flashing actually improving anything depends on how 'broken' the individual
GPU is? Furthermore, my X800 is probably not a VIVO version, which I
understand means it is not flashable to an XT regardless? Whereas all GT's
are capable? Has anyone actually performed a flash on either of these
cards?

What else bears consideration? I've got a couple weeks to make a decision
and I know they're both great cards. Nor am I particularly loyal to (or
vengeful against) either manufacturer.

Thanks for any and all input,

Dookie


  #2  
Old July 22nd 04, 09:52 PM
Philburg2
external usenet poster
 
Posts: n/a
Default

Well from almost every test I've seen, the 6800GT destroys the x800 pro by
several fps, not just a few. The 6800GT also has several features like
PS3.0 that will be enabled and optimized in future drivers so performance
will only go up. I think the more future proof card is the 6800GT.

"dookie" wrote in message
.com...
Hey y'all,

I'm reentering the gaming world after a long hiatus. How long? I'm
replacing a 2xP2/300, 384mb, Voodoo3, AWE64 rig! I'm going to ask the

same
question that everyone is these days, but hopefully a little more
intelligently than "DudE! My mOm say she'll h00k me up with eItheR.

Which
iZ da bizzy-b0mB?" I've been reading everything I can, and I have some

very
specific questions (the answer to many of which will be "only time will
tell" I suspect). I'd appreciate logical and informed responses (what?

On
Usenet?). The email address herein is legit (after you remove the

obvious),
if you prefer to stay out of the fray.

The new rig is an Athlon XP 3200+ with 1gb DDR400. This is not up for
debate. The price was *very* right and it's already purchased (~$225 for
CPU, cooler, case, motherboard, 400w power supply, tax and shipping). I'm
not very interested in overclocking anything. The question is which $400
GPU to put in it, the 6800GT or the X800Pro, if I'm planning to have this
box as long as I did my last. Availability is not an issue...I happen to
have both cards right here in front of me (an ATI and a PNY, both still in
cellophane) . Yes, I *am* a bitch.

So, with *only* the X800Pro and 6800GT in mind...

Performance:
We've all seen the Doom3 benchmarks. Big whoop...this is not the only

game
I'll be playing. On the other hand, a great engine will get a lot of

reuse.
Is it realistic to believe that ATI will a) be able to, and b) choose to

fix
the OpenGL performance of the X800Pro. Or is it a) crippled by its
12-pipeline architecture and lack of Shader 3.0 support, and/or b) doomed

at
birth by the promise of a near-term declocked 16-pipe card (the so-called
X800GT)?
And in the other camp, plenty of benchmarks show the two cards pretty much
neck and neck in DirectX games today, with perhaps a slight advantage to
ATI. Will 9.0c (and its Shader 3.0 support) change much? How important

is
Shader 3.0 support really?

Noise:
Anybody with real world experience with both? I understand the 6800GT is
loud. I spend my days in climate-controlled server rooms, so a little
machine whirr ain't no big thing. On the other hand, the rig will be left
on pretty much all the time in a very open-architecture house. Will I

hear
it in the next room?

Hacks:
Not that I'll be jacking around with my $400 toy any time soon, but it's
widely reported that BIOS flashes are a poor man's upgrade. As I

understand
it, the chipsets that don't pass muster to be part of an XT / Ultra PCB

are
then tested to lower (ie: Pro / GT) standards. So the probability of
flashing actually improving anything depends on how 'broken' the

individual
GPU is? Furthermore, my X800 is probably not a VIVO version, which I
understand means it is not flashable to an XT regardless? Whereas all

GT's
are capable? Has anyone actually performed a flash on either of these
cards?

What else bears consideration? I've got a couple weeks to make a decision
and I know they're both great cards. Nor am I particularly loyal to (or
vengeful against) either manufacturer.

Thanks for any and all input,

Dookie




  #3  
Old July 22nd 04, 11:11 PM
JB
external usenet poster
 
Posts: n/a
Default


Performance:
We've all seen the Doom3 benchmarks. Big whoop...this is not the only game
I'll be playing. On the other hand, a great engine will get a lot of reuse.
Is it realistic to believe that ATI will a) be able to, and b) choose to fix
the OpenGL performance of the X800Pro.


The obvious answer is "no". If they could, they would have long ago.
It's not like ATI is just now finding out they do OGL poorly.


Noise:
Anybody with real world experience with both? I understand the 6800GT is
loud. I spend my days in climate-controlled server rooms, so a little
machine whirr ain't no big thing. On the other hand, the rig will be left
on pretty much all the time in a very open-architecture house. Will I hear
it in the next room?




Hacks:
Not that I'll be jacking around with my $400 toy any time soon, but it's
widely reported that BIOS flashes are a poor man's upgrade. As I understand
it, the chipsets that don't pass muster to be part of an XT / Ultra PCB are
then tested to lower (ie: Pro / GT) standards. So the probability of
flashing actually improving anything depends on how 'broken' the individual
GPU is?


Don't believe everything you hear. The x800pro CANNOT be
turned into a x800xt by the so-called '16 pipe fix', or BIOS flash. I
tried it
on my x800pro, so I know what I'm talking about. Aside from adjusting
the clocks with ATI tool, however it runs
OOTB is the best it will ever run. Of course you must use ATI tool to
adjust the clocks so the card will run at full speed, they are
terribly underclocked OOTB.

Jeff B

  #4  
Old July 23rd 04, 12:05 AM
Bean
external usenet poster
 
Posts: n/a
Default


"JB" wrote in message
news:fYWLc.163581$XM6.52882@attbi_s53...

Performance:
We've all seen the Doom3 benchmarks. Big whoop...this is not the only

game
I'll be playing. On the other hand, a great engine will get a lot of

reuse.
Is it realistic to believe that ATI will a) be able to, and b) choose to

fix
the OpenGL performance of the X800Pro.


The obvious answer is "no". If they could, they would have long ago.
It's not like ATI is just now finding out they do OGL poorly.


Noise:
Anybody with real world experience with both? I understand the 6800GT

is
loud. I spend my days in climate-controlled server rooms, so a little
machine whirr ain't no big thing. On the other hand, the rig will be

left
on pretty much all the time in a very open-architecture house. Will I

hear
it in the next room?




Hacks:
Not that I'll be jacking around with my $400 toy any time soon, but it's
widely reported that BIOS flashes are a poor man's upgrade. As I

understand
it, the chipsets that don't pass muster to be part of an XT / Ultra PCB

are
then tested to lower (ie: Pro / GT) standards. So the probability of
flashing actually improving anything depends on how 'broken' the

individual
GPU is?


Don't believe everything you hear. The x800pro CANNOT be
turned into a x800xt by the so-called '16 pipe fix', or BIOS flash. I
tried it
on my x800pro, so I know what I'm talking about. Aside from adjusting
the clocks with ATI tool, however it runs
OOTB is the best it will ever run. Of course you must use ATI tool to
adjust the clocks so the card will run at full speed, they are
terribly underclocked OOTB.

Jeff B


Actually is is known that ATI is in the progress of redoing the Opengl
Dirvers. Betatesters have said this and also people on the Catalyst team.
Ask around the rage3d forums for more info. Yes the x800 pro can be hacked
and then have its bios flashed to be a x800xt, but it wont work on every
card. You where just unclucky. When you hack it your enabling the extra 4
pipelines. Most likley one or more of those extra 4 pipelines was defective,
and that is why it didnt work for you. The lucky ones with the working 4
pipelines got there hack to work fine. Its more of a 50/50 chance that the
hack will work.

Bean


  #5  
Old July 23rd 04, 01:39 AM
Andrew MacPherson
external usenet poster
 
Posts: n/a
Default

In article fYWLc.163581$XM6.52882@attbi_s53, (JB) wrote:

The x800pro CANNOT be turned into a x800xt by
the so-called '16 pipe fix', or BIOS flas


I think the successes come from x800Pro *VIVO* cards. Mine (a European
brand, Club) has certainly flashed to 16 pipes and benchmarks improved
nicely even though it doesn't quite handle XT core speeds (memory goes all
the way to 600 though without obvious issues... other than heat,
obviously). This probably explains why it has an XT sticker covered over
by a Pro sticker on the heatsink.

I wouldn't want anyone to rely on this though, even if early indications
show a lot of success with VIVO cards. The 9800se VIVO I bought a while
back was supposed to softmod to a 9800Pro and it most certainly didn't.
Its extra pipelines were disabled for a reason... they were totally
f*cked :-)

Anyway, to the original poster... I switched to ATI last year when I
bought a 2nd hand 9700Pro. Now I don't particularly care about getting
every last FPS out of D3, and it's quite possible Nvidia have won the
hardware crown back in this generation. But I was very impressed with the
9700Pro... and all it lacked was some speed when engaging x4FSAA. I bought
the x800 for that reason (ok, and the fact that D3's coming :-) and I'd
have no problem recommending it.

If I wasn't already on board the ATI bus I might well have been swayed by
the Nvidia benchmarks, but I also haven't forgiven them for buying 3dfx
and not using their vastly superior FSAA methods (slow or not). The V5
still has a soft spot in my heart... just next to the empty place in my
wallet ;-)

Andrew McP
  #6  
Old July 23rd 04, 02:18 AM
Dr. Richard Cranium
external usenet poster
 
Posts: n/a
Default

3dfx!! right on!! Nvidia sucks for that dragging 3dfx through the muck and
mire with false advertising/law suits that were intended to just deplete
3dfx's bank accounts so Nvidia could purchase the 3dfx carcass. gawd. that
really ****es me off. !!!!!

http://www.smokeypoint.com/3dfx.htm

memory flogger:
http://www.smokeypoint.com/flash.htm

a fortiori

** No Fate **

cheers,
dracman
Tomb Raider: Shotgun City
http://www.smokeypoint.com/tomb.htm
http://www.smokeypoint.org/traod/traod.html
http://www.smokeypoint.com/tombraide...r1pictures.htm
http://www.smokeypoint.com/tombraide...1midaspics.htm
http://www.smokeypoint.com/uzi.htm
http://www.smokeypoint.com/tomb2.htm
http://www.smokeypoint.com/medipak.htm

**savegame editors all versions Tomb Raider & TRAOD godmode
http://www.smokeypoint.com/tr2code.htm

http://www.smokeypoint.org/farCry.htm

http://www.smokeypoint.com/My_PC.htm

** Win2k and winXP hi-res with TR1
http://www.smokeypoint.com/glidos.htm

** Tomb Raider 1 add on UB levels
http://www.smokeypoint.com/tomb2.htm#Tova

** GTA III vice City Character MOD
http://www.smokeypoint.com/uzi.htm#gta3



.... so much of me...



"Andrew MacPherson" wrote in message
ddress_disguised...
In article fYWLc.163581$XM6.52882@attbi_s53, (JB) wrote:

The x800pro CANNOT be turned into a x800xt by
the so-called '16 pipe fix', or BIOS flas


I think the successes come from x800Pro *VIVO* cards. Mine (a European
brand, Club) has certainly flashed to 16 pipes and benchmarks improved
nicely even though it doesn't quite handle XT core speeds (memory goes all
the way to 600 though without obvious issues... other than heat,
obviously). This probably explains why it has an XT sticker covered over
by a Pro sticker on the heatsink.

I wouldn't want anyone to rely on this though, even if early indications
show a lot of success with VIVO cards. The 9800se VIVO I bought a while
back was supposed to softmod to a 9800Pro and it most certainly didn't.
Its extra pipelines were disabled for a reason... they were totally
f*cked :-)

Anyway, to the original poster... I switched to ATI last year when I
bought a 2nd hand 9700Pro. Now I don't particularly care about getting
every last FPS out of D3, and it's quite possible Nvidia have won the
hardware crown back in this generation. But I was very impressed with the
9700Pro... and all it lacked was some speed when engaging x4FSAA. I bought
the x800 for that reason (ok, and the fact that D3's coming :-) and I'd
have no problem recommending it.

If I wasn't already on board the ATI bus I might well have been swayed by
the Nvidia benchmarks, but I also haven't forgiven them for buying 3dfx
and not using their vastly superior FSAA methods (slow or not). The V5
still has a soft spot in my heart... just next to the empty place in my
wallet ;-)

Andrew McP





.................................................. ...............
Posted via TITANnews - Uncensored Newsgroups Access
at
http://www.TitanNews.com
-=Every Newsgroup - Anonymous, UNCENSORED, BROADBAND Downloads=-

  #8  
Old July 23rd 04, 04:00 AM
Minotaur
external usenet poster
 
Posts: n/a
Default

Bean wrote:

"JB" wrote in message
news:fYWLc.163581$XM6.52882@attbi_s53...

Performance:
We've all seen the Doom3 benchmarks. Big whoop...this is not the only


game

I'll be playing. On the other hand, a great engine will get a lot of


reuse.

Is it realistic to believe that ATI will a) be able to, and b) choose to


fix

the OpenGL performance of the X800Pro.


The obvious answer is "no". If they could, they would have long ago.
It's not like ATI is just now finding out they do OGL poorly.



Noise:
Anybody with real world experience with both? I understand the 6800GT


is

loud. I spend my days in climate-controlled server rooms, so a little
machine whirr ain't no big thing. On the other hand, the rig will be


left

on pretty much all the time in a very open-architecture house. Will I


hear

it in the next room?



Hacks:
Not that I'll be jacking around with my $400 toy any time soon, but it's
widely reported that BIOS flashes are a poor man's upgrade. As I


understand

it, the chipsets that don't pass muster to be part of an XT / Ultra PCB


are

then tested to lower (ie: Pro / GT) standards. So the probability of
flashing actually improving anything depends on how 'broken' the


individual

GPU is?


Don't believe everything you hear. The x800pro CANNOT be
turned into a x800xt by the so-called '16 pipe fix', or BIOS flash. I
tried it
on my x800pro, so I know what I'm talking about. Aside from adjusting
the clocks with ATI tool, however it runs
OOTB is the best it will ever run. Of course you must use ATI tool to
adjust the clocks so the card will run at full speed, they are
terribly underclocked OOTB.

Jeff B



Actually is is known that ATI is in the progress of redoing the Opengl
Dirvers. Betatesters have said this and also people on the Catalyst team.
Ask around the rage3d forums for more info. Yes the x800 pro can be hacked
and then have its bios flashed to be a x800xt, but it wont work on every
card. You where just unclucky. When you hack it your enabling the extra 4
pipelines. Most likley one or more of those extra 4 pipelines was defective,
and that is why it didnt work for you. The lucky ones with the working 4
pipelines got there hack to work fine. Its more of a 50/50 chance that the
hack will work.

Bean



Yes because it only works on, VIVO equiped cards.
I thought that was common knowledge by now?
  #9  
Old July 23rd 04, 04:59 AM
JB
external usenet poster
 
Posts: n/a
Default


Yes the x800 pro can be hacked
and then have its bios flashed to be a x800xt, but it wont work on every
card.


Believe what you want, I have proof that the hack doesn't work.

Jeff B

  #10  
Old July 23rd 04, 05:00 AM
JB
external usenet poster
 
Posts: n/a
Default



Yes because it only works on, VIVO equiped cards.
I thought that was common knowledge by now?


What do you mean, VIVO equiped cards? All x800pros are the same,
right?

Jeff B

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Good fan upgrade for XFX 6800GT PCI-E? Dax Overclocking 1 February 17th 05 08:55 AM
GA-7DXR+ and Leadtek 6800GT Vu Gigabyte Motherboards 2 September 25th 04 04:34 PM
ALLARD ON THE FUTURE OF XBOX - PART TWO Zackman Ati Videocards 2 April 13th 04 07:40 PM
Interesting thing At CD Freaks - Register article 12x and 16x DVD writers near future and double sized DVD writables - two layers Alceryes General 1 November 4th 03 03:08 PM
Future Mark sets the rules ? You guys catch this? JAD Ati Videocards 2 September 27th 03 05:30 AM


All times are GMT +1. The time now is 12:31 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.