6800GT vs. X800Pro...with an eye to the future
Hey y'all,
I'm reentering the gaming world after a long hiatus. How long? I'm replacing a 2xP2/300, 384mb, Voodoo3, AWE64 rig! I'm going to ask the same question that everyone is these days, but hopefully a little more intelligently than "DudE! My mOm say she'll h00k me up with eItheR. Which iZ da bizzy-b0mB?" I've been reading everything I can, and I have some very specific questions (the answer to many of which will be "only time will tell" I suspect). I'd appreciate logical and informed responses (what? On Usenet?). The email address herein is legit (after you remove the obvious), if you prefer to stay out of the fray. The new rig is an Athlon XP 3200+ with 1gb DDR400. This is not up for debate. The price was *very* right and it's already purchased (~$225 for CPU, cooler, case, motherboard, 400w power supply, tax and shipping). I'm not very interested in overclocking anything. The question is which $400 GPU to put in it, the 6800GT or the X800Pro, if I'm planning to have this box as long as I did my last. Availability is not an issue...I happen to have both cards right here in front of me (an ATI and a PNY, both still in cellophane) . Yes, I *am* a bitch. So, with *only* the X800Pro and 6800GT in mind... Performance: We've all seen the Doom3 benchmarks. Big whoop...this is not the only game I'll be playing. On the other hand, a great engine will get a lot of reuse. Is it realistic to believe that ATI will a) be able to, and b) choose to fix the OpenGL performance of the X800Pro. Or is it a) crippled by its 12-pipeline architecture and lack of Shader 3.0 support, and/or b) doomed at birth by the promise of a near-term declocked 16-pipe card (the so-called X800GT)? And in the other camp, plenty of benchmarks show the two cards pretty much neck and neck in DirectX games today, with perhaps a slight advantage to ATI. Will 9.0c (and its Shader 3.0 support) change much? How important is Shader 3.0 support really? Noise: Anybody with real world experience with both? I understand the 6800GT is loud. I spend my days in climate-controlled server rooms, so a little machine whirr ain't no big thing. On the other hand, the rig will be left on pretty much all the time in a very open-architecture house. Will I hear it in the next room? Hacks: Not that I'll be jacking around with my $400 toy any time soon, but it's widely reported that BIOS flashes are a poor man's upgrade. As I understand it, the chipsets that don't pass muster to be part of an XT / Ultra PCB are then tested to lower (ie: Pro / GT) standards. So the probability of flashing actually improving anything depends on how 'broken' the individual GPU is? Furthermore, my X800 is probably not a VIVO version, which I understand means it is not flashable to an XT regardless? Whereas all GT's are capable? Has anyone actually performed a flash on either of these cards? What else bears consideration? I've got a couple weeks to make a decision and I know they're both great cards. Nor am I particularly loyal to (or vengeful against) either manufacturer. Thanks for any and all input, Dookie |
Well from almost every test I've seen, the 6800GT destroys the x800 pro by
several fps, not just a few. The 6800GT also has several features like PS3.0 that will be enabled and optimized in future drivers so performance will only go up. I think the more future proof card is the 6800GT. "dookie" wrote in message .com... Hey y'all, I'm reentering the gaming world after a long hiatus. How long? I'm replacing a 2xP2/300, 384mb, Voodoo3, AWE64 rig! I'm going to ask the same question that everyone is these days, but hopefully a little more intelligently than "DudE! My mOm say she'll h00k me up with eItheR. Which iZ da bizzy-b0mB?" I've been reading everything I can, and I have some very specific questions (the answer to many of which will be "only time will tell" I suspect). I'd appreciate logical and informed responses (what? On Usenet?). The email address herein is legit (after you remove the obvious), if you prefer to stay out of the fray. The new rig is an Athlon XP 3200+ with 1gb DDR400. This is not up for debate. The price was *very* right and it's already purchased (~$225 for CPU, cooler, case, motherboard, 400w power supply, tax and shipping). I'm not very interested in overclocking anything. The question is which $400 GPU to put in it, the 6800GT or the X800Pro, if I'm planning to have this box as long as I did my last. Availability is not an issue...I happen to have both cards right here in front of me (an ATI and a PNY, both still in cellophane) . Yes, I *am* a bitch. So, with *only* the X800Pro and 6800GT in mind... Performance: We've all seen the Doom3 benchmarks. Big whoop...this is not the only game I'll be playing. On the other hand, a great engine will get a lot of reuse. Is it realistic to believe that ATI will a) be able to, and b) choose to fix the OpenGL performance of the X800Pro. Or is it a) crippled by its 12-pipeline architecture and lack of Shader 3.0 support, and/or b) doomed at birth by the promise of a near-term declocked 16-pipe card (the so-called X800GT)? And in the other camp, plenty of benchmarks show the two cards pretty much neck and neck in DirectX games today, with perhaps a slight advantage to ATI. Will 9.0c (and its Shader 3.0 support) change much? How important is Shader 3.0 support really? Noise: Anybody with real world experience with both? I understand the 6800GT is loud. I spend my days in climate-controlled server rooms, so a little machine whirr ain't no big thing. On the other hand, the rig will be left on pretty much all the time in a very open-architecture house. Will I hear it in the next room? Hacks: Not that I'll be jacking around with my $400 toy any time soon, but it's widely reported that BIOS flashes are a poor man's upgrade. As I understand it, the chipsets that don't pass muster to be part of an XT / Ultra PCB are then tested to lower (ie: Pro / GT) standards. So the probability of flashing actually improving anything depends on how 'broken' the individual GPU is? Furthermore, my X800 is probably not a VIVO version, which I understand means it is not flashable to an XT regardless? Whereas all GT's are capable? Has anyone actually performed a flash on either of these cards? What else bears consideration? I've got a couple weeks to make a decision and I know they're both great cards. Nor am I particularly loyal to (or vengeful against) either manufacturer. Thanks for any and all input, Dookie |
Performance: We've all seen the Doom3 benchmarks. Big whoop...this is not the only game I'll be playing. On the other hand, a great engine will get a lot of reuse. Is it realistic to believe that ATI will a) be able to, and b) choose to fix the OpenGL performance of the X800Pro. The obvious answer is "no". If they could, they would have long ago. It's not like ATI is just now finding out they do OGL poorly. Noise: Anybody with real world experience with both? I understand the 6800GT is loud. I spend my days in climate-controlled server rooms, so a little machine whirr ain't no big thing. On the other hand, the rig will be left on pretty much all the time in a very open-architecture house. Will I hear it in the next room? Hacks: Not that I'll be jacking around with my $400 toy any time soon, but it's widely reported that BIOS flashes are a poor man's upgrade. As I understand it, the chipsets that don't pass muster to be part of an XT / Ultra PCB are then tested to lower (ie: Pro / GT) standards. So the probability of flashing actually improving anything depends on how 'broken' the individual GPU is? Don't believe everything you hear. The x800pro CANNOT be turned into a x800xt by the so-called '16 pipe fix', or BIOS flash. I tried it on my x800pro, so I know what I'm talking about. Aside from adjusting the clocks with ATI tool, however it runs OOTB is the best it will ever run. Of course you must use ATI tool to adjust the clocks so the card will run at full speed, they are terribly underclocked OOTB. Jeff B |
"JB" wrote in message news:fYWLc.163581$XM6.52882@attbi_s53... Performance: We've all seen the Doom3 benchmarks. Big whoop...this is not the only game I'll be playing. On the other hand, a great engine will get a lot of reuse. Is it realistic to believe that ATI will a) be able to, and b) choose to fix the OpenGL performance of the X800Pro. The obvious answer is "no". If they could, they would have long ago. It's not like ATI is just now finding out they do OGL poorly. Noise: Anybody with real world experience with both? I understand the 6800GT is loud. I spend my days in climate-controlled server rooms, so a little machine whirr ain't no big thing. On the other hand, the rig will be left on pretty much all the time in a very open-architecture house. Will I hear it in the next room? Hacks: Not that I'll be jacking around with my $400 toy any time soon, but it's widely reported that BIOS flashes are a poor man's upgrade. As I understand it, the chipsets that don't pass muster to be part of an XT / Ultra PCB are then tested to lower (ie: Pro / GT) standards. So the probability of flashing actually improving anything depends on how 'broken' the individual GPU is? Don't believe everything you hear. The x800pro CANNOT be turned into a x800xt by the so-called '16 pipe fix', or BIOS flash. I tried it on my x800pro, so I know what I'm talking about. Aside from adjusting the clocks with ATI tool, however it runs OOTB is the best it will ever run. Of course you must use ATI tool to adjust the clocks so the card will run at full speed, they are terribly underclocked OOTB. Jeff B Actually is is known that ATI is in the progress of redoing the Opengl Dirvers. Betatesters have said this and also people on the Catalyst team. Ask around the rage3d forums for more info. Yes the x800 pro can be hacked and then have its bios flashed to be a x800xt, but it wont work on every card. You where just unclucky. When you hack it your enabling the extra 4 pipelines. Most likley one or more of those extra 4 pipelines was defective, and that is why it didnt work for you. The lucky ones with the working 4 pipelines got there hack to work fine. Its more of a 50/50 chance that the hack will work. Bean |
Bean wrote:
"JB" wrote in message news:fYWLc.163581$XM6.52882@attbi_s53... Performance: We've all seen the Doom3 benchmarks. Big whoop...this is not the only game I'll be playing. On the other hand, a great engine will get a lot of reuse. Is it realistic to believe that ATI will a) be able to, and b) choose to fix the OpenGL performance of the X800Pro. The obvious answer is "no". If they could, they would have long ago. It's not like ATI is just now finding out they do OGL poorly. Noise: Anybody with real world experience with both? I understand the 6800GT is loud. I spend my days in climate-controlled server rooms, so a little machine whirr ain't no big thing. On the other hand, the rig will be left on pretty much all the time in a very open-architecture house. Will I hear it in the next room? Hacks: Not that I'll be jacking around with my $400 toy any time soon, but it's widely reported that BIOS flashes are a poor man's upgrade. As I understand it, the chipsets that don't pass muster to be part of an XT / Ultra PCB are then tested to lower (ie: Pro / GT) standards. So the probability of flashing actually improving anything depends on how 'broken' the individual GPU is? Don't believe everything you hear. The x800pro CANNOT be turned into a x800xt by the so-called '16 pipe fix', or BIOS flash. I tried it on my x800pro, so I know what I'm talking about. Aside from adjusting the clocks with ATI tool, however it runs OOTB is the best it will ever run. Of course you must use ATI tool to adjust the clocks so the card will run at full speed, they are terribly underclocked OOTB. Jeff B Actually is is known that ATI is in the progress of redoing the Opengl Dirvers. Betatesters have said this and also people on the Catalyst team. Ask around the rage3d forums for more info. Yes the x800 pro can be hacked and then have its bios flashed to be a x800xt, but it wont work on every card. You where just unclucky. When you hack it your enabling the extra 4 pipelines. Most likley one or more of those extra 4 pipelines was defective, and that is why it didnt work for you. The lucky ones with the working 4 pipelines got there hack to work fine. Its more of a 50/50 chance that the hack will work. Bean Yes because it only works on, VIVO equiped cards. I thought that was common knowledge by now? |
Yes the x800 pro can be hacked and then have its bios flashed to be a x800xt, but it wont work on every card. Believe what you want, I have proof that the hack doesn't work. Jeff B |
Yes because it only works on, VIVO equiped cards. I thought that was common knowledge by now? What do you mean, VIVO equiped cards? All x800pros are the same, right? Jeff B |
Actually is is known that ATI is in the progress of redoing the Opengl Dirvers. Betatesters have said this and also people on the Catalyst team. So what are you saying, Doom3 etc. is about to suddenly run great on ATI hardware? So the benchmarks run by ID software mean nothing?? LOL! If ATI knew how to fix the problem, they would have done so long ago. To think otherwise is to set yourself up for a big dissappointment. Jeff B |
uh, no, you have proff that you couldn't do it, either because of an unlucky
card or alack of knowledge. Mike "JB" wrote in message news:920Mc.7939$eM2.5675@attbi_s51... Yes the x800 pro can be hacked and then have its bios flashed to be a x800xt, but it wont work on every card. Believe what you want, I have proof that the hack doesn't work. Jeff B |
Mike P wrote: uh, no, you have proff that you couldn't do it, either because of an unlucky card or alack of knowledge. Mike No, you are going by "some guy said". I actually did the mod, you didn't. I have hands-on experience, you have nothing. Therefore, by definition, I'm right and you're wrong. Jeff B |
All times are GMT +1. The time now is 11:07 PM. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
HardwareBanter.com