If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#31
|
|||
|
|||
"Les" wrote in message news:bydfc.647$pL6.459@newsfe1-win... They still releasing AGP 8x versions along side PCI x16. I read somewhere nvidia is doing something with a bridging device while ATI is making totally seperate cards, ie R420 is agp 8x and R423 is a proper PCI x16 card. I cannot for the life of me remember where I read it though sorry. It *could* have been anandtech Right on ATI's site it says that they are the only company to be making a "True PCI Express card" It's right on there front end. JLC |
#32
|
|||
|
|||
Yeah. That's one odd feature I don't get. Those power connectors are tied
to the same source. I guess the wires can only hold so much. But what about the traces on the power supply? DaveL "teqguy" wrote in message ... The two power connectors will eventually come down to one... right now testing is only showing that stability is better achieved using 4 rails instead of two. |
#33
|
|||
|
|||
I'd have to agree. It looks like this guy is trying to masquerade anti-ATI
sentiment as nonchalance and "no-brainer" NVIDIA superiority. Sorry, but your weak psychology is definitely not fooling me. Wrong. I have a RADEON 9700 PRO in Athlon64 3000+ box and it's fast enough for everything. I'm a programmer so I am very interested in the 3.0 shaders, that is the only reason for me to upgrade at all. That's why I don't have RADEON 9800 XT, it only has increased speed and that is something I am not too thrilled about at this point because the 9700 PRO is very nice already. How is that anti-ATI sentiment? I repeat: I am interested in programming for the 3.0 shaders. If ATI won't deliver then I have to revert back to NV based products. GeForce4 was the last NV card I purchased, it's now on Pentium4 2.4 Ghz box running Linux Mandrake 1.0 Community. It works great there. I have nothing against nVidia as they were. It just a fact that until recent rumors and reviews of the GeForce 6800 on various websites I didn't even think NV was worth crap against ATI's latest offerings. How is that anti-ATI sentiment? I'm entitled to be thrilled about 3.0 shaders as much as I want, and I can post how much I am thrilled it I want. And you can make claims about me but you don't have to be right, infact, you are entirely mistaken and incorrect. Thanks for your time. |
#34
|
|||
|
|||
use at lest a 480W PS. That's going to be a very expensive upgrade for a
lot of people. It sure will. new card can deliver. Let's hope that Doom 3 runs great on with this card. Of course by the time the game finally comes out this card will probably cost $150. JLC That's a very good point, I was only speaking for myself. I don't play games much at all, we do some Blackhack Down and Warcraft III TFT multiplayer a couple of times a week. For that pretty old card would suffice. It's the work that I need the latest features for, I won't even be paying for the card myself anyway. |
#35
|
|||
|
|||
It just occured to me that it may be the traces on the 6800U that Nvidia is
worried about. Why not just use wider traces? DaveL "DaveL" wrote in message ... Yeah. That's one odd feature I don't get. Those power connectors are tied to the same source. I guess the wires can only hold so much. But what about the traces on the power supply? DaveL "teqguy" wrote in message ... The two power connectors will eventually come down to one... right now testing is only showing that stability is better achieved using 4 rails instead of two. |
#36
|
|||
|
|||
On Wed, 14 Apr 2004 11:25:45 -0700, G wrote:
There's more to performance than just gaming. And there's more to PCI Express than just the 16X slot which will be used for graphics cards initially. AGP was a hack, and (as others have said) it hit the wall at "4X". PCI Express is a *VERY* well thought out bus that should be alot better than PCI, PCI-X, and AGP... not to mention things bolted directly to the Northbridge. If it helps games a little in the process, it's just gravy. I was only talking about what PCI Express will do for graphics, which I think will be very little. Of course it is going to be great for applications such as RAID, 1Gb and 10Gb ethernet etc. PCI has served us well but it's time it moved on. Lots of good reasons for PCI Express, but not gfx. I wish that Aureal was still around. One of the problems with Vortex 2 and more so with Vortex 3 was that A3D was very heavy on the PCI bus with all the positional information it to share with the CPU. PCI Express would have gone really well with Vortex 3. But those *******s at Craptive sent Aureal under a wave of malicious litigation and now the tech is sitting in a vault somewhere. Now we can only dream of what could have been... K |
#37
|
|||
|
|||
"JLC" wrote in news:fcjfc.142647$K91.357088@attbi_s02:
It says right in that same article that the new cards will take two slots. But it is possible for vendors to come out with single slot cards. I find it amazing that it says that Nvidia recommended that their testers use at lest a 480W PS. That's going to be a very expensive upgrade for a lot of people. And a lot of guys that think they have a 480+ PS will find that there cheap PS is not up to the task. So the Ultra is gonna start at $499 + say another $100 for a quality PS, Wow $599 just to play games that probably don't need a fraction of the power the new card can deliver. Let's hope that Doom 3 runs great on with this card. Of course by the time the game finally comes out this card will probably cost $150. JLC I _really_ want a new system right now. I mean, I'm running dual P3-800 with Ti4200 video, and it just doesn't cut it for todays games. But the game I know I want is Doom 3 and who know's when it will be out. When it does come out, it's anyone's guess what will be the best video card for the game. There will be the fastest, then there will be the best price / performance cards, a little slower, a lot cheaper, etc. I'm just going to have to wait until the game comes out if I don't want to spend too much money and want to be really sure, making a decision based on real benchmarks of production code and production hardware. But I hate waiting! My current setup is killing me! I'm sure it's not the Ti4200's fault, it's a great card, I'm just too CPU limited. But again, it will be interesting to see which cpu / video card combo does Doom 3 the best. More waiting! Argh! |
#38
|
|||
|
|||
"teqguy" wrote in message .. .
The bandwidth of AGP 2X can carry a high definition signal... so I don't understand how you can expect PCI-Express to do it any better. Nope. AGP's upstream bandwidth is only half-duplex. It's not the bandwidth that's the problem. Here's an article that explains it in detail (with further links to PCI Express info as well): "PCI Express and HD Video: Marriage Made in Heaven?" http://www.extremetech.com/article2/...1533061,00.asp SCSI only operates at 320Mb/s. In RAID stripe 0, it's roughly 460Mb/s. So again... a lot more bandwidth than required. That's not the point. SCSI controllers don't sit in the AGP slot. If you're switching to comparing PCI Express with PCI/PCI-X then you have to switch to talking about total bandwidth in the whole system. Besides, SCSI is up to 640Mb/s. And definitely a lot more expensive than using onboard SCSI. Being onboard has nothing to do with it either. The onboard controller has to be connected somehow. It's on some bus or another even if it's not sitting in a slot. |
#39
|
|||
|
|||
"chrisv" wrote in message ... "teqguy" wrote: The FX series is 28 to 23, ranging from the 5950 to the 5200. Better late then never, I guess. If you have nothing to contribute, shut up. If you're just going to post drivel, shut up. What's drivel is your obsessant need to critique everything anyone ever says. Wrong again. If I recall you are the same chrisv who stated over and over in alt.computer.storage how IBM never had a problem with it's last batch of Deathstar hard drives Ignore the troll folks, he knows not what he says |
#40
|
|||
|
|||
On Wed, 14 Apr 2004 19:13:30 GMT, "teqguy" wrote:
Most MPEG encoding is processor dependent... I wish developers would start making applications that let the graphics card do video encoding, instead of dumping the work on the processor. I'm pretty sure I read somewhere that the (new & improved) Prescotty processor has been given a special hard wired instruction set which is dedicated to encoding video, so that should speed things up some what. I remember reading an article over a year ago which had Intel giving a demo of a future release CPU which apparently was running 3 full screen HD videos simultaneously rotating in a 3d cube. The processor prototype was not specified, but it may have been a Tejas as it was rated at 5GHz. Ricardo Delazy |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Geforce 6800 drivers | Glitch | General | 3 | December 13th 04 09:30 AM |
Athlon xp 2600+ (@1.9ghz) with a Geforce 6800? | Kiran Kumar Kamineni | Overclocking AMD Processors | 1 | November 14th 04 11:24 AM |
Gigabyte NVIDIA 6600 Series: Bringing GeForce 6800 features to the mainstream! | Gigabyte USA Marketing | Gigabyte Motherboards | 0 | October 28th 04 11:04 PM |
GeForce 6800 Ultra (256 Mb) | F.O.R. | General | 1 | August 7th 04 02:20 AM |
P4C800-E Deluxe and BFG GeForce 6800 Ultra OC graphics card | Mark Cee | Asus Motherboards | 2 | June 28th 04 05:24 AM |