If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#11
|
|||
|
|||
As far as the reference Nvidia cards go... I'm pretty sure we'll start
out with the dustbuster again... at least until someone can decide on a more effective method. That kind of sucks with K8V if anyone taken notice of where the firewire connector is in the motherboard.. it might fit perfectly tho, you never know before you try.. I heard that NV40 boards will have _two_ power connectors...? When RADEON's came with just one, I thought that was already one too many, LOL, but since it's inside the case who cares in the end of the day. But two? Huh! 200+ Million transistors sure suck some power.. but certainly 350 Watt supply with only 5 IDE devices connected should be enough? ;-) It would suck if find out suddenly (from the smoke coming from the PSU) that oh ****, looks like 450-500 watts would be required anyways... though I find it amazingly unlikely, but since someone else in this thread was concerned about his PSU being sufficient had to ask. NV40 would rock for programming, because that's the only way for quite a while to try out VS 3.0 and PS 3.0 if I am not mistaken? I read from this NG that ATI wouldn't have these in their new chip, why the hell not!? Peace. |
#12
|
|||
|
|||
Ah, http://frankenstein.evilgeniuslabs.c...nv40/news.html I see from the pictures (assuming not fakes that the card should fit reasonably to "single" AGP (8x) slot more or less.. that's nice, but the best part about this debacle is two DVI ports. That is the part I like the most, currently using DVI + DB25 to two TFT's. Looks like a winner to me compared to ATI, performance alone isn't what turns me on, RADEON 9700 PRO - RADEON 9800 XT are plenty fast as they come, the enhanced feature set is what turns me on. Especially these two: - 3.0 shaders (vertex samplers will be SO cool) - 32 bit precision for the whole pipeline from vertex to output fragment, v. cool The rest is yada yada yada.. but those two features are what 'do it', atleast for me from coder's point of view. Extra performance is so yesterday. ;-) |
#13
|
|||
|
|||
No, really, I can't believe my eyes that after two year trip to the ATI side, would again consider NV even a candidate, not to mention #1 choise as gfx card upgrade. Must suck to base your choises to brandname aka. the Fanboy's Choise. Looking at the offerings, this is a no-brainer for me. |
#14
|
|||
|
|||
"teqguy" wrote:
NightSky 421 wrote: Regardless of if someone wants the new high-end nVidia or ATI product, I've read that a person better have a monster power supply and excellent case cooling before even considering such cards. I also wonder how loud the fans on these new cards are going to need to be. It'd be interesting to see what they can do with regards to cooling and power consumption on future video cards too - I see this as getting to be more and more of a problem with time. The power consumption should stay below 15v. The Geforce FX does NOT use the 12v rail, for anyone wondering. All 4 pins are connected for potential usage, but the overall consumption never raises above 5.5v so 17v is not neccessary. Surely you can't believe that we can take the advice of someone who thinks that power "consumption" is measured in Volts. What you wrote is complete drivel, sorry. |
#15
|
|||
|
|||
"teqguy" wrote:
The best possible optimization that could ever be made, would be to start manufacturing motherboards with sockets for a GPU and either sockets or slots for video memory. This would allow for motherboards to potentially reduce in size, while increasing in performance and upgradability. The price would increase, but it would be worth it. No it wouldn't. |
#16
|
|||
|
|||
"Shep©" wrote in message news On Wed, 14 Apr 2004 00:10:26 +0000 As truth resonates honesty K wrote : On Tue, 13 Apr 2004 20:56:48 +0100, Shep© wrote: *PCI Express x16, AGP 8x support Looks like new mother boards required? If there is AGP 8x support, why would you need a new motherboard? K Because it's my understanding that although the new protocol/cards support AGP 8X this is merely a data rate comparison and the new cards will only fit a,"PCI-Express" slot,not an AGP one. http://www.pcstats.com/articleview.cfm?articleID=1087 HTH -- Free Windows/PC help, http://www.geocities.com/sheppola/trouble.html email shepATpartyheld.de Free songs to download and,"BURN" :O) http://www.soundclick.com/bands/8/nomessiahsmusic.htm They still releasing AGP 8x versions along side PCI x16. I read somewhere nvidia is doing something with a bridging device while ATI is making totally seperate cards, ie R420 is agp 8x and R423 is a proper PCI x16 card. I cannot for the life of me remember where I read it though sorry. It *could* have been anandtech |
#17
|
|||
|
|||
"NV55" wrote in message m... the following is ALL quote: http://frankenstein.evilgeniuslabs.c...nv40/news.html Tuesday, April 13, 2004 NVIDIA GeForce 6800 GPU family officially announced - Cormac @ 17:00 It's time to officially introduce the new GPU generation from NVIDIA and shed the light on its architecture and features. So, the GeForce 6800 GPU family, codenamed NV40, today officially entered the distribution stage. Initially it will include two chips, GeForce 6800 Ultra and GeForce 6800, with the same architecture. These are the key innovations introduced in NVIDIA's novelties: *16-pipeline superscalar architecture with 6 vertex modules, DDR3 support and *real 32-bit pipelines *PCI Express x16, AGP 8x support *222 million transistors *400MHz core clock *Chips made by IBM *0.13µm process Isn't it time for NVidia to use 0.09um process? How could they put some many features if still using 0.13 um process? |
#18
|
|||
|
|||
K wrote in message ...
I have a gut feeling that PCI Express will do very little for performance, just like AGP before it. Nothing can substitute lots of fast RAM on the videocard to prevent shipping textures across to the much slower system RAM. You could have the fastest interface imaginable for your vid card; it would do little to make up for the bottleneck that is your main memory. But what about for things that don't have textures at all? PCI Express is not only bi-directional, but full duplex as well. The NV40 might even use this to great effect, with its built-in hardware accelerated MPEG encoding/decoding plus "HDTV support" (which I assume means it natively supports 1920x1080 and 1280x720 without having to use Powerstrip). The lower cost version should be sweet for Shuttle sized Media PC's that will finally be able to "tivo" HDTV. I can also see the 16X slot being used in servers for other things besides graphics. Maybe in a server you'd want your $20k SCSI RAID Controller in it. Or in a cluster box a 10 gigabit NIC. There's more to performance than just gaming. And there's more to PCI Express than just the 16X slot which will be used for graphics cards initially. AGP was a hack, and (as others have said) it hit the wall at "4X". PCI Express is a *VERY* well thought out bus that should be alot better than PCI, PCI-X, and AGP... not to mention things bolted directly to the Northbridge. If it helps games a little in the process, it's just gravy. |
#19
|
|||
|
|||
The best possible optimization that could ever be made, would be to
start manufacturing motherboards with sockets for a GPU and either sockets or slots for video memory. This would allow for motherboards to potentially reduce in size, while increasing in performance and upgradability. The price would increase, but it would be worth it. No it wouldn't. haha! I agree completely. Videocards have reached such a complexity it's doubtful that a single company could produce both successfully. Not to mention the question of upgradeability, which is why we have pci/agp in the first place. rms |
#20
|
|||
|
|||
No, really, I can't believe my eyes that after two year trip to the ATI
side, would again consider NV even a candidate, not to mention #1 choise as gfx card upgrade. Must suck to base your choises to brandname aka. the Fanboy's Choise. Looking at the offerings, this is a no-brainer for me. pfft. You don't even know what the ATI offering is as yet, much less are you able to buy a 6800 until well into next month. rms |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Geforce 6800 drivers | Glitch | General | 3 | December 13th 04 09:30 AM |
Athlon xp 2600+ (@1.9ghz) with a Geforce 6800? | Kiran Kumar Kamineni | Overclocking AMD Processors | 1 | November 14th 04 11:24 AM |
Gigabyte NVIDIA 6600 Series: Bringing GeForce 6800 features to the mainstream! | Gigabyte USA Marketing | Gigabyte Motherboards | 0 | October 28th 04 11:04 PM |
GeForce 6800 Ultra (256 Mb) | F.O.R. | General | 1 | August 7th 04 02:20 AM |
P4C800-E Deluxe and BFG GeForce 6800 Ultra OC graphics card | Mark Cee | Asus Motherboards | 2 | June 28th 04 05:24 AM |