If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
|
Thread Tools | Display Modes |
#1
|
|||
|
|||
Gigabyte creates dual-GPU graphics card
http://www.tomshardware.com/hardnews...16_115811.html
Chicago (IL) - Gigabyte will announce Friday a graphics card running two = graphics processors on one board. According to sources, the SLI card = will lift current 3DMark2003 record levels by a significant margin while = being priced lower than ATI's and Nvidia's single-GPU high-end cards.=20 If two graphics cards in one system are too expensive or simply not fast = enough, Gigabyte's new 3D1 board may be worth a serious look. Sources = told Tom's Hardware Guide, that the company prepares to launch a = dual-GPU card Friday, saying that it will "revise the VGA performance = ranking".=20 =20 The card integrates two Nvidia GeForce 6600 GT graphics processors and = is the first 6600 GT card on the market to offer a total of 256 MByte = DDR 3 memory and 256 Mbit of memory bandwidth, according to the = manufacturer. The card is cooled by two on-board fans.=20 The 3D1's two processors communicate through Nvidia's SLI interface and = achieved 14,293 points in 3DMark2003, sources at Gigabyte said. This = would not only be almost twice the performance of a regular 6600 GT = card, but also more than ATI's Radeon X850 XT Platinum Edition, which = achieved in Gigabyte's test environment 13,271 points and Nvidia's = GeForce 6800 Ultra, which posted 12,680 points.=20 While Gigabyte claims that the 3D1 will trump the performance of Radeon = X850 XT Platinum Edition and the GeForce 6800 Ultra cards, it says that = the card will be offered in combination with the mainboard GA-K8NXP-SLI = for less money than ATI's and Nvidia's single-GPU graphics cards alone. = These high end cards current carry suggested retail prices between $500 = and $600.=20 Tom's Hardware Guide's test lab staff will run the 3D1 through its = benchmark track, as soon as the card becomes available. According to = sources, will be available in samples at the end of this month and will = be sold as "luxury solution" for gamers by mid of January. http://www.tomshardware.com/hardnews...16_115811.html |
#2
|
|||
|
|||
Heh, a card as large as the Voodoo5...
It is true, however, that two 6600GTs in SLI is about as fast as a 6800GT. If the pricing is right (and driver support is there), then this will actually work. -- "War is the continuation of politics by other means. It can therefore be said that politics is war without bloodshed while war is politics with bloodshed." "John Eckart" wrote in message ... http://www.tomshardware.com/hardnews...16_115811.html Chicago (IL) - Gigabyte will announce Friday a graphics card running two graphics processors on one board. According to sources, the SLI card will lift current 3DMark2003 record levels by a significant margin while being priced lower than ATI's and Nvidia's single-GPU high-end cards. If two graphics cards in one system are too expensive or simply not fast enough, Gigabyte's new 3D1 board may be worth a serious look. Sources told Tom's Hardware Guide, that the company prepares to launch a dual-GPU card Friday, saying that it will "revise the VGA performance ranking". The card integrates two Nvidia GeForce 6600 GT graphics processors and is the first 6600 GT card on the market to offer a total of 256 MByte DDR 3 memory and 256 Mbit of memory bandwidth, according to the manufacturer. The card is cooled by two on-board fans. The 3D1's two processors communicate through Nvidia's SLI interface and achieved 14,293 points in 3DMark2003, sources at Gigabyte said. This would not only be almost twice the performance of a regular 6600 GT card, but also more than ATI's Radeon X850 XT Platinum Edition, which achieved in Gigabyte's test environment 13,271 points and Nvidia's GeForce 6800 Ultra, which posted 12,680 points. While Gigabyte claims that the 3D1 will trump the performance of Radeon X850 XT Platinum Edition and the GeForce 6800 Ultra cards, it says that the card will be offered in combination with the mainboard GA-K8NXP-SLI for less money than ATI's and Nvidia's single-GPU graphics cards alone. These high end cards current carry suggested retail prices between $500 and $600. Tom's Hardware Guide's test lab staff will run the 3D1 through its benchmark track, as soon as the card becomes available. According to sources, will be available in samples at the end of this month and will be sold as "luxury solution" for gamers by mid of January. http://www.tomshardware.com/hardnews...16_115811.html |
#3
|
|||
|
|||
"First of One" wrote in message ... Heh, a card as large as the Voodoo5... It is true, however, that two 6600GTs in SLI is about as fast as a 6800GT. If the pricing is right (and driver support is there), then this will actually work. Superb idea though, since this alleviates the need for dual-slot PCI-E X16 mobos. Dual 6800 boards can't be far behind. |
#4
|
|||
|
|||
"Tim" wrote in message ... "First of One" wrote in message ... Heh, a card as large as the Voodoo5... It is true, however, that two 6600GTs in SLI is about as fast as a 6800GT. If the pricing is right (and driver support is there), then this will actually work. Superb idea though, since this alleviates the need for dual-slot PCI-E X16 mobos. Dual 6800 boards can't be far behind. How many people could afford a dual 6800 card? What would be intersing to know is can the AGP bridge work with a dual GPU card? How many people would upgrade to PCI-e is that's possible? |
#5
|
|||
|
|||
"John Russell" wrote in message ... How many people could afford a dual 6800 card? At least more than those who could afford dual-slot PCI-e X16 systems. And who's to say where the final prices will fall? NVidia is using SLI technology now to encourage dual 6800 solutions, so they obviously think someone out there has the will and the means to buy them. Personally I think it's crazy to put that much money into a graphics system (for gaming at least), but apparently NVidia believes there's a market for it. What would be intersing to know is can the AGP bridge work with a dual GPU card? How many people would upgrade to PCI-e is that's possible? I don't think we'll ever see an AGP version, just for that very reason. |
#6
|
|||
|
|||
"Tim" wrote in message ... "John Russell" wrote in message ... How many people could afford a dual 6800 card? At least more than those who could afford dual-slot PCI-e X16 systems. And who's to say where the final prices will fall? NVidia is using SLI technology now to encourage dual 6800 solutions, so they obviously think someone out there has the will and the means to buy them. Personally I think it's crazy to put that much money into a graphics system (for gaming at least), but apparently NVidia believes there's a market for it. What would be intersing to know is can the AGP bridge work with a dual GPU card? How many people would upgrade to PCI-e is that's possible? I don't think we'll ever see an AGP version, just for that very reason. But this dual GPU card is not based upon a reference Nvidia design but appears to be a bit of clever gigabyte design. Introducing this card has negative sales prospects for Gigabyte SLI motherboards, never mind others. You could argue that a gigabyte might sell more dual GPU cards to current AGP users than they would lose in PCI-e motherboard sales. |
#7
|
|||
|
|||
"John Russell" wrote in message ... "Tim" wrote in message ... "John Russell" wrote in message ... How many people could afford a dual 6800 card? At least more than those who could afford dual-slot PCI-e X16 systems. And who's to say where the final prices will fall? NVidia is using SLI technology now to encourage dual 6800 solutions, so they obviously think someone out there has the will and the means to buy them. Personally I think it's crazy to put that much money into a graphics system (for gaming at least), but apparently NVidia believes there's a market for it. What would be intersing to know is can the AGP bridge work with a dual GPU card? How many people would upgrade to PCI-e is that's possible? I don't think we'll ever see an AGP version, just for that very reason. But this dual GPU card is not based upon a reference Nvidia design but appears to be a bit of clever gigabyte design. Introducing this card has negative sales prospects for Gigabyte SLI motherboards, never mind others. You could argue that a gigabyte might sell more dual GPU cards to current AGP users than they would lose in PCI-e motherboard sales. ......and there is probably more profit made on graphics cards at the moment! |
#8
|
|||
|
|||
"John Russell" wrote in message ... "John Russell" wrote in message ... "Tim" wrote in message ... "John Russell" wrote in message ... How many people could afford a dual 6800 card? At least more than those who could afford dual-slot PCI-e X16 systems. And who's to say where the final prices will fall? NVidia is using SLI technology now to encourage dual 6800 solutions, so they obviously think someone out there has the will and the means to buy them. Personally I think it's crazy to put that much money into a graphics system (for gaming at least), but apparently NVidia believes there's a market for it. What would be intersing to know is can the AGP bridge work with a dual GPU card? How many people would upgrade to PCI-e is that's possible? I don't think we'll ever see an AGP version, just for that very reason. But this dual GPU card is not based upon a reference Nvidia design but appears to be a bit of clever gigabyte design. Introducing this card has negative sales prospects for Gigabyte SLI motherboards, never mind others. You could argue that a gigabyte might sell more dual GPU cards to current AGP users than they would lose in PCI-e motherboard sales. .....and there is probably more profit made on graphics cards at the moment! Good points. I just suspect that the industry heavyweights (Intel, etc) might be pressuring them into a PCI-e exclusive version, just to give the standard a stronger foothold in the market. Time will certainly tell though.. |
#9
|
|||
|
|||
Keep in mind 95% of video cards produced go into OEM systems, most of which
nowadays are Intel P4-based with PCIe motherboards, so it does make some business sense. -- "War is the continuation of politics by other means. It can therefore be said that politics is war without bloodshed while war is politics with bloodshed." "Tim" wrote in message ... Good points. I just suspect that the industry heavyweights (Intel, etc) might be pressuring them into a PCI-e exclusive version, just to give the standard a stronger foothold in the market. Time will certainly tell though.. |
#10
|
|||
|
|||
"First of One" wrote in message ... Keep in mind 95% of video cards produced go into OEM systems, most of which nowadays are Intel P4-based with PCIe motherboards, so it does make some business sense. So is that good business to produce the card, or not to produce a card? Even those building AMD64 systems recently using nforce4 will not have used SLI boards. That means there are lot of new Intel and AMD CPU PCIe systems out there which don't have motherboard support for SLI. Seems like an excellent market opportunity for the "SLI on a card" approach. After all how many parents will wake up on XMAS day to find little Johnny is upset because his new PC dosn't have SLI? |
|
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
GIGABYTE TECHNOLOGY receives highest honors—15th Annual National Quality Award | Gigabyte USA Marketing | Gigabyte Motherboards | 0 | November 4th 04 07:35 PM |
Decent, low power, silent PCI graphics card? | cjm | General | 7 | August 26th 04 10:43 PM |
What is an nVidia GeForce FX5200 graphics card? | Brian | Ati Videocards | 10 | February 4th 04 03:28 AM |
Value Graphics Card that can support Two monitors. . | Wayne Youngman | Ati Videocards | 1 | December 16th 03 05:34 PM |
New Graphics Card | Strontium | Homebuilt PC's | 36 | November 4th 03 01:39 AM |