A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Ati Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

My take on Nvidia SLI



 
 
Thread Tools Display Modes
  #1  
Old July 6th 04, 01:13 AM
external usenet poster
 
Posts: n/a
Default My take on Nvidia SLI

7 years ago, 3DFX followed up the Voodoo success with Voodoo 2 SLI.
One board cost about $5-600 at first and was out of reach for most
gamers. I got a hold of a pair of used V2 cheap a few months later
from a distressed owner who was in dire need of some money. That was
the first time I played games in all 1024x768’s glory and was wowed by
it. But I sold off one board a few days later because two boards were
still too rich in price for me.



It wasn’t until TNT2 (or TNT1 Ultra) that Nvidia overtook the speed
crown from V2 SLI. How ironically Nvidia now is the one who brings
SLI back?



Yet if you consider these two iterations of SLI, you’ll see what made
the V2 SLI a success is not presented to the Nvidia case.



Every Mobo had two pci slots back then but we haven’t seen one PCI
express boards out on the market yet.



The difference between playing 800x600 and 1024x768 (V2 SLI) is huge
but not between playing 1280 x1024 and 1600x1200 (Nvidia SLI). Unless
you have a 21” monitor, playing games on 1600x1200 res makes the icons
too small.



The V2 is purely a 3D gaming board that couldn’t do anything else.
This Nvidia board is a full fledge board and Nvidia expects people to
buy an extra board just to play some 3D games at a slightly higher
resolution.



My take is the Nvidia SLI is not gonna fly. People are not be easily
wowed by new boards anymore, unlike we did back 7 years ago when 3Dfx
was the king, Thomas Pabst was a rookie tech reviewer and Anand la
Shrimp just celebrated the 2nd year anniversary of his website.



Well, those were the days.









  #2  
Old July 6th 04, 01:41 AM
NightSky 421
external usenet poster
 
Posts: n/a
Default

wrote in message
...

My take is the Nvidia SLI is not gonna fly. People are not be easily
wowed by new boards anymore, unlike we did back 7 years ago when 3Dfx
was the king, Thomas Pabst was a rookie tech reviewer and Anand la
Shrimp just celebrated the 2nd year anniversary of his website.



Well, those were the days.



Both ATI and nVidia are going to be coming out with SLI solutions, but I
have to agree with you that they aren't likely to catch on like the good
ol' Voodoo2. Price is one factor, but also the power consumption involved
and the amount of heat being produced by a higher-end system with high-end
video cards and hard drives would be tremendous. Now that summer has hit,
I've become keenly aware of just how much heat my current Pentium 4 system
with it's 9800 Pro video card is spitting out, and I sure as heck don't
want any more. My other concern with these new SLI solutions is the
proximity of the cards to each other. From pictures I've seen on the web,
the cards are awfully close to one another, and that is a major concern.

It's really little wonder why the Voodoo2 SLI will undoubtedly do better
than dual 6800 or X800 cards. Back in the days of the Voodoo2, the only
big consideration was the price. Now, in 2004, a lot has changed. And
how much of a limiting factor will the CPU be when people get these new
video cards working together?


  #3  
Old July 6th 04, 02:04 AM
Ben Pope
external usenet poster
 
Posts: n/a
Default

wrote:
7 years ago, 3DFX followed up the Voodoo success with Voodoo 2 SLI.
One board cost about $5-600 at first and was out of reach for most
gamers. I got a hold of a pair of used V2 cheap a few months later
from a distressed owner who was in dire need of some money. That was
the first time I played games in all 1024x768’s glory and was wowed by
it. But I sold off one board a few days later because two boards were
still too rich in price for me.



It wasn’t until TNT2 (or TNT1 Ultra) that Nvidia overtook the speed
crown from V2 SLI. How ironically Nvidia now is the one who brings
SLI back?



Yet if you consider these two iterations of SLI, you’ll see what made
the V2 SLI a success is not presented to the Nvidia case.



Every Mobo had two pci slots back then but we haven’t seen one PCI
express boards out on the market yet.


When is the SLI thing due to be released? PCI-express is pretty new. It
will gain popularity.

It should be cheap and easy to put two PCI-E slots in, if there is a demand.

The difference between playing 800x600 and 1024x768 (V2 SLI) is huge
but not between playing 1280 x1024 and 1600x1200 (Nvidia SLI). Unless
you have a 21?monitor, playing games on 1600x1200 res makes the icons
too small.


Works fine on my 19" at 1600x1200@85Hz. :-p

The V2 is purely a 3D gaming board that couldn’t do anything else.
This Nvidia board is a full fledge board and Nvidia expects people to
buy an extra board just to play some 3D games at a slightly higher
resolution.


You're not paying for the 2D stuff, it comes practically free considering
the rest of the 3d stuff and components on the board. And you're getting,
they suspect, ~80-90% improvement, not bad...

My take is the Nvidia SLI is not gonna fly. People are not be easily
wowed by new boards anymore, unlike we did back 7 years ago when 3Dfx
was the king, Thomas Pabst was a rookie tech reviewer and Anand la
Shrimp just celebrated the 2nd year anniversary of his website.


It should be quick. But the cost will be prohibitive for most.

Ben
--
A7N8X FAQ:
www.ben.pope.name/a7n8x_faq.html
Questions by email will likely be ignored, please use the newsgroups.
I'm not just a number. To many, I'm known as a String...


  #4  
Old July 9th 04, 02:40 PM
John Russell
external usenet poster
 
Posts: n/a
Default

Nvidia have the right to the acronym SLI but don't get confused into
thinking this is Scan Line Interleaving,as it isn't.

From what i've seen only one card is responsible for generating the image,
the other is just a second GPU which performs tasks allocated by the first
GPU. This is far more efficeint than the Voodoo solution as no graphics
calculations are done twice.

It also means that cards don't have to match so another upgrade path
becomes available. You can add a new faster card as the master and keep the
old one as the slave GPU.



  #5  
Old July 9th 04, 07:08 PM
Ben Pope
external usenet poster
 
Posts: n/a
Default

John Russell wrote:
Nvidia have the right to the acronym SLI but don't get confused into
thinking this is Scan Line Interleaving,as it isn't.

From what i've seen only one card is responsible for generating the image,
the other is just a second GPU which performs tasks allocated by the first
GPU. This is far more efficeint than the Voodoo solution as no graphics
calculations are done twice.


The Voodoo solution is entirely symmetrical, with each card performing half
the scan lines, hence double the speed.

nVidias solutions is asymmetric - master/slave, which is LESS efficient - it
will never be able to deliver twice the performance, they reckon up to about
90%.

It also means that cards don't have to match so another upgrade path
becomes available. You can add a new faster card as the master and keep
the old one as the slave GPU.


"Secondly you'll need two identical, same brand and type, PCI-E GeForce 6800
graphics cards"
- http://www.hardwareanalysis.com/content/article/1728/

"SLI will only work on two of the same cards."
- http://www.neoseeker.com/Articles/Ha...s/nvsli/2.html

"Additionally, another requirement of SLI is that both cards must come from
the same manufacturer and be based on the same configuration"
- http://www.firingsquad.com/hardware/..._sli/page4.asp

Ben
--
A7N8X FAQ: www.ben.pope.name/a7n8x_faq.html
Questions by email will likely be ignored, please use the newsgroups.
I'm not just a number. To many, I'm known as a String...


  #6  
Old July 9th 04, 07:29 PM
Andrew
external usenet poster
 
Posts: n/a
Default

On Fri, 9 Jul 2004 19:08:45 +0100, "Ben Pope"
wrote:

The Voodoo solution is entirely symmetrical, with each card performing half
the scan lines, hence double the speed.


V2 SLI wasn't twice the speed of a single card, 10-50% speed boost was
more usual, or having the ability to do 1024x768.
--
Andrew. To email unscramble & remove spamtrap.
Help make Usenet a better place: English is read downwards,
please don't top post. Trim messages to quote only relevant text.
Check groups.google.com before asking a question.
  #7  
Old July 9th 04, 07:31 PM
CapFusion
external usenet poster
 
Posts: n/a
Default

[snip]
"John Russell" wrote in message
...
Nvidia have the right to the acronym SLI but don't get confused into
thinking this is Scan Line Interleaving,as it isn't.

[/snip]
According to NVIDIA for SLI = Scalable Link Interface

[snip]

From what i've seen only one card is responsible for generating the image,
the other is just a second GPU which performs tasks allocated by the first
GPU. This is far more efficeint than the Voodoo solution as no graphics
calculations are done twice.

[/snip]
Each GPU will start half and will change depending on the load. Software
will determine which load to distribute to which GPU.

[snip]
It also means that cards don't have to match so another upgrade path
becomes available. You can add a new faster card as the master and keep

the
old one as the slave GPU.

[/snip]
You will need both identical card. Best is to have that same card from the
same manufacturer.

CapFusion,...


  #8  
Old July 9th 04, 07:45 PM
Ben Pope
external usenet poster
 
Posts: n/a
Default

Andrew wrote:
On Fri, 9 Jul 2004 19:08:45 +0100, "Ben Pope"
wrote:

The Voodoo solution is entirely symmetrical, with each card performing
half the scan lines, hence double the speed.


V2 SLI wasn't twice the speed of a single card, 10-50% speed boost was
more usual, or having the ability to do 1024x768.


Is that from a technology standpoint or is it due to the CPU being the
limiting factor?

"This shows that even a Pentium II 300 is not able feeding the Voodoo2 with
enough data at 640x480. It requires a CPU that's significantly faster than
this one."
- http://graphics.tomshardware.com/gra...204/index.html

Whats the fastest CPU anybody has used to bench a Voodoo2 SLI rig?

Had a little look, but not found a site that compares a V2 SLI with a V2 on
a fast (1GHz or so) CPU.

My impression was that at the time, games were fill-rate (or CPU) limited,
and that the SLI would double fill-rate.

Ben
--
A7N8X FAQ: www.ben.pope.name/a7n8x_faq.html
Questions by email will likely be ignored, please use the newsgroups.
I'm not just a number. To many, I'm known as a String...


  #9  
Old July 9th 04, 08:59 PM
J. Clarke
external usenet poster
 
Posts: n/a
Default

CapFusion wrote:

[snip]
"John Russell" wrote in message
...
Nvidia have the right to the acronym SLI but don't get confused into
thinking this is Scan Line Interleaving,as it isn't.

[/snip]
According to NVIDIA for SLI = Scalable Link Interface

[snip]

From what i've seen only one card is responsible for generating the
image, the other is just a second GPU which performs tasks allocated by
the first
GPU. This is far more efficeint than the Voodoo solution as no graphics
calculations are done twice.

[/snip]
Each GPU will start half and will change depending on the load. Software
will determine which load to distribute to which GPU.

[snip]
It also means that cards don't have to match so another upgrade path
becomes available. You can add a new faster card as the master and keep

the
old one as the slave GPU.

[/snip]
You will need both identical card. Best is to have that same card from the
same manufacturer.


Since there are no boards on the market today or coming in the near future
with two PCI-Express 16x slots (read the fine print on the Alienware very
carefully then read the spec sheets on the components they use and you'll
find that it has one 16x and one 8x), if it depends on the boards being
identical then it's not going to work at all for a good long time.

CapFusion,...


--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
  #10  
Old July 12th 04, 02:24 AM
NightSky 421
external usenet poster
 
Posts: n/a
Default

"Ben Pope" wrote in message
...

"This shows that even a Pentium II 300 is not able feeding the Voodoo2

with
enough data at 640x480. It requires a CPU that's significantly faster

than
this one."
- http://graphics.tomshardware.com/gra...204/index.html



That's the thing, the CPU on a system would forever be playing catch up to
a dual video card solution. Unless, of course, you hang onto those video
cards past their "Best Before" date. :-)


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
pc problems after g card upgrade + sp2 ben reed Homebuilt PC's 9 November 30th 04 01:04 AM
Bad news for ATI: Nvidia to 'own' ATI at CeBit - no pixel shader 3.0 support in R420 (long) NV55 Ati Videocards 12 February 24th 04 06:29 AM
Response by Nvidia concerning HL2 *warning, lengthy post, strong opinion content, some bad langwidge* NC-17 rating administered... Dave Ati Videocards 28 September 14th 03 05:51 PM
I dont see that nvidia is "finished"... Steven C \(Doktersteve\) Ati Videocards 17 September 13th 03 09:00 PM
Kyle Bennett (HardOCP) blasts NVIDIA Radeon350 Ati Videocards 12 August 13th 03 09:19 PM


All times are GMT +1. The time now is 02:32 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.