A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Ati Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

AIB Companies To Adopt XGI Volari GPUs?



 
 
Thread Tools Display Modes
  #1  
Old September 27th 03, 08:36 AM
graphics processing unit
external usenet poster
 
Posts: n/a
Default AIB Companies To Adopt XGI Volari GPUs?

While not directly related to Nvidia or ATI, the fact that XGI is entering
the consumer graphics industry with its range of Volari GPUs may effect both
of the current leaders. hopefully in a positive way, for the end user. God
knows we could use some more competition here.

Personally, I am most excited about the Volari V8 Duo - first *consumer*
graphics card configuration to sport twin Grahpics Processing Units.

now here's the article of the topic:

http://www.xbitlabs.com/news/video/d...923124528.html
-----------------------------------------------------------------
ASUS, ABIT, Gigabyte, Club3D to Adopt XGI Volari GPUs?

by Anton Shilov
09/23/2003 | 12:46 PM

There are rumours going around Computex Taipei 2003 exhibition in Taiwan
that a number of graphics card makers are seriously considering
manufacturing of graphics cards powered by XGI Volari graphics processors.
The list of the companies includes the names of tier-one manufacturers, even
though there are no official comments from any firms mentioned.

As we managed to find out, ASUSTeK Computer, ABIT, Gigabyte Technology, CP
Technology and Club3D plan to support XGI in an attempt to successfully
enter the graphics cards market this year by adoption of XGI Volari V5 and
V8 GPUs.

Everybody in the graphics cards market is very interested in the third
provider of GPUs since the fierce competition between today's leaders NVIDIA
and ATI is not only exhausting for chip companies, but also has a negative
impact on their add-in-board partners. Furthermore, only two GPU companies
that have practically equal resources may result in a rise of the GPU cartel
that totally controls the graphics processors market. Even though it is
practically impossible for a new player to enter the market, AIB companies
want to give XGI a try. In case XGI manages to be competitive, everyone will
benefit from this.

Note that the information is totally unofficial and no formal decisions
concerning actual graphics cards have been announced yet



  #2  
Old September 27th 03, 08:48 AM
Thomas
external usenet poster
 
Posts: n/a
Default

graphics processing unit wrote:
Personally, I am most excited about the Volari V8 Duo - first
*consumer* graphics card configuration to sport twin Grahpics
Processing Units.


Hahahahahahaha...
Quite funny.

Don't remember the Ati Rage Fury MAXX, or the Voodoo 5 5500? Both had twin
GPU's...

If you dont know what you're talking about, stop posting ;-)

Thomas


  #3  
Old September 27th 03, 10:20 AM
graphics processing unit
external usenet poster
 
Posts: n/a
Default


"Thomas" wrote in message
news:chbdb.55519$tK5.6217727@zonnet-reader-1...
graphics processing unit wrote:
Personally, I am most excited about the Volari V8 Duo - first
*consumer* graphics card configuration to sport twin Grahpics
Processing Units.


Hahahahahahaha...
Quite funny.

Don't remember the Ati Rage Fury MAXX, or the Voodoo 5 5500? Both had twin
GPU's...

If you dont know what you're talking about, stop posting ;-)

Thomas


Bwuhahahahahaha....

I guess you didn't notice that I said
*graphics processing unit* and not graphic accelerator or graphics chip.
neither the Ati Rage Fury MAXX nor the Voodoo 5 5500 used GPUs with on-chip
geometry processing (T&L)



  #4  
Old September 27th 03, 11:08 AM
Thomas
external usenet poster
 
Posts: n/a
Default

graphics processing unit wrote:
I guess you didn't notice that I said
*graphics processing unit* and not graphic accelerator or graphics
chip. neither the Ati Rage Fury MAXX nor the Voodoo 5 5500 used GPUs
with on-chip geometry processing (T&L)


The name 'GPU' was simply an invention of NVidia. For me, it's just another
name, not another 'thing', hehe. There were many more hardware-related
things added to the 'GPU', that didnt change the name, so for me, it's all
the same thing, from the Hercules chip to the Ati 9800 ;-) Just added more
features, and speed... But at least i see what you mean now ;-)

Thomas


  #5  
Old September 27th 03, 11:27 AM
external usenet poster
 
Posts: n/a
Default

U comp.sys.ibm.pc.hardware.video graphics processing unit prica:
Personally, I am most excited about the Volari V8 Duo - first *consumer*
graphics card configuration to sport twin Grahpics Processing Units.



Voodoo5 5500 in my machine has got 2 VSA100 units... If that isn't GPU, than
what is it? Drivers are working properly under any Windows OS (right now
using Windows 2000 Pro)...

There is one thing that nobody will beat soon... ) Voodoo5 6000... Or,
saying another words - 4 CPU's on one board...

But, shhhhhh... ) I screwed one CPU, so it isn't working properly... ))


And, ATI Rage Fury MAXX had 2 Rage128Pro CPU's (IIRC)... But, problematic
drivers...



--
"Ruzans li mlijekoo podmazuje ?" upita Fataa drka Zidovu povracu.
"Nisam ja nikog bombardiro !" rece coravaco mirise "Ja samo pudingu pozdravlju naklonjenm !"
By runf

Damir Lukic,
a member of hr.comp.hardver FAQ-team
  #7  
Old September 27th 03, 01:06 PM
Bratboy
external usenet poster
 
Posts: n/a
Default

"Thomas" wrote in message
news:chbdb.55519$tK5.6217727@zonnet-reader-1...
graphics processing unit wrote:
Personally, I am most excited about the Volari V8 Duo - first
*consumer* graphics card configuration to sport twin Grahpics
Processing Units.


Hahahahahahaha...
Quite funny.

Don't remember the Ati Rage Fury MAXX, or the Voodoo 5 5500? Both had twin
GPU's...

If you dont know what you're talking about, stop posting ;-)

Thomas



well and not to mention the new 9800 dual chip cards I read about recently
that someones makeing


  #8  
Old September 27th 03, 02:01 PM
Andy Cunningham
external usenet poster
 
Posts: n/a
Default

Sapphire built a 9800 MAXX with dual 9800 Pros. Didn't work, but until I
see the volari working I don't think that matters for this comparison :

"graphics processing unit" wrote in message
.com...
While not directly related to Nvidia or ATI, the fact that XGI is entering
the consumer graphics industry with its range of Volari GPUs may effect

both
of the current leaders. hopefully in a positive way, for the end user. God
knows we could use some more competition here.

Personally, I am most excited about the Volari V8 Duo - first *consumer*
graphics card configuration to sport twin Grahpics Processing Units.

now here's the article of the topic:

http://www.xbitlabs.com/news/video/d...923124528.html
-----------------------------------------------------------------
ASUS, ABIT, Gigabyte, Club3D to Adopt XGI Volari GPUs?

by Anton Shilov
09/23/2003 | 12:46 PM

There are rumours going around Computex Taipei 2003 exhibition in Taiwan
that a number of graphics card makers are seriously considering
manufacturing of graphics cards powered by XGI Volari graphics processors.
The list of the companies includes the names of tier-one manufacturers,

even
though there are no official comments from any firms mentioned.

As we managed to find out, ASUSTeK Computer, ABIT, Gigabyte Technology, CP
Technology and Club3D plan to support XGI in an attempt to successfully
enter the graphics cards market this year by adoption of XGI Volari V5 and
V8 GPUs.

Everybody in the graphics cards market is very interested in the third
provider of GPUs since the fierce competition between today's leaders

NVIDIA
and ATI is not only exhausting for chip companies, but also has a negative
impact on their add-in-board partners. Furthermore, only two GPU companies
that have practically equal resources may result in a rise of the GPU

cartel
that totally controls the graphics processors market. Even though it is
practically impossible for a new player to enter the market, AIB companies
want to give XGI a try. In case XGI manages to be competitive, everyone

will
benefit from this.

Note that the information is totally unofficial and no formal decisions
concerning actual graphics cards have been announced yet





---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.516 / Virus Database: 313 - Release Date: 01/09/2003


  #9  
Old September 27th 03, 10:07 PM
Tony Hill
external usenet poster
 
Posts: n/a
Default

On Sat, 27 Sep 2003 07:36:59 GMT, "graphics processing unit"
wrote:
While not directly related to Nvidia or ATI, the fact that XGI is entering
the consumer graphics industry with its range of Volari GPUs may effect both
of the current leaders. hopefully in a positive way, for the end user. God
knows we could use some more competition here.


"I'll believe it when I see it". There have been a LOT of graphics
cards that were supposed to be the next big thing to come along. S3
has done it a handful of times (and again just recently with Delta
Chrome), Matrox has done it, BitBoys did it several times without ever
having a product, and now we've got XGI. So far none of these cards
have managed to compete very effectively with the low-end chips from
ATI or nVidia, let alone their high-end stuff.

The real key is in getting decent drivers. This is why nVidia took
over the graphics world, not by their hardware. nVidia managed to get
fast and *stable* drives out for all of their products while 3dfx and
ATI were floundering with buggy drivers that were missing features and
having either very poor performance or, at best, uneven performance.
ATI has since learned from their mistakes and really improved the
quality of their drivers, but they are about the only one.

Right now there are three players in the graphics market, ATI, nVidia
and Intel (with Intel actually been the largest supplier). Most of
the world's computer users do VERY well with integrated graphics, and
have absolutely ZERO reason to buy an add-in card. That just leaves
an extremely small market at the very high-end and a decent sized but
very low-margin market in the mid range. If XGI wants to succeed,
they need to get a graphics card out for $100 that has stable drivers
and that can match or beat whatever nVidia and ATI are selling for
~$125 at the time (right now that would be the GeForceFX 5600 and the
Radeon 9600).

I ain't holding my breath. I'll be surprised if they ever get stable
drivers, let alone within the next 6 months of it's release. And
that's just talking about Windows drivers, the situation is likely to
be even worse for their Linux drivers if they even bother to make
those at all.

Personally, I am most excited about the Volari V8 Duo - first *consumer*
graphics card configuration to sport twin Grahpics Processing Units.


I'm not. I doubt that it manage to match a GeforceFX 5600 or ATI
Radeon 9600, yet it will likely cost a LOT more. It all comes back to
drivers, especially for a more complicated design with two graphics
processors.

Besides that, their claim as being the first consumer card with dual
GPUs is REALLY stretching things. They're taking a very narrow view
on just what it means to be a consumer card and what it takes to be
considered a GPU. Marketing at it's best/worst here.

-------------
Tony Hill
hilla underscore 20 at yahoo dot ca
  #10  
Old September 28th 03, 02:36 PM
Radeon350
external usenet poster
 
Posts: n/a
Default

Tony Hill wrote in message t.com...
On Sat, 27 Sep 2003 07:36:59 GMT, "graphics processing unit"
wrote:
While not directly related to Nvidia or ATI, the fact that XGI is entering
the consumer graphics industry with its range of Volari GPUs may effect both
of the current leaders. hopefully in a positive way, for the end user. God
knows we could use some more competition here.


"I'll believe it when I see it". There have been a LOT of graphics
cards that were supposed to be the next big thing to come along. S3
has done it a handful of times (and again just recently with Delta
Chrome), Matrox has done it, BitBoys did it several times without ever
having a product, and now we've got XGI. So far none of these cards
have managed to compete very effectively with the low-end chips from
ATI or nVidia, let alone their high-end stuff.

The real key is in getting decent drivers. This is why nVidia took
over the graphics world, not by their hardware. nVidia managed to get
fast and *stable* drives out for all of their products while 3dfx and
ATI were floundering with buggy drivers that were missing features and
having either very poor performance or, at best, uneven performance.
ATI has since learned from their mistakes and really improved the
quality of their drivers, but they are about the only one.

Right now there are three players in the graphics market, ATI, nVidia
and Intel (with Intel actually been the largest supplier). Most of
the world's computer users do VERY well with integrated graphics, and
have absolutely ZERO reason to buy an add-in card. That just leaves
an extremely small market at the very high-end and a decent sized but
very low-margin market in the mid range. If XGI wants to succeed,
they need to get a graphics card out for $100 that has stable drivers
and that can match or beat whatever nVidia and ATI are selling for
~$125 at the time (right now that would be the GeForceFX 5600 and the
Radeon 9600).

I ain't holding my breath. I'll be surprised if they ever get stable
drivers, let alone within the next 6 months of it's release. And
that's just talking about Windows drivers, the situation is likely to
be even worse for their Linux drivers if they even bother to make
those at all.

Personally, I am most excited about the Volari V8 Duo - first *consumer*
graphics card configuration to sport twin Grahpics Processing Units.


I'm not. I doubt that it manage to match a GeforceFX 5600 or ATI
Radeon 9600, yet it will likely cost a LOT more. It all comes back to
drivers, especially for a more complicated design with two graphics
processors.

Besides that, their claim as being the first consumer card with dual
GPUs is REALLY stretching things. They're taking a very narrow view
on just what it means to be a consumer card and what it takes to be
considered a GPU. Marketing at it's best/worst here.

-------------
Tony Hill
hilla underscore 20 at yahoo dot ca



I don't see why it is such a stretch. First of all, there are not many
companies that make consumer GPUs to begin with. They can be counted
on one hand, I believe. And as far as I am aware, none have released a
card with more than one GPU, for consumer use. Yeah, there are dozens
of cards that use 2 or more GPUs, from a number of companies, for all
kinds of highend, non-consumer applications. many of them predate
Nvidia's NV10/GeForce256, which was the first working consumer GPU,
but *certainly* not the first-ever GPU. that is, a chip with T&L
on-chip.

Actually I don't just use the term 'GPU' as Nvidia uses it. To myself
and to many who use graphics processors, something that takes the
geometry processing load OFF the CPU, putting on the graphics chip,
that's a 'graphics processor' or graphics processing unit / GPU as
Nvidia coined it. The 3Dfx Voodoo chips, including VSA-100s used in
Voodoo5 5500 and 6000 did NOT do that at all. Neither did any of the
pre-Radeon ATI chips, inluding the duel Rage Fury chips in the MAXX
card. And basicly any consumer 3D PC chip before the GeForce256. Any
graphics chip that lacks what used to be called 'geometry processing'
or what was commenly called T&L in late 1999 when GeForce came out,
and is now called Vertex Shading, if it lacks that, it's usually
concidered a 3D accelerator or rasterizer, rather than a complete
'graphics processor' or GPU. At least that is the way I have
understood things for a long time.

On the other hand,
I suppose one can argue that any graphics chip, be it 2D or 3D is a
'GPU', anything from a 1990 VGA chip, to a Voodoo1, to the Graphics
Synthesizer in the PS2. However it is commen practice in the graphics
industry to differentiate between a rasterizer and a complete graphics
processor with geometry & lighting (now vertex shading) on board.

So therefore, I do not find the marketing of XGI to be outrageous in
their claims of having the first duel GPU card for consumer use. Of
course, they *will* have to bring Volari to market. it will have to
work. in other words "believe when we see it" that still applies. but
the specific claim of having the first duel GPU card is not a stretch
in and of itself, in my book
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT +1. The time now is 06:48 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.