A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

NV43: AARGH! More misleading nomenclature!



 
 
Thread Tools Display Modes
  #1  
Old October 11th 04, 03:27 PM
Lachoneus
external usenet poster
 
Posts: n/a
Default NV43: AARGH! More misleading nomenclature!

From this article about the upcoming GeForce 6200 series cards:

http://anandtech.com/video/showdoc.aspx?i=2238

"The first thing to notice here is that the 6200 supports either a
64-bit or 128-bit memory bus, and as far as NVIDIA is concerned, they
are not going to be distinguishing cards equipped with either a 64-bit
or 128-bit memory configuration. While NVIDIA insists that they cannot
force their vendor partners to distinguish the two card configurations
apart, we're more inclined to believe that NVIDIA simply would like
all 6200 based cards to be known as a GeForce 6200, regardless of
whether or not they have half the memory bandwidth. NVIDIA makes a
"suggestion" to their card partners that they should add the 64-bit or
128-bit designation somewhere on their boxes, model numbers or
website, but the suggestion goes no further than just being a
suggestion."

WHY?! It's just like the 5200 series today--some are 64-bit, some are
128-bit, and they're generally not distinguished by model name, price,
or anything. This means that some cards are much faster than others
with the same model number, and it's not always possible for you to
know in advance which one you are buying.

Why not just call the 64-bit version the 6200SE and be done with it?
Oh, and REQUIRE card makers to follow this naming scheme--no 64-bit
Radeon 9600 non-Pro types. What does NVIDIA stand to lose by
distuinguishing the 64-bit and 128-bit models?
  #2  
Old October 11th 04, 04:20 PM
tq96
external usenet poster
 
Posts: n/a
Default

And more broken promised features. Everyone with a 6800 waiting to make
use of the highly touted video processor may have another thing coming:

"The Video Processor (soon to receive a true marketing name) on the NV40
was somewhat broken, although it featured MPEG 2 decode acceleration.
Apparently, support for WMV9 decode acceleration was not up to par with
what NVIDIA had hoped for. As of the publication of this article, NVIDIA
still has not answered our questions of whether or not there is any
hardware encoding acceleration as was originally promised with NV40."

http://anandtech.com/video/showdoc.aspx?i=2238
  #3  
Old October 11th 04, 05:57 PM
Tim
external usenet poster
 
Posts: n/a
Default


"Lachoneus" wrote in message
m...
From this article about the upcoming GeForce 6200 series cards:



Why not just call the 64-bit version the 6200SE and be done with it?
Oh, and REQUIRE card makers to follow this naming scheme--no 64-bit
Radeon 9600 non-Pro types. What does NVIDIA stand to lose by
distuinguishing the 64-bit and 128-bit models?


Consider that the 6200 is poised to be the next "bargain" card. I think
NVidia is targeting the type of customer who buys products based on price
and marketing alone, without any knowledge of the technology. For this type
of customer, information like the card's memory interface would only hamper
sales. Besides, it can't be an issue if they're never informed about it,
and don't know enough to question it anyway.

I see the 6200 being a big OEM card for PCs sold touting brand name
components..


  #4  
Old October 11th 04, 06:34 PM
deimos
external usenet poster
 
Posts: n/a
Default

Lachoneus wrote:
From this article about the upcoming GeForce 6200 series cards:

http://anandtech.com/video/showdoc.aspx?i=2238

"The first thing to notice here is that the 6200 supports either a
64-bit or 128-bit memory bus, and as far as NVIDIA is concerned, they
are not going to be distinguishing cards equipped with either a 64-bit
or 128-bit memory configuration. While NVIDIA insists that they cannot
force their vendor partners to distinguish the two card configurations
apart, we're more inclined to believe that NVIDIA simply would like
all 6200 based cards to be known as a GeForce 6200, regardless of
whether or not they have half the memory bandwidth. NVIDIA makes a
"suggestion" to their card partners that they should add the 64-bit or
128-bit designation somewhere on their boxes, model numbers or
website, but the suggestion goes no further than just being a
suggestion."

WHY?! It's just like the 5200 series today--some are 64-bit, some are
128-bit, and they're generally not distinguished by model name, price,
or anything. This means that some cards are much faster than others
with the same model number, and it's not always possible for you to
know in advance which one you are buying.

Why not just call the 64-bit version the 6200SE and be done with it?
Oh, and REQUIRE card makers to follow this naming scheme--no 64-bit
Radeon 9600 non-Pro types. What does NVIDIA stand to lose by
distuinguishing the 64-bit and 128-bit models?


This isn't much different from what we have now. There's no standard
that says a 5200 can't have either a 64 or 128-bit bus, but they do.
Same with the 5600XT's and 5900XT's, you have to check the specs to make
sure you're getting what you want.

NVIDIA just designs the reference chipset and board, it's up to
manufacturers to implement it. They could (if they wanted to) make
something completely different with the same chip design still. It's
always been this way.
  #5  
Old October 12th 04, 01:18 AM
Lachoneus
external usenet poster
 
Posts: n/a
Default

NVIDIA just designs the reference chipset and board, it's up to
manufacturers to implement it. They could (if they wanted to) make
something completely different with the same chip design still. It's
always been this way.


When the TNT2 has its memory bus cut in half, it became the TNT2 M64.

When the GeForce2 GTS had its memory bus cut in half, it became the
GeForce2 MX.

Likewise, memory bus width is one of the distinguishing factors between
the GF4 MX420/MX4000 and MX440.

AFAIK, it wasn't until fairly recently (FX 5200) that NVIDIA and/or its
partners decided not to distinguish cards based on memory bus width.
  #6  
Old October 12th 04, 01:25 AM
Lachoneus
external usenet poster
 
Posts: n/a
Default

Consider that the 6200 is poised to be the next "bargain" card. I think
NVidia is targeting the type of customer who buys products based on price
and marketing alone, without any knowledge of the technology. For this type
of customer, information like the card's memory interface would only hamper
sales. Besides, it can't be an issue if they're never informed about it,
and don't know enough to question it anyway.


Exactly. End users don't know what a memory bus is or why a wider one
is better. But Joe Sixpack will see how well Doom runs on his friend's
128-bit GeForce 6200, then buy a 64-bit one because he doesn't know any
better, and find that it doesn't meet his expectations. This is why
it's all the more important for the two versions of the card to be
distinguished. Like giving the 64-bit version the "SE" suffix, or
renaming it to the 6000, or whatever.

I see the 6200 being a big OEM card for PCs sold touting brand name
components..


Maybe it's as simple as this: the OEMs want to bundle a cheapo crippled
64-bit-memory video card, but give the impression that they perform as
fast as the non-crippled version. Bundling an "SE" or "MX" card would
make it look like they're cutting corners--so the solution isn't to
bundle a good card, it's to remove the "SE".
  #7  
Old October 12th 04, 06:37 AM
Phil
external usenet poster
 
Posts: n/a
Default

"Tim" wrote in message ...
"Lachoneus" wrote in message
m...
From this article about the upcoming GeForce 6200 series cards:



Why not just call the 64-bit version the 6200SE and be done with it?
Oh, and REQUIRE card makers to follow this naming scheme--no 64-bit
Radeon 9600 non-Pro types. What does NVIDIA stand to lose by
distuinguishing the 64-bit and 128-bit models?


Consider that the 6200 is poised to be the next "bargain" card. I think
NVidia is targeting the type of customer who buys products based on price
and marketing alone, without any knowledge of the technology. For this type
of customer, information like the card's memory interface would only hamper
sales. Besides, it can't be an issue if they're never informed about it,
and don't know enough to question it anyway.

I see the 6200 being a big OEM card for PCs sold touting brand name
components..



Well,

64-bit video card to go with the new 64-bit CPU(s) and new 64-bit OS.
Can't beat the performance !!!

Look like nVidia is playing with unsuspected buyers again !!!
  #8  
Old October 12th 04, 05:56 PM
Tim
external usenet poster
 
Posts: n/a
Default


"Lachoneus" wrote in message
...


This is why it's all the more important for the two versions of the card to
be distinguished. Like giving the 64-bit version the "SE" suffix, or
renaming it to the 6000, or whatever.


Absolutely, although NVidia will probably insist on proper labeling only if
this ploy winds up hurting their bottom line.



 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


All times are GMT +1. The time now is 05:55 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.