A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Nvidia's 9800 GTX and 9800 GX2 seem to be a waste of time & money



 
 
Thread Tools Display Modes
  #1  
Old April 16th 08, 08:21 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.games.action,comp.sys.ibm.pc.hardware.video,comp.arch
NV55
external usenet poster
 
Posts: 149
Default Nvidia's 9800 GTX and 9800 GX2 seem to be a waste of time & money

If you have a 8800 GTX, 8800 Ultra or 8800 GT, you might as well save
your money, don't waste it on 9800 GTX or 9800 GX2. These cards are
more or less what you already have. In some important areas, really
less. Save it for the upcoming GT200 based 9900GTX or 9900GX2. Which
will should be arriving Q3 2008.


http://www.techradar.com/news/comput...on-test-318825




Our verdict: Nvidia's 9800 GTX and GX2 on test
Are these cards a technological leap?

16 hours ago | Reader comments (0)

The new Nvidia GeForce 9800 range is not as good as we'd hoped

ZoomZoom



Just over eighteen months ago the much-heralded age of the DirectX10-
capable graphics card dawned, as the supreme G80-powered GeForce
8800GTX dropped into the TechRadar office. Six months later came the
updated 8800 Ultra, a card that has remained Nvidia's top end
offering... until now.

We've had to wait 12 long months for the refresh, during which time we
have been treated to a mass of mid-range cards, including the
excellent 8800GT - Nvidia's first card with a 65nm core.

But still, it's been a long time coming for the 9800 GTX and GX2.

Long time passing

Both new cards are powered by the same 65nm G92, a core that is itself
now six months old. And it's the first time that Nvidia has released a
brand new family of top-end cards based on old architecture. Replacing
the 8800GTX and Ultra is a necessity as far as furthering the Nvidia
brand is concerned, competition-wise though it's less of an issue. AMD
still hasn't managed to create anything to seriously outperform these
year-old cards, so is the lack of a new core an acknowledgment that
Nvidia only has to turn up at the track to win the race?

The GTX version of the 9800 card is a straight, beefed up version of
the G92 with higher clock speeds across the board. While it shares the
number of Raster Operators (ROPs) with the 8800GT, it does have the
old GTX's complement of shader units at 128, giving it the necessary
speed boost.

The GX2 follows the example of the old 7950GX2, strapping two G92-
stuffed PCBs together. But this time both PCBs face into the same
heatsink and are housed in a vaguely coffin-like surround. The clock
speeds are slightly slower than the GTX, but a fair bit of optimising
has gone into making this single-card SLI offering an impressive piece
of engineering.

Swiss cheese memory

The first difference you'll notice when comparing the two new cards
with their predecessors is the change in memory capacity. Both the
8800 GTX and Ultra had a 384-bit memory bus with 768MB of GDDR3, while
the 9800s make do with the same 256-bit 512MB of memory that resides
on the GTS and GT iterations of the G92-based 8800s.

Due to its two cores, the GX2 comes out tops in the memory bandwidth
stakes at 128GB/s compared with the Ultra's 103.7GB/s, but the 9800
GTX lags well behind both of the previous cards. What this all means
in real terms is that at the higher resolutions, and most especially
with full screen anti-aliasing turned on, the new cards take quite a
hit at the levels we were hoping these big-panel pixel pushers would
excel at.

The differences between the GTX and GX2, and indeed the 8800GT, are
slight; the GX2 simply relying on the brute force effect of the single
card SLI factor. Where the difference between the two new G92 parts is
most obvious though is the number of ROPs. The GTX is still hobbling
along with 16, less than both the 8800GTX and Ultra at 24, but due to
the doubling up, the 9800 GX2 has 32 ROPs. The difficulty is in
knowing how much of a benefit this multi-GPU's extra ROPs gives us as
opposed to the single card with 24.

Bigger, faster, stronger

So where do we find ourselves with the two new top-end cards? Well,
mostly in the same place we were before to be honest. There's very
little difference between this new set and the old, with the 9800 GTX
being the biggest disappointment.

It struggles to find any space between itself and the 8800 GTX (which
it's supposed to be replacing), and there's also the fact that you can
still pick up the older card - with the extra memory, bandwidth and
ROPs - for less than £200. In some places you can save yourself around
£50 and come out with an equivalent, and in some cases, faster card.
The march of progress seems to have stomped right past this iteration
of the 9800, and here at TechRadar we might just have to plump for the
original DX10 monster.

With regards to the GX2, Nvidia had to go down the multi-GPU route,
not just to prove it could produce a functional version like AMD, but
also to create a card that it could legitimately call the fastest
graphics card around.

The final verdict...

Still, the memory constraints hold the GX2 back from being the
superlative, stand out, top-end card de jour. On lower-res panels
without silicon-melting anti-aliasing, it speeds ahead of the
competition. Yet with all the bells and whistles cranked up to a
deafening roar it struggles to break-even with the old 8800 Ultra.
Again, if you shop around you can pick up an Ultra for around £350,
and be fairly sure that your card will have drivers mature enough to
cater for whatever you throw down its graphics tubes.

The long and short of it is that if you've got yourself an 8800GTX or
Ultra, and felt that twinge of envy at the announcement of this new
generation of top-end cards, then you can stop worrying. In fact, you
can probably be down-right smug as your slightly geriatric cards are
still more than capable of holding their own against these
youngbloods. Til the GT200 comes out that is...

The full version of this review will appear in PC Format magazine
issue 214 and will go on sale on 4 May.
By Dave James and James Rivington
  #2  
Old April 16th 08, 01:10 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.games.action,comp.sys.ibm.pc.hardware.video,comp.arch
Tim O[_2_]
external usenet poster
 
Posts: 41
Default Nvidia's 9800 GTX and 9800 GX2 seem to be a waste of time & money

On Wed, 16 Apr 2008 00:21:44 -0700 (PDT), NV55
wrote:

If you have a 8800 GTX, 8800 Ultra or 8800 GT, you might as well save
your money, don't waste it on 9800 GTX or 9800 GX2. These cards are
more or less what you already have. In some important areas, really
less. Save it for the upcoming GT200 based 9900GTX or 9900GX2. Which
will should be arriving Q3 2008.


Save my money for Q3? I bought an 8800GT in December. It'll be long
after Q3 that I upgrade again. I suspect thats true for a lot of
people since there are probably going to be 2 more games that can even
use the 8800's horsepower by then.

Tim
  #3  
Old April 16th 08, 03:23 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.games.action,comp.sys.ibm.pc.hardware.video,comp.arch
Augustus
external usenet poster
 
Posts: 738
Default Nvidia's 9800 GTX and 9800 GX2 seem to be a waste of time & money

You're preaching to the choir in this forum.....for $175 I can add a second
8800GT OC unit that will outperform any 9800GX2.


  #4  
Old April 16th 08, 03:44 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.games.action,comp.sys.ibm.pc.hardware.video,comp.arch
JLC
external usenet poster
 
Posts: 146
Default Nvidia's 9800 GTX and 9800 GX2 seem to be a waste of time & money


"Tim O" wrote in message
...
On Wed, 16 Apr 2008 00:21:44 -0700 (PDT), NV55
wrote:

If you have a 8800 GTX, 8800 Ultra or 8800 GT, you might as well save
your money, don't waste it on 9800 GTX or 9800 GX2. These cards are
more or less what you already have. In some important areas, really
less. Save it for the upcoming GT200 based 9900GTX or 9900GX2. Which
will should be arriving Q3 2008.


Save my money for Q3? I bought an 8800GT in December. It'll be long
after Q3 that I upgrade again. I suspect thats true for a lot of
people since there are probably going to be 2 more games that can even
use the 8800's horsepower by then.

Tim


I agree. My 8800GT kicks the crap out of any game I throw at it (except
Crysis of course). I'm hanging on to this card for awhile. JLC


  #5  
Old April 19th 08, 03:14 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.games.action,comp.sys.ibm.pc.hardware.video,comp.arch
Trimble Bracegirdle
external usenet poster
 
Posts: 80
Default Nvidia's 9800 GTX and 9800 GX2 seem to be a waste of time & money

"If you have a 8800 GTX, 8800 Ultra or 8800 GT, you might as well save
your money, don't waste it on 9800 GTX or 9800 GX2. These cards are
more or less what you already have. "

Well thanks for that comfort as I'm still trying to feel my 1 year old £300
8800GTX
is / was worth it.
All divides us up on this issue depending on what resolution a Bunny thinks
it needs
must play in.
With my 21" CRT Monitor all is looking good at 1024 x 768 Max everything.
(\__/)
(='.'=)
(")_(") mouse(it all just dots u no ?)


  #6  
Old April 19th 08, 04:39 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.games.action,comp.sys.ibm.pc.hardware.video,comp.arch
Augustus
external usenet poster
 
Posts: 738
Default Nvidia's 9800 GTX and 9800 GX2 seem to be a waste of time & money

Well thanks for that comfort as I'm still trying to feel my 1 year old
£300 8800GTX
is / was worth it.
All divides us up on this issue depending on what resolution a Bunny
thinks it needs
must play in.
With my 21" CRT Monitor all is looking good at 1024 x 768 Max everything.


You're joking, right? It should be looking just fine at 1600x1200 running
everything unless it's Crysis or one two other titles. And these should be
more than just fine at 1280x1024 maxed. Running an 8800GTX at 1024x768 with
everything maxed on a 21" CRT? You're using an 8800GTX 768Mb card to do what
an 8600GT would do at 1024x768.


  #7  
Old April 19th 08, 11:07 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
Sorrid User
external usenet poster
 
Posts: 2
Default Nvidia's 9800 GTX and 9800 GX2 seem to be a waste of time & money

"Augustus" wrote in message
news:BPdOj.488$XI1.270@edtnps91...
Well thanks for that comfort as I'm still trying to feel my 1 year old
£300 8800GTX
is / was worth it.
All divides us up on this issue depending on what resolution a Bunny
thinks it needs
must play in.
With my 21" CRT Monitor all is looking good at 1024 x 768 Max everything.


You're joking, right? It should be looking just fine at 1600x1200 running
everything unless it's Crysis or one two other titles. And these should be
more than just fine at 1280x1024 maxed. Running an 8800GTX at 1024x768
with everything maxed on a 21" CRT? You're using an 8800GTX 768Mb card to
do what an 8600GT would do at 1024x768.


An 8600GT is not good enough for Assasin's Creed and World of Conflict IME.
Even at 1024x768. Crysis looks and runs well enough on medium, but those two
games looks like DX7 titles by the time I get playable framerates on my
8600GT.

  #8  
Old April 19th 08, 03:39 PM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati
Augustus
external usenet poster
 
Posts: 738
Default Nvidia's 9800 GTX and 9800 GX2 seem to be a waste of time & money

An 8600GT is not good enough for Assasin's Creed and World of Conflict
IME. Even at 1024x768. Crysis looks and runs well enough on medium, but
those two games looks like DX7 titles by the time I get playable
framerates on my 8600GT.


I'm aware of that. You missed the point.


  #9  
Old April 20th 08, 01:05 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.games.action,comp.sys.ibm.pc.hardware.video,comp.arch
Trimble Bracegirdle
external usenet poster
 
Posts: 80
Default Nvidia's 9800 GTX and 9800 GX2 seem to be a waste of time & money

Wise man (?) say:
"You're joking, right? ...... Running an 8800GTX at 1024x768 with

everything maxed on a 21" CRT? You're using an 8800GTX 768Mb card to do what
an 8600GT would do at 1024x768."

Exactly...a year ago the 8800GTX (DX10 etc) looked to make sense & those
other
cheaper lower spec 8xxx's were not out .
The CRT won't last forever anway.
(\__/)
(='.'=)
(")_(") mouse(Yes we did rather get that purchses wrong did we not)



  #10  
Old April 20th 08, 03:35 AM posted to alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.games.action,comp.sys.ibm.pc.hardware.video,comp.arch
JLC
external usenet poster
 
Posts: 146
Default Nvidia's 9800 GTX and 9800 GX2 seem to be a waste of time & money


"Trimble Bracegirdle" wrote in message
...
Wise man (?) say:
"You're joking, right? ...... Running an 8800GTX at 1024x768 with

everything maxed on a 21" CRT? You're using an 8800GTX 768Mb card to do
what
an 8600GT would do at 1024x768."

Exactly...a year ago the 8800GTX (DX10 etc) looked to make sense & those
other
cheaper lower spec 8xxx's were not out .
The CRT won't last forever anway.
(\__/)
(='.'=)
(")_(") mouse(Yes we did rather get that purchses wrong did we not)


I used to run all my games at 1024x768 until around this time last year when
I upgrade to a ATI X1900XT. I then discovered 1280x1024 to be a lot better
looking on my 19"CRT. I usually run 4x AA and 16x AF and the only game that
has problems with that is Crysis. Now that I have a 8800GT I still run my
games at 1280 because if I go any higher on this monitor I have to run at
60Hz which bugs my eyes. I run everything at 85Hz. Someday I'll take the
plunge and get a nice big LCD, but for now I'm happy with what I have. JLC


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
AIW 9800 Pro and HDTV Wonder at the same time Professor Yedidyah Langsam Ati Videocards 1 June 11th 05 06:14 AM
Is the Radeon 9800 Pro 256MB worth the money for its performance? Chinche Homebuilt PC's 0 October 26th 04 05:39 PM
aiw 9800 is it a $200+ waste Husky Ati Videocards 15 July 28th 04 06:37 PM
Xp 2400 2 Ghz and a Radeon 9800 Pro will i get ll the power fromthe 9800 Pro ? We Live For The One We Die For The One Ati Videocards 24 January 28th 04 01:44 PM
Time to mod my Gigabyte Radeon 9800 PRO? Game³ Ati Videocards 1 August 10th 03 10:00 AM


All times are GMT +1. The time now is 10:52 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.