A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Motherboards » Gigabyte Motherboards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Gigabyte GV-N96TSL-1G



 
 
Thread Tools Display Modes
  #1  
Old December 17th 09, 09:05 AM posted to alt.comp.periphs.mainboard.gigabyte
Bill
external usenet poster
 
Posts: 41
Default Gigabyte GV-N96TSL-1G

I was considering the fanless Gigabyte GV-N96TSL-1 graphics card in on a
Gigabyte GA-P55A-UD4P motherboard. This board supports (future technology)
features USB 3.0 and 6.0 GB/s SATA, however if either of these features are
used, the first PCI Express slot goes from 16X to 8X. My question is
whether this reduction (to 8x) would be expected to affect the graphics
performance of the system. I was thinking that maybe since it is a
relatively "slow" card, that it might not--but I really have no idea. Thank
you for sharing whatever thoughts you may have concerning this.

Bill


  #2  
Old December 17th 09, 09:07 AM posted to alt.comp.periphs.mainboard.gigabyte
Bill
external usenet poster
 
Posts: 41
Default Gigabyte GV-N96TSL-1G


"Bill" wrote in message
...
I was considering the fanless Gigabyte GV-N96TSL-1 graphics card in on a
Gigabyte GA-P55A-UD4P motherboard. This board supports (future technology)
features USB 3.0 and 6.0 GB/s SATA, however if either of these features are
used, the first PCI Express slot goes from 16X to 8X. My question is
whether this reduction (to 8x) would be expected to affect the graphics
performance of the system. I was thinking that maybe since it is a
relatively "slow" card, that it might not--but I really have no idea.
Thank you for sharing whatever thoughts you may have concerning this.

Bill


I should have also mentioned that I was planning to install an Intel 860 CPU
in the system. (with no overclocking).


  #3  
Old December 17th 09, 12:23 PM posted to alt.comp.periphs.mainboard.gigabyte
Paul
external usenet poster
 
Posts: 13,364
Default Gigabyte GV-N96TSL-1G

Bill wrote:
I was considering the fanless Gigabyte GV-N96TSL-1 graphics card in on a
Gigabyte GA-P55A-UD4P motherboard. This board supports (future technology)
features USB 3.0 and 6.0 GB/s SATA, however if either of these features are
used, the first PCI Express slot goes from 16X to 8X. My question is
whether this reduction (to 8x) would be expected to affect the graphics
performance of the system. I was thinking that maybe since it is a
relatively "slow" card, that it might not--but I really have no idea. Thank
you for sharing whatever thoughts you may have concerning this.

Bill


If the slot ran x8 PCI Express Rev2, that is 8*500MB/sec or 4GB/sec.
That is roughly equivalent to twice what you'd get with AGP 8x.

The older generation PCI Express Rev1.1 x16 slot, would have given
you 16*250MB/sec or 4GB/sec as well. So x8 operation in Rev2 mode,
is still pretty good, and comparable to x16 in Rev1.1 mode.

If it were to have an impact, which I doubt, it would be at the 5%
to 10% level while gaming.

Tomshardware did some tests years ago, where they used cello tape,
and insulated various numbers of PCI Express lanes. You can use those
results, to understand the shape of the performance curve. The
effects are worst for one particular kind of benchmark, and not
nearly as pronounced on real games.

(SpecViewPerf suffers, when PCI E is slowed down)
http://www.tomshardware.com/reviews/...ing,927-9.html

Such a set of test cases, would need to be repeated for the
more powerful processor and GPU combinations available today.
I can't guess at what the performance curve would be. The impact
should be pretty small, but only a real benchmark series, such
as the Tomshardware article, is needed to be sure.

*******

On the architecture front, the reason your question is intriguing, is

Why would "Northbridge" PCI Express interfaces, have any
relationship to what is done on the "Southbridge" ?

I downloaded the manual for your motherboard, and I do see the section
in question. I don't doubt there is an issue there.

ftp://download.gigabyte.ru/manual/mb...a-ud4(p)_e.pdf

I also have a copy of the P55 ("Southbridge") spec 322169.pdf, and
what is interesting in there, is the chip seems to have integrated
clock generation. That might not be the only way to do it.
It may be possible to use an external clock generator. My
guess is, that they're using the integrated clock generation.
That saves money.

The P55 spec is 892 pages long, and I'm not going to read the whole
thing. Even if I was paid to do it, there wouldn't be enough hours
in a day, to read the whole thing, look for every "*" or "Note" in
the document, and figure out what evil they're up to. I was not
able to find a reference to a register controlling clock generation,
due to the limits of the Adobe Acrobat version 9 PDF reader
(piece of crap). I wish Intel would use an older version of
PDF compatibility, so I could use an older version of Acrobat.

The P55 has two PCI Express Rev2 compliant clock outputs (150pS
jitter spec). I can see one output going to PCI Express slot 1.
The second clock output would go to the PCI Express switch chip,
which routes the remaining x8 of bandwidth, to either the first
or second video card slot. Maybe the switch chip makes more
outputs ? We don't even know what chip is used.

Great, we have PCI Express Rev2 video slots, and PCI Express Rev1
Southbridge PCI Express interfaces.

Now, when Gigabyte wants to run the add-on peripheral chips with
PCI Express Rev2 compliant speeds, it needs the low jitter clocks
for that. Where the hell are those clocks coming from ? Perhaps
it is the lack of good quality clock signals, that causes this
limitation, and interaction between Northbridge (Video) and
Southbridge (Peripheral) PCI Express interfaces. I doubt it
very much, that the PCI Express switch chip, is being used to
supply both video and peripherals at the same time - the Gigabyte
architecture diagram in the manual seems to discount that.

Very peculiar... and sucky.

I wonder how much more it would have cost, to use an external
clockgen, or if it is even possible ?

I don't know why this interaction exists, but it could be
because of Intel's half baked built-in clock generator.
Anyone who has worked out clock distribution architectures
on a PCB, knows that additional clock outputs are golden,
and allow amazing things to be done. Cheap out on them,
and some poor PCB designer will be sweating gumdrops,
trying to make their design work. At the moment, I don't
even know how Gigabyte managed to do what they've done.

There are devices, that allow buffering and creation of
more clock signals. But once you use such a device, you
degrade the clock quality. That is why it isn't a trivial
matter to solve.

*******

I think in the Tomshardware article, you can see it takes
a pretty serious degradation of the video slot bandwidth,
before it ruins your video performance. In your case,
I wouldn't lose any sleep over it. However, if I bought
a $600 video card, and it removed even a few percentage
points from it, I'd be ****ed - because I want to get my
$600 worth of performance.

Paul
  #4  
Old December 17th 09, 10:13 PM posted to alt.comp.periphs.mainboard.gigabyte
Bill
external usenet poster
 
Posts: 41
Default Gigabyte GV-N96TSL-1G


"Paul" wrote in message
...
Bill wrote:
I was considering the fanless Gigabyte GV-N96TSL-1 graphics card in on a
Gigabyte GA-P55A-UD4P motherboard. This board supports (future
technology) features USB 3.0 and 6.0 GB/s SATA, however if either of
these features are used, the first PCI Express slot goes from 16X to 8X.
My question is whether this reduction (to 8x) would be expected to affect
the graphics performance of the system. I was thinking that maybe since
it is a relatively "slow" card, that it might not--but I really have no
idea. Thank you for sharing whatever thoughts you may have concerning
this.

Bill


If the slot ran x8 PCI Express Rev2, that is 8*500MB/sec or 4GB/sec.
That is roughly equivalent to twice what you'd get with AGP 8x.

The older generation PCI Express Rev1.1 x16 slot, would have given
you 16*250MB/sec or 4GB/sec as well. So x8 operation in Rev2 mode,
is still pretty good, and comparable to x16 in Rev1.1 mode.

If it were to have an impact, which I doubt, it would be at the 5%
to 10% level while gaming.

Tomshardware did some tests years ago, where they used cello tape,
and insulated various numbers of PCI Express lanes. You can use those
results, to understand the shape of the performance curve. The
effects are worst for one particular kind of benchmark, and not
nearly as pronounced on real games.

(SpecViewPerf suffers, when PCI E is slowed down)
http://www.tomshardware.com/reviews/...ing,927-9.html

Such a set of test cases, would need to be repeated for the
more powerful processor and GPU combinations available today.
I can't guess at what the performance curve would be. The impact
should be pretty small, but only a real benchmark series, such
as the Tomshardware article, is needed to be sure.

*******

On the architecture front, the reason your question is intriguing, is

Why would "Northbridge" PCI Express interfaces, have any
relationship to what is done on the "Southbridge" ?

I downloaded the manual for your motherboard, and I do see the section
in question. I don't doubt there is an issue there.

ftp://download.gigabyte.ru/manual/mb...a-ud4(p)_e.pdf

I also have a copy of the P55 ("Southbridge") spec 322169.pdf, and
what is interesting in there, is the chip seems to have integrated
clock generation. That might not be the only way to do it.
It may be possible to use an external clock generator. My
guess is, that they're using the integrated clock generation.
That saves money.

The P55 spec is 892 pages long, and I'm not going to read the whole
thing. Even if I was paid to do it, there wouldn't be enough hours
in a day, to read the whole thing, look for every "*" or "Note" in
the document, and figure out what evil they're up to. I was not
able to find a reference to a register controlling clock generation,
due to the limits of the Adobe Acrobat version 9 PDF reader
(piece of crap). I wish Intel would use an older version of
PDF compatibility, so I could use an older version of Acrobat.

The P55 has two PCI Express Rev2 compliant clock outputs (150pS
jitter spec). I can see one output going to PCI Express slot 1.
The second clock output would go to the PCI Express switch chip,
which routes the remaining x8 of bandwidth, to either the first
or second video card slot. Maybe the switch chip makes more
outputs ? We don't even know what chip is used.

Great, we have PCI Express Rev2 video slots, and PCI Express Rev1
Southbridge PCI Express interfaces.

Now, when Gigabyte wants to run the add-on peripheral chips with
PCI Express Rev2 compliant speeds, it needs the low jitter clocks
for that. Where the hell are those clocks coming from ? Perhaps
it is the lack of good quality clock signals, that causes this
limitation, and interaction between Northbridge (Video) and
Southbridge (Peripheral) PCI Express interfaces. I doubt it
very much, that the PCI Express switch chip, is being used to
supply both video and peripherals at the same time - the Gigabyte
architecture diagram in the manual seems to discount that.

Very peculiar... and sucky.

I wonder how much more it would have cost, to use an external
clockgen, or if it is even possible ?

I don't know why this interaction exists, but it could be
because of Intel's half baked built-in clock generator.
Anyone who has worked out clock distribution architectures
on a PCB, knows that additional clock outputs are golden,
and allow amazing things to be done. Cheap out on them,
and some poor PCB designer will be sweating gumdrops,
trying to make their design work. At the moment, I don't
even know how Gigabyte managed to do what they've done.

There are devices, that allow buffering and creation of
more clock signals. But once you use such a device, you
degrade the clock quality. That is why it isn't a trivial
matter to solve.

*******

I think in the Tomshardware article, you can see it takes
a pretty serious degradation of the video slot bandwidth,
before it ruins your video performance. In your case,
I wouldn't lose any sleep over it. However, if I bought
a $600 video card, and it removed even a few percentage
points from it, I'd be ****ed - because I want to get my
$600 worth of performance.

Paul



Thank you your very detailed reply, and the link to TomsHardware article.
As you put it, "I'm not going to
lose any sleep over the 8x issue". From what I have read, the reason the
board works the way it does is becuase of the P55 chip.
Evidently there isn't anyway to overcome the shortcoming I mentioned (going
to 8x) because of that chip--if one wants more, then they need to spend a
little more and go to the X58 chip... I don't do any serious gaming.
Occasional adventure game, Google Sketchup. Appears I should be okay with
the components I mentioned. I was sort of waiting to see how the Intel
X25-M SSD /TRIM issues played out, and I haven't heard much lately--which I
guess is a good thing.

Peace,
Bill


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
GIGABYTE GA-X38-DQ6 Core2Duo Gigabyte Motherboards 2 October 21st 07 05:13 AM
Gigabyte Gigabyte NX7600GT does it come with a Din to component adaptor ? Home Theatre Guy Gigabyte Motherboards 0 October 21st 06 08:38 AM
Gigabyte Gigabyte NX7600GT does it come with a Din to component adaptor ? Home Theatre Guy Nvidia Videocards 0 October 21st 06 08:37 AM
Are MSI and Gigabyte the same? larry moe 'n curly General 7 October 6th 05 08:52 AM
GA-K8N - gigabyte or others Getho AMD x86-64 Processors 6 July 14th 04 02:33 PM


All times are GMT +1. The time now is 09:07 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.