A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Processors » Overclocking
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Upping core voltage on 9800 Pro/XT



 
 
Thread Tools Display Modes
  #1  
Old July 12th 04, 11:15 PM
Neil
external usenet poster
 
Posts: n/a
Default Upping core voltage on 9800 Pro/XT

My 9800 Pro (with R360 core but Pro PCB, which I
believe is a BBA) now has an XT BIOS. (If you're
interested I had more luck in the end with a BIOS
I found for an HIS card modified with a 128MB
limit, than the modified Club3D BIOS that many
people seem to use. Both BIOS's were considerably
older than the one installed on my card
originally. I wonder what the intervening updates
might have included). My card nearly runs at XT
clock speeds (412/365) flawlessly, but not quite.
At the moment I have the case open and a desk fan
directed into it. In this fashion, ATitool tells
me I can have stable settings of about 403/373.
At those settings (needless to say) 3DMark03 runs
fine. With the default settings everything is
fine apart from a few (very few, probably fewer
than 10 or 20) single-pixel snowy sparkles that
show up in Mother
Nature. You might not even notice them if you
weren't looking. [I have run FarCry (not for very
long) at the default speeds and not noticed
anything untoward].

To be honest, I don't think there's much wrong
with the card at these speeds, but the thing is I
know it's not working perfectly. I shouldn't have
thought it was the GPU temperature that is causing
the sparkles, since I've fitted the Arctic VGA
Silencer, which claims to operate the GPU at much
lower temperatures than the standard cooler (and
the desk fan's there too). I thought I might
benefit from increasing the voltage to the GPU.

I did try changing the AGP signalling voltage to
1.6V, but that made no difference (I think) and I
can't really see why it should make any, to the
core. I've googled a little and found the Rojack
Pot articles on volt-modding the Pro and the XT.
I'm pretty certain, I need to follow the
instructions in the Pro article, as I just don't
have the components described in the XT article.
I'll almost certainly do it by pencilling my
resistors in. I think the R360 has a default
voltage slightly higher than the R350. (The
articles mention nearly 1.8V measured on a
standard XT compared with just 1.7V or less on the
standard Pro). I'm wondering whether the Pro PCB
is modified to provide the higher core voltage
when the R360 core is included. I guess I'll find
out when I measure the stock voltage.

So my questions a
Is it likely that increasing the GPU voltage will
rid me of my sparkles at 412MHz?
Is there a sensible way to measure the GPU
temperature?
Any tips on pencilling (beyond the article) from
someone who's done it?

Thanks.

Neil


  #2  
Old July 12th 04, 11:48 PM
pudj
external usenet poster
 
Posts: n/a
Default


"Neil" wrote in message
...
My 9800 Pro (with R360 core but Pro PCB, which I
believe is a BBA) now has an XT BIOS. (If you're
interested I had more luck in the end with a BIOS
I found for an HIS card modified with a 128MB
limit, than the modified Club3D BIOS that many
people seem to use. Both BIOS's were considerably
older than the one installed on my card
originally. I wonder what the intervening updates
might have included). My card nearly runs at XT
clock speeds (412/365) flawlessly, but not quite.
At the moment I have the case open and a desk fan
directed into it. In this fashion, ATitool tells
me I can have stable settings of about 403/373.
At those settings (needless to say) 3DMark03 runs
fine. With the default settings everything is
fine apart from a few (very few, probably fewer
than 10 or 20) single-pixel snowy sparkles that
show up in Mother
Nature. You might not even notice them if you
weren't looking. [I have run FarCry (not for very
long) at the default speeds and not noticed
anything untoward].

To be honest, I don't think there's much wrong
with the card at these speeds, but the thing is I
know it's not working perfectly. I shouldn't have
thought it was the GPU temperature that is causing
the sparkles, since I've fitted the Arctic VGA
Silencer, which claims to operate the GPU at much
lower temperatures than the standard cooler (and
the desk fan's there too). I thought I might
benefit from increasing the voltage to the GPU.

I did try changing the AGP signalling voltage to
1.6V, but that made no difference (I think) and I
can't really see why it should make any, to the
core. I've googled a little and found the Rojack
Pot articles on volt-modding the Pro and the XT.
I'm pretty certain, I need to follow the
instructions in the Pro article, as I just don't
have the components described in the XT article.
I'll almost certainly do it by pencilling my
resistors in. I think the R360 has a default
voltage slightly higher than the R350. (The
articles mention nearly 1.8V measured on a
standard XT compared with just 1.7V or less on the
standard Pro). I'm wondering whether the Pro PCB
is modified to provide the higher core voltage
when the R360 core is included. I guess I'll find
out when I measure the stock voltage.

So my questions a
Is it likely that increasing the GPU voltage will
rid me of my sparkles at 412MHz?
Is there a sensible way to measure the GPU
temperature?
Any tips on pencilling (beyond the article) from
someone who's done it?

Thanks.

Neil

strange my sapphire 9800 pro 128meg(256bit) runs flawless at 411 378(3 hrs
of 3d mark 2001se looping) and i havent added any extra cooling its the r350
chip.Is that normal?my defaults are 378/337.50 I dont over clock usually but
because everyone seems to be talking about it I thought id test my card is 3
hrs a long enough test?no problems though everything ran smooth no pixel
locks or particals is my card special lol?


  #3  
Old July 13th 04, 01:07 AM
Sham B
external usenet poster
 
Posts: n/a
Default

I've fitted the Arctic VGA
Silencer, which claims to operate the GPU at much
lower temperatures than the standard cooler (and
the desk fan's there too). I thought I might
benefit from increasing the voltage to the GPU.

I did try changing the AGP signalling voltage to
1.6V, but that made no difference (I think) and I
can't really see why it should make any, to the
core.


Um. I have an artic on a 9800 pro and can get comforatably up to 430/370 (max is 445/380). Although
variability is a factor in silicon in general, your temps still seem low. Have you got the high and
low fan settings mixed around? I keep mine on high all the time btw. Also might be worth checking
that you are getting good thermal contact.

Upping the voltage makes the card hotter, which is more than likely making the heating problem
worse.

could be that you simply have a card with a hot GPU? I have the same problem with my processor,
XP2800 which simply crashes if the temp goes beyond 60 for more than a second, so any real
overclocking is out. Luck of the draw.

S



  #4  
Old July 13th 04, 05:42 AM
Inglo
external usenet poster
 
Posts: n/a
Default

On 7/12/2004 5:07 PM Sham B brightened our day with:

I've fitted the Arctic VGA


Silencer, which claims to operate the GPU at much
lower temperatures than the standard cooler (and
the desk fan's there too). I thought I might
benefit from increasing the voltage to the GPU.

I did try changing the AGP signalling voltage to
1.6V, but that made no difference (I think) and I
can't really see why it should make any, to the
core.



Um. I have an artic on a 9800 pro and can get comforatably up to 430/370 (max is 445/380). Although
variability is a factor in silicon in general, your temps still seem low. Have you got the high and
low fan settings mixed around? I keep mine on high all the time btw. Also might be worth checking
that you are getting good thermal contact.

Upping the voltage makes the card hotter, which is more than likely making the heating problem
worse.

could be that you simply have a card with a hot GPU? I have the same problem with my processor,
XP2800 which simply crashes if the temp goes beyond 60 for more than a second, so any real
overclocking is out. Luck of the draw.

S



I have an arctic cooler on my 9800 Pro and the max core overclock I get
with ATITool is ~402. I don't know about ATITool though, it spots
artifacts that I don't even see.
Before I ever used ATITool I ran my old 9600 Pro at 475 (400 default)
core and never had a problem, but when I tried it with that card it said
the max overclock was ~448.
I'm kind of frustrated I thought ATITool would clear things up, just
makes things more confusing for me.

And how is it possible that ATITool says my memory can handle a max of
372, I've got regular Hynix memory and from what I've read people have a
hard time getting that to run at 365 with ramsinks on.

What do experienced 9800 overclockers use to judge their max overclock?
Is ATITool to sensitive to artifacts?

--
"You keep using that word. I do not think it means what you think it means."

- Inigo Montoya
Steve [Inglo]
  #5  
Old July 13th 04, 09:13 PM
Neil
external usenet poster
 
Posts: n/a
Default


"pudj" wrote in
message ...

strange my sapphire 9800 pro 128meg(256bit) runs

flawless at
411 378(3 hrs of 3d mark 2001se looping) and i

havent added
any extra cooling its the r350 chip.Is that

normal?

I have read (on the internet so it must be true)
that the R350 generally overclocks better than the
R360. If your core was R360 I'd recommend the
BIOS upgrade (given you can run at those speeds).
It certainly affected my benchmark score
significantly. [Standard score 5550, upgrade BIOS
only 5760, up clock speed to XT only 5900, do both
6200 in 3DMark03]. I only have one demanding
game, and never really studied the frame rate
closely enough to see if it made any difference in
that.

my defaults are 378/337.50 I dont over clock

usually but
because everyone seems to be talking about it I

thought
id test my card is 3 hrs a long enough test?no

problems
though everything ran smooth no pixel locks or

particals
is my card special lol?


I think 3hrs is enough. I used ATitool and tried
to make it defect free for about 2 hours or so,
which is where I came up with my 403MHz figure.
You might try ATitool, though if you're happy with
your current results it might put you off a bit.
It spots defects a long time before your eye, and
your maximum defect-free clock speeds might come
down. (See someone else's post below).

Neil


  #6  
Old July 13th 04, 09:30 PM
Neil
external usenet poster
 
Posts: n/a
Default


"Inglo"

wrote in message
m...

I have an arctic cooler on my 9800 Pro and the

max
core overclock I get with ATITool is ~402. I

don't know
about ATITool though, it spots artifacts that I

don't even see.

...snip...

What do experienced 9800 overclockers use to

judge their
max overclock? Is ATITool to sensitive to

artifacts?


You don't want to talk to me then I'm a noob, but
I know what you mean about ATitool picking out
very small artifacts. I'm undecided whether that
is a good or bad thing. I test my CPU & system
memory overclock with Prime95 and insist that it's
defect free - other people just accept the fact
that their machine doesn't crash (hardly) at all
and say that's ok. On the other hand, I can't see
how an imperceptibly defective GPU overclock could
lead to (say) corrupted data on a hard disc. But
I've certainly done that (before I knew better)
with a CPU & system memory overclock.

Neil


  #7  
Old July 13th 04, 09:46 PM
Neil
external usenet poster
 
Posts: n/a
Default

I've fitted the Arctic VGA
Silencer, which claims to operate the GPU at
much lower temperatures than the standard
cooler (and the desk fan's there too). I

thought
I might benefit from increasing the voltage to

the
GPU.


Um. I have an artic on a 9800 pro and can get
comforatably up to 430/370 (max is 445/380).


By ATitool or by eye?

Although variability is a factor in silicon in
general, your temps still seem low. Have
you got the high and low fan settings mixed
around? I keep mine on high all the time btw.


It's on high.

Also might be worth checking that you are
getting good thermal contact.


I never removed the shim from the card, but I cut
some slots in the VGA Silencer so it fits over the
outside. I certainly think it's not being held
off the core. I never managed to get the screws
to do up all the way to the rubber washers; I
couldn't get enough purchase on the little screw
heads and the clip seems too strong to bend. It
does seem firmly attached though. In your opinion
would that be significant?

Upping the voltage makes the card hotter, which
is more than likely making the heating problem
worse.


I was guessing I didn't have a heat problem, just
an inability-to-handle-the-frequency-problem.
That's why I thought a core voltage increase might
help - in a get-the-capacitors-charged-to-a-
threshold-voltage-faster sort of way. But I'm not
sure.

could be that you simply have a card with a hot
GPU? I have the same problem with my processor,
XP2800 which simply crashes if the temp goes
beyond 60 for more than a second, so any real
overclocking is out. Luck of the draw.

I can't really complain - I have some lucky system
memory PC2100 that runs ok at 164MHz with low
latency. (It will do Memtest at 175MHz if I
increase the latency. But either differences
between the test and WinXP or the extra stress the
OS puts on other things (processor, chipset,
AGP/PCI, I don't know) means I can't have a stable
system with that FSB).

I win some, I lose some. [At least I got an
R360].

Neil


  #8  
Old July 13th 04, 10:55 PM
Neil
external usenet poster
 
Posts: n/a
Default

Is there any correlation between core type, PCB
type and max. core speed?

So far in this thread we have:

Me R360 ProPCB 403MHz
Inglo ? ? 402MHz
Pudj R350 ProPCB? 411MHz (more?)
ShamB ? ? 430/445MHz

Not necessarily all done through ATitool.

I'm still curious as to whether the ProPCB
undervolts the R360 core leading to it
overclocking poorly. If anyone wants to add to
this list feel free. I think I'll get my meter
and look at the core voltage.

Neil

[A curious point: My vidcard supplier (scan)
stuck their own warrantee invalidation sticker
right over the place where you might want to
solder/measure voltages to increase the core
voltage. They're not as green as they're cabbage
looking].


  #9  
Old July 13th 04, 11:34 PM
Sham B
external usenet poster
 
Posts: n/a
Default


By ATitool or by eye?

I use Powerstrip to up the clocks. I dont use ATI tool, rather, I slowly set the OC up over time
whilst playing my normal games (Most of the recent ones...I find that Lock-on is the best test of
the GFX and CPU, although its prolly not a popular game, but also recently played BV and KOTOR a
lot). I run 10Mhz down from the max clock that satarts causing artifacts (ie noticeable by me when I
play). so its 'By eye'.


I was guessing I didn't have a heat problem, just
an inability-to-handle-the-frequency-problem.
That's why I thought a core voltage increase might
help - in a get-the-capacitors-charged-to-a-
threshold-voltage-faster sort of way. But I'm not
sure.

Um. Yeah, that might be the case, but the point is that your core seems to be low on overclockabilty
to start off with, and upping the core voltage is just increasing the severity of the problem that
is holding it back... If its some tranisistors that are at the low end of spec, then you are
increasing the chances of burning them out, and if its overall heating of the GPU, then you are
adding to the problem.

If you must overvoltage (and realistically, if I was in your position, I know I would try it, being
a consumate tinkerer , Id be tempted to suggest starting with an underclock+overvoltage (rather
than hitting the GPU with an overclock and overvoltage without knowing what is really holding it
back) and moving forward from there.

S


  #10  
Old July 14th 04, 02:56 PM
Inglo
external usenet poster
 
Posts: n/a
Default

On 7/13/2004 2:55 PM Neil brightened our day with:

Is there any correlation between core type, PCB
type and max. core speed?

So far in this thread we have:

Me R360 ProPCB 403MHz
Inglo ? ? 402MHz
Pudj R350 ProPCB? 411MHz (more?)
ShamB ? ? 430/445MHz



I have a R350 core on a Pro PCB and used ATITool.
Computer chips are just not all created equal. I think it would be
ridiculous for me to feel disappointed in my overclock results.

I'm getting 50+ fps in FarCry at relatively high settings with the core
at 396 and the mem at 366 so I'm pleased enough.
The only reason to go higher is just my inveterate tinkering nature.

--
"You keep using that word. I do not think it means what you think it means."

- Inigo Montoya
Steve [Inglo]
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
What's the default core voltage for an AMD XP 3200+? +c0re- Overclocking 15 July 4th 04 05:59 PM
A64 3200+ Voltage Question johny AMD x86-64 Processors 2 June 28th 04 09:41 AM
ABIT AI7 (Springdale) - NB/AGP Core Voltage? Wayne Youngman Overclocking 7 April 24th 04 03:44 PM
Max Temperature David Hansen Overclocking 11 February 21st 04 06:00 AM
Proposed System Thunder9 General 80 October 14th 03 02:50 PM


All times are GMT +1. The time now is 06:04 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.