A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » General Hardware & Peripherals » General
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Page File size - I've been thinkinig.



 
 
Thread Tools Display Modes
  #31  
Old October 19th 07, 05:59 AM posted to alt.comp.hardware
Lord Turkey Cough
external usenet poster
 
Posts: 67
Default Page File size - I've been thinkinig.


"startrap" wrote in message
...

Defragging makes a difference if you use programs that access the disk
frequently. Reading a contiguous file is faster than reading a
fragmented file although it does depend on the actual size of the file
and degree of fragmentation (eg 2 fragments = not a big deal. 50
fragments = bad). A fragmented MFT (Master File Table) is particularly
bad for performance.


This is a pretty neglible amount of time. Afterall your computer arrives
defragged, most of the programs you add to it will be in the early days
and hence to a 'clean' drive, and hence defragged. After loading everything
is run from memory anyway, does not matter if the original file was
fragmented.

Futhermore, the files assoicated with a program will be but in the most
convient place on the drive initially, when you then defrag the drive those
files will be scattered pretty randonly over the drive. The defragger cannot
know which programs those files are normally accessed by - impossible.


Defragmentation also reduces drive wear because reading/writing a file
contiguously stresses it relatively less (mechanically i.e.) since the
acuator arm does not have to go nuts trying to pick up fragments from
around the platter. Finally, this last point saves battery power on
laptops, though it is not a factor for desktops.


The last point I made about files associated with programs nullifies
your point.Just think about the ware and tear your background program
causes looking for fragmented files too!!!!

First time I ever defragged my computer, to speed up the start up time
I timed it to see 'how much faster' it was. If anything it appeared to be
slower!!!
(honest!!). I have not really bother much with it after that, it seems to
make little of know difference. The six hours or so of constant disk
activity
didn't really endear me to the idea either!!

Like virus scanninig it is a by and large a waste of time. Never finds
anything
baring red herrings.



As for automatic defragmentation, its not that the drive runs all the
time during idle...only when necessary i.e a few minutes a day. It's
hardly a bother.

Lord Turkey Cough;155961 Wrote:

Can't say I bother with defragging at all, never noticed any
performance
difference after defragging (actually seemed slower), so I just don't
bother
anymore. Don't like the idea of a back ground defragger either I prefer
my
computer to be silent when idle, constant disk activity would drive me
nuts.



Anyway, all this fragmentation stuff is OT lol. Coming back to the
paging file: is there any demonstrable performance increase in NOT
having a paging file, or is it just a 'feeling' that everything is
faster? (Though that's good enough for me lol) Any hard performance
data? I am really curious about this, but too chicken to actually try it
out on my rig.


Well I have taken the plunge and tried it out on my 'rig' and whilst I have
no
data to prove it is better, it certaintly feels no worse. I think my machine
is quieter, but I can't prove that really, all I can say I have 540 meg of
free
memory and so I can't really seem much reason for any disk activity, and
indeed
there does not appear to me much if any of that even thouogh I have a few
programs
running which are connected to advertising stuff etc..


My rig at the moment is a C2D E6550 on a Gigabyte P35-DS3R mobo with 2
GB of RAM, 2x160GB + 1x250GB HDDs, 7800GT 256MB and XP Pro. It's a
decent rig with sufficient RAM, but I still leave the paging file on (in
a small, separate partition) because, frankly I am very scared of
crashes in the middle of something important (I use Photoshop and huge
DSLR RAW files a lot) and losing my data.


I think mine locked up once early on but that could have been due to a
number
of reasons as it has done that before with pageing on. since then it has
been fine,
and thats about 5 days now.

Also before it could get into a state where it ran inccredibly slow due to
constant
disk activity so I have to wait untill that stops, and quite frankly it
would be
quicker and better to reboot!

Personally I would just give it a go otherwise you will never know, 2 gig is
a lot,
it's not that long ago I only had 2 gig of drive space!!!! I am sure you
have run into
problems even with paging on so what have you got to lose?

If you consider the massive difference between access time of ram and hard
drive
then quite frankly, it is counter productive.
For example I sometimes run a statistical program on 10's of thousands of
poker
hand history files, the first run takes ages as all the files are on disk,
after that when
the files are cached in memory it is much faster, by a factor of at least
20,
maybe 50 or 100. So paging to my mind is rather pontless, if you get to the
stage
where you are pageing a lot you would probably be better off rebooting!!

No harm in trying it.




------------------------------------------------------------------------
View this thread: http://www.wirelessforums.org/showthread.php?t=30288
http://www.wirelessforums.org



  #32  
Old October 19th 07, 05:17 PM posted to alt.comp.hardware
startrap[_4_]
external usenet poster
 
Posts: 1
Default Page File size - I've been thinkinig.


Lord Turkey Cough;157172 Wrote: [color=blue]
"startrap" wrote in message
...

This is a pretty neglible amount of time.


Not for a HDD. Even a few millisecs is a long time -for the system-.
When you consider track seek time, rotational latency, settle time etc
for each fragment that the drive has to pick up sequentially, it can
have substantial impact on performance. But as I mentioned, the degree
of fragmentation of the files is key. Apart from the optical drives, the
harddrive is the slowest component of the PC because of it's mechanical
operation, so if it runs even slower due to heavy fragmentation, it's
not good.

Afterall your computer arrives
defragged, most of the programs you add to it will be in the early
days
and hence to a 'clean' drive, and hence defragged. After loading
everything
is run from memory anyway, does not matter if the original file was
fragmented.


Not really. I keep adding and deleting programs quite often, and with
the size of today's files fragmentation can build up quickly. And
fragmentation affects all files not just 'programs'..modify any file and
it may get fragmented or cause free space fragmentation if it is flanked
by other files.

Er....'the loading into memory' is what is affected by fragmentation.
As is writing to the drive. Once it's in RAM, it shouldnt matter unless
you are *gasp* paging it to the HDD and the page file itself is
fragmented.


Futhermore, the files assoicated with a program will be but in the
most
convient place on the drive initially,


Not necessary at all. In NTFS, it gets puts into the first bits of free
space available, which might or might not be fragmented free space.


when you then defrag the drive those
files will be scattered pretty randonly over the drive.


Not at all. Defragmenters consolidates files and directories.


The defragger cannot
know which programs those files are normally accessed by - impossible.

Actually, they can. Atleast most of the new ones offer sequencing and
placment options based on a number of file attributes.


The last point I made about files associated with programs nullifies
your point.Just think about the ware and tear your background program
causes looking for fragmented files too!!!!


Once the files are defragmented, the head can pick them up sequentially
so no wear and tear. A defragmented drive with well consolidated free
space suffers from lesser fragmentation during future file writes.

And the auto defraggers dont go to work 24x7; as I said, only when
necessary, and using the barest minimum of resources. Usually, they
would run for a few minutes a day at the most.

Better than the head going crazy *each time* it has to pick up a
fragmented file.


First time I ever defragged my computer, to speed up the start up time
I timed it to see 'how much faster' it was. If anything it appeared to
be
slower!!!
(honest!!). I have not really bother much with it after that, it seems
to
make little of know difference. The six hours or so of constant disk
activity
didn't really endear me to the idea either!!


You are right, that's quite a departure from the norm. It has never
been the case in my experience. Usually, manual fragmentation ought to
be as follows: [defragmentation of files] - [boot-time defrag to defrag
the MFT, paging file etc] --[final file defrag]. Once this is done,
you are all set.


Like virus scanninig it is a by and large a waste of time. Never finds
anything
baring red herrings.


Not a waste of time at all, since it is completely automatic in nature.
And it is useful for those who use their systems heavily. I game, use
Photoshop, and my PC is my main entertainment device in my room, so
defragging definitely helps me.

As for AV scans, if your AV setup is good in the first place, no
viruses will get through the net; but fragmentation is an inherent trait
(er, 'feature', thanks Microsoft!) of the FAT and NTFS file systems.
Others such as ext3 dont suffer as much from this.




Well I have taken the plunge and tried it out on my 'rig' and whilst I
have
no
data to prove it is better, it certaintly feels no worse. I think my
machine
is quieter, but I can't prove that really, all I can say I have 540 meg
of
free
memory and so I can't really seem much reason for any disk activity,
and
indeed
there does not appear to me much if any of that even thouogh I have a
few
programs
running which are connected to advertising stuff etc..


If you say there is no drawback or benefit from disabling the paging
file apart from the relative lack of HDD activity, then it does not seem
to be necessary to take the risk. Maybe I can try it out on my office PC
which is er..'expendable' and ironically contains no important data.


I think mine locked up once early on but that could have been due to a
number
of reasons as it has done that before with pageing on. since then it
has
been fine,
and thats about 5 days now.

Also before it could get into a state where it ran inccredibly slow due
to
constant
disk activity so I have to wait untill that stops, and quite frankly
it
would be
quicker and better to reboot!


That slowdown could have been due to a number of reasons including
fragmentation or a fragmented paging file, background processes/programs
accessing the disk etc.


Personally I would just give it a go otherwise you will never know, 2
gig is
a lot,
it's not that long ago I only had 2 gig of drive space!!!! I am sure
you
have run into
problems even with paging on so what have you got to lose?

Actually, I've never had any problems with the paging file being
enabled since it sits inside it's own little partition on the outer edge
of the platter. In fact, I cant remember when was the last time my
system BSODed or hard crashed. It's always been running smoothly since I
first built it 2 years ago with a A64/1GB RAM as the starting point. I
upgraded the sytem to intel only recently.


If you consider the massive difference between access time of ram and
hard
drive
then quite frankly, it is counter productive.
For example I sometimes run a statistical program on 10's of thousands
of
poker
hand history files, the first run takes ages as all the files are on
disk,
after that when
the files are cached in memory it is much faster, by a factor of at
least
20,
maybe 50 or 100. So paging to my mind is rather pontless, if you get to
the
stage
where you are pageing a lot you would probably be better off
rebooting!!


You do have a point, that RAM is always much faster than the HDD, but
it still has to get the poker files from the HDD to the RAM, and that's
where the bottleneck comes in. I doubt paging has much to do with it.


No harm in trying it.

I probably will, but on my office PC


------------------------------------------------------------------------
View this thread: http://www.wirelessforums.org/showthread.php?t=30288
http://www.wirelessforums.org

  #33  
Old October 20th 07, 03:12 AM posted to alt.comp.hardware
Lord Turkey Cough
external usenet poster
 
Posts: 67
Default Page File size - I've been thinkinig.


"startrap" wrote in message
...[color=blue]

Lord Turkey Cough;157172 Wrote:
"startrap" wrote in message
...

This is a pretty neglible amount of time.


Not for a HDD. Even a few millisecs is a long time -for the system-.
When you consider track seek time, rotational latency, settle time etc
for each fragment that the drive has to pick up sequentially, it can
have substantial impact on performance. But as I mentioned, the degree
of fragmentation of the files is key. Apart from the optical drives, the
harddrive is the slowest component of the PC because of it's mechanical
operation, so if it runs even slower due to heavy fragmentation, it's
not good.


Well it has to do that for even if the whole file is unfragmented it has to
find the file. When a file is written to a fragmented disk I would imagine
it
puts it in the places which are quickest to access, (seems sensible) so
I doubt the access overhead would be that much.


Afterall your computer arrives
defragged, most of the programs you add to it will be in the early
days
and hence to a 'clean' drive, and hence defragged. After loading
everything
is run from memory anyway, does not matter if the original file was
fragmented.


Not really. I keep adding and deleting programs quite often, and with
the size of today's files fragmentation can build up quickly. And
fragmentation affects all files not just 'programs'..modify any file and
it may get fragmented or cause free space fragmentation if it is flanked
by other files.


But it's not a great overhead all things considered.

Er....'the loading into memory' is what is affected by fragmentation.
As is writing to the drive. Once it's in RAM, it shouldnt matter unless
you are *gasp* paging it to the HDD and the page file itself is
fragmented.


I don't use a page file anymore. I think it is better to ensure you never
need
a page file by not overloading your system.


Futhermore, the files assoicated with a program will be but in the
most
convient place on the drive initially,


Not necessary at all. In NTFS, it gets puts into the first bits of free
space available, which might or might not be fragmented free space.


However it's likely to be on the same trackor nearest track to the read
head, so not to much work. Next time you use that file the read head is
also likely to be in a similar position, unless of course you have defragged
in which case it will be likely be in some random position on the disk.



when you then defrag the drive those
files will be scattered pretty randonly over the drive.


Not at all. Defragmenters consolidates files and directories.


And that is what I am what I am saying could be the cause of the problem.
A file will have been moved from what was a convienient place to access into
a different place based upon directory structures. Initally the required
files
might have been written on the same track, now they will be pretty much
scattered
randomly all over the drive.


The defragger cannot
know which programs those files are normally accessed by - impossible.

Actually, they can. Atleast most of the new ones offer sequencing and
placment options based on a number of file attributes.


I don't think that wil be helpful.



The last point I made about files associated with programs nullifies
your point.Just think about the ware and tear your background program
causes looking for fragmented files too!!!!


Once the files are defragmented, the head can pick them up sequentially
so no wear and tear. A defragmented drive with well consolidated free
space suffers from lesser fragmentation during future file writes.


Whilst the files themselves may be defragmented, a set of files as used
as a functional group are liklely scattered all over the drive.

It's a bit like an untidy desk, it may look untidy but things tend to be
automatically
be grouped togeather by usage, everything for a particualar function will
tend
to be grouped togeather by last uasge, which is likely them most convienient
grouping for their next usage. When you tidy up that desk you destroy that
'natural grouping'. Things become grouped by other things unrelated their
most lilkely usage.

And the auto defraggers dont go to work 24x7; as I said, only when
necessary, and using the barest minimum of resources. Usually, they
would run for a few minutes a day at the most.

Better than the head going crazy *each time* it has to pick up a
fragmented file.


I don't think that would happen, the fragments would be initially
writte to the most convienient space and hence bein a convienient space
when it comes to reading them again.


First time I ever defragged my computer, to speed up the start up time
I timed it to see 'how much faster' it was. If anything it appeared to
be
slower!!!
(honest!!). I have not really bother much with it after that, it seems
to
make little of know difference. The six hours or so of constant disk
activity
didn't really endear me to the idea either!!


You are right, that's quite a departure from the norm. It has never
been the case in my experience. Usually, manual fragmentation ought to
be as follows: [defragmentation of files] - [boot-time defrag to defrag
the MFT, paging file etc] --[final file defrag]. Once this is done,
you are all set.


Well whatever the case I don't think find fragmentation an issue for me.
My disk does not go crazy in general, and if it does I am pretty sure it is
nothing to do with fragmented files.More likely to do with excessive
pageing,
my view is once it starts trying to use your hard drive as RAM you may as
well
give up, the differnce in access times is collossal.


Like virus scanninig it is a by and large a waste of time. Never finds
anything
baring red herrings.


Not a waste of time at all, since it is completely automatic in nature.
And it is useful for those who use their systems heavily. I game, use
Photoshop, and my PC is my main entertainment device in my room, so
defragging definitely helps me.


Well in my experience it makes no noticable difference, I did it several
times
on my old system and it seemed eaxctly the same, if not worse. Even if you
defrag individual files you will oftend be working with hundreds of small
files
anyway, which is the same as one file in a hundred fragments. Defragging may
well put these 100 files in less convienient places than those which they
were initially
in, so it's swings and roundabouts. I certaintly have no initension
whatsoever of defragging
any of my drives at the moment. I think it would more likely make thing
worse
than better, and as it is fine at the moment it is not a risk I am prepared
to take.

As for AV scans, if your AV setup is good in the first place, no
viruses will get through the net; but fragmentation is an inherent trait
(er, 'feature', thanks Microsoft!) of the FAT and NTFS file systems.
Others such as ext3 dont suffer as much from this.


Anything Microsoft produces is rubbish, it takes 2 seconds to pop up my
volume control, from RAM. No ammount of defragging will make a silk
purse out of a cows ear. Enough said.




Well I have taken the plunge and tried it out on my 'rig' and whilst I
have
no
data to prove it is better, it certaintly feels no worse. I think my
machine
is quieter, but I can't prove that really, all I can say I have 540 meg
of
free
memory and so I can't really seem much reason for any disk activity,
and
indeed
there does not appear to me much if any of that even thouogh I have a
few
programs
running which are connected to advertising stuff etc..


If you say there is no drawback or benefit from disabling the paging
file apart from the relative lack of HDD activity, then it does not seem
to be necessary to take the risk. Maybe I can try it out on my office PC
which is er..'expendable' and ironically contains no important data.


Well I was a little worried at first "Will it crash?" I thought, but it has
been
fine for about a week now, and considerably quiter I would say. Certaintly
no noiser.


I think mine locked up once early on but that could have been due to a
number
of reasons as it has done that before with pageing on. since then it
has
been fine,
and thats about 5 days now.

Also before it could get into a state where it ran inccredibly slow due
to
constant
disk activity so I have to wait untill that stops, and quite frankly
it
would be
quicker and better to reboot!


That slowdown could have been due to a number of reasons including
fragmentation or a fragmented paging file, background processes/programs
accessing the disk etc.


Personally I would just give it a go otherwise you will never know, 2
gig is
a lot,
it's not that long ago I only had 2 gig of drive space!!!! I am sure
you
have run into
problems even with paging on so what have you got to lose?

Actually, I've never had any problems with the paging file being
enabled since it sits inside it's own little partition on the outer edge
of the platter. In fact, I cant remember when was the last time my
system BSODed or hard crashed. It's always been running smoothly since I
first built it 2 years ago with a A64/1GB RAM as the starting point. I
upgraded the sytem to intel only recently.


Never has a BSOD on mine yet, seemed to have locked up a couple
of times but generally I just reboot pretty quickly rather than wait to
see if it 'sorts itself out' and then have to reboot anyway. Better to
reboot
in a couple of minutes than wait 5 hoping it will cure itself!!


If you consider the massive difference between access time of ram and
hard
drive
then quite frankly, it is counter productive.
For example I sometimes run a statistical program on 10's of thousands
of
poker
hand history files, the first run takes ages as all the files are on
disk,
after that when
the files are cached in memory it is much faster, by a factor of at
least
20,
maybe 50 or 100. So paging to my mind is rather pontless, if you get to
the
stage
where you are pageing a lot you would probably be better off
rebooting!!


You do have a point, that RAM is always much faster than the HDD, but
it still has to get the poker files from the HDD to the RAM, and that's
where the bottleneck comes in. I doubt paging has much to do with it.



No it can't really.
Actually another poker site puts all the poker hand historys into one big
file, or
several big files and I think this is a much better approach, much less disk
activity.
with say one 40 meg file than 40,000 1KB files and I do men much less, I
would
say at least 50 times faster.
It was a bit of a pain modifying the program though especially as I was not
100% sure
of the structure of the history files initially, I do now so the second
program is structured
better. I think I would also be better off bunging other sites files into
one big file
too. Mind you the statistics its gather on a player are not of much use,
they don't tell
you what cards he holds, and it is easier to guess than from how he plays
his current
hand rather than statistics on how he played his previous hands. So counter
productive
in a way, but it sharpened up my programming skills.



No harm in trying it.

I probably will, but on my office PC


------------------------------------------------------------------------
View this thread: http://www.wirelessforums.org/showthread.php?t=30288
http://www.wirelessforums.org



  #34  
Old October 21st 07, 02:01 AM posted to alt.comp.hardware,alt.computer
John
external usenet poster
 
Posts: 161
Default Page File size - I've been thinkinig.

Noozer wrote:
"Michael Everson" wrote in message
...
A rule of thumb in linux is to make your swap file no bigger than the
ammount of physical memory in your computer.


That is an old wives tale.

Does it really make sense to have a 128meg swapfile if you only have 128meg,
but have a 1gig swapfile if you have 1gig?

Let windows manage the size... If you have lots of ram you won't be hitting
it very often anyhow.




Letting Windows manage the size is a good idea. The only other
suggestion I would make is to put the swap/page file in its own
partition to prevent fragmentation. An even better solution is to put it
in its own partition on a different hard drive than the drive where the
OS is located, obviously the faster the drive the better. If running
Windows turn off indexing and system restore on a dedicated page file
partition as it serves no purpose and just slows down access to the page
file.

John
  #35  
Old October 22nd 07, 01:59 PM posted to alt.comp.hardware
startrap[_5_]
external usenet poster
 
Posts: 1
Default Page File size - I've been thinkinig.


To Lord Turkey Cough,
Dude, no offence, but I don't think you have a clear understanding of
fragmentation and defragmentation and how it relates to file
read/writes. When you say that your performance is better with one large
40MB file than 40,000 1KB files, thats analogous to a a defragmented
file. Anyway, Let me not drag this off-topic conversation any further.
Let us agree to disagree

I disabled the paging file on my office HP desktop, and after running
it for a few hours, I have found zero difference in performance or disk
activity. I don't think disabling the paging file helps in any way
whatsoever in my case, so I re-enabled it and left it on.

As for your case, where loading those poker files from the HDD to the
RAM takes a long time, I am curious as to your memory usage before and
after loading those files. Have you checked? Also, I am not sure why
your OS would page the drive when initially loading those files into
the RAM, if you have sufficient RAM in the first place. Me thinks you
have some other disk, hardware or OS related problems... run chkdsk on
the drive to see if it comes up with errors, and also run a
fragmentation analysis and let us know (yeah I know you don't believe in
defragging, but never hurts to check). Also if you have an IDE drive,
check if it has downgraded itself to PIO mode from UDMA.

You also ought to download and run this freeware program called HDtach
(google for it) to check if your disk performance is normal.


------------------------------------------------------------------------
View this thread: http://www.wirelessforums.org/showthread.php?t=30288
http://www.wirelessforums.org

  #36  
Old October 27th 07, 05:17 AM posted to alt.comp.hardware,alt.computer
Lord Turkey Cough
external usenet poster
 
Posts: 67
Default Page File size - I've been thinkinig.


"Lord Turkey Cough" wrote in message
...

"GT" wrote in message
...
"Lord Turkey Cough" wrote in message
...

"Michael Everson" wrote in message
...
A rule of thumb in linux is to make your swap file no bigger than the
ammount of physical memory in your computer. That is probably a decent
guideline to follow for windows as well. I would set it to 1gig and see
if you run in to any problems. If you do then increase it to 2gig but 1
should be plenty alongside your 1gig physical ram.

No I don't really agree, I think 90% of the time it will just be
swapping a load of crap
you will never need again to disk, and disk writes are slow.
I would say my computer is a lot more responsive since I ditched the
page file.

I don't wish to be rude, but the idea that you have over 1 gig of
frequently used
data is ludricrous - cloud cuckoo land. If you are reading in files that
large might
as well read the origiinal file.

If you read in 100 meg then you would have to write a 100 meg to disk
which is a much
slower process then simply reading in the original file.


Contiguous swap file space versus fragmented original 100MB file. Could
make loads of difference! I personally run with 1.5GB of RAM and swapfile
disabled as I work with loads of small files that are read in, compiled,
OBJ files created, linked etc etc. If the swapfile were busy at the same
time, performance would drop.

I would suggest running the simply Windows Task Manager. Leave it open
for ages on the performance tab with the update speed (View menu) set to
Low. See how much memory you 'peak' at. If you don't get anywhere near
(maybe 75%) full, then just turn off swapping. But if you run out of RAM,
things WILL fail/crash.


Yes I suppose they would, I think I might have had that once when I opened
up a window on an application, sounds plausible.. Typically I have around
1/2 gig available. I have just put my machine up towhat I would call 'max'
usage, 4 poker applications, OE, several IE and a digital TV application
running and I have 300 meg free. I would not normally run with that kind
of
load as it is quite a load on the CPU, especially the TV app.
Anyway I will keep an eye on things in the task manager and see how I get
on.
It was fine yesterday and has been fine so far today. Generally I would
prefer to run without a pagefile.


OK then 10 days on from the above post and all has been fine, not one crash
or lock up. I can go down to my last 250 meg of ram sometimes but there
is so much stuff running, that I would need a faster CPU before I needed
more Ram.


  #37  
Old October 27th 07, 04:02 PM posted to alt.comp.hardware,alt.computer
kony
external usenet poster
 
Posts: 7,416
Default Page File size - I've been thinkinig.

On Sat, 27 Oct 2007 04:17:51 GMT, "Lord Turkey Cough"
wrote:


OK then 10 days on from the above post and all has been fine, not one crash
or lock up. I can go down to my last 250 meg of ram sometimes but there
is so much stuff running, that I would need a faster CPU before I needed
more Ram.



Again I remind you that used memory total allocated
memory. When you open an app it reserves a certain amount
more than used, so when you use that app if you do something
demanding it will exceed your real memory expectations.

Maybe your use isn't so demanding and thus you can get away
with this running without a pagefile, but many can't.
Mainly I suggest that if/when you receive an out of memory
type message, an abrupt app termination, or a bluescreen,
that the first attempt at resolution be to re-enable the vm
pagefile.

I write this having traveled down that road, that things so
seem to work fine until you run something that needs this vm
allocation and when it can't get it, it crashes.
  #38  
Old October 31st 07, 12:18 AM posted to alt.comp.hardware
Lord Turkey Cough
external usenet poster
 
Posts: 67
Default Page File size - I've been thinkinig.


"startrap" wrote in message
...

To Lord Turkey Cough,
Dude, no offence, but I don't think you have a clear understanding of
fragmentation and defragmentation and how it relates to file
read/writes. When you say that your performance is better with one large
40MB file than 40,000 1KB files, thats analogous to a a defragmented
file.


I do understand it, what fragmeation there is is not noticable.
I know if I defrag I won't notice any difference so why bother?


Anyway, Let me not drag this off-topic conversation any further.
Let us agree to disagree

I disabled the paging file on my office HP desktop, and after running
it for a few hours, I have found zero difference in performance or disk
activity. I don't think disabling the paging file helps in any way
whatsoever in my case, so I re-enabled it and left it on.


I think this is because it is not pagiing anyway as I say below, it's
not used so it does not matter if it is on or off.


I ran fine untill a few days ago when I got a warning to increase my page
file, I just closed down some stuff instead but I have put it to windows
managed
page file. So you will get a warming before any majo problem.
Does not seem to make much difference either way.
I think now it does not matter how I sestit because it is never or rarely
used
anyway.Bascially once in a month it got low enouth to give a warming
(think I had over 100 meg spare at the time.)



As for your case, where loading those poker files from the HDD to the
RAM takes a long time, I am curious as to your memory usage before and
after loading those files.


Not sure what you mean.
I know all the files fit in memory because when I run it a second time
I do no hear the same disk access, unless I have run something in between
to overwrite them.
If I do it on a lot of files there might not be enough memory so it would
have
to do the whole lot again. Thus I don't do it on ahuge pile of files as
it takes ages.

Have you checked? Also, I am not sure why
your OS would page the drive when initially loading those files into
the RAM, if you have sufficient RAM in the first place. Me thinks you
have some other disk, hardware or OS related problems... run chkdsk on
the drive to see if it comes up with errors, and also run a
fragmentation analysis and let us know (yeah I know you don't believe in
defragging, but never hurts to check). Also if you have an IDE drive,
check if it has downgraded itself to PIO mode from UDMA.


I don't think I have a problem with my drive, it works fine I am sure of
that.
I have benchmarked it before, it is 'normal' enough. Just did another and
it's
figures look OK considering I have a lot of other apps running.
12 ms random access
Drive Index 42 MB/s

I don't really have a problem with my computers performance generally

You also ought to download and run this freeware program called HDtach
(google for it) to check if your disk performance is normal.


I ran Sisoft benchmark and it looked comparable to similar drives.


------------------------------------------------------------------------
View this thread: http://www.wirelessforums.org/showthread.php?t=30288
http://www.wirelessforums.org



 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Optimum page file size for 1 GB? Terry Pinnell Storage (alternative) 42 May 9th 07 06:16 PM
Page file Jaap Telschouw General 59 January 24th 07 07:34 PM
Optimal Page File size? John Blaustein Asus Motherboards 6 September 3rd 04 04:37 PM
custom page size Rover Printers 1 August 27th 04 07:20 PM
How to set page size in PCL XL John Brown Printers 3 June 8th 04 12:32 AM


All times are GMT +1. The time now is 05:53 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.