Entry
I get a MemoryError using <file>.read on an AIX machine with lots of memory. How can I use more?
Feb 20th, 2006 15:40
Brian Duhan, Michael Chermside, Markus Indenbirken, Seth Grimes,
Well, just buy more memory! <wink>
The problem here is almost certainly due to your having tried to read
the entire file into memory at once. Most likely, the solution is for
you to read and process it bit by bit, never keeping the entire thing in
memory at once. If you think that might work for you, keep reading. (If
not, another option might be to use a memory-mapped file. See the
documentation on the `mmap´ module.)
If you have a file object, f:
>>> f = file('readme.txt')
there are several ways you can try to read the contents. If you want to
read the entire contents into a string, works like this:
>>> wholeFile = f.read()
But, as we said above, that may be too big. You can supply a maximum
size if you like:
>>> first1K = f.read(1024)
and if you do this in a loop you can just keep reading through the file
in chunks. When you reach the end of the file, you will get a chunk
which is smaller than 1024 (size 0 when actually at the EOF).
If you have a text file, you can (and probably should) read it in line
by line. Calling the readlines method:
>>> allLines = f.readlines()
will return all of the lines as one big array... useful sometimes, but
it still keeps it all in memory at once. Instead, try using:
>>> for line in f.xreadlines():
>>> process( line )
The difference is that the lines are read in one-by-one on demand.
But the ***BEST*** way to do line-by-line processing requires Python 2.2
or higher. It's fast, and it's really easy to type:
>>> for line in f:
>>> process( line )
If your problem wasn't solved by the Python-level fixes above, you can
edit (as root) /etc/security/limits to allow the user running the script
more system resources. This is valid for AIX 4 and AIX 5.2 (the only
versions I have access to). AIX's 'limits' will also get in the way if
you try to write a file bigger than 1GB or so.