Memory Problems
Ben Rubinstein
benr_mc at cogapp.com
Thu Dec 13 11:25:01 EST 2001
on 13/12/01 12:22 PM, Jack Rarick at rarickj at btathletics.com wrote:
> Using several different Mac's - all running Sys 9.1. Using put
> URL("Binfile .." etc to copy large files (20 to 70 meg is size). Using a
> repeat loop to copy sets of up to 20 of these files. Works fine (1 or 2
> file copies) and then hangs - throwing the error "Low memory." All of
> these Mac's have at least 128 meg of RAM and the memory on Metacard has
> been set all the way from 20 meg to 65 meg -> Same results.
Using the 'put URL "binfile:..." into URL "binfile:..."' technique to copy
files actually means that MC is reading the entire contents of the file into
memory, and then writing it out again (unless Scott's introduced a damn
clever optimisation). Hence it's not surprising that using very large files
might fall foul of this, regardless of any actual bugs.
The pure metacard alternative is to write a file copy routine that uses the
open file/read from file/close file commands to make copies a bit at a time
- choose some buffer size (eg 250K) and use 'read from file srcfile for
250000'/'write it to file dstfile' in a loop until it = empty.
If you really only care about Macs, it would probably be a lot easier (and
I'd guess quicker) to use AppleScript to get the Finder to do it for you.
Ditto on Windows to use the shell. Or of course you could define a
'copyfile' routine that used AppleScript on MacOS, shell on Windows, and
fell back on reading data into MetaCard and writing it out again on any
other platform.
Ben Rubinstein | Email: benr_mc at cogapp.com
Cognitive Applications Ltd | Phone: +44 (0)1273-821600
http://www.cogapp.com | Fax : +44 (0)1273-728866
More information about the metacard
mailing list