Memory Problem?

Peter Reid preid at
Sun Feb 4 06:28:03 EST 2007

Thanks David & Mark.

I think I'll try the buffered binary writing, as suggested by Mark,  
to see how that works out.  I've seen how fast Rev can do this kind  
of thing before, but not for such large files.  These files are   
Retrospect backup catalogues that I'm copying from the primary back- 
up area into another area for subsequent copying to tape.  In total,  
I have to copy about 85Gb (in about 54 files) from one partition to  
another.  Once in the 2nd partition, Retrospect itself copies them to  
an Ultrium tape drive for off-site storage.

Thanks again, I'll report back on my progress with the buffered  
binary approach.



On 3 Feb 2007, at 5:57 pm, David Bovill wrote:

> I don't think this will be a memory problem - more likely an IAC type
> problem in that revCopyFile uses AppleScript and the equivalent on  
> windows.
> If the delay between starting the event and completing it is very  
> large and
> in the mean time you have issued a cue of events - I guess things are
> getting clogged. I think the way around it is to figure out a way of
> monitoring when the copy has completed and only then issuing the next
> revCopyFile command. However I am not sure how you would do this -  
> one thing
> that you could try as well is to make a zip, issue one copy then  
> unzip?
> I'd love to know how you get on as it is a situation that does come  
> up from
> time to time?

On 3 Feb 2007, at 6:14 pm, Mark Schonewille wrote:

> Since Revolution uses AppleScript, a much better way to do this  
> task is to open each file for binary read, open a destination for  
> binary write, and use a repeat loop to read and write small chunks,  
> something like 200K. When done, close both files and continue with  
> the next. You will be surprised about the speed and if you include  
> a wait command with messages in the repeat loop, you have still  
> control over the GUI to show e.g. a progress bar.

More information about the Use-livecode mailing list