Memory Problem?

Peter Reid preid at
Tue Feb 6 17:24:05 EST 2007

I've now managed to use buffered file copying for the problem I  
mentioned previously.  Here is my file copying handler:

on copyAfile  sourceFile, destFolder, fCreatorType
   constant cBffrSize = 10485760 -- copy in 10Mbyte chunks
   set itemDelimiter to "/"
   put destFolder & last item of sourceFile into destFile

   -- set the Mac OS filetype & create destination file:
   set the fileType to fCreatorType
   open file destFile for binary write
   -- open source file:
   open file sourceFile for binary read

   -- copy the file data:
   put empty into dataBffr
   put false into gotEOF
   repeat until gotEOF
     read from file sourceFile for cBffrSize chars
     put the result is "eof" into gotEOF
     put it into dataBffr
     if dataBffr is not empty then
       write dataBffr to file destFile
     end if
   end repeat

   close file destFile
   close file sourceFile

   -- copy Mac OS resource fork info:
   put getResources(sourceFile) into resourceList
   set itemDelimiter to comma
   repeat for each line i in resourceList
     put item 1 of i into resType
     put item 2 of i into resID
     put copyresource(sourceFile,destFile,resType,resID) into junk
   end repeat
   put empty into junk
end copyAfile

The remaining problem I have is that the copied file has the current  
time & date NOT the same time & date as the original file.  If I had  
used the revCopyFile, then the time & date would have been preserved.

Can anyone suggest how I can change the (creation or modified) time &  
date on a file so the copies are the same as the originals?

On 4 Feb 2007, at 11:28 am, Peter Reid wrote:

> Thanks David & Mark.
> I think I'll try the buffered binary writing, as suggested by Mark,  
> to see how that works out.  I've seen how fast Rev can do this kind  
> of thing before, but not for such large files.  These files are   
> Retrospect backup catalogues that I'm copying from the primary back- 
> up area into another area for subsequent copying to tape.  In  
> total, I have to copy about 85Gb (in about 54 files) from one  
> partition to another.  Once in the 2nd partition, Retrospect itself  
> copies them to an Ultrium tape drive for off-site storage.
> Thanks again, I'll report back on my progress with the buffered  
> binary approach.
> Regards,
> Peter
> On 3 Feb 2007, at 5:57 pm, David Bovill wrote:
>> I don't think this will be a memory problem - more likely an IAC type
>> problem in that revCopyFile uses AppleScript and the equivalent on  
>> windows.
>> If the delay between starting the event and completing it is very  
>> large and
>> in the mean time you have issued a cue of events - I guess things are
>> getting clogged. I think the way around it is to figure out a way of
>> monitoring when the copy has completed and only then issuing the next
>> revCopyFile command. However I am not sure how you would do this -  
>> one thing
>> that you could try as well is to make a zip, issue one copy then  
>> unzip?
>> I'd love to know how you get on as it is a situation that does  
>> come up from
>> time to time?
> On 3 Feb 2007, at 6:14 pm, Mark Schonewille wrote:
>> Since Revolution uses AppleScript, a much better way to do this  
>> task is to open each file for binary read, open a destination for  
>> binary write, and use a repeat loop to read and write small  
>> chunks, something like 200K. When done, close both files and  
>> continue with the next. You will be surprised about the speed and  
>> if you include a wait command with messages in the repeat loop,  
>> you have still control over the GUI to show e.g. a progress bar.

More information about the use-livecode mailing list