60 hours divided by 60 is 2 minutes?

MultiCopy Rotterdam-Zuid info at multicopy.org
Mon Oct 28 03:14:02 EST 2002

Hi all,

I have a 5MB file with about 550000 lines that need to be processed by a
script. A simple script that deletes a line if the previous line has the
same contents. That takes more than 60 hours to complete. So I thought I
divide the file into smaller files of about one 60th of the total number of
lines. But instead of the expected hour of processing time, it took 2
minutes for each file to complete.

I understand processes are faster with less data in memory, but I never
would have thought the difference would be this big.

Any thoughts on how this is possible and what we can learn from it when
making programs?


More information about the Use-livecode mailing list