3 megs of text

David Vaughan drvaughan55 at mac.com
Mon Mar 4 18:27:01 EST 2002


Ralf

It appears you are treating the lines in pairs and are using "repeat 
with i = 1 to zillions". This will be insufferably slow for large data 
volumes, as you have discovered.

You can use the _vastly_ faster "repeat for each" using a switch to get 
line pairs before processing each pair. Try it on your 480KB file and 
you ought to be measuring time in tens of seconds rather than tens of 
minutes.

If you are processing all pairs rather than consecutive pairs (that is, 
lines 1,2, then 2,3 rather than 1,2 then 3,4) then it is even easier as 
no switch is needed; just keep the last read line in a prevLine and move 
currentLine to it before the next loop.

There is also a Filter command but it is inclusive rather than exclusive 
(i.e. I don't think you can say "all lines without text_rubbish") so 
let's see what this does first.

regards
David

On Tuesday, March 5, 2002, at 05:37 , Ralf Könner wrote:

> Dear List,
>
> may be some of you might have the right solution for this task:
>
> I have about 3 MB of text data in one single logfile which I need to 
> parse
> and do some maths for time and date manipulations with. I don't want to 
> go
> through every single line of it - it simply would take far too much 
> time.
>
> Each line has 3 items: gDate,gTime,"text_A" or "text_B" or 
> "text_rubbish".
>
> 1.) I need to get rid of every line that contains "text_rubbish" (item 
> 3).
> 2.) If "text_A" in line i is followed by "text_b" in line i+1 then add
> (gTime of line i+1) minus (gTime of line i) to spentTime and 3.) put 
> gDate
> into dateList.
>
> A 480 KB file took my G3/233 about 18 minutes to do this by using 
> "normal"
> repeat loops with 1 if-then-else statement inside.
>
> Any ideas to speed this up?
>
> Thank you very much for any help and best regards,
>
> Ralf
>
> _______________________________________________
> use-revolution mailing list
> use-revolution at lists.runrev.com
> http://lists.runrev.com/mailman/listinfo/use-revolution
>




More information about the use-livecode mailing list