Processing Large Amounts Of Data

Warren Kuhl warrenkuhl at gmail.com
Wed Jul 29 16:30:19 EDT 2009


Richard,

Thanks...that really helps!  I will give it a try.

Warren


On 7/29/09, Richard Gaskin <ambassador at fourthworld.com> wrote:
>
> Warren Kuhl wrote:
>
> I have a variable that has approx 100,000 records (loaded from a text
>> file).
>> I need to read through each record and extract items from each record.
>>
>> I am currently using:
>> Repeat with x = 1 to tRecordCount
>> Put item 2 of line x of tData into tItem2
>> Put item 6 of line x of tData into tItem6
>> ...process data
>> End Repeat
>>
>> I find this very slow to process. Is there a fast way to process through
>> the
>> data?
>>
>
> The "repeat with..." construct will be slow when using the iteration
> variable to access the data, because each time through the loop it needs to
> count from 1 to x to find the line in question.
>
> You should see at least an order of magnitude performance gain using
> "repeat for each...", e.g.:
>
>  repeat for each line tLine in tData
>   put item 2 of tLine into tItem2
>   put item 6 of tLine into tItem6
>   ...process data
>  end Repeat
>
> With "repeat for each", the engine parses the data as it goes, putting each
> line as it finds it into tLine, and keep track of where it is so the next
> time through the loop it just picks up where it left off. Much, much faster.
>
> --
>  Richard Gaskin
>  Fourth World
>  Revolution training and consulting: http://www.fourthworld.com
>  Webzine for Rev developers: http://www.revjournal.com
> _______________________________________________
> use-revolution mailing list
> use-revolution at lists.runrev.com
> Please visit this url to subscribe, unsubscribe and manage your
> subscription preferences:
> http://lists.runrev.com/mailman/listinfo/use-revolution
>



More information about the use-livecode mailing list