surprising filter benchmarks

Richard Gaskin ambassador at fourthworld.com
Tue Jul 12 16:26:19 EDT 2005


I figured the filter command would carry at least some overhead for its 
convenience, but I had no idea how much!

I wrote the test below to compare it with walking through a list line by 
line, and the results were surprising:

on mouseUp
   put  fwdbCurTableData() into s -- gets 10,800 lines of
   --                                tab-delimited data
   --
   -- Method 1: filter command
   --
   put format("*a*\t*r*\tr\t*\t*\t*\t*\t*") into tFilter	
   put s into result1
   put the millisecs into t
   filter result1 with tFilter
   put the millisecs - t into t1
   --
   --
   -- Method 2: repeat for each
   --	
   set the itemdel to tab
   put the millisecs into t
   repeat for each line tLine in s
     if item 1 of tLine contains "a" \
         AND item 2 of tLine contains "r"\
         AND item 3 of tLine is "r" then
       put tLine&cr  after result2
     end if
   end repeat
   delete last char of result2
   put the millisecs - t into t2
   --
   put result1 into fld "result"
   put result2 into fld "result2"
   --
   put "Filter: "&t1 &cr& "Repeat: "&t2
end mouseUp



Results -
    Filter: 745
    Repeat: 40

Did I miss something, or am I just seeing the penalty for the filter 
command's generalization?

--
  Richard Gaskin
  Managing Editor, revJournal
  _______________________________________________________
  Rev tips, tutorials and more: http://www.revJournal.com



More information about the use-livecode mailing list