Finding duplicates in a list

Ian Wood revlist at azurevision.co.uk
Wed Jan 9 06:47:57 EST 2008


Faaaaaantastic!

I see similar speedups, and the entire 40k list goes through in 2s.

Thanks a lot,

Ian

On 9 Jan 2008, at 11:33, Bill Marriott wrote:

> on mouseUp
>
>  put url "file:D:/desktop/dupeslist.txt" into tList
>  set the itemdelimiter to tab
>  put the milliseconds into tt
>
>  put 0 into n
>  repeat for each line tCheck in tList
>    add 1 to n
>    put n & tab & tCheck & return after tCheckArray[tCheck]
>  end repeat
>
>  put empty into tListResult
>
>  repeat for each key theKey in tCheckArray
>    if the number of lines in tCheckArray[theKey] > 1 then
>      repeat with i = 2 to the number of lines in tCheckArray[theKey]
>        put item 1 of line 1 of tCheckArray[theKey] & tab & \
>            item 1 of line i of tCheckArray[theKey] & tab & \
>            theKey & return after tListResult
>      end repeat
>      -- put  tCheckArray[j] & return after tListResult
>    end if
>  end repeat
>
>  put the milliseconds - tt & return & "number of files:" && \
>      the number of lines in tList & return & return & tListResult
> end mouseUp
>
>
> 64 milliseconds on my computer, versus 5023 for yours.




More information about the use-livecode mailing list