Optimize This!

Jan Schenkel janschenkel at yahoo.com
Mon May 1 04:20:15 EDT 2006


--- Todd Geist <tg.lists at geistinteractive.com> wrote:
> Hello Everyone,
> 
> I had the need to search large binary files for a
> string. The files  
> could be over a gigabyte in size, so I decided not
> to load the whole  
> file into ram but digest in chunks instead.  This is
> the routine I  
> came up with, it seems to work very quickly but I am
> wondering if  
> some of you might be able to speed it up.
> 
> [snip]
> 
> what do you think?  Did I miss some obvious easier
> way?
> 
> Thanks
> 
> Todd
> 

Hi Todd,

Have you looked at the 'seek' command? It allows you
to go straight to a certain position in the file, and
then you can use 'read' to grab the chunk you're after
- no repeat needed.
--
open file tFilePath
seek to 30000 in file tFilePath
read from file tFilePath for 500 chars
put it into tData
close file tFilePath
--

Hope this helped,

Jan Schenkel.

Quartam Reports for Revolution
<http://www.quartam.com>

=====
"As we grow older, we grow both wiser and more foolish at the same time."  (La Rochefoucauld)

__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 



More information about the use-livecode mailing list