Lowering high CPU rates?
Phil Davis
revdev at pdslabs.net
Sun Mar 16 03:05:27 EDT 2008
Hi David,
One thing you can do is break up the file processing into a small number
of records at a time, with big time gaps inserted between the record
processing cycles. It'll obviously make your entire process take a lot
longer, but will reduce the load on the CPU. It also will reduce the
demand for memory, which in itself can make a large difference in one's
computing experience.
Here's one way to make that happen. The magic ingredients are:
- using open / read / close file to bring a small amount of data into
memory at a time
- putting huge (1 sec) gaps between tiny 100-records-at-a-time
processing cycles
-- all in the script of a button
local vMyFileIn, vMyFileOut
on mouseUp -- start everything
answer file "Select a file to process:"
if it = empty then exit to top
put it into vMyFileIn
ask file "Save output as:"
if it = empty then exit to top
put it into vMyFileOut
open file vMyFileIn for read
open file vMyFileOut for write
send "processFile" to me in 1 second
end mouseUp
on processFile
read from file vMyFileIn for 100 lines -- puts lines into "it"
if the result = "EOF" then
send "endProcess" to me in 1 second
exit to top
end if
put withoutZeroInItemTwo(it) into tCleanRecords
write tCleanRecords to file vMyFileOut
send "processFile" to me in 1 second
end processFile
function withoutZeroInItemTwo pRecords
-- you know what goes here!
end withoutZeroInItemTwo
on endProcess
close file vMyFileIn
close file vMyFileOut
answer "File processing is completed."
end endProcess
HTH - there may be errors in my code - I haven't tested it...
Phil Davis
David Coker wrote:
> Since folks here on the list have been so awesome in helping me around
> the few trouble spots with my project, I'm just about to the point
> where I can start adding in the final error checking routines and
> working towards a beta stage... I thought I'd first touch base with
> you good folks to see if there is a work around for the final
> troubling aspect of this project.
>
> We have some pretty hefty hardware on our office machines so I haven't
> noticed many problems with the data I've thrown at this program while
> testing on those, but when working on my development machine (an older
> laptop with very modest specs), the CPU is running at 100% for what
> seems to be long periods of time. Unfortunately, during those times
> it's almost impossible to do any other work.
>
> Speed is a relative issue with what I'm doing with this program and
> not exactly the most important factor *and* there are certain portions
> of the processing where I do not want to sacrifice the visibility of
> the work being done. (Most of that aspect is done in the final stage
> of processing and doesn't seem to be too much of a problem.)
>
> Thanks to the advice I've already received, most of the "heavy
> lifting" is being done behind the scene in variables rather than in
> text fields, the screen is being locked during times that seem
> appropriate and I've spread around a few "wait 1 with messages" to
> help keep the program responsive to user input. Furthermore, I only
> see a real problem when working with data exceeding 10-15,000 records
> on a given run.
>
> With that said...
>
> Assuming it is possible under the circumstances, how might I go about
> cutting back the sometimes extended periods of time where there is a
> full load on the CPU, allowing them to continue working on other
> tasks?
>
> Any advice at all is greatly appreciated.
>
> David
> _______________________________________________
>
--
Phil Davis
PDS Labs
Professional Software Development
http://pdslabs.net
More information about the use-livecode
mailing list