[Q] send in time and response times from apps

Rob Cozens rcozens at pon.net
Mon Mar 18 12:29:01 EST 2002


>does the performance of an application suffer if you use a low time interval
>for the "send" command?
>
>examples:
>send "cowSpeak" to me in 60 seconds
>send "cowSpeak" to me in 1 seconds
>
>obviously the updating is more fluid with the 1 seconds interval, but is it
>harmful to the overall response of the application?

IMFO, it really depends on what else is going on with your application.

As I noted in "Re: Time Field (Please Help)", either example has the 
potential to start a continuous stream of send commands that will 
persist until the runtime engine quits.

If your application is "compute bound", ie: it's doing lots of number 
crunching and/or image manipulation, or it's pushing large amounts of 
data, then such loops may slow processing.  OTOH, if your application 
is basically presenting a single screen of data to the user and 
waiting for him/her to type something or click on something; either 
send command should have no practical effect.  However, I would still 
script a mechanism to assure the loop stops when the user leaves the 
card.

BTW, in my experience, I would not consider 1 sec a "low" time 
interval in an environment where a CPU is dealing in milliseconds. 
One tick would be a low time interval, and would probably put the app 
into suspended animation.

And while we're on the subject, am I correct that any "send ... in" 
event cannot guarantee precise timing?  My presumption is that no 
message can be sent while a handler (or some code segment?) is 
running.  Does a CPU intensive handler effectively block the sending 
of queued events until it completes?
-- 

Rob Cozens
CCW, Serendipity Software Company
http://www.oenolog.com/who.htm

"And I, which was two fooles, do so grow three;
Who are a little wise, the best fooles bee."

from "The Triple Foole" by John Donne (1572-1631)



More information about the use-livecode mailing list