How to sync multiple functions / routines in parallel

Glen Bojsza gbojsza at gmail.com
Sat Apr 23 17:11:51 EDT 2011


Good thoughts on approaches... I am starting down the rabbit hole!

thanks

On Sat, Apr 23, 2011 at 1:26 PM, Mike Bonner <bonnmike at gmail.com> wrote:

> There is probably a way to do what you want directly with a single select
> but i'm a beginner to middlin of all trades, master of none.  Have another
> silly idea at the bottom, but heres more on the current questions.
>
> I think as long as you put a semicolon between queries, and don't put a
> trailing semicolon, you might be able to send all 10 at once. I don't know
> if it would be any faster this way, would require testing.
>
> Pretty sure revdatafromquery is blocking.  So to test you might time 10
> individual revdatafromqueries in a row, vs 1 with 10 requests inclusive.
>  see if theres a gain. The responses are still going to be sequential in
> this case.
>
> Starting to sound to me like the issue is that a) you need to grab all the
> data, fast, and process all of it within your 10 second time frame. And you
> can't be processing while you're grabbing a query, so any query time is
> basically lost processing time.
>
> If this is the case, and you MUST hit that 10 second mark or less, you may
> need to benchmark your parts and then see which you can optimize the most.
>  My guess is the queries themselves are very very fast, so there might be
> some gain to be had elsewhere.
>
> If you're not using the internal sqlite driver, the open process option
> might work best for this.  Have never really did much with process control
> though, so not much help.
>
>
> The silly idea (that is way beyond me) is this. If as you say, you have 10
> queries to run, create a query engine who's job it is to only perform a
> single query.  It will need some way of interprocess communication so it
> can
> talk to the controller. If network, looking at chatrev might give some
> ideas
> on this. Build this app.
>
> Build a controller whose job it is to process the data of the query
> engines.
> Set up its communcation stuff so it can receive data from the query agents.
>  To launch a query agent, copy the agent itself to a temp location with a
> unique name name designating the query to perform.  (could use the filename
> or the seconds or whatever you wish in the query agent to determine agent
> id)
>
> My vision is this
> Controller is started. It starts, and logs in 10 unique agents, agents are
> then told to start querying, each completed query is sent back to the
> controller for processing.  This way, each agent runs in its own sandbox
> (can't just be substacks, must be fully built standalones) each process can
> be doing its own thing, with the controller processing. Since these are
> only
> selects, it should be safe to do this with sqlite i think.
>
> If using a dbserver that can be accessed from a web server, its easier.
> just
> the controller needs to be created, plus simple queries from a local web
> server that can access the database. untested, but hopefully you get the
> idea.
>
> controller starts
>
> --sends the requests
> command sendrequests
> global numOfRequests,requestCount -- globals to see if a request segment is
> complete
> put 10 into numOfrequests -- set this to how many query requests there will
> be
> put 1 into requestCount -- set counter to first query
> load URL "http://yourfirst.unique.query.url" with "loadcomplete" -- start
> loading queries
> load URL "http://yournext.unique.query.url" with "loadcomplete" -- however
> many queries you need.
> end sendrequests
>
> command loadComplete turl
> global numOfRequests,requestCount -- same globals
> put URL tUrl into tData -- t url is set by the engine during the callback
> procesyourdatahere -- of course, this is where you process
> unload tUrl -- need to unload the unique url so that it will reload fresh
> data the next time
> if requestCount = numOfRequests then-- check of all current loop of queries
> are complete
> sendrequests -- if all queries are complete, start a new batch
> put 1 into requestCount -- reset the counter
> else
> add 1 to requestCount -- if more need to complete, increment the counter
> end if
> end loadComplete
>
> Untried, needs cleanup, but its an option. And webservers are really good
> at
> multiple requests.
>
> Be aware, if you're accessing large amounts of data, each cached page takes
> up memory, when you pull the data from the cached url, that takes up memory
> too. Its possible to bring a machine to a grinding halt of you're not
> careful. As long as you aren't storing cumulative data in your stack (Ie,
> processing it and writing to a file) then probably are fine. If however,
> you
> um. don't *cough* the server will most likely be happy and fine, but the
> app
> and system go boom.  Not that I've done this.
> _______________________________________________
> use-livecode mailing list
> use-livecode at lists.runrev.com
> Please visit this url to subscribe, unsubscribe and manage your
> subscription preferences:
> http://lists.runrev.com/mailman/listinfo/use-livecode
>



More information about the use-livecode mailing list