Large Recordsets - An unbuffered approach?

Lynch, Jonathan bnz2 at cdc.gov
Wed Dec 7 17:30:19 EST 2005


This is prolly an ignorant suggestion, but...

Rather than returning the entire record set, can you just return a list
of the record IDs from your query? 60-million record IDs should not
overwhelm your memory.

Then you can go down the list, and for each record (or a predetermined
number of records), you load the content, perform your analysis on each
record, then discard that information.



-----Original Message-----
From: use-revolution-bounces at lists.runrev.com
[mailto:use-revolution-bounces at lists.runrev.com] On Behalf Of Ruslan
Zasukhin
Sent: Wednesday, December 07, 2005 4:56 PM
To: use-revolution
Subject: Re: Large Recordsets - An unbuffered approach?

On 12/7/05 11:34 PM, "Adam" <friendlygeek at gmail.com> wrote:

> Hi All:
> 
> My project works. Sort of. ;)  Thanks again to those who have been
> helping me out.  I have done in two days what took me weeks to
formulate
> elsewhere.  Best of all it has a GUI. :D
> 
> Like I said it sort of works... I can connect to the database, I can
> retrieve records, but this is the sorce of my problem.  It would
appear
> to me that an attempt is made to retrieve all the records matching the
> SQL criteria specified in the query before opertating on the contents
of
> the recordset.  With a small recordset this approach in fine and
dandy.
>   But in my case, my criteria returns 60 million plus, records.
> 
> Previously, I had used PHP's mysql_unbuffered_query to fetch one
record
> from the database at a time which I could then process and discard,
> moving to the next record.  I only had to execute one query to
> accomplish this but I would have to loop through a while statement to
> the effect of:
> 
> psuedo-code
> --
> 
> While lRecordCount <= lNoOfRecords - 1
> Fetch the current record
> Process the current record
> Increment lRecordCount
> End While
> 
> This worked very fast as it didn't have to move the entire result set
to
> memory. (Which my laptop ran out of this afternoon) ;)
> 
> An aside:  Is it possible to have multiple threads?  Whereby an SQL
> query such as that listed above can operate while still allowing
access
> to the program interface?

Hi Adam,

You have meet 2 problems:

* too big result
* mySQL have only client-side cursors, so it try to load result into RAM
into RAM of client computer.

You can try to use LIMIT, but often this is not what you want.


-- 
Best regards,

Ruslan Zasukhin
VP Engineering and New Technology
Paradigma Software, Inc

Valentina - Joining Worlds of Information
http://www.paradigmasoft.com

[I feel the need: the need for speed]


_______________________________________________
use-revolution mailing list
use-revolution at lists.runrev.com
Please visit this url to subscribe, unsubscribe and manage your
subscription preferences:
http://lists.runrev.com/mailman/listinfo/use-revolution




More information about the use-livecode mailing list