Large Recordsets - An unbuffered approach?
sunshine at public.kherson.ua
Wed Dec 7 16:56:24 EST 2005
On 12/7/05 11:34 PM, "Adam" <friendlygeek at gmail.com> wrote:
> Hi All:
> My project works. Sort of. ;) Thanks again to those who have been
> helping me out. I have done in two days what took me weeks to formulate
> elsewhere. Best of all it has a GUI. :D
> Like I said it sort of works... I can connect to the database, I can
> retrieve records, but this is the sorce of my problem. It would appear
> to me that an attempt is made to retrieve all the records matching the
> SQL criteria specified in the query before opertating on the contents of
> the recordset. With a small recordset this approach in fine and dandy.
> But in my case, my criteria returns 60 million plus, records.
> Previously, I had used PHP's mysql_unbuffered_query to fetch one record
> from the database at a time which I could then process and discard,
> moving to the next record. I only had to execute one query to
> accomplish this but I would have to loop through a while statement to
> the effect of:
> While lRecordCount <= lNoOfRecords - 1
> Fetch the current record
> Process the current record
> Increment lRecordCount
> End While
> This worked very fast as it didn't have to move the entire result set to
> memory. (Which my laptop ran out of this afternoon) ;)
> An aside: Is it possible to have multiple threads? Whereby an SQL
> query such as that listed above can operate while still allowing access
> to the program interface?
You have meet 2 problems:
* too big result
* mySQL have only client-side cursors, so it try to load result into RAM
into RAM of client computer.
You can try to use LIMIT, but often this is not what you want.
VP Engineering and New Technology
Paradigma Software, Inc
Valentina - Joining Worlds of Information
[I feel the need: the need for speed]
More information about the Use-livecode