Custom Property Data Limit?
bdrunrev at gmail.com
Fri Dec 2 06:01:14 EST 2011
Some years back I was working with a few joined tables in a commercially
available relational database. Making certain queries across these joined
tables became unusably slow as the size of the tables increased (I can't
remember the numbers, but certainly in the millions, possibly billions).
As the individual data items were small (mostly latitudes and longitudes),
I put all the data into custom properties, and queried those instead. The
responses were instantaneous.
If I had to look up a lot of data and get fast responses, I would consider
doing it in custom properties rather than any database. In fact, my tests
custom properties (arrays) were faster than using in-memory databases (e.g.
I came across a series of benchmarks a few years ago which compared the
performance of most the major databases versus PHP's own hash tables (I did
a quick google but couldn't find them right now). The PHP hash tables won
hands down (by roughly an order of magnitude). The whole "prevayler"
hypothesis is basically founded on this difference in speed. Whilst many
enterprisey people were (apparently ideologically) opposed to the idea of
principally working on data in memory, I seem to remember that this kind of
thing has reappeared recently with projects such as redis (and if memory
serves me right, some corporate entity like Oracle has just come out with
If it is a question of data analysis rather than persistence per se, I'd do
some testing with arrays/custom properties and go from there.
You wouldn't necessarily need to display 1 billion rows in a datagrid. You
would extract the subset you need and then display them in the datagrid.
On Fri, Dec 2, 2011 at 12:42 AM, <dunbarx at aol.com> wrote:
> Just for the hell of it, I made a custom property of a button with one
> billion chars, 100,000,000 lines of ten chars. If I ask it for line, say,
> 28455999, it gives it instantly.
> Craig Newman
More information about the Use-livecode