What is LC's internal text format?
bobsneidar at iotecdigital.com
Tue Nov 20 13:55:05 EST 2018
> On Nov 20, 2018, at 10:24 , Ben Rubinstein via use-livecode <use-livecode at lists.runrev.com> wrote:
> This isn't about strongly typed variables though, but about when (correct) conversion is possible.
> LC throws an error if you implicitly ask it to convert the wrong kind of string to a number - for example, add 45 to "horse". (Obviously multiplication is fine: the answer would be "45 horses".)
> LC throws an error if you implicitly ask it convert the wrong kind of string or number to a colour: try setting the backcolor of a control to "horse".
> LC throws an error if asked to convert a number, or the wrong kind of string, to a boolean: try setting the hilite of a button to 45.
> In all these cases, LC knows it cannot do the right thing, so it throws an error to tell you so, rather than guessing, for example, what the truth value of "45" is.
> I'm just suggesting that it cannot know how to correctly convert binary data into a string - so it should throw an error rather than possibly (probably?) do the wrong thing.
Too many assumptions about the "string" would be necessary here. What if I wanted to write a utility that displayed in ascii format a sector on a disk like the old MacOS DiskEdit used to do? Certainly, much of the data would be impossible to format, but some might be discernible. I'm suggesting that there is nothing intrinsic about binary data that can absolutely identify it as the type of string or data you are expecting, whereas with typed data there is. So when it comes to binary data, it seems to me to be better to assume nothing about whether or not the data is valid.
Again, I am not terribly versed in data processing at this level, but it seems to me that referring to binary data as "typed" data is a bit of a misnomer. ALL data is in the end stored as binary data. Typing is a predefined way to structure how the data is stored, partly for efficiency, and partly to preclude the volatility of processing the wrong type of data at the machine level.
Imagine if every time an addition process was called by a microprocessor it had to do error checking to see if the values the binary data actually represented were integers. The processing overhead would be impossibly voluminous. So this is enforced by the compiler, hence data typing. It only matters to the higher level language what the binary data represents.
In the case of LC, and other non-typed languages, this is determined at run time, and not at compile time. That is really the big difference. In C++, typing prevents me as a developer from trying to add 45 to "horses" before I compile the entire application, as with that kind of environment, compiling a large application can (or used to) take a really long time, and debugging to find out where you went wrong could be immensly tedious. Since LC "compiles" scripts as it goes, it isn't really necessary anymore to type variables, and that frees us, the developers to think about the flow of the application rather than get bogged down in the minutia of programming faux pas.
I freely admit though that when it comes to this subject matter, better minds than I are more suited to the discussion.
More information about the use-livecode