What is LC's internal text format?
mark at livecode.com
Tue Nov 13 05:52:52 EST 2018
On 2018-11-13 11:06, Geoff Canyon via use-livecode wrote:
> I don't *think* I'm confusing binary string/data with binary numbers --
> was just trying to illustrate that when a Latin Small Letter A (U+0061)
> gets encoded, somewhere there is stored (four bytes, one of which is) a
> byte 97, i.e. the bit sequence 1100001, unless computers don't work
> way anymore.
Yes - a byte is not a number, a char is not a number a bit sequence is
not a number.
Chars have never been numbers in LC - when LC sees a char - it sees a
string and so
when such a thing is used in number context it converts it to the number
it *looks* like
i.e. "1" -> 1, but "a" -> error in number context (bearing in mind the
code for "1" is not 1).
i.e. numToChar(charToNum("1")) + 0 -> 1
The same is try for 'byte' in LC7+ (indeed prior to that byte was a
synonym for char).
> What I now see is tripping me up is the implicit cast to a character
> saying that charToNum supports, without the corresponding cast to a
> supported in numToChar -- i.e. this fails:
> put textEncode("a","UTF-32") into X;put numtochar(byte 1 of X)
Right so that shouldn't work - byte 1 of X here is <97> (a byte), bytes
get converted to native
chars in string context, so numToChar(byte 1 of X) -> numToChar(<97> as
char) -> numToChar("a")
and "a" is not a number.
You'd get exactly the same result if you did put numToChar(char 1 of
As I said, bytes are not numbers, just as chars are not numbers - bytes
do implicitly convert to
(native) chars though - so when you use a binary string in number
context, it gets treated as a
Put another way, just as the code for a char is not used in conversion
in number context, the
code of a byte is not used either.
Mark Waddingham ~ mark at livecode.com ~ http://www.livecode.com/
LiveCode: Everyone can create apps
More information about the use-livecode