kee at kagi.com
Mon Aug 15 17:32:13 EDT 2011
In my perfect programming world ...
I'd want all characters all the time for any place characters are displayed to be displayed and entered as unicode characters and represented as UTF8 bytes.
If the display version has "割劥" I'd want the language to recognize those as two characters and as 6 bytes.
I want UTF8 instead of UTF16 because UTF8 is the same byte stream regardless of processor endian-ness and more importantly, the entire web uses UTF8.
Is this crazy talk or would this be your ideal programming system for unicode?
More information about the Use-livecode