Transcript and Dot Notation - the really "embarrassing" concepts IMHO

jbv jbv.silences at club-internet.fr
Sun Feb 26 10:12:21 EST 2006


if you guys allow me to squeeze a few words in this (hot) thread,
I'd like to say that the initial post by Dan Shafer contains a couple of
sentences that I would qualify as, ahem "embarrassing", especially
due to Dan's huge contribution to the xTalk world for many years...

actually it reminds me of another "embarrassing" discussion that took
place on this very list a few months ago, in which Dan wrote that
goal-oriented interfaces were a huge step forward in UI design, no matter
if the rules were set by Microsoft...

Here's an example of what I find "embarrassing" :
"I hope it *does* in fact adopt dot notation so that all of us who
have trained our brains to think in those terms when we create and
program with objects will be comfortable doing so"

the embarrasing part being, IMHO : "all of us who have trained our brains
to think in those terms"

I have a background in ergonomics and psychology, and it is proven fact
for decades that newbies and experienced programers approach and memorize
algorithms in completely different ways (I have already discussed this a few
times on this list during the past years, so please check the archives for more
info).

But at the same time, AFAIR, xTalk (and specifically HC) were designed to
allow newbies, but also ppl with no background as programers but with enough
intellectual skills and rigour (and mostly no time and/or no desire to learn a
cumbersome language & notation) to develop sophisticated projects by themselves.
IOW, the design of xTalk & HC was to "train" computers to think in the human
way...
And it seems to me that now Dan is promoting the other way around : to adopt
dot notation because so far all OO languages use dot notation (mainly because they
have been created by prof. programers for prof. programers, except perhaps for
SmallTalk)... It really sounds like a big step backwards, back to a pre-HC and
pre-xTalk era... I've learned assembler in 1976-78 and believe me I do know
what it means to be "forced" to think in the same way as the computer...

Almost any algo can be described in a natural (verbose) language. If one says that
OO concepts benefit from dot notation (non natural language), does that imply that
the associated OO concepts are non natural by nature ? Frankly I don't think so :
AFAIR OO languages like SmallTalk have been created to promote more "natural"
concepts (sorry guys if I use such approximative words & concepts, but I really lack
time to develop this)... Anyway, I guess you see the pojt I'm trying to make, and
the paradox I'm pointing at...

Last but not least : as for the fact that dot notation means better implementation and
better performances, with computing power doubling every 18 months, I really wonder
how long this will remain an issue...

JB




More information about the use-livecode mailing list