Simulating touch messages on desktop - what do the smart kids do?
Ben Rubinstein
benr_mc at cogapp.com
Mon Feb 13 14:04:46 EST 2012
My view of why development in LiveCode is so rapid is a lot to do with
avoiding the code/compile/test cycle (or at least making the middle bit
unnoticeably rapid). When applied to mobile development, I think that's huge
- we can do most of our work in a desktop context (that is, in the IDE), only
occasionally having to build out to simulator or device to test. As soon as I
run up against something where a change can only be tested on the device or
simulator, my rate of progress drops to a crawl.
Unfortunately, the desktop engines, and hence the IDE, don't support touch
messages. (And obviously even though many Mac and Windows machines now have
multi-touch input devices, this is complications by the fact that on these
machines, touches don't have a location on screen. So I don't expect this to
change soon.)
So when I write code responding to touchStart/End/Move/Release, I find I'm
thrown back to the old world for testing, and I don't like it.
I tried simply adding mouseDown/Up/Move/Release handlers that called the
corresponding touch handlers, but that didn't seem to be a complete solution
(not exactly sure why not, but I did this to a stack that was working on iOS,
and it went awry...). And in any case, that would help for the simple case of
a single touch at a time, but not for the more interesting ones involving
multi-touch gestures, or overlapping touches.
So my question: what do the smart kids do to speed up their development in
this case?
TIA,
Ben
More information about the use-livecode
mailing list