Air for mobile’s weird touch implementation

Let me begin by saying that after playing with Adobe AIR for mobile for the last couple of weeks, I have been really pleasantly surprised by general performance. I have done quite a bit of mobile device development off late with both iOS and Android, but AIR might actually be a contender for my next project.

That said, I am kinda surprised by the touch event implementation. I am hoping someone will correct me if I am missing something but here is what I am seeing so far.

In AIR, you can now choose to set a Multitouch Input mode to either intercept raw touch events (MultitouchInputMode.Touch_Point) or have the touch events mimic mouse events and get separate gesture events from the flash player when gestures are performed (MultitouchInputMode.Gesture). Note: when gestures begin, you stop getting mouse events till the gesture has completed.

My first problem is that you can pick one or the other, but not both. The former will give you raw touch events and so you can see when more than one touch is on the stage, but then you have to write your own code to define what a gesture is. The latter gives you gestures but you’ll never know when more than one finger are on the stage. But what if you want to track touch points independently till a gesture begins? You are on your own there.

Additionally, There is no information on touch positions on the Gesture event. There is a localX and localY which I presume is the point between the 2 touch events, but they don’t change much as the gesture occurs (I was trying to read these values while panning halfway into pinching, the change in values isnt representative of how much my fingers moved) which seem to be the x and y points of where the gesture began and do not change as the gesture events change.

Also, Gestures like Zoom and Pan do work independently but not together. So If you are zooming (pinching) into an image using 2 fingers and then start moving the 2 fingers in a particular direction without changing the distance between the touch points (Pan), you dont get the pan gesture events. This is unlike the behavior on most apps that allow zoom and pan.

At this point I started looking to read raw touch point data. Here is another implementation gap. AIR requires the developer to keep track of every touch point (identified by toucheventIds). There is no data model that you can query on the AIR Player that gives you an Array of touchPoints. This is irritating and smells of poor API design decisions. Compare this with iOS’s api for touches where I get the set of all touch objects every time the time touches begin or change:


- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;

Anyway, thats as far as I have gotten so far. Maybe I am missing something but it seems that if you are really building something significant with AIR for mobile, you might need a custom non-Adobe gestures library.

Anyone know of a good one?

Author: Arpit Mathur

Arpit Mathur is a Principal Engineer at Comcast Labs where he is currently working on a variety of topics including Machine Learning, Affective Computing, and Blockchain applications. Arpit has also worked extensively on Android and iOS applications, Virtual Reality apps as well as with web technologies like JavaScript, HTML and Ruby on Rails. He also spent a couple of years in the User Experience team as a Creative Technologist.

3 thoughts on “Air for mobile’s weird touch implementation”

Leave a comment