Ok, this sounded a bit fancier than it is. This post is not about what the future phones might look like, but rather about something related to touch-based UIs (or to use the fancy term UX :)) that has bothered me for a while now.

Preamble: To be honest, I haven’t had much opportunity to use various touch devices apart from my Symbian S60v5, and I’ve played a bit with Androids. With that said, I have really no clue whether the ideas presented here are implemented somewhere, but I haven’t seen them.

Current gestures

This part is a bit ranty, so feel free to jump down to the next section. :)

When touch-based devices went world-wide, the most touted thing was how natural the UI was. Palm WebOS closes the apps when (IIRC) you drag them down, iOS uses pinch zooming, etc.

While I have nothing against these concepts, they work quite well, let us consider how natural those actually are.

  • Get a pen and a notebook. When you finish writing, do you drag the notebook down from your desk? No, you close it, and put it away.
  • Open your photo album, and take one photo out of it. Put it on the table. Put your index finger in the centre and the thumb somewhere else on the photo. Try to zoom it by stretching.

So, please stop using the word natural and replace it with ‘intuitive’ (which, again, can be debatable, but can’t be considered as plain wrong).

Guided gestures

So, the gestures are useful to make some tasks quicker. The thing that I want to be able to do faster is to chose a contact and call it/send sms without the need to ‘click’ more than once. The gestures could be ‘drag contact up to call’, ‘drag contact down to send sms’. But that is not really intuitive.

Enter guides:

Mobile Concept 1 - Contact dashboard
Mobile Concept 1 - Contact dashboard

The picture mostly says it all, but here’s a short synopsis - you have your favourite contacts placed on the dashboard of some sorts. Every contact gets a size proportional to how much you did contact him/her. Other contacts are automatically added if you have unanswered calls from them, or sms messages.

When you touch the screen, the guides appear - in this case you have 4 actions - call (up), sms (down), info (right), more (left). If you want to send an sms, just move your finger down and release it.

Mobile Concept 2 - Message List
Mobile Concept 2 - Message List

Now, the same for the message list. Dragging up/down behaves as expected - list gets scrolled. But when you drag to left/right, you get two commonly used actions - ‘delete’ and ‘forward’ (replying is embedded in the bottom of the window - not shown in the picture).

When you touch one of the messages, you get the icon guides. When dragging the item to one of the sides, the icon for that action becomes less transparent, and so does the tooltip - as a visual indication of how much you need to drag the item to perform the desired action.


Just to note that these are not mockups - all UI is implemented using QML - it is not yet connected to real data, but the widgets are as real as they get. :)

You can support my work on , or you can get my book Functional Programming in C++ at if you're into that sort of thing.