Yesterday David Arno tweeted this:
“Use proper affordances — buttons should look like buttons” Why? How often do ppl press physical buttons compared with hitting touch areas? (source).

David hits the nail on the head. A touch area should look like a touch area – the button term in the “tactile transferred to touchscreen” is redundant. It’s redundant because the tactile button itself is pretty much extinct. Sure, there are physical buttons on kitchen devices etc., but many, many buttons today are placed on a digital interface.

And just think of the generation growing up today. None of them will make the reference between tactile buttons and touch areas on a touchscreen. To them, a touch area is a touch area. They’ve built a different context. I think all of us who’ve seen a small child pinch-zoom and slide on an iPhone (my oldest did this at age 18 months to my big surprise) agree that this is a new type of affordance to them. We wouldn’t dream of comparing that to sliding doors, which is probably the tactile action that comes closest if we were to follow the “button – touch screen” analogy.

What I’m trying to say is: Sometimes we should stop and look with a critical eye on the theory and guidelines we base (interface) design on, and instead think of the new contexts in which the interfaces are used.


3 Responses

  1. My point is, Rasmus: a touch point in an interface doesn’t HAVE TO look like a “real” button for people to click it. If we stick to conventions of proximity and alignment etc., people will know it’s a touch point. Even if it’s not formed as a button in an elevator :)

Leave a Reply

Your email address will not be published. Required fields are marked *