Develop the taxonomy further in order to describe and relate tangible UI experiences

As I read through the paper, the UI that popped up on my mind is Wii U. Since I have not personally interacted with many tangible UIs, it’s hard to recall and relate to examples when I try to apply the framework provided in the paper. However, I really like the simple, two-axes taxonomy provided in the paper to help us better classify, understand, and observe the field of Tangible UIs. The Will U offers an interactive gaming experiences where the controller/joystick can acts as multiple objects when paired up with sensors that detect the movement of the users and the position of the controller. The controller can act as tennis racket or table tennis paddled or baseball bat. To apply the metaphor, the controller X can become multiple Xs in different settings. Also, the controller can perform actions that simulate Xing in real world. Will U offers both verb and noun metaphors. To apply embodiment, Wii U qualifies as a distant embodiment since it’s similar to the example given in the paper where a controller interacts with television. The framework is a useful starting point to start analyzing the interaction provided by tangible UIs, however, it doesn’t cover the nature of the interaction in depth such as the input intensity or different types of output responses. It focuses more on the relationship between the user and the UI and lacks analysis on the experience itself. If I were to modify the taxonomy, I would probably work on coming up with ways to categorize or relate different types of input/output that can fit into the metaphor/embodiment framework.

Leave a Reply