Microsoft Surface (Pixelsense)

When it first came out in 2008, The Microsoft Surface Table (now known as the Pixelsense) brought a revolution in the world of touch devices. It brought with it a multitude of new interactions which people had dreamed about in their lives.

Microsoft Surface Table is essentially a huge screen placed on a table consisting of a ton of sensors. In fact, each pixel in the screen is a sensor that can capture very basic information of object proximity. This means it can detect shapes. It can distinguish among a variety of shapes both regular and irregular. It supports gestures, both on and away from the screen.

The Microsoft Surface Table had the pleasurability index which was not common in interactive devices in those days. Affective design and computing, technology of connected presence, context-aware computing , terms which started becoming trends around 2004[1] were seen to be included with this device. I have attached the activity theory of sending files via Bluetooth using the Surface. It is a very intuitive process to send files. The two people who want their files transferred place their phones on the surface. The surface then reacts to this action by displaying files in an aesthetically pleasing manner on the screen. The sender can now tap on a file and drag it to the other person’s phone to transfer it. This gesture feels so natural and human-like that it makes me use this device again although I can still transfer my files without using the Surface.

In conclusion, I would say that the TUI that Surface has offered me has left a mark in my memory as a very intuitive TUI.

[1] Kaptelinin, V., & Nardi, B. A. (2006). Acting with technology: Activity theory and interaction design. Cambridge, MA: MIT Press.