1HDC 10.03 - Gestural Interfaces - Discussion Forum

I’m pretty sure these are called implicit interactions (at least that’s what I’ve been calling them).

Every day, we as humans communicate in person- but not just through talking. Gestures, expressions, and tones all complement what we have to say when we have a conversation with someone, by implying emotions. The same idea can be applied to objects- the action of doing something can imply something else. The example given of moving the chair on or off the table top is a great example- when we look into a restaurant and see chairs on tables, we know that they’re closed. It’s implied! These sort of interactions are part of our lives, and the more integrated and connected everything becomes, the more prevalent these seemingly obvious interactions will become.

For example, if your bed was hooked up to your floor and clock, you could have an entire system when you wake up. The floor would have a touch sensor in it that needs to be stepped on to turn off your alarm- the action of doing that prepares your bathroom, by heating up the shower, and setting up whatever else you do in the morning. Since most people have a morning ritual, this “smart house” concept could work… who knows.