Interactive Technology in Public – Mood Floor

Mood Flood is a interactive project created by Mazi Tradonski and Michael Abramovich. It was designed to pick up different body languages and reflect them with lights projecting on the floor. Here’s the video showing how it works.

Aggressive or angry mood will be expressed by fast and sharp movements with the lights, while calm and relaxed mood will be expressed by slow movement. Those lighting will move around the user create a fun interaction on the street.

In my opinion, this is a very successful design because it doesn’t take a lot of thinking to understand what this project is about and how to interact with it. When people see lights moving around their feet on the floor, we immediately start to interact with it, because that’s what people do if there are real physical objects moving around people’s feet.

For me, the best design for interaction is to blend digital objects into our real physical life. Suppose a person walking down the street suddenly saw few cockroaches running towards him, the two reaction he will have is to either step on the cockroaches or run away from them. So when there are unpleasant things being projected on the floor, people will react the same because it has been our instinct. Same goes with pleasant objects, if we see a nice grass field in front, we would love to step or walk on it. So when there are pleasant lighting being projected near our feet, we will play with them.

To sum up, this is a very simple design but it follows all human instincts, that’s why when users are being exposed to this project, there’s not much introduction needed for the users to enjoy the interaction.



Physical Computing Final Project

Physical Computing Final Project Schedule

11/16  6-8 servo biped robot assembled with movement

11/23  6-8 servo biped robot – interactive with music

11/30   18 servo fully built and functional with hands/head movement

12/07   improving interactivity

12/14   Facial expression/aesthetics





Physical Computing Mitderm Project

For the midterm project, our initial idea was to create a little game involving the proximity sensor. There will be one sensor on both the left and right side, each sensor will detect the position of the player. A random value will be generated by the program to assign whether the player should go left or right for each step. To win the game, the player must choose the right side each time to reach the end of the path.



(This is the design for the sensors)

Since the sensor only works well with a black flat surface, therefore, I program the sensor to take multiple reading, discard the 0 reading the return the mean value to have an accurate reading of the distance. The second program I wrote is to detect whether the player has taken the step. This program will store an initial reading of player’s position, then keep comparing the difference between the initial reading and player’s current position, once the difference is greater than 25 cm, it will return true to indicate the player has taken a step.

After the first two programs was written, we decided to added a guidance for the player to indicate which side is the right way. So we set up a target with a pressure sensor behind it, then the player hits the target, it will trigger the sensor and the program will send out a signal showing whether the player should go left or right.


(This is the target we made for the player to shot at)

To put all the programs together into the game, I set each program to run under different time interval, once a program run out of time or receive the right signal, the next program will start running.

After we put the game together, we decided to add lighting into the game, so Richard started to write the different light effects for different parts of the game. We have left and right sweep for the guidance to indicate which way the player should go; a count down for shooting the target, a red light to indicate the wrong way, a green light to indicate the right way and a rainbow effect when the player has won the game.

However, since the program for the game and lighting were written in different board and different logic, we couldn’t combine them together. The program for the game was written in the for loop which check both timing and a boolean value, it will stop if the Boolean value change or when the time runs out. But the program for the lighting is written in a for loop which only stops when all the statements have run out. Therefore, when combining both programs together, the loop for the lighting interferes with the timing of the game, which hinders the game from running smoothly.

To sum up the project, both program runs perfectly when running independently, however, we haven’t got time to figure out the code to put both programs together for the game.


What is Interaction?

Interaction, or interactivity, could be a complicated term which takes a long way to explain, however, a picture from Bret Victor’s post “A Brief Rant on the Future of Tool2.pngInteraction Design” has perfectly explain the term.

In the picture we see one of the simplest tool in the world, a hammer. Yet, the design captures all the essence for an interactive instrument. It has a handle that is easy for human hand to grab on to, it has a hard and heavy head for us to hammer the nail, and by feeling the resistance from the nail, we are able to tell how deep the nail is in the wall. 

So why is interaction so important to us?  As Chris Crawford wrote in his book, “The Art of Interactive Design”, “The greatest movie in the world can lose our attention to the sound of munching popcorn; our involvement with a great book will surrender itself to a buzzing fly.” Interactivity will always has a greater involvement and requires greater attention compare to passive observation. Movies or music are great examples of digital technology that generate great excitement or emotions for us, but they will never be as involved as video games since we don’t possess any interactions over them.

The next important question will be what makes a good interaction. In my opinion, the best interaction requires no instruction, which means the design is entirely based on our human intuition, which is developed on our interaction with the society. One of the best example I could think of is the hand gesture for scrolling pages on a touch screen. This technology requires no further instructions since it’s exactly the same as if we would to scroll real papers up and down.