Musical Sandbox

Members: Andrew Chong, Owen Hsiao, Vivian Liu

Our final project will be a music sandbox that allows users to engage with music haptically and visually. This is a scaled down, more interactive version of the installation chamber we had in mind for our midterm project.

Hands can make music and visuals by playing in the sand.

Hands can make music and visuals by playing in the sand.

The interaction would be to have people stick their hands in the box and play around with the sand. Thus, the box would become an instrument that transduces hand motion into visuals (Processing, LED lights) and melody changes.

Only one wall will be augmented with a Processing display. The other two will be laser-cut with negative-space silhouettes. Through these holes, LED lights will shine through and change color. The point of the visuals is to invite people to engage with the box and to more demonstrably illustrate results.

This is the side view. The silhouettes represent negative space through which LED-colored lights would flow through. Perhaps they won't be negative-space but could be made of some semi-opaque inspired by Cesar's talk today.

This is the side view. The silhouettes represent negative space through which LED-colored lights would flow through. Perhaps they won’t be negative-space but could be made of some semi-opaque inspired by Cesar’s talk today.


To track the motion, we will be using the same motion sensors we mentioned in our previous presentation. They will be mounted on each of the three non-Processing walls.

Within the box, besides sand there will also be pegs that will allow the sand to scatter when thrown down for a larger range of motion.

This is a view of the box ignoring the processing wall and the side walls.

This is a view of the box ignoring the processing wall and the side walls.

The changes in music will be simple and will most likely be changes in speed. For example, if there is more motion in one wall, the melody will be played at a faster frequency (less time between each note).

Our game plan is to first iterate with a cardboard box and make sure that the motion can alter the music. We’ll be exploring PureData and linking things through FirmData. After getting that done (our MVP) we will work on the bells and whistles of visual output. Our thoughts are that the final product will be a laser cut box that we can set on the floor in the center of the exhibition.

[RR06] VR

I experienced two VR Games: the balloon-popping one and the North Face exploring one.

The part that I enjoyed the most and what surprised me the most about VR was the illusion of tactility. During the balloon popping game, I could actually feel tension coming from the bow, and it made me feel like I was really exercising a bit as I fended off the little helmet people. Three senses were engaged, and that was incredibly immersive–I completely forgot that I was in the basement of South Hall.

Another part I really enjoyed about VR was how it genuinely made me feel scared. After I tried out Deep Blue for a quick minute, I backed out, because I didn’t want to imagine what might be behind the dark depths, even though I knew that those were mere voxels.

There were certain bugs to the system that I found interesting. For example, you could find a spot in the North Face where you were hovering over land instead of being planted right on it. It was an interesting sensation and almost frightening sensation. Though to pure VR enthusiasts it would be considered a design flaw, I like that it gave us this affordance to experience something we would never be able to experience in real life.

One design flaw that came up was the difficulty of navigating the user interface of the balloon popping game. We didn’t know how to properly teleport to the right place to start the game. There was no natural interaction. We only got it through a trial-and-error of input sequences with the controller. This is something that should definitely be remedied with more user testing.

Baby Steps

Description
My servo crawls with the help of a putty foot. The ball of its foot propels it forward when I rapidly rotate the pot by a small angle. The putty holds the “leg” together but also does not constrict the range of motion. It can only be rotated by a small angle only because it won’t otherwise it will go backwards.

Materials
Drill, Putty, Metal Hex, Tape, Toothpick
Clothespin
Servo, Servo Container, Potentiometer
Breadboard, Arduino, Jumper Cables

Code


/*
* Servo with Potentiometer control
* Theory and Practice of Tangible User Interfaces
* October 11 2007
*/

int servoPin = 7; // Control pin for servo motor
int potPin = 0; // select the input pin for the potentiometer

int pulseWidth = 0; // Amount to pulse the servo
long lastPulse = 0; // the time in millisecs of the last pulse
int refreshTime = 20; // the time in millisecs needed in between pulses
int val; // variable used to store data from potentiometer

int minPulse = 500; // minimum pulse width

void setup() {
pinMode(servoPin, OUTPUT); // Set servo pin as an output pin
pulseWidth = minPulse; // Set the motor position to the minimum
Serial.begin(9600); // connect to the serial port
Serial.println("servo_serial_better ready");
}

void loop() {
val = analogRead(potPin); // read the value from the sensor, between 0 - 1024

if (val > 0 && val <= 999 ) { pulseWidth = val*2 + minPulse; // convert angle to microseconds Serial.print("moving servo to "); Serial.println(pulseWidth,DEC); } updateServo(); // update servo position } // called every loop(). void updateServo() { // pulse the servo again if the refresh time (20 ms) has passed: if (millis() - lastPulse >= refreshTime) {
digitalWrite(servoPin, HIGH); // Turn the motor on
delayMicroseconds(pulseWidth); // Length of the pulse sets the motor position
digitalWrite(servoPin, LOW); // Turn the motor off
lastPulse = millis(); // save the time of the last pulse
}
}



[Lab06] This one’s a kicker.

Description

For this lab, I wanted to try to extend the range of motion from the DC motor to a pinwheel, and then to try to move water with that. Thus, I made a dolphin that can splash with a high enough energy rotation from the potentiometer. Otherwise, it will just make a fitting high pitched squeal (due to the motor being restricted in its range of motion). I also created an accompanying sketch in Processing that does waves.

Materials

  • Dental Floss Cap
  • Deconstructed Pinwheel
  • Dolphin with Magnetic Bottom
  • Arduino, Breadboard, Jumper Cables
  • Potentiometer
  • Resistors, Transistors, Diodes

Code

 

/*
* one pot fades one motor
* modified version of AnalogInput
* by DojoDave <http://www.0j0.org>
* http://www.arduino.cc/en/Tutorial/AnalogInput
* Modified again by dave
*/

int potPin = 0; // select the input pin for the potentiometer
int motorPin = 9; // select the pin for the Motor
int val = 0; // variable to store the value coming from the sensor
void setup() {
Serial.begin(9600);
}
void loop() {
val = analogRead(potPin); // read the value from the sensor, between 0 - 1024
Serial.println(val);
analogWrite(motorPin, val/4); // analogWrite can be between 0-255
}



Processing (building off of SineWave example code)

String buf="";
int cr = 13;
int lf = 10;
int serialVal = 0;
int shift = 0;
int xspacing = 16; // How far apart should each horizontal location be spaced
int w; // Width of entire wave

float theta = 0.0; // Start angle at 0
float amplitude = 75.0; // Height of wave
float period = 500.0; // How many pixels before the wave repeats
float dx; // Value for incrementing X, a function of period and xspacing
float[] yvalues; // Using an array to store height values for the wave
float alpha=0;
boolean forward;

void setup() {
size(1500, 800);
w = width;
dx = (TWO_PI / period) * xspacing;
yvalues = new float[w/xspacing];

}

void draw() {
background(0, 125,200);
calcWave();
for (int i = 0; i < 360; i+=10) {
rotate(i%36);
renderWave(alpha);

}
}

void calcWave() {
// Increment theta (try different values for 'angular velocity' here
theta += 0.03;

alpha = ((alpha +1) %255);

// For every x value, calculate a y value with sine function
float x = theta;
for (int i = 0; i < yvalues.length; i++) {
yvalues[i] = sin(x)*amplitude;
x+=dx;
}
}

void renderWave(float alpha) {
stroke(50, alpha ,200);
noFill();
// A simple way to draw the wave with an ellipse at each location
for (int x = 0; x < yvalues.length; x++) {
ellipse(x*xspacing, height/2+yvalues[x], 3, 3);
}
}

waves

Thoughtless Acts

<h3>One Pot Dish</h3>

img_0392

In this photo, out of laziness, I repurposed my rice cooker to also steam my vegetables and then repurposed the rice cooker container as a bowl. A simple design fix would be to attach another layer for vegetables to keep the flavors from mixing. I’m sure this design solution already exists somewhere in the product space.

<h3>Hanging on a Hanger</h3>

img_0431

The diameter of my rail could not fit my drying rack’s hook, so I hooked it onto another hanger. A simple design fix here would be to have a hook that is not as rigid. That way it could flexible wrap around rails. Otherwise, it could be designed so that the user could set the hook diameter themselves within a given range.

<h3>Amazon Desktop</h3>

img_0434

I have been making full use out of my Amazon shipments by utilizing the boxes as both carriers and tables. My desk is quite small so when I make things, I use the bottom of my box. This provides me an adequate second table so I can separate my book work from my maker work.

img_0435

However, when the project is done, I simply invert my table space and get back a box to carry my project in. For this example, I can’t think of a better solution, because I can’t imagine another feasibly designed portable desk that could shapeshift into a container. Sometimes simplicity works, despite the best design intentions?

[Lab05] Music Box ♪೭੧(❛▿❛✿)੭೨♪

Description

When I was little, I loved music boxes, because I thought they were a toy that offered some interaction. Twist the top, and then let the melody play. There was both a visual and an audio output. The ballerina would twist in pirouettes to the tune of the Nutcracker. In this lab, I decided to create my own using a Little Mermaid Theme.

The dolphin can bob back and forth a bit, but when pressed, the melody will speed up faster. If the melody is too distracting, someone can twist the potentiometer to turn down the sound. There is a mix of red and blue light in the bottom layer of the box that blinks to the melody.

Materials

  • Arduino, Breadboard, Jumper Wires
  • Potentiometer, Force Sensitive Resistor, LEDs
  • Daiso Dolphin (Token), Faux shards (Reflecting diffused light)
  • Sponge (Diffuser)
  • Silk Tape
  • Code

    int potPin = A0;
    int ledPin = 13;
    int fsrPin = A1;
    int ledPin2 = 12;
    int speakerOut = 7;
    byte names[] = {'c', 'd', 'e', 'f', 'g', 'a', 'b', 'C'};
    int tones[] = {1915, 1700, 1519, 1432, 1275, 1136, 1014, 956};
    byte melody[] = "2d1p2e1p2f1p5f5p2e1p2f1p2g1p5g5p1p2d1p2e1p2f1p5f8p2e1p2f1p2e1p2f1p2g1p2g1p5g5p2e1p2f1p5g1p2f1p2g1p5f2d1p2f1p2g1p2a1p2a1p4g8p";
    // count length: 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0
    // 10 20 30
    int count = 0;
    int count2 = 0;
    int count3 = 0;
    int MAX_COUNT =62;
    int statePin = LOW;

    void setup() {
    pinMode(ledPin, OUTPUT);
    pinMode(potPin, INPUT);
    pinMode(ledPin2, OUTPUT);
    pinMode(speakerOut, OUTPUT);
    }

    void loop() {
    digitalWrite(speakerOut, LOW);
    for (count = 0; count < MAX_COUNT; count++) { statePin = !statePin; digitalWrite(ledPin, statePin); for (count3 = 0; count3 <= (melody[count*2] - 48) * 30; count3++) { for (count2=0;count2<8;count2++) { if (names[count2] == melody[count*2 + 1]) { digitalWrite(speakerOut,HIGH); delayMicroseconds(tones[count2]); digitalWrite(ledPin,HIGH); digitalWrite(ledPin2, HIGH); //delayMicroseconds(analogRead(potPin)); digitalWrite(speakerOut, LOW); delayMicroseconds(tones[count2]); digitalWrite(ledPin,LOW); digitalWrite(ledPin2, LOW); //delayMicroseconds(analogRead(potPin)); } if (melody[count*2 + 1] == 'p') { // make a pause of a certain size digitalWrite(speakerOut, 0); Serial.println(analogRead(potPin)); delayMicroseconds(analogRead(fsrPin)); } } } } }

    RR05 Roombas

    A few years back, my family adopted a second pet: a Roomba. This pet was going to take on the quotidian task of cleaning for us, whirring around the room to the commands of some invisible algorithm. With just a touch of a button, it made the familiar action of sweeping and vacuuming hands-free and completely unfamiliar.

    I experienced Roomba in its highly polished, end product form, but I wonder how the designers iterated to create that circular animated bot. I wonder what their prototypes could have looked like, what generators (bio-inspiration perhaps?) led to their design direction, and how they utilized the principles of experience design to inform their next decisions, especially considering the fact that the point of a Roomba is to not disturb you. Did they insert their half-finished Roombas into their own homes and role-play a normal Sunday in, observing how it interacted with them?

    With the Roomba, we generally presume that it cleans every spot or that it has traversed most of the room. However, often when it hits a corner and blindly continues to collide against things, we just shake our heads and forfeit that our technology isn’t smart enough. Doesn’t that just show our gullible acceptance to the claims of consumer products? For all I know, the Roomba could only clean 40% of my floor and qualify as a “cargo culture” prototype, but I would never know, because I’m not the designer, and I could never peek inside the blackbox.

    At any rate, the Roomba shifted my perspective, because for the first time, I saw a robot taking the ease off of my household’s workload. Also, though it is just a modular element of our home, it introduced for the concept and possibilities of a “smart home”.

    roomba

    [RR04] Taxonomy of Ambient Media + User

    I think the taxonomy of ambient media should also consider the user’s point of view as well. While I do find the current archetypes and vocabulary very comprehensive, I think it all focuses on the system. However, the system is defined for the user to interface with. Therefore, I think that to add a dimension to the taxonomy, what happens after the user has shifted their attention to their “ex-periphery”. How much attention did each notification? How does the system want the user to react? What is the goal of the system? What is the system’s level of immersion?

    Attention and interaction–this concept builds on the paper’s notion of notification level but is distinct in that it focuses on how the user deals with the notifications. Can a user simply turn their head to ignore the information, like the user can for the Dangling String? Or do they have to be more involved and give the system input? This is a measure of how interactive an ambient media system is. Does it allow the user to sit back and be passive before and after notification, just taking in the soft stream of information, or does it want the user to be more active after the periphery comes into focus?

    Another aspect of ambient media to consider could be the level of immersion. When we assess these types of systems, we should also consider how easy it is for a person to extricate himself or herself from the system. In the case of all the examples mentioned in the taxonomy paper (Digital Family Portrait, Apple Dashboard, etc.) this would be easy: just look or tap away. After all, the focus of ambient media is to allow the user to naturally shift their attention. However, in previous reading examples such as the ambientROOM, the user is physically in the system. They have to physically leave behind the information to escape the patter of digital rain (their choice of representation) and information.

    [Lab04] Roses are red, I am blue because midterm season.

    Description

    In this lab, I fit the force sensitive resistor between two airpacks. Upon compression, a rose is generated in Processing. This rose is created from arcs against a gradient background.

    Materials

    • Arduino, LED, jumper, cables
    • Processing
    • Force sensitive resistor
    • Air packs


    import processing.serial.*;

    String portname = "/dev/cu.usbmodem1421";
    Serial port;
    String buf="";
    int cr = 13;
    int lf = 10;
    int serialVal = 0;
    int shift = 0;

    void setup() {
    size(800,800,P3D);
    port = new Serial(this, portname, 9600);

    noStroke();
    colorMode(RGB, 400);
    for (int i = 200; i < 400; i++) { for (int j = 200; j<400; j++) { stroke(i,j,0); point(i,j); }}} //} // void draw() { noStroke(); colorMode(RGB, 400); for (int i = 0; i < 800; i++) { for (int j = 0; j<800; j++) { stroke(i,j,0); point(i,j); } } stroke(255,0,255); noFill(); //ellipse(150,150,serialVal, serialVal); genRose((int) (serialVal*0.7),0, 350, 350); genRose((int) (serialVal*0.5), (int) HALF_PI, 350, 350);; if (serialVal > 400) {
    genRose((int) (serialVal*0.5) ,0, 600, 600);
    genRose((int) (serialVal*0.3), (int) HALF_PI, 600, 600);;
    if (serialVal>500) {
    genRose((int) (serialVal*0.5) ,0, 200, 200);
    genRose((int) (serialVal*0.3), (int) HALF_PI, 200, 200);;

    genRose((int) (serialVal*0.3) ,0, 700, 700);
    genRose((int) (serialVal*0.1), (int) HALF_PI, 700, 700);;
    if (serialVal > 600) {
    genRose((int) (serialVal*0.3) ,0, 200, 600);
    genRose((int) (serialVal*0.1), (int) HALF_PI, 200, 600);;
    genRose((int) (serialVal*0.3) ,0, 600, 200);
    if (serialVal > 700) {
    genRose((int) (serialVal*0.1), (int) HALF_PI, 600, 200);;
    genRose((int) (serialVal*0.3) ,0, 500, 500);
    genRose((int) (serialVal*0.1), (int) HALF_PI, 500, 500);;
    }
    }
    }
    }

    }

    void genRose(int serialVal, int shift, int x, int y) {
    fill(255,0,0);
    arc(x,y,serialVal*0.9, serialVal, 0+shift, HALF_PI+shift);
    arc(x,y,serialVal*1.1, serialVal*1.1, HALF_PI+shift, PI+shift);

    fill(200,0,0);
    arc(x,y,serialVal*0.9, serialVal, PI+shift, PI+HALF_PI+shift);
    arc(x,y,serialVal*1.1, serialVal*1.1, PI+HALF_PI+shift, PI+PI+shift);
    fill(175,0,0);
    arc(x,y,serialVal/2, (serialVal/2)*0.8, 0+shift,PI+shift);
    arc(x,y,serialVal/2, (serialVal/2) *0.8, (HALF_PI * 3)+shift,(PI * 2) +shift);
    fill(150,0,0);
    arc(x,y,(serialVal/4)*0.9, serialVal/4, 0+shift,HALF_PI+shift);
    arc(x,y,(serialVal/4) *0.9, serialVal/4, PI+shift,PI+HALF_PI+shift);
    fill(175,0,0);
    arc(350,350,serialVal/8, (serialVal/8) *0.9, PI+HALF_PI+shift,PI+PI+shift);
    arc(350,350,serialVal/8, (serialVal/8) *0.9, HALF_PI+shift,PI+shift);

    }

    void serialEvent(Serial p) {
    int c = port.read();
    if (c != lf && c != cr) {
    buf += char(c);
    } if (c==lf) {
    serialVal = int(buf);
    println("val="+serialVal);
    buf= "";
    }
    }

    Picture of the airpacks.
    img_0420

    Now I have created a garden of roses. Using a photocell would have made more sense because sunshine correlates to flowers. But also, if I attach this FSR to a water container like my water filter (will not be bringing this to class tomorrow), I can map it to the idea of watering plants.

    image1-1

    RR03 Dance Dance Revolution

    As Fishkin introduced the taxonomy of TUI, Dance Dance Revolution was a UI that came to mind. As input, players stomp their feet on pressure-sensitive mats to the rhythm of the surrounding music. As output, the computer calls out commentary and generates a color bar that visualizes your performance.

    I personally have always found the embodiment a bit lacking, because dance is reduced to patterns of four directional arrows. In some recent versions, they have also added more affordances, such as remote controls that can be utilized to incorporate hand movements into choreography. The metaphor of DDR is one of a verb, the mat translates synchronized feet and hand motions into a graded dance.

    I support the taxonomy. When applied to something like DDR, it creates a neat conceptual bin for the UI to fall into. This conceptual bin is demarcated by concepts such as “metaphor” and “nearby”. It’s impressive that he casts this wide net over just about every human-computer interaction (Urp and greeting cards being under one umbrella) and still manages to define a vocabulary for them all. Holmquist’s description of tokens, containers, and tools is also helpful, because he further refines what it means to be a TUI artifact: is the input generic, or is it more contextual? (DDR, I’ve determined, is a token.) I also appreciated how he clarified the distinction between phicons and tokens and containers.