Final Project – Interactive light display – Sarah Wolf, Daniel Scott, Elena Brown, Matthew Schneider

17 Dec

ImageImageImageImage

 

                Our team designed an interactive light screen intended to be a piece of installation art that would react to human hand and wrist movement detected by a Kinect.  This motion  would then trigger an Arduino code controlling a series of motors within the display that would control the motion of shutters allowing the light behind the screen to be  emitted for the audience to view.

                The display itself was a 4’ tall by 5’ wide 5-sided wooden box with the back exposed to allow for setup of the motor/arduino/computer control system. The face of the display had 96 laser-cut 2.5” diameter circles, behind which were shutters weighted down by washers and attached by a cable system to 4 servo motors attached at the top of the display. By using only 4 motors, aggregate motion was designed to control 3 “zones” on the display, so that when a person’s motion was detected each zone would begin to open and close in sequence based on the person’s hand and wrist movement detected by the Kinect.

                We coded for both the Kinect using Firelfly and the Arduino, but found that the Firefly code confused and overloaded our computer systems and resulted to using the Arduino code. Unfortunately, without the ability to detect motion with the Kinect, we needed to simulate motion detection using a potentiometer to activate the display. This code worked, however, several of the cable systems failed on us because the washers attached to the shutter system weighed too much for the motors and dowels attached to the cable system to handle and broke under the pressure.

                If we moved forward with this project, we would develop a more robust  framework for the cable system and purchase stronger servo motors to control the display. We would also like to smooth out the motion of the shutters as they are controlled by the motors to create a more seamless effect of motion across the display and fix coding and wiring bugs so that we could utilize the Kinect with the system.  Additionally, we would like the code to be able to be reprogrammed as needed to fit the needs of the location in which it was installed, so that it might display an image created by the pattern in the visible light.  

                This is the Arduino code we used during our presentation (utilizing a potentiometer):

 

#include <Servo.h> 
 
Servo myservo1;  // first servo
Servo myservo2; // second servo
Servo myservo3; // third servo
Servo myservo4; // fourth servo
 
int potpin = 0;  // analog pin used to connect the potentiometer
 
int val; // variable to read the value from the analog pin
 
 
void setup() 
  myservo1.attach(9);  // attaches the servo1 on pin 9 to the servo object 
  myservo2.attach(6); //attaches servo 2 to pin 6
  myservo3.attach(5); //attaches servo 3 to pin 5
  myservo4.attach(3); //attaches servo 4 to pin 3
 
void loop() 
  val = analogRead(potpin);            // reads the value of the potentiometer (value between 0 and 1023) 
  val = map(val, 0, 1023, 0, 179);     // scale it to use it with the servo (value between 0 and 180) 
  myservo1.write(180-val);                  // sets the servo position according to the scaled value 
  delay(15);                           // waits for the servo to get there 
  
   val = analogRead(potpin);            // reads the value of the potentiometer (value between 0 and 1023) 
  val = map(val, 0, 1023, 0, 179);     // scale it to use it with the servo (value between 0 and 180) 
  myservo2.write(val);                  // sets the servo position according to the scaled value 
  delay(15);                           // waits for the servo to get there 
  
   val = analogRead(potpin);            // reads the value of the potentiometer (value between 0 and 1023) 
  val = map(val, 0, 1023, 0, 179);     // scale it to use it with the servo (value between 0 and 180) 
  myservo3.write(180-val);                  // sets the servo position according to the scaled value 
  delay(15);                           // waits for the servo to get there 
  
   val = analogRead(potpin);            // reads the value of the potentiometer (value between 0 and 1023) 
  val = map(val, 0, 1023, 0, 179);     // scale it to use it with the servo (value between 0 and 180) 
  myservo4.write(val);                  // sets the servo position according to the scaled value 
  delay(15);                           // waits for the servo to get there 
 
Advertisements
Image

Final Project – Interactive Blinds – Arthur Dunne and Alex Duden

16 Dec

Sine Wave PatternLight Sensorgrasshopper screen shotFinal Project - Interactive Blinds - Arthur Dunne and Alex Duden

We designed a system of interactive blinds that reacted to light sensors, an OSC Touch on the iphone, and a sine wave animation that we created in Grasshopper. A projector was also included in our display to provide color, so this was added for only aesthetics. We started by building a frame out of two-by-fours, which held everything together. Next, we constructed the blinds out of thin balsa wood and attached wooden dowels to the ends of each blind, so that they could pivot around 180 degrees. We borrowed a pre-wired rail of servo motors from Justin, that we used to rotate the wooden blinds and was connected to an Arduino. All in all, we were able to build this on a low budget, because we already had the light sensors, grasshopper, breadboard, computer, projector, Arduino, and iphone. The only costs were the OSC Touch app for $5, and $15 worth of balsa wood.

After everything was wired to the Arduino, we uploaded the Firefly Firmata to the Arduino. We used Firefly as a way to link the Arduino to Grasshopper, and created all of our variations through this visual programming software. It took us a little while to troubleshoot the OSC Touch, because it had to be on the right channel, and share the same IP address as the computer, but the result was great. The OSC Touch allows the user to control the blinds’ position from an iphone, which could be useful in any kind of building or house.

Next, we experimented with the light sensors, and we were able to get the blinds in an open position when the light was shining on the sensor, and a closed position when it was dark. Our intention was to simulate the sun’s rays shining on the window of a house during the day, and how the blinds would respond by opening. This would allow a room to utilize the sun as a source of heating and lighting during the day and trap the heat in as the sun sets.

Finally, we created sine wave animations on Grasshopper, and connected each of them to a separate servo motor. Each sine wave animation had a different speed and degree of rotation, so that it would generate a gradual wave across the blinds. This was purely an aesthetic display as well as the projector that projected different colors onto the blinds for each application.

Final Project Documentation – Due Thursday 12/19/2013

16 Dec

1.Write 2-3 Paragraph summery of your final project.
2.Post photos/video and code to website using the following format “Final Project – Title- Student Name(s)”

Final Project Proposal – Interactive LED Tent – Josiah Song & Alexa Zyllo

5 Dec

Proposal: For our final project, we will be creating an interactive installation in the form of a 3-dimensional space. Specifically, our project is intended to be a cube shaped tent-like structure that can fit a human being inside while sitting down. From the inside, the walls will be able to be stretched outwards and pressed upon, which will trigger a response from 8 capacitive sensors (2 on each side) on the exterior of the cube based on the distance between hand and sensor. Based on the readings from the sensors, LED strips lining the top and bottom of the cube will change colors and brightness. The color changes will be connected to MIDI software and will be able to alter noises or sounds we install, creating an interactive space with a soothing ambience.

The code that we use to run the installation will consist of a loop that takes input readings from the capacitive sensors and continuously alters the color and brightness of the LEDS. Then, based on the current LED settings, different sounds associated with different color values will play.

The frame of the cube will be made out of PVC and measure approximately 4 ft. in length on each side. The walls will be made from a translucent fabric, most likely spandex, so that the LEDs can play off of the walls as they change color and brightness. Given cost and time restraints, it is more realistic to create a small tent for one person sitting than to create one that can fit multiple humans with standing room.

Image

Materials: PVC pipes (50 ft.), PVC connectors (8), spandex (9 yards), LED strips (8), capacitive sensors (8), Arduino board, MIDI software, wires

Schedule:

Dec 3-5: Initial sketches/designs, begin coding, gather materials

Dec 6-9: Construction of project, continue code development

Dec 10-11: Complete construction and install sound features, test code, final troubleshooting

Dec 12: Present project

Final Project Proposal – RFID Tag Wine Rack – Casey Henrichs & Tyler Cook

3 Dec

Final Project Description 

For the final project we hope to use RFID technology to trigger a series of events. The idea is to have an interactive wine rack that could be sited in Rapson courtyard for events. The wine bottles within the rack would each have an RFID tag attached to them. These tags will be programed to hold information either pertaining to the individual wines or to possible food pairings. The food pairings idea may be more rational as the tags could then be recycled among similar (but not identical) wines. The information would then be put through processing and displayed on a separate screen. This would allow for modifications to be made to text size and color.

In addition to the wine information, we hope to use the RFID tags to trigger a reaction with LEDs placed within the rack itself. We hope to be able to have a loop playing on the lights and then when a bottle is removed have the lights in that tile go dark or vise versa. It would be ideal if the loop could play in a sort of fade and then have the LEDs on at full intensity when an RFID tag is read.

The display screen and the RFID tag reader will need to be encased/framed to make the product presentable for the final review. Wood encasements look to be the most economical option.

Project Renderings/Sketches/Diagrams

Tyler-Cook_Diagram-1

Build of Materials

RFID reader: 1 Parallax: purchased

RFID tags: 6: purchased

LED strips: 12+: borrow from Justin

Arduino Board: 2: purchased

Display screen: 1: borrow from Justin

Wood to frame 15″ screen: scrap from workshop

Wood to create RFID reader mount: scrap from workshop

Project Schedule

Dec 3: Complete code writing for wine info portion of RFID tags. Discuss ways to attach RFID tags to bottles. Start work in processing.

Dec 5: Determined a way to incorporate RFID tag to command LEDs. Complete Processing parameters for text appearance.

Dec 10: Begin mounting LEDs in wine rack tiles. Decide on a final design for wooden encasements for both the display screen and RFID reader.

Dec12: Complete Project siting. Complete wood encasements for components.

Final Project Proposals – Interactive Blinds – Arthur Dunne and Alex Duden

2 Dec

 

1.  

Our final project proposal is to create a system of interactive blinds that react to light, a potentiometer, and a smartphone controller app.  We plan on constructing the blinds out of wood and creating  a frame out of two-by-fours to support it. We were lucky enough to borrow  a pre-wired rail of servo motors from Justin, that we will use to rotate the wooden blinds.  In addition, we already have the light sensors, switch, potentiometer, breadboard, Arduino, and smartphone, so this project will be cost efficient.

Our process will include  designing the wooden panels, constructing the frame with an electrical box on the side, assembling the components together, writing the code, wiring everything together, and programing the smartphone with a wireless controller application.  

The purpose of this design is to make a functional interactive system for a building to use as an energy saving and light filtering product.

 

2.

3. Materials

1 – 6’ 2×4

6 – ⅛” or ¼” panels for the blinds

6 – servo motors

2 – light sensors

1 – push button

1 – potentiometer

1 – breadboard

1 – arduino

1 – smartphone

x – connecting wires

 

4. Dec 3 – 5: Arthur and Alex

-Fabricate the frame and the 6 blinds. – 2 hours

-Set up the servo motors, sensors, breadboard, and arduino – 2 hours

 

Dec 5 – 10: Aruthur and Alex

-Properly wire all devices – 3 hours

-Code the design – 3 hours

-Test blinds – 1 hour

 

Final Project Proposal – Kinetic Architecture – Garrett Jensen & Olivia Coughlin

2 Dec

Description:

Our project will use kinetic architecture using the idea of the roof to catch rain water for other uses such as for toilet water or for storage as backup water. Our project will be a simple structure such as a (8 in x 8 in) box with a roof sloop peaking at 3 inches. We will use a moisture resistor on one panel of the roof that will represent the structure getting hit with rain. When the resistor is activated, a servo motor connected by four pulleys, one connecting to each corner, will also be activated pulling the pulleys causing metal rods in each corner to raise and slope the roof downwards where the water will be collected in a container used to dispense into toilets or various other water related appliances within the household.

1476131_3256048557178_1891703384_n

 

Materials:

Laptop, arduino, servo motor, moisture resistor, bread board, wires, arduino/processing software

3 MDF facades, 4 metal roof panels, 1 water catching contraption, 4 retractable metal rods, string for pulleys

Schedule:

Dec 3 – 4

Finalize details and materials needed

Buy materials and begin constructing structure

Dec 5-10

Writing code, testing,  tweaking, troubleshooting

Complete final structure