Archive | December, 2013

Final Project – Garrett Jensen & Olivia Coughlin

22 Dec

 

 

 

DSC_0249DSC_0253DSC_0257DSC_0268DSC_0277DSC_0292DSC_0299

After much deliberation and phases of design, Garrett and I were able to pull something together that

was both challenging and enjoyable to accomplish. Beginning with many different sustainable designs

for residential homes, we finally landed on a design for opening and closing window treatments. The

drapes went through multiple phases from a precedent consisting of string, to elastic fabric, then rubber

bands, to cloth bands, then finally back to elastic fabric.

One large piece of spandex material covered three ceiling to floor windows with four horizontal slits

across each window pane. Installing two servo motors hidden within the window mullions allow for

discreet, hands-free drape movement. We created a code for the dual servo motors to move 180

degrees simultaneously in a loop. Connected to thread and looped through small eyelets along the

ceiling and back down to the servo motors, the slits in the fabric have the ability to open and close

depending on the position of the servo motors.

The servo motors are on double loop per day rotation in which they rotate 180 degrees and then back

180 degrees. This motion is in synchronized with the time of day in order to open the drapes fullest

when the sun is at high noon, thus less direct sunlight through the windows and more ambient light.

Another idea we were thinking about installing is drape movement dependent on the twisting of a

potentiometer where the openness of the drapes would be user-controlled and interactive. This would

be simple reconstruction and code change.

We designed this mechanical circuit for the sustainability it can produce. With the ability to control the

daylighting in a home, this would be an energy-efficient product that would take advantage of natural

light at the perfect times. Though the final product could be tweaked to be even more functional and

aesthetic, it is a good starting point for what we were invisioning.

Code:

#include <Servo.h>

Servo myservo; // create servo object to control a servo
Servo myservo2;
int potpin = 0; // analog pin used to connect the potentiometer
int val;
int pos = 0;
int buttonPin = 2;
int buttonState = 0;// variable to read the value from the analog pin

void setup()
{ pinMode(buttonPin, INPUT);
myservo.attach(9); // attaches the servo on pin 9 to the servo object
myservo2.attach(10);
}

void loop() {
buttonState = digitalRead(buttonPin);
if(buttonState == 1) {
val = analogRead(potpin); // reads the value of the potentiometer (value between 0 and 1023)
val = map(val, 0, 1023, 0, 179); // scale it to use it with the servo (value between 0 and 180)
myservo.write(val); // sets the servo position according to the scaled value
myservo2.write(180-val);
delay(15);
}

else {
for(pos = 0; pos < 180; pos += 1) // goes from 0 degrees to 180 degrees
{ // in steps of 1 degree
myservo.write(pos); // tell servo to go to position in variable ‘pos’
myservo2.write(pos);
delay(20); // waits 15ms for the servo to reach the position
}
for(pos = 180; pos>=1; pos-=1) // goes from 180 degrees to 0 degrees
{
myservo.write(pos); // tell servo to go to position in variable ‘pos’
myservo2.write(pos);
delay(20); // waits 15ms for the servo to reach the position
}
}
}

Advertisements

Final Project – Interactive Box Array – Jake Brzezicki

21 Dec

ImageImageImageImageImageImage

I designed an interactive space of a system of boxes in a vertical array. I originally wanted to design it so that when a hand or other object came into close proximity to the boxes, then the boxes would move out of the way using servo motors and the tied string. This would have been done by using the lead motion control to track the hand, then when it got close, activate the motors and move the boxes. Unfortunately, I was running low on time so I was only able to make it so the boxes would move from the motors, and I hooked it up to a potentiameter. 

The main process was focused around building a base structure for the boxes to hang from and from where the motors would pull. The wood frame is constructed from 1.5 x 1.5 inch wood rods, and loop screws that the string was fed through. I made it at 4 ft scale, but should have gone smaller, but if was all successful I could imagine it as a large interactive installation somewhere that people would have to walk through it and have some fun. For the movement I just used a basic servo motor code with the arduino and bread board connected to the motors with the potentiometer. Also a counterweight to keep the string taught. The boxes were able to move, not as much as I wanted, but it was a start. It was suggested that I have 2 potentiometers to lessen the load on one so I tried to program that but that didn’t work. Fortunately the 1 potentiometer was able to handle the 5 motors. For future iterations I would try to design the array better, and work of the code, as well as try to incoporate the lead motion aspect.

Code:

#include <Servo.h>

Servo microservo;

Servo microservo2;
Servo microservo3;
Servo microservo4;
Servo microservo5;
Servo microservo6;
int pot = 0;
int pott = 0;
int val;
int vall;

void setup() {
microservo.attach(1); // put your setup code here, to run once:
microservo2.attach(2);
microservo3.attach(3);
microservo4.attach(5);
microservo5.attach(6);
microservo6.attach(7);
}

void loop() {
val = analogRead(pot); // put your main code here, to run repeatedly:
vall = analogRead(pott);
val = map(val, 0, 1023, 0, 179);
vall = map(vall, 0, 1023, 0, 179);
microservo.write(val);
microservo2.write(val);
microservo3.write(val);
microservo4.write(vall);
microservo5.write(vall);
microservo6.write(vall);
delay(5);

}

Final Project – Interactive LED Tent – Alexa Zyllo, Josiah Song

21 Dec

Image

Image

Image

Image

Our project was to build an interactive LED tent that explored the way we can change the experience of a space using the things we learned over the course of the class. The interactive tent is made up of two things, PVC 1/2″ piping and a Spandex/Rayon blend of fabric. Each pipe is 53″ long and was cut with a saw and sanded to get rid of any sharp edges left behind. The pipes were then put together with a 3 way pipe connector to form a cube (53″ x 53″ x 53″). After the cube was formed, the cloth was cut in a rectangle shape (60″ x 212″), and sewed to make a closed loop. After this loop was made, a square piece of cloth (53″ x 53″) was cut and sewed onto the loop to form an almost completely closed box. To insure the cloth would stretch right, little pockets were sewed into the bottom four sides (this is where the tubing went through). When the cube was completely constructed, a zipper was added to one side to allow people to enter and exit the cube.

After the construction of the tent, we strung up a string of 45 LEDs that went around the top, down one corner, and continued all the way around the bottom and connected to a power box. This in turn was connected to an Arduino and DMX shield, as well as four flex sensors that were hooked up to the tent (one on each side of the tent, attached at the top and center). The way that the tent worked was based on mapped analog value readings from the flex sensors. They were attached in such a way that when one pushed on the sides of the tent from the inside, the bottom tip of the sensor would bend outward up to about 45° and readings would depend on the degree of bending. The change in readings would correlate to changes in the color of the LEDs.

Each LED’s color is determined by a combination of a red, green, and blue value (RGB). We randomly assigned each sensor to be associated with alternating R, G, or B values along the string of LEDs. 45 LEDs meant 135 RGB values, and with up to 3 different sensors effecting one LED, the change in colors was quite drastic throughout the entire tent. The DMX shield was used to control the LEDs within the Arduino code. Originally we were coding for the brightness of the LEDs to change as well, but due to time constraints did not have a chance to try alternate coding.

With more time and finances, next steps would include recreating the tent with more size, sensors, and LED’s. The coding would be refined to allow for smoother changes in the LEDs, and we would determine better ways to organize the RGB value changes so human interaction with the tent can be more intriguing. We also had an early goal of incorporating ambient music tones to create a more ethereal effect when inside the tent that we would definitely explore given more time.

Here is the initial code that we first came up with. Each flex sensor has to be on its own channel and control an LED through its readings. In the final code, this was simply extrapolated to include multiple RGB values and incorporated the DMX shield.

int flex0 = A0;
int flex1 = A1;
int flex2 = A2;
int flex3 = A3;
int led1 = 9;
int led2 = 6;
int led3 = 5;
int led4 = 3;

void setup(){
  Serial.begin(9600);
  pinMode(led1, OUTPUT);
  pinMode(led2, OUTPUT);
  pinMode(led3, OUTPUT);
  pinMode(led4, OUTPUT);
}

void loop(){
  int flexSensorReading0 = analogRead(flex0);
  //Serial.println(flexSensorReading0);
  int flexvalue0 = map(flexSensorReading0, 149, 312, 0, 255);
  //Serial.println(flexvalue0);
  delay(250);
 
  int flexSensorReading1 = analogRead(flex1);
  //Serial.println(flexSensorReading1);
  int flexvalue1 = map(flexSensorReading1, 149, 312, 0, 255);
  //Serial.println(flexvalue1);
  delay(250);
 
  int flexSensorReading2 = analogRead(flex2);
  //Serial.println(flexSensorReading2);
  int flexvalue2 = map(flexSensorReading2, 149, 312, 0, 255);
  //Serial.println(flexvalue2);
  delay(250);
 
  int flexSensorReading3 = analogRead(flex3);
  //Serial.println(flexSensorReading3);
  int flexvalue3 = map(flexSensorReading3, 149, 312, 0, 255);
  //Serial.println(flexvalue3);
  delay(250);
 
  analogWrite(led1, flexvalue0);
  analogWrite(led2, flexvalue1);
  analogWrite(led3, flexvalue2);
  analogWrite(led4, flexvalue3);
 
  delay(30);
}

Image

Lost Arduino Kit?

20 Dec

Lost Arduino Kit?

Did anyone forget their Arduino kit in the BDA studio? Message me if this is yours!

Final Project – Soundbox of Light – Charlene Kulesa

19 Dec

IMG_1670 IMG_1675 IMG_1694

My project is an interactive installation that tracks light intensity and correlates that data to a specific musical note. Movement is a secondary and indirect component that impacts the sound as it changes the degree of light intensity that is read by the photoresistor sensors. The base of the “soundbox of light” is a repurposed turntable where the record is raised above the turntable arm and has 4 cutout sections on the record that allow the ambient light to pass through to the photoresistor sensor that is attached to the turntable arm. The code is written so that if the light intensity is greater than 500, and the push button for the turntable is turned on, that it will play note 20 (reference the Midi Note Chart for the specific musical note). The record is cut so that the cutout sections are all equidistance and therefore when the turntable is turned on the notes are played as a rhythmic background beat.

The laser harp portion was added to the end of the turntable and the frame that houses the 3 lasers has a hinge that allows the frame to either lay down on top of the turntable or positioned upright when being played. The initial intent was to have a tilt switch sensor that turns on and off the lasers but due to inconsistent readings from the tilt switch I had to default to a push button sensor to turn on and off the lasers. Photoresistor sensors were placed at the base of the frame to align with each laser at the top of the frame.

The laser harp code uses both the direct light from the laser and the ambient light of the room to determine the musical note that is played on the laser harp. If the light intensity was less than 900 that meant that there was a break between the laser and its corresponding photo resistor sensor. This was the baseline that determined whether a note should be played. The actual note and pitch being played is based off of the light intensity. This light intensity decreases and increase as the user’s hand gets closer or farther away from the sensor.. The pitch of the note is dependent on the ambient light of the room and therefore the sound can vary from room to room, even though the code is exactly the same. If it is desired to hard code the exact note to be played depending on the height of the hand that is covering the laser, one could add a range detector and hard code a specific note to the hand location. For this project, I preferred the element of the ambient light as an uncontrolled variable that impacted the sound but I also see the benefit of having more control over the sound and note being played regardless of the ambient light.

Here is the code:

#include

// Constant variables:
int LEDpin = 13; // set LED to pin 13
int laserAPin = 10; // set laserA to pin 10
int laserBPin = 11; // setlaserB to pin 11
int laserCPin = 12; // set laserC to pin 12
int buttonTt = 5; // set turntable button to pin 5
int buttonL = 4; // set laser button to pin 4

// Variables for turntable:
int valTt = 0; // set turntable val to 0
int old_valTt= 0; // set turntable old val to 0
int stateTt =0; // set turntable state to 0

// Variables for lasers:
int valL = 0; // set turntable val to 0
int old_valL= 0; // set turntable old val to 0
int stateL =0; // set turntable state to 0

// Variables for notes:
byte noteA = 0; // The MIDI note value to be played for laserA
byte noteB = 0; // The MIDI note value to be played for laserB
byte noteC = 0; // The MIDI note value to be played for laserC
byte noteTt = 0; // The MIDI note value to be played for Turntable

// Variables for analog values:
int AnalogAValue = 0; // value from analog laserA photosensor
int AnalogBValue = 0; // value from analog laserB photosensor
int AnalogCValue = 0; // value from analog laserC photsensor
int AnalogTtValue = 0; // value from analog turntable sensor

// Software serial:
SoftwareSerial midiSerial(2, 3); // digital pins that we’ll use for soft serial RX & TX

void setup()
{
// Set the states of the I/O pins:
pinMode(LEDpin, OUTPUT);
pinMode (laserAPin, OUTPUT);
pinMode (laserBPin, OUTPUT);
pinMode (laserCPin, OUTPUT);
pinMode (buttonTt, INPUT);
pinMode (buttonL, INPUT);

// Set MIDI baud rate:
Serial.begin(9600);
midiSerial.begin(31250);

}

void loop()
{
// Set logic for turning on and off turntable via button:
valTt = digitalRead(buttonTt); // reads input on turntable button
if ((valTt == HIGH) && (old_valTt == LOW)) // check if val has transitioned
{stateTt = 1 – stateTt; // change state from on to off or vice-versa
delay (10); } // set delay

old_valTt = valTt; // val is now old val, store it

if (stateTt == 1) // check to see if state of turntable button is 1 (on)
{digitalWrite (LEDpin, HIGH); } // if true turn on LED
else { digitalWrite (LEDpin, LOW); } // if false turn off LED

// Set logic for turning on and off lasers via button:
valL = digitalRead(buttonL); // reads input on laser button
if ((valL == HIGH) && (old_valL == LOW)) // check if val has transitioned
{stateL = 1 – stateL; // change state from on to off or vice-versa
delay (10);} // set delay

old_valL = valL; // val is now old val, store it

if (stateL == 1) // check to see if state of laser button is 1 (on)
{
digitalWrite (laserAPin, HIGH); // if true turn on lasers A, B and C
digitalWrite (laserBPin, HIGH);
digitalWrite (laserCPin, HIGH);
}
else // if false turn off lasers A, B and C
{
digitalWrite (laserAPin, LOW);
digitalWrite (laserBPin, LOW);
digitalWrite (laserCPin, LOW);
}

// Set up analog read values:
AnalogAValue = analogRead(1);
AnalogBValue = analogRead(2);
AnalogCValue = analogRead(3);
AnalogTtValue = analogRead(5);

// Define note values:
noteA = (AnalogAValue/21); // A top range is approx 790; this equation equals note 36 (C)
noteB = ((AnalogBValue/15)-1); // B top range is approx 810; this equation equals note 51 ( D#)
noteC = (AnalogCValue/12); // C top range is approx 825; this equation equals note 67 (G)
noteTt = (AnalogTtValue/23);

// Logic that determines when and what note is played:
if ((AnalogAValue < 900) && (stateL == HIGH)) // if laserA sensor value is less than 900
{noteOn(0x90, noteA, 0x35);} // then play noteA at medium velocity

if ((AnalogBValue < 900) && (stateL == HIGH)) // if laserB sensor value is less than 900
{noteOn(0x90, noteB, 0x35);} // then play noteB at medium velocity

if ((AnalogCValue 500) && (stateTt == HIGH)) //
{noteOn(0x90, 0x20, 0x45);}

delay (250); // set delay value
}

// plays a MIDI note.
// cmd is greater than 127, or that data values are less than 127:
void noteOn(byte cmd, byte data1, byte data2)
{
midiSerial.write(cmd);
midiSerial.write(data1);
midiSerial.write(data2);

//prints the values in the serial monitor
Serial.print(“A:”);
Serial.print(AnalogAValue);
Serial.print (“, B:”);
Serial.print (AnalogBValue);
Serial.print (“, C:”);
Serial.print (AnalogCValue);
Serial.print (“, Tt:”);
Serial.print (AnalogTtValue);
Serial.print(“, cmd: “);
Serial.print(cmd);
Serial.print(“, data1: “);
Serial.print(data1);
Serial.print(“, data2: “);
Serial.println(data2);
}

Image

Final Project – Wine Wall – Tyler Cook & Casey Henrichs

19 Dec

photo photo (3) photo (1)Final Project - Wine Wall - Tyler Cook & Casey Henrichs

Our project used RFID technology to interact with a wine rack. There were two levels of interaction; the LEDs and the displayed text. Ideally we had planned to tag each bottle of wine with an RFID tag and then have those tags display a text when scanned pertaining to that type of wine. The information was a broad definition of a Riesling or Shiraz wine giving information about the grapes, history, and food pairings. This would allow the same tag to be reapplied to another bottle without it being the exact brand of wine.

The ceramic tiles which held the wine bottles were then fitted with LEDs. The idea developed to a point where scanning the tag would interrupt the LED loop and turn certain tile’s LEDs off. The idea behind this was that empty tiles would be dark until a bottle replaced to old one.

For the actual project budget constraints led to a few alterations in the design. Each bottle of wine contained an RFID tag. Due to the financial restraints of the project we were only able to display the piece with 5 wine bottles. These 5 tiles then had LEDs wired separately to transistors on the arduino board to stop the current when that particular tag was scanned. Unfortunately not all of the transistors were compatible and only 2 where able to be displayed with the correct changes in lighting.

Another issue we faced was getting the desired text to be read through processing. Processing would not display character strings received from Arduino. Our way around this was to type the phrases directly into processing or into Firefly. This was not ideal as it did not sync to the scanning of the tags. The information appeared though with the click of the mouse. The other problem with Firefly is that the tool used to display text only allows three inputs. This meant that the final project had two wine bottle tiles that had LEDs turning off when the RFID tags were scanned and three that projected a text when scanned.

Here is the code we used:

// RFID reader for Arduino
// Wiring version by BARRAGAN
// Modified for Arudino by djmatic

int val = 0;
char code[10];
int bytesread = 0;

void setup() {

Serial.begin(2400); // RFID reader SOUT pin connected to Serial RX pin at 2400bps
pinMode(2,OUTPUT); // Set digital pin 2 as OUTPUT to connect it to the RFID /ENABLE pin
//pinMode(5,OUTPUT); //Send to second arduino
digitalWrite(2, LOW); // Activate the RFID reader
digitalWrite(5,HIGH); //LEDs on in tiles
digitalWrite(3,HIGH);
digitalWrite(6,HIGH);
digitalWrite(7,HIGH);
digitalWrite(9,HIGH);
digitalWrite(11,LOW);
digitalWrite(12,LOW);
digitalWrite(13,LOW);
}

void loop() {

if(Serial.available() > 0) { // if data available from reader
if((val = Serial.read()) == 10) { // check for header
bytesread = 0;
while(bytesread 0) {
val = Serial.read();
if((val == 10)||(val == 13)) { // if header or stop bytes before the 10 digit reading
break; // stop reading

}
code[bytesread] = val; // add the digit
bytesread++; // ready to read next digit
}
}
if(bytesread == 10) { // if 10 digit read is complete
//Serial.print(“TAG code is: “); // possibly a good TAG
Serial.println(val); // print the TAG code
}
if (val==65){
Serial.println(“Shiraz originated in the Rhône region of France.”);
Serial.println(“Wines made from Syrah are often powerfully flavoured and full-bodied. While there is a wide range of aromas, two that are often noticed in Shiraz wines are that of blackberry and pepper.”);
Serial.println(“Shiraz is a heavy red with big bodied flavor and therefore pairs well with meets such as beef and lamb.”);
digitalWrite(5,LOW); // LEDs off when scanned
digitalWrite(11,HIGH);
//delay(500);
//digitalWrite(5,LOW);
}
if(val==53){
Serial.println(“Malbec originated in the Bordeaux region of France and has spread to South America.” );
Serial.println(“Wines made from Malbec grapes have hints of ripe fruits and earthy greens.”);
Serial.println(“Malbec is a medium bodied red wine pairing well with red meats and dishes of Mexican, Cajun, or Italian cuisine.”);
digitalWrite(6,LOW);
digitalWrite(12,HIGH);

}
if(val==48){
Serial.println(“Rosé wine originated in the Provence region of France which was settled by ancient Greeks around 600 BC.”);
Serial.println(“Wines of this sort rely on the skins of the grapes to color the wine but remove them before they add a harshness to the flavor.”);
Serial.println(“Since rosés are made with a variety of grapes the flavor ranges from light and fruity to a darker heartier wine.”);
Serial.println(“Due to it’s wide variety of blends a rosé pairs well with an endless possibility of foods.”);
digitalWrite(3,LOW);
}
if(val==50){
Serial.println(“Sauvignon blanc is a green grape that originated in the Bordeaux region of France”);
Serial.println(“The grapes were named after the French word for wild, sauvage, and produce wines with flavors ranging from zesty lime to a flowery peach. “);
Serial.println(“This herbaceous wine pairs well with simple greens and herbs.”);
digitalWrite(7,LOW);
digitalWrite(13,HIGH);
}
if(val==52){
Serial.println(“Riesling originated in the Rhine region of Germany.”);
Serial.println(“Riesling wines tend to exhibit notes of apple and tree fruit.”);
Serial.println(“Noble rot is a term used when Riesling grapes were harvested after a harsh freeze creating a much sweeter wine.”);
Serial.println(“Sweet riesling wines pair well with dessert as they are semi-carbonated like dessert champagnes.”);
digitalWrite(9,LOW);
}

bytesread = 0;
digitalWrite(2, HIGH); // deactivate the RFID reader for a moment so it will not flood
delay(1500); // wait for a bit
digitalWrite(2, LOW); // Activate the RFID reader
}
}
}

// extra stuff
// digitalWrite(2, HIGH); // deactivate RFID reader

Image

Final Project – Shadowbox (aka Mystery Box) – Kevin Desautels, Dennis MacAvaney, Kierra Thomas

19 Dec

Final Project - Shadowbox (aka Mystery Box) -  Kevin Desautels, Dennis MacAvaney, Kierra Thomas

Shadowbox is an installation that manipulates shadows. It works best in a dark room with a single light source. The goal of this project was to produce a fun, interactive piece that is both reactive and somewhat unpredictable (to add interest).
The Box contains an Arduino, a webcam, 2 potentiometers, and has a frosted plastic screen on which to cast shadows. The frosted plastic aids the processing script in blob detection by filtering out color. One potentiometer controls the color of the projection, and one controls the shakiness of the edges drawn. The arduino then relays the raw information to a laptop, which runs a processing script. The processing script captures video frames from the webcam, and attempts to draw the edges of the shadows it views from the inside of the plastic screen, and then draw lines from the center of the blobs to the outer edge (to act as a fill).
Because of the roughness of the blobs detected by the processing code, the projected image is abstract and interacts somewhat unpredictably. Shadowbox is a good proof of the concept, but could use streamlining and more accurate interpretations of the shadows. This would help to produce a more consistently reactive and abstract result. More controls could be added to give the user more options.

ARDUINO CODE:

int colorPot=0;//determines color in Processing (float Color)
int weirdPot=0;//determines shaky-ness of edges (float Weird)

void setup(){
Serial.begin(9600);
}

void loop(){
colorPot=analogRead(A0);
weirdPot=(analogRead(A1)+9000);//adds 9000 to variable to differentiate from colorPot
colorPot=map(colorPot,510,1023,0,510);
Serial.println (colorPot);
delay(50);//for stability
Serial.println (weirdPot);
delay(50);
}

PROCESSING CODE:

import processing.video.*;
import blobDetection.*;
import processing.serial.*;

Capture cam;
BlobDetection theBlobDetection;
PImage img;
boolean newFrame=false;
Serial myPort;
float Temp=0;
float Color=0;
float Weird=0;

// ================================================== setup()
void setup()
{
println(Serial.list());//Select serial port
myPort = new Serial(this, Serial.list()[3], 9600);
size(640, 480);
String[] cameras = Capture.list(); //Select camera
if (cameras.length == 0) {
println(“There are no cameras available for capture.”);
exit();
} else {
println(“Available cameras:”);
for (int i = 0; i < cameras.length; i++) {
println(cameras[i]);
}
cam = new Capture(this, cameras[3]); //iSight
//cam = new Capture(this, cameras[15]); //webcam
cam.start();
}
// BlobDetection
// img which will be sent to detection (a smaller copy of the cam frame);
img = new PImage(80,60);
theBlobDetection = new BlobDetection(img.width, img.height);
theBlobDetection.setPosDiscrimination(false);//"true" makes blobs favor bright areas, mostly works
theBlobDetection.setThreshold(0.8f); //detection threshold
}

// ================================================== captureEvent()
void captureEvent(Capture cam)
{
cam.read();
newFrame = true;
}

// ================================================== draw()
void draw()
{
if (newFrame)
{
newFrame=false;
background(0, 255); //background color
img.copy(cam, 0, 0, cam.width, cam.height,
0, 0, img.width, img.height);
fastblur(img, 2);
theBlobDetection.computeBlobs(img.pixels);
drawBlobsAndEdges(false,true);
}
}

// ================================================== drawBlobsAndEdges()
void drawBlobsAndEdges(boolean drawBlobs, boolean drawEdges)
{
noFill();
Blob b;
EdgeVertex eA,eB,vA,vB;

Temp=0;
String inString=myPort.readString();//Read from serial port

if(inString!=null){
inString = trim(inString);
Temp = float(inString);

if(Temp<9000){//filter out colorPot value, set = float Color
Color=Temp;
}else{//filter out weirdPot value, set = float Weird
Weird=(Temp-9000);
Weird=map(Weird,550,1023,0,.02);
if(Weird<0){
Weird=0;}
println(Weird);
}
}else{
Temp=0;
}

for (int n=0 ; n<theBlobDetection.getBlobNb() ; n++)//blob-scanning
{
b=theBlobDetection.getBlob(n);
if (b != null)
{
if (drawEdges)//Edges
{
//====================================color scrolling
float R=0;
float G=0;
float B=0;
if(Color255){
B=(Color-255);
}else{
B=0;}
G=(255-R-B); //R+G+B = 256

strokeWeight(5);
stroke(R,G,B);
for (int m=0;m 0){
eB.x += random(-Weird,Weird);//adds random value to edge endpoint based on float Weird
eB.y += random(-Weird,Weird);
}
if (eA !=null && eB !=null){
line(
eA.x*width, eA.y*height,
eB.x*width, eB.y*height);
line(
b.x*width, b.y*height,
eB.x*width, eB.y*height);
}
}
}//end edge drawing
}
}//end blob-scanning loop
}
// The following code is unneccesary, but adds stability to edges
// Super Fast Blur v1.1
// by Mario Klingemann
//
// ==================================================
void fastblur(PImage img,int radius)
{
if (radius<1){
return;
}
int w=img.width;
int h=img.height;
int wm=w-1;
int hm=h-1;
int wh=w*h;
int div=radius+radius+1;
int r[]=new int[wh];
int g[]=new int[wh];
int b[]=new int[wh];
int rsum,gsum,bsum,x,y,i,p,p1,p2,yp,yi,yw;
int vmin[] = new int[max(w,h)];
int vmax[] = new int[max(w,h)];
int[] pix=img.pixels;
int dv[]=new int[256*div];
for (i=0;i<256*div;i++){
dv[i]=(i/div);
}

yw=yi=0;

for (y=0;y<h;y++){
rsum=gsum=bsum=0;
for(i=-radius;i>16;
gsum+=(p & 0x00ff00)>>8;
bsum+= p & 0x0000ff;
}
for (x=0;x>16;
gsum+=((p1 & 0x00ff00)-(p2 & 0x00ff00))>>8;
bsum+= (p1 & 0x0000ff)-(p2 & 0x0000ff);
yi++;
}
yw+=w;
}

for (x=0;x<w;x++){
rsum=gsum=bsum=0;
yp=-radius*w;
for(i=-radius;i<=radius;i++){
yi=max(0,yp)+x;
rsum+=r[yi];
gsum+=g[yi];
bsum+=b[yi];
yp+=w;
}
yi=x;
for (y=0;y<h;y++){
pix[yi]=0xff000000 | (dv[rsum]<<16) | (dv[gsum]<<8) | dv[bsum];
if(x==0){
vmin[y]=min(y+radius+1,hm)*w;
vmax[y]=max(y-radius,0)*w;
}
p1=x+vmin[y];
p2=x+vmax[y];

rsum+=r[p1]-r[p2];
gsum+=g[p1]-g[p2];
bsum+=b[p1]-b[p2];

yi+=w;
}
}
}