Vase – Love vs Hate, Baltimore (2014-11-25)


Well, Bullshit

Shapeways removed ceramics as one of their materials with no warning. They are introducing porcelain, which actually sounds like it’ll be better in most respects, but there is no ETA for when one can order something printed in porcelain.

Seems rather shortsighted to remove the ceramic without having a replacement ready to go, also, throws a wrench into my plans for the short term.


Tweet Data Captured

I captured 48 total hours of tweets in Baltimore, from Mon Nov 24 09:49:16 EST 2014 to Wed Nov 26 09:55:56 EST 2014.

I will likely just use the data from 2014/11/25 00:00:00 to 2014/11/25 11:59:59 so that I can visualize exactly one full day in Baltimore. Right now I’m pretty settled on using “love” vs “hate” since those words are short, common, diametric, and a major category of tweets (praise or complaints).

After doing some very basic ratio analysis on the data, and seeing how the ratio varies from one hour to the next, there’s a few hours that would not print well in a ceramic 3D print as there would be just too much overhang and adding support material would probably work well because it wouldn’t be removable like it could on a plastic print since everything gets coated in glaze and fired in a kiln. So, I think making each segment a two hour timespan would give a better result for the medium as well as reducing the total number of segments by half allowing a less frantic—more smooth—shape over the total possible height allowed by Shapeways. Here’s a very rough mockup of what this would look like:

Next up: Rhino/Grasshopper to convert the data into an actual model.


Nov. 25, 2014

We cut out the totality of the dots we’ll be using in the .25″ lollipop pink acrylic from Acrylite.

Everything cut.

Everything cut.

Out cut settings were the following for future reference:

  • Power: 60
  • Speed: 100
  • Auto Hz
  • Passes: 3
  • Air: On
  • IPC: Accuracy, HQ Checked

We also cut some very small (.0625 radius) spacer pegs in the 1/8″ plexi, but, they were just a tad too short to allow enough space between the two panes for plexiglass for the LEDs to fit. So, we’ll have to cut pegs out of the .25″ pink stuff we have left, and we can increase the radius of each peg by double (to .125″).

Little plexiglass spacer pegs.

Little plexiglass spacer pegs.

Tested the acrylic solvent cement with the pegs, and it works extremely well: bonds quickly and bonds clearly.

Cement and light test with pegs inserted. No shadows.

Cement and light test with pegs inserted. No shadows.

Also, worked on the Processing script with will take input from our little USB camera and detect motion. Currently the feed is divided into three section and will then fire a function to trigger some LED dynamism. Need to actually create the function that makes the Arduino PWM the LED strips, but the motion detection stuff—the hard part—is done, now all that is left is basically some fairly straight forward math.


Data Driven Vase

After trying to figure out a way to both capture twitter data based on location and via keyword, here’s what I’ve discovered:

1) Twitter’s advanced search allows you to search for tweets based on location, with a specific keyword, and between certain past dates. But, it only returns the “top” tweets. In other words, it only returns a very small subset of everything.

2) The ScraperWiki website can no longer scrape Twitter data.

3) The twitter4j library for Processing cannot filter live streaming tweets by both location and by keyword at the same time. Additionally, if one searches by keyword they get just a small sample of all tweets around the world w/ that keyword, and very rarely will a tweet in your desired geolocation be found.

So, best way to get the information I require was to capture all the tweets I can for my location, and then after capture can I process and search for my keywords.

Here’s my Processing code that will run non-stop collecting tweets. Each hour the script will save the current file, and then start a new text file to save tweets to so I’m not potentially dealing with a massive text document in the end.

import twitter4j.conf.*;
import twitter4j.internal.async.*;
import twitter4j.internal.logging.*;
import twitter4j.json.*;
import twitter4j.internal.util.*;
import twitter4j.auth.*;
import twitter4j.api.*;
import twitter4j.util.*;
import twitter4j.internal.http.*;
import twitter4j.*;
import twitter4j.internal.json.*;
import java.util.List;
import java.util.Map;
import java.util.*;

 Developed by: Michael Zick Doherty
 Adapted to capture twitter data, for a 
 specific geographical locaiton, by Alex Jacque.

PrintWriter output; // var to hold reference to text file
int textUpdateDelay = 60*60*1000; // every hour, (minutes * seconds * milliseconds)
int lastUpdate; // when we last updated
int hour = 0; // starting hour number

///////////////////////////// Config your setup here! ////////////////////////////

// This is where you enter your Oauth info
static String OAuthConsumerKey = "blFY1vgdJGR8LRqx6LTYpQ";
static String OAuthConsumerSecret = "P3SfVF8EbOouVS7kAQjXydLbfvjZbhLsuqMKYToUg";
// This is where you enter your Access Token info
static String AccessToken = "63581578-HGdSyHfp7kNppdFdMOM3E57Gm4zgiroPq419u4pZQ";
static String AccessTokenSecret = "f8OZ2dv74tXhWbx2r9X57ZunTDpyzzHAa7AMmQPsmo";

// if you enter keywords here it will filter, otherwise it will sample
String keywords[] = {""};

///////////////////////////// End Variable Config ////////////////////////////

TwitterStream twitter = new TwitterStreamFactory().getInstance();
PImage img;
boolean imageLoaded;

void setup() {
 double lat = 39.2905807; // baltimore, md
 double longitude = -76.6092606; // baltimore, md
 double lat1 = lat - .08;
 double longitude1 = longitude - .1;
 double lat2 = lat + .08;
 double longitude2 = longitude + .1;

 double[][] geoloc = {{longitude1, lat1}, {longitude2, lat2}}; // bounding box

 output = createWriter("hour"+hour+".txt");

 //if (keywords.length==0) twitter.sample();
 twitter.filter(new FilterQuery().locations(geoloc));

void draw() {
 if (imageLoaded) image(img, width/2, height/2);

// Initial connection
void connectTwitter() {
 twitter.setOAuthConsumer(OAuthConsumerKey, OAuthConsumerSecret);
 AccessToken accessToken = loadAccessToken();

// Loading up the access token
private static AccessToken loadAccessToken() {
 return new AccessToken(AccessToken, AccessTokenSecret);

// This listens for new tweet
StatusListener listener = new StatusListener() {
 public void onStatus(Status status) {
 if (millis()-lastUpdate>=textUpdateDelay) {
 output.flush(); // flush anything that's still not written
 output.close(); // close the output
 println("new file");
 hour = hour+1; // increment our hour integer
 output = createWriter("hour"+hour+".txt");
 lastUpdate = millis(); // reset last update
 String screenName;
 Date createdAt;
 String text;
 GeoLocation coords;
 screenName = status.getUser().getScreenName(); // tweet owner's screen name
 createdAt = status.getCreatedAt();
 text = status.getText();
 coords = status.getGeoLocation();
 output.flush(); // Writes the remaining data to the file

 // Checks for images posted using twitter API
 // println(status);
 // output.println(status);

 public void onDeletionNotice(StatusDeletionNotice statusDeletionNotice) {
 // System.out.println("Got a status deletion notice id:" + statusDeletionNotice.getStatusId());
 public void onTrackLimitationNotice(int numberOfLimitedStatuses) {
 // System.out.println("Got track limitation notice:" + numberOfLimitedStatuses);
 public void onScrubGeo(long userId, long upToStatusId) {
 // System.out.println("Got scrub_geo event userId:" + userId + " upToStatusId:" + upToStatusId);

 public void onException(Exception ex) {

void keyPressed() {
 if (key == 'k' || key == 'K') {
 // output.flush(); // Writes the remaining data to the file
 output.close(); // Finishes the file
 exit(); // Stops the program

The type of result I get from this looks like:

Sat Nov 22 16:00:07 EST 2014
If only you could wear your pjs to work LIFE WOULD BE AMAZING
GeoLocation{latitude=XX.XXXXXX, longitude=-YY.YYYYYY}
Sat Nov 22 16:00:08 EST 2014
getting my new phone on tuesday .
GeoLocation{latitude=XX.XXXXXX, longitude=-YY.YYYYYY}
Sat Nov 22 16:00:09 EST 2014
Brewer isn't a fit for tech make a change.
GeoLocation{latitude=XX.XXXXXX, longitude=-YY.YYYYYY}

Then, I run this through a PHP script to break the text file into an array of tweets and then further break each tweet into an array with indexes for screenName, date, text, and coordinates. From there, I can search for a specific term and have the number of matching tweets returned. But, I need to be careful in how I search, because if I just search for say “sad” it will return a match even if the actual word used in the tweet is “sadistic.”

 $result = array();
 $file = explode("**********", file_get_contents("hour0.txt"));
 foreach ( $file as $content ) {
   $result[] = array_filter(array_map("trim", explode("\n", $content)));
 $termCount = 0;
 $searchTerm = " love ";
 foreach ( $result as $tweet ) {
   if (strpos($tweet[3], $searchTerm) !== FALSE) {
 print $termCount;

Next step, run the script for 24 hours on my computer in my studio. Then, parametric modeling in Rhino/Grasshopper.


Towards a Built Object

This last week we made a bit progress in the march towards our final installation, but really, not as much progress as we would have liked.

We started by constructing our space in rhino to the actual dimensions of the window we’ll be using. We placed our type lockup in the space to size, and then dropped in our dots at the scale that looks most appropriate.

Because we’re attaching the LED strips to the outside of these plexiglass circles, and because LED strips come in 1 inch segments, it only made sense that our circles would have a circumference equal to whole integers. So we created new circles at each location to an exact circumference measurement—rounding to the nearest integer of what we had placed from our original dots file—which Rhino makes so very easy.

Total circumference of the circles: 105″
Total LED count: 315
Total LED length on hand (of one °K value): 196″

Circumference measurements.

Circumference measurements.

3D perspective of our final environment.

3D perspective of our final environment.

Here’s a short video of moving around that 3D space.

We had thought that we’ll need a whole bunch of plexiglass for this but it turns out that all of these dots will fit really well within a 18″x24″ sheet of plexiglass (under half of the space actually).

Additionally, we spent about 2 hours with Ryan figuring out the proper laser cutter power settings. The laser cutter seems a bit inconsistent from what we’re hearing, but we finally settled on the following for 1/8″ plexiglass:

  • Power: 65
  • Speed: 100
  • HZ: Auto
  • Air On
  • IPC: Accuracy
  • HQ On

Are those the setting that will work every time or even work with our final plexiglass material (which will actually be 1/4″)? Not a chance.

What’s next on our list? Well, we ordered a sample of the plexiglass we think we want to use, if that tests alight, we’ll order up the full needed quantity. Additionally, we’ll need to source a webcam, but that should be pretty easy and cheap. Additionally, we need to start figuring out our circuits in relation to the Arduino. Perhaps today we start the electrical work.

We do need to figure out how these letters will be finally output. Either CNC with the foam and hand finish each letter because the CNC won’t give us perfect results, or we’ll see what Ryan finds out about the about-to-expire material in regards to the Objet printer (in which case we can print the facets, and laser cut shapes to glue those to).


3D Printing and LEDs

This past class we got a tutorial on 3D printing, and we were able to use a individual letter from our faceted model of our type lockup. Results were mixed.

Just beginning.

Just beginning.

Further along printing.

Further along printing.

Finished 3D test print still on the print bed.

Finished 3D test print still on the print bed.

The points of the facets were rather rough, it seemed that when the extruder tip was probably not hot enough and thus the filament didn’t adhere to the previous layer very well while the extruder made small circles, and as such the just-extruded bits tended to gather.

It was brought to our attention that we may find better success, and certainly much faster results if we were to mill our letters on the CNC mill out of a dense foam (that we can easily and very cheaply acquire from Home Depot or Ace hardware). The level of detail we could achieve on the mill looks to be pretty fine from examples we were shown, and we could possibly then also mill letters for EACH of the various marquee walls and not just this one window display in Fox. Exciting!

Additionally, we wired up an LED strip to an LED driver and a dimmer. Results were good and it looks like we can get a good range of brightness levels from full-power down to the lowest power level before cutoff.

Cool white LEDs. ~16 feet with 3 LEDs per inch segment.

Cool white LEDs. ~16 feet with 3 LEDs per inch segment.

For plexiglass, we found a pink semi-opaque plexi sample that would be perfect for out final construction. We just need to order a batch up now, though first we need figure out exactly how large we will need these circles.

We found a near perfect pink plexiglass to use.

We found a near perfect pink plexiglass to use.

Rough sketch of what our circuit to control the LEDs through the arduino will look like.

Rough sketch of what our circuit to control the LEDs through the arduino will look like.

MJE3055T - Transistor that should handle the power needed to supply two small strips of LEDs.

MJE3055T – Transistor that should handle the power needed to supply two small strips of LEDs.