Snoopi update: visualizing sniffed data

The main focus this week was the data visualization portion of the project, since this week was our ICM final presentation.

Before the presentation, I had wanted to try the program/visualization in Washington Square Park. Surya (who made NSHeyy) had mentioned that it’s easier and faster to pick up router information in open public areas where every device is trying to connect to a wifi spot. I ended up going to a Starbucks, because it’s closer and, more importantly, warmer. I was able to almost immediately see that there were a few NYU students and was able to sniff a “jackpot”. This is the term we serious hackers use to describe people/devices with multiple legible data points, aka this is what I heard Surya say once. Simply, this means with “jackpots”, we can potentially look up the sniffed data to find out specific detailed information about the owner of the device. This is when things can get creepier.

Below is a video of the sniffed result from Starbucks. Skip through the video to see the result overtime and the interaction. Code for the project is here.

Sniffed visualization from Jiashan Wu on Vimeo.

Breaking it down: the visualization is made up of three sections

The left section is where all the sniffed devices are visualized. This is the most dynamic section.

  • Each dot is a device
  • The radius from a dot to the dog is the estimated distance between the device to where ever the program is running (my computer in this case)
  • The size of the dot corresponds to the amount of router information that was sniffed from that device. The more information, the bigger the dot.
  • The dots turn green when another device that had been connected to the same router shows up. They are connected by a green line, and they move closer to each other, while maintaining their radial distance.

The top right section is where information of each device shows up when a dot is clicked. This section allows the viewer to zoom in on the data.

  • “Sniffed” shows when the last time information was sniffed from the device
  • “Device from” shows the company that made the device
  • The rest of the section lists out all the unique router names sniffed from the device

The bottom right section is where the aggregate router names are displayed. This section provides a quick overview of the data.

  • When a router name is sniffed multiple times from different devices, its text turns green and its text size increases as the instances that the router name is sniffed increases
  • After every 20 router names, the first (earliest) router name is removed from the list, unless it’s green.

Stages of discovery: data structure

Wireframe: Setup the data structure. Getting the data to display in real time.

Wireframe: Setup the data structure. Getting the data to display in real time.

This was where I spent the most time on. It took me a while to wrap my mind around using objects in hashmaps. Once I recognized that a new object was created each time when the draw loops through the if statement and then the new object was added to the hashmap, I was able to get past it.

I also used

  • Table: to load the vendor information
  • stringDict: to store the parsed vendor information, used to match with the MAC addresses to determine the device maker
  • intDict: to save all the different router names sniffed from the devices and the count of how many times each router name is sniffed
  • ArrayLists: to store all keys in the hashmap; and store all routers sniffed from a device into its object
    I would like to dive deeper in data structures and get more comfortable with using them in combination.

Stages of discovery: data visualization

Data visualization rough mockup

Data visualization rough mockup

Mockup with light background. Too slick looking.

Mockup with light background. Too slick looking.

This was another challenge. First it was the organization of the data and what/how should they be displayed. Then it was the animation.

With the organization, I had decided early on the overall layout, but thinking through the details of each section was still tough. For instance, since the only spatial information I was able to base off was the devices’ signal power, it didn’t really matter whether a dot was on the left or right of the dog. So it was hard to clarify that the line connecting the dot to the dog is related to the distance. The background circles and the measurement key was added in the hope to clarify this.
Then it was a matter of visual design. I had tried mocking up a light background. It looked nice and slick, but definitely did not scream “creepy”. So I changed it back to dark. I’m still not very happy with the visual design and animation of the data visualization. I’m not a fan of special effects, and would like to keep each design decision purposeful, but I think I should be able to find a way to make it more engaging while keeping the simplicity.

In our first ICM user testing, a few people mentioned that they felt the P Comp side of the project (the doggie wearable sniffer) and the data visualization didn’t seem well connected. From these feedback came the idea of a dog house where information would be stored. Also it was clear that the dog had to be present in the visualization tie it back to the wearable device. Also the name on screen had to be clear it’s dog related and data/spying related. Luckily this name exist in English, hence Snoopi.

In both ICM testing/presentations, people in the class had questioned: why does this matters? How does this data reveal relevant information about someone? Why should I care that I’m leaking breadcrumbs of data from my phone? I’m happy that people are asking these questions, and it’s kind of the point of the project. I don’t know if all this matters. It probably depends on who uses this data and for what. What’s interesting is that we are at a time when none of this is defined yet. The norms of our attitude towards data and data collection is still being formed. Maybe years down the line, we’ll see our data as a form of currency and that ownership should be clearly defined. I have no idea. But I’m hoping to be part of this conversation through Snoopi.

More to do!

  • Update some of the design and animation details of the data visualization
  • Complete the wearable device!
  • Bring in the dog!

Our ICM class was awesome. People disconnected their device from NYU wifi so we can track them and demo the visualization in real time. I’m gonna miss this class. Here’s a video of the demo in class.

doc5 from Jiashan Wu on Vimeo.

The game of driving a straight line

Following up to testing all the sensors, our team custom made a steering wheel that controlled a game developed by Ayanna. The player drives a car and tries to stay on the road. That’s it! The constraints of the project was that we had to spend less than four dollars on materials and that we had to use only the sensors provided by Kaho. We created the wheel out of two plastic plates with two tilt sensors, a sonar sensor, and the Flora. Two holes were cut out from the plate to ensure the player would hold it the right way, so that we can get the best readings from the tilt sensors.

I like to think of our game as the Flappy Bird of hardware games. It’s deceptively simple, but nearly impossible to win. Because the tilt sensors work more like switches. Even though we were able to get increasing and decreasing values from them, they essentially produce binary results – on or off. That makes driving a straight line extremely difficult with our wheel. Once the player makes his/her first turn, it’s impossible to go back to just driving in a straight line, because the wheel is always creating a left or right turn on the screen. One of our testers even flipped the car over while trying to steer it back to the middle of the road! What I learned from this project is that simple, impossible games are pretty addictive. Even after flipping his car over, the player still wanted to try again. The stupidly simple goal of the game makes it frustrating to players that they can’t beat it, so they keep playing. (insert evil laugh)

IMG 3987 from Jiashan Wu on Vimeo.

IMG 3988 from Jiashan Wu on Vimeo.

Tilt sensors between the plates.

Tilt sensors between the plates.


We also tried using the Arduino, because we were having issues reading the sonar values with the Flora.


We went back to the Flora because the Uno didn’t support keypress events.


Completed steering wheel!

Snoopi: The Data Sniffing Dog

Snoopi, aka Prints, in his hoodie.

Snoopi, aka Prints, in his hoodie.

Snoopi, aka Prints, will “sniff” out data of people around him through a wearable hoodie controller that he will wear and display the information on a LED screen attached to the hoodie. This real time data and collected data would also be visualized in the “dog house” on a computer nearby.

As a group, we have thought a lot about the way our data is collected on an everyday basis, with and without our knowledge. We are interested in exploring this topic and spotlighting this issue to people who may be unaware. Culturally we are at a place where the ramifications of having this data shouted out over wireless network has largely remained unseen. With Snoopi we want to start the conversation surrounding simple, benevolent acts (such as petting a dog) and data collection and profiling. Also we like puns.

System diagram

System diagram

Sniffing Program

We are using N.S.Heyy, created by Surya Mattu, to pick up wifi signals from nearby mobile devices. This allows us to capture unique mac addresses and their corresponding wifi router connection history and their distance to the program (through signal power).

N.S.Heyy sniffs out WiFi probing signals from devices. Screenshot of the interface.

N.S.Heyy sniffs out WiFi probing signals from devices. Screenshot of the interface.

Snoopi Hoodie Display

Prints will be wearing a hoodie controller with a LED screen attached on the back to display scrolling router history information. The LED screen would either be programmed by an Arduino which is connected to Bluetooth through BlueFruit or a mini computer such as the Raspberry Pi or Android device. The LED screen will display the information when the hoodie is up and not display anything when the hoodie is down. This would be done with conductive material sew on the hoodie to create an on/off switch.

Hoodie diagram

Hoodie diagram

Data Visualization (dog house)

The data visualization is a Processing sketch that would run on a nearby computer. It would display the people/devices around Prince in real time. Viewer would be able to click on each to reveal their router history. The dominant router names sniffed from the area would also be displayed in the corner of the visualization.

Dog house data visualization sketch

Dog house data visualization sketch

Data visualization rough mockup

Data visualization rough mockup

Wireframe: Setup the data structure. Getting the data to display in real time.

Wireframe: Setup the data structure. Getting the data to display in real time.


  • Sniffing program: the program is running smoothly and updates in a spreadsheet that can be loaded line by line into the Processing sketch
  • Hoodie: we have the Arduino setup with the Bluefruit and the LED screen. We can send messages from the Arduino IDE to be displayed on the LED screen. This is operating fully off of a 9V battery
  • Hoodie: we have a black hoodie! have prototyped it with Prints.
  • Data vis: we have the data structure setup drawn out and have started on a rough version of the layout
  • Prints: we are training Prints to get into a bag so he can also sneak in like a bandit
Testing LED screen with Arduino and Bluefruit. It works!

Testing LED screen with Arduino and Bluefruit. It works!

To dos

  • Ordering materials
  • Hoodie: prototyping the on/off switch
  • Hoodie: getting the Processing sketch to send strings to Arduino IDE and the Arduino IDE to send strings to the LED display
  • Hoodie: trying out mini computer option
  • Data vis: finalizing design and completing code
  • Testing: testing it on Prints around people!

General Questions

  • How revealing is the data visualization? With a few devices we were able to pick up lots of router history, and was even able to google map them out to get a pretty clear idea of who the person is and where he has been. But this kind of detail is rare and even with the rare cases, the visualization itself wouldn’t be able to reveal much without manually google searching router names.
  • Will people be able to see the LED screen?

Technical Questions

  • Where should we place the program? The ideal would be on Prints, so that he would be the center point from where the data would be collected and where their distance would be measured from.
  • Alternatively, we can also leave the program on the computer, where the data visualization would be displayed, and send the router names to the LED screen on Prints to be displayed. How would we get the Processing sketch to send strings to the Arduino IDE that would in turn send the strings to the Arduino and LED screen?
  • How to get real-time data/string information to upload and display from Processing to Arduino and then to the LED display on Prints?
  • Can NSHEYY operate on the Raspberry Pi?

[Bill of Materials here]