AOAC final idea: augmenting social interactions

Fish and Carrier Frequencies

What do we experience when we are in the physical presence of other people? Do we experience joy or fear? Can we augment this experience by altering what you see and hear?

These are some of the things I want to explore with my AOAC final project. Jason and I will be working together to create an “augmented social” experience utilizing BLE connections between smartphones and smartphones/beacons. We have two ideas, both involve multiple participants, wearing Cardboard to augment their sight and headphones to augment their hearing.

Fish: The experience we create would simulate a school of fish interacting with each other underwater. Each participant would represent a fish in the experience.

  • Each person/fish triggers a different oscillator that has an unique sound.
  • As one person/fish gets closer to another person/fish, they would hear the other’s sound and corresponding visuals.
  • The visuals they see in Cardboard would be a filtered view from the camera, that changes based on the closeness of other people/fish.
  • Each person/fish can also speak into their microphone to add more audio queues to the whole group’s experience.
  • The direction the person/fish is facing can also change the audio or visual experience.

Carrier Frequencies: Experiencing other people only through their devices’ frequencies. The details of this idea is similar to the Fish idea, but conceptually focusing on the connection between our devices over our actual social interactions.

  • Every participant emits a “carrier” frequency, which has parameters like amplitude, filter frequency.
  • Parameters are modulated by proximity to other people. So other people act as “modulators.”
  • The frequencies are both sonified and visualized.

Getting technical

To measure the physical relationship between each person/device, we will use BLE connection between devices or connection between devices and iBeacons placed around the room.

We’ve tested with iBeacons. The distance measured is not very accurate, but the audio produced from the frequencies are pretty cool.

I’m also trying to connect devices through BLE. There are two Phonegap plugins (this and this) that would set an iPhone as a peripheral, but I’m not having much luck with them. There are definitely more options in Objective-C. I’ll have to look into this more.


We received a lot of great feedback from the class after we presented the ideas. Here are some of the feedback notes we collected.

Screen Shot 2015-04-20 at 1.09.00 AM

Screen Shot 2015-04-20 at 1.09.12 AM

Screen Shot 2015-04-20 at 1.09.22 AM

Screen Shot 2015-04-20 at 1.09.40 AM

Screen Shot 2015-04-20 at 1.10.04 AM

Screen Shot 2015-04-20 at 1.10.25 AM


AOAC midterm ideas

Idea 1: Update!

If at a location and/or time, send or receive an update. Users can customize these updates like with IFTTT.

Idea 2: Musical routes

An app that let users create music routes. A designated song would play when their subscriber is in a specific location.

Background and questions

I realized that my main fascination with the “contextualize” idea is rooted in the idea of content made just for mobile. This means content that’s made with real world context in mind, context such as geolocation, time, weather, or how fast the user is moving. What would a platform for creating this kind of content look like? Would the content be informative or entertaining or both? How long can this content be? Can it be shared on other platforms?

This also reminded me of my first assignment in ITP last semester, designing an experience for a classmate in Nancy’s Applications class. Every week a group of students drew from a hat that was filled with things that was tied to a location in NY. We were to visit that location and follow directions from what we drew if there were any. Then we had to design our own experience for next week’s group. Nancy tracked our experiences on a map in our Applications blog. It was a great way to get us off the floor and for students new to NY to explore the city. What if anyone can design an experience on their phone, with content linked to specific locations and/or time, and others can subscribe to that experience?

How do I use my phone when out?

  • Moving: I may play music, podcast or an audiobook.
  • Not moving (waiting or on the subway): read from Kindle app, Twitter, Feedly, or listen to audiobook
  • Navigate
  • Communicate
  • Getting information linked to a specific location (e.g. bus schedule)

Get Info Here and there

Continuing to explore my “contextualize” idea, I’m hardcoding the first event in the app: when I’m at the M54 bus stop, send me schedule for the next bus.



The app should allow me to:

  • update the location where I want to receive the schedule by typing an address or dragging a map
  • close the time option
  • turn the event on and off
  • view the event from the time list and on the map

My goal for this week is to get the following to work:

  • background geolocation plugin
  • MTA api
  • saving location in local storage
  • simple UI