Sensory Storytelling Final

My final project for Sensory Storytelling uses a real-time feed of search terms on twitter to affect a particle system in Unity.

In the project I use a node.js server and to pull tweets from the twitter streaming API. I am pulling and parsing three search terms—”migrant crisis,” “presidential election,” and “poverty.” These are issues in the public conversation that have been really frustrating me lately.

My goal with this project was to take these terms and use them to create something seemingly unrelated and beautiful. In doing so I’m attempting to offer a different perspective on frustrating things.

On another level, it’s interesting that, in reality, Twitter users have control over the development of this Unity session—but they aren’t aware of it and if they were they would likely feel that, despite their individual contributions, they couldn’t affect the outcome on their own.

I started developing this idea originally when I started to play with the Twitter API for my mapping class. (See more on hooking up Twitter to Unity here) There is something really impressive about seeing so much data coming through in real time.

This Vimeo video (and this post) shows the first time I was able to hook up a socket and the twitter feed to Unity.

The project uses the following libraries:

  • Node.js
  • Twitter (npm)
  • Socket library for Unity

I used node.js as my server. The twitter streaming API is running on node and the results are being sent with the library. I’m then using in Unity to pull through the Twitter search results.

The final Unity project uses three particle systems of similar sizes with different colors. Each system is tied to one keyword on twitter, and bursts each time a tweet comes in. Right now the graphics are very simple, but as I get more comfortable with Unity, it would be interesting to work more with the graphics and find a way to somehow make them more representational of the input.

This video shows the final Unity program running next to the console.