My final project for Sensory Storytelling uses a real-time feed of search terms on twitter to affect a particle system in Unity.
Recently I used the Twitter streaming API for the first time. I was really impressed watching the flood of live data come in and I’ve been thinking of different applications for it. I also recently learned to hook up Socket.io to Unity. So I’m thinking of having Twitter play a game or create something in Unity.
This week in Sensory Driven Storytelling we are exploring connected spaces by taking input from non-standard sources ( Kinect, Leap, VR, Arduino, Node.js, Web Browser, Ableton Midi etc ). I chose to use Node and Socket.io to input data into Unity.
This week in Sensory-driven Storytelling we are applying and exploring what we’ve learned in Unity. The assignment is to submit an experiment that uses traditional input (mouse, key) to affect the narrative of your scene using any of the techniques or items learned. Consider how this input might be mapped in the future.
One of my favorite unexpected places to find a story is Yogi tea labels.