I am exploring how the body can be better represented virtually in real-time. I’m interested in live events and in working with the physical body. Currently live virtual representations focus on the face, hands and voice. They leave the user mostly disembodied. Our bodies are an integral part of our experiences and being present in the moment. I am interested in exploring how we can create more truthful representation of ourselves in virtual space, and what might be gained by doing so.
My current plan is to continue my development on Kinectron, an application and API that streams Kinect V2 data—RGB, depth, infrared and skeleton data—over a peer-to-peer (UDP) connection. I am currently interested in pursuing two different paths. The first is to create a 3D hologram using an array of two or more Kinects. The second idea is to use Kinect skeleton data to puppeteer avatars in three.js. I’ll decide my direction based on the research I do in the upcoming weeks.