Team Brown Update 11/9

This week I had office hours with Todd to talk about the technical logistics of the project. Jarrett and I also met with Kyla Ernst-Alper, who is interested in helping us develop the project.

Notes from Meeting with Todd

Todd gave me some great references, including:

This project, which uses mesh emitters:

And this, which has amazing realtime projection mapping:

He showed me an unreal project that adds ribbon particles to different animation sequences. These are ribbon particles:

I believe that he will demo this in class tomorrow. Todd suggested that we get Kyla moving with this type of particle as soon as possible to begin to get a feel for things. I’m hoping to do this on Sunday at 3pm.

Todd and I also discussed ways that we can “archive” previous sessions in order to keep the rendering load manageable. We looked at the possibility of using “scene texture 2d,” which would allow us to record a 2d video in real time of the 3d scene, then turn that video into a video or image texture that could be placed on an object and manipulated and/or archived. We could just put these on cubes, for example. Or we could put them on destructive meshes and use radial force (?) to break them up and push them away.

He recommended the following places to get going with particles:

  • content examples — effects — particles
  • landscape mountains — bird mesh emitters
  • shooter game — car mesh emitters

Todd also let me know that we only have one screen out to a projector in our current scheme. It would be the scene inside the Vive. We will likely want to use the open source software Mediate (?) to create the multiplayer game.

Notes from Meeting with Kyla

Kyla, Jarrett and I looked at Synchronous Objects as a reference for movement. Based on this idea, Kyla had a lot of interesting thoughts on the way that two different performers could work with each other in virtual space with rule-based improv.

She also brought up the physical limitations of the person in the Vive. It is wired, which will limit mobility. Also, for people not used to being in a headset, they may not be comfortable moving around much in physical space. These are really important points that we’ll have to test.

She also let us know that she did do mocap once before, about five years ago, and the system had some problems reading her extreme joint flexibility. I’m looking forward to seeing how the Optitrack handles her body.

Cover image: Screenshot from this week’s inspiration, Forms by Quayola.