Categories

Realtime 3D and theatre

Real-time 3D-CGI technology in the Performing Arts got attention to a wider audience as early as 1989 through the work of choreographer Merce Cunningham who used an application called Lifeforms from Credo Sotware Products. A wide range of software and hardware applications have since then been developed many times by the artists themselves. To apply interactive real-time 3D-CGI technology in the Performing Arts several tools are needed from the creation of the virtual space and objects to the hardware needed to capture analogue input like motion to the engines needed for the real-time calculation of the mostly projected imagery or other forms of output.

Most artist are not capable to create their virtual worlds and objects through programming with C++ and working directly with OpenGL code. Software like 3D Studio Max and Motion Builder give a user-friendly GUI for content creation and possibilities to connect to motion capture data. Many different motion capture systems are now available ranging from Inertial, Mechanical, Magnetic to Optical passive and active. A software package called Virtools uses a node based GUI system to ‘program’ behaviour for real-time interaction with the 3D CG environment.

For many artists and educational institutes these technologies are simply too expensive. Also the software is mostly constrained by copyright laws and commercial interest thus making it almost impossible to change or add anything to the hard- and software systems. Open source 3D engines like OGRE and 3D content creation software like Blender are starting to become serious competitors with commercial products. The constant improvements of marker less optical motion capture algorithms make the need for costly hardware increasingly obsolete. Artists and educational institutes are thus capable to invest in more complex systems creating a wider range of possibilities and an increasing sense of creative and physical freedom.

Internet has enabled the technology to be globally connected and shared. Real-time 3D CGI has been introduced as a standard called VRML to the Web already in the mid-nineties. Online multi-user real-time 3D worlds like Alpha World became a platform for many cultural exchanges including the Performing Arts and Theatre. A well known, and slightly hyped, recent example is of course Second Life where, among the more banal activities, artists are creating experimental spaces, objects and performances.

Next to the Internet and other media the upcoming importance of ‘user generated content’ has also its effect on the Game industry. Recent real-time 3D games are made available with a so called level-editor and multi-user capabilities. Users can now create their own interactive virtual environments as new levels (MAPS) or complete modifications (MODS) and share them in real-time with others. A powerful and versatile example is the Source game engine from Valve Software, well known for its Half-Life game series, containing the Havoc physics engine, powerful rendering engine, elaborate level editing and character building editors. 3D Game-engines are used for making animation films since the mid nineties also known as Machinima. The potential of using 3D game engines for the Performing Arts however lies in its capabilities to respond to live input in real-time enabling the performer to interact with the virtual world and objects.

To be able to interact with the virtual world there must be a technological bridge between the physical space of the performer and the virtual space inside the computer. These bridges consist of hard- and software components. As mentioned before optical markerless motion capture technology is becoming increasingly powerful and accessible through low cost webcam hardware and open source software like EyesWeb. MaxMSP, being a commercial version of the open source software called Pure Data, is especially developed to connect many input and output sources, including audio and video streams. A similar interface approach designed especially for the Performing Arts is for example Isadora, developed by Troinka Ranch who are using and developing their software for their own performances and installations. Interface boards like the Arduino or I-pac make it easier to build a bridge between the hard- and software. And also low cost input devices like Nintendo’s WII-mote open up a huge range of interaction possibilities for the Performing Arts and educational institutes.

To conclude this brief and general overview of existing technologies related to Theatrical 3D scene modelling and authoring tools one major problem becomes apparent; how do we use and connect al these technologies in a way that this system can actively participate and contribute to the rehearsal phase of a theatre making process? Although common protocols like Midi and OSC can be used to create bridges between all components a more general understanding and standardised architectural model is needed for a modular and flexible system that is able to dynamically connect all before mentioned technologies and possibilities especially for the Performing Arts.

List of links: