Stereoscopic 3D Virtual Sets and Stages:

One of the ongoing projects at the 3D Lab, twofour54 is R&D into stereoscopic 3D virtual sets and production workflows.

While Virtual Sets have been around for some time now, true stereoscopic 3D workflows pose a few interesting challenges. There are systems such as WASP 3D that have announced stereoscopic 3D support, and we are sure that other high end systems such as ORAD and VizRT would support stereo workflow as well, but what needs to be investigated is do/can they really support TRUE green screen stereoscopic live keyed talent at the proper Z-depth location in a scene?

Challenges faced in Stereoscopic 3D Virtual Sets today:

  • None of the big players in Virtual Sets have specifically mentioned the capability of Live Stereoscopic Actors/Talent compositing capability (except ORAD)
  • How does existing Realtime Camera Tracking systems perform in case of scenes with Ground contact of actors feet? (it’s not easy to cheat depth as we can in 2D).
  • Importing a live camera Stereoscopic  green screen feed with associated alpha channels for left/right eyes, for real time compositing in Z-space. (akin to NewTek’s Live Matte)

Intermediate Solutions at the 3DLab twofour54:

Why the big deal about live chroma actors at proper Z-depth? After all, a simple solution would be to use a hardware (or software) green screen keyer and feed the virtual set as background, with the keyed out live talent as the foreground. However, this kind of approach is too simplistic and would not allow for live talent to have “simulated” interaction with CGI content.

Also, placing live talent at proper Z-depth and CGI elements or props in “front” of the live talent with proper occlusion makes for more sophisticated and “real” looking Virtual sets.

At the 3D Lab, we are investigating budget solutions such as the excellent iClone 4 from Reallusion.

(see the clip in 3D on youtube here)

As can be seen in the video above, with proper placement of the live talent (rendered) on a billboard 3D object, we are at liberty to do the following:

1) Position the talent easily in Z-space

2) Place additional 3D props in front of the live talent with proper occlusion control.

3) If we so desire, “simulate” interaction between live talent and the props in the Virtual Set.

The last point above, simulation of interaction, needs to be carefully choreographed if it is to look convincing. For instance having the live talent reach out and “turn on” a TV etc.

Fortunately, iClone4 is a realtime WYSIWYG environment for the most part, running a realtime render engine based on Direct X. This allows for true REAL-TIME lighting, texturing and Animation of the Virtual Set props and Camera.

This feature allows us to take a PC running the iClone software directly onto the shooting floor and have a green screen composite of the live talent overlayed on the Virtual Set. The composited feed can then be displayed on a Studio monitor, for the live talent to get visual feed back of their “simulated” interaction. This allows for proper eye-line and gesture control.

At this point, because we cannot yet do “Live Matting” with proper Z-Depth directly into the software, we only use it as a guide. The talent can see if they walk into a prop etc, and thereby know the geography of the scene and how to interact with it.

Once the scene is recorded, we can key the footage in compositing software such as NUKE or Aftereffects, and map the resulting Alpha mask and Keyed footage onto a “billboard” object in iClone and position it in true stereoscopic 3D space.

The 3D Lab, twofour54…. seeding ideas in S3D:

Stereoscopic 3D workflows are new to even the most established heavyweights in the media production industry, both hardware and software manufacturers.

Today, Newteks Tricaster has the capability to become a stereoscopic 3D production OB Truck on a desk. Yet … it isn’t so far.

WASP 3D has stereoscopic 3D support, but we have not yet investigated if it can composite live talent with alpha matte in realtime and in Z-depth

VizRT also has stereoscopic 3D support, yet an interesting bit in this article says, quote: “The installation was done without the special stereo cameras normally needed for 3D demonstrations. Vizrt’s stereoscopic rendering ensured that the on-camera talent remained properly situated within the 3D scene”

Could this mean that the live talent was simply a 2D realtime feed, mapped as a layer in 3D space?

It is worth investigating a cost effective solution for Virtual Sets in Stereoscopic 3D. Software such as iClone4 and the fantastic 3D Authoring platform Unity3D, have the power to outperform more expensive systems. Combining such powerful software platforms with intuitive authoring interfaces and instant realtime feedback is where creativity can meet technology.

At the 3D Lab, the mission is to bring Creative people, programmers, hardware and software developers together in a collaborative environment to advance the art of stereoscopic 3d production.