Realtime Stereoscopic 3D Green Screen keying on location.

Real time Chroma Keying in Stereoscopic 3D:

Here’s the dilemma: Productions are now all going stereoscopic 3D. (the word stereoscopic should suffice, but we add 3D to prevent ambiguity), and there is a need to preview a live chroma or green screen “key” in 3D, because compositing and match moving in S3D is a whole different beast, more challenging than doing the same in 2D.

Simple “foot contact” of actors, with the ground gets even more complicated to composite in S3D than it is in 2D. A bad S3D composite will show up as people floating off the surface of the ground. In 2D, without stereoscopic depth cues such problems are rare and easy to cheat.

Blurring the Line between Post production and Production

The practice of compositing used to be the exclusive domain of “post production”. In a typical movie, compositing is done in post. It still is today, but high performance (HP) computing coupled with the fall in prices of hardware and technology is making it very feasible for desktop replacement Laptops to be brought on location to give instant visual feedback.

Industry standard Software such as Maya, compositing sofware such as Nuke and AfterEffects, and realtime NLE software that can assemble and cut “dailies” as soon as they are shot in full stereoscopic 4K can now sit on a portable laptop and give the Director and crew instant visual feedback.

The Decade of RealTime 3D Moviemaking:

We at RealVision believe this will be the decade of realtime 3D movie making. From 3D cameras becoming more self contained, to doing away with cumbersome Beam Splitter 3D rigs …to Lens metadata and camera orientation info driving realtime CG cameras for rendering in realtime. The holy grail of  hybrid film making!

..but before we get ahead of ourselves, lets start with right now,2011, and what we could ask of existing Software developers to include in upcoming versions of their products…

Where’s the Live Green Screen Keyer in NUKE?

Nuke is the industry leading compositing software for movies. On a recent shoot I was exploring the option of feeding Nuke, running on a suitably high powered laptop or desktop, with a live stereoscopic 3D Camera signal. The 3D camera would be aimed at a green screen stage set, and would thus allow the DP to light the green screen stage, and we could be sure of the quality of the key IN stereo3D and MUCH BEFORE it went to the post production studio.

well…that was the idea. But there is no Live Camera input NODE in Nuke that I know off. Granted it takes huge computing power to “render” final film quality output, so compositing software is not realtime at this time. But with CPU and GPU compute power that is available today, I’d like to request the very talented programmers of this software to look into such an option in upcoming releases. Of course a third Party plugin node could do the job if developers look into it.

It would be a much needed and revenue generating plugin, when it’s implemented. (if anyone knows of an existing method of getting a realtime Live feed into Nuke, I’d be happy to learn about it)

Where’s the Live Green Screen Keyer in After Effects?

The next choice would be Adobe’s excellent CS5 Suite. However both Adobe Premiere as well as After Effects does not have any facility to get in a live Camera feed either!

All is not lost though. Here’s a solution that Real Vision came across:

  1. It’s best to work with a Desktop for ease of use in this scenario as there are a few options for HDSDI or HDMI dual grabber cards available for getting the two camera feed into the computer. BlackMagic Intensity and AJA Kona are available
  2. In keeping with my preference for shrinking technology, I still prefer to use a high end laptop with the Black Magic Design Intensity Shuttle over USB 3.0 as a video grabber.
  3. The all important ingredient to the mix: the LENSFEED plugin for After effects.
  4. Rent a 3D multiplexing unit such as Inition’s StereoBrain that will take 2 HDSDI inputs and mux them side by side to give a single HDMI 3D output.
  5. Choice of Laptop: I’ve heard that the Blackmagic Intensity Shuttle works with the HP Envy 15 laptop well. The new Sony F series 3D enabled VAIO also has USB 3.0 and a slew of other must haves, such as Nvidia CUDA enabled GPU that will allow Adobe CS5′s amazing Mercury Engine to take advantage of GPU acceleration etc..

Realtime Stereoscopic 3D Green screen Keying Methodology in AfterEffects:

  1. Download the trial (or buy) the Lens Feed plugin for AfterEffects
  2. Connect up the Intensity Shuttle grabber to a suitably qualified Laptop (we are not grabbing actual video, merely previewing the signal, so it may work with Express Card to USB 3.0 cards) .But this needs to be tried (on a laptop) before buying.
  3. Setup an After Effects composition that is 1920 x 1080 wide.
  4. Create two AfterEffects layers and drop in the stereoscopic background “assets”. Squeeze them side by side in the composition window to fit (960×1080 each). In the example I was working on, it was a CG rendered “catwalk” for a 3D Fashion show. The Models were being filmed live on greenscreen and would be keyed over the CG stereo3D catwalk.
  5. Create a Solid Layer and drop the Lensfeed plugin on it. This should show the Live 3D camera feed (muxed side by side if using an extrnal mux unit), or
  6. Create a second solid layer and drop another instance of the LensFeed plugin on it. (there are 5 instances of this plugin meant for capture cards that have more than one input).
  7. If using step 6 above, then arrange the two live camera feeds as in step 4
  8. Drop an instance of the Keylight effect on the Solid(s) layers and key the green out.

This allows the DP to adjust lighting for the green screen stage, and gives the Director (and talent) and the “video village” or the client, a live preview of the final composite. Of course we would be “eye-balling” everything at this point, but it would aid in taking measurements, for later integration with CG cameras, or compositing cameras.

Realtime Stereo3D keying, the way VJs have been doing it…

It helps to have been doing live Stereoscopic 3D for over a decade. I used to VJ at live events and toured with Bacardi, and some big name music acts. Stereoscopic 3D events encouraged out-of-the-box thinking. Mixing live 3D camera feeds with pre “canned” S3D visuals were a common thing even in 2001.

There is currently a very powerful node rendering compositor, Visual Jockey, optimized for Realtime visuals that can still be used today for the purpose that we want. The software itself has not been updated to take GPU processing etc into account, but the ease of use and powerful features make up for it. Best of all it’s FREE.

The screenshots above with the webcam show the interface. It can be easily setup for stereoscopic 3D Keying, (the screenshot above shows a low fi version of the setup for illustrative purposes using a webcam during the rehearsal for the 3D Fashion show project) Stereoscopic webcams such as Minoru and/or stereoscopic DirectX Drivers can be used with any resolution Camera, limited only to the quality of the video capture board on the PC / Laptop.

3D Vision Mixers:

Finally, for the sake of completeness of this article, it is worth mentioning that there is always the option of using (HD) vision mixers with Chroma Keyers such as the cost effective Panasonic AG-HMX 100. I have not investigated this vision mixer first hand, so do not know if there is a keyer, but chances are it would have one.

The idea behind using a keying facility directly within the Software versus an external green screen keyer, is to minimize cost and drop in any quality of image.

Food for Thought… idea seeding and Real Vision 3D-LABS.

Software such as MAYA can also be taken to Movie sets and on-location to preview or “eyeball” final composites.

If there was an option for Live video as a background (currently all 3D software has the option for AVI, image sequences or still images as background references)… basic Matchmoving setups could be easily done. This would of course only work for locked off Live Cameras…. but we are looking at scene setup only in this article and not full fledged “Simulcam” setups.

In the short term, if there was a fiducial marker tracking Node in Nuke, that would lend itself well for realtime matchmoving as well. Augmented Reality movie making anyone?

Hopefully we have seeded some ideas for Software Plugin developers, 3D hardware manufacturers and 3D moviemakers to work on. RealVision’s 3D Labs, which run as permanent and touring installations, encourage the advancement of  stereoscopic 3D technology, and spreading 3D movie making best practices. We have Labs running in Asia and are keen to look at other regions.

Let’s end with a quick video of the rehearsal day of the 3D Fashion shoot:

(Watch with annotations,on youtube, here)

** Update  Sept 1st 2011**

I tried the BM intensity Shuttle with the new Sony Vaio F series 3DLaptop. Not much success unfortunately. I can manage to get a full hd feed while using Black Magic;s own video capture app, but cant get a feed to show up in most of other software that supports live capture/preview (example abobe on location wont show, AMCAP utility wont, neither will VLan or other Direct show capable applications. The driver shows up in the drop down list, but I only get a black screen no matter what settings I choose).

Maybe people will have better luck with other Laptops and USB 3.0

** Update Oct 6th 2011 **

Black Magic Design Support replied with the following:  ” At the moment Adobe On-Location is not supported by DeckLink devices although it may be in a future release. I have brought this to the attention of our QA department and the product managers.

Regarding the setup described on your webpage ( http://realvision.ae/blog/2011/05/realtime-stereoscopic-3d-green-screen-keying-on-location/ ) please note that USB 3.0 Express Cards are not supported and will not work. This is because they provide insufficient bandwidth for the Shuttle or other DeckLink USB 3.0 devices. DeckLink devices (including the Intensity series) do not have a low bandwidth preview functionality, but rather only support full bandwidth capture or playback. The preview functionality on Media Express for example obtains the full video frame from the capture device using the DeckLink API, but only displays every second line.