Film/Game Convergence: The Future of Cinematography


‘Let there be light,’ said the Cgi-God, and there was light…and God Rays.

We were out in the desert; barren land…and the Director’s wishes were that it be transformed into a green oasis; a tropical paradise… for the next few days.

And so our demi gods went to work in their digital sand-boxes, while we sipped cold coffee and ate donuts in the video village.
Then…one of the Cgi-Gods populated the land with Dirrogates– Digital people in his own likeness.

Welcome to Visual Storytelling in real-time — The future of film making is upon us.

Although the linear world of film-making has been steadily adapting to technology, it’s not been doing it fast enough. Why? I’ll be blunt about this:

Film-makers hate change. Not all of them, I’ll be the first to say, but many of the old dogs do. These are the film-makers who are still rebelling against digital cameras, who still hold out against Stereoscopic 3D as a medium of visual storytelling, who seek comfort behind the artifact cloud cover that 24 frames per second offers; who feel that 48 frames per second is like cheap video.

Meanwhile… the new generation of cinema audiences; those who are accustomed to nothing less than 120 frames per second on their gaming rigs, have been quietly creating a revolution (perhaps even un-known to them). That revolution in visual storytelling is called the “Cut-scene” in game parlance.

Cut-scenes have been a staple in the modern gaming world, and is made possible by technology that dollar for dollar, is vastly superior to what Avatar had. Everything from Virtual Cinematography and detailed Cgi world creation, existed, and is today produced on budgets that will give bean counters and suits in tent-pole Hollywood Studios, orgasms. Yet such technology and the operators behind them have never been acknowledged in the same way that their counterparts in the film world are.

So how can all this voodoo technology be brought to the film set? Enter the Game Engine, or to be more specific, look at something called :CineBox:

All output in the video above is in real-time and from a single modern gaming PC. That’s right…in case you missed it, ALL of the video was generated in real-time from a single PC that can sit on a desk. The “engine” behind it, is the CryEngine 3 and is what powers the CineBox.

Before naysayers comment this all looks game like and does not have a “cinematic” feel to it, do remember that the whole “set” can be rendered at higher rez via traditional render engines and/or in near realtime on GPU powered renderers for further iterations in quality if needed. The sync and industry standard input-output capability of digital assets will be it’s strength.

The Cinebox is a dedicated offering aimed at Cinematography and film-making so will have tools and functions that film professionals are familiar with. I will not go through the list of these, as this article aims to add food for thought on what can/should be added to this enigma of modern movie making technology.

  • Dolby Atmos support?
Imagine going out on location with the CineBox and setting up a “Digital Location”. The actual location can be an empty lot far from the city, and the “world” is either pre-loaded or can be built from scratch with final touches and tweaks on the day itself (move a tree here…raise a rock there…put in some particle smoke with DoF blur in the distance etc.) Now, CryEngine also has the ability to create atmosphere, both visual and aural of course, with so-called ‘entities’. Everything is preview-able in real-time.
7.1 Dolby is not enough. Dolby Atmos is where it’s at these days.
If sound entities (on-location foley) that are assigned with meticulous effort by the environment designer (a.k.a Cgi-God) can be recorded as a Dolby Atmos track, such an option offered in the CineBox could be invaluable to both dailies review and as a guide layer for final audio production.
Should the CineBox be used in episodic production of Broadcast TV programmes, a live Dolby Atmos / 7.1 track could be created and transmitted to base station for further sweetening and relay to homes.
  • CineBox, meet Scenect:
Faro’s Scenect could allow for rapid building of digital “assets” while on location for smaller objects and mass replication needs of flora or man-made objects. If there is robust integration of either 3rd party software such as Scenect or an in-house solution from CryTek for the Microsoft Kinect, asset scanning becomes just one of the uses. Lower budget productions could benefit from using low cost Kinect sensors to create a full Performance Capture stage as the video above shows.
At a recent RealVision workshop, I demonstrated such a solution (mocap) live on-stage. I also did a demo of CryEngine, though this was not connected to any mocap solution.
  • HFR Storytelling in Real-time Stereoscopic 3D:
Game engines thrive on faster than 24fps. The norm for the average gamer is 60fps, already surpassing the 48 fps that The Hobbit was shot at. It’s not uncommon to see gaming rigs that do 100 fps and above when running on beefy Nvidia and ATI GPUs. Further enhancements from CUDA break even more ground.
After watching the Hobbit, in HFR 3D at 48fps, I had speculated that the very stylized world of the Hobbit could certainly be rendered in near-realtime with future iterations of the CineBox. After all, 48fps did bring out detailing in the sets, costumes, and actors faces that made them resemble a Game like world. This world was received well by the young generation of “transmedia” cinema audiences, and most objections came from older eye-balls used to the slower 24 fps cinema feel.
  • Support for live Camera input for Green Screen and Virtual Camera, a la Previzion
If there is support to bring in a live camera feed and there is a built in chroma keyer, then all sorts of possibilities start to surface, the most advanced of which being:
  • Tracked Camera for Virtual Cinematography; the (in)famous Avatar “Simulcam”
  • METADATA: Matching of real world Lens metadata from “intelligent” lenses such as the Cooke i-Series lenses to match DoF etc, between the live action camera and the CGI one, including accurate lens distortion mapping.
  • Stereoscopic 3D Compositing in Realtime: Matching of interaxial for live stereoscopic 3D compositing of live action and CGI 3D rigs. If the keyframable CineBox stereo camera module is driven by a tracked 3Ality 3D rig or a Meduza or indeed, a PACE Cameron rig…then realtime or at the very least, hi- fidelity, near-realtime Stereoscopic 3D productions can be realized with ease and freedom of movement.
  • Real-time Node rendering CineBox for Large Worlds
Laser scanned buildings and even whole neighborhood blocks are now common place in large budget Hollywood productions. A detailed point cloud needs massive compute power to render. CryEngine excels at what it does, and if there is support for daisy chaining a few CineBoxes together, large neighborhoods with realtime animated atmosphere, flora and fauna can be realized… in stereoscopic 3D, at High Frame Rates (HFR)
  • Real World Synced Weather:
CryEngine has a powerful and advanced TOD (time of day) editor. There is every reason to believe that this will make it’s way into the CineBox (I have not yet got my hands on a beta version of the CineBox).
Now imagine if the TOD editor module and a “Weather” system could actually pull data such as wind direction, temperature and weather conditions from Real World sensors, or a realtime data source.
If this could be done, then the CineBox if used as an on-location set extension generator could have details such as leaves blowing in the correct direction.
See the video above at around the 0.42 seconds mark for a feeler of what I’m aiming for.
Also: No possible errors in the night sky, such as the one Titanic suffered from.
The video above rounds up what the CineBox is all about, and worth mentioning is the segment at around the 0:20 second mark, showing the rapid and intuitive “world creation” that Cgi Gods are known for.
The latest demo of what CineBox can do is showcased in the video below:

Audiences watching the finished film, would never know that the tropical paradise never existed in the barren desert of the real world where we filmed…

Authors note: All images and products mentioned are copyright to their respective owners. The above suggestions for the CryTek CineBox are my own, and I am not affiliated with any of the company/Companies mentioned, nor should this article be taken as endorsement by these companies. 

I also use Lumion at my master-classes to show on-location set extension creation, due to it’s very intuitive interface. I had contacted the makers of this fine software, but they did say that architects and arch-viz is their focus for now. I do hope at some time soon, they will look at the film making market with Lumion 3.

Dir·ro·gateA portmanteau of Digital + Surrogate. Borrowed from the hard science novel “Memories with Maya


  • I’ve never heard of Dolby Atmos. Makes sense, though. Cgi Gods. Intriguing! Just last night, I did a quick search for physics engines in use in Second Life, and Miku Miku Dance. The two that I wanted were Havok (SL) and Bullet (MMD). I also learned that Blender has a game engine! Probably not as powerful as CryEngine, but that’s besides the point.

    I’m of the opinion that if Google had access to Hollywood’s film library, they could do an awesome job of timelining locations to their maps and earth projects. It would take worldbuilding to a new level.

    • Dirrogate

      completely agree with you.

      Won’t be long now, before Google Earth becomes the defacto … “Second Life” Engine to run our own avatars (Dirrogates)

      On a related note: With enough compute power, a future version of CryEngine (Cinebox) might be able to run a copy of Google Earth, integrated with location accurate high rez laser scanned Landmarks (think ancient Rome)… giving film-makers the power to “enter” ancient rome.