Cinematography and Gaming Singularity: The emergence of HFR storytelling
Film/Game Convergence: The Future of Cinematography
‘Let there be light,’ said the Cgi-God, and there was light…and God Rays.
We were out in the desert; barren land…and the Director’s wishes were that it be transformed into a green oasis; a tropical paradise… for the next few days.
And so our demi gods went to work in their digital sand-boxes, while we sipped cold coffee and ate donuts in the video village.
Then…one of the Cgi-Gods populated the land with Dirrogates– Digital people in his own likeness.
Welcome to Visual Storytelling in real-time — The future of film making is upon us.
Although the linear world of film-making has been steadily adapting to technology, it’s not been doing it fast enough. Why? I’ll be blunt about this, and it may hurt some egos:
Film-makers hate change. Not all of them, I’ll be the first to say, but many of the old dogs do. The ones who do not want/cannot be taught new tricks; as the saying goes. These are the film-makers who are still rebelling against digital cameras, who still hold out against Stereoscopic 3D as a medium of visual storytelling, who seek comfort behind the artifact cloud cover that 24 frames per second offers; who feel that 48 frames per second is like cheap video, and who will not entertain the voices of young blood when it comes to technology integration on the film set.
Meanwhile… the new generation of cinema audiences; those who are accustomed to nothing less than 120 frames per second on their gaming rigs, have been quietly creating a revolution (perhaps even un-known to them). That revolution in visual storytelling is called the “Cut-scene” in game parlance.
Cut-scenes have been a staple in the modern gaming world, and is made possible by technology that dollar for dollar, is vastly superior to what Avatar had. Everything from Virtual Cinematography and detailed Cgi world creation, existed, and is today produced on budgets that will give bean counters and suits in tent-pole Hollywood Studios, orgasms. Yet such technology and the operators behind them have never been acknowledged in the same way that their counterparts in the film world are.
So how can all this voodoo technology be brought to the film set? Enter the Game Engine, or to be more specific, look at something called :CineBox:
All output in the video above is in real-time and from a single modern gaming PC. That’s right…in case you missed it, ALL of the video was generated in real-time from a single PC that can sit on a desk. The “engine” behind it, is the CryEngine 3 and is what powers the CineBox.
Before naysayers comment this all looks game like and does not have a “cinematic” feel to it, do remember that the whole “set” can be rendered at higher rez via traditional render engines and/or in near realtime on GPU powered renderers for further iterations in quality if needed. The sync and industry standard input-output capability of digital assets will be it’s strength.
The Cinebox is a dedicated offering aimed at Cinematography and film-making so will have tools and functions that film professionals are familiar with. I will not go through the list of these, as this article aims to add food for thought on what can/should be added to this enigma of modern movie making technology.
Why CryTek should call me:
- Dolby Atmos support?
- CineBox, meet Scenect:
- HFR Storytelling in Real-time Stereoscopic 3D:
- Support for live Camera input for Green Screen and Virtual Camera, a la Previzion
- Tracked Camera for Virtual Cinematography; the (in)famous Avatar “Simulcam”
- METADATA: Matching of real world Lens metadata from “intelligent” lenses such as the Cooke i-Series lenses to match DoF etc, between the live action camera and the CGI one, including accurate lens distortion mapping.
- Stereoscopic 3D Compositing in Realtime: Matching of interaxial for live stereoscopic 3D compositing of live action and CGI 3D rigs. If the keyframable CineBox stereo camera module is driven by a tracked 3Ality 3D rig or a Meduza or indeed, a PACE Cameron rig…then realtime or at the very least, hi- fidelity, near-realtime Stereoscopic 3D productions can be realized with ease and freedom of movement.
- Real-time Node rendering CineBox for Large Worlds
- Real World Synced Weather:
Audiences watching the finished film, would never know that the tropical paradise never existed in the barren desert of the real world where we filmed…
Recently, scientists have said that we may really be living in a simulation after all. The Mayans stopped counting time not because they predicted the end of the world in 2012, but it might be because they saw 2013 heralding the dawn of a new era. An era that sees the building blocks come into place for a journey heading into eventual…’Singularity‘
A bit far fetched?…possibly. But this is the kind of food for thought that Cgi Gods thrive on, when pushing the boundaries of their creativity; their art, and technology.
Dir·ro·gate : A portmanteau of Digital + Surrogate. Borrowed from the hard science novel “Memories with Maya” (The world, augmented by Digital replicas of people. Living…and dead)
Authors note: All images and products mentioned are copyright to their respective owners. The above suggestions for the CryTek CineBox are my own, and I am not affiliated with any of the company/Companies mentioned, nor should this article be taken as endorsement by these companies.
I also use Lumion at my master-classes to show on-location set extension creation, due to it’s very intuitive interface. I had contacted the makers of this fine software, but they did say that architects and arch-viz is their focus for now. I do hope at some time soon, they will look at the film making market with Lumion 3.