Storytelling in VR:

They say you shouldn’t hijack the head-tracking data stream of the Oculus Rift; visuals should not be separated from the human vestibular system…but rules were meant to be broken. Why? because there’s so much more to VR than gaming.

This is not to say games aren’t becoming movies! I found myself strangely immersed in Naughty Dog’s “Last of Us“, more than any tent-pole movie I’ve seen in the past few months. Such is the power of CG movies, un-canny valley be damned.

Defining a language for Storytelling in Virtual Reality:

You know how it all began oh so long ago (OK, 4 years ago) when the language of film-making was being defined / re-written for S3D. Well, time for a re-write again. Immersive 360 film-making is set to explode; geared for an audience of teens to mid forties – at least at the start, and telling stories in this medium is quite a different skill-set to master.

Citizen Kane, back in the day, although a 2D film, had given enough clues to modern 3D film-makers on how to effectively use the medium of S3D… but no one really had the patience to listen. Lighting, Depth of field and yes – even hijacking the head-tracking stream can work when creating movies on a 360 canvas.

When I started investigating this exciting medium a few months ago, alarm bells would go off when I asked on Oculus Rift / Game Engine forums about intercepting head-tracking and orientation info of these devices, but that’s because so far it’s only games that have been designed for VR. It’s soon becoming evident that apart from the gimmicky interactive look-around voyeuristic possibilities offered by the medium, serious Directors and storytellers will look at retaining control of the “frame” if they are to be enticed into creating movies in Virtual Reality.

So what could an immersive 360 Director’s tool-box look like?

  • Lighting – With the temptation to look around a scene, a Director and VR DoP can use the age-old technique of spot-lighting areas of importance.
  • 360 Positional Sound – Wait until Dolby Atmos gets interested – Chances are an Atmos SDK might already be in the works to create scound-scapes that can aid in directing an audience’s attention.
  • Depth of Field – The pet peeve of Steresoscopic 3D film-making, unless done correctly. (Bokeh) This technique is worth exploring in an immersive 360 environment, to guide audience attention. At least it won’t be a lead-by-the-nose experience, as it’s sometimes abused by inexperienced DPs and Directors on 2D films.
  • Limiting the Horizontal FoV – There is no rule per se that every scene should feature full wrap-around 360 views of the scene for the audience to explore. The horizontal field of view can be restricted for certain shots. This is a creative call, and is what will contribute to the flavor of the overall movie experience being crafted by the film-maker.

Advanced Tools for Immersive 360 Storytelling:

The short demo scene above is from MAYA – a mixed media Motion Comic I’m working on for the Oculus Rift and other VR devices, including Cell phone VR, such as Google’s Cardboard, Durovis Dive and others that will allow almost any smartphone; andriod or iOS, to playback VR movies and experiences.
The demo clip is a straight video grab of a wearer viewing a “page” of the motion comic via the Oculus Rift. It features both, scene cuts and subtle interactivity.

Interactivity: In the first scene after the title, The girl stands at the window – that’s what the Director intends the audience to see, and the rest of the room has subdued lighting. That is… unless the wearer turns their head around, which triggers the bedlamps to increase in intensity.

Head Tracking Hijack: Hijacking the Head-tracker – The next scene shows the girl framed on the bed. This cut will happen, ir-respective of where the wearer of the Rift is looking. Yes, it is a forced Cut, and will put the scene bang center.

The important point to be aware of, is this – The same rules for S3D storytelling apply; mis-matched depth splicing should be avoided. 

(image credit: RoadtoVR)

GreenScreening the Crew out:

This idea came to me when I glanced at the image of what I later realized was a paratrooper in the movie. I initially thought they had covered crew/equipment in green, for later keying/wire removal. While I have not looked at the actual feasibility of stereoscopically replacing a background plate after removing any green-screen clad crew or equipment – I am confident that it could be possible, even when dealing with de-warping and stitching the 360 image.

Compositing in 360:

Below is an interactive 360 “cubic” panorama. It was converted from an Equirectangular image to cubes that form the Panorama.  (Click and drag, or if browsing this page on an android/ios device, the gyro will work).

Images of the 6 individual Cubes are here:

Again, there’s every reason to believe, a competent NUKE programmer-artist could write the warping matrix to composite elements directly over a spherical or equirectangular sequence of images or video.

Jaunt VR – the people behind The Mission VR are certainly proficient in the algorithm and Cuda coding department, to pull it off.

(image credit:

Virtual Reality Film making Gear:

Capture: The current camera system used by Jaunt VR is said to be a rig based on 14 GoPro cameras in stereoscopic config to capture a wrap-around view of a scene in S3D. Now – I’ll admit i’m sitting on the fence about actual “scanline” level CMOS sync of go-pros in a stereo config, much less 7 pairs of them!… yet, I’ll give the rig and the experts behind it, the benefit of the doubt. It’s possible that sync drift is non-existent in today’s oscillators/controllers.

Edit: I’ve been told, JauntVR aren’t capturing pano stereo video in the traditional manner, but instead every point in space is seen by a minimum of 3 cameras in the rig, and thus a virtual stereo rig can be created  which can then produce a final stereo pair using “Computational Photography”. This is said to give rounded stereo across the fish-eye field of view, and gets rid of what I call ‘fish-eye shear’ – at the periphery of view. For narrative filmmaking, I’m seriously looking at “graduated stereo fall-off (SFO) as a viable tool in directing audience attention.

Stereo 3D Conversion Houses: 360 Film making might actually put the spotlight on 2D to 3D conversion. –  A much needed service for film-makers and a new Business avenue for conversion studios to explore!  Compositing in stereo 3D on the stitched 360 image is only a few Nuke nodes away for competent conversion houses.


For MAYA – a 360 Motion novel, I’ve chosen the excellent Cinema Director from Cinema-Suite. It’s the closest one can get to an NLE system for an otherwise scripting heavy Game Engine such as Unity. Yes, to create immersive 360 video there is no straightforward way to do it in any of the popular video editing solutions. There is a bit of scripting in Javascript or C# to do anything creative with Unity or other Game Engines.

I had to customize the software to a certain extent to get the desired tools I needed to do those cuts for the Oculus Rift.

Most NLE systems were late to catch the Stereoscopic 3D train, the same will probably be the case for 360 VR support. Toolkits will need to be written to control different aspects of a VR experience. So what would a time-line look like for an immersive 360 VR film? – Take a look at a screenshot for MAYA, below.

(click for larger)

Immersive VR eye-wear:

Credit for the device that started the VR revival would go to the Oculus Rift. The recent acquisition of the Oculus Rift for a staggering $2 Billion by FaceBook should prove that VR is here to stay. Equally important are initiatives and products coming out from Sony with it’s Morpheus eyewear, Samsung’s own venture, and a special nod to long time VR headwear experts TDVision and their soon to be released ImmersiON VR eyewear.

Smart phones can easily be converted to immersive 360 video playback hardware. The Durovis Dive, and even Google’s “Cardboard” show how low cost this could be.

Immersive 360 Film-making: Content

The hardware and software is becoming available at a quicker pace than the talent to produce Immersive 360 content. Are audiences ready?

Gamers have always wanted beefier gaming rigs (laptops), and the same holds true for their screens. With VR devices, it’s like having an Imax® strapped to their faces. Yes they are ready – and they’d like to watch their movies the same way. But VR is not limited to gamers. The luxury of an immersive large screen environment and the privacy and intimacy it offers cannot be discounted. A long haul flight is but one venue that comes to mind where VR eye-wear would be in high demand…

Education is another. Already 360 documentaries, complete with Sir Attenborough’s voice are being readied for when these devices go mainstream….
As a film-maker, will your storytelling skills evolve for the next generation of audiences?

**Edit 11th August 2014**

Depth of Field – Bokeh-ing the background:

On the Oculus Rift forum as well as in the comments section of this article, the question of Bokeh came up, so to clarify with example:

The concept of using Depth of field (a practice which I do not condone and label as lazy filmmaking; see the Citizen Kane argument above) can actually be used to great effect if the background is completely Bokeh-ed out. i.e there has to be no ambiguity for the eyes/brain to even attempt to fuse semi blur imagery, which would otherwise cause eye strain and headaches in stereoscopic 3D viewing, and especially more so in a stereoscopic VR environment.

To show an example of what I mean, below is the same Interactive Panorama with only the Director’s framing and ground in focus…exactly as the Director might intend this scene to be viewed. The example below is obviously a quick photoshop lens filter job for illustration purposes, and a proper artist will spend time crafting a proper bokeh filter to apply to scenes.  Use your mouse to simulate head turning when wearing the Oculus Rift.

Part 2: The Language of narrative storytelling in VR is now online.



  • Great innovative idea…One day feature films may be also made in this manner. Wish you all success for your MAYA – 360 Motion novel. I hope in the near future, consumer models of immersive 360 video playback hardware will be available at more affordable prices, so that people like me will be able to enjoy the experience.

    • clydeD

      Dear Ramachandra,
      Thank you for the kind words.

      Jaunt VR are spearheading this initiative and props to them for venturing into the medium.
      >> I hope in the near future, consumer models of immersive 360 video playback hardware will be available at more affordable prices..

      Google’s Cardboard ( is a great way to whet anyone’s appetite for VR. There’s also Durovis Dive and others, and there’s every reason to believe the consumer version of the Rift will be cost effective for mass adoption.

      Kind Regards.

  • clydeD

    Over on Reddit’s Oculus board, I was asked to expand on shooting a stereo pano and the MAYA motion comic project, and as the above article has got visits from two famous “comics” studios (hint: it’s not DC comics!), I think the answer posted below, might be of value fwiw.
    If more Motion comics are created for the Rift, the better…

    —copy paste—-
    Yeah it’s just a demo scene, not a final version. will have to hire artists to work on it. I wanted to know what would go into the production from start to finish, so investing time in learning as much as I can.
    Shooting stereo panos for this particular project would be:

    1) for scenes where there is control of the environment, two shots (left/right) done with a parabolic mirror, or a conventional gopro 360 rig, shot in cha-cha method

    2) Where there’s no control of the environ, Two pairs of stereocameras (Lanc synced) with wide angle lenses giving a wide HFOV. This will then be stitched to a second shot of the background (other part of the hemisphere) with the two pairs of stereocams rotated, and because there’s no way to guarantee the two shots will be in sync, the “back” hemisphere will be “bokeh”-ed out, thereby suggesting to the rift wearer, where he/she should be looking.

    3) For yet other scenes, it will be a conventional 2D pano shot, then converted to Cubic map…then a 2D to 3D pass done to give ‘depth’ to elements in the scene… think pedestrians on one plane, vehicles and trees on another, and far background on another..,

    I can do this, *only* because this is a motion comic, and not a full fledged action movie.

    Yet, you might want to check what JauntVR are doing with their 14 camera GoPro rig. I’ve written about them (and so have others). Here:

    Kind Regards.

  • Pingback: The language of visual storytelling in 360 virtual reality « TheVirtualRealist()

  • John A. Rupkalvis

    Excellent commentary. Like all new innovative approaches, I am sure that 360 degree VR will meet with resistance from those directors who are used to their status quo, and are afraid to try anything new.

    Actually, it is not that new. Shakespeare’s Globe theatre was actually a “theater-in-the-round” with a circular stage surrounded by the audience, and many of the challenges of 360 degree imaging are similar, including composition and direction of the audience’s (or single viewer’s) attention.

    Although I agree that bokeh can be used, I believe that its use should be limited, as it can often be a distraction. Like zooming, fisheye, telephoto, etc., it is most effective if limited to “special occasions” when it really suits the dramatic considerations of the story.

    There are other ways that the audience’s attention may be directed. The classical theater methods are still valid today. Certainly, moving the players to more-or-less prominent positions is one way. Perhaps the most effective way is through the use of dynamic lighting, wherein the brightness and position of the lights, as well as the area (spotlighting vs floodlighting) can be used very effectively, whether in a live-action theater-in-the-round play or in a 360 degree VR viewer. Even the sound system can be used to augment or enhance this purpose. Remember, “the eyes follow the ears”. You hear a sound, and you look in that direction.

    I am sure that my own experience with theater-in-the-round, as well as the Stereorama immersion theater, will lend itself very well to 360 degree VR applications. These are areas that I have researched for many years. I designed, built, demonstrated, and published one of my VR architecture systems in 1972. If anyone has a specific interest, my e-address is: .

    • clydeD

      Dear John,

      Valuable insights from you as always. The Globe theater is an apt and good analogy to VR.

      I’ve updated the article to further illustrate my point on how Bokeh in regards to how a Director can maintain control of a “frame”. For sure, this effect is not to be abused, and should be used sparingly as another tool in a 360 narrative toolkit, and when appropriate.

      >>Perhaps the most effective way is through the use of dynamic lighting, wherein the brightness and position of the lights, as well as the area (spotlighting vs floodlighting) can be used very effectively, whether in a live-action theater-in-the-round play or in a 360 degree VR viewer…

      Yes, I agree fully on this, and hence the Citizen Kane reference at the start of the article and the use of Lighting as one of the most potent tools a 360 Director can wield to asset a degree of control over the flow of the narrative in a 360 movie.

      Kind Regards.

  • Pingback: The Language of Visual Storytelling in 360 Virtual Reality. | Knowledge Base and stereoscopic 3D Blog |

  • Pingback: Frontloading Oculus Rift Data | YouRift()

  • Mr. Omer

    Dear Clyde,

    First of all, I am truly glad to read such an amazing article, since I am as well a movie lover and a film student at Met Film School.

    I’ve been trying to define the new rule of film-making through 360 storytelling. A week ago, The company Inition gave me the opportunity to express myself on that very same topic, as I’ve being researching and ended up writing an all essay concerning the subject. As I true believer of VR Film-making and a passionate, I am genuinely happy to see someone else working on a very similar project.

    From my only 22 years, within all my respect, If its not too much asking, I would appreciate to talk with you and exchange ideas and why not have some feedback about essay, which I named Virtual Reality Cinematography – The raise of a new type of film-making.

    Many thanks for such a work,

    I am looking forward to hear you, Clyde.


    Mr. Omer

    • Dirrogate

      Hello Omer.
      Inition has always been a leader in recognizing trends in AR, s3D and VR. There was no better platform for you to share your thoughts on VR filmmaking. I’d be happy to exchange a few notes via email. Send me an email using the “contact us” menu at the top of this page, and I’ll email back. Kind Regards.

    • Aimons

      Hey Omer, I’d like to talk with you about VR Cinematography.

  • lanxinger

    Hi Clyde,

    Thank you for the cutting edge article. Lots of interesting possibilities and even more open questions. Samsung just announced their version of the Rift made together with Oculus for their Note 4 phone so this market just got even more interesting with a big player going into it that early on.

    From the technical perspective, I am too very curious on how they achieve proper sync between 14 GoPro cameras. I heard that even with the new sync cable for the Hero 3+ the cameras still go out of sync at times and that is only with a pair. I would be very curious to see some of their footage.

    • >>From the technical perspective, I am too very curious on how they achieve proper sync between 14 GoPro cameras.
      I should have updated the article. Jaunt aren’t using a typical 360 rig the way pano footage is shot and stitched. Instead, they say that every point in space is seen by at least 3 of the cams in the rig, and then their algorithm or “computational photography” creates perfect spheres for left and right eyes, thereby also doing away with edge shearing as seen in typical fish-eye footage at the periphery, but more importantly, via their method it is claimed that perfect stereo is visible at all points in the “frame”. (If normal extreme fish-eyes are used on a stereorig, we lose stereo at the periphery due to the nature of the lenses)

      I’m certainly looking forward to seeing sample footage myself to confirm the value of this approach.
      kind Regards.
      (p,s thanks for the kind words!)

  • Wow!! What a technology!

  • Technology making the visions of human beyond their expectation. In order to make the vision come true facts 360 degree camera has been doing the best part since the inceptions. It does not seem unreasonable to suggest that every part of the camera making the whole project precious along with productive in all the way. Therefore anyone can have the journey regarding the camera.