xR for Producers and Directors

Ashraf Nehru
9 min readOct 27, 2020

--

How to eXtend Reality without losing your mind

Katy Perry’s American Idol performance of ‘Daisies’, made by XR Studios

Executive Summary

xR (sometimes “eXtended Reality” or “virtual production”) uses LED screens to place performers and props in a fluid three-dimensional environment, captured in-camera without the need for costly post-production visual effects.

xR is “hot” right now in live, film, and television production. Especially during the COVID-19 crisis, it seems like everyone and their pet alligator wants to produce and direct an xR show. However, xR requires a new level of collaboration between previously separate stage departments (lighting, camera, and screens), and this has the potential to seriously trip up productions if not properly understood.

If you’re a producer or director who wants to deliver xR shows on time, on budget, and without insane levels of stress, this article is for you. It explains what xR is, why it’s hot at the moment, how it works (easy on the science bit), and (crucially) what it means for your production process and how you organise your team.

Oh, and best of all, it’s short. And there’s a picture of Baby Yoda at the end.

What is xR ?

If a picture is worth a thousand words, sometimes a video is worth a million words; this is one of those times.

You’ll be familiar with the age-old technique called Green Screen : film actors in front of large green screens, and substitute fake backgrounds through the magic of computers :

The Hobbit, made by Weta Digital

The trouble with Green Screen is that actors can’t see what they’re acting against; they get tinged with green, which has to be fixed in post; and you as the director can’t see the final shot until much later, so the creative process is stilted. If you thought The Hobbit was just okay, this might be why.

The next evolutionary step, Green Screen Replacement (GSR) solved these problems by projecting a background (either still or video) onto a screen behind the performers and props. This has three main advantages : first, the performers can see the environment around them, so they’re more comfortable, and you get a more natural performance; second, the performers and props (particularly diffuse or reflective surfaces) are lit naturally by the backdrop (and additional lighting driven by the video content); and third, you as the director can see the final shot in-camera immediately, rather than having to wait for a slow and expensive post-production effects process.

GSR’s first ‘big’ outing was on Oblivion, where the cloudscapes around Tom Cruise’s floating bachelor-pad were projected onto screens around the set, reflecting nicely off all the glass bubbles and chrome highlights :

Oblivion, made by Digital Domain, Pixomondo, and Lux Machina

GSR is great for background plates that are far away or soft-focused, like cloudscapes or landscapes. But because the backgrounds are ‘static’, we can’t change their viewing angle when the camera moves or rotates. So for instance, if we had a smaller, closer object in the scene (like another aircraft coming in to land, say), moving the camera would break the illusion and we’d realise it was just a flat shape.

And this is the problem that xR solves.

For xR, we create a three-dimensional model of the background scene, and re-draw it continuously from the viewpoint of the camera we’re filming with. So we can move the camera freely and ‘look around corners’, which wouldn’t be possible with flat video.

This means xR can convincingly simulate a much wider variety of scenes, as in the example below :

‘Quite Brilliant’ spot, made by Satore Studio

Why is xR hot right now ?

all the cool kids want to be Doja Cat (made by XR Studios)

xR is definitely having a ‘moment’; as with all overnight success, it’s been years in the making.

Productions have always wanted to avoid having to shoot outdoors or build expensive and complex sets, and xR lets you do that more often. In particular, it lets you build a small stage and make it look like you built a large one (through an xR effect called virtual set extension) which is particularly useful in the current COVID-19 environment.

However, it’s really only recently that all the various technology that xR relies on (fast computers, accessible game engines, camera trackers, and so on) has become widely available. And it’s this accessibility, coupled with COVID-driven necessity, that’s driving the current explosion of interest in xR as a technique.

It’s unclear whether xR is a passing fad — with all such techniques, there’s usually an initial flurry of hype and over-use, before the industry settles down and builds it into standard practice in a more measured way. For the purposes of the following discussion, I’m going to assume that xR is going to be around for a while, and will at some point be considered normal and boring.

How does xR work ?

Tom Cruise standing on a box

Like everything in the rich history of in-camera trickery, xR relies on technology.

In xR, we use a camera tracker device mounted on each camera (common options are systems made by MoSys, NCAM, or Stype) that figures out where the camera is and where it’s pointing (its position, angle, and field of view).

We then use a game engine (common options are Notch, Unreal, and Unity) to quickly draw the background scene from that point of view, and then render the resulting image onto the screens on stage. If we get everything right, when we look down the barrel of the camera, we experience the illusion that the performers and props are actually “in” the virtual scene. Kinda sorta.

To make this work in real time, we need to repeat this process every frame, and since we’re usually working at anywhere from 24 to 60 frames per second, we need powerful computers to do this. But basically that’s all there is to it.

Easy, right ? Weeeellll….

Why is xR ‘difficult’ ?

There are two parts to this answer.

The first part is simply that xR is complex new technology, and complex new technology can be … challenging.

Instead of painting or filming backdrops, artists have to model them in 3D on computers. Granted, with post-production VFX they’d have to do that anyway; but for xR, they have to be simple enough to work in real time inside a games engine, and that’s still something of a black art. Done poorly, you end up with backgrounds that look like old-school computer games; it’s hard to do it well enough to fool the viewer into thinking they’re looking at a “real” scene, assuming that’s your intention.

To light the backdrops, we can’t use real lights — instead, we have to create virtual lights inside the game engine, and use those instead. We also need to light the performers using physical lights that match the backdrop lighting, and deal with shadows of real objects onto virtual ones, and virtual ones onto real ones. So we need to somehow ‘connect’ the virtual lights to real ones, and to be able to drive the real lights from the background scene. And so on, and so on.

Suffice to say that there’s lots of fun and complex problems to solve to create a convincing xR illusion. Fun for engineers, anyway; not so much fun when they bite you in the ass on set.

xR toolsets are evolving quickly, practitioners are learning as they go, and the industry is inventing the field ‘on the fly’. In the context of real productions, with real budgets, deadlines and consequences, this makes xR riskier (although it’s getting better all the time). Hardware and software failures, as well as inadequate training, can derail technical production and lead to stress, overtime, and lateness.

Luckily, there’s a strategy to deal with this : don’t assume that everything will “just work”. Instead, make time and space to test systems and workflows before load-in, and don’t rely on things you haven’t tested; and if you can’t test exhaustively (which is often the case), build sensible contingencies and backup plans into your budgets and schedules (tip : listen to your friendly technicians — they don’t want to be on set any longer than they have to). Resist the pressure to compress production schedules. Yes, this makes it more expensive — consider it an investment in the future, if you like. If this is a problem, find existing xR stages and expertise you can hire.

The second, and less obvious, part of the answer is that xR demands a different kind of team collaboration on set.

Go xR Team !

we’ve been used to working like this …

Before xR, the three main show departments (camera, lights and screens) didn’t really have to interact that much, and they liked it that way. Each department stayed in their lane, communicated mainly with the director, developed their own self-sufficient work cultures, and evolved snide/affectionate terms for each other like “vidiots” and “lampies”. Although not especially friendly, this system was simple, and it worked.

… but xR needs a more collaborative approach

The unavoidable fact about xR, unfortunately, is that it requires us to connect and synchronise cameras, lights and screens in ways that weren’t necessary before. The old “stay in your lane” approach just doesn’t work in this scenario. Each command from the director requires all three departments to work together to achieve it. And when technical issues emerge, they have the potential to devolve into an interdepartmental blame game that (aside from being infuriating) doesn’t actually fix the issue.

(we want to avoid this situation)

It’s therefore critical to assemble those previously separate departments, and get them communicating, collaborating and planning as a team, both before and during production. That way, all departments can collaborate efficiently to diagnose and fix technical issues before they turn into full-blown dumpster fires.

The xR Checklist

Since this article promised to be practical, here’s a handy checklist to run through when planning your next xR production. And since most humans who aren’t producers or directors can only hold seven items in their working memory, I’ve kept it short.

  • do your department heads (DOP, lighting, screens, content) all understand that they are working on an xR project ?
  • have your department heads received training on xR workflows ?
  • is there a clear list of xR capabilities you will expect to see during the shoot ? (eg. multi-camera switching, use of a specific engine or content, physical size of stage, etc etc)
  • have your department heads met together to plan how they will tackle pre-production and load-in tasks ?
  • have you scheduled a pre-load-in system test to check the capabilities you expect during the shoot ?
  • does your creative team understand the kind of compromises they may have to make to get things working in real time ?
  • do your departments have a workaround or backup plan if a specific feature fails ?

Summary

afraid do not be; great powers, will you wield (made by ILM with assistance from Lux Machina)

xR isn’t really super-difficult — it’s just new, and it requires creators to think and work differently than they’ve been used to. What you get back for your effort is access to a whole new approach to creating content, one that’s not yet been mined to death, and presents huge opportunities for innovation and finding a new creative language. What it’s mainly missing right now is your voice.

Come on in, the water’s fine !

More xR Resources

Disclosure: I’m a founder of disguise, a hardware/software company which has a mission of trying to help these techniques achieve some kind of maturity.

--

--

Ashraf Nehru
Ashraf Nehru

Written by Ashraf Nehru

I once made the mistake of letting other people use my software; the result was www.disguise.one. Now I’m trying to figure out how to fix what’s really broken.