Prime time is back!

BTS look at Amazon-MGM Studios

You’ve had to tackle a warzone at work, you’ve ninja’d your way back home through peak hour traffic and you’ve dodged that friend’s house party because he lives in an insufferable zip code. All so you can spend time alone with The Boys.

Our unwavering confidence in our OTT platforms to “just work” and entertain and befriend us in our downtime is thanks to some serious behind-the-scenes tech. Tech that Stuff (India) was privy to in an exclusive tour of Amazon’s Stage 15, part of the erstwhile iconic MGM Studios before the merger and becoming Amazon MGM Studios. Established in 2020, Stage 15 is ground zero for the pioneering Virtual Production powered by AWS (Amazon Web Services).

Spread across 34000 square feet, the absolute star of the show is the gigantic LED volume wall that uses Unreal Engine to create any scene and lighting that the director wants. Composed of 3000 LED panels and 100 motion capture cameras, the primary aim of this cutting-edge virtual production system is to cut costs and let filmmakers visualise their shots to the highest level of detail like never before. Of course, the most obvious byproduct of this technique is the environment it creates for the actors themselves. Being virtually immersed in the location or scene opens up the potential for greater expression and believability. The actual glow reflected on the actors’ faces or wearables from the LED also eliminates the dreaded colour spill from traditional green screen set-ups.

After a brief intro, we were allowed to get in the thick of things, right in the middle of the volume wall. Only when you’re in the centre of it do you realise its true scale. 80ft in diameter and a towering 26ft in height, it completely engulfs you in whatever world or background that is being conjured. Any of the panels can be dropped out to mount additional rigs anywhere in the wall and there are options to even map physical objects that have a “virtual” consequence on the volume wall. So if you use a flashlight across the volume wall, the motion capture cameras will map the movement of the flashlight and accurately create a “digital” spotlight that follows the flashlight’s movements.

Shooting on the virtual production stage is just a part of the process though and this is where the AWS (Amazon Web Services) camera-to-cloud workflow comes into play. With almost real-time asset sharing along with review, graphics, editing and colour grading can happen on the fly. More than 500 partners across different parts of the workflow chain use AWS, making it one of the biggest cloud ecosystems in the world! To put things into perspective, Jurassic Park has “only” 83 CGI shots back in the day, whereas the Prime Video Original Series Rings of Power has 9164 CGI shots, all thanks to the AWS workflow! Modern filmmaking is often a collaborative process between multiple VFX, production and editing teams located in different parts of the world and the AVPS-developed asset management system allows stakeholders to catalogue, search, preview, and repurpose production assets. Unsurprisingly, this greatly reduces the lag between transferring files from set to editorial to VFX and onwards to post-production.

BA Winston - VP of Technology

The technologies that we are building are applicable globally. And some of the technology specifically around how we adapt to varying bandwidth conditions. We do adapt them for specific countries as well and it’s not just India with fluctuating bandwidth conditions. it's very important for us to ensure that even in those countries with poor bandwidth we do deliver a good experience.  So for example, when we did Cricket, I believe we probably offer the best experience for customers ever because we spend a lot of time optimizing our bitrates specifically for India, we have AI based resolution where we are able to do upconversion on the device side, even with lower bandwidth. So we can send lower bitrates and even a lower resolution video to the device and then the devices does the upconversion. So in that case, you can adapt even more to varying bandwidth conditions.


Of course, the less tangible but equally critical byproduct of having the AVPS LED volume wall and AWS-powered workflow is the big impact it has on the carbon footprint of a unit. Less trucks carting equipment to multiple locations, fewer professionals flying all over the world and not to mention, and fewer resources used overall. On the other end of the spectrum, OTT is our window to all the processes that lead to the final product being delivered to millions of devices. Over 8000 files are created for streaming a single movie on Prime Video and AI is used in the encoding process to ensure each frame has the right colour and contrast. Using AI to upscale low-res frames helps in making them look high-res using generative processes and this is being rolled out slowly in a phased manner.

Elsewhere, the use of AI has permeated into the backend of Prime Video’s vast labyrinth of a distribution network to ensure that the bit rates are perfectly judged according to varying network speeds. The long chain from Studio to encoders-media packagers-content delivery networks-ISP-home Wi-Fi and finally to the device has innumerable variables that need to be tamed via complex algorithms. All of which gets a helping hand from AWS’ AI implementation. Not too far into the future, a typical 9-month production pipeline will be condensed into 9 weeks using AI, ML and Neural Radiance Field, the tech that can reconstruct 3D scenes from 2D assets and all of it can be carried out with the help of AWS. The scalability, security and global infrastructure it offers are proving to be a seismic shift in filmmaking budgets and workflows.

Adam Gray - VP of Product at Prime Video

We just launched something called explicit language of preference. So if you choose a certain language for the UI then that’s also the one we lean into for localisation of content. But Its not true for a lot of customers in India as they will choose English or any of the other multiple options. It basically enables customers to select languages that they would like to stream in and change their experience. We’re doing more and more to implicitly figure that out as well. But just giving the ability for customers to directly change that had a really big impact. India is one where we have a dedicated team that just works on that experience because there are so many options. 

As viewers and consumers, all we have to do is await the next big release, settle into our sofa and press play. That’s exactly why a look behind our OLED screens and onto the scene of filming was such an eye-opening experience, literally.