Virtual production may be the toast of Hollywood, but as demand for blockbuster-caliber visuals crosses over into proAV, the approach is popularizing for hybrid live streamed concerts, product launches, and more. Unlike film or episodic shoots, live event productions don’t afford second chances, which is where Immersive Design Studios CANVAS platform serves companies who have adopted the technique. It combines real-time game engine and cloud technology with video capture, playback tools, and artificial intelligence (AI) neural networks to ensure memorable, immersive live event activations. Immersive Design Studios works closely with clients to support every production and broadcast or stream it live. To achieve a uniform look for broadcasts that comprise footage shot on different cameras, the team deploys AJA ColorBox with Assimilate Live Looks for live grading, most recently for WORRE Studios.

Commenting on the project, Immersive Design Studios Co-Founder and CEO Thomas Soetens explained, “We’re constantly brainstorming ideas and testing them across client projects. When we started thinking about WORRE Studios and began LUT (lookup table) work, we realized we could use our cinema cameras with AJA ColorBox and Assimilate Live Looks to push the loads. We ordered multiple ColorBoxes and were using them with Live Looks in under a week.”

Establishing the right pipeline

The workflow was relatively straightforward to implement, according to Soetens. WORRE Studios – a CANVAS-powered facility that’s been tapped by Fortune 500 companies, celebrities, and more for live hybrid in-person and virtual events – features a circular LED stage. The venue comprises four massive LED walls – each 60 feet wide and 14.5 feet high – totaling a combined resolution of 38K. It accommodates an in-person audience of up to 350 people, as well as the capacity to reach 500,000 virtual attendees. CANVAS outputs the experiences onto the LED walls, but most often, everything that happens in-studio is also broadcast or live streamed to remote audiences.

Ensuring a Hollywood-caliber viewing experience on a production like this meant that anywhere there was an interaction between the LED wall, speaker, and virtual production background, the look had to be uniform. Soetens and team leaned on LUTs as a solution because they support camera calibrations that move beyond adjusting the white point and an even balance between the colors. They output feeds from ten different cameras through a video mixer and into AJA ColorBox units, which helped them unify the look and feel of the color to a prespecified LUT the team wanted applied to the broadcast. The resulting effect made the remote audience feel as though they were inside a movie.

“Our ColorBoxes sat between every camera and the switcher in a very simple way, introducing essentially zero latency and making them quite handy kit.” Soetens shared. “We just had to configure the devices and talk to them over the network, and we got incredibly powerful outcomes; they’re so small and portable, which was great because we could place them anywhere and access them remotely via the browser-based web UI. This approach gave us far greater control over the outcome.”

Overcoming barriers

Establishing a uniform look when working with a broad range of cameras, however, can be challenging as each camera has unique attributes. Recognizing that learning multiple camera models and brands, and their different LUTs and calibration approaches, just wouldn’t be efficient, Soetens and team saw ColorBox and Assimilate Live Looks as a centralized solution that would allow them to unify and color grade footage from multiple different camera types with low latency. “We were able to swap the look and feel between shots and make every camera source ultimately look the same,” he explained. “It allowed us to deploy multiple LUTs on multiple ColorBoxes, test them, and bring them together seamlessly for a cohesive program output.”

A commitment to quality

In addition to using AJA ColorBox, Immersive Design Studios leverages many other AJA solutions across projects from KONA I/O cards to FS frame synchronizers, Mini-Converters, and, more recently, BRIDGE NDI 3G. “BRIDGE NDI 3G is a symbiotic fit for a lot of our clients,” Soetens expressed. “Especially considering CANVAS can ingest hundreds of NDI streams; most of our clients are using it as a set-and-forget solution for SDI to NDI encoding/decoding needs. We trust AJA as a brand, so we implement it when we need trustworthy conversion.”

He concluded, “AJA ColorBox and BRIDGE NDI 3G are reliable and do what they are supposed to without introducing any unnecessary complexities, and that’s the magic of AJA; the products are always lean, very focused, and withstand the test of time. We choose AJA in synergy with the vision of the CANVAS platform technology: reliably enable frictionless execution that maximizes artistic decisions and the audience experience.”

Gabriel Mays

AbelCine Director of Rental
As Director of Rental, Gabriel Mays oversees growth of Rental by nurturing business relationships and unifying the customer experience nationwide. In his previous role as the LA Rental Manager, Gabriel worked with producers, cinematographers, and assistant camera operators to provide the highest level of customer service possible. He also ensured that all their equipment needs are met on budget. Before coming to AbelCine in 2015, Gabriel previously worked as a Communications Instructor and Electronic Media Engineer at a private university. He taught various courses in electronic media, including film, broadcasting, and digital arts. Gabriel is an award-winning cinematographer, director, producer, and editor known for his innovative research and implementation of new film techniques. He was also one of the first to implement HDSLR cameras into filmmaking. He has been the cinematographer on dozens of independent films, documentaries, and music videos.