Universal Mentors Association

Powerful tech is breaking boundaries in VFX film production

[ad_1]

Presented by Supermicro and AMD


Production 2.0 is pushing the frontiers of what’s possible in visual storytelling, creativity and film production. In this VB Spotlight, leaders from Supermicro and AMD talk about the new technologies utterly transforming how the entertainment industry works.

Watch free on-demand now.


Before the dawn of computer graphics in film, production consisted of three stages: pre-production, production, and then post-production, where visual effects were incorporated. Now, with increased compute capabilities production has become one iterative process — far more efficient, and more cost-effective. Plus, new tools that can take advantage of these computational resources, like Threadripper, AMD EPYC CPU, 3DS Max and Houdini are making far more sophisticated motion capture and visual simulations possible.

This technology is unlocking some extraordinary visual storytelling, says James Knight, global media & entertainment/VFX Director at AMD.

“Good content is all about suspension of disbelief — as years go on, audiences expect more realism,” Knight said. “When you watch a piece of content, you want to be in it for an hour or 90 minutes, and that’s your reality. Good visual effects add to the story, and add to the deception that what you’re seeing is real within that storytelling.”

And modern-day GPUs allow for real-time rendering, and an explosion in the possibilities of iterative virtual production. Now creators and editors can create and use new assets at any point in the production pipeline, as well as easily make adjustments on the fly, in real time, on set and off.

“Virtual production and the real-time render — they’ve changed everything,” said Erik Grundstrom, Director, FAE, Supermicro. “Advances in technology have been able to allow us to have more realistic visuals, faster rendering times, more complex effects, increased detail and resolution. We’re headed to 8K.”

The hardware under the hood

Five years ago, GPUs were built with 10 or 12 cores — 16 at the high end, relatively low clock speeds and very basic inter-process communication. But modern GPUs have become many magnitudes more powerful, Grundstrom said.

“Today we have these massive math monsters,” he explained. “The innovation has been significant. These types of things were unheard of five years ago. When you have a ton of cores at a ton of frequency, it really has changed our capabilities as far as time to completion and multi-tenancy workstations and storage, all across the board.”

He points to AMD’s 4th generation EPYC processors, with up to 96 cores of CPU and Ryzen Threadripper PRO, with 64 cores and clock speed that runs at 100% above 3GHz. This kind of power makes it possible to create virtual machines in studio production for virtual workstations, with true multitenancy and full 3D acceleration, with all-flash storage that makes it possible to save and transfer files faster and more efficiently.

“The uptick in cores and threads has generated a revisiting of how studios and how projects look at their pipelines,” Knight added. “It turns out CPUs have been the holdback. As virtual production, real-time visualization, and specialized VFX become more ubiquitous, having increased lane capability, having the ability to plug more things into a system, has had a huge effect on production.”

The impact on virtual production

When a visual effects studio or a production company gets awarded a project in film and TV production, it often requires them to staff up as quickly as possible. And with compute capabilities in data centers around the world, a show can be staffed incredibly quickly, and workflow is far more efficient.

“Artists can spend more time with their art because of the increased compute capabilities,” Knight said. “They can make more mistakes within the same deadlines. It translates to the audience in better storytelling.”

And this is power that all remote collaborators have access to, wherever they are — up to hundreds of them. And multitenancy means a workstation now can be shared amongst many users, Grundstrom said.

“Now they have, from any computer, from any interface they like, all of the horsepower that’s traditionally behind a tower that sits on their desk in an office,” he explained. “You can be literally anywhere in the world, and as long as you have a decent enough internet connection, you could be on your laptop in a cafe somewhere and have access to a full 3D accelerated workstation with all the resources you would have if you had a box on your desk.”

The democratization of creative technology

Faster CPUs, better chips and more powerful tools aren’t just for the big tier-one studios, Knight said, but the smaller studios across the world where tax incentives support production can also benefit from this technology.

“This technology that we’ve worked on together is for everybody,” he said. “It doesn’t matter if you’re working on an independent film or a major feature film, a live TV broadcast or a sports show. This is for everyone.”

Plus, he added, innovations in feature film and TV technology are largely credited with all the innovation in computer graphics that then trickles to the other verticals.

“Media and entertainment is a great area to battle-test new technologies that will end up having an effect across all verticals.” Knight explained. “And through relationships with companies across industries, Supermicro has found new ways to push the boundaries of this tech. We have a feedback loop with our partners and our customers. That helps future generations of our technology. That’s how we push the boundaries.”

For a deeper dive into how technology is pushing an evolution in visual storytelling, why tech innovation in the entertainment space is a barometer for innovation across industries and more, don’t miss this VB Spotlight.

Watch free on-demand now!

Agenda

  • Virtualization and collaboration, production workflows, resource utilization, and the limitations of physical sets and locations
  • Advances in rendering and storage speeds, complex visual effects, speeding up production timelines and time to market
  • A look at the way real-time rendering engines can stretch the boundaries of filmmaking
  • And more

Presenters

  • James Knight, Global Media & Entertainment/VFX Director, AMD
  • Erik Grundstrom, Director, FAE, Supermicro
  • Dean Takahashi, Lead Writer, GamesBeat (moderator)

[ad_2]

Source link

Leave a Comment

Your email address will not be published. Required fields are marked *