The world has changed, that does not have any type of discussion. Now the question is, how will this change affect us in the audiovisual sector? What will emerge from this new situation?

In this article I want to review everything learned in these last months in which we have worked very hard internally to evaluate the situation, improve all those aspects and processes that the daily whirlpool did not allow us to do, and align ourselves with the new normality that now it forces us to reinvent ourselves in the audiovisual sector.

For some years, especially thanks to the technological development that companies like Nvidia or AMD have developed at the Hardware level, the audiovisual sector has been developing a real revolution that, coincidentally, taking advantage of the pandemic break, has had its climax in recent months.

The potential of new graphics cards and the development of software from companies such as Disguise, 10bit FX Limited, or Epic games, are causing the dream of those who dedicate ourselves to this, to become a reality. To be able to work on our projects in real-time, the famous WYSIWYG, What You See Is What You Get, and significantly reduce rendering times.

Obviously this does not imply that now, overnight, the pipeline of audiovisual projects has changed and we forget about the quality that other render engines such as Octane and Redshift can give us…. On the contrary, each of the applications of these motors has their role, and as always, the need for the type of final product will determine which tools to use.

But these new tools, such as Unreal Engine or Notch, have come so that we can now offer new services focused on creating virtual worlds and embedding characters in them. Services that can range from a hybrid or virtual event to recordings for film productions, series, or video games.

After the “brief” introduction, I want to define the different points that we have focused on during these months and they have become services that we already offer to our clients.

Virtual Production or XR

Virtual Production or xR are very broad terms that focus on providing solutions to how we understand audiovisual production in all its aspects. Let’s say that it is a type of production in which the physical and real world are combining virtual reality, augmented reality and CGI, making it possible for us to see how our scenes unfold in real-time as they are composed and captured in one set.

In short, it is about creating virtual scenarios in which we can embed people or objects by using LED screens, leaving aside the green screen and its high costs in post-production. This new technological solution has been implemented mainly thanks to Epic Games, through its Unreal Engine video game engine.

On the other hand, the term xR, or extended reality, has been created by Disguise. xR is the technological solution focused on creating a workflow between camera tracking, indispensable for this type of project, and the creation of content in real-time, in a relatively “easy and simple” way.

Finally, we cannot forget the generation of content, since for all this to happen we need our environments to be rendered in real-time.

Thanks to Unreal Engine or Notch, we managed to synchronize the physical camera that is being tracked on our set with the 3D camera that is inside our virtual world. That conjunction of virtual and real worlds occurs thanks to the LED screens, which render both the real content and augmented using tools such as Ndisplay by Unreal Engine, or Disguise’s xR solution.

The main advantages of this process are:

  • Alternative to the traditional chroma system
  • Real lights and shadows on our actors or objects without green stripes
  • Real reflections on our speaker or object
  • Reduction of the needs for rotoscopy or Keying in post-production.
  • Direct interaction and visualization of actors and presenters with the content.

To develop this type of production we need a tracking system. Currently, we are working with the Mosys solution, but there are many more like Stype for example. In addition, all this process can also be done with the HTC Vive tranking systems, which provide a more economical solution.

At this moment in Framemov we are working on several projects applying this technology and they are beginning to become the main services we offer.

XR Services

Virtual events

Recordings in which we embed the presenters and speakers in a 3d environment. We recreate live events just like before the pandemic. We include led screens in which we can include any type of content and we make lighting assemblies that can be controlled by lighting technicians. The final recording can be broadcast live via streaming having a complete final product that brings a high degree of innovation to the event. We virtualize speakers or objects through augmented realities that interact with the speaker in real-time. In addition, we have Blacktrax movement systems that allow us to perfectly track all the movements of the person on set.

Hybrid events

hybrid event is a tradeshow, conference, unconference, seminar, workshop or another meeting that combines a “live” in-person event with a “virtual” online component. With the growing popularity and cost-effectiveness of virtual events, hybrid events have become a popular way of increasing participation in traditional events at a relatively low cost. They also enable participation by people who might be unable to attend physically due to travel or time zone constraints or through a wish to reduce the carbon footprint of the event. With Augmented reality, you can add more interest to the event, or focus the attention on specific themes, data or any kind of 3D object.

Film recordings, television production and cinema.

By using the virtual production system we create virtual environments that can be modified in real-time according to the needs of production. Lighting changes, space rotations, application of CGI effects in real-time. Live visual effects set that interacts directly with people or objects on the scene.

Music Streamings

In July 2020, we held an electronic music festival in Framemov focused on finding a way out of the current situation of venues, festivals, and musical and visual artists. We generated a virtual stage in which DJ’s and VJ’s played. The entire stage could be modified in live through Unreal Engine as if it were a real stage. On the screens, the VJ’s could control the visuals by connecting Resolume, Madmapper, or VDMX. The lighting was controlled by the DMX system. You can see the result below.

.

Cinematics for video games.

Thanks to Unreal engine we have started to create cinematics for video game companies. In them, we develop the entire process of recording actors and characters through the use of motion capture suits, mocap, from companies such as Xsens, Perception Neuron, or Rokoko. Through the tracking system, we can record these 3D characters in real-time as if we were really on a recording set, seeing the final result live.

By Jorge Escobar, CEO & Creative Director at Framemov