What’s it like working backstage at a New Year’s Eve concert by a star performer known for his world-record audiences in the millions?
What if that concert takes place during a pandemic lockdown, when the audiences have to stay home?
For the crew that pulled the Welcome to the Other Side virtual reality concert together, it felt nearly as frenetic as actually being there in person – and at times, even more exhilarating. What’s more, the millions in the (virtual) audience felt it too.
AR/VR consultant Antony Vitillo journaled the entire experience of putting on Jean-Michel Jarre’s Welcome to the Other Side (WTTOS), which featured the international electronic music star performing in a VR replica of the Notre-Dame Cathedral to celebrate New Year’s Eve with audiences around the world.
Racking up more than 75 million views, the show was clearly an international success. For a glimpse of the steps Antony and the team took using Unity to pull the show together, keep reading.
Three months is a tight timeline for any stage event – considering lighting design, sound design, concert visuals, pyrotechnics and logistics – and is especially rushed for a show that requires software development. But with New Year’s Eve on the horizon, the team initiated the project in September, and assembled to start work actively in early October.
When one of the collaborators, lighting choreographer Jvan Morandi, was brought into the project, he remarked, “This is scheduled for Summer 2021, right?” When he discovered that it was set for the end of 2020, he told Antony, “You guys are totally crazy.”
Stage by stage
The team – gathered by VR experience company VrOOm and led by its CEO Louis Cacciuttolo – had prior experience surviving rushed productions, and knew first-hand the value of building in a design phase. In this phase, performer Jean-Michel Jarre and the various experience artists determined the overall look and feel of the visual experience.
Once the design direction was decided, the team experimented within the selected event technology to understand what would be doable in the live experience.
“We chose VRChat as a platform because it was the one we were the most familiar with, having made all our biggest projects with it,” Antony says. “It is one of the most versatile social VR technologies, and also it let us use Unity, which we all knew very well.”
Modeling the scene
Then the team started assembling the scene in Unity using a model of the Notre-Dame Cathedral they found online. They worked with Lapo Germasi and Victor Pukhov of Manifattura Italiana Design (MID) to simplify it and optimize it for use in VRChat.
Creating the animations
Vincent Masson used Cinema 4D to produce the 3D animations that the audience could see all around them, while Jvan Morandi worked on the spotlight animations. Both worked on the 2D video mapping that was projected on the walls of the cathedral.
Another agency, SoWhen?, took care of creating a beautiful avatar of Jean-Michel Jarre using Maya. This was not easy, says Antony, because the public image of an artist is very important, so they had to balance creating an avatar that was at the same time appealing to a VR user and representative of JMJ’s well-known style.
“Since we were using Unity, we used very standard asset types for all of these features: FBX models for the 3D animations, the avatars, and the spotlights; Light objects for the spotlight lights; and mp4 videos for the video mapping, projected with proper UVs on the walls of the cathedral,” says Antony.
The avatar was animated using the real movements of the body of Jean-Michel Jarre. A mocap suit compatible with SteamVR was employed to ensure accuracy of the musician’s physical moves, along with tracking gloves to reproduce his intricate hand movements when playing the electronic instruments.
Assembling the scene
Once all the material was ready, it was time to assemble it in Unity, polish everything, and optimize it to perform well in VRChat. Jean-Michel Jarre entered the scene in VR to explore the complete environment and ensure it harmonized with his artistic vision, then Germasi made a final polishing pass with Unity’s Post-processing effects to balance the fidelity of the Cathedral with the visual effects inside it.
Automating the animation
To make sure all the elements of the scene (video projections, colored spotlights, props) would perform their actions in sync with the beat of every song in the show, Antony used Unity to time and trigger every action. “This would have been a tedious task to copy timecodes and program each event they trigger,” says Antony. But in Unity he was able to create scripts to handle it. With an editor script, he could read the massive CSV of all the timecodes of the animations and lighting triggers, and create on the fly all the required timed behaviours for each song, just by selecting a menu in the Editor. This made the operation much more manageable.
Optimizing for social VR
According to Antony, VRChat not only demands a smooth frame rate (“you don’t want people to see the concert with 20 fps,” he says), but also requires optimizing for file size. “If the overall scene weighs more than 300 Mb, your users will take too much time to download it, and sometimes things can also crash, so people won’t have a good experience.” The team at MID focused on optimizing the geometry of the Cathedral and played carefully with the size of the textures.
Working closely with Lapo Germasi and Victor Pukhov from MID, Antony made optimizations throughout and added the virtual systems and interactions that would emulate a live, immersive event.
- All the temporizations: The spotlights, the video mapping, and all actions should be triggered at the right time and be in sync with the song, requiring some creative solutions inside VRChat.
- All the interactions: Interactions are necessary for user engagement, especially on a social platform. Also, to enhance the sense of being in VR, the team added some effects that viewers could activate while watching the concert, such as “pop” other audience members into colored stars.
- All the elements needed to shoot the video: This being a 360-degree experience, the team didn’t want to provide a flat video from a single point of view. “We actually created a complex camera system inside VRChat so that teammates Georgy Molodotsov and Maud Clavier could also switch to a “multi-camera” version of the concert for live streaming,” Antony says. They used an open source project to initialize the camera system, VRCLens to add depth effect for shots taken by hand, and a 360-degree video recorder avatar by VoxelKei to shoot after-party video. (You can see the resulting video below.)
- The after-party experience was designed by Russian XR artist Denis Semionov, who used Quill to sculpt the scenery and to develop a simplified version of the Notre-Dame model in the event a lightweight rendition was needed.
- This was optimized by MID; and once Antony and the team added the after-party interactions (for instance, the ability to board a Zeppelin for a flyover), it was done.
The importance of VR
At one point, during the design stage, some people questioned whether they should actually abandon the VR experience and do the entire show as a linear live streaming video. But Jean-Michel along with Antony and other team members made the case that it should be in VR. The interactive experience, even if just viewed on a computer screen, would make people feel as though they were really inside Notre-Dame and surrounded by other people on New Year’s Eve.
Ultimately, Jean-Michel trusted the team, and the program went on to accrue more than 75 million views – in the live VR experience itself and on the many other platforms that streamed the video experience. “We made history, performing a huge concert completely within virtual reality, but for people both inside and outside virtual reality,” says Antony. “You can only make something incredible if you are willing to take the risks to do it.”
Get the full story of the making of Welcome to the Other Side from a studio that developed the event. Peek behind the experience here.