Virtual reality and immersive concerts: An introductory primer
Guide by Jeremie Joubert
For the music industry, perhaps one silver lining of the COVID-19 pandemic is that many artists are making some of the most visually interesting creative work of their careers, especially around livestreaming.
With so much digital media vying for fans’ attention, artists and their teams have been experimenting with different techniques to make their online performances more visually captivating, often taking nods from the gaming and film industries. Some artists have deliberately compared their online concerts to “TV specials” in terms of the production quality and intended viewing experience. Others have used game creation systems like Unity, Unreal Engine, and Dreams to give their virtual stages and music videos new immersion levels. In the process, major gaming companies like Epic Games and Roblox, as well as more recent virtual- and augmented-reality ventures like Anything World, Jadu, and Supersphere, are emerging as essential partners to the music industry, where they otherwise would have been siloed off or deprioritized as “separate” forms of entertainment.
Amidst this heightened activity, we thought it would be a good time to write an introductory primer to virtual reality and immersive media in online concerts. We want to highlight not just any typical Twitch stream or ticketed livestream concert, but rather those that have made use of cutting-edge technology, including VR, 3D design, and real-time motion capture, to deliver wholly new, imagined experiences for music fans.
In addition, we don’t just want to highlight basic definitions, market stats, or key companies around music and immersive media. We also want to present a clear, systems-level framework for understanding the shape that this rapidly-expanding landscape is taking today, observing years’ worth of experimentation both inside and outside the music industry. How do expectations line up with reality when it comes to the fan experience for immersive online music events? What is the tech and partnership stack behind some of today’s biggest VR concerts? And — most importantly for the future of touring — how, if at all, do any of these experiences make money?
In a contradictory way, we’ve found that even though virtual reality and other immersive media formats seem to be moving faster than ever in the music industry in terms of behind-the-scenes technical development, it seems to be staying in the same place in terms of actual revenue and consumer adoption.
✯ ✯ ✯
Definitions
For this article, we’ll borrow key definitions from the official glossary for Unity:
Virtual reality (VR) = Any experience consisting of “computer-generated stereo visuals which entirely surround the user, entirely replacing the real-world environment around them.” A first-person, “viewer-centric perspective” is crucial so that the user can interact in real-time with their environment (e.g., a performance designed for a third-person, bird’s-eye view would not be suitable for VR).
Because total immersion is crucial, VR headsets like the Oculus Quest, PlayStation VR, and HTC Vive, plus mobile VR add-ons like Google Cardboard and Samsung Gear VR, are the only ways to consume VR content as they are meant to be experienced.
360 video = Video footage in which “a view in multiple directions is recorded simultaneously,” using either a single omnidirectional camera or several separate cameras arranged in a spherical formation around the subject. Crucially, 360 videos can be interactive or non-interactive and do not have to be 100% immersive, which differentiates them from VR.
The Unity glossary also includes a definition of augmented reality, and many examples abound of one-off AR partnerships in the music industry, including Snap’s lyrical Lens, Lego’s Vidiyo music-making app with Universal Music Group, and Live Nation’s AR offerings for music festivals. That said, we’ll be focusing the following discussion more on VR and 360 videos for the sake of scope.
Market reality check
While all these different kinds of realities fall on a spectrum, their market sizes are notably different.
Technically, anyone with an Internet connection and a web browser can enjoy 360 videos, and anyone with access to a camera (via a computer or smartphone) can be a potential AR consumer. That puts the total addressable audience for 360 videos and AR at around 4 billion people, or roughly 50% of the world’s population.
The market for VR software and hardware pales in comparison. Various reports have pegged total VR headset sales (not including mobile headsets like Google Cardboard) to around 4.9 million units in 2020, a 15% drop year-over-year. According to ARTillery Intelligence, the total installed (i.e., active) VR user base globally is only around 14 million people — or roughly 0.3% of the total addressable audience for AR and 360 videos.
Industry commentators have pointed to expensive price points (the Oculus Quest 2, considered the cheapest premium VR headset, goes for $299), a bulky user experience, and a rapidly accelerating development cycle as the main reasons why most consumers are still hesitant to purchase their headset, even as the underlying technology might be improving. As Ben Kuchera put it in Polygon: “The VR revolution has been 5 minutes away for 8 years.”
That said, we will not dismiss VR’s future in the music business, for two reasons. Firstly, those who already own a VR headset have proven that they are willing to spend a lot of money on premium entertainment. According to ARTillery Intelligence, revenue from VR content and software in 2020 was around $1.6 billion, implying average VR content spending of over $114 per active user.
Secondly, because VR still has a long way to go before rivaling AR and 360 videos in terms of total addressable audience, we are not advocating simply for a VR-or-nothing approach; instead, industry executives could look to VR as just one component of a broader media strategy, as we’ll discuss towards the end of this piece.
✯ ✯ ✯
Introducing our Interaction vs. Immersion framework
From our earlier definitions, two key concepts seem to underpin most virtual experiences: Interactivity and immersion.
We define interactivity as the user or viewer’s ability to decide to progress in a given environment. Let’s take a virtual concert as a hypothetical example. A base level of interaction would allow the viewer to move around in the venue and change the angle or vantage point from which they can view a given experience; they might also be able to engage with the performer by tipping them, shouting them out in a live chat or voting for a certain setlist in real-time. On a deeper level, the viewer could feel present via a first- or third-person digital avatar that can grab digital objects (e.g., empty glassware) or chat and dance with fellow concertgoers, also represented as avatars.
Immersion refers to the depth of the environment that surrounds the fan. For a virtual concert, to what extent is the concert environment replacing the viewer’s real-world environment? If the experience is genuine VR and 100% immersive, is the viewer watching the performance from a virtual GA pit or balcony (as is the case with Venues, Oculus’ official live entertainment app), or can they find themselves in a front-row seat or literally in the middle of the show (as was the case with Travis Scott’s Fortnite event)?
We looked at nearly a dozen virtual concerts from the last year and assigned each one an interaction and immersion score. This research, visualized on the chart below, saw four distinct categories emerge:
Visualizing key case studies from more tech-forward virtual concerts in this way allows us to clarify which shows are truly VR, versus which ones are perhaps “immersive” but ultimately passive viewing experiences.
Let’s now take a closer look at each of these categories:
2D streaming of 3D concerts: Maximum scale, minimum immersion
In this first category, we include virtual events that appear to be made in VR but are not immersive or offer little to no interactivity. These events are often marketed as “virtual-reality events,” but usually can’t be joined through actual VR gear; instead, they can only be viewed in a 2D form via a computer or mobile device.
A few examples from the last year include Wave’s animated concerts with John Legend and The Weeknd (June and August 2020, respectively) and the virtual edition of Tomorrowland, titled “Around The World” (July 2020).
Wave built its early reputation on its now-shuttered social VR app, and has brought a similarly visual-first mindset to its recent shows with stars like Legend and The Weeknd, which make use of the Unity game engine and full-body motion capture suits from Xsens to render avataristic versions of performers in real-time. Most of these shows have also been what Wave’s founder and CEO Adam Arrigo calls “gamified,” in the sense of allowing fans to interact with artists and vote on upcoming songs or stage designs via live text chat during a stream.
Importantly, though, these shows have been able to reach hundreds of thousands of people at once only because they were made available solely on 2D platforms like YouTube, Facebook, and Twitch, not on VR devices. Hence, the level of interaction allowed for the viewer is not that different from what any typical Twitch stream can offer; the only difference is the technology used to visualize the results of such interaction.
[Screenshot of The Weeknd’s concert with Wave, which aired on TikTok in August 2020.]
The tech stack and distribution strategy behind Tomorrowland Around The World were similar. The video content itself was all pre-recorded days or weeks ahead of time using green screens, then overlaid with CGI elements using Unreal Engine and camera-tracking technology from stYpe. While these elements were visually striking, the festival was ultimately broadcast live for passive consumption, like an online movie premiere, on the festival’s virtual venue grounds. The only sense of interactivity from the fan’s perspective was navigating different festival stages by clicking on a map. This led 52% of the festival’s virtual audience to feel that there was not enough interaction with other attendees — a feature that is baked into the IRL festival experience.
[Screenshot of the virtual stage for Tomorrowland Around The World, which was created using Unreal Engine but only available for viewing in 2D form.]
The Tomorrowland team had made clear that their digital events would be accessible to fans without any VR gear and best watched “on a big screen like a computer or television.” Financially, this turned out to be a better call, as it allowed the online event to sell nearly 140,000 tickets, attracting about one million total viewers. Even still, the festival was unable to turn a profit, which speaks to the immense difficulty in building a business around music livestreams — especially if you aim for higher production quality or focusing on distribution platforms like Facebook and YouTube, where fans are not used to paying for content.
Perhaps financial viability for this model of virtual concerts will come in balancing several different kinds of monetization at scale (e.g. merch bundles) or extending the shelf life of a livestream with a broader waterfall release strategy. For instance, Tomorrowland re-released on-demand versions of some of its artists’ live sets from Around the World exclusively to Apple Music. Jean-Michel Jarre made the audio of his recent VR concert Welcome To The Other Side available on multiple streaming platforms. The Weeknd released two on-demand clips from his Wave show as alternate music videos (for “Save Your Tears” and “Blinding Lights”), which have collectively received over 21 million views to date.
360-video concerts: A gift for fans, but falling short of what VR has to offer
Another category of virtual performance is the broadcasting of concerts filmed with 360-degree cameras.
Usually, viewers of 360-video concerts can modify the camera angle they are watching the show, either by turning their head or pressing buttons on a controller. If they’re lucky, they can also move about on stage, getting close to the artist or the other performers. But the footage for these experiences usually is prerecorded — making it an ultimately non-social experience, with no ability for viewers to interact with each other or with the artist in real time.
For this reason, many online concert experiences and apps that are billed as “virtual reality” are actually just 360 videos that happen to be available only for VR headsets. Sony’s partnerships with Joshua Bell and Tom Grennan, which had the artists perform while being filmed with omnidirectional cameras for Playstation VR, fall under this category.
[Screenshot of the Joshua Bell VR Experience, which is available as a free download on Sony’s PlayStation VR.]
In some cases, 360-degree concerts billed as “VR” don’t even require a VR headset to watch. For instance, in April 2020 the rock band Halestorm aired a “Virtual Reality Concert” from their tour archives on their YouTube channel. Unless you owned a Google Cardboard setup, the only way you could view the footage was as a 360 video on your 2D device.
Similarly, despite its name, the concert streaming app MelodyVR releases all of its show footage both for VR headsets and for its iOS and Android apps — making it possible for users to view MelodyVR shows in a 360-video format on their mobile devices without being 100% immersed.
[Screenshot of a 360 video that The Head and the Heart filmed at The Sylvee in Madison, WI for MelodyVR.]
Unsurprisingly, monetization of 360 videos is also a significant challenge. MelodyVR, perhaps the only company trying to build a more extensive service around curating and aggregating 360 concert videos, relies on a mostly transaction-based business model, allowing fans to purchase entire shows or even individual songs for a flat fee. But in the first half of 2020, the company reported only around £190,000 in revenue against £11.6 million in operating expenses.
Sony’s PSVR projects with Bell and Grennan — and similar VR projects at other larger tech companies, such as that between Microsoft and Childish Gambino — were likely structured as brand partnerships with flat fees for the artists involved, rather than as long-term revenue opportunities. Even if tech companies might be fronting production costs as a way to drum up demand for the platform, that alone is hardly a clear path to scaling this kind of experience to millions of artists and fans.
In-game concerts: New fantasy worlds for music
Let’s move further on the immersion and interactivity scales, looking at live music experiences online that take place natively within video games.
Notably, the concept of interactive online concerts in virtual, animated environments is nothing new: It dates back at least 15 years to Second Life, the online virtual world that dozens of artists and labels used to build their venues and connect with fans in avatar form. Travis Scott’s show in Fortnite, which took place in April 2020, merely reworked this tradition for the modern gaming industry.
Scott’s performance was designed to happen within the game; the rapper’s avatar was turned into a larger-than-life giant towering over Fortnite’s Sweaty Sands Beach. While the entire show was premeditated, offering no genuine two-way interaction between Scott and his fans in real-time, the “venue” was the entire game environment itself, which by nature relies on user agency to function. Players could run around Scott’s character and control the vantage points from which they viewed the show simply by moving their avatars around, just as they would during a typical Battle Royale round.
[Screenshot of Travis Scott’s show in Fortnite in April 2020.]
More recent music performances in Fortnite from the likes of Diplo and J Balvin were not nearly immersive or interactive. These performances were simply 2D videos streamed through a screen insert in a dedicated, non-combat area in Fortnite called Party Royale. While fans had to run to Party Royale’s Main Stage to catch the concert at the allotted time, they could not run around 3D versions of the artists’ characters, nor change their camera angle on the show as they could with Scott’s performance. This approach doesn’t take advantage of the unique capabilities of Fortnite as a fully immersive venue; in fact, it ultimately feels just like a slightly flashier version of watching a YouTube video. (There’s probably a reason why Party Royale hasn’t hosted any new concerts in this format since October 2020.)
[Screenshot of fans watching Steve Aoki’s Party Royale set in Fortnite in May 2020.]
For artists like Scott who can partner with Epic Games and Fortnite on more integrated in-game event experiences, the return on investment is impressive — spanning a Coachella-sized performance fee, in-game merch sales, and a significant follow-on bump in music streams and sales. Other gaming properties like Grand Theft Auto V are also developing in-game venues with regular “residencies” from real-world artists, creating more opportunities for artists to have a similar cultural impact.
But due to the production and engineering costs involved, this model is hardly scalable across more than a handful of artists every year, making it far from a likely contender for the “future” of online concerts as a whole. Perhaps a more accessible, diversified model for in-game concerts will come from social gaming platforms like Roblox — which is partnering with labels like Warner Music and Sony Music on bespoke music events (as we saw with recent shows from Ava Max and Lil Nas X), but also allows anyone to build their own gaming experience.
“True” VR concerts: 100% immersive performances in 100% virtual venues
To reiterate, while media coverage may have suggested otherwise, none of the examples mentioned above are “true” VR experiences. So, which ones are? We’re looking for events that meet all three core requirements for VR, as per our earlier definition: 1) Visuals that surround the user, 2) a first-person, viewer-centric perspective, and 3) real-time user interaction with the virtual environment.
One example is Oculus’s live entertainment app Venues, which will feature upcoming shows from the likes of Sofi Tukker and Big Freedia. While audience members still cannot interact with performers in real-time during Venues events (as the videos are pre-recorded in advance), they can interact with other audience members and travel together from one “show” to another in a more social manner, with the ability to gather in groups in a central “lobby” area. The latest beta version of Venues also allows viewers to watch a given show from multiple vantage points in the audience, e.g., choosing between mezzanine or box seats.
Oculus has recently partnered with the VR live-events company Supersphere to produce higher-quality shows for the Venues app with artists like Steve Aoki and Major Lazer — using stage and lighting designs that these artists would have used on their IRL tours.
[Screenshot of Supersphere’s venue template for concerts in Oculus Venues. Source: 60 in 6 on Quibi]
On a larger festival scale, the virtual Lost Horizon Festival — created by the team behind Glastonbury’s Shangri-La area — hosted performances throughout the second half of 2020 in the VR app Sansar, in partnership with the VR-focused live-events production company VRJAM. The first festival took place over two days in July 2020 and boasted four simultaneous music stages featuring the likes of Fatboy Slim and Peggy Gou, alongside dozens of films and visual artworks on exhibition for festival-goers to experience outside the concerts themselves. Virtual festival-goers were greeted by “hosts” (i.e., humans in avatar form) at the gate, and could socialize and dance with each other.
Later in December 2020, Lost Horizon would continue the festivities with its “December Series” of featured performances from the likes of Nicole Moudaber, Pan Pot, Krust and Infected Mushroom, alongside several underground electronic artists. The visualized concert setup itself replicated much of an actual festival set: The artist perched onstage, the crowd dancing in the pit, a stage barrier being the only separation between the performer and the front row. Hence, while Lost Horizon was also technically viewable via Sansar’s web and mobile apps as well, the experience was clearly designed with a fully immersive, interactive and social VR setup in mind.
[Screenshot of virtual festival-goers having fun at Lost Horizon. Source: VRScout]
Once again, the underlying business models for “true” VR concerts are nebulous at best. To date, all of Supersphere’s shows in Oculus Venues have generated precisely $0 in ticketing revenue (audience reservations were free) and had no brand sponsors, aside from perhaps Facebook itself. Lost Horizon technically charged $10 per ticket for its December series on the official Sansar website, but they also offered unlimited free premium passes to HTC Vive device owners. We have yet to find a successful case study of a VR-native, interactive concert that has broken even thanks to ticket sales or direct fan support.
This also means the “only” barrier to entry for these shows is owning a VR headset — which will set fans back at least $300, all of which goes to the hardware manufacturer instead of the artist or concert producer. VR event promoters also have to balance an inevitable dichotomy of the “haves” versus the “have-nots” — i.e. offering a fantastic, immersive experience for the small minority of fans who own a headset, and a flatter, 2D rendering for the masses that might not be as interesting.
That said, because of the maximum levels of immersion, interactivity and intimacy, “true” VR shows may actually present the clearest path forward to a multimodal approach to monetization in the longer term. We already established earlier that VR users, while small in number, have higher amounts of money to spend on entertainment; this suggests that the price for true VR concerts moving forward could inch closer to that of an IRL concert (outside of music, premium VR experiences are going for as much as £32 a ticket). Sansar also launched a native tipping feature within Lost Horizon — allowing fans to tip artists anywhere from $0.01 to $50 directly — that will likely become table stakes for true VR concert experiences moving forward.
✯ ✯ ✯
Conclusion: The VR gambit
The above discussion suggests that if we’re honest with ourselves, VR still has a long way to go before it can power a new generation of online concerts and music experiences.
Put simply: Immersion is expensive, and its business model is unproven. While the technology behind VR concerts can be groundbreaking, we have yet to find any approach that makes these experiences scalable and repeatable in a sustainable way. VR headsets are still too expensive for the average consumer, while mere 360 videos fall short of the unique capabilities that VR as a format has to offer.
Where do we go from here? As we previously mentioned, the vast majority of artists cannot, and should not, go “all-in” on VR and immersive media as their only performance outlet. Instead, the technology can comprise one component of a broader media strategy.
Right now, the value of a VR concert for the artist is much more about landing headlines in the right media publications than about making any actual money. This isn’t inherently bad, as a VR event can help artists build their brands by reaching early-adopter and tech-enthusiast demographics. Once touring resumes, a VR gig could also become an additional ticketed stop on a tour, perhaps in conjunction with a 2D livestreaming approach to ensure broader reach. Events and venues that have managed to build a distinctive brand experience can capitalize on this strength by creating virtual worlds within VR apps, hosting shows and audiences as they do IRL.
With this in mind, there needs to be a clearer path in the music industry for independent and emerging artists to experiment more with VR and other immersive media formats.
Unsurprisingly, many of the most exciting initiatives on this front are led by independent artists themselves. The UK-based band Miro Shot — which performed a “virtual worlds tour” in summer 2020 across multiple VR apps such as Sinespace and AltspaceVR — launched their own startup Overview Ark to help other artists and venues do the same. Vaporwave artist George Clanton recently launched an online initiative called Virtual Utopia, which offers tutorials on 3D design tools like Unity, Cinema 4D, Blender, as well as a matchmaking and microgrant program for musicians and 3D artists in partnership with Red Bull.
Sansar offers resources for DIY creators to host virtual experiences in pre-designed worlds, using 3D elements that can be purchased from their store. Other platforms and game studios would do well to follow a similar strategy, opening up and clarifying the required steps for up-and-coming artists to access performance opportunities in their respective spaces. If the music industry historically thrives on discovery and diversity, where tomorrow’s star is today’s developing artist, that should be reflected in the emerging technologies we adopt as our “future.”
Ultimately, we take an optimistic view of VR’s place in the music industry moving forward. If the technology finally hits the mainstream, it might just allow live music to do online what it already does best IRL: Make people dream.
✯ ✯ ✯
Jeremie Joubert is the founder of FAN LABS, a strategy and digital consultancy for the music industry. He previously wrote for Water & Music about how to build a genuinely fan-centric online music experience. You can catch him on Twitter or LinkedIn.
Cherie Hu contributed additional reporting and editing to this guide.
Read more of our coverage on emerging tech and startups in the music industry: