News Category
Immersive VR at Glastonbury Festival with Lost Horizon
by Coral Manton, Naomi Smyth, Ana Levordashka
Creative Computing Research Group - Coral Manton
Bath Spa University Creative Computing Research Group has been working with Lost Horizon on their project to create a new digital twin venue to share live performance in online virtual spaces. Working with Naomi Smyth, we have been developing new tools to bring digital and physical audiences together and recreate the emotional connection people get from being together and dancing together at gigs to a virtual VR venue. This year we brought the project to a bespoke VR venue, built around a recycled shipping container, to test with people in the Nomad area of Shangri-La at the 2022 Glastonbury Festival - the first in-person Glastonbury after the pandemic.
The VR venue we developed for the festival was built using a 3D model of Lost Horizon in Bristol. We used head and hand tracking, alongside VFX that were designed to emphasise user movement, to create a sense of dancing together for the people in the VR venue. Seeing and hearing people recognizing and connecting with friends through movement in VR was special. Alongside this we brought the people dancing IRL outside the venue into the VR world using a bespoke AI people tracking system we designed using an OpenCV depth sensing camera trained on a dataset originally designed for retail surveillance that we repurposed for this project. Inspired by the ethos of Shangri-La, a space of community, protest and looking for alternative ways of being, we created an ‘AI Hall of Mirrors’ in which we made predictions about the future - which were based on computer vision ‘seeing’ and analysing the festival goers to strike up conversations about our relationship with AI and surveillance in public spaces.
AI Hall of Mirrors at Shangri-La Glastonbury 2022, Image Credit - Dave Webb
AI Hall of Mirrors at Shangri-La Glastonbury 2022, Image Credit - Dave Webb
As a group we love to experiment with new technology, but for us the best thing about the Shangri-La experience was talking to people, sharing new approaches, and imagining alternative ways of living together with technology in the future. Both through taking part in curated talks on the Nomad stage, talking to other creators and people coming to the venue, we had countless thought provoking and inspiring discussions about relationships with technology including, surveillance, the potential of online virtual spaces, the importance of community ownership, open data, education, climate change and inclusive online spaces. We are looking forward to developing the project and working with this amazing community again soon.
The Creative Computing team was Coral Manton, J Ponte, Samuel Sturtivant and Dave Webb. The venue coordinator was Naomi Smyth, a theatre maker and current PhD researcher at Bath Spa University.
Venue co-ordination and production - Naomi Smyth
I’ve written, designed, directed and performed immersive, interactive and/or participatory performances at UK festivals and theatres since 2007. I’ve done this with Kaye Dunnings at Shangri La every Glastonbury since 2009.
Over the pandemic, the long separation of audience and performer had a profound effect on all the performing arts. Perhaps none more so than immersive and participatory theatre, a form that has often relied on everybody moving and interacting at close quarters, touching the same objects, sharing food and drink, shouting and laughing.
My PhD research with Bristol +Bath Creative R+D began in 2019 and was originally intended to explore how immersive theatre artists were incorporating emerging technologies into their in-person productions. Instead, COVID forced me to turn towards what was unfolding as immersive and participatory performance makers began to use remote access technologies in place of in-person interaction.
Live testing of the VR work at Shangri-La Glastonbury 2022, Image by Dave Webb.
Many immersive performance artists including myself were drawn to social VR. VR can give an embodied ’feel’ of place and social context that is imperfect but can be more powerful than a two-dimensional medium like Zoom.
The Shangri La creative directors I have worked with for over a decade mirrored the shift in my research by creating Lost Horizon in existing Social VR platform Sansar. In their virtual festival site and nightclub, international audiences moved and communicated using avatars, nonverbal movement and dialogue with others in VR. Lost Horizon have since developed their own app to a level where it can host volumetric video of performers but the number of avatars and inputs it can host is limited.
As venues open back up it would be a shame to lose a whole section of the audience who might have attended shows remotely that they would or could never travel to physically. It’s our R&D goal at Lost Horizon to create a hybrid venue that can harness the live presence of the crowd, performers and space and bring that to people attending remotely in VR or in a 2D web browser. This must be in a way that feels authentic but doesn’t overburden processors and servers. Very high data flows limit access to those with the most expensive technologies at home. This is the access problem with Sansar and the reason for the R&D.
Our installation at Shangri La was a showcase of our progress and plans toward this goal. We called our venue ‘Party Nerds’ and originally intended to programme various social and creative VR experiences. However there was a site-wide dearth of internet and despite having a connection installed could barely perform software updates over it, let alone run social VR online. Having predicted this, I worked with Creative Computing Research Group who designed an elegant ‘backup’ solution using a Local Area Network.
We delivered the experience Coral describes above as the best and most stable demonstration of combining the physical location-based ‘liveness’ of the people moving in front of the venue with the digital ‘liveness’ of the people inside in headsets. It was a strong proof of concept that demonstrated the possibilities quickly and clearly to the audience. We gave them something fun and unique to play with, while the AI installation on the outside contextualised the techno-utopianism VR can inspire with valid concerns about data surveillance, bias and social and political control.
I can’t wait to do more with this fantastic team, and hopefully will be able to create and perform some shows next time.
Audience Experience of the Lost Horizon Virtual Twin Venue - Ana Levordashka
University of Bath Human-Computer Interaction lab (CREATE) assisted Bath Spa University Creative Computing Research Group in showcasing the Lost Horizon digital twin venue at Glastonbury Festival, guiding attendees through the virtual reality experience, conducting observations, and collecting informal feedback from the participants and production team.
Using an HTC Vive tethered headsets, participants spent around 10 minutes exploring the virtual Lost Horizon venue—a darkly-lit space, where participants appeared as floating headsets, their hand movements leaving a lingering luminescent trace. Visible in the venue were floating orbs representing people standing outside the container, tracked in real-time with an object recognition camera.
VR stream of the Lost Horizon Venue in Bristol, Glastonbury 2022, Image Credit - Ana Levordashka
As psychologists, we were interested in understanding whether and how mixed-reality (XR) technologies, such as VR, can produce emotionally rich and psychologically immersive shared experiences, as well as to assess the challenges of using VR in public live events, such as motion sickness and busy surroundings.
Overall satisfaction with the experience and concept of the Lost Horizon virtual twin was remarkably high. Participants enjoyed exploring the space, dancing, and interacting with other visitors. Hybridity—the fact that the venue exists physically—produced strong positive reactions, especially in people who had been to Bristol or had enjoyed Shangri-La, the Glastonbury area created by the team behind Lost Horizon. Hybridity also helped alleviate the occasionally raised concerns about the Metaverse as isolating and devoid of reality. Over four days of showcasing, we observed no adverse reaction to VR.
Participants experienced high levels of immersion, becoming unaware of a particularly busy external environment. For some, immersion was instantaneous, and they began exploring the environment with little further guidance. For others, it was helpful to suggest looking around, making a couple of physical steps, squatting, glancing at their virtual hands. The most engaging element appeared to be interactivity—the luminescent trace following hand movements. Participants often expressed interest in having more objects to interact with. Some struggled to see the floating headsets and recognise them as other people, suggesting the need for further research into effective body representation.
Showcasing the Lost Horizon hybrid venue was an important experiment in creating psychologically meaningful mixed-reality experiences. We saw definitive interest in hybridity—using VR to access familiar or existing, rather than ones developed in and for VR. That visitors with prior associations to the physical locations of the twin suggests that hybridity could be key to engaging virtual audiences.
Access the programme of Lost Horizon events on their website.