XR Stories Residency Diary: Real-Time Stories
Introduction
Olivia Kitteridge is a storyteller and founder of Real Time Stories. Olivia and writer, Joe Willis, are developing their own interactive branching narrative within Virtual Reality (VR), called ‘Tech Support’.
The piece is a satirical comedy, set within a world where AI assistants receive therapy to help them perform better. The experience explores how we treat and interact with technology.
The original plan for the residency was to explore different technologies (including VR headsets, mocap suits, LiDAR cameras, and audio recording equipment) to decide which kit would be best suited to produce a short taster experience of Tech Support.
Actor Dylan Silke-Browne will join the project team to play the central character in the piece, Sammy – an AI mentor who guides the audience through the experience.
Pre-Residency
Olivia: I usually work as a developer or programmer or 3D artist on other people’s projects. But Tech Support is completely mine and Joe’s conception. This year I was just like “let’s just make a prototype”.
Joe’s a film director and screenwriter – we met at an ‘Introduction to Tech’ event run by GeoStories and Vanitas Arts. Joe’ll be here Tuesday – Thursday. Dylan – our actor – is coming in on Thursday (and potentially Friday) to do some mocap acting and record some lines.
We’re coming into this and already there’s ideas and that’s brilliant. Already that’s a big difference. There are limits to what I can do at home, so it’s exciting to try out the new tech.
Day One
Olivia: Today we’re going over Research and Development. We’ve discussed how a live version of Tech Support would work.
We’re always exploring different versions – what could we make with £100, £1000, £10,000? Finding the point where we can support ourselves and the project, but not be stressed out.
Today has been about narrowing down options. Tomorrow, when Joe’s here, we’ll be refining the story and looking at what we might change. I’ve started putting together a script.
We’re discussing what could be best for the project – although we’ve come in with an idea and an early VR demo, we wanted to be open. VR might not even be the right mode for it. I’m going to think about creating a script for a live version that’s more of a performance piece. We went over some tech too, some of the haptics. Looking at what’s available, what we can try, what’s online. We’ve been looking at metahumans.
Tomorrow, we’ll start on the narrative – the story and script. Then on Wednesday, we’re planning to work on the gameplay, so that might be testing haptics and hand-tracking. If we make enough progress tomorrow, maybe we start looking at that gameplay a bit early and maximise the time.
Day Two
Olivia and Joe: Now we’re moving towards a version of Tech Support that is a live performance. It’s a change, but one we’ve always had in mind. A year ago, when we started applying for funding, that live element was integral. We backed away from it because elements like venue costs are astronomical for a project like this. But we’re taking advantage of the fact that we’ve got access to this technology during the residency. We have the chance to explore and, in the future, be able to show funders that we’ve done this.
So now there are two sides to Tech Support, but both are exciting. The Unity project that we were working on before this week is more of a game – whereas this new version is more of a performance. They’re almost sister experiences. The Unity VR version gives us the chance to engage more people, but there are elements that we can add to a live version that will make it a more immersive experience. It’s a bit like what Pilot Theatre have done with Monoliths. They have their version that you can view through the headset, but they also have the interactive, more experiential version. Right now we’re questioning how that live element affects the existing story – particularly a pivotal reveal that comes late in the story. Joe has built this incredible diagram of the branching plot today.
We’re hoping Dylan can come in tomorrow, to test the motion capture. Our plan is to use mocap to record a tutorial. If we have time after that, we’ll record one of the later chapters too. We can then use that live recording for models and assets within the game version too.
Tomorrow should be more tech-focused. We’re using Unreal Engine for the live experience, and we need to make sure we’ll be able to play audio within the experience. Then we’re hoping we can do a run through on Thursday.
Day Three
Olivia, Joe and Dylan: Today has been a lot of art, and a lot of tech. We’ve been going over how we want Sammy, our tutorial character, to look. XR Stories set up an avatar for Dylan, which we’re currently connecting. Olivia has been building the background for our tutorial scene – this futuristic office.
We’re glad we’ve done the suit testing today, rather than waiting for tomorrow. Once we’ve seen how the metahuman moves, it’s just about dropping it into the tutorial scene, putting it all in a headset and giving it a go.
We’re looking at how to do facial capture outside of the lab. Olivia had seen a video about someone who records facial capture using a free phone app and Blender. We’ve tried the app ourselves, and it looks pretty cool! Dylan’s wearing a Rococo headset that the phone can be attached to, with the phone providing the facial capture. The plan for tomorrow is to get some recorded footage and audio. We’ll take those recordings, Olivia can put it in Blender, and we’ll see how it looks.
It’s Dylan’s first time doing mocap. The Rococo squeezes his skull a little, but other than that it’s been a lot of fun.
Joe has also found another opportunity for future support later in the year. It’s the Immersive Fiction Lab – a talent development programme. The work we’re doing this week would give us something to show them, to help our application. The specific remit is ‘creatives exploring humour in XR’. We were like “Oh my God, this is our avenue!” – it couldn’t be more perfect timing.
This residency is what we were always looking for – something that gave us the space to focus on the concept. It’s a project we really love, but we needed this time to work on it and try different things. We’ve had times where we would talk for an hour or two on the phone, and go over it. But it’s been nice to have a full week dedicated to it. We feel like we’re on holiday, but it’s the most productive holiday possible. It also helps you to back yourself. When you’re applying for a million and one funds, you naturally get pulled in different directions trying to fit the remit of the funds. So, without meaning to, you start diluting the original vision. This week has allowed us to reestablish the core tenets – getting back to that idea of a live element.
Day Four
Olivia, Joe and Dylan: Today we’ve been blocking out the physical scene for the recording: marking where Dylan should stand, where the desk should be, etc. We’re also looking at the design of one of our other characters, Titan, for the chapter that follows the tutorial. Joe spent the morning redesigning some of the copy, bringing it into line with how the story has grown and developed.
We came into this residency thinking we would try out new technology, and see if there were assets we could make here for the Unity game. Then afterwards we would find time to look at those assets and build it. But we’re glad we’ve been able to complete this rough version of the live performance. It’s better that we’ve focused on this one idea and will have a video – something we can show. If we’d known that was the direction things would go in, Olivia probably would have built the office background in Unreal before we started.
Our application to the Immersive Fiction Lab – with its focus on humour in XR – has got us looking at the comedy of the piece. It’s something that’s been core to the story from the start. A lot of it is in the tone of what we’re doing, and the lens through which we’re looking at everything – the question is how to represent that in the pitch. Olivia has been adding jokes into the visual design while she’s working on that office background. It’s vital to the project, but it can still be quite hard to put across.
We’re about to play through a scene. Dylan will be acting out his parts, and Olivia responding as the user within the headset. We had a runthrough yesterday, but now we want to record it. Then we can add in the audio, and use that footage to showcase the project. Hopefully we can get the whole tutorial scene filmed today, and we’ll maybe have time to rehearse the next chapter with Titan, which we can record tomorrow.
It really has become like theatre in a way – but the tech makes it a lot cooler! Olivia isn’t an actor here in the sense that she has lines to prepare or a script to follow – it’s actually important for it not to feel like she’s rehearsing. She’ll be playing the user, who should be new to this, so she probably does have to make mistakes. We don’t want it to sound like she knows exactly what she’s supposed to say at each point.
Olivia and Joe direct the recording.
Day Five
Olivia and Dylan: We didn’t quite get round to recording the tutorial on our last day, and then cancelled trains meant we couldn’t make it over for our final day. So we’re coming back to the lab two weeks later to get those recordings finished.
Which we have! We’ve got two takes of both scenes – the tutorial, and Titan’s chapter. We ended up not changing Titan’s look too much from Sammy’s – we’ve just changed the colour.
We decided to change how we were recording – Olivia is no longer in the headset. Now we’re just recording the on-screen animation, and the audio from our own external mics. Our next step is compiling it all – XR Stories have been putting it all together.
All our scenes have been recorded in one take. Later we’ll record video of someone in a headset looking around, and Olivia will edit that in with the animation so it looks as though they’re reacting to the scene. Then we’ll probably compile all the scenes into one proof of concept video. We’ve moved from theatre into film!
After the residency, Olivia will go back to working on the Unity version of this. We have enough of that version’s framework in place to create a demo. We’ll apply to the Immersive Fiction Lab and, if we get in, we’re hoping to have that Unity demo ready in time for the Lab’s start in November. With two versions we have options – maybe one version will stay as a demo, or maybe we’ll develop them both.
Published on 15 October 2024
Filed under: R&D Projects, XR Stories