Hyper Reality Experience is based in Leeds, UK.
The Void in London is a pretty good example too. But it doesn’t have hyperreality in the name. So it loses the contest to be my preferred example.
Why is this hyperreality gaming?
The experiences hosted here qualify as hyperreality games, because they implement four very specific elements/technologies in unison:
- Motion capture – The player’s movements being captured in real-time.
- Virtual Reality – Headsets ensure they are completely removed from the real world.
- Physical Objects – Such as environmental obstructions or props, which can be felt and interacted with.
- Digital Environment – A digitised creation where all these things elements are bought together.
Each of these components have unique, immersive properties of their own that they excel at. They also have flaws that mitigate how effective these properties can be.
Benefits and limitations
Motion capture can transpose your performance and actions onto a digital character, but you’re essentially a puppeter. You’re watching the perfomance before you, manipulating the character without ever truly being the character.
Virtual Reality can immerse you in an environment that appears real, but restrictions on movement, interaction and feedback mean the simulation is largely surface level. It might look like reality, but it doesn’t feel like it.
The weight and feel of physical objects stimulate senses in ways that most games cannot. But you need to imagine they’re something else for the fantasy to work. The simulation is reliant largely on your own imagination.
Digital environments allow artists to create their own version of reality, down to the last pixel. They can provide seemingly limitless interactivity, but that interactivity is facilitated largely through clicks on a mouse, keyboard or controller.
I’m simplifying an awful lot to get the point across quickly, but you get the idea. As far as immersion and a sense of presence are concerned, there are too many gaps in each of the above components for them to be fully effective in isolation.
When bought together however, they each compensate for one another’s shortcomings. It’s a complimentary experience where with the right craftmanship, the bleed between what’s real and what is not can be diminished.
Hyperreality games at Staffordshire University
Richard Harper and I have been planning an overhaul of the motion capture stage at Staffordshire University for the past couple of years. In April 2019, I was given the go ahead to purchase the specified equipment. In early August this same year, we got it installed.
These words likely mean nothing to any but the geekiest of people. I’m therefore writing up a more detailed, accessible entry about the new hardware, software and its supported features. I’ll share this when it’s done. Which will hopefully be soon. I emphasise ‘hopefully’.
One of the single biggest advantages this system has bought with it, is the ability to stream motion capture data straight into Unreal Engine 4.
Because of this, we can now experiment with digital environmments which incorporate a performer/player’s full range of motion in real-time. We can also introduce physical objects into the equation. Virtual Reality, too.
Essentially, we can start building hyperreality games.
So we are.
Motion Capture into Unreal Engine 4
Technical setup and streaming
It’s a Tuesday afternoon. A team comprising myself, Rich, Tom Vine and Conor-Jack Molloy have just completed the setup of our new motion capture stage.
We had a day’s training on the new toys yesterday, software mainly. The original software package we used was Vicon Blade. This did allow us to stream an actor’s motion into Unreal Engine 4, but only via Motion Builder.
Performance therefore, wasn’t great.
Unreal Engine 4
We’re now running Shogun Live, which enables motion captured subjects (actors and/or props) to bypass Motion Builder and stream directly into Unreal Engine 4.
Performance is smooth as Fallout 76’s launch, except the exact opposite. I mean, the fact that it works makes it the polar opposite to Fallout 76 by default.
Once the stage has been calibrated and there’s a subject being captured, all we need is to activate Vicon’s plugin in Unreal Engine 4.
Refreshingly simple, quickly sorted.
We want to see this in action, so Conor suits up while Tom chucks together a really basic environment in Unreal Engine 4. We’ve only got an hour or so left of the day at this point, so this is just going to be a quick test.
Tom sets up a ton of physics objects to spawn and drop into a rectangular arena, the same size as the availble floor space on the stage.
Watching these objects bounce off Conor as he moves around the space gives us a tremendous amount of delight. We each reflect on the sorry state our lives must each be in for this to be so entertaining.
Rather than dwell on that, Tom adds a ton more physics and we all smile again.
Conor-Jack, our natural born mocap performer
Before we’re able to incorporate some physical props, we remember we have homes to go to. It’s been a long day and I fancy some halloumi.
In fairness though, we’re in a good position to start actually trying some game design out. The performance of the motion capture stage is miles ahead of where it was and it barely took any time to get the data streamed into Unreal Engine 4. The speed with which we’re able to start working with the tools should allow us to prototype ideas very quickly.
So in the next entry, I’ll be detailing the creation of our very first hyperreality game. This will be from design document to actually playing the game in VR.
If you’ve any thoughts on what you’d like to see experimented with in hyperreality games, please let me know. Human interaction is something I’m lacking.