Building hyperreality games – Introduction

Posted on 2 September, 2019

Alright nerd boy, what the eff is hyperreality, anyway?

It’s like reality, but with speed holes. They make it go faster.

Broadly speaking, hyperreality refers to the application of a medium or mediums, where the result is indistinguishable from reality. This is despite it being a simulation.

There are probably simpler ways of expressing that, but I couldn’t find one that didn’t just sound like I was plagiarising Wikipedia. Ya’ll know how seriously I take myself.

The work of artist Diego Fazio encapsulates this principle better than my shoehorning of vaguely associated terms ever could:

Diego Fazio

I discovered Diego Fazio’s work when doing some basic research into examples of hyperreality. His work is boss so please, do check it out: https://www.deviantart.com/diegokoi/gallery

You could be forgiven for mistaking the image on the left for being a photograph. Because it looks real.

This difficulty in being able to consciously identify where actual reality ends and simulated reality begins, is at the crux of what defines hyperreality.

But what does it mean in the context of games? I don’t really know, but this journal series is going to document some experiments in finding out.

Because I’m a nerd boy. Obviously.

Story continues below...

Hyperreality games – are they a thing yet?

Personally, I would say no, hyperreality games aren’t really a thing yet.

At least, I can’t say I’ve seen considerable, long-term discussion on the topic within gaming enthusiasts. The term is certainly nowhere near the broader consumer conscious in the same way Virtual or Augmented Reality are.

I don’t think there is even a single, universally agreed definition of what makes a game ‘hyperreal’ yet.

I could be wrong. But for my money, Hyper Reality Experience in Leeds is the best example:

Hyper Reality Experience is based in Leeds, UK.

Hyper Reality Experience is based in Leeds, UK.

The Void in London is a pretty good example too. But it doesn’t have hyperreality in the name. So it loses the contest to be my preferred example.

Why is this hyperreality gaming?

The experiences hosted here qualify as hyperreality games, because they implement four very specific elements/technologies in unison:

  • Motion capture – The player’s movements being captured in real-time.
  • Virtual Reality – Headsets ensure they are completely removed from the real world.
  • Physical Objects – Such as environmental obstructions or props, which can be felt and interacted with.
  • Digital Environment – A digitised creation where all these things elements are bought together.

Each of these components have unique, immersive properties of their own that they excel at. They also have flaws that mitigate how effective these properties can be.

Benefits and limitations

Motion capture can transpose your performance and actions onto a digital character, but you’re essentially a puppeter. You’re watching the perfomance before you, manipulating the character without ever truly being the character.

Virtual Reality can immerse you in an environment that appears real, but restrictions on movement, interaction and feedback mean the simulation is largely surface level. It might look like reality, but it doesn’t feel like it.

The weight and feel of physical objects stimulate senses in ways that most games cannot. But you need to imagine they’re something else for the fantasy to work. The simulation is reliant largely on your own imagination.

Sadly, we’re not quite here yet.

Digital environments allow artists to create their own version of reality, down to the last pixel. They can provide seemingly limitless interactivity, but that interactivity is facilitated largely through clicks on a mouse, keyboard or controller.

I’m simplifying an awful lot to get the point across quickly, but you get the idea. As far as immersion and a sense of presence are concerned, there are too many gaps in each of the above components for them to be fully effective in isolation.

When bought together however, they each compensate for one another’s shortcomings. It’s a complimentary experience where with the right craftmanship, the bleed between what’s real and what is not can be diminished.

Hyperreality games at Staffordshire University

Richard Harper and I have been planning an overhaul of the motion capture stage at Staffordshire University for the past couple of years. In April 2019, I was given the go ahead to purchase the specified equipment. In early August this same year, we got it installed.

Our motion capture stage now boasts twenty-four Vicon cameras, consisting of eight Vicon Vantage and sixteen Vicon Vero.

These words likely mean nothing to any but the geekiest of people. I’m therefore writing up a more detailed, accessible entry about the new hardware, software and its supported features. I’ll share this when it’s done. Which will hopefully be soon. I emphasise ‘hopefully’.

Such wonderful, wonderful toys.

One of the single biggest advantages this system has bought with it, is the ability to stream motion capture data straight into Unreal Engine 4.

Because of this, we can now experiment with digital environmments which incorporate a performer/player’s full range of motion in real-time. We can also introduce physical objects into the equation. Virtual Reality, too.

Essentially, we can start building hyperreality games.

So we are.

Motion Capture into Unreal Engine 4

Technical setup and streaming

It’s a Tuesday afternoon. A team comprising myself, Rich, Tom Vine and Conor-Jack Molloy have just completed the setup of our new motion capture stage.

We had a day’s training on the new toys yesterday, software mainly. The original software package we used was Vicon Blade. This did allow us to stream an actor’s motion into Unreal Engine 4, but only via Motion Builder.

Performance therefore, wasn’t great.

Motion Capture

The obligatory ‘work in progress’ shot.

Unreal Engine 4

We’re now running Shogun Live, which enables motion captured subjects (actors and/or props) to bypass Motion Builder and stream directly into Unreal Engine 4.

Performance is smooth as Fallout 76’s launch, except the exact opposite. I mean, the fact that it works makes it the polar opposite to Fallout 76 by default.

Once the stage has been calibrated and there’s a subject being captured, all we need is to activate Vicon’s plugin in Unreal Engine 4.

Refreshingly simple, quickly sorted.

Conor-Jack Molloy

Conor-Jack Molloy, ‘bossing’ our new motion capture suits. Photo by Tom Vine.

We want to see this in action, so Conor suits up while Tom chucks together a really basic environment in Unreal Engine 4. We’ve only got an hour or so left of the day at this point, so this is just going to be a quick test.

Tom sets up a ton of physics objects to spawn and drop into a rectangular arena, the same size as the availble floor space on the stage.

Watching these objects bounce off Conor as he moves around the space gives us a tremendous amount of delight. We each reflect on the sorry state our lives must each be in for this to be so entertaining.

Rather than dwell on that, Tom adds a ton more physics and we all smile again.

Conor-Jack, our natural born mocap performer

Conor-Jack, our natural born mocap performer

Conclusion

Before we’re able to incorporate some physical props, we remember we have homes to go to. It’s been a long day and I fancy some halloumi.

In fairness though, we’re in a good position to start actually trying some game design out. The performance of the motion capture stage is miles ahead of where it was and it barely took any time to get the data streamed into Unreal Engine 4. The speed with which we’re able to start working with the tools should allow us to prototype ideas very quickly.

So in the next entry, I’ll be detailing the creation of our very first hyperreality game. This will be from design document to actually playing the game in VR.

If you’ve any thoughts on what you’d like to see experimented with in hyperreality games, please let me know. Human interaction is something I’m lacking.

Cheers!

Back To Top