Videogames, more often than not, are about granting us incredible abilities – impossible powers that allow us to explore and master our environments with preternatural ease. Press ctrl to enable bullet time; hold the right mouse button to teleport; push down the right stick to enable echolocation and see the world the way a bat might.
Related: the best horror games on PC.
Only, that last one’s possible – a navigation technique used by the blind to locate objects based on the echoes they produce.
“They can train themselves to use how the world sounds around them to actually see,” says writer and producer Amanda Gardner.
Working from home with husband and Bioshock lead designer Bill, along with a remote team of veteran contractors, Gardner is making Perception – a haunted house horror game with a blind protagonist. It’s a project that comes with unique demands. Not least among them: how do you visualise an environment that isn’t, in the traditional sense, seen?
Normal mapping a superpower
Echolocation stimulates the same area of the brain used for normal sight – but, since sighted Perception players are using an ability they don’t possess themselves, the Gardners needed to find a way to represent sound waves visually.
“They could be smokey, they could look like repulsing water,” Amanda points out. “So how did we want to do it?”
Here’s how: when echolocation isn’t in effect, Perception’s map is entirely black. It’s only when an object makes a noise that a light is spawned, and a number of post-processing effects come into play.
It’s an incredibly unusual application of Unreal Engine 4’s lighting system – which is built to recreate, to an extraordinary degree of accuracy, the scattering, reflections and shadows of the world as sighted people know it.
“I do think there’s a tendency for developers to push the realism more than the surreal or noir,” notes Bill.
When working at Irrational on SWAT 4, Bill would place the lights in a level as close to objects as he could to highlight normal mapping – the texturing technique that was then proving groundbreaking for faking bumps, dents and irregularities in surfaces that weren’t truly 3D.
“It definitely gave it a more surreal look, but I loved the way it gave [environments] a distinct appearance,” Bill remembers.
More than 10 years later, Perception’s objects are almost entirely normal mapped. The great, yawning Echo Bluff estate in which the game takes place is silhouetted in blues and whites that reveal heard objects in all their detail and texture, if only for a second. The wintry colours have a handy side effect, too, backing up Perception’s eerie atmosphere with a ghostly palette.
Alone in the dark
Typically echolocators make sharp clicking noises with their mouths, which bounce back from nearby objects. But The Deep End Games decided that clicking wouldn’t give the player enough visual feedback, and so settled on a cane – which protagonist Cassie taps against the floor to reveal the world around her.
“It’s actually really, really cool, because as she walks through the house all the surfaces make different sounds,” says Amanda.
The Deep End Games rely on a mixture of systemic sounds, like those produced by Cassie’s cane and footsteps, and specially-placed ones – a heater leaking steam, for instance, or the billowing curtains of an open window.
“We’re not doing a hardcore simulation,” notes Bill. “We’re doing our best to capture the feeling of echolocation.”
Some of those handcrafted sounds – and their attendant visuals – are designed to help the player get their bearings as they become accustomed to a new way of viewing the world.
“We always forget how difficult navigation is in 3D,” says Bill. “If you look at Bioshock, we added that Crazy Taxi guide arrow relatively late. It’s difficult for developers to understand that people don’t always have the best spatial relations, and that their environments are incredibly complex.”
Though Echo Bluff is nowhere near as complex as Rapture, it’s partway designed to disorient. There’s the reliance on echolocation, obviously – and then there are the times that the geography and architecture of the house change. Horror game, remember?
To help, Perception highlights ‘memories’ – landmark objects you’ve discovered before, like chimneys or staircases, that appear in green when you come to echolocate them again. Beyond that, the game allows a ‘sixth sense’ which lights up the next object you’re aiming for, giving you a vague objective to aim for if not a clear route to it.
The Deep End Games are simulating the innate sense of direction we all have, but can’t call upon when exploring an environment from behind a screen.
“It’s a way of guiding people without handholding,” says Bill.
Perception’s developers might have created a lot of their lighting and navigation problems for themselves. But they’ve also found solutions, and the result is a game unmistakeable for any other. Where most horror tends to teach phobias, Perception players might learn a little empathy – a first-hand understanding of the way others see the world.
In this sponsored series, we’re looking at how game developers are taking advantage of Unreal Engine 4 to create a new generation of PC games. With thanks to Epic Games and The Deep End Games.