Head-Up Display

 

Heads-Up Display

In Event[0], there are four types of HUD: the gauges, the compass, the inventory, and brief descriptions of objects in the environment.

My task was to design these HUD together with the art director and make sure it was easy to understand and to use.

 
 
 

Mood board

Together with the art director, we determined the general direction of the HUD in the game. This was the starting point for my work. The HUD had to be futuristic, but functional. 

Pinterest mood boards were used to narrow down the selection of styles. I first chose HUD examples that were “futuristic,” then rejected the ones that weren't “functional.”

After several iterations with the art director, we decided the HUD would be white, with thin lines and would use blur effects.

 

 

in-game augmented reality (ar) GAUGes

The gauges appear when the player is in space. Controls change as well: left click is used to move forward and right click to stabilize.

Initially, there was one gauge for the oxygen level and another gauge for the jetpack heat meter. The oxygen gauge was a classic indicator: when the cursor was at 0, the player died. If the temperature gauge reached its maximum, it meant the player's jetpack was overheating. In this case, the player couldn't accelerate and had to wait for the gauge to reach its minimum again. To avoid that, the player had to stabilize from time to time.

After many playtests, I recommended transforming the heat gauge because the players didn't understand it and didn't make the connection between this indicator and the stabilization control. As a consequence, they experienced difficulties getting around in space and positioning the avatar in front of terminals.

In the end, we replaced the heat gauge with a more understandable a speed gauge.

 

in-game Ar compass

In space, the player may have to perform many tasks with a limited oxygen supply. For example, hacking all the antenna terminals.

I observed a lack of guidance there: players were lost and frustrated. Finding the terminals was a real challenge even though we didn't want it to be one. To fix this, we made a compass that would guide the player directly to the terminals.

Once again, multiple iterations allowed us to reach this result.

 

In-game Ar descriptions

In Event[0], you can't interact with objects. During playtests, I observed a lot of frustration about this lack of interaction. Moreover, I saw that the players almost didn't talk to the AI about the environment. The reason for this is simple: in video games, if an object isn't interactive, the player will think it's not useful to complete their objective. However, even if an object doesn't play a role to achieve a task, its presence is crucial for the atmosphere. Each object in the game tells a story, and we wanted the player to discover it.

We decided to add augmented reality descriptions that would encourage the player to care about the environment and allow them to piece more of the story together. However, it was important not to impose it on the players who didn't want to explore the environment.

A few playtest sessions allowed us to improve the way the AR descriptions work. When the player aims at an object, the crosshair changes into a timer (circular gauge). If they keep pointing at the object, its description appears. If not, the timer gauge goes away.