Looking at many of the games that I’ve reviewed shows some very diverse gameplay experience, including FPS, puzzlers, arcade, simulators, target throwing/shooting, tower defense and others.
It’s great to see that AR allows game developers to deliver many types of gameplay experiences. Some of those are similar to standard mobile games, others were only possible using AR technology like ARKit and ARCore.
Look me in the Eyes!
During my testing of many of those games, I found out that I immensely enjoy playing games where I am being observed by the characters in the game, feeling that the in-game virtual character is “aware” of my presence.
It becomes more satisfying when a virtual character looking at you or moving towards you as you change your physical position in the real world space. This gives players a feeling that the virtual character is being aware of their presence, making the entire gameplay scene mixed not just visually, but also socially.
This character interaction with the player is quite common in first-person shooters where the enemy characters’rotation and movement are designed to lock onto the camera’s location, which is where the player is located.
As common as it is in FPS augmented reality games, this can be a leading part or enhance other types of games. For example, virtual pet simulators in AR are a great example of how making a virtual character awareness can strengthen the emotional bond the user has with the character. That reaction is very much similar to when we look at a person from the side compared to when we look straight at the person’s eyes. This can be further enhanced by integrating physical touch where you get to physically interact with the virtual character using on-screen touch gestures. In the future, hand-detection using computer vision algorithms, as well as sensing gloves with further enhance that experience. This allows us to actually touch the virtual world using force-feedback, further giving physical properties to virtual objects and reducing the gap between what’s virtual and what’s real.
That virtual character’s “awareness” of the user alongside direct character interaction is what made a game like AR Dragon such an entertaining experience.
So there you have it, you’ve just met a new friend!
I think this image taken using the app Follow Me Dragon can clearly illustrate my point. Just look at it for a few seconds, please.
Yes, that dragon on the right is obviously virtual, but you cannot ignore that existence of an emotional connection with the virtual character, same as you have with a real one, more or less.
One of the early AR games that had employed it so beautifully was ARrived, an augmented reality God game. Although I didn’t enjoy the game itself as much as I expected, that part of the virtual characters awareness was implemented almost perfectly in my opinion.
Having this type of direct look and dynamic location awareness of the user by the virtual character can enhance many different types of games. Depends on the virtual character’s behavior and look, you can trigger different types of emotions from the user’s side. By doing so, you have elevated the gameplay experience times fold. The user feels like an indispensable part of that mixed reality world that you’ve created, not just as an observer. Of course, it depends on the type of game that you are developing, but even in an introductory tutorial, this type of virtual character’s interaction can enhance the gameplay experience.
Adding voices, sound effects and music can further enhance that experience. This is one of the reasons why ghosthuntAR: Survival AR game had such a big impact on me. I felt triggered and scared at times. It was me against “them”, they knew where I am and I needed to look out for myself. Their direct evil creepy look made me want to shoot them hard to get them away from me. I felt satisfaction in every single shot. Seeing them getting blown away with colorful explosions carnage made it all worthwhile.
Responding to Player’s Interactions
By the way, it not completely necessary to make the virtual character look at the users directly in the eyes. This can be an action that the virtual character responds that can further augment the bond between the player and the virtual entity. For example, playing ball with the dragon in AR Dragon. That player feels that the virtual character responds to his actions.
This was perfectly illustrated in a very simple augmented reality app called Take that, Elf!. I set down and asked myself several times, what did I enjoy that app so much? What made it different than other AR apps?
At the end I realized is that it’s that emotional bond that was created using virtual character awareness of the player and response to player’s direct and indirect interactions, alongside beautifully made animations, character voice, the unique reactions and facial animations that made it such an unforgettable AR experience.
Having the ability to have a direct influence on the virtual character’s behavioral response was one of the main reasons why I enjoyed it so much.
This also one of the reasons of why a non-AR mobile game like “Kick the Buddy” is beloved by so many players.
Look how the character almost always maintains an eye contact with the player. The game itself is designed around direct or indirect interaction with the character.
How to employ it in your game?
You can enhance your AR game by employing the same character’s animation and interaction principles that I’ve mentioned.
In AR, you can create a stronger emotional response because you can produce responses that are not possible to achieve in standard mobile games, like having a character follow the player or look at him from down below or up above.
Some things that you might consider adding to your game:
- Direct and ongoing eye connection with the player’s (device) location
- Making a character move or fly toward the user’s location in the 3D space
- Adding voice in response to different interactions with the character (e.g. you tickle it, and the characters laugh) or actions the player is doing in its surroundings (e.g. You throw a ball far from the character and his response: “Why did you throw that ball so far, now you get it yourself”.)
- Make the virtual character grabbable and moveable when applicable.
- Apply facial and body animations that reflect the current virtual character’s emotional state
These are a few things you can add to your character in order to give players a sense of the in-game virtual characters are aware of their presence.
Some developers overlook this feature and I think it’s just a big miss. If you are planning to develop any type of game, try to see if you can employ this in your game. It doesn’t necessarily have to be in a game where the character continuously interacts with the player. For example, in a game like AR Football where the player must shoot a ball into the net, if the player miss, you can make the character look into the player’s eyes and say: “Wow, that was one of your worst shots…[laughs]”, instead of just making the character look flat forward and feeling like it lives in its own domain.
If you develop a jumping game like Jump AR, make the character face the user (head only) and say: “Hey, you are doing great”, instead of an on-screen text notification.
So many games use such adorable characters, but they look and feel so lifeless and separated from the AR experience.
As I said, these recommendations won’t fit any type of game, but they can definitely enhance many types of augmented reality games. I’m sure you’ll be able to find ways to incorporate these recommendations in some of your games in exciting and entertaining ways.
I can’t do what I do without your support. If you find those articles useful, please share them with others. You can also support me via Patreon, I have a link to my account on my Twitters page in the profile and pinned tweet. Your help will be greatly appreciated. Thank you so much and see you in the next article.
Part 9 coming soon, stay tuned!