Designing Great AR Games Part 2

Designing Great AR Games: Tips & Suggestions – Part 2

This is the second part of the article series of “Designing Great AR Games: Tips & Suggestions”.If you haven’t read the first part, I highly recommend that you do so, although the tips and suggestions are listed in no particular order.

In this second part guide, I would continue where I stop last time and share with you more tips and suggestions on how to make your AR game look and feel great. Keep in mind that not all of these suggestions might apply to your own game nor it’s necessary to employ all the suggestions listed here. Having said that, I’m sure that you’ll find at least some of them useful for your next project or at least give you some inspiration. OK, without further ado, let’s begin.

Improving Depth Perception

Playing Kings of Pool AR mode vs AI robot with Cherry Blossom views

Depth perception is a big subject and many academic types of research have been done about it.

This is something that had a strong impact on me when I first experienced AR. I remember toying around with some sticker apps and none of them were really exciting. Because I was not encouraged to physically move within the environment, those stickers, aside from the fact they were indeed flat (I checked), look like floating images on my screen. There were no depth cues that lead me to perceive the virtual content as part of the real-world scene, thus the AR experience was very poor.

Even with having static virtual objects, you can still create a good perception of depth. For example, animate the objects moving in the 3D space.

There are things you can do to enhance the perception of depth and make the AR experience more exciting and immersive, including:

  • Casting shadows of virtual objects onto other virtual objects, as well as horizontal and vertical surfaces detected by the AR framework (example 1)
  • Animating virtual objects moving through the 3D space. Their movement would be like drawing perspective lines to the eyes of the viewer as they change their size. It’s important especially when 3D objects move in mid-air with no drop shadow effect depth cue to give the user an understanding of its location in the real-world space. Having an anchor point which the objects are moving in relation to can also help (example 1)
  • Encourage physical movement which gives the user the opportunity to view the 3D scene from different angles and better perceive its 3D structure (example 1, example 2, example 3, example 4). In fact, optical illusion puzzle AR games were built on that exact idea, like Mazelith and AMON for example.
  • Give the user an option to move or rotate the 3D virtual object as part of the game controls (example 1). This is good for seated games or games where you don’t expect the player to move around too much.
  • Use 3D models instead of 2D models to give the object dimensionality/volume. I was amazed how primitive or simple well-designed 3D shapes look so good in AR compared to more complex shapes. I then realized that it’s due to the well-defined light and shadow areas that due to the high contrast, help to clearly perceive the shapes dimensionality (example 1, example 2, example 3, flat look: example 4)
  • Create a positioning separation between your game objects. If your game is designed to be played is seated position, it’s good to design the game so it makes use of the z-axis. However, because it depends on the angle of view, it’s better to create volume by creating the game so it makes use of a large 3D area, rather than just making it play on a certain axis with little use of the other dimensions. Furthermore, animating an object moving through anchor/stationary objects along the z-axis can help better perceive depth, especially when shadows can be observed (example 1).
  • A top-down view in a game that doesn’t require movement can impair the AR experience, as the 3D scene can look flat, so try to avoid this type of design. It will feel like a standard non AR game. You can improve this by adding tall 3D object and encourage movement (Bad: example 1, Good: example 2). Tall buildings with well-defined gameplay area lines that with perspective change, gives a good depth cue when the buildings are viewed against the back line and also against other real-life objects in the scene. The game also encourages physical movement around the gameplay area.
  • Using occlusion if supported by your AR framework.  This is an essential feature that I’m sure will soon enough be part of any AR framework. As of the time fo writing, it’s not well most AR frameworks.
  • Let the real-world scene be seen. I’ve seen games that either played in mid-air and put the game elements too close to the camera or draw a very large virtual surface. I sometimes ask myself, why the developer even bothered making it an AR game with that type of design (example 1). It’s an Augmented Reality game, don’t forget it. What’s the point in making an AR game where the end result is that the user can’t see the real environment in the way the game is designed to be played?

These are just a few things that can enhance depth perception and are based on my observation while playing many AR games and apps. I’ve decided to give a big chunk to this topic (although I can write about it even more) because this is something that I see many developers doing wrong. I think some just try to port their game ideas and make it in AR. To really make a great AR game, you need to think about making a game that is designed from the ground up to match the medium.

I’ve played many augmented reality games where the game is just ported to AR and I found that I actually prefer playing the non-AR version.

“Physicalization” & Realism

I really don’t know how to call it, so forgive me if the term might not be so clear at first glance. Under this term, I include physical interaction with virtual objects in the scene as well as making virtual elements in the game feel more lively and organic.

I tried a few AR games and apps that wowed me and made me think about why exactly I enjoyed them so much.

The first one was a Marge Cube app called TH!INGS, especially this part of the app. The second one is an educational app called FishingGO. The third one is InstaSaber and the fourth one is Take that Elf!.

Of course, there are many mentionable apps but I those ones can help me explain my point.

The first thing that made me enjoy Take that Elf! and FishingGO is the realism and the lifelike and organic look and feel. Playing Take that Elf! gave me the feeling like there was actually another (almost) real creature there with me in the garden. I also really enjoyed pocking him with my finger, which felt like I’m actually physically interacting with it.

Having a high-realism looking character blended really well with the real-world environment, making the entire experience felt a bit surreal yet very exciting. I just can’t forget this app because how impactful it was.

Most of the games that I’ve played look digitized, like computer games, you know what I mean. I wish there were more games that make use of lifelike characters like the one in Take that, Elf!. It doesn’t have to be a human-like character, but something that has organic behavior, that feels less computer-generated one.

The second thing that made an impact on me was the ability to feel like I am physically interacting with virtual objects in the game. I think this is one of the reasons I loved the Merge Cube so much. I felt like I was holding that virtual content in my hand, and physically interacting with it.

This is the reason why I enjoyed Smash Tanks! pull-to-shoot gameplay mechanics, snatching puzzle pieces in place playing AMON and PuzzlAR: World Tour, and pushing and pulling Jenga pieces in Wobbly Stack AR. It all felt good because I could, in a way, physically interact with the virtual content, even though in most cases it was done indirectly.

These two things had a fundamental impact on me. So I think this might inspire you to think about games that not just interacted via buttons to remote control characters in the scene, but by using familiar on-screen (or even using hand-recognition) gestures to allow users to feel like they are physically interacting with the virtual objects in the scene.

Regarding realism, I know that high-quality models, especially animated ones are very expensive, and probably, alongside performance factors, is one of the reasons why they are not widely used in many AR games. This is also something that I hope some developers will consider. Even a game like Jump AR could be more entertaining had it had a character like the one in Take that Elf!.

To be continued…

I discuss only two topics in this part because I wanted to make sure they are covered in depth and understood as I want them to be and I hope I was managed to do it well.

Please, don’t forget to follow my Facebook page, Twitter page (I am very active there) and my YouTube channel. Thank you.

The third part will come soon, so stay tuned!