Designing Great Mixed Reality Games Part 3

Designing Great Mixed Reality Games – Part 3

When we design games, knowing what input methods are available for the user to interact in the game, can dictate how we design certain parts of the game and how we make sure that the control scheme is applied in a way that it’s the most comfortable, and accessible to players who are going to be our game.

In this Part 3 of my “Designing Great Mixed Reality Games”, we’ll take a brief look at the available input methods then talk about the 6DoF Control, it’s pros and cons and how to make sure we designed our game with that controller in mind. After that, I’ll give you some examples of creating a control scheme for two AR games that I’ve already reviewed on my site and see how we can make them work well with that new controller.

Various Input Methods

The good news is that the Magic Leap One offers a variety of input methods for the user to interact with virtual content in your game. This includes the following input methods:

  • The 6DoF Control
  • External Bluetooth Keyboard
  • Headpose
  • Eye Gaze
  • Pre-defined 8 different hand gestures
  • Voice
  • Mobile app (Control alternative, mimics the Controller’s control digitally

Magic Leap recommends making the 6DoF Control  (position and rotation) your primary input method. This control provides a comfortable, reliable, responsive and accurate way to interact with virtual content in mixed reality. It has a touchpad that supports touch interaction and also provides tactile feedback (10 different haptics) when manipulating objects. You can use the controller comfortably manipulate objects closer or far away from your location with ease using a control ray, visible laser-like beam, which make it easy to target objects from a distance.  It also includes a bumper and a trigger button, as well as a back/home button.

Touchscreen Interactions are No More

If you come from mobile AR development, you probably noticed that that fine hand gestures that we are so used to on mobile phones do not exist in the Magic Leap One.

The Birdcage AR game screenshot
AR Game, the Bird Cage

On-screen touch gesture allowed developers to create games that require fine controls. You can tap on items that were very small in size. Even so, you always had the option get closer to the content to have an even more precise control.

For example, when I played the AR game The Birdcage, I needed to get very close to the content, like levers, for example, to be able to interact with them (see image on the right).

The Magic Leap One has a limitation. Content nearer than 37 cm (14.57 inches) is clipped (hidden). This means that I cannot get too close to objects in the game.

Knowing this fact means that you should design the game with an interaction that takes that limitation into account. Furthermore, unlike mobile AR, you no longer have the comfortable hand gestures that you are used you before and this means that users will need to adapt to a new control scheme and the use of the controls need to be designed well for the user to feel good control  over how he or she interacts with the game. You do have the option to use hand gestures, but Magic Leap recommend not using Gesture for lengthy use as it requires users to raise their arms to the sensor’s detection area and this puts physical effort on the arms and can be tiring when used for a longer period of time.

About the touchpad. The Control touchpad is also very small and designed for use with and your thumb on the touchpad. There is no second control for the second hand. The Control is like a mouse for a PC in that sense. The touchpad doesn’t suppose multi-hand or multi-finger gestures and can recognize touches, circular thumb gesture (clockwise and counterclockwise, swipes to different directions (up, down, left, right), detect which area the touchpad is being pressed with pressure (e.g. bottom, top, right or left edges).

The Controller is, therefore, less accurate compared to handheld AR but is more comfortable and offers a good degree of flexibility for manipulating objects.

For example (based on examples from the official documentation under Design Manipulation section):

  • Rotate using a circular thumb gesture on the touchpad
  • Scale by pressing left or right edges
  • Push or Pull by pressing the top or bottom edges

Making interactions that do not require very fined tuned movement are not complicated, however, it’s important to understand that you are now manipulating objects in a 3D space. The 6DoF Control was actually designed to make that type of content manipulation easier as it can detect both movement and orientation in the 3D space.

Making finetuned controls, if your game need can be tricky. There are a few ways to tackle this:

  • Combine the Control input method with another input method like headpose, eye-gaze
  • Use hand gesture where appropriate
  • Make objects larger so they are easier to manipulate or allow the user to scale an object or the entire game space
  • Provide a zoom-overlay (not recommended)
  • Create a large second custom handler for a fine control
  • Make the movement based on rotation rather than limited straight movement
  • Use tapping to provide gradual movement
  • Use combined control scheme like holding the trigger to increase the sensitivity of the main control you are using, like swiping for example

You can define the control scheme as you find fit for your game. The important thing is to make it the control scheme simple.

Let’s take the AR game The Birdcage and let’s see how can we apply a design scheme that can be good for that particular game using the 6DoF Control.

The Gesture mapping isn’t a good fit for interacting with the cage in the game. The 6DoF Control gives us good flexibility. Here is a control scheme I think will work well for this game.

  • We use straight pointer/laser beam (following Magic Leap’s recommendation, making the Beam start at approx. 2 inches in front of the Control and extend up to the digital object it points to.
  • The Bumper is only used for getting in and out of the Cage control mode.  When on, the cage is selected, there is no need to make it targetable. Swipe left on the touchpad to rotate the cage. Tap the Bumper to exist the cage control mode.
  • Any actionable object that is part of the puzzle we need to make Targetable When the user points the 6DoF Control (can even combine with eye gaze, headpose is not accurate enough for this due to the relatively small size of the game level = the cage) he can use the touchpad Rotate, Swipe and Pressured Tap to interact with those elements. This is a puzzle game and highlighting actionable objects can ruin the experience for the player, so we don’t use it.
  • Anything that needs to be dragged and dropped to the right panel (or in a container if in mixed reality this is utilized differently, and I believe it will) can be done by using the Trigger button. Click and hold to “Hold” and release the button to “Drop”.

This is a quick example in a nutshell. If I spent more time I would probably find a more optimized control scheme but then I need to go back to the game and see that I remember all the interactions.

It’s important to design the game in a way that the user won’t need to make very small movements with the Control as this can become annoying and frustrating for the player. For some games, this might be OK. For example, when playing a Jenga game.

A Jenga game is actually a game that is very easy to create a Control scheme for because the main interaction is just pulling or pushing blocks.

Precise game controls, Wobbly Stack AR
Screenshot from the AR game Wobbly Stack AR.

Each block is targetable. You can use eye-gaze to fine tune the selection but I think that just using the Control can be good enough for this type of game. We use the Control trigger button to select and deselect a block. Press & Hold to lock and move the controller to any direction to push or pull it. Release the trigger to let it go (or drop the block if you intend to put it on top or drop it off).

Targeting small objects or fast-moving objects is much harder. I recommend putting a distance limit to how far an object can go, so it won’t appear too small and make it difficult for the user to interact with it. Same goes for the speed (unless it’s a way you use to increase the difficulty for your particular game), try to make objects move relatively slow and in predictable pace and direction.

Summary

Magic Leap One developers and users will need to adapt to a new control scheme. As a developer, try to keep things simple and follow the official recommendations. Make sure the game controls are easy to understand, preferably without simultaneous button and touchpad interaction.

You can design the app so it is optimized for the primary control. By doing so, you will be able to deliver a more comfortable, fun and immersive experience to your users.

Although the amount of user inputs is broad, it is recommended to develop your game with the primary control (6DoF Control) in mind and use other input methods as complementary or alternative to your main ones. Giving the user the option to map the control is something that some users will appreciate, but it’s totally optional.

I’ve just touched on the basics but this guide should give you a good start and overall understanding of how to make the shift from mobile AR to wearable Mixed Reality, in this case, the Magic Leap One with focus on the primary input method, the 6DoF Control.

Like always, I highly recommend any developer to visit the Creator portal on magicleap.com and read the official documentation to get good foundations.

If you find this guide useful, please don’t forget to share it. This will motivate me to write more guides in the near future. Don’t forget to check out Part 1 and Part 2 as well. Thanks.