Many user input options

Magic Leap One has a large variety of User Input options

After watching Magic Leap’s Livestream at Unite Berlin 2018, I was shocked to see how many user inputs are available for developers to use in their app.

Developers will be able to access a wide variety of user inputs, including ones originating from the user’s hands (fingers, wrist, thumbs), eyes state (fixation, dwelling, sacking, eye-rolling, etc) and eye focus, arms, wrist, emotional reaction (based on eye analysis), head pose, body posture (Crawling, standing, seating), and voice,  Each one of those options provide a wide range of customization and/or fine-tune options that allow even better accuracy and the ability to combine various user inputs together. There is also the ability to identify the transition between two or more poses.

Let’s not forget the Magic Leap One motion controller and the ability to build interfaces and interaction that utilize both the 6DoF motion controller with other non-verbal or voice inputs.

Having such a wide variety of user control’s input is great. It can help developers design experiences without bounding to the limitations of a controller. The controller still has some advantages on its own when used in Mixed Reality, like the ability to operate without needing direct sensors to detect it (e.g. outside the sensor’s detection range, like behind your back), it’s more accurate for apps that require high-degree of accuracy and of course it has a haptic feedback that gives a better feeling when interacting with user interfaces.

This can be really intimidating at first.  However, one of the main advantages is that knowing that those user input facilities exist, can inspire you to create new interactions that weren’t possible until now. I have so many ideas coming out of my head just by learning that those type of user inputs exist.

That being said, you don’t need to get lost by it. You can think about great mixed reality experiences for Magic Leap One and know that, most probably, most of the user interactions that you need, the Magic Leap One will be able to deliver.

I do think that it’s important to understand all the options available. This is because as developers and users, our mind is kind of fixed based on previous user interactions using with other technologies. We need to think outside the box, and in this case, develop for a platform that is something new to us. Many developers have had experience with augmented reality applications like ARCore and ARKit, but with Magic Leap One, you have access to so much more data. Understanding the available user controls is only one part, there lots of other consideration and data that you need to be aware of.

One of the complexities of designing user interfaces in Mixed Reality is the ability to understand the user intent using those available controls, especially those who use pose and motion data. The good news is that Magic Leap will simplify some of these tasks, so you’ll be able to easily detect, for example, which object the user intended to interact with base on the combination of different user inputs, like where the user is looking at, where is he focusing at, where is he pointing his finger at, etc.

It’s easier of course when you have just a cursor on a 2D display or when using buttons to interact with the app, but in Mixed Reality, things are completely different.

Having such a large variety of user input options will certainly make things more challenging in certain apps but at the same time, more exciting. Some developers might focus on a small range of user inputs, while others will want to take advantage as many of them as possible, in order to come up with a really impactful and immersive mixed reality experience. In some way, I think that once a great experience is introduced and become successful, one that takes great use of the various user inputs that are available, other developers will be encouraged to do the same.

Just imagine a user playing a mixed reality game for the first time on the Magic Leap One. Think how amazing it will be when gazing at a certain point, doing a complex hand gesture, having a surprised look on your face or changing your head orientation affects the gameplay itself. Most of those user controls are tight to physical movements that you do with parts of your body. Now your body becomes a controller!

This also means that developer should carefully plan the implications of certain user controls and the number of user controls used both in terms of usability and complexity. sometimes it might be better to developer apps and games that take advantage of only a small part of the available user inputs or gradually add more to the app or game as the user learns about it or progresses within the game.

Bottom line is, if you are a developer who plans to develop apps and games for Magic Leap One, make sure you are aware of all the available user controls, both for inspiration, for creating more immersive MR apps and for having a competitive advantage.