Magic Leap has a very inviting, enthusiastic and active community (I’m a part of it). I regularly check out many of the videos that developers are sharing and being part in active online communities, discussing with devs about the technology and what they are working on.
It first starts with seeing developers testing out the bundled applications and now many Magic Leap developers have moved on to the next stage, creating apps themselves. Many of those apps are prototypes, developers experimenting with the different input methods which are completely new to some, especially those who haven’t developed for other mixed reality platforms like the HoloLens. Those who come from HoloLens development will feel very much at home, but there are still new things to learn.
One of my favorites is an app made by Lucas Rizzotto called “AR Superpower”. It demonstrates the use of hand gestures input method. One of the reasons I am excited about it, because we finally got to place a place where our hands can be free, and we can actually use our hands to create cool interactions with it.
Here is the video posted by Lucas Rizzotto. You can probably see where this is going, spellcasting game, scene staging with visual effects, cool social interactions, etc.
As developers continue to explore this uncharted territory, new ideas will come up, ideas for creating new types of unique experiences, whether its apps, games or novel user interfaces implementations.
Another cool demo was done by James Ashley, who shows how eye tracking can be used to create some really nice interaction. He called it “Magic Leap Heat Vision Demo but I see X-Men’s Cyclops laser eyes and a cool game that can be created with it. This demo shows how the eye-gaze user input of the Magic Leap One can be used to create unique experiences You can see each eye being tracked individually.
Usually, the eye-gaze input isn’t recommended, at least not for long interactions as it can lead to eye strain, especially for fine interactions but it can be used for some interactions in the app or combined with other input methods for better optimization and accuracy. For some apps, this can be used throughout, especially when interacting with user interfaces and needing to know the user’s intent. A user can look at a certain object but headpose along won’t tell you which object the user intends to interact with, eye-gaze can make that tell you that.
By the way, when you see the lasers cutting off, it’s because of the user blinked as mentioned in the description of the video.
Overall, it’s exciting to see many developers already putting a great deal of effort getting familiar with this emerging new technology and help push it forward. Now that the technology is out there, it’s all about creativity, watching what amazing things creators are able to make with it.
By the way, if you want to be a part of a great Magic Leap community, connect with Noah, he will introduce you to this amazing community, great guy!
Oh, before I forget, here is a gift for you.