I want to share with you a video I found on YouTube made by Avaer Kazmer, an engineer who shared his experience about Magic Leap One and it does that better than any other media outlet out there. First, let’s check out the video.
Avaer first demonstrates the Magic Leap One meshing process, and it shows that in such great details, better than any other place that I’ve seen. I assume this was recorded using the built-in recording functionality as described in this tweet by Magic Leap.
At minute 2:45 you can see one of the key differences between Augmented Reality and Mixed Reality, and this is occlusion, the ability to mask virtual objects behind real physical objects in the real scene. The tilt-like brush app UI is partially hidden behind the wooden logs in the ceiling when the user moves it around in his room. It’s not perfect as you can see, and the accuracy is based on how accurate that real physical object is meshed by the Magic Leap One meshing engine.
Using the 6DoF Controller
Using that same app, Avaer draws a sword and can actually make it attach to its 6DoF controller and now control the sword with hits own hand, move it from side to side, etc.
Once the meshing process is done, any app that you use will be aware of it and make use of it. In that paint app, for example, we can see the user picks up blocks and once he released them, they are dropped onto the surface floor. This is similar to what we’ve seen in ARKit and ARCore’s surface detection capability. However, the big difference with Magic Leap compared to ARKit and ARCore surface detection is that that Magic Leap can create a more complex meshing structure that resembles many objects and surface types in your environment, not just flat horizontal and vertical surfaces. In ARKit 2.0, the engine can detect angular shapes, but still, it doesn’t come close to that full environment meshing capabilities.
For example, if you have a chair in the room, you can push that cube below it and you won’t see it due to the masking (occlusion) process. I’m sure that in the future, we will have the capability of detecting object’s materials and apply pre-defined physics attributes to different polygons which represent different objects in the real world scene. This way, if you take a virtual ball that has a certain physical property, let’s say of a standard basketball, and throw it on your bad, it won’t bounce up because the bed’s sheet will absorb some of the kinetic energy. So virtual objects will behave similarly to real objects when interacted with real-world objects. This is something that isn’t available right now, but we might see it in the future.
Limited Field of View
The Magic Leap one has an approx 50-degree diagonal field of view. This means that you can see some of the virtual content cropped as you use an app. You can see the entire scene but only part of showing the virtual content.
At around minute 3:55 you can actually see the content being cut out. I am not sure whether this is because of the limited field of view, but it seems like it. I think the image is combined from the camera at the front and virtual content drawn on the screen.
In minute 4:37 and other sections in the video, you can see that the virtual content appears semi-transparent, so you can actually see the real world section behind that virtual content. Still, the virtual content does appear quite opaque but with very slight transparency. I don’t know if there is an option for non-transparency at all. From what I’ve heard from those who tried it, this is how Magic Leap One renders virtual content, they appear a bit semi-transparent. This makes the virtual content appear less realistic because of that.
That partial transparency appears more prominent against a strong backlight (check out minute 4:44 when the fish moves near the window). But overall, Avaer Kanzmer report good visual fidelity and that the content appears to mix with the real world in a believable way.
The Importance of Audio
Avaer Kazmer mentioned the importance of the spatial audio, especially in a mixed reality experience. The fact that virtual content is positioned around you in the real world space, it’s important to associate the sound with the exact location where that virtual object is. This, of course, leads to better realism and better immersion. It’s not different than Virtual Reality in that, matter, maybe even more.
At minute 5:56 you can see Avaer using the controller to build dominoes on the floor. You can see the controller and the laser pointer that shows where he is pointing to the controller. Judging by these frames and other interaction, the controller seems to be responsive and very accurate, no annoying lagging or anything like that, but again, I will need to try it myself to get a better assessment of that.
Magic Leap One Jittering
If you look at round Minute 6:03 you can see some jittering happening with the location of the domino pieces. I am not talking about occlusion misses. However, overall, the experience shown in that demo seems to work really well with little jittering. I’m sure this will be improved in a future version like it is with current AR frameworks.
Hand Gestures are Amazing
Watching minute 8:20 you can see a demonstration of how hand gestures work. This is such a useful feature for social interaction that I’m sure it will be used in many social apps like it is used in virtual reality apps.
This is one aspect where AR/MR glasses have a great advantage over mobile device based AR solutions because you have your hands free, or at least one of them free if you are not using the controller. With mobile device based AR, in most cases, you are required to hold your device with your hands, making them not usable for any other interactions, let alone hand gestures.
Now imagine playing a spell-casting game in Mixed Reality, this will be so cool, and it’s definitely a type of game that is perfect for MAgic Leap One, but isn’t good for handheld AR. It seems to be a slight lag in the reading of the hand gestures, but maybe it’s because of that type of app, so it might be better than what we’ve seen in the social app demo that comes with Magic Leap One.
Magic Leap One has two runtime environment, with one enabling multitasking, allowing one or more applications running at the same time. Multi-tasking is important for Mixed Reality, especially for business use when you want to do several things at the same time, kind of an equivalent of having multi monitors in your office or home to improve your productivity. For certain works, it’s even essential or required.
The Lumin Runtime apps (compared to Lumin SDK Immersive apps runtime) support cooperative multitasking, so you can build apps that interface with other 3rd party apps that expose certain APIs or it can be interfacing with another app or apps from the same developer.
Magic Leap One App Store
The Magic Leap One app store is called “Magic Leap World”. It seems to be the equivalent of the Apple App Store or Android Play Store. You can browse the available apps but it seems in a very early stage without the great looks and feel of a fully featured app store app. Users can see a live preview of an app which should give users a better idea of how the app looks when used on the device. This is a useful idea considering the medium those apps run on, which is Mixed Reality. Making an image just won’t give you the understanding how this app is used and if you are already browsing the app store using the Magic Leap One glasses, why not give you a glimpse of what the app actually looks like in 3D. Cool stuff.
Continuous Meshing Process
At minute 12;40 you can see the Tonandi app. Avaer mentioned that this was done not just to make a cool app but because the app also does continuous meshing process in the background.
Now, this reminds me of the initial surface scanning processing in some ARKit apps where you see a creature or object and the developer made it so the user needs to search or follow that object, just to make the ARKit scanning process more user-friendly and fun. So this is a technique that Magic Leap One developers can use in order to provide a seamless room scanning procedure instead of explicitly asking users to move around and scan the room prior to using their app – Smart!
Lack of Hand Haptic Feedback
When using a hand gesture, you don’t get haptic feedback, which is quite obvious considering that you don’t use any type of controller just your bare hands. This s something that I think needs to be addressed at some points and I guess some people will come up with some type of controllers or wearable gadgets for Magic Leap One that will allow seamless hand gesture support while also enabling haptic feedback to work, same as it is in virtual reality, and there are already some solutions to that, and maybe some of those will found their way to Magic Leap One as well.
To be honest, this video was better than any other official promotional content and other content posted on leading media websites. It really opened my mind to some amazing things that can be developed for this platform. Watch this video until the very end and I’m sure you’ll get inspired by it.
To be honest, this video got me excited about Magic Leap One that I was before seeing it. This is why I’ve decided to write this long article and share my insights after watching it. Now I am in an even worse situation, being in a place where I really want to get the Magic Leap One, Dang it 🙂