environmental lighting

ARKit 2.0 Realistic Image-based Lighting for Virtual Objects

ARKit 2.0 also introduced an interesting feature that allows light from the real world surrounding to affect augmented objects within the AR virtual environment.

You can find more information about this feature in the ARKit documentation under AREnvironmentProbeAnchor. As explained in that documentation, ARKit automatically generates environment texture during the AR session from the camera, allowing SceneKit or a custom rendering engine to provide realistic image-based lighting for virtual objects.

I remember using this environment reflection map functionality when I designed scenes using 3D Studio Max many years back. But this is a completely new tech that takes images of the real world scene and recreates image-based lighting for virtual objects that are not part of the real scene. Using this technology, it makes the virtual object blend more seamlessly with the environment because they inherit some of the physical properties that exist for real-world materials.

t seems that there is a new Unity plugin that already supports it. Dan Miller, a game developer, posted a short video showcasing it.

Tim Field, founder of Abound Labs shared a tweet where he explained how this feature works. ARKit 2.0 generates environment cube map textures using the rear-facing camera while the AR session is taking place. It then builds a 360-degree panorama image that then used as image-based lighting for those virtual objects in the scene.

As you can see, the results are nothing less than awe-inspiring. Any app that uses a reflecting object can benefit from it. Just thinking of this, I recall two car-showroom apps that could definitely benefit from this: BMW I Visualizer, and W Motors app.

Beautiful 3D car render

When reviewing those two apps, I remember paying attention to the reflections. I remember seeing that the image in the reflection didn’t match the place I was at. It still looked pretty convincing, not anyone pays a good attention to these fine details.

Having realistic image-based lighting for virtual objects will result in higher degree of realism and more seamless blending of virtual objects within a real-world scene, at least those objects that are meant to look hyper-realistic in augmented reality.

By the way, real-time reflections aren’t something brand new in AR. I’ve already seen it done using Vuforia and Unity a year ago and this video was done close to 4 years back.

So if it was possible to achieve that such a long time ago, how come we only get to have it now in ARKit? Anyways, I am trying to learn more about the technical aspects and see how developers were able to achieve that using Unity and with other frameworks.

If you have some insight about that, please share it in the comment section below, I would really love to read your opinions.