Today I’ve downloaded the app ‘Silver Clouds’ by Matthew Swenson. Silver Clouds is inspired by Andy Warhol’s interdisciplinary installation of the same name. That app was a fruit of a collaboration between Warhol and engineer Billy Klüver who founded the now legendary Experiments in Art and Technology (E.A.T), which attempted to explore the synthesis of art and science.
Before I continued using the app, I actually went to warhol.org to check out Andy Warhol’s original Silver Clouds Installation at “The Andy Warhol Museum”, 117 Sandusky Street, Pittsburgh, PA 15212. I actually found a video of it, here, check it out.
Andy Warhol was an Amerian artist, director, and producer who was a leading figure in the visual art movement known as pop art (via Wikipedia).
After browsing through the website I went back to the app. Silver Clouds is a simple augmented reality app where you get to place floating reflective silver balloons around you, watch the reflection change as you move and move them on an intersection.
Here is the app in action!
The beauty of this app, for me, was the reflection mapping technique used in this app to create a dynamic reflection that looks very realistic. This is done obviously from a precomputed texture from the camera’s video stream. The reflection changed as I moved around the virtual balloons, changed the lighting or move my hand or objects in the real world scene.
Although it seems real, it did behave unnaturally. I moved closer to the balloon and noticed that when I rotate the iPad (thus the camera) the reflection image is rotated as well, but that behavior, once you know how this “trick” is done, is not unexpected. Of course, those reflections can’t reflect the scene behind you because that part is not captured by the camera.
So the front side of the balloon (the one that faces you) should reflect the part behind you, but in fact, it reflects the scene in front of you. This is obviously is not a natural realistic reflection, but this is better than just having a reflection map of a predefined image used that has nothing to do with the scene, nor it reacts to dynamic changes within it, which this technology does.
The AR Experience
I enjoyed this AR experience, especially while experiencing it with the relaxing and enchanting soundtrack that runs in the background. However, balloon collision when the camera didn’t work that well. I tried bumping against the balloons and they moved only very slightly or pushed together as a group, which is probably caused because the app doesn’t require and surface scan, so it doesn’t have a pivot point from which it can produce that relative movement on collision.
I’ve noticed that the balloons did rotate based on the changes in the image stream. The app probably tracks changes in the image stream and moves the balloons based on the movement pattern in the image analysis. For example, if the change was done in the top left corner and a balloon is located at the location, you see that balloon affects. Of course, if you put many balloons, a collision will take effect and other balloons will touch and affect nearby balloons as well.
The developer used several techniques to make this type of Augmented Reality experience as realistic as possible under the current technology limitations and it did a very good job with it. If you don’t get into the intestines of this app and over analyze it, it will look super realistic to you. It’s like those virtual objects are indeed exist in the real world space and are effects by changes happening in the real world surroundings.
It would be great if the balloons would react when I actually touch each one of them with my hand 🙂
So how it was done?
I don’t know which exact libraries were used to create this app. I came across this library on GitHub called ARCameraLighting. This is for Unity. It captured the camera video frame and uses it for spherical environment mapping. It supports ARKit 1.5 and ARCore 1.2 The results are not perfect but very convincing.
Regarding the second technology that uses to detect movement in the scene. It can be image detection or using 2D/3D object tracking but I haven’t dived too deep to find related libraries for it
In ARKit 2.0 (iOS 12.0) we have 3D object detection, this means that this technology can help create even more realistic results. To make it work for more situations (in many cases unreliable without getting any z-axis object movement vector information), image analysis is a good choice I believe. If you combine several technologies, you can obviously produce results that look and feel much more realistic.
I did a little test. I placed balloons in front of a rotating fan in front of me. I noticed that when the fan rotated left, the balloons moved left and when it rotated right, the balloons moved right (wasn’t consistent though). That leads me to think that the app uses an image tracking algorithm that can output a 2D or 3D movement vector. For example, if there is a ball and it keeps getting larger, this might translate to a movement along the z-axis. Again, I am not 100% sure, maybe you can spread some light how it was done.
If you are a developer, I’m sure you’ll find those resources fast but this is something to help some developer get started.
Silver Clouds is a unique app like anything I’ve seen before. It’s great to see what developers can do when they try to capitalize on advanced image analysis technologies in order to deliver exciting AR experiences.
The virtual felt very much real (to an extent of course) and that experience was exciting and very convincing. You can also record videos using the built-in camera feature. I personally recommend if you have a kid, placing some balloons and let him or her run through it, see how it works out.
Silver Clouds is available for free on the App Store, and I highly recommend everyone to try it out, both regular users and developers. It can open your mind to new creative ideas.
Download Silver Cloud from the App Store here (for iPhone and iPad).