An experiment in augmenting augmented reality. What happens when physical reality is aware of augmented reality? When devices share their augmented reality space?
In this experiment the Cozmo robot drives along a track drawn in augmented reality. The iPhone and iPad are registered to the same augmented reality. When you draw on one, you see it the other.
I wanted to experiment creating a shared reality based on my experience building Dance with Flarmingos working on the Holographic Transmission and building other shared experiences.
In working on Dance with Flarmingos it struck me how much more immersive and engaging augmented reality is when you are sharing the same AR with others. It's a way to create a shared reality.
One of the key learnings while testing Holographic Transmission was how much more effective the instructor was when both student and instructor were looking at the same augmented reality.
Creating Shared Realities using augmented reality help break through the isolation and egocentric experience of so many digital experiences. Niantic's success with Ingress and Pokemon Go illustrates this, much of the appeal of those experiences is that others are sharing the same "augmentation" that you are.
With ARKit and ARcore it is now easier to calibrated multiple devices to the same space. It's not seamless yet but it's a simple matter of time before advances in machine learning and computer vision will make this easier.
Ben Purdy and I collaborated to build this experiment. It is based on the same Unity framework we created for Dance with flARmingos. Devices that want to share the same space register to the same physical location. In this case you place a virtual Cozmo model over the physical one. The devices use Unity's networking features to connect to a server that manages and coordinates the shared environment. The server remote controls the Cozmo through the Cozmo SDK.