I recently had a friend who is a professional artist ask me to explain what the tech behind the PokemonGo phenomenon is, and how he can use it in his work. I\’m pasting a bit of my response to him for those who are curious. Augmented Reality (AR) is a fascinating technology, one that\’s actually been around for a while – but PokemonGo has brought it into the mainstream.
What is augmented reality?
A way to superimpose something created over the “real” world. It adds to reality, rather than replaces it. Some call it a mashup – a combination of two or more functional elements + data to create something new and different.
It is usually visual, but can be aural or even touch. Currently phones are used as the interface (looking at the world through the phone screen to see the overlay), but in the near enough future eye glasses, car windshields, etc will be able to do it as well so the interaction isn’t so artificial (or hazardous).
The difference between augmented reality and virtual reality is AR continues to let you interact with the real world, while VR immerses you in a totally different one.
Here’s a good video that explains it (from 2007!)
How does it work?
There are different ways it can be triggered:
- Using your phone’s sensors: this is Pokemon Go uses, it “places” objects at geo coordinates and people “find” them. The sensors it can use includes digital compasses, accelerometers, GPS, radar, etc to sense objects in the real world and draw the virtual data in relation to them. These are most often used in vehicles and mobile phones.
- Using a pre determined visual trigger: This is where the phone recognizes the trigger and superimposes the artificial over it. Sometimes it uses bar codes printed on objects. To do this the phone needs to analyze the video feed from a camera. It then uses the marker as a reference point onto which to draw the virtual data over the real objects. A good example is this:
- Markerless: This is a more complex variant of the simpler bar code system, where the computer is programmed to detect markers that are not as specific. This could potentially be any feature in the real world as long as it is visually identifiable enough (street signs are one). The complexity of detecting these features is greater, but has the capability to be more robust.
The content that you’re overlaying doesn’t necessarily have to be 3d content: it could trigger a video, for example. This was used in an ad campaign for the Stella Artois Beerfinder App (see examples below)
How to create it?
You need to create an app. The predominant programming language is one that’s used in gaming, called Unity 3D and a framework to run it on, which seems to predominantly be Qualcomm’s Vuforia (the alternative, with different strengths, is Layar, although it’s not mentioned much). The third competitor – Metaio – was recently bought by Apple and doesn’t seem to be public anymore.
Some good AR examples:
Here are six different examples of different AR experiences.
1. Fixed Experience: Desktop Computer – DeBeers
This AR application helps people shop by visualizing an item \”on\” at home, without having to go to a store.
This example works by printing out a sheet of paper with a graphic on it. This graphic is called a \”marker\”. She then holds up the paper so the computer\’s camera can see the paper.
Her browser accesses the image from the camera, identifies the marker, checks a database for the correct animation. Her web browser overlays an animation on top of the graphic and the user can move it around, spin it like a 3D object by moving the paper or using the keys of their keyboard
De Beers used this in their \”forevermark\” campaign (link here).
2. Mobile Experience: Stella Artois
This AR application helps people find nearby restaurants and bars that serve the product you\’re looking for.
This example works by downloading the Stella Artois Beerfinder App to a smartphone. As someone holds up the phone – it accesses the phone\’s camera and GPS/Maps functions, as well as mobile data. As he turns around he can see streets in front of him through the phone screen.
The app connects to a database and shares the loci (gps coordinates and direction the camera is facing) and gets back loci data in response (addresses/storefront info for stores selling the beer. As he looks through the screen, the app overlays an icon on restaurants and bars that sell Stella Artois Beer.
It\’s a lot faster than googling \”who serves Stella Artois near me\” and hoping to find an answer.
3. Mobile Experience: Pokemon GO
PokemonGo is currently all the rage. It\’s not the first AR game, but it is the most successful to date. It\’s the ultimate scavenger hunt! – and it\’s inadvertently getting people into shape lol.
This is how it works.
A woman downloads Pokemon Go game app and creates an account. The game accesses her camera and GPS and mobile data; as she plays the game, she \”finds\” Pokemon tracks superimposed on real world locations – and follows them
To catch a Pokemon she has to hold up her phone so the camera can show where the Pokemon is; now the app is capturing camera data as well as GPS and direction the camera is pointing in. It \”shows\” a Pokemon Character standing on the path of a park or on a bench in a museum. She then captures the Pokemon by \”throwing\” a ball at it on the screen and earns points, which she shares the points team (other players aligned with a certain team).
4. Mobile Experience: Zombie Run (Audio Only)
This AR uses only audio to superimpose a game onto the real world.
The Zombie Run app connects to mobile data, then uses the phone\’s GPS and accelerometer data to play sounds to the person playing. When he turns on the game – which is audio-only – he starts running and are given both directions (thanks to GPS) as well as running speed data. The directions they hear push user to \”find\” caches of supplies.
At determined intervals the game will play sounds and give commands that the player is being chased by Zombies, which they can outrun if they speed up.
I don\’t actually know what happens when they catch you, but I suspect it sounds grisly.
5. Mobile Experience: Glassholes
AR that helps recognize and identify people, overlaying information about them on the \’real\” world with the help of Heads Up Display (HUD) (a la Google Glass, or tech enabled eyeglasses).
The Google Glass has a built-in camera, mobile data connectivity and a transparent display in the field of view.
As a person walks through a crowd or a party, the Google Glasses are capturing pictures of people\’s faces at the party, instantly processing that info using facial recognition and comparing it to data source such as LinkedIn, then providing an overlay to the wearer of data such as their name, company they work for, and other useful stats pulled from various social media and other sources.
6: Mobile Experience: Word Lens (now integrated with Google Translate)
This is a very simple AR application, but a great one: triggered by signage, it overlays whatever language you prefer over the sign through your phone screen (although I\’m waiting for it to be integrated into your car windshield, so when driving in a foreign country you will \”see\” the signs in your preferred language).
So, that\’s a basic introduction to Augmented Reality. Any other great examples I\’ve missed? Tell me in the comments!