By: Morgane Santos
This is part of a series about designing for Augmented Reality. If you want to brush up on AR terms and concepts, we recommended starting here.
Imagine, walking through a city and getting POI content or directions in real time, oriented to your field of view (like the Mapbox Vision SDK), or hiking to a view spot and learning more about what’s in front of you — which mountain is in the distance, or where your house is from where you are. This is world-scale AR, one of the most exciting applications of AR, in which your current environment is the backdrop to an experience based on your specific location.
Since Locate, I’ve heard lots of ideas from developers, but there aren’t many resources yet breaking down the basic building blocks of world-scale. Building a world-scale AR app currently requires a bit of math and custom functions. We’ve put together an open source project for you to play with, but we also wanted to briefly walk through how each component works.
World-scale AR relies on a few things:
• a user’s current location (latitude, longitude, and potentially altitude as well)
• a user’s heading, or where they’re facing
• a set of POI coordinates
• a little bit of math to tie it all together
I mentioned math. If you’re a little rusty on vectors and matrices, don’t worry! The overall gist is that we take two locations (like, a user’s location and a destination) and use that to figure out the azimuth or how many degrees clockwise from north the user has to turn to face the destination.
Note that azimuth is always positive, so if something is directly west of me, the azimuth is 270º as opposed to -90º.
To calculate the azimuth in Swift, add this function to your project:
Getting the azimuth is the first step in displaying the POIs. For example, we might want to create UI labels in the AR experience with the POI names. We use matrix transformations to make that happen. There’s some more math involved here, but basically, we take the azimuth and use that to change the position and rotation of our POI label accordingly. In this sample code, we position the labels 10 meters in front of us (z = -10), then rotate it according to the azimuth.
Note that some of these functions are custom extensions of the matrix_float4x4 class in Swift. To extend this class, add this to the bottom of your view controller, after the view controller code.
If you’re ready to start playing around with this, check out our open source repo showcasing a world-scale experience that shows the locations of various national parks and forests in the United States. It includes transitions between 2D maps and AR, as well as guidance on how the math works, a 3D horizon, and 3D UI.
World-scale AR directions + 3D horizon for orientation. Now I know that Yellowstone is thattaway 🧭 Made with @Mapbox iOS + SceneKit SDKs #builtwithmapbox #ar https://t.co/dA8nigMrGC
World-scale AR opens up a world of possibilities. Whatever you end up building, we’d love to see it. Tweet about it with the hashtag #builtwithmapbox to show off your work!
Building a world-scale AR experience in iOS was originally published in Points of interest on Medium, where people are continuing the conversation by highlighting and responding to this story.