Quantcast
Channel: maps for developers - Medium
Viewing all articles
Browse latest Browse all 2230

Using maps and location services with ARKit

$
0
0

By: Jesse Bounds

Since ARKit’s release, it’s been exciting to see what developers can build with our maps, data layers, and location APIs. To help more people get started, I’ve created a small native iOS demo application and a tutorial to go with it. Take a look and see what you can build.

Setting up

If you have an iOS 11 beta running on an iPhone 6s or iPad Pro (or newer) device, download the demo application. In the project root directory, set your Mapbox Access Token in the mapbox_access_token file, open MapboxDirectionsARKitDemo.xcworkspace with Xcode 9 Beta 5, and set your Apple developer signing credentials.

We will be working with the demo app I used to make this “PAC-MAN” style navigation demo at night.

We’ll walk through this demo to understand how to:

That’s lot layers so let’s break it down one by one!

1 | Adding a map for simple scene calibration

Expanding on the iOS Maps SDK example for adding a simple map, I add an MGLMapView in Interface Builder to the bottom of the demo application. In code, I set a nice looking custom style in configureMapboxMapView()and declare how I’d like for directions API results to be visualized in mapView:didFinishLoadingStyle().

Since I also configure the map to followWithHeading, when I start the app it automatically hones in on my current location, as best it can, using iOS location services. Often, my current location may not be determined accurately enough for an AR demo that relies on sub-meter precision, so I use the crosshairs in the map view to let the demo app know exactly where I’m located. In this example, location services placed me in the correct building in downtown San Francisco and I use the cross hairs to clarify my exact location.

As Apple points out in their documentation, “world tracking is an inexact science”. This is also true for world calibration when an ARKit session starts. To begin we need to determine where true north is, and that is tricky on a mobile device. The ARKit+CoreLocation project is home to some great, early work in this area. Check out the comments there and experiment with calibrating the ARKit session correctly when you use the app.

For this demo, I’ve simply added a configuration setting at the top of ViewController.swift:
var automaticallyFindTrueNorth = true

3| Making a call to the Mapbox Directions API

Next, I wire up a long press gesture recognizer to capture a location on the map when a user presses and holds. The location of this gesture is translated into a geographical coordinate that is used along with the map view’s current center location (ideally aligned perfectly with the crosshairs and your current location) as inputs into the Mapbox Directions API request in ViewController.queryDirectons().

4| Processing the directions results

Once I have the directions result, I pluck out the steps and plot them in the map view as nodes in an ARSCNView. However, sometimes directions steps can be relatively few and far between, even for pedestrian directions. To help with the visualization I can add a line or series of points in between the steps.

In this app, I use Turf-swift’s Turf.coordinate(at: formStartof:)method to calculate points every 5 meters along the route. When plotted in the map view and in AR along with the key steps, the result is an easier to follow experience for the user of the demo app. Just like PAC-MAN pellets!

5| Displaying the directions results in AR

The demo app hints at new API we’ll have out soon that will combine map view style annotation management with AR on iOS, Unity, and other platforms.

For now, MapboxARAnnotationManager.addARAnnotation(startLocation:endLocation:calloutString:)is a starting point that allows for placement of individual ARAnchor objects mapped to geographical points with the current location as a reference point. This can be combined with SceneKit for essentially limitless customization of the visualization of those anchors. MapboxARAnnotationManager contains a small collection of utilities that perform a set of matrix transforms that work well for city size demos.

We are just getting started!

I’m looking forward to iterating on these ideas and helping you build amazing experiences that are integrated into our world.

We will be publishing new libraries to help you combine relevant Mapbox services like maps and directions within AR applications. In particular, datasets like terrain meshes, building footprints, and heights will unlock exciting possibilities in AR.

Imagine objects that are aware of their terrain and physical obstructions around them:

We’re looking forward to working with the community to help solve orientation and tracking problems in world space, and to including those solutions in our ARKit related libraries.

In the meantime, follow along for more tutorials and examples. Building something yourself? Share it with us on twitter using the hashtag #BuiltWithMapbox.

Jesse Bounds


Using maps and location services with ARKit was originally published in Points of interest on Medium, where people are continuing the conversation by highlighting and responding to this story.


Viewing all articles
Browse latest Browse all 2230

Trending Articles