By: Tory Smith
There’s an awful lot going on in this five second video clip from yesterday’s test drive of the Vision SDK — we’re enhancing the depth and feel of our augmented reality navigation. Whether you are building solutions to run on a smartphone, a connected dashcam, or an automotive heads up display, lining up projected 3D features while weaving through traffic is a tough challenge. Getting AR navigation to feel like reality requires harmonious fusing of our map data from the cloud with the live sensor data on the client: GPS, inertial measurement unit (IMU), and the camera.
In designing a novel navigation experience, customization is crucial. The Vision SDK invites developers to get creative with designing turn-by-turn directions in 3D. Opacity, thickness, and color of the AR driving line are all adjustable today. In this example, we’re also changing from a solid line to a tread mark pattern. By estimating the vehicle’s velocity from the device’s GPS and IMU, we’re able to animate the chevrons of the tread mark such that they stay in place as we drive over them.
To create 3D features that look as realistic as possible, we need to draw on the camera frame with a perspective that closely matches the horizon and vanishing point of the roadway as viewed from your car. No one wants to go through a calibration procedure every time they bring up their navigation screen, so the Vision SDK uses scene segmentation to calibrate automatically. We use segmentation of key features in the environment, such as curbs and lane markings, to find the horizon and vanishing point. This allows us to project features precisely without relying on any feedback from the user.
We also use segmentation to intelligently alter the appearance of the AR line. In this example, you can see two ways we’re dynamically obfuscating portions of the tread marks. First, the interior dashboard and the hood of the car have been removed. Second, we’ve cut off the end of the path where it intersects with the car in front of us. Look closely to see how the shadows in the scene show up on top of the tread marks as well — AR features that dynamically interact with the real world enhance the feel of driving over an illuminated carpet.
Finally, we’ve improved the fit of the AR path to the road geometry using lane detection; we can use our understanding of where the left and right lane boundaries are to more precisely center the AR line as we drive. This is particularly helpful in situations with poor GPS signal, like tunnels and urban canyons.
Ready to put your imagination behind the wheel? Download the Vision SDK and start building today.
Tory Smith - Product Manager - Mapbox | LinkedIn
Lining up AR features while weaving through traffic with the Vision SDK was originally published in Points of interest on Medium, where people are continuing the conversation by highlighting and responding to this story.