Quantcast
Channel: maps for developers - Medium
Viewing all articles
Browse latest Browse all 2230

Just In: Vision SDK comes to Android

$
0
0

By: Tory Smith

The Mapbox Vision SDK is now available on both major mobile platforms.

Last month, we launched the beta of the Vision SDK, which can turn any connected camera into a second set of eyes for your car. By running our neural networks live, directly on the device, we are redefining how machines and humans alike interact with the driving environment.

With our initial iOS release, we gave mobile developers the tools to bring visual context into the navigation experience. Today, we are excited to announce beta availability of the Vision SDK for Android. With the Vision SDK, we are unlocking classification, semantic segmentation, object detection, and augmented reality to run on the billions of devices in users’ pockets.

Lane detection determines our home lane (in blue), total lanes, and direction of travel

In addition to adding support for Android, we have also added some exciting new features. For example, our teaser applications for iOS and Android now support native AR navigation, so developers will be able to experience it directly out-of-the-box. We have also added support for lane detection; the Vision SDK will determine the number of lanes of traffic as well as your home lane while driving. This feature will allow developers to provide more context-aware turn-by-turn directions with consideration of when lane changes may be necessary.

The Mapbox Vision SDK is ready for developers. Access the beta.

Travelers, daily commuters, and even professional drivers face the challenge of navigating complex environments when they’re behind the wheel. The Vision SDK offers developers the ability to marry augmented reality with a semantic understanding of the road scene, adding crucial context to a heads-up navigation experience.

Developers can also use the Vision SDK to create novel navigation features that improve the driving experience, including lane-level navigation, speed limit notifications, illumination of passenger pick-up and drop-off locations, and alerts of traffic incidents.

Real-time neural networks for object detection (left) and semantic segmentation (right)

Today, the smartphone is the most ubiquitous platform in the world for communication, computation, and sensing. With the addition of Android support, there are a plethora of new devices, including tablets, dash cams, and other embedded systems that will be able to join the Vision ecosystem. The novelty of the Vision SDK is not just that we are applying machine learning to navigation and mapping. Rather, by releasing SDKs that allow developers to run neural networks on commodity hardware, we are democratizing tools for interpreting the driving environment in real time. We are thrilled to see what our developers will create with these exciting new possibilities.

Open the future with the Mapbox Vision SDK. Get access and start building.


Just In: Vision SDK comes to Android was originally published in Points of interest on Medium, where people are continuing the conversation by highlighting and responding to this story.


Viewing all articles
Browse latest Browse all 2230

Trending Articles