Reimagined driving experiences and real-time mapping with Vision
By: Tory Smith
Sign up for Uber Visualization Night in San Francisco — This Wednesday, March 27th, we’re presenting on the Vision SDK. I’ll be walking through our Vision pipeline, from training the neural networks, to running them on edge devices, to powering live map updates.
We’ll talk about how working with live data is crucial for keeping an increasingly fast-moving world connected and oriented — from our roots working with GPS probe data to today — and how leveraging distributed cameras allows us to add a myriad of dimensions to the living map. The combination of new technologies in AI and AR enable exciting new possibilities for enriching the navigation experience as well as improving mapping infrastructure for all.
The Vision SDK classifies road boundaries, lane markings, curbs, crosswalks, traffic signs, and more. Real-time interpretation of the road brings value to drivers contemporaneously, but this data is also used to update the map live to ensure the most up-to-date information. Looking forward to seeing you on Wednesday night!
Our public beta of the Vision SDK is coming soon! At launch, any developer with a free mapbox.com account will be able to start building Vision applications that can run on billions of connected devices. Visit www.mapbox.com/vision to sign up and get notified on launch day!
Tory Smith - Product Manager - Mapbox | LinkedIn
Uber visualization night was originally published in Points of interest on Medium, where people are continuing the conversation by highlighting and responding to this story.