Quantcast
Channel: maps for developers - Medium
Viewing all articles
Browse latest Browse all 2230

Sprint 5G Maps Launch at Mobile World Congress

$
0
0

Sprint 5G Maps launch at Mobile World Congress

By: Marc Prioleau

Location data is now embedded directly into the 5G spectrum, with any device running on Sprint’s Curiosity IoT network getting direct access to live maps updated on the edge. Today at Mobile World Congress we’re announcing our partnership with Sprint. By deploying AI that can sense and understand changes on the edge, we can provide situational updates instantaneously to support mobility and smart city applications built on Sprint’s new Curiosity IoT Network.

We’re showing the millisecond performance of maps running on Sprint’s new Curiosity IoT Network with 5G at the Ericsson Experience in Hall 2 at the Fira Gran Via at MWC.

Our maps running on Sprint’s mobile edge computing in the Curiosity IoT Network Cores connects to the Vision SDK — launched with our partner ARM — to detect, categorize and incorporate changes with super low latency. We deploy to Sprint’s edge locations using Atlas, securely delivering our map stack offline with full API from mapbox.com. Deployment at the network edge means our map APIs and the neural networks used for analyzing imagery from front-facing cameras run on Sprint’s endpoints, bringing live location data within 10ms of any user on the network.

The sub-10 ms latency now possible with 5G is critical for the autonomous vehicles and robots currently being tested areas across US cities. Sprint’s mapping platform allows smart machines not only to know what is along their path but also to react to changes in that path on the spot. The traditional model of slowly updating maps days or weeks after a change won’t work. Maps don’t just need to be faster; they need to be “live”, where changes are detected by sensors built into the map in real time. That requires new thinking for maps as well as a new kind of network.

That rethinking involves repositioning our AI closer to the edge. The Vision SDK, now in private beta, runs neural networks for object detection and segmentation directly on the mobile device, understanding the roadside environment practically at the side of the road.

Vision SDK object detection (left) and semantic segmentation (right) running on a mobile device

We can now take the next step, using Sprint’s mobile edge computing cores to do deeper AI that requires more costly compute resources, while still benefiting from the super low millisecond latency with the Vision SDK now directly connecting to Sprint’s IoT Network in less than 10 ms. We still use the cloud to do the bigger deep leaning tasks and training of algorithms but we have fundamentally shifted the parts that impact real-time driving to the edge. Here’s a deeper look at how it works:

Step 1: 100% on the edge. Developers running the Vision SDK for advanced driver navigation and safety features on mobile devices continue to process everything locally, running neural networks that identify a wide range of standard roadside features, including signs, speed limits, other vehicles, and lane geometry. While the Vision SDK continues to run 100% on the edge, today we are launching the Curiosity plugin for all developers running the Vision SDK on the Sprint Curiosity IoT platform. This new AI plugin collects anomalies above our Vision detection threshold for processing on the Curiosity Core.

Step 2: Direct connect to the Curiosity cores in 10 ms. Once an anomaly is detected by the Vision SDK, the cameras capture images and video of the anomaly and directly connects to the closest Curiosity Core running Mapbox’s network edge software. This software includes advanced AI with similar neural networks to the Vision SDK, but requiring more compute resources for processing imagery to create more complex detections. This includes interpretation of unclassified signs, where we can run Optical Character Recognition (OCR) to interpret signage, or classifying traffic incidents or construction from different angles. These advanced detections are made locally on the Curiosity Cores, which are typically within 50 miles from the devices, reducing data transfer times. Anomalies can be captured, interpreted, and returned to the vehicle in less than a second.

Step 3: Learnings from deep AI in the core is streamed back to the cloud. All learning from these anomalies is then captured and pushed back to our global cloud to update our neural networks across the world with fresh insights. This lets our cloud continue to collect imagery detections improving training of the neural networks that run on the edge. Images can be transferred off-peak and processed on our Amazon Web Services cloud resources.

This is a redistribution of map building designed to build the live maps needed to support autonomous vehicles and other mobility initiatives. Allocating tasks closer to the edge is a game changer. We get the speed and efficiency of our neural networks running directly on the mobile devices using low cost, low power chips, but can also directly connect to the Curiosity Cores when we need more thorough processing. These Sprint Cores natively running Mapbox will be rolled out across the US this year. This accelerates the speed at which our live maps can learn and adapt to the changing world.

Developers using Sprint do not need to wait for local 5G roll-outs. These benefits will be available to Curiosity developers to enable applications ranging from mobility to navigation to on-demand services to Smart City applications which improve the use of scarce resources by communities. As Curiosity transitions to 5G installations, the increased bandwidth will only accelerate the ability to refresh the map every second.

In today’s cities, these road features aren’t static. Curiosity IoT is an architecture for remote sensors interacting with the network and each other with super high bandwidth, extreme low latency, and precise location. Machines and humans alike can have fresh, precise maps all the time, without waiting for a survey van to drive by. That live picture of the world will benefit not only application developers for commercial uses but will have amazing impact on communities that can now plan and act based on live data.

Please email press@mapbox.com to schedule any onsite press inquiries or meetings. Our CEO Eric Gundersen and I will be at the Sprint Media announcement Monday in Barcelona to discuss more about 5G maps, the implications of location data now embedded directly into the spectrum, and our partnership with Sprint.

Marc Prioleau


Sprint 5G Maps Launch at Mobile World Congress was originally published in Points of interest on Medium, where people are continuing the conversation by highlighting and responding to this story.


Viewing all articles
Browse latest Browse all 2230

Trending Articles