By: Langston Smith
I’ve been brainstorming Mapbox/API.AI integrations and had voice-control on my mind. Google Assistant can dictate travel times, but could it change the actual map? What would this mean for accessibility? Could a “god mode” be created where users edit a map in real time using just their voice? The answer is… it’s totally possible and it’s ridiculously cool, which is why we made this tutorial:
The Mapbox Maps SDK for Android offers runtime styling which enables you to dynamically change the look and feel of your map in real time. Our maps are comprised of various layers, and it’s these layers that will be adjusted using your voice in this project.
Despite API.AI’s straightforward docs, it can take a quick minute to understand how API.AI’s entities, intents, actions, and parameters all fit with Mapbox’s runtime styling. I’ll walk you through each step of the process below. All of the code for this project is in the Github repo, view the installation instructions in the README to get started.
Entities
In the case of this project, entities are the individual item that will changed in real time (“runtime”). These are the actual words and verbal commands that you’ll be feeding to the API.API service:
- Camera change (“tilt”, “zoom to”, “spin”, etc.)
- Colors (red, black, purple, etc.)
- Map layer names (water, park, place island, bridge rail, etc.)
- PropertyFactory adjustments for Mapbox runtime styling (circle radius, fill color, text padding, etc.)
- Visibility (“hide”, “erase”, “show”, etc.)
Go to a single entity page in the API.AI console and click on “Switch to raw mode”:
Then click on the CSV tab to show the CSV field:
Here’s where you’ll paste the various CSV text (found in the list of gists below) into each respective entity’s CSV box. For example, you’ll see that the CSV text above matches what’s found in the “raw visibility csv” link below…
The layers listed in the raw layer csv file are based on the Mapbox Streets default style. If you’re using a different default Mapbox style or your own custom style, run the following code in the mapView’s onMapReady() to get a list of the style’s layer names in the Android Studio log console.
for (Layer layerName : mapboxMap.getLayers()) {
Log.d("Actiivty", layerName.getId());
}
Intents
For this project, think of intents as the various high-level reasons why you’d be making a command. I added the big four to my intent list as you can see in the screenshot below:
“Adjust layer properties” intent:
- Adjust color of the water layer
- Adjust opacity of the park layer
- Adjust size and orientation of street labels’ text
“Adjust layer visibility” intent:
- Hide the hospital layer
- Delete water
“Change camera properties” intent:
- Rotate map 20 degrees
- Tilt the map 5 degrees
- Zoom the map to level 12
“Move map to location” intent (specific addresses, cities, and countries are all options):
- Find Boston
- Move map to Shanghai, China
- Go to 1600 Pennsylvania Avenue, US
- Visit 85 2nd Street, San Francisco, California
Let’s put this into practice with this example where we will delete all of the parks on the map via voice commands as seen in the video. Remember, parks are deleted from the map by deleting the map’s park layer.
You will need to “train” API.AI by adding sample phrases that would fall under the “Adjust layer visibility” intent. That’s what you see below with “hide cemetery”, “remove sand”, and so on. These are sample voice commands that I typed into the “Add user expression” field in order to make API.AI smarter.
API.AI automatically recognizes “cemetery”, “sand”, “building”, and “water” as part of the @layer entity with the yellow highlight. @visibility has two entities with many synonyms, which is why “hide” and “Remove” are identified as part of @visibility with the orange highlight.
Here’s a test run of the “Delete park” command within the API.AI console. As you’ll see below, the parameter and values are included in the API.AI response by default.
Clicking on “SHOW JSON” will display the specific JSON response (seen below) that API.AI sends.
Because you get a JSON response, you can then parse through it to get the layer name and visibility action. In the case above, it’s “park” and “hide” respectively. Then with if/else Java statements, you can use the Mapbox Maps SDK runtime styling to get the designated layer and change the layer’s visibility.
This trio of the API.AI response, if/else/switch statements, and runtime styling is what powers this voice control demo project. The example above is for deleting a layer, but it’s not too different to change the water layer to purple, tilt the map by 30 degrees, or move the map to Mexico City.
Yes, it might seem like a lot, but I think things will click once you you explore the API.AI console and dive into the sample project’s code.
You can learn more about our mobile SDKs here. Don’t forget to tweet a video of your finished project @Mapbox using the hashtag #BuiltWithMapbox. We love featuring cool projects!
Style a map on Android with your voice was originally published in Points of interest on Medium, where people are continuing the conversation by highlighting and responding to this story.