Building a Google Action Using API.AI

Josh Feinberg
Dev Tutorials
Published in
4 min readMay 11, 2017

--

After buying my Google Home at launch I knew instantly that I wanted to be able to build conversation actions with it. Unfortunately, in the beginning there was very limited functionality and the best I was able to do was integrate my smart plug with IFTTT to make this:

Luckily we now have a lot more support and along with the Actions API, Google offers a service called API.AI. This allows us to build our conversation actions much easier and today we’re going to build a simple conversation to learn when the bus is coming.

So first lets create our API.AI account and start our first “agent”. I named my agent “BusTracker”.

Setup for our agent

After that we are brought to a screen with the default “intents”. Intents are how we build the interaction between what a user says and how the system responds. When we create a new agent we are provided the “Default Welcome Intent”, which is called when the agent is started, and the “Default Fallback Intent”, which is used when the system doesn’t understand what the user input.

Let’s click on the “Default Welcome Intent” to edit it and give our agent some default start statements:

Default responses

Now let’s go enable our Google Assistant Integration. The great part about using API.AI is that we get a lot more than just Google Assistant, there is also Facebook Messenger, Slack, and more. For now let’s just focus on the “Actions on Google” integration. Go ahead and enable that and set up your invocation name (I choose “Chicago Bus Tracker”). Don’t worry about the project ID for now and you can pick whichever voice option you want. Go ahead and authorize then you will have the option to use the web simulator by hitting the “Preview” button on the integration.

Our first test run

Very basic but we’re off to a good start!

Next step is to add our own intent so our user can actually answer this question. First though we need to create a new “entity”.

An entity is essentially a variable type. It is what allows API.AI to know when the user is saying something that is a variable we care about for our API. For example, when we say “Where is the 146 southbound bus?”, we want to know recognize the southbound as the bus direction. We need to create this entity for a direction of the bus.

The new entity

Great!

Next up, lets go create a new intent so that we can accept some user input. We will call this intent “Lookup Bus”.

First step is to create our user input. Start off with “Where is the 1 southbound bus?”. Now, we have to teach API.AI that we don’t actually want to use the literal string “1” so highlight that text and you’ll have a bunch of options popup which are our “entities”. Using the filter we can select “@sys.number-integer”. Now our intent has a variable that we can use to handle the responses. Lets do this again for the word southbound and this time have it use our entity “@direction”. You can add any additional items you want for kicking off this intent.

Now in our action section you can see we have our variables. I chose to rename my variable for the number to “busnumber” to help better describe it. I also made both variables required and updated the prompt to be something more human readable.

Finally we get to our response section. For this first part of the tutorial we are just going to have our Google Home echo what we told it. Here we have to use our variables and we do that by using the $ sign and having the text response be “You have entered in $busnumber $direction”.

One last step before we can get this going. We should be good citizens on the platform and by default the microphone will stay on until we tell it to stop. So for now, under the “Actions on Google” we will say that this will end the conversation by checking the box.

We did it! We should now have an intent that looks similar to this:

Our first intent!

Now let’s test it out back in the web simulator. Go back to our integrations page and click on Settings, reauthorize if needed, then hit preview.

We Did It!

We now have a conversation action that we can properly parse using API.AI. And the best part is, this will work on your Google Home as well!

That’s it for part 1! Next up we’ll add in some hooks to actually hit the CTA bus tracker.

Edit: Part 2 is now available here!

Edit 2: Part 3 where we get into permissions is also up here!

Click❤ below to recommend this to other Medium readers interested in AI, chatbots and development.

--

--