1. Introduction
In this lab you will build a simple action using Dialogflow and learn how to integrate it with the Google Assistant.
The exercises are ordered to reflect a common cloud developer experience:
- Create a Dialogflow v2 agent
- Create entities
- Create intents
- Setup a webhook with GCP Cloud functions
- Use the knowledge base to import FAQs
- Test the chatbot
- Enable the Google Assistant integration
What you will build
We will build a Google Assistant app for the Women in Voice meetup group. It will be possible to ask when the next meetup is, article & book tips or general questions about the meetup group. |
What you'll learn
- How to create a chatbot with Dialogflow v2
- How to create linear conversation with Dialogflow
- How to make use of entities
- How to make use of the knowledge base
- How to setup webhook fulfillments with Dialogflow and GCP functions
- How to bring your application to the Google Assistant with Actions on Google
Prerequisites
- You will need a Google Identity / gmail address to create a Dialogflow agent.
- We will provide you GCP cloud credits for the usage of GCP Cloud Functions
- You will need to access this public Google Sheet for getting the agent data.
- Open it in a new tab: https://docs.google.com/spreadsheets/d/1UWx3WYVCrqz0D4uJ_pO56WeqEPa9rQDG1cfc_H11kgY/edit?usp=sharing
- Basic knowledge of JavaScript is not required, but can be handy in case you want to change the webhook fulfillment code.
2. Getting set up
Enable Web Activity in your browser
- Make sure Web & App Activity is enabled:
Create a Dialogflow agent
- In the left bar, right under the logo, select "Create New Agent". In case you have existing agents, click the dropdown first.
- Specify an agent name:
yourname-wiv
(use your own name) - As the default language choose: English - en.
- As the default time zone, choose the time zone that's the closest to you.
- Do not select Mega Agent. (With this feature you can create an overarching agent, which can orchestrate between "sub" agents. We do not need this now.)
- Click Create
Configure Dialogflow
- Click on the gear icon, in the left menu, next to your project name.
- Enter the following agent description: Women in Voice agent
- Scroll down to Beta Features and flip the switch, to enable beta features.
- Scroll down to Log Settings and flip both switches to Log the interactions of Dialogflow and to log all interactions in Google Cloud Stackdriver. We will need this later, in case we want to debug our action.
- Click Save
- Click Done
Configure Actions on Google
- Click on the Google Assistant link in See how it works in Google Assistant in the right hand panel.
This will open: http://console.actions.google.com
NOTE: Make sure you are logged in with the same Google account as in Dialogflow.
When you are new to Actions on Google, you will need to go through this form first:
- Try to open your action in the simulator**, by clicking on the project name.**
- Select Develop in the menu bar
Uncheck Match user's language setting. To make sure the text to speech synthesizer won't be overruled by the Assistant default language.
- Click Save
- Select Test in the menu bar
- Make sure, the simulator is set to English and Click on Talk to my test-app
The action will greet you, with the basic Dialogflow default intent. That means that setting up the integration with Action on Google worked!
Configure Google Cloud
For this tutorial you will need a GCP account with a billing account. If you don't have one yet, you can createone with these steps.
Normally a billing account requires a payment method such as a credit card. For this workshop, we can make use of workshop credits which can skip this process.
- Navigate to this URL and login
- Click: Click here and access your credits
- Click Accept & Continue
You are all set. You've created a billing account with 25 dollars, which should be more than enough to use Cloud Functions for a long time.
Enable Google Sheets API
If your agent needs more than static intent responses (for example to fetch data from a web service, database or Sheet), you need to use fulfillment to connect your web service to your agent. Connecting your web service allows you to take actions based on user expressions and send dynamic responses back to the user.
For example, if a user wants to receive a blog or book tip, your web service can check in your database and respond to the user with an article to read.
In this tutorial we won't make use of a database, instead we will make use of a Google Sheet. Once the sheet gets updated, the Google Assistant action will be updated as well. Neat!
- Open this Google Sheet in a new browser tab, if you haven't done so already:
- https://docs.google.com/spreadsheets/d/1UWx3WYVCrqz0D4uJ_pO56WeqEPa9rQDG1cfc_H11kgY/edit#gid=1240329448
- IMPORTANT: Make a copy of this sheet. Click File > Make a Copy
- Once the sheet has been copied, click Share
- We will need to give the Dialogflow Service Account edit rights. To do this, open Dialogflow > Settings (cog wheel).
- Scroll down to Google Project
- Copy the service account (email) address. It should look something like this: dialogflow-<someid>@<my-gcp-project>.iam.gserviceaccount.com
- Paste this service account in the Share popup of Google Sheets, and give it Edit rights.
- Next we will need to remember the Sheet ID that we are currently working in.
The Sheets URL will look something like this:
https://docs.google.com/spreadsheets/d/1fPd8b_z19U7ZzAaY327QhYoogn6q8c1rpGSNF8KIR_o/edit#gid=1240329448
But we are only interested in the Sheet id, which is the part between:
https://docs.google.com/spreadsheets/d/ and /edit#gid=1240329448 (without the slashes).
So it will look something like this: **1fPd8b_z19U7ZzAaY327QhYoogn6q8c1rpGSNF8KIR_o**
**Write this Sheet ID down**, or copy it to Notepad. In the Webhook steps we will use this again.
- Open in another browser tab; http://console.cloud.google.com. (In case you have more Google Cloud projects, activate the new Dialogflow project: yourname-wiv). - In the search bar search for: Google Sheets API
- Click this, and click the Enable Google Sheets API button in the top.
3. Custom Entities
Entities are objects your app or device takes actions on. Think about it as parameters / variables. In our action we will ask:
"I want a reading tip about
chatbots
/ I want a reading tip about
voice*"*
Whether you say Chatbots, Voice or Both, this will be gathered from a custom entity which will be used as a parameter in my request to a web service.
Here's more information on Dialogflow Entities.
Creating the Channel Entity
- Click in the Dialogflow Console on the menu item: Entities
- Click Create Entity
- Entity name:
tech
(make sure it's all lowercase) - Specify the options with the synonyms. (You can tab through the interface.)
Chatbots - Chatbots, Chat, Web
Voice - Voice, Voicebots, Voice Assistants
Both - Both, All
5**.** Switch to **Raw Edit** mode by clicking on the menu button next to the blue save button.
- Notice that you could have entered all the entities in CSV format as well. This can be handy when you have a lot of entities that need to be created.
"Chatbots","Chatbots","Chat","Web"
"Voice","Voice","Voicebots","Voice Assistants"
"Both","Both","All"
- Hit Save
4. Intents
Dialogflow uses intents to categorize a user's intentions. Intents have Training Phrases, which are examples of what a user might say to your agent.
For instance, a user who wants to know who wants to know when the next event is might ask:
"When is the next meetup?"
When a user writes or says something, referred to as a user expression, Dialogflow matches the user expression to the best intent in your agent. Matching an intent is also known as intent classification.
Here's more information on Dialogflow Intents.
Modifying the Default Welcome Intent
When you create a new Dialogflow agent, two default intents will be created automatically. The Default Welcome Intent, is the first flow you get to when you start a conversation with the agent. The Default Fallback Intent, is the flow you'll get once the agent can't understand you or can not match an intent with what you just said.
- Click Intents > Default Welcome Intent
In the case of the Google Assistant, it will auto-start with the Default Welcome Intent. This is because Dialogflow is listening to the Welcome event. However, you can also invoke the intent by saying one of the entered training phrases.
Here's the welcome message for the Default Welcome Intent:
User | Agent |
"Ok Google, talk to <yourname>-WIV" | "Hey there, I'm Anna, the virtual agent of Women in Voice.""You can ask me for information about meetups, Women in Voice or a reading tip."What would you like to know?" |
- Scroll down to Responses.
- Clear all Text Responses.
- In the default tab, create the following 3 responses. (Click Add Responses > Text or SSML Response, for each new line:)
- Hey there, I'm Anna, the virtual agent of Women in Voice.
- You can ask me for information about meetups, Women in Voice or a reading tip.
- What would you like to know?
The configuration should be similar to this screenshot.
- The previous output is used for chatbots, we can modify the output a bit, specifically for the Google Assistant. We will use SSML (Speech Synthesis Markup Language) to build in pauses in our sentences. Click the Google Assistant tab.
- Do not enable the Default toggle, as we won't re-use the chatbot message.
- Click Add Responses > Simple Response
- Add the following text version:
Hey there, I'm Anna, the virtual agent of Women in Voice.
You can ask me for information about meetups, Women in Voice or a reading tip. What would you like to know?
- Then click Customize audio output
- And add the following SSML version:
<speak><p><s>Hey there, I'm Anna, the virtual agent of Women in Voice.</s><s>
You can ask me for information about meetups, Women in Voice or a reading tip.</s></p><break time="500ms"/><p><s>
What would you like to know?</s></p></speak>
The configuration should be similar to this screenshot.
- Click Save
Here you can find more information about SSML for Actions on Google.
- Let's test this intent. First we can test it in the Dialogflow Simulator.
Type: Hello. It should return this message.
- Now, switch back to the Actions on Google console.
(You might want to keep this in another tab.)
Click: "Talk to my test app." And listen to the new welcome message.
Modifying the Default Fallback Intent
- Click Intents > Default Fallback Intent
- Scroll down to Responses.
- Clear all Text Responses.
- In the default tab, create the following responses, each on a new line, so it alternates between these options:
Sorry, can you repeat this?
I didn't understand you. You can ask me questions about Women in Voice, a book or article tip or when the next meetup will be.
- Click Save
Note, when you don't enter a Google Assistant output, it will take the default.
Create the Stop Intent
- Click on the Intents menu item.
- Click Create Intent
- Enter the Intent Name:
Stop Intent
- Click Add Training phrases
No
That's it
Bye
I don't want that
Goodbye
It's ok for now
Quit
I want to stop
Close this
End the conversation
- Scroll down to Responses.> Add Response
- Add the following text options:
Alright! Hopefully we will see you at one of our meetups!
No problem. See you at one of our meetups!
- Flip the switch: Set this intent as the end of conversation. This will make sure, once this intent gets matched, it will close the Google Assistant action.
- Click Save.
Create the Meetup Intent
The Meetup Intent will contain this part of the conversation:
User | Agent |
"When is the next meetup?" | "The next meetup will be <date> at <time> in <location>. The topic will be <topic>. And the speakers are: <speakers>. You can register via our newsletter." |
- Click on the Intents menu item.
- Click Create Intent
- Enter the Intent Name:
Meetup Intent
(make sure you use a capital M and a capital I. - If you spell the intent differently, the back-end service won't work!) - Click Add Training phrases
When is the next meetup?
Do you have any events?
Which events are in the planning?
Are there meetup events soon?
I would love to attend a meetup
Can I join a virtual meetup?
When will you get together?
Can I join?
What does your calendar look like?
- Click Fulfillment > Enable Fulfillment
- Flip the Enable Webhook call for this intent switch.
- Hit Save
Create the Tip Intent
The Tip Intent will contain this part of the conversation:
User | Agent |
"I want a reading tip." | "Do you want to read more about Chatbots, Voice or Both?" |
"Voice" | "Alright, here's the tip of the day! The <type> <title> of <author>. Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?" |
- Click on the Intents menu item again.
- Click Create Intent
- Enter the Intent Name:
Tip Intent
(make sure you use a capital L and a capital I. - If you spell the intent differently, the back-end service won't work!) - Click Add Training phrases and add the following:
Can I get a tip for an article?
I would like to receive a reading tip
Any book tips?
What's nice to read?
I want to learn more about Chatbots, what should I read?
What are nice blogs?
Do you have book suggestions?
I want to receive information about Both
Can I have Chatbots reading tip
I would like to read more about Voice
Voice please
Both are okay.
Reading tip
Tip
Blog
Article
Book
Book suggestions
Yes
Yeah
Another tip
Yes one more
- Scroll down to Action and parameters
- Mark tech as required
Click: Define Prompt and enter:
Do you want to read more about Chatbots, Voice or Both?
- Click Fulfillment > Enable Fulfillment
This time we are not hardcoding a response. The response will come from the cloud function! Thus flip the Enable Webhook call for this intent switch.
Click Fulfillment > Enable Fulfillment
- Flip the Enable Webhook call for this intent switch.
- Hit Save
5. Knowledge Connectors
Knowledge connectors complement defined intents. They parse knowledge documents to find automated responses. (for example, FAQs or articles from CSV files, online websites or even PDF files!) To configure them, you define one or more knowledge bases, which are collections of knowledge documents.
Read more about Knowledge Connectors.
Let's try this out.
- Select the en tag, to select the English language in the top menu.
- Select Knowledge (beta) in the menu.
- Click the right blue button: Create Knowledge Base
- Type as a Knowledge Base name; Women in Voice and hit save.
- Click Create the first one link
- This will open a window.
Use the following config:
Document Name: Women in Voice FAQ Sheet
Knowledge Type: FAQ
Mime Type: CSV
- We will need the data from this sheet, make sure the data sheet is opened, and select the FAQ tab
- Select File > Download > CSV
- Back in Dialogflow, Click Upload File from Computer and select the CSV file you have downloaded. Click Create
A knowledge base have been created:
- Click Add Response
Create the following answers and hit save:
$Knowledge.Answer[1]
- Click View Detail
This will display all the FAQs you have implemented in Dialogflow.
That's easy!
Know that you could also point to an online HTML website with FAQs to import FAQs to your agent. It's even possible to upload a PDF with a block of text, and Dialogflow will come up with questions itself.
- Click on Knowledge (beta) in the Dialogflow menu to go back to all the Knowledge base connectors.
- It's possible to change the strength and weakness of the Knowledge Base. This makes sense, when you have the idea that your FAQs are winning or losing from your own intents. Since we don't have many intents, let's make our Knowledge Base a bit stronger. Change the scale to -0.2. After dragging the slider, it will automatically save the value.
Now FAQs should be seen as ‘extras' to add to your agents, next to your intent flows. Knowledge Base FAQs can't train the model. So asking questions in a completely different way, might not get a match because it makes no use of Natural Language Understanding (Machine Learning models). This is why, sometimes it's worth converting your FAQs to intents.
6. Webhook Fulfillment
Create a Google Cloud Function
- Navigate to http://console.cloud.google.com in another browser tab.
- Select in the left menu Cloud Functions
- Click Create Function
- Specify the following configuration:
- Name:
dialogflow
- Memory allocated: 256MiB
- Trigger: HTTP
- Copy the URL to your clipboard.
- Select Inline Editor
- Runtime: NodeJS 8
- Function to execute:
dialogflow
- Make sure this authentication checkbox is checked:
- Here's the contents for package.json. Copy and paste this in the package.json tab of the editor.
This piece of code, loads the correct npm libraries into Google Cloud:
{
"name": "dialogflow",
"description": "Cloud Functions",
"engines": {
"node": "8"
},
"dependencies": {
"request": "^2.85.0",
"request-promise": "^4.2.5",
"dialogflow-fulfillment": "^0.6.1",
"actions-on-google": "^2.2.0",
"googleapis": "^48.0.0",
"moment": "^2.24.0"
},
"devDependencies": {
"eslint": "^5.12.0",
"eslint-plugin-promise": "^4.0.1",
"ngrok": "^3.2.7"
},
"private": true
}
- Here's the contents for the index.js. Copy and paste this in the index.js tab of the editor.
This piece of code will integrate with the googleapis library, to fetch data from a Google Sheet. It makes uses of the actions-on-google library to display cards on a Google Assistant device. It makes use of library dialogflow-fulfillment, to classify Dialogflow intents. And it makes use of the library moment to handle date and time objects.
/* jshint esversion: 8 */
'use strict';
process.env.DEBUG = 'dialogflow:debug';
const ACCOUNTS_SHEET_ID = '1UWx3WYVCrqz0D4uJ_pO56WeqEPa9rQDG1cfc_H11kgY';
const {
BasicCard,
Button,
} = require('actions-on-google');
const {google} = require('googleapis');
const moment = require('moment');
moment.locale('nl');
const { WebhookClient } = require('dialogflow-fulfillment');
var books;
var meetups;
const SHEETS_SCOPE = 'https://www.googleapis.com/auth/spreadsheets.readonly';
/**
* Authenticates the Sheets API client for read-only access.
*
* @return {Object} sheets client
*/
async function getSheetsClient() {
// Should change this to file.only probably
const auth = await google.auth.getClient({
scopes: [SHEETS_SCOPE],
});
return google.sheets({version: 'v4', auth});
}
/**
* Return a natural spoken date
* @param {string} date in 'YYYY-MM-DD' format
* @returns {string}
*/
var getSpokenDate = function(date){
let datetime = moment(date, 'YYYY-MM-DD');
return `${datetime.format('D MMMM')}`;
};
/* When the tipIntent Intent gets invoked. */
function tipIntent(agent) {
var par = agent.parameters.tech;
var selection = [];
//console.log(par);
//console.log(books);
for(var i = 0; i<books.length; i++){
if(books[i][2].toLowerCase() == par.toLowerCase()) {
selection.push(books[i]);
}
}
var random = Math.floor(Math.random() * selection.length);
var booktip = selection[random];
//console.log(selection[random]);
let spokenText = `<p><s>Alright, here's the tip of the day!</s></p><p>The ${booktip[6]} ${booktip[0]} of ${booktip[1]}.</p>`;
let writtenText = `Alright, here's the tip of the day! The ${booktip[6]} ${booktip[0]} of ${booktip[1]}.`;
//console.log(booktip[8]);
if (agent.requestSource === agent.ACTIONS_ON_GOOGLE) {
let conv = agent.conv();
conv.ask(`<speak>${spokenText}</speak>`);
conv.ask(new BasicCard({
title: `Tip of the day!`,
subtitle: `${par}`,
text: `The ${booktip[6]} ${booktip[0]} of ${booktip[1]}.`,
buttons: new Button({
title: 'Read',
url: `${booktip[8]}`,
})
}));
conv.ask(`<speak><p><s>Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?</s></p></speak>`);
// Add Actions on Google library responses to your agent's response
agent.add(conv);
} else {
agent.add(writtenText + ' Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?');
}
}
function meetupIntent(agent) {
let conv = agent.conv();
let record;
console.log(meetups);
for(var i = 0; i<meetups.length; i++){
let d = moment(meetups[i][0], 'YYYY-MM-DD');
let today = moment(new Date());
if(moment(d).isSameOrAfter(today)) {
// the i event is not in the past
record = meetups[i];
console.log(record);
break;
}
}
let date = getSpokenDate(record[0]);
let spokenText1 = `The next meetup will be ${date} at ${record[1]} in ${record[3]}.`;
let spokenText2 = `The topic will be <emphasis level="moderate">${record[2]}.</emphasis>`;
let spokenText3 = `You can register via our newsletter.`;
let writtenText = `${spokenText1} The topic will be ${record[2]}. ${spokenText3}`;
if (agent.requestSource === agent.ACTIONS_ON_GOOGLE) {
conv.ask(`<speak>${spokenText1} ${spokenText2} ${spokenText3}</speak>`);
conv.ask(new BasicCard({
title: `Meetup`,
subtitle: `${record[2]}`,
text: `${record[0]} ${record[1]} - ${record[3]}`,
buttons: new Button({
title: 'Register',
url: `http://www.meetup.com`
})
}));
conv.ask('<speak><p><s>Is there anything else I can help you with?</s></p></speak>');
agent.add(conv);
} else {
agent.add(`${writtenText} Is there anything else I can help you with?`);
}
}
exports.dialogflow = async (request, response) => {
var agent = new WebhookClient({ request, response });
console.log('Dialogflow Request headers: ' + JSON.stringify(request.headers));
console.log('Dialogflow Request body: ' + JSON.stringify(request.body));
const client = await getSheetsClient();
const allBooks = await client.spreadsheets.values.get({
spreadsheetId: ACCOUNTS_SHEET_ID,
range: 'Books&Blogs!A:I',
});
const allEvents = await client.spreadsheets.values.get({
spreadsheetId: ACCOUNTS_SHEET_ID,
range: 'Meetups!A:D',
});
books = allBooks.data.values;
meetups = allEvents.data.values;
books.shift();
meetups.shift();
var intentMap = new Map();
intentMap.set('Tip Intent', tipIntent);
intentMap.set('Meetup Intent', meetupIntent);
agent.handleRequest(intentMap);
};
- Click the the Environment variables, networking, timeouts and more link
- Select the Dialogflow Integrations service account.
(By default it's using a GAE App Engine Service account but this should be the same service account as the one that was shared within your Google Sheets, in the first steps of this tutorial.)
- Before we will deploy the cloud function. We will change one line in our code in the index.js tab. The 3rd line of code:
const
ACCOUNTS_SHEET_ID = '1Yo_E8KONgSiUm00ZmTOqtjXCwULmc2JuI3sjxRyvrkE';
In one of the first steps, we wrote this key down in Notepad. So copy and paste this id in your code.
- Now we are ready. Click the Create button. It will take a moment, because it's deploying your serverless function.
Enable fulfillments in Dialogflow
- Switch back to Dialogflow
- Click Fulfillment in the main menu
- Enable the Webhook switch.
- Enter the URL of the cloud function which was copied to the clipboard.
For example: https://us-central1-leeboonstra-wiv-uhtefa.cloudfunctions.net/dialogflow
- Click Save.
- Let's test the webhook, to see if the code works and test the flows directly in the Dialogflow simulator.
7. Actions on Google
Actions on Google is a development platform for the Google Assistant. It allows the third-party development of "actions"—applets for the Google Assistant that provide extended functionality.
You will need to invoke a Google Action, by asking Google to open or talk to an app.
This will open your action, it will change the voice, and you leave the ‘native' Google Assistant scope. Which means, everything you ask your agent from this point needs to be created by you. Within your own action you can't suddenly ask the Google Assistant for Google weather information, if that's what you want; you should leave (close) the scope of your current action (your app) first.
Testing your action in the Google Assistant simulator
Let's test the following conversation:
User | Google Assistant |
"Hey Google, talk to my test app." | "Getting the test version of my app."" |
User | Your-Name-WIV |
- | "Hey there, I'm Anna, the virtual agent of Women in Voice. You can ask me for information about meetups, Women in Voice or a reading tip. What would you like to know?" |
Can I have a tip for a blog? | "Do you want to read more about Chatbots, Voice or Both?" |
Chatbots | "Alright, here's the tip of the day! The Article Chatbots And The Challenge Of Invisible Affordances of Cobus Greyling.Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?" |
When is the next meetup? | "The next meetup will be 29 april at 6:00 PM in a virtual Hangout. The topic will beOnline Workshop: Create Google Actions with Dialogflow!.You can register via our newsletter.Is there anything else I can help you with?" |
What's the url for your website? | "You can find us at www.womeninvoice.com. Is there something else I can do for you?" |
No that's it, thanks | "Alright! Hopefully we will see you at one of our meetups!" |
- Switch back to the Google Assistant simulator
Open: https://console.actions.google.com
- Click on the microphone icon and ask the following:
Talk to my test app.
- Now let's ask:
Can I have a tip for a blog?
This should return:
"Do you want to read more about Chatbots, Voice or Both?"
Chatbots
"Alright, here's the tip of the day! The Article Chatbots And The Challenge Of Invisible Affordances of Cobus Greyling.
Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?"
- Let's try a different version of the same question:
"Yes, I want to read more about Voice"
"Alright, here's the tip of the day! The Book Designing Voice User Interfaces: Principles of Conversational Experiences. ... of Cathy Pearl.
Do you want another book or article tip? Also, I can tell you more about meetups or what we do. How can I help?"
Notice that you have never used this training phrase in Dialogflow before. It just matched the right intent.
Also notice that you didn't get a followup question, because you provided enough information for Dialogflow to continue.
- Continue the Dialogflow with the following phrases:
What's the URL for your website
Bye
Errors? Check the logs!
Every time you use console.log()
in your Cloud Function code, data will be written to your GCP logs (Stackdriver). You can access these logs, by opening the Cloud Console > Logging.
In the first dropdown, you can select Cloud Function > dialogflow to filter for your logs.
8. Congratulations
You have created your first Google Assistant action with Dialogflow, well done!
As you might have noticed, your action was running in test-mode which is tied to your Google Account. If you would login on your Nest device or Google Assistant app on your iOS or Android phone, with the same account. You could test your action as well.
Now this is a workshop demo. But when you are building applications for the Google Assistant for real, you could submit your Action for approval. Read this guide for more information.
What we've covered
- How to create a chatbot with Dialogflow v2
- How to create custom entities with Dialogflow
- How to create linear conversation with Dialogflow
- How to setup webhook fulfillments with Dialogflow and Google Cloud Functions
- How to bring your application to the Google Assistant with Actions on Google
What's next?
Enjoyed this code lab? Have a look into these great labs!
- Build a TV guide action with Dialogflow and Actions on Google
- Build Actions for the Google Assistant with Dialogflow (level 1)
- Build Actions for the Google Assistant with Dialogflow (level 2)
- Build Actions for the Google Assistant with Dialogflow (level 3)
- Understanding fulfillment by integrating Dialogflow with Google Calendar
- Integrate Google Cloud Vision API with Dialogflow