Analytics with Firebase (Realtime Database) in Voiceflow

In this tutorial, we will use a Firebase Realtime database to store some usage statistics from a Voiceflow skill.

The project is simple, for each user, we want to record the number of sessions, the date, start time and the end time. As a bonus, we will also log if the user is using the skill on a device with or without a display. This will be linked to each user with his or her unique identifier obtained using the user_id variable.

1 - CREATE THE FIREBASE PROJECT

Let’s start by creating a new project on https://firebase.google.com/

On this page, click on 29%20AM at the top right.

On the next screen, click on Add project

and enter the project name you want then hit Continue.

Finally, disable Google Analytics and hit Create project

Wait until the end of the project creation step

39%20PM

Then click on Continue

2 - CREATE A USER

The project is created but we now need to add a user to be able to authenticate later. To do this, simply click on Authentication in the left panel.

On the main page, click on image

On the next screen, click on the little pen on the right on the Email/Password row.

Allow users to sign up using their email address and password by enabling this option. Hit Save to finish.

Back to the previous page, click on the Users section

Unlike before, you can now add a user. This is what we will do by clicking on Add user.

Add an email (no need to put a valid email here) and a password. Before clicking on Add user, remember to write this information down somewhere as we will need it later.

Last step here, move your mouse over the user UID and click on Copy UID

3 - CREATE AND CONFIGURE THE REALTIME DATABASE

Let’s move on to the creation of our database. In the left panel, click on Database, then, on the right page, scroll down until you see the option to create a Realtime Database then click on Create database.

On the window that appears, simply click on Enable.

Well done, your database is created. Last step, we will modify the rules to allow access only to the user we created previously.
To do this, click on the Rules tab.

We will now replace the default code with this one (remember to replace the UID Jg4CmL3zexeLCsC7R5ym6PhW5nP2 with the one corresponding to the user you created previously).

{
  "rules": {
    ".write": "'Jg4CmL3zexeLCsC7R5ym6PhW5nP2' === auth.uid",
    ".read": "'Jg4CmL3zexeLCsC7R5ym6PhW5nP2' === auth.uid"
  }
}

Your screen should look like this

Click on Publish to validate the changes.

Now, we’re going to need two more things. The URL of your database and the Web API key that will allow us to authenticate you during the first API call.

For the database URL, simply copy and save it from the Database page.

And for the Web API key, go to your project’s settings

Then on the main page, copy and save the Web API key

So much for the database part, now let’s move on to our Voiceflow project.

4 - INTEGRATE YOUR REALTIME DATABASE INTO VOCIEFLOW

To continue this tutorial, import the Firebase // Realtime Database project into your board and open it.

In this project we will log for each user, the ID, the date and timestamp, the start and end time, the duration of the session, the number of sessions already performed, if the device has a screen and the timezone on which the user is located.

Of course, you can modify this project as you wish to make it fit your needs.

But let’s start by studying this project quickly.
So, the first block of this project is a Code block that will allow us to know if the device has a screen but also to record the time of the skill launch.

image

The next part is only there for the example and allows you to delay to have a different start and end time.

image

Finally, the part that interests us most here, the Analytics flow.

image

In the first Set block, you will be able to enter all the information you have been keeping during the creation of the Realtime Database.

database_url will be the … database URL (keep the ending slash)
a_user is the email
a_password the password
a_key the API key

Here is what I get with the information from this tutorial for an example

I won’t detail too much the next block because it’s a cover of the tutorial on the user timezone available here.

The Making Data Code block is here to calculate the session time with the start and finish time. We also use it to clean up the user ID a little bit so that we can use it as the name of the record in our database.

image

The authentication part is managed by an Integration block to make the API call and get a token thanks to the user IDs we created.

Finally, the last block allows us to update the user’s information thanks to a PATCH request. If the user ID does not exist, a new record will be created, otherwise, it is simply updated.

In the request parameters we send the user’s token and in the body, all the information we want to add to the user’s record in JSON format.

{
    "userID": "{user_id}",
    "timestamp": {timestamp},
    "date":"{a_date}",
    "time_start": "{a_time_start}",
    "time_end": "{a_time_end}",
    "session_time": "{a_session_time}",
    "hasDisplay":{hasDisplay},
    "sessions":{sessions},
    "userTimezone":"{a_user_timezone}"
}

The advantage of this format is that you can add and delete information very easily.

So much for the discovery of this project, the Analytics flow is also present in the Stop flow, so if the user stops the skill, you can always save the data.

Ideally, always try to keep this Analytics flow at the very end of a skill’s execution, after the last speak block, this way you have the most up-to-date information but most importantly, in case of a problem you don’t take the risk of breaking the skill and providing a bad experience to your users.

5 - CONCLUSION

To be able to test the project you must use the ADC simulator or a device.

After the first launch, you should see your first record in the database.

Congratulations, you have completed this tutorial!

See you soon for a new tutorial :+1:

1 Like

Hi great tutorial! One question i have, is the code snippets accessible somewhere? The reason why i ask is that some of the code is an image and it sort of got truncated so I can’t see the entire code? Many thanks!

Sure, you can import the project from the link in the tutorial, from the marketplace or right here: https://creator.voiceflow.com/dashboard?import=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJwcm9qZWN0SWQiOjQ1OTI4LCJwcm9qZWN0TmFtZSI6IkZpcmViYXNlIC8vIFJlYWx0aW1lIERhdGFiYXNlIiwiaWF0IjoxNTY4ODQxNjgyfQ.HQisLD2sTyeqE_a79cUeFTk4DMvIzdDjdNzjjFisbX8

Ah thanks. My bad, for some reason missed the link in the tutorial. Curious about the marketplace though, didn’t know there is one. How do i access it?

@goldzulu You can find it here: https://airtable.com/tblpYysnQuzqzmL0f/viwOWjYA2irqHRr17?blocks=hide

hasDisplay = voiceflow.capabilities && (‘Alexa.Presentation.APL’ in voiceflow.capabilities);

still works now?

sorry, just misunderstood…display block is required to get hasDisplay.