Analytics is an essential part of a bot project since it allows you to get an overview over performance KPIs of your AI assitant and to get a condensed understanding of how your clients and users are interacting with the bot. It shows you which intents, which data entries and channels are going well. Therefore, you need to get an understanding about the question how those counts are coming together and how to interprete these figures. This is a guide to understanding Mercury.ai analytics.
Is a total number that is derived from interactions with the assistant. Interactions that count into the total are: Events that trigger the Opt-in message (this could be a free text input, opening the chat in Facebook or a start intent in web chats), accepting or decling Opt-in, Trigger message after Opt-in approval, Free text input and messages triggered by button (either quick reply or postback buttons) clicks.
Shows the total number of messages that have been interpreted with a confidence of at least 0.5 and then replied to with the respective bot intent configured in the Creator.
Input that the AI assistant did reply to with a fallback message because of a too low level of interpretation confidence (lower than 0.5).
A relative figure describing the relation between total messages and interpreted messages. The closer to 100%, the higher level of interpretation.
This is the number for the question: How many interactions did the average user have with the AI assistant?
The duration of a conversation is being measured through the time span between the first and the last interaction of the user with the assistant. Conversations are counted as finished when users did not have any interaction with the bot within the last hour.
The total user count is a sum from the three user groups Active new users, Returning users and Bouncers. The meaning of each user group is being explained below.
Active new users are users that have interacted with the bot more than once. When the Opt-in message is activated, this means that a user who accepted Opt-in is an active user. For bots with deactivated Opt-in, users who sent more than one message (start intent is also treated as message, see above) are active users.
A user is a returning user when she has traceably interacted with the bot before and came back to the interaction again. Note: returning users can view their chat history.
This user group is being created through those users who had only one interaction with the bot.
Bots with activated Opt-in: users who ignore the Opt-in message are counted as Bouncer users
Bots with deactivated Opt-in: users who had only one interaction with the bot (e.g. the startIntent message)
Is a relative value of bouncers in relation to the total user count.
What do the numbers under the bold numbers mean?
The upper figure of the two displays the statistics for the last day of a selected period. So, you can see a live status for this very day and even shows you the data for the current day if you've selected this as last day of the period. The lower figure shows the relation between the day before the last day and the last day of the period.
Note: Is the last day of your period today, keep in mind that the current day can only show statistics until the time you are looking at the statistic. Values for yesterday cover the whole day. So, when you compare at 8 o'clock in the morning, your data for today might not be as good as the count for yesterday.
Is there somebody not counted as User?
Users who declined the Opt-in message are not counted as users but their interactions (Opt-in message trigger and reply to Opt-in) are counted in the message section.
Retention gives you the chance to have a look at the statistics for recurring users. It is an analysis specifically for the relative comparison between one-time active users and users who are returning and showing the specific percentage in each field.
The vertically aligned information are each data point you've selected in the period. Each day is a single item if you've selected Day and each week is also separately listed when you've chosen Weeks as resolution.
Horizontally you see how often users returned. 0 means that a user has only once approached the AI assistant and every following number how often a single user returned. In each cell you see a percentage which can be read as the percentage of users that returned as often as the numberof the respective field indicates.
Here you can differentiate activity of your AI assistant according to the channel the interaction took place. When you have it connected to several types of channels (e.g. Facebook, Web and Alexa), you see the share of each channel as percentage of the overall interaction.
Description coming soon...
Next, we go over to the analysis at intent level. Here you can dive deep into the world of analyzing your bot from the very basic element by looking at the figures of each intent individually. This means that you can see how many times a certain intent has been triggered and see exactly at what days the intents were triggered often and when it was only seldomly activated by users. The second part of the intents section shows you which intents are often the last interaction point between your users and the AI assistant.
Let's start with the chart for the intents. When you first navigate to the analytics page, it is an empty diagram. What you need to do is selecting those intents you are interested in. You can do this via the options add/remove intents to display where you can choose from a list of intents. You will notice that the list does not show all of the intents from your bot, what you rather see is the list of all intents triggered at least once during your selected period. After you've checked the boxes of the respective intent, all you have to do is clicking on apply and your selection is being confirmed and saved.
Now, the diagram is being generated. In the diagram you can view the development of each intent separately during the selected period. Each data point in the diagram represents the count of intents for a single day or week respectively. When you move your mouse in the diagram to a certain point of a line, you get the information about the performance of the intent on this day/week.
If you want to know which intent stands for which colour, you can have a look at the legend which explains the correspondence between each colour and intent. When you click on the colour or name of each intent in the legend, the very intent is going to be removed from being displayed in the graph. Clicking on it again, also makes the graph visible again.
Beneath the graph you find the information about the total count during the period of each intent separately. This gives you the opportunity to see how often an intent has been triggered during the selected period so that it shows you the total sum of the individual numbers in the graph. What's important here is that the total count keeps showing the complete list of the selected intents even when you remove the display of an intent from the graph. However, when you completely remove the intent via the add/remove intents to display option, the intent will also be removed from the total count.
If you want to view different categories of intents in different graphs, you can do this through the Add intents chart option. This will create another intent chart that works the same way as the functionality described above.
Last intents before inactivity
This is the display of the very last intents before users did not continue an interaction with an AI assistant. First of all we need to know when an interaction becomes inactive. The answer is that whenever a user did not respond to the bot within one hour, the conversation is being counted as an inactive conversation. Thus, what you will be shown here is the count of the situations in which this definition could be applied to a conversation.
The last point of a conversation is then being tracked as the last user intent that has been triggered before inactivity occurred. Therefore you get the information which intent it was that has been the last one in a conversation.
The last intent display is a crucial information for optimizing your AI assistant because through this statistic you get the tools to find out if a certain intent is not optimally designed. In other words, it is another way of acquiring knowledge about creating the best user experience as well as tracing and counteracting performance issues of a bot.
The last section of analytics shows you the performance of your content items for each content object separately. That is why there is a two-dimensional split of the display of your content. On the one hand you see a list of each of your content objects which are then again split up into a list of each single item in the content object.
I want to explain it based on an example. Assume that you have an online shop for clothing and there you have different categories such as shoes, jeans and jackets. Each category is separately organized as separate content objects in your data base which will now show different content objects in Analytics.
In your bot, you will now have three Content Object Search games in which you can search through the data base for each content object type. When users find and have a look at a single item in your data base, it will appear in analytics. You will thus get the information that in the category "Jackets" user have looked at item "Green Label Jacket" a certain count of times during the selected period.
What you see here is exactly what kind of products are getting the most attention and which products are the biggest hit in your shop. You get an important indicator at a very basic level about which products go well and which products are maybe often times looked at but are not sold very often (based on your selling data in the shop). What you find is a granular look at the performance of each item and lets you optimize your products in a very detailed way.
What we do have here is the most important feature of evaluating your bot - but it is not just at a very macroscopic level but what you rather get is the opportunity to get down to the very elemental perspective of how and which parts of your bot are running well and at what points it is necessary to keep optimizing.