What is this feature and why you should use it.
The Talkative GenAI Chatbot is a conversational agent designed to answer questions using a language model (LLM) combined with your own dataset. This chatbot uses the power of artificial intelligence to provide accurate, relevant, and context-aware responses to user queries.
Who can use this feature?
Licence | Role | ||
Teams Licence | ✓ | Agent | ╳ |
Business Licence | ✓ | Supervisor | ╳ |
Enterprise Licence | ✓ | Account Holder | ✓ |
Table of Contents
- How the AI Knowledgebase Works
- How to create a knowledgebase [10 minutes]
- How to add a knowledgebase to your chatbot [10 minutes]
- Frequently asked Questions (FAQ's)
- Additional Data information
What will you learn from this article?
How it works
Creating Knowledgebases
Using the Knowledgebase in the chatbot
How the AI Knowledgebase Works
Talkative's chatbot supports integration with an AI knowledge base. The key concept of the generative AI chatbot is using a Large Language Model (OpenAI) to respond to a customer's question, based on a pre-approved data set (e.g. your website).
The interaction flow works in the following order:
1: Start an interaction
- A customer starts a chat interaction with your chatbot, and sends in a message.
2: The AI looks for data
- The chatbot flow uses an AI Knowledgebase lookup to answer that particular question
- If relevant data is found the message will be sent to the LLM.
3: The AI attempts to respond
- The LLM is prompted to respond as if it were a customer service agent, answering only based on the dataset you've provided.
4: If the AI cant answer...
- If the LLM cannot answer the question, it will return a system message to your chatbot, from where you can make the next action in the chatbot flow (i.e. route to an agent).
5: If the AI successfully answers...
- If the LLM can answer the question, it will send the response back to the Talkative system, where it is inserted into the chatbot response.
No data is stored with the LLM. No identifying metadata is sent to the LLM - All pre-chat forms data, company name, queue name, agent name etc. are all stripped out before the request is sent over. The LLM is only aware of the dataset provided, the prompt and the conversation thread including the message it needs to respond to.
How to create a knowledgebase [10 minutes]
This section will cover how to build a knowledgebase for your chatbot.
Step 1: Log into Talkative
Step 2: Access the Knowledgebase settings
- Click the 'Settings' tab on the left hand side
- Then click 'AI Knowledge Bases'
Step 3: Select what knowledgebase type you want to create
- At the top of the page, click the 'Create AI Knowledge Base' button
- Knowledgebases can contain:
- Website URLs (current limit is 1,000 URLs) - For webpage-based knowledgebase you will submit a list of public URL's that contain your training data. This will automatically update itself at specified intervals with any updates that have been made to the selected URL's
- Files (PDF, JSON, TXT or CSV)
- Free text
Step 4: Setting up a knowledgebase
- Set a label for the knowledgebase, this is so that you can refer back to it later, should you need to make any updates or changes
- Then decide if you wish to scan your entire website to generate the training data, or specify certain URL's
- To scan your entire website, click the 'Scan website for URLs' button. Then enter your URL on the next page
- To scan only specific URL's, paste your URL's into the 'Knowledge Base URL's' textbox. You can add more URL's by clicking the button.
- To scan your entire website, click the 'Scan website for URLs' button. Then enter your URL on the next page
- Then specify how often website content should be re-downloaded to update the training.
- Then specify how often website content should be re-downloaded to update the training.
Please note that refreshing your knowledgebase more frequently may incur a higher cost.
- You can also add in files, click or drag your TXT, PDF, JSON or CSV file into the upload box.
- You can also add in any free text as applicable.
- Then click 'Create Knowledge Base'
- Once the knowledgebase has been created successfully, you will receive an email that the upload has completed.
File based KB's are limited to 20MB per file.
- Then click 'Save'
- Once the process has been completed, you will receive an email
Step 5: Advanced Knowledgebase settings (optional)
- Once you have set up your knowledgebase, you can configure advanced settings such as custom prompts.
- To access these settings, navigate to the list of AI knowledgebases in the settings
- Click the knowledgebase you want to edit
- At the bottom of the page, an 'Advanced options' tab will appear. Click this to display the advanced settings
- Here you will find a selection of different settings, we suggest leaving these blank unchanged where possible.
- Please read this guide to manage custom prompts: https://support.gettalkative.com/support/solutions/articles/201000066331-genai-chatbot-responses-custom-prompts
If you would like to change these settings, please reach out to the Talkative team who will be able to advise further
How to add a knowledgebase to your chatbot [10 minutes]
This section will cover how to add a knowledgebase lookup to your chatbot.
Step 1: Log into Talkative
Step 2: Decide the approach
- Decide where in the chatbot flow you want to use the AI lookup. There are a few approaches:
- Having a lookup in a specific flow.
- Using GenAI lookups in the fallback.
- Using GenAI response as your initial response to the customer.
We'd advise thorough testing of your chatbot if you are using option C in particular.
Step 3a: A lookup in a specific flow
- In this case, we select the exit approach of the node to be "Look up previous message in AI knowledge base"
- So your flow will look like this:
- Let's preview this to see how it works. First, the chatbot asks the user to input their question:
- Once the question has been received, the chatbot will do an AI lookup, querying the question against the KB specified.
- In this case you can see the data was matched, and we include the response here:
Step 3b: Using GenAI lookups in the fallback
- In this case we're using the AI lookup as the fallback node.
- Here we're initially asking the customer to go down our path of suggestion chips, where we've got scripted responses. But if the customer veers off from our decision tree, they will hit the fallback node.
- In this initial conversation, we go down the "expected" path we want the user to take.
- But in this second conversation, our question is not covered by our pre-written responses.
- The fallback node is triggered because of this. And the fallback node triggers the AI Knowledgebase lookup.
Step 3c: Using GenAI response as your initial response to the customer
- Here the initial message is set to go to the AI Knowledge lookup:
Frequently asked Questions (FAQ's)
- Question 1: Will this answer questions that are not relevant to my business?
- Answer: No, the AI will only respond if the answer to the question is within your dataset / Knowledgebase
- Answer: No, the AI will only respond if the answer to the question is within your dataset / Knowledgebase
- Question 2: My knowledgebase hasn't uploaded.
- Answer: If your Knowledgebase is within the data size limit and you still have issues, please contact Talkative.
- Answer: If your Knowledgebase is within the data size limit and you still have issues, please contact Talkative.
- Question 3: Does it matter how my data is structured?
- Answer: LLMs work well with unstructured data, but FAQ formats tend to work the best. The higher quality of data, the better your results will be.
- Answer: LLMs work well with unstructured data, but FAQ formats tend to work the best. The higher quality of data, the better your results will be.
- Question 4: How many URLs can I use?
- Answer: There is currently no limit.
- Answer: There is currently no limit.
- Question 5: What benefits can I expect to see from this?
- Answer: This really depends on what sort of customer questions you are receiving. The main benefits are that the AI is going to be better at responding to questions than traditional chatbot, allowing it to answer a higher percentage of questions without human assistance.
- Answer: This really depends on what sort of customer questions you are receiving. The main benefits are that the AI is going to be better at responding to questions than traditional chatbot, allowing it to answer a higher percentage of questions without human assistance.
- Question 6: How well does it work on websites - dropdowns?
- Answer: The KB import tool works well on most websites, however there will inevitably a small percentage of websites that it doesn't work well on. Please contact Talkative if you require support with importing
- Answer: The KB import tool works well on most websites, however there will inevitably a small percentage of websites that it doesn't work well on. Please contact Talkative if you require support with importing
- Question 7: What's the difference between fallback and data not matched nodes?
- Answer: Fallbacks are your global Talkative Chatbot node for when a user's message input is not matched to a flow. The "item/data not matched" nodes are for when the AI lookup tool was used, but the data/answer was not found in the dataset/KB.
- Answer: Fallbacks are your global Talkative Chatbot node for when a user's message input is not matched to a flow. The "item/data not matched" nodes are for when the AI lookup tool was used, but the data/answer was not found in the dataset/KB.
Additional Data information
Only the message transcript is sent to the Large Language Model, in addition with the Knowledgebase contents. Pre-chat form data and other interaction metadata is not used.
You are able to select which LLM you wish to use. The data processing depends on the model you are using.
OpenAI: OpenAI does not train its models on your data. Data is stored for 30 days. Please contact us if you require a different data processing duration. Please familiarise yourself with OpenAI's privacy policies: https://openai.com/policies/api-data-usage-policies
AWS Bedrock (LLAMA, Anthropic models): AWS does not share the data with the model providers, and Bedrock doesn't store or log your prompts and completions. https://docs.aws.amazon.com/bedrock/latest/userguide/data-protection.html
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article