Applications of GenAI
Within the Talkative platform, GenAI is used in two primary areas:
Knowledgebase-Driven Use Cases:
Examples include chatbot responses and copilot functionalities—here, you can choose your AI model.General Use Cases:
Examples include interaction summaries—currently, these do not offer model choice. Regional-specific open-weights models are used here (AWS-hosted LLAMA).
Below is a summary of the GenAI features and whether you can choose your model:
GenAI Feature | Choose Your Model? |
---|---|
GenAI Chatbot | Yes |
Copilot – Autocomplete | Yes |
Copilot – Suggested Responses | Yes |
Copilot – Navi | Yes |
Agent Message Rephrase | No |
Summaries and Insights | No |
AI Phrase Matching | No |
AI Agent Training | No |
GenAI Architecture
1. Knowledgebase-Driven Use Cases
Setup
Upload Knowledgebase:
Upload your knowledgebase source material to Talkative.Storage:
The source material is stored as a text file within Talkative’s regional AWS S3 bucket.Large Datasets:
For larger datasets, AWS Cohere is used to generate a vector store of your knowledgebase data.Model Selection:
Choose your preferred LLM for runtime.Configuration:
Enable the feature by configuring the appropriate settings (e.g., Chatbot or Copilot configuration).
Runtime
Prompt Generation:
The customer’s message, conversation transcript, and relevant knowledgebase data are combined into a prompt sent to the selected LLM.Embedding Model:
AWS Cohere is utilized as the embedding model.Response Integration:
The response from the LLM is integrated into the system’s reply (whether for direct customer interaction or within a copilot message).
2. General Use Cases
Setup
Feature Enablement:
Activate the feature within Talkative’s settings.Model Limitation:
Currently, there is no option to choose your LLM for these features. All general use cases are powered by AWS LLAMA.
Runtime
Processing:
Depending on the feature, the text is sent to one of the LLMs (AWS LLAMA, AWS Anthropic, OpenAI, or Google).Response Integration:
The LLM’s response is used to generate interaction summaries, rephrase messages, or perform other functions.
Data Processing and Privacy
AWS Bedrock (LLAMA, Anthropic Models):
AWS does not share your data with model providers. Additionally, Bedrock does not store or log your prompts or completions.
Learn more: AWS Bedrock Data ProtectionOpenAI:
OpenAI does not use your data to train its models. Data is stored for 30 days. Please contact us if you require a different data retention period.
Learn more: OpenAI API Data Usage Policies
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article