Posted On: Mar 1, 2024
We are excited to announce general availability of the QnAIntent in Amazon Lex, enabling developers securely connect foundation models (FM) to company data for Retrieval Augmented Generation (RAG). Introduced in preview at re:Invent in November 2023, the QnAIntent leverages enterprise data and foundation models on Amazon Bedrock to generate relevant, accurate, and contextual responses. The QnAIntent can be used with new or existing Lex bots to automate frequently asked questions (FAQs) through text and voice channels, such as Amazon Connect.
The QnAIntent helps bot developers automate customer questions and avoid unnecessary transfers to human representatives. Developers no longer need to create variations of intents, sample utterances, slots, and prompts in order to predict and handle a wide range of FAQs. By simply connecting the new QnAIntent to company knowledge sources, a bot can immediately handle questions using the allowed content, such as "what documents do I need to submit for an accident claim?". The QnAIntent currently supports Knowledge Bases for Amazon Bedrock, Amazon OpenSearch, and Amazon Kendra. Developers can also choose between a generative response summary or an exact response match, providing developer control over the bot provided response. QnAIntent is now generally available in the English language in US East (N. Virginia) and US West (Oregon) regions. To learn more, visit the Amazon Lex documentation page.