DEV Community

Cover image for Generative AI Serverless - RAG using Bedrock Knowledge base, & Single document chat via AWS Console!
Girish Bhatia
Girish Bhatia

Posted on

Generative AI Serverless - RAG using Bedrock Knowledge base, & Single document chat via AWS Console!

Generative AI - Has Generative AI captured your imagination to the extent it has for me?

Generative AI is indeed fascinating! The advancements in foundation models have opened up incredible possibilities. Who would have imagined that technology would evolve to the point where you can generate content summaries from transcripts, have chatbots that can answer questions on any subject without requiring any coding on your part, or even create custom images based solely on your imagination by simply providing a prompt to a Generative AI service and foundation model? It's truly remarkable to witness the power and potential of Generative AI unfold.

'Chat with your document' is the latest Generative AI feature added by Amazon to its already feature-rich areas of GenAI, Knowledge Base, and RAG.

RAG, which stands for Retrieval Augmented Generation, is becoming increasingly popular in the world of Generative AI. It allows organizations to overcome the limitations of LLMs and utilize contextual data for their Generative AI solutions.

Amazon Bedrock is a fully managed service that offers a choice of many foundation models, such as Anthropic Claude, AI21 Jurassic-2, Stability AI, Amazon Titan, and others.

I will use the recently released Anthropic Sonnet foundation model and invoke it via the Amazon Console Bedrock Knowledge Base. As of May 2024, this is the only model supported by AWS for the single document knowledge base or 'Chat with your document' function.

There are many use cases where generative AI chat with your document function can help increase productivity. Few examples will be technical support extracting info from user manual for quick resolution of questions from the customers, or HR answering questions based on policy documents or developer using technical documentation to get info about specific function or a call center team addressing inquiries from customers quickly by chatting with product documentation.

Let's look at our use cases:

β€’ MyBankGB, a fictitious bank, offers various credit cards to consumers. The document "MyBankGB Credit Card Offerings.pdf" contains detailed information about all the features and details of the credit cards offered by the bank.

β€’ MyBankGB is interested in implementing a Generative AI solution using the "Chat with your document" function of Amazon Bedrock Knowledge Base. This solution will enable the call center team to quickly access information about the card features and efficiently address customer inquiries.

β€’ This is a proof of concept hence API based solution not required, rather access will be provided to selected call center team members via AWS Console for Bedrock knowledge base chat with your document function.

Here is the architecture diagram for our use case.

Image description

Let's see the steps to create a single document knowledge base in Bedrock and start consuming it using AWS Console.

Review AWS Bedrock 'Chat with your document

Chat with document is a new feature. You can use it via AWS Console or can use SDK to invoke it via Bedrock, Lambda and API.

Image description

For Data, you can upload a file from your computer OR you can provide ARN for the file posted in the S3 bucket.

Image description

Select model. Anthropic Claude 3 Sonnet is the only supported model as of May, 2024.

Request Model Access

Before you can use the model, you must request access to the model.

Chat with your document using AWS Console

Let’s chat with the document and review responses.

Image description

Review the response

Image description

Let's review more prompts and responses!

Image description

Image description

As you can see above, all answers are provided in the context of the document uploaded in the S3 bucket. This is where RAG makes the generative AI responses more accurate and reliable, and controls the hallucination.

With these steps, a serverless GenAI solution has been successfully implemented to build a serverless GenAI RAG solution for creating a single-document knowledge base using the Amazon Bedrock Console. Selected call center team members who have access to the AWS Console and the Anthropic Sonnet model can utilize the 'chat with your document' function and then provide feedback on whether an automated solution using APIs should be developed.

As GenAI solutions keep improving, they will change how we work and bring real benefits to many industries. This workshop shows how powerful AI can be in solving real-world problems and creating new opportunities for innovation.

Thanks for reading!

Click here to get to YouTube video for this solution.

https://www.youtube.com/watch?v=ErzmSHRY3pY

π’’π’Ύπ“‡π’Ύπ“ˆπ’½ ℬ𝒽𝒢𝓉𝒾𝒢
𝘈𝘞𝘚 𝘊𝘦𝘳𝘡π˜ͺ𝘧π˜ͺ𝘦π˜₯ 𝘚𝘰𝘭𝘢𝘡π˜ͺ𝘰𝘯 𝘈𝘳𝘀𝘩π˜ͺ𝘡𝘦𝘀𝘡 & π˜‹π˜¦π˜·π˜¦π˜­π˜°π˜±π˜¦π˜³ 𝘈𝘴𝘴𝘰𝘀π˜ͺ𝘒𝘡𝘦
𝘊𝘭𝘰𝘢π˜₯ π˜›π˜¦π˜€π˜©π˜―π˜°π˜­π˜°π˜¨π˜Ί 𝘌𝘯𝘡𝘩𝘢𝘴π˜ͺ𝘒𝘴𝘡

Top comments (0)