Amazon Bedrock offers a feature called Knowledge Bases, which allows you to connect Large Language Models (LLM) to additional data documents, such as PDFs, Markdown, HTML, Microsoft Word & Excel files, and more. This technique is also known across the industry as Retrieval Augmented Generation (RAG). Once you connect a model to your document store, in Amazon S3, you can query / prompt the model to answer questions or otherwise consume data that are stored inside of those documents. In order for this feature to work, you must connect Amazon Bedrock to a supported vector database. At the moment, Bedrock supports five different vector storage engines, including Amazon OpenSearch, Aurora Postgres, Pinecone, Redis Enterprise Cloud, and MongoDB. In this video, Trevor Sullivan (Solutions Architect, StratusGrid) explores setting up MongoDB Atlas as a vector storage engine, and connecting Amazon Bedrock Knowledge Bases to it.
Learn more at https://stratusgrid.com/