Generative AI for SAP VI. Consume Amazon Bedrock through BTP Core AI

What are we doing today?

Leverage a generalized approach to applying Generative AI 🤗 to business challenges with SAP BTP. In this sample, we will deploy an SAP Cloud Application Programming application that interacts with Amazon Bedrock and does an RAG-based API search to federated data using SAP DataSphere.

What are we trying to solve?

Large Language Models are pre-trained on generalistic data and trained on past data. Still, they can both be helped to call external APIs, or they can query our own company data, and we leave the models to be the “language translators.” Hence, they get a good understanding of what the user wants and how the information is given back to the users.

One of the biggest problems of LLMs is they need as much data as possible and don’t consider the roles and authorizations the user has at the moment of the query. For this, we will leverage the SAP Identity Provisioning services from BTP to manage the identity lifecycle for SAP and non-SAP data for model prompting.

Components in use

1⃣ SAP AI Core is part of the services portfolio that SAP BTP provides. It handles the execution and operations of the AI assets and integrates with our other SAP solutions. In our approach, we use SAP AI Core to act as a proxy to access the external Amazon Bedrock service and to expose a destination that can be consumed by our application.

2⃣ Amazon Bedrock is a new fully managed service from AWS that makes base models from Amazon and third-party model providers accessible through an API. It’s cool because we do not need to control the model deployment or scalability; we just choose the model and interact with it through the API. Bedrock uses Agents 🤖 that invoke APIs dynamically to execute tasks. In this case, agents will call SAP BTP data services like DataSphere for federated queries or HANA Cloud for Graph queries over a Data Lake. Agents allow the native support for RAG to extend the power of FMs with SAP proprietary data

3⃣ CAP Application. SAP recommends using Cloud Application Programming (CAP) as the entity layer of the application and SAP UI5 if a user interface is required to quickly build and deploy a front-end application

4⃣ Pinecone Vector DB if the Agents for Amazon Bedrock need to connect to the company’s data sources, which have been transformed into embeddings and numerical representations. A Vector Database feeds a model with information in a way they understand.

5⃣ SAP DataSphere. SAP Datasphere simplifies complex business data landscapes by unifying all data systems into a central layer.SAP Datasphere offers a variety of ways to collect or connect with data, regardless of where the data resides. It integrates with SAP applications and non-SAP systems to federate/replicate data across your most crucial data solutions—from lakehouses, data warehouses, data lakes, and data governance platforms.

Scroll to Top