May Developer Challenge – SAP AI Services – Week 3

Welcome to week 3 of the AI Developer Challenge! I am so excited about every single submission! Now I don’t know about you but in my family, we love avocado! Especially my youngest could eat it all the time. Let’s use the Data Attribute Recommendation service to build an avocado price prediction engine! With Data Attribute Recommendation you can train regression as well as classification models and you can use the SAP AI Launchpad, Postman, Swagger or the AI API Python SDK to implement your use case! With Data Attribute Recommendation you need to specify a model template or a business blueprint to specify what kind of algorithm should be used.

For this challenge we will use the regression model template to implement an avocado price predictor! If you do not like avocado, feel free to come up with another use case and search for a fitting dataset (e.g. here).

This week we start by uploading and preparing our dataset schema and our dataset! For that we will be using the AI API which is a standardized way to interact with different AI runtimes and services such as SAP AI Core or Data Attribute Recommendation.

  1. Go to your BTP trial account and create a DAR instance (you can use the booster): https://developers.sap.com/tutorials/cp-aibus-dar-service-instance.html
  2. Download this dataset
  3. Download the postman collection for the Data Attribute Recommendation service and import the folder to Postman
  4. Go to the environment tab on the left and add the information from your DAR service key from step 1.
    1. authUrl = url from the uaa section in your .json file
    2. url = urlnoravonthenen_7-1715759911268.png
  5. Make sure to select the correct environment and go back to the Collections tab to create a token with the Get XSUAA OAuth Token GET request in the Authorization folder. If everything is set up correctly it will take the url, username and password from your environmental variables you configured in step 4. It will also assign the token to the correct variable in the environment for further use. Make sure to repeat this step once the token timed out.
  6. Create a dataset_schema.json file
  7. Go back to the Environments tab and assign a folder name (e.g. avocado_price_predictor) and a datasetSchemaFileRegression name (e.g. dataset_schema.json) as well as a datasetFileRegression name (e.g. avocado_data.csv)
  8. Upload a dataset schema with the PUT Upload Dataset Schema Regression request and make sure to upload your dataset_schema.json
  9. Upload this dataset with the PUT Upload Dataset Regression request
  10. Create a dataset schema artifact with the POST Create Dataset Schema Artifact Regression request (the required values were assigned automatically to your environment)
  11. Create a dataset artifact with the POST Create Dataset Artifact Regression request (the required values were assigned automatically to your environment)
  12. Learn more about AI API and the underlying concepts such as artifacts
  13. Submission is again a screenshot of the last POST request and the correct returned result:noravonthenen_6-1715759819554.png
  14. Stay tuned for next week to train your model!

 Week 1 challenge

Week 2 challenge

Scroll to Top