Amazon AWS S3 Storage Connectivity – From Data Services

Starting from Version 4.2 SP7

In this day and age of cloud computing, it is not uncommon that companies are moving from in-house storage and infrastructure (on-premise) to cloud computing (IaaS*), in order to store and share files without having to maintain the systems within the company’s infrastructure.

Cost savings, elasticity of storage, and disaster recovery are just a few reasons to consider cloud infrastructure as a winning strategy.

Here’s a handy “cheat sheet” from TechTarget on different cloud services:

However, there is often a need for combining data from “the Cloud” with that of the on-premise systems for Analytics and other business purposes. This scenario should help you understand the capability of SAP Data Services to seamlessly connect to cloud storage systems like Amazon AWS S3 (Simple Storage Service) and integrate data from the Cloud with on-premise data, using it for multiple purposes including Business Intelligence.

A new feature in SAP Data Services, beginning with version 4.2 SP7, is the ability to create a File Location and configure it to connect to the Amazon AWS S3 Cloud storage. This connection will let us read and write files from this external cloud storage (know as S3 buckets).

Let’s take a deeper look into the configuration of this File Location to connect to AWS S3 Storage and to read and write files from/to that location.

The first steps toward setting up the File Location in SAP Data Services is to obtain the following details of your Amazon AWS S3 Storage System:

  1. Access Key ID
  2. Secret Access Key

Towards the end of this article, you will find detailed steps to get this information, if you do not already have these keys with you.

Once you have the necessary S3 Storage connectivity details, log on to SAP Data Services Designer tool.

Creation of new File Location

Browse to the Local Object Library, Formats tab. Now right click on the File Location and select New as shown in the screenshot below:

Now, in the “Edit File Location” window, from the Protocol dropdown, choose the “Amazon S3 Cloud Storage” Option. Then, provide the Access Key and the Secret Key that were provided to you by the AWS Admin.

Note: Please make sure that you choose the correct “Region” below, as selected in the AWS S3 Bucket properties.

Creating a new File Format

Once the File Location is created, you would create a new file format, choosing “Location” as the new File Location you just created as shown below:

Now you choose the file format you had created:

You will continue to follow the same steps to create a new file format as you would generally do.

Tip: If the file format is elaborate, you can temporarily copy the file to your local drive first, create the file format based on that file, and then once it is all done, change the “Location” to the AWS+S3 file location later as shown below:

Now, you can use this file format to either read from or write to the files in the S3 Storage.

Do you know what Business Intelligence tools you’re licensed for? Find out here.

Getting your Access Key ID and Secret Key from Amazon AWS account

This section gives you details about getting the AWS S3 access Keys. You can create the Access keys using the AWS Management Console. It is recommended to use the IAM Access keys instead of AWS root account access Keys. In order to create these keys, you need to have permissions to perform the following IAM actions.

  1. Open the IAM console.
  2. In the navigation pane, choose Users.
  3. Choose your IAM user name (not the checkbox).
  4. Choose the Security Credentials tab and then choose Create Access Key.
  5. To see your access key, choose Show User Security Credentials. Your credentials will look something like this:
    • Secret Access Key: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
  6. Choose Download Credentials, and store the keys in a secure location.

Your secret key will no longer be available through the AWS Management Console; you will have the only copy. So, please keep it safe.


By creating a file location to Amazon S3 buckets, it becomes seamless to the ETL Developers to access data from and write to the Amazon S3 Cloud location.
Whether this is for ingesting third-party data into your on-premise Data Warehouse, exposing internal feeds to Cloud BI like SAP Analytics Cloud or simply an “off-site” storage backup task – being able to bridge the worlds of cloud storage with internal processes is a huge bonus for our data orchestration needs.

See more Business Intelligence insights or get future articles sent right to your inbox

increase your knowledge
increase your knowledge

If this blog didn’t quite answer your question or you have other Business Intelligence questions or concerns, please feel free to contact us.

Read More about Data Service in these recent Blogs

By |2018-09-24T15:08:59+00:00August 27th, 2018|Data Services, Kingfisher|0 Comments

About the Author:

Shaheen has more than 13 years of experience with a strong background in BI Analytics, Data Warehousing, and ETL Architecture. She enjoys working with clients finding solutions to their BI reporting problems. She has helped many healthcare clients with data cleansing projects, along with designing data marts and establishing BI reporting infrastructure. When she’s not cooking delicious food or enjoying hikes in the mountains, you can reach her at