Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
former_member69765
Contributor

Motive


While developing applications on SAP Cloud Platform, we often have need to store files. In the SCP Neo edition a Document Service is available.

However, in the Cloud Foundry edition – the same Document service is not available.

Instead, a different service called Object Storage is available. This service specefic to the underlying IaaS.

For e.g. this is S3 (on AWS), Cloud Storage (on GCP), Blob Storage (on Azure)

In this blog post, we will focus on S3 storage and how to develop a simple Java app to read and store files.

 

Account Setup


Before proceeding futher, make sure that the objectstore service is available and quota is added to your subaccount.

An instance of the object store service must be created and it can be done in the following ways:

Option 1: Create instance of this service directly from the cockpit. Simply select default options.

Option 2: Add this service / resource as a dependency in the MTA Deployment descriptor (mta.yaml)  file. In this case the service instance will be created automatically when the application is deployed

Recommendation: Option 2 is recommended.

 

Understanding S3 service in AWS


In this blog post, we will not explain about AWS S3 service. Read about AWS S3 as a prerequisite.

It is beneficial to understand the concepts of bucket and how buckets are used.

For the lazy developers who don’t like reading long documentation:

  • An AWS bucket is a like a big folder where you can store your files.

  • A bucket (folder) will be provided for your application where you can organize your own files.

  • Anyone with the access key can read / write files in this bucket (folder)


 

Project MTA


Coming back to real stuff : Implementation in the Project

First - We have to define objectstore as a requirement / dependency in the MTA Yaml file like this:
ID: ObjectStore_Demo
_schema-version: '2.1'
description: Object Store Demo
version: 0.0.1
modules:
- name: ObjectStore
type: java
path: ObjectStore
parameters:
memory: 1024M
provides:
- name: ObjectStore_api
properties:
url: '${default-url}'
requires:
- name: demo_s3
resources:
- name: demo_s3
type: objectstore
parameters:
service: objectstore
service-plan: s3-standard
service-name: demo_s3

When this app is deployed, the service instance with name demo_s3 will be created and the application will be bound to it automatically.

 

Service Instance and Bucket Keys


When the application is deployed, the following happens:

  • Service instance is created

  • Application binding is done

  • System creates a bucket and access keys for your application




 

JAVA App


AWS provides SDKs for several programming languages. For now, we focus only on JAVA SDK.

There are 2 versions of SDK available at the moment:

  • Version 1.1

  • Version 2.0


For this PoC we used version 1.1 as this is older version and a lot of commuity support was available. This was also easier to implement.

 

POM Dependencies


Add the following dependencies to the POM.xml
	    <dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-bom</artifactId>
<version>1.11.600</version>
<type>pom</type>
<scope>import</scope>
</dependency>

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-s3</artifactId>
<version>1.11.600</version>
</dependency>

 

Initialize the S3 Client


In JAVA, we need to access the bucket name and access keys. This can be accessed via VCAP services.
		JSONObject vcap = new JSONObject(System.getenv("VCAP_SERVICES"));
s3_credentials = vcap.getJSONArray("objectstore").getJSONObject(0).getJSONObject("credentials");

String access_id = s3_credentials.getString("access_key_id");
String secret_key = s3_credentials.getString("secret_access_key");
String region = s3_credentials.getString("region");

BasicAWSCredentials awsCreds = new BasicAWSCredentials(access_id, secret_key);

s3 = AmazonS3ClientBuilder.standard()
.withCredentials(new AWSStaticCredentialsProvider(awsCreds))
.withRegion(region)
.build();

 

 

Reading file List


Note that you can read the files only from your bucket.
		JSONArray file_list = new JSONArray();
JSONObject file;

String bucket_name = s3_credentials.getString("bucket");

ListObjectsV2Result result = s3.listObjectsV2(bucket_name);
List<S3ObjectSummary> objects = result.getObjectSummaries();
for (S3ObjectSummary o : objects) {

file = new JSONObject();
file.put("key",o.getKey());
file.put("owner",o.getOwner());
file.put("size",o.getSize());
file_list.put(file);
}

return file_list;

 

Write files to bucket


String file_name = "my_file_name";
String file_content = "my_file_content";
String bucket_name = s3_credentials.getString("bucket");
s3.putObject(bucket_name, file_name, file_content);

 

Read File Content


		String bucket_name = s3_credentials.getString("bucket");
S3Object o = s3.getObject(bucket_name, file_name);
BufferedReader reader = new BufferedReader(new InputStreamReader(o.getObjectContent()));
StringBuilder sbuilder = new StringBuilder();
String line;
while ((line = reader.readLine()) != null) {
sbuilder.append(line);
}

JSONObject result = new JSONObject();
result.put("file_content", sbuilder.toString());
result.put("file_meta",o.getObjectMetadata().getRawMetadata());

return result;

 

Build and Deploy


You application is not ready to build and deploy. Deploy it on the cloud platform and test.

 

Full working example:


The complete development example can be seen in the video:



 

Conclusion


Integrating and using AWS S3 as object store service is really easy.

In this blog post we learnt the followng:

  • How to add object store as a requirement in the MTA file

  • How to read the bucket credentials and access the bucket

  • How to do read and write operations on bucket a S3


The object store sercice can be used where applications have to store and manage files (or raw data). For e.g. in scenarios where application has to store documents, images, reciepts etc.

In my next blog post, I plan to demonstrate how the same can be achieved when underlying IaaS is GCP.
4 Comments