Performing Basic Operation On AWS Bucket In Spring Boot Java

Posted By : Abhilasha Saxena | 29-Oct-2021

AWS Java Spring boot

Loading...


In this blog, we are going to learn about basic operations that we can perform on AWS S3 bucket with AWS credentials and without AWS credentials in Spring Boot Java Application.
Firstly we need to create a spring boot maven project,
you can create a spring boot project from spring initializers
and add below aws dependency in your pom.xml file

<dependency>
   <groupId>com.amazonaws</groupId>
   <artifactId>aws-java-sdk-s3</artifactId>
   <version>1.11.954</version>
</dependency>

If you want to create a rest api then create a rest controller and
create a function in your service layer to get AmazonS3Object so that you can re-use it .

Note: for this you'll need credentials for AWS account 

You can set the ACCESS_KEY_ID, SECRET_ACCESS_KEY and STORAGE_REGION
in your application.properties file or you can hard code the values
as per your requirement.

The below code authenticates the user's was account and returns AmazonS3 object:

public AmazonS3 getAmazonS3Object() {
    BasicAWSCredentials credentials = new BasicAWSCredentials(aws.ACCESS_KEY_ID, aws.SECRET_ACCESS_KEY);
    return AmazonS3ClientBuilder.standard()
            .withRegion(aws.STORAGE_REGION)
            .withCredentials(new AWSStaticCredentialsProvider(credentials))
            .build();
}

This object exposes multiple functions which we can use to get access
to bucket files, folder etc,. and perform various operations.

Create object of AmazonS3 class to access other methods for creating,
fetching, deleting bucket and files from it

AmazonS3 s3 = getAmazonS3Object();

You can create a bucket with below line:

s3.createBucket(bucketName);

If you want to make a rest api that create a bucket on server then use below code:

public Bucket createBucket(String bucketName) {
     AmazonS3 s3 = getAmazonS3Object();
     Bucket b = null;
     try {
         b = s3.createBucket(bucketName);
     } catch (AmazonS3Exception e) {
         e.printStackTrace();
     }
    return b;
}

if you want to add a check for existing bucket with same name you can use
doesBucketExistV2 method and pass bucketName as parameter it return boolean value,
true in case if bucket exists with same name and false in case no bucket exists with that name.

Update your code with below mentioned code:

public Bucket createBucket(String bucketName) {
    AmazonS3 s3 = getAmazonS3Object();
    Bucket b = null;
    if (s3.doesBucketExistV2(bucketName)) {
        b = getBucket(bucketName);
    } else {
         try {
             b = s3.createBucket(bucketName);
         } catch (AmazonS3Exception e) {
            e.printStackTrace();
         }
    }
    return b;
}

You can perform other operations also as mentioned below and
can convert them into individual functions performing each operation:

To write or upload file into the bucket simply use:

s3.putObject(new PutObjectRequest(bucketName, fileName, file));

To delete any file or directory from the bucket simple use:

s3.deleteObject(bucketName, path);

To delete a bucket simply use:

s3.deleteBucket(bucketName);

There are multiple methods to fetch the file object from amazonS3 but here we have used single argument getObjectMethod which takes bucket name and file name which we need to read and make a copy

public void copyFileFromS3Bucket(AmazonS3 amazonS3, String bucketName, String fileName) throws IOException {
    S3ObjectInputStream stream = amazonS3.getObject(new GetObjectRequest(bucketName, fileName)).getObjectContent();
    byte[] buffer = new byte[stream.available()];
    stream.read(buffer);
    File targetFile = new File(fileName);
    OutputStream outStream = new FileOutputStream(targetFile);
    outStream.write(buffer);
}

We can use the below lines to call above defined methods and perform the action

AmazonS3 s3 = getAmazonS3Object();
copyFileFromS3Bucket(s3, aws.STORAGE_BUCKET_NAME, aws.STORAGE_BUCKET_NAME);

Suppose we don't want to share the account credentials with others and still want the file to be readable then we can grant public access for the file inside bucket and make it accessible with the help of url
the url would be like below:

http://[BucketName].s3.amazonaws.com/[FileName]

And for reading or copying any public access file we don't require any credentials for AWS account like the above once,

so we can simply copy the file via url using below code from aws:

public void copyFileFromLink() throws IOException {
    URL url = new URL(aws.URL_LINK);
    BufferedInputStream in = new BufferedInputStream(url.openStream());
    FileOutputStream fileOutputStream = new FileOutputStream(getFileName(url));
    byte dataBuffer[] = new byte[1024];
    int bytesRead;
    while ((bytesRead = in.read(dataBuffer, 0, 1024)) != -1) {
        fileOutputStream.write(dataBuffer, 0, bytesRead);
    }
    fileOutputStream.close();
    in.close();
}

We, at Oodles, provide end-to-end ERP software development services to solve complex business problems. Our custom ERP software solutions enable cross-industry businesses to streamline their processes and achieve higher levels of customer satisfaction.