AWS Lambda is a serverless computing platform that uses Amazon Web Services (AWS) to run code without provisioning or managing servers. With this tutorial we will learn how to create an example application using Spring Boot, AWS and the Cloud9 IDE.

The “how to read file from s3 bucket in spring boot” is a blog post that discusses how to use Spring Boot, AWS Lambda and S3 Trigger.

This article will show you how to use Spring Boot to create an AWS Lambda function with an S3 trigger.

AWS’s Simple Storage Service, or Amazon S3, is a cloud object storage service. It’s a web-based cloud storage solution for storing data items in a bucket structure that’s scalable and fast. While following this guide, make sure you have an S3 bucket set up to upload items. If you don’t have one, make one now.

We’ll upload a basic text file and observe how the Lambda function is called right away in this example.

A Spring Boot project in Java will be used to generate the Lambda function. We’ll create a basic handler method in the Spring Boot project to accept upload events from S3. In the Handler function, you may build your business logic to handle the S3Event. We will just access the file and print its contents in this tutorial.

Let’s get this party started. The first step is to build a Spring Boot project and populate the pom.xml file with the appropriate maven dependencies.


To access the objects in S3, we must additionally include the ‘aws-java-sdk-s3’ dependency in addition to the ‘java-events’ and ‘java-core’ requirements.

Hoxton.SR6 (1.8) org.springframework.boot 1.0.17.RELEASE spring-boot-starter-web com.amazonaws aws-lambda-java-events spring-cloud-function-adapter-aws 2.0.2 aws-lambda-java-core com.amazonaws 1.1.0 lombok true com.amazonaws aws-java-sdk-s3 org.projectlombok org.springframework.boot 1.11.271 test org.junit.vintage junit-vintage-engine spring-boot-starter-test spring-cloud-dependencies true org.springframework.boot maven-deploy-plugin org.springframework.boot.experimental spring-boot-maven-plugin $wrapper.version org.apache.maven.plugins spring-boot-thin-layout maven-shade-plugin 3.2.4 aws aws aws aws aws springboot-aws-lambda

The handler class must be created and the handler method must be implemented in the second step. Make a class called S3EventHandle that implements the RequestHandler interface. Implement the handleRequest method in this class, which accepts S3Event and Context as input and returns a String.

We’ll also need to use AmazonS3ClientBuilder and BasicAWSCredentials to construct an AmazonS3 client that accepts an AWS access key and a secret key in this class. Create new keys if you don’t already have an access key and a secret key to access S3 items programmatically.

The handler function gets an S3Event with many records. Each record refers to a specific object that was uploaded to the S3 bucket. Using s3event.toJson, we can inspect the whole structure of the S3Event (). As demonstrated in the code, extract the appropriate information from the S3Event, such as bucket name and filename. We can now use the getObject() method of the AmazonS3 object s3client to access the S3 object(text file in our case). Print the contents of the file using InputStream.

In the file, paste the following code.

Import com.amazonaws.auth.AWSStaticCredentialsProvider; import com.amazonaws.auth.BasicAWSCredentials; import com.amazonaws.regions.Regions; import; import RequestHandlerS3Event, String> is implemented by S3EventHandler. finals in private private final String secretAccessKey = System.getenv(“secretAccessKey”); private final String accessKeyId = System.getenv(“accessKeyId”); private final String accessKeyId = System.getenv(“accessKeyId”); private final String accessKeyId = System.getenv(“accessKeyId”); private final String accessKey private final = System.getenv(“region”); string region = System.getenv(“region”); BasicAWSCredentials AmazonS3 s3client = AmazonS3ClientBuilder.standard(); basicAWSCredentials = new BasicAWSCredentials(accessKeyId, secretAccessKey); withRegion(Regions.fromName(region)).withCredentials(new AWSStaticCredentialsProvider(basicAWSCredentials)).build(); static final Logger log = LoggerFactory.getLogger(S3EventHandler.class); static final Logger log = LoggerFactory.getLogger(S3EventHandler.class); static final Logger log = LoggerF handleRequest is a public string that may be overridden (S3Event s3Event, Context context)“Lambda function is invoked: Processing the uploads………” + s3Event.toJson()); String BucketName = s3Event.getRecords().get(“Lambda function is invoked: Processing the uploads………” + s3Event.toJson()); String BucketName = s3Event.getRecords().get(“Lambda function is invoked: (0). String FileName = s3Event.getRecords().getName(); getS3().getBucket().getName(); getS3().getBucket().getName(); getS3().getBucket().getName(); getS3().get (0).“File – “+ FileName+” uploaded into “+ BucketName+” bucket at “+ s3Event.getRecords().getKey();“File – “+ FileName+” uploaded into “+ BucketName+” bucket at “+ s3Event.getRecords().getKey();“File – “+ FileName+” uploaded into “+ BucketName (0). (InputStream is = s3client.getObject(BucketName, FileName).getObjectContent()); try (InputStream is = s3client.getObject(BucketName, FileName).getObjectContent()) catch (IOException e) e.printStackTrace(); return “Error reading contents of the file”; return null;“File Contents: “+StreamUtils.copyToString(is, StandardCharsets.UTF 8));“File Contents: “+StreamUtils.copyToString(is, StandardCharsets.UTF 8));“File Contents: “

After that, use the mvn package command to produce a jar file, then use the jar file to construct a Lambda function. Check out this guide to learn how to accomplish it.

With AWS Lambda, create and deploy a serverless Spring Boot Web application.

Add the access key and secret key values as Environment variables under the Configuration tab after creating the Lambda function, as shown below, then import them into the project using System.getenv ().


Adding an S3 Trigger to a Lambda Function

The next step is to give this Lambda function a Trigger. Select S3 as the trigger configuration and click Add Trigger. Choose the name of the Bucket you just made from the drop-down menu. You may also create events that correspond to PUT/POST or even COPY by selecting the event type as ‘All generate events.’


To make the Lambda function handle certain events, we may include further information like a prefix (name/path of the item uploaded) and suffix (kind of object i.e., jpg/txt/img, etc.).

To complete inserting the trigger, click Add.

Go to the Services tab and upload a test.txt file to S3. The following is the content of the file that I have submitted.


The Lambda function must be called when the upload is complete. To check this, navigate to the Lambda function’s Monitor tab and choose ‘View logs on Cloud Watch.’ Check logs by clicking on the newly created stream in the Log streams section.



The line Lambda function is executed may be seen in the logs. The uploads are being processed… You may view the whole json that explains the S3Event if you click on it. You can also see the statement File – test.txt, which was uploaded to the s3trigger-bucket bucket at 2021-11-06T11:03:04.731Z on the next line.


You can also see the file contents printed if you scroll down to the bottom, as illustrated in the image below.


Every time a text file is uploaded into the bucket, the Lambda function is called in a similar fashion.

GitHub is where you can find the code.

This concludes the tutorial. I hope this has been of use to you. Good luck with your studies!! Good luck with your studies!

Watch This Video-

The “spring boot aws s3 maven” is a project that uses Spring Boot to create an AWS Lambda function and deploy it to S3. The project also includes Maven support.

Related Tags

  • spring boot s3 resourceloader
  • spring boot s3 file upload
  • spring boot connect to aws s3
  • spring cloud aws s3
  • from zero to production with spring boot and aws pdf