Learn Amazon Web Services (AWS) Platform-Part 2

Learn Amazon Web Services (AWS) Platform-Part 2

Hello everyone, this is part 2 of my previous series (Amazon Web Services) Platform. In the previous series, we discussed about EC2, Light Sail, and Lambda. This series will focus on Elastic Beanstalk and Batch Processing.

So let’s get started!!!

Elastic Beanstalk:

You can consider Elastic Beanstalk as “Platform as service” offered by Amazon for Web application development. This has made the job of web or application developer very easy. In a matter of a few clicks, you have your environment up and running. You just need to upload your code and the platform will be provisioned according to your application requirement.

Java, NET, PHP, Node. js, Python, Ruby, Go, and Docker is the web or application platform can be used to develop the code.

It doesn’t stop here, Elastic Beanstalk service not only provides a platform for your application but it scales automatically as per your application requirement. You also have the flexibility to still choose EC2 instance for web application deployment

With Elastic Beanstalk you also can monitor the health of applications

Below are the steps to create an Environment for Elastic Beanstalk:

  • Login to AWS Console and from Services Tab Click Elastic Beanstalk. This is part of the Compute Group.
  • Click on create an application.
  • Enter the Application Name and select the language platform. I have Selected Python, which automatically recommended Python version is selected. This can be customized

  • You can now upload the code, for testing I have selected use sample code.

  • Configure More option provides with Flexibility to customize the environment like Configuring load balancing, auto update ,monitoring, but all these are not covered under free-tier
  • Click on create application, in the background Environment provisioning will start

  • After 5 mins you will have your environment up and running in AWS cloud.



AWS Batch Processing:

 

AWS Batch processing is the service provided by amazon that executes batch computing jobs. Depending upon the Job requirement, resources are allocated. EC2 and Spot instances are used for resource allocation. Batch computing jobs run asynchronously across multiple compute instances.

AWS Batch comprise of below 4 components.

  • Jobs
  • Job definition
  • Job queues
  • Compute Environment


Jobs: The amount of work submitted to AWS Batch, this can be in form of PowerShell Script, executable, batch file or Docker container image

Job definition: It specifies how your jobs will be executed, CPU requirement, memory requirement and IAM role required

Job queue: List of your jobs which will be executed, you can decide the priority in case of multiple jobs.

Compute Environment: This describes what kind of instance will be used to execute the jobs. This can be managed by AWS or you can also decide.

 

Used Case: To better understand the AWS Batch processing let’s look into below example

 

Objective: Move .jpg files from one S3 to another S3 Storage as Thumbnail using the AWS Batch Processing

 

Steps:

1. An Image file is uploaded on S3 bucket which serves as source storage for batch Processing

2. An event notification is triggered from Source S3 bucket to Amazon SQS ( Queue Service ) with object details

3. The Amazon ECS tasks poll the SQS queue for any jobs available for processing

4. If the job is available task picks it up, in this case, .jpg file and convert into a thumbnail and move into Target S3 bucket.


Tags :
Share :
About Author
Gagandeep Singh Hoda

Leave a Comment

Comments
promoItemo
promoItemo
2 years ago

Earlier I thought differently, many thanks for the help in this question.