Without the Kubernetes, how can we manage and orchestrate cluster of containers in AWS.
Monday, June 6, 2022
AWS Beanstalk: Quicker way to move applications to AWS Cloud
AWS Beanstalk :
Do you want to deploy your applications (Java, .NET, Node JS) in AWS cloud easily? Then the best way is to go for Beanstalk.
Just upload your application like JAR file if Java application, and create application. Beanstalk automatically provision all required resources and deploy them, and provides the URI to access the application. It is very fast deployment without struggling defining your EC2, Scaling, Load balancing configuration and you can have full control on created resources like EC2 instances etc.,
Deploy via AWS Console:
Pre-requisites:
- Have your application built and ready. In my case a simple Spring Boot application with REST API end points. Please do-not provide SERVER.PORT:8080 and leave it to default.
- Test your application locally using, java -jar app.jar and test it is working fine.
- Have your AWS account and credentials. Login to AWS Console.
- Check if you have VPC ready, if you do not have, please create a default VPC.
Step to Deployment:
- Login to AWS Console, Search for Elastic Beanstalk and click.
- Click the create environment button.
- Choose web server environment as our app is web API, and click SELECT button.4
- In the application details page
- Fill Application Name : example java-balaji-api
- Fill Environment Name : example: java-balaji-api-dev
- Provide description
- Choose Platform, here in our case it is Java Spring Boot based API so selected Java
- Choose Platform Branch : My app is built using OpenJDK 11, so I selected "Corretto 11 running on 64 bit Amazon Linux2"
- Platform Version : I chosen default Recommended
- Application Code Tab: Select Upload your code.
- Fill Version and Choose Local file, and upload the jar from local folder location.
- Configure More Options
- Click Software and Edit button
- In Environment properties, please add new variable and value SERVER_PORT and value 5000.
- Then click Save button
- Then Click create application.
Test from Postman:
Terminate and Remove:
If you want to terminate the application, please choose and terminate. It will terminate only the EC2 instance. But to remove your application from S3, please click application and click "Delete" button.
Thursday, June 2, 2022
Terraform : Infrastructure as Code : Quick Example
Here we will see.
- Install Terraform on Windows Machine - Manual
- Install Terraform on AWS EC2 - Manal
- Install Docker on AWS EC2 using Terraform Script
- Install Jenkins on AWS EC2 using Terraform Script
- Install AWS Components and provisioning using Terraform Script
- Upload the Terraform Script (Code into GitHub)
- terraform.tfstate - Track, Security, Access Provisioning
- Terraform Registry and Providers
- Change Infrastructure using Script
- Destroy Infrastructure using Script - terraform destroy
- Terraform Cloud
allows teams to easily version, audit, and collaborate on infrastructure changes. It also securely stores variables, including API tokens and access keys, and provides a safe, stable environment for long-running Terraform processes.
For more hands-on experience with the Terraform configuration language, resource provisioning, or importing existing infrastructure, review the tutorials below.
Configuration Language - Get more familiar with variables, outputs, dependencies, meta-arguments, and other language features to write more sophisticated Terraform configurations.
Modules - Organize and re-use Terraform configuration with modules.
Provision - Use Packer or Cloud-init to automatically provision SSH keys and a web server onto a Linux VM created by Terraform in AWS.
Import - Import existing infrastructure into Terraform.
DevOps and DevSecOps
Do we need Repository for artifacts?
Do we need repository only for dependencies and not for applications?
What is Infra as Code and how terraform helps?
DevOps :
Culture :
Development, Infra, IT, Business and Testing team - will work as single unit
Best Practices
Developers involve in IT Operations
IT Operations involve in Development
Version Control for Code, Infra, Repo
Adapt new changes quickly to Code, IT, Process, Infra
Everything Automated - Code, CI, CD, Testing, Reports
Containerized Tools and Environment - DEV-PROD parity
Tools
Source control - Code Repo, DockerHub, Access provisioning
CI/CD - Jenkins Docker Images, Security, Access provisioning
Testing
Code Coverage Tools
Configuration Management and Tools
Binary Management/ Artifactory
Monitoring
Security
Collaboration
Wednesday, June 1, 2022
Spring Boot, Jenkins, Docker, Kubernetes on AWS EKS
Here we will
- Simple Spring Webflux based API
- Containerize the application using Docker
- Create CI Jenkins Pipeline
- Build
- Run Unit Tests
- Run Jacoco Reports
- Create Docker Image and Push to DockerHub
- Create CD Pipeline
- Login to AWS
- Pull Docker Image from Docker Hub and create container
- Deploy in AWS EKS
- Configure CI pipeline to be triggered only when code checked-in to any feature/* branches
- Configure CI pipeline to be triggered only when PR raised
- Configure CD pipeline to be triggered only when PR merged
1. Simple Spring Webflux based API
- Please check the README.md for how to build locally
- Beer-Service.postman_collection.json for testing the application using postman.
2. Containerize the application using Docker
Here in above file we use the image adoptopenjdk/openjdk11 We copy the built jar file as beerOrderService.jar We expose the post it need to be executed. The execution command to run application once container started |
3. Create CI Jenkins Pipeline
- Build
- Run Unit Tests
- Run Jacoco Reports
- Create Docker Image and Push to DockerHub
- Login to AWS
- Pull Docker Image from Docker Hub and create container
- Deploy in AWS EKS
Configure CI pipeline to be triggered only when PR raised
Configure CD pipeline to be triggered only when PR merged
Azure DevOps vs Jenkins
Azure DevOps vs Jenkins
- Group Tasks – Azure allows you to perform a sequence of tasks, already defined in a pipeline, into a single task, whereas Jenkins is generally done by a single user which leads to tracking and accountability problems.
- YAML Interface – With YAML in Azure Pipelines, you can configure CI/CD pipeline as code, whereas Jenkins doesn’t have a YAML interface.
- Platform, language, and cloud – In Azure Pipelines, you can deploy various applications including Node.js, android, iOS, java, python, and many more and then deploy to either on-premise, AWS, Azure, or GCP. With regards to Jenkins, you get scripted pipelines that must be programmed in Groovy.
- Analytics in Azure Pipelines is provided at the end with two parameters – rate and duration of the run. Jenkins doesn’t provide any analytics.
- Plugins and Tasks – The built-in plugins and extensions can be downloaded from Azure DevOps marketplace. Jenkins has a wide range of plugins to choose from.
- Integration of Azure Pipelines with Microsoft is easy, but requires configuration changes to integrate with non-Microsoft products. Jenkins, on the other hand, can easily be modified and extended.
- Easy Support – Since Jenkins is open source, there is a huge support from the agile teams.
Who wins the battle?
The battle boils down to the team or the project you work on. While Jenkins is more flexible to create and deploy complex workflows, Azure DevOps is faster to adapt. In most cases, organizations use both the tools and in such cases, Azure Pipelines supports integration with Jenkins.