Saturday, June 18, 2022

Using Java for Date, Time, Duration

 We have been using the below packages and the classes for Data Time for long time. But after Java 8 release there is big relief for developers and they can do date, time manipulations with lesser code. 

We will compare them. 

java.util.* - Old date, time package

java.sql.* - Implements the classes from above package java.util.*

java.time.* - The new Java 8 package with lot of convenient features.

joda time api - Which was fulfilling the gaps in java.util packages.

Monday, June 6, 2022

Spring Boot, Jenkins, Docker on AWS ECS

 Without the Kubernetes, how can we manage and orchestrate cluster of containers in AWS.

AWS Beanstalk: Quicker way to move applications to AWS Cloud

 AWS Beanstalk :

Do you want to deploy your applications (Java, .NET, Node JS) in AWS cloud easily? Then the best way is to go for Beanstalk.

Just upload your application like JAR file if Java application, and create application. Beanstalk automatically provision all required resources and deploy them, and provides the URI to access the application. It is very fast deployment without struggling defining your EC2, Scaling, Load balancing configuration and you can have full control on created resources like EC2 instances etc., 

Deploy via AWS Console: 

Pre-requisites: 

  1. Have your application built and ready. In my case a simple Spring Boot application with REST API end points. Please do-not provide SERVER.PORT:8080 and leave it to default. 
  2. Test your application locally using, java -jar app.jar and test it is working fine.
  3. Have your AWS account and credentials. Login to AWS Console. 
  4. Check if you have VPC ready, if you do not have, please create a default VPC. 

Step to Deployment: 

  1. Login to AWS Console, Search for Elastic Beanstalk and click.
  2. Click the create environment button. 
  3. Choose web server environment as our app is web API, and click SELECT button.4
  4.  In the application details page
    1. Fill Application Name : example java-balaji-api
    2. Fill Environment Name : example: java-balaji-api-dev
    3. Provide description 
    4. Choose Platform, here in our case it is Java Spring Boot based API so selected Java
    5. Choose Platform Branch :  My app is built using OpenJDK 11, so I selected "Corretto 11 running on 64 bit Amazon Linux2"
    6. Platform Version : I chosen default Recommended
    7. Application Code Tab: Select Upload your code. 
    8. Fill Version and Choose Local file, and upload the jar from local folder location.
  5. Configure More Options
    1. Click Software and Edit button
    2. In Environment properties, please add new variable and value SERVER_PORT and value 5000.
    3. Then click Save button
  6. Then Click create application. 

It will provision EC2 for compute S3 for code (jar file) and deploy and start the application and provide the URL. 

Test from Postman: 

From post man replace the localhost:8080 with the above URL and test your application. 
In my case : 



Terminate and Remove: 

If you want to terminate the application, please choose and terminate. It will terminate only the EC2 instance. But to remove your application from S3, please click application and click "Delete" button.


Thursday, June 2, 2022

Vault : Another product from HashiCorp for Secrets Management across Infrastructure and Applications

Terraform : Infrastructure as Code : Quick Example

 Here we will see.

  • Install Terraform on Windows Machine - Manual


  • Install Terraform on AWS EC2 - Manal

  • Install Docker on AWS EC2 using Terraform Script

  • Install Jenkins on AWS EC2 using Terraform Script

  • Install AWS Components and provisioning using Terraform Script

  • Upload the Terraform Script (Code into GitHub)

  • terraform.tfstate - Track, Security, Access Provisioning 

  • Terraform Registry and Providers

  • Change Infrastructure using Script

  • Destroy Infrastructure using Script - terraform destroy 

  • Terraform Cloud 

allows teams to easily version, audit, and collaborate on infrastructure changes. It also securely stores variables, including API tokens and access keys, and provides a safe, stable environment for long-running Terraform processes.


For more hands-on experience with the Terraform configuration language, resource provisioning, or importing existing infrastructure, review the tutorials below.

  • Configuration Language - Get more familiar with variables, outputs, dependencies, meta-arguments, and other language features to write more sophisticated Terraform configurations.

  • Modules - Organize and re-use Terraform configuration with modules.

  • Provision - Use Packer or Cloud-init to automatically provision SSH keys and a web server onto a Linux VM created by Terraform in AWS.

  • Import - Import existing infrastructure into Terraform.



DevOps and DevSecOps

 Do we need Repository for artifacts? 

Do we need repository only for dependencies and not for applications?

What is Infra as Code and how terraform helps?

DevOps :

Culture : 

     Development, Infra, IT, Business and Testing team - will work as single unit

Best Practices 

    Developers involve in IT Operations

    IT Operations involve in Development

    Version Control for Code, Infra, Repo

     Adapt new changes quickly to Code, IT, Process, Infra

    Everything Automated - Code, CI, CD, Testing, Reports

    Containerized Tools and Environment - DEV-PROD parity

Tools

    Source control - Code Repo, DockerHub, Access provisioning

    CI/CD - Jenkins Docker Images, Security, Access provisioning

    Testing 

    Code Coverage Tools 

    Configuration Management and Tools

    Binary Management/ Artifactory

    Monitoring 

    Security

    Collaboration



    

Wednesday, June 1, 2022

Spring Boot, Jenkins, Docker, Kubernetes on AWS EKS

Here we will

  1. Simple Spring Webflux based API
  2. Containerize the application using Docker
  3. Create CI Jenkins Pipeline 
    1. Build
    2. Run Unit Tests
    3. Run Jacoco Reports
    4. Create Docker Image and Push to DockerHub
  4. Create CD Pipeline
    1. Login to AWS
    2. Pull Docker Image from Docker Hub and create container
    3. Deploy in AWS EKS
  5. Configure CI pipeline to be triggered only when code checked-in to any feature/* branches
  6. Configure CI pipeline to be triggered only when PR raised
  7. Configure CD pipeline to be triggered only when PR merged

1. Simple Spring Webflux based API

For Sample API, please checkout the code from 

  1. Please check the README.md for how to build locally
  2. Beer-Service.postman_collection.json for testing the application using postman. 

2. Containerize the application using Docker

    Add a file name "Dockerfile" into the application folder location. 
    Add below configuration 


FROM adoptopenjdk/openjdk11 COPY build/libs/*SNAPSHOT.jar beerOrderService.jar EXPOSE 8080 CMD ["java", "-jar", "beerOrderService.jar" ]

    

 Here in above file we use the image adoptopenjdk/openjdk11
We copy the built jar file as beerOrderService.jar
We expose the post it need to be executed. 
The execution command to run application once container started 


3. Create CI Jenkins Pipeline 

  1. Build
  2. Run Unit Tests
  3. Run Jacoco Reports
  4. Create Docker Image and Push to DockerHub

Create Jenkins 


Create CD Pipeline
  1. Login to AWS
  2. Pull Docker Image from Docker Hub and create container
  3. Deploy in AWS EKS
Configure CI pipeline to be triggered only when code checked-in to any feature/* branches
Configure CI pipeline to be triggered only when PR raised
Configure CD pipeline to be triggered only when PR merged




    

Azure DevOps vs Jenkins

 

Azure DevOps vs Jenkins

  • Group Tasks – Azure allows you to perform a sequence of tasks, already defined in a pipeline, into a single task, whereas Jenkins is generally done by a single user which leads to tracking and accountability problems.
  • YAML Interface – With YAML in Azure Pipelines, you can configure CI/CD pipeline as code, whereas Jenkins doesn’t have a YAML interface.
  • Platform, language, and cloud – In Azure Pipelines, you can deploy various applications including Node.js, android, iOS, java, python, and many more and then deploy to either on-premise, AWS, Azure, or GCP. With regards to Jenkins, you get scripted pipelines that must be programmed in Groovy.
  • Analytics in Azure Pipelines is provided at the end with two parameters – rate and duration of the run. Jenkins doesn’t provide any analytics.
  • Plugins and Tasks – The built-in plugins and extensions can be downloaded from Azure DevOps marketplace. Jenkins has a wide range of plugins to choose from.
  • Integration of Azure Pipelines with Microsoft is easy, but requires configuration changes to integrate with non-Microsoft products. Jenkins, on the other hand, can easily be modified and extended.
  • Easy Support – Since Jenkins is open source, there is a huge support from the agile teams.

Who wins the battle?

The battle boils down to the team or the project you work on. While Jenkins is more flexible to create and deploy complex workflows, Azure DevOps is faster to adapt. In most cases, organizations use both the tools and in such cases, Azure Pipelines supports integration with Jenkins.



Concerns with Jenkins: 

  • 12
    Workarounds needed for basic requirements
  • 9
    Groovy with cumbersome syntax
  • 7
    Plugins compatibility issues
  • 6
    Lack of support
  • 6
    Limited abilities with declarative pipelines
  • 4
    No YAML syntax
  • 3
    Too tied to plugins versions


  • Sample Jenkins File 

  • pipeline {
  •     agent none
  •     stages {
  •         stage('Build') {
  •             steps {
  •                 sh 'npm install'
  •                 sh 'npm run build'
  •             }
  •         }
  •         stage('Test') {
  •             steps {
  •                 sh 'npm test'
  •             }
  •         }
  •     }
  • }


  • Azure-pipeline.yaml
  • jobs:
  • - job: Build
  •   steps:
  •   - script: npm install
  •   - script: npm run build
  • - job: Test
  •   steps:
  •   - script: npm test


  • If we containerized our applications.
  • Jenkinsfile
  • pipeline {
  •     agent none
  •     stages {
  •         stage('Build') {
  •             agent {
  •                 docker {
  •                     image 'ubuntu:trusty'
  •                     args '-v $HOME:/build -w /build'
  •                 }
  •             }
  •             steps {
  •                 sh 'make'
  •             }
  •         }
  •         stage('Test') {
  •             agent {
  •                 docker {
  •                     image 'ubuntu:xenial'
  •                     args '-v $HOME:/build -w /build'
  •                 }
  •             }
  •             steps {
  •                 sh 'make test'
  •             }
  •         }
  •     }
  • }

  • Azure-pipeline.yaml

  • resources:
  •   containers:
  •   - container: trusty
  •     image: ubuntu:trusty
  •   - container: xenial
  •     image: ubuntu:xenial

  • jobs:
  • - job: build
  •   container: trusty
  •   steps:
  •   - script: make
  • - job: test
  •   dependsOn: build
  •   container: xenial
  •   steps:
  •   - script: make test


  • After completion Jenkinsfile

  • post {
  •     always {
  •         echo "The build has finished"
  •     }
  •     success {
  •         echo "The build succeeded"
  •     }
  •     failure {
  •         echo "The build failed"
  •     }
  • }
  • Azure-pipeline.yaml

  • jobs: - job: always steps: - script: echo "The build has finished" condition: always() - job: success steps: - script: echo "The build succeeded" condition: succeeded() - job: failed steps: - script: echo "The build failed" condition: failed()


  • https://blog.opstree.com/2021/04/13/jenkins-vs-azure-devops/

  • Jenkins CI Pipeline for SpringBoot Application : Pipeline Script VS Pipeline Script from SCM

     Here we will create CI and CD Pipelines to build and deploy application using Jenkins pipelines.

    We can create in two was 

    1. Pipeline Script 

    2. Pipeline Script from SCM