Friday, September 16, 2022

Basic Questions From Java and MySQL

 


MySQL

What is INDEX?

What INDEX do you use much, How will you create INDEX?\

What is Pessimistic and Optimistic locking?


Java

What is StringBuffer and StrinBuilder?\


Friday, July 1, 2022

SAGA Pattern: Design, Implementation Path

 Choreography Or Orchestration

Steps (Microservices and Persistent Layer)

Events 

States

Bounded Contexts

Actors

Compensating Transactions

Eventual Completion (Happy Path/ Unhappy Path)



Pivot Transaction - Once commit, then next transactions must complete until end. If payment is done, must complete the Seat Allocation. 

Retry able Transaction - Transactions followed by Pivot transaction. If the Seat allocation done, next ones will be processed.

List of Idempotent transactions - Identify


  • compensating transactions must be idempotent, commutative, and they cannot abort (they must be retried indefinitely or resolved through manual intervention when necessary).


Saturday, June 18, 2022

Using Java for Date, Time, Duration

 We have been using the below packages and the classes for Data Time for long time. But after Java 8 release there is big relief for developers and they can do date, time manipulations with lesser code. 

We will compare them. 

java.util.* - Old date, time package

java.sql.* - Implements the classes from above package java.util.*

java.time.* - The new Java 8 package with lot of convenient features.

joda time api - Which was fulfilling the gaps in java.util packages.

Monday, June 6, 2022

Spring Boot, Jenkins, Docker on AWS ECS

 Without the Kubernetes, how can we manage and orchestrate cluster of containers in AWS.

AWS Beanstalk: Quicker way to move applications to AWS Cloud

 AWS Beanstalk :

Do you want to deploy your applications (Java, .NET, Node JS) in AWS cloud easily? Then the best way is to go for Beanstalk.

Just upload your application like JAR file if Java application, and create application. Beanstalk automatically provision all required resources and deploy them, and provides the URI to access the application. It is very fast deployment without struggling defining your EC2, Scaling, Load balancing configuration and you can have full control on created resources like EC2 instances etc., 

Deploy via AWS Console: 

Pre-requisites: 

  1. Have your application built and ready. In my case a simple Spring Boot application with REST API end points. Please do-not provide SERVER.PORT:8080 and leave it to default. 
  2. Test your application locally using, java -jar app.jar and test it is working fine.
  3. Have your AWS account and credentials. Login to AWS Console. 
  4. Check if you have VPC ready, if you do not have, please create a default VPC. 

Step to Deployment: 

  1. Login to AWS Console, Search for Elastic Beanstalk and click.
  2. Click the create environment button. 
  3. Choose web server environment as our app is web API, and click SELECT button.4
  4.  In the application details page
    1. Fill Application Name : example java-balaji-api
    2. Fill Environment Name : example: java-balaji-api-dev
    3. Provide description 
    4. Choose Platform, here in our case it is Java Spring Boot based API so selected Java
    5. Choose Platform Branch :  My app is built using OpenJDK 11, so I selected "Corretto 11 running on 64 bit Amazon Linux2"
    6. Platform Version : I chosen default Recommended
    7. Application Code Tab: Select Upload your code. 
    8. Fill Version and Choose Local file, and upload the jar from local folder location.
  5. Configure More Options
    1. Click Software and Edit button
    2. In Environment properties, please add new variable and value SERVER_PORT and value 5000.
    3. Then click Save button
  6. Then Click create application. 

It will provision EC2 for compute S3 for code (jar file) and deploy and start the application and provide the URL. 

Test from Postman: 

From post man replace the localhost:8080 with the above URL and test your application. 
In my case : 



Terminate and Remove: 

If you want to terminate the application, please choose and terminate. It will terminate only the EC2 instance. But to remove your application from S3, please click application and click "Delete" button.


Thursday, June 2, 2022

Vault : Another product from HashiCorp for Secrets Management across Infrastructure and Applications

Terraform : Infrastructure as Code : Quick Example

 Here we will see.

  • Install Terraform on Windows Machine - Manual


  • Install Terraform on AWS EC2 - Manal

  • Install Docker on AWS EC2 using Terraform Script

  • Install Jenkins on AWS EC2 using Terraform Script

  • Install AWS Components and provisioning using Terraform Script

  • Upload the Terraform Script (Code into GitHub)

  • terraform.tfstate - Track, Security, Access Provisioning 

  • Terraform Registry and Providers

  • Change Infrastructure using Script

  • Destroy Infrastructure using Script - terraform destroy 

  • Terraform Cloud 

allows teams to easily version, audit, and collaborate on infrastructure changes. It also securely stores variables, including API tokens and access keys, and provides a safe, stable environment for long-running Terraform processes.


For more hands-on experience with the Terraform configuration language, resource provisioning, or importing existing infrastructure, review the tutorials below.

  • Configuration Language - Get more familiar with variables, outputs, dependencies, meta-arguments, and other language features to write more sophisticated Terraform configurations.

  • Modules - Organize and re-use Terraform configuration with modules.

  • Provision - Use Packer or Cloud-init to automatically provision SSH keys and a web server onto a Linux VM created by Terraform in AWS.

  • Import - Import existing infrastructure into Terraform.



DevOps and DevSecOps

 Do we need Repository for artifacts? 

Do we need repository only for dependencies and not for applications?

What is Infra as Code and how terraform helps?

DevOps :

Culture : 

     Development, Infra, IT, Business and Testing team - will work as single unit

Best Practices 

    Developers involve in IT Operations

    IT Operations involve in Development

    Version Control for Code, Infra, Repo

     Adapt new changes quickly to Code, IT, Process, Infra

    Everything Automated - Code, CI, CD, Testing, Reports

    Containerized Tools and Environment - DEV-PROD parity

Tools

    Source control - Code Repo, DockerHub, Access provisioning

    CI/CD - Jenkins Docker Images, Security, Access provisioning

    Testing 

    Code Coverage Tools 

    Configuration Management and Tools

    Binary Management/ Artifactory

    Monitoring 

    Security

    Collaboration



    

Wednesday, June 1, 2022

Spring Boot, Jenkins, Docker, Kubernetes on AWS EKS

Here we will

  1. Simple Spring Webflux based API
  2. Containerize the application using Docker
  3. Create CI Jenkins Pipeline 
    1. Build
    2. Run Unit Tests
    3. Run Jacoco Reports
    4. Create Docker Image and Push to DockerHub
  4. Create CD Pipeline
    1. Login to AWS
    2. Pull Docker Image from Docker Hub and create container
    3. Deploy in AWS EKS
  5. Configure CI pipeline to be triggered only when code checked-in to any feature/* branches
  6. Configure CI pipeline to be triggered only when PR raised
  7. Configure CD pipeline to be triggered only when PR merged

1. Simple Spring Webflux based API

For Sample API, please checkout the code from 

  1. Please check the README.md for how to build locally
  2. Beer-Service.postman_collection.json for testing the application using postman. 

2. Containerize the application using Docker

    Add a file name "Dockerfile" into the application folder location. 
    Add below configuration 


FROM adoptopenjdk/openjdk11 COPY build/libs/*SNAPSHOT.jar beerOrderService.jar EXPOSE 8080 CMD ["java", "-jar", "beerOrderService.jar" ]

    

 Here in above file we use the image adoptopenjdk/openjdk11
We copy the built jar file as beerOrderService.jar
We expose the post it need to be executed. 
The execution command to run application once container started 


3. Create CI Jenkins Pipeline 

  1. Build
  2. Run Unit Tests
  3. Run Jacoco Reports
  4. Create Docker Image and Push to DockerHub

Create Jenkins 


Create CD Pipeline
  1. Login to AWS
  2. Pull Docker Image from Docker Hub and create container
  3. Deploy in AWS EKS
Configure CI pipeline to be triggered only when code checked-in to any feature/* branches
Configure CI pipeline to be triggered only when PR raised
Configure CD pipeline to be triggered only when PR merged




    

Azure DevOps vs Jenkins

 

Azure DevOps vs Jenkins

  • Group Tasks – Azure allows you to perform a sequence of tasks, already defined in a pipeline, into a single task, whereas Jenkins is generally done by a single user which leads to tracking and accountability problems.
  • YAML Interface – With YAML in Azure Pipelines, you can configure CI/CD pipeline as code, whereas Jenkins doesn’t have a YAML interface.
  • Platform, language, and cloud – In Azure Pipelines, you can deploy various applications including Node.js, android, iOS, java, python, and many more and then deploy to either on-premise, AWS, Azure, or GCP. With regards to Jenkins, you get scripted pipelines that must be programmed in Groovy.
  • Analytics in Azure Pipelines is provided at the end with two parameters – rate and duration of the run. Jenkins doesn’t provide any analytics.
  • Plugins and Tasks – The built-in plugins and extensions can be downloaded from Azure DevOps marketplace. Jenkins has a wide range of plugins to choose from.
  • Integration of Azure Pipelines with Microsoft is easy, but requires configuration changes to integrate with non-Microsoft products. Jenkins, on the other hand, can easily be modified and extended.
  • Easy Support – Since Jenkins is open source, there is a huge support from the agile teams.

Who wins the battle?

The battle boils down to the team or the project you work on. While Jenkins is more flexible to create and deploy complex workflows, Azure DevOps is faster to adapt. In most cases, organizations use both the tools and in such cases, Azure Pipelines supports integration with Jenkins.



Concerns with Jenkins: 

  • 12
    Workarounds needed for basic requirements
  • 9
    Groovy with cumbersome syntax
  • 7
    Plugins compatibility issues
  • 6
    Lack of support
  • 6
    Limited abilities with declarative pipelines
  • 4
    No YAML syntax
  • 3
    Too tied to plugins versions


  • Sample Jenkins File 

  • pipeline {
  •     agent none
  •     stages {
  •         stage('Build') {
  •             steps {
  •                 sh 'npm install'
  •                 sh 'npm run build'
  •             }
  •         }
  •         stage('Test') {
  •             steps {
  •                 sh 'npm test'
  •             }
  •         }
  •     }
  • }


  • Azure-pipeline.yaml
  • jobs:
  • - job: Build
  •   steps:
  •   - script: npm install
  •   - script: npm run build
  • - job: Test
  •   steps:
  •   - script: npm test


  • If we containerized our applications.
  • Jenkinsfile
  • pipeline {
  •     agent none
  •     stages {
  •         stage('Build') {
  •             agent {
  •                 docker {
  •                     image 'ubuntu:trusty'
  •                     args '-v $HOME:/build -w /build'
  •                 }
  •             }
  •             steps {
  •                 sh 'make'
  •             }
  •         }
  •         stage('Test') {
  •             agent {
  •                 docker {
  •                     image 'ubuntu:xenial'
  •                     args '-v $HOME:/build -w /build'
  •                 }
  •             }
  •             steps {
  •                 sh 'make test'
  •             }
  •         }
  •     }
  • }

  • Azure-pipeline.yaml

  • resources:
  •   containers:
  •   - container: trusty
  •     image: ubuntu:trusty
  •   - container: xenial
  •     image: ubuntu:xenial

  • jobs:
  • - job: build
  •   container: trusty
  •   steps:
  •   - script: make
  • - job: test
  •   dependsOn: build
  •   container: xenial
  •   steps:
  •   - script: make test


  • After completion Jenkinsfile

  • post {
  •     always {
  •         echo "The build has finished"
  •     }
  •     success {
  •         echo "The build succeeded"
  •     }
  •     failure {
  •         echo "The build failed"
  •     }
  • }
  • Azure-pipeline.yaml

  • jobs: - job: always steps: - script: echo "The build has finished" condition: always() - job: success steps: - script: echo "The build succeeded" condition: succeeded() - job: failed steps: - script: echo "The build failed" condition: failed()


  • https://blog.opstree.com/2021/04/13/jenkins-vs-azure-devops/

  • Jenkins CI Pipeline for SpringBoot Application : Pipeline Script VS Pipeline Script from SCM

     Here we will create CI and CD Pipelines to build and deploy application using Jenkins pipelines.

    We can create in two was 

    1. Pipeline Script 

    2. Pipeline Script from SCM

    Saturday, May 28, 2022

    Docker : Docker, DockerCompose and DockerRun and params

    Docker : Plugins comparison : palantir VS bmuschko

     https://plugins.gradle.org/search?term=com.palantir.docker


    https://palantir.github.io/

    https://tomgregory.com/bmuschko-docker-gradle-plugin-review/


    https://tomgregory.com/automating-docker-builds-with-gradle/



    Thursday, May 26, 2022

    RestControllerAdvice VS ControllerAdvice VS ExceptionHandler - Interrupt the StackTraces, LOG.ERROR

     @ExceptionHandler can be used at the local level or at the global level. Local level would mean using this annotation within the controller itself to handle the exceptions within that controller only. All error thrown by that controller would be caught by that @ExceptionHandler. But this would mean that if there is a similar exception in a different controller you would have to rewrite the corresponding code again in that controller again locally.

    In order to prevent repeating this style of exception handling per controller we can write the @ExceptionHanlder at the global level with the help of another annotation called @ControllerAdvice.

    @ControllerAdvice is not specific to the exception handling , its also used for handling property, validation or formatter bindings at the global level. @ControllerAdvice in the context of exception handling is just another way of doing exception handling at a global level using @Exceptionhandler annotation.

    Now coming to the HandlerExceptionResolver - this is an interface at a more lower level. Spring provides 2 implementations of this:

    • ResponseStatusExceptionResolver :This supports the support the @ResponseStatus annotation
    • ExceptionHandlerExceptionResolver : This supports the @ExceptionHandler annotation

    Example : So when you want to handle exceptions and choose an exception handling strategy you will need to think of choosing between using a local or global exception handling via the annotations. How you need to provide the HTTP status codes, how to wrap it in the @Response entity etc, how you want to redirect to handler pages , carry the data via flash attributes or get params etc etc. Or maybe skip the annotations and use the SimpleMappingExceptionResolver and start mapping the specific exceptions to the error handler page urls

    Here we are not be considering the lower level underlying HandlerExceptionResolver at this stage since we are dealing with its implementation at a higher level and building the strategy based on these options.

    With the above context to answer your query - @ControllerAdvice was not introduced for exception handling, it's a mechanism you can leverage to handle exceptions globally using the @ExceptionHandler. HandlerExceptionResolver is an interface whose implementation helps support the @ResponseStatus and the @Exceptionhandler annotations. Unless you want to handle the exceptions related to the MVC system because the Spring framework does not provide any proper exception handling. So if you need to hand;e issues related to incorrect @Requestmapping etc which would not be caught by a controller as it would not even reach it in the 1st place then an implementation of the HandlerExceptionResolver would be useful

    Order Service API - Simple Microservice for Order Creation

    Git Repo:

    https://github.com/balajimathu/beer-order-service





    Create Below APIs using Spring Webflux.

     1. Order Service -  will invoke below APIs use  spring-boot-starter-validation

    2. Inventory Service -  search and confirm availability, if not available Order Service will return message.

    3. Price Service - Search and get latest price, Store actual prices in DB,  implement Spring Data to apply seasonal offers, configurable

    4. Shipment Service - On successful creation Send mail.  @Email - -Validation

    5. Payment Service - Insert data in DB and return an transaction id. 

    6. Implement @ControllerAdvice - Reactive


    Implement CommandLineRunner to check the availability of DependencyAPIs on startup..

    Database:

    Postgres


    Platform:

    Create Docker Images for REST APIs and Postgres DB

    Run Jenkins on AWS EC2

    CI and CD Pipelines

    Deploy all these in Containers, separately, 

    Authentication - NO Auth as of now.




    Wednesday, May 25, 2022

    Redis : Spring Data Redis

    Spring Data Reactive : R2DBC Entity Callback AfterConvertCallback

     

    R2DBC Entity Callback – AfterConvert:

    Lets implement the other requirement – to apply seasonal global discounts to all products when we select records from DB. Here we do not want to touch the database. We still want to keep the original price as it is. We just want to update the price once it is retrieved from DB.

    AfterConvertCallback hook would be a good choice here.

    \BeforeConvertCallback. In this case our implementation should implement Ordered as well as shown here to return the order in which it should be executed.



    R2DBC Entity Callback – BeforeSave:

    Sometimes the actual table might have more columns and entity object might not contain all the fields. For ex: fields like created_bycreated_date etc. But we might want to update these fields. In this case, BeforeConvertCallback will not help much! But we could achieve with BeforeSaveCallback.