« Back to Blogs

Auto deployment in AWS EBS using bitbucket pipelines

Want to get free from manually building project and deployment?

How amazing for developer and DevOps, if there project automatically build and deploy on server(AWS EBS) on single click event or automatically while any commit found in brach. Yeah, it is possible to build and deploy your project automatically and deploy it in EBS via bitbucket pipelines. Let’s have a look at that.

Enable pipelines for repository

First of all, We have to enable pipelines of bitbucket repository which is default disable for any repository. We can enable it from bitbucket repository settings --> Pipeline settings

Choose a language template

Pipelines configured via the bitbucket-pipelines.yml file. Pick a language template for which you want to write pipelines. Here we are going to select Java(maven) for deploying our java project. You will get java based(maven) predefined pipelines template with default java version. We have to configure custom pipelines steps for our project. If you want to change your JDK version for building your project, you can customize it by updating the docker version of java(maven) with jdk-version postfix.

 Example: maven:3.6.1-jdk-8 

Let’s build a project with pipelines configuration

To build a maven project via bitbucket pipelines, we have to fire basic maven commands to clean, build, update the project. Additionally, we can also perform operations such as creation of build, artifact customization, coping environment properties files for different environments. Let's have a look for the same.

Clone dependent project
  • Let’s say, you have a requirement to clone another repository to build your project with maven at that time you have to clone that repository via pipelines in maven docker via ssh
    • For Clone another repository on your maven docker, you have to configure ssh credential for the same
    • Configure SSH credentials for clone repository
      • Let’s say I have two repositories A and B, I want to build repository A with depended repository B for that before building A repository I need to clone repository B into docker
      • Open repository A --> Settings --> PIPELINES --> SSH key --> Generate ssh key
      • Open repository B --> Settings --> Access keys --> Add ssh key which is generated On repository in A
      • Added below command on your docker to clone repository B into A
         - git clone -b branch-name [email protected]:team/reponame.get(copy clone URL after “bitbucket.org/”) 
Changes properties file as per environment on build
  • Whenever we have more than one deployment environment and each have their separate properties files, To make easier to handle this kind of scenario let’s understand with below steps
    • Create property file which is used on your project confih
    • Create a single property file for each environment
    • Copy environment specific file(for which environment you are building the project ) on general property file which path we have given on our project configuration
    • Example: If you are building project for a production environment than copy your production environment specific file on property file before building artifact
       - cp application.properties.prod application.properties 
  • Build project

    Note: you must be fire command where your pom.xml file resides

    • You can simply build your project with maven commands as below example
       - mvn clean install 
  • Create artifacts of build
    • To deploy your build you have to generate downloadable artifact which downloadables/deployable
    • Pipeline provides the functionality to specify your build as artifact
    • You have to follow the below configuration for the same

      After completing execution of those steps you will have a deployable artifact which you can deploy by downloading your self or you can deploy on AWS EBS

      - artifacts:
         - target/app.war

Deploy artifact on AWS EBS through Bitbucket pipelines

Default bitbucket pipelines provides the functionality to deploy artifacts in AWS by minor configuration only.

I would suggest writing a script to deploy build on aws should be on separate step which is triggered manually.

- artifacts:
    - target/app.war

After completing execution of those steps you will have a deployable artifact which you can deploy by downloading your self or you can deploy on AWS EBS

  • Internally Your build will be uploaded first on s3 bucket for that you have to crate bucket for the same
  • You can add below configuration to trigger manual step

Whenever you run this deployment step pipelines will download your artifact which is created on the first step automatically. You just need to add below configuration for deployment

 - step:
    trigger: manual
    script:
      - pipe: atlassian/aws-elasticbeanstalk-deploy:0.2.1
        variables:
           AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
           AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
           AWS_DEFAULT_REGION: $AWS_RIGION
           APPLICATION_NAME: $APPLICATION_NAME
           LOCAL_PATH: "local path where you build reside (EX. target)"
           S3_BUCKET: $BUCKET_NAME
           ENVIRONMENT_NAME: $EBS_ENVIRONMETN_NAME
           ZIP_FILE: “Zip file path from docker home (EX. target/app.war)”

Manually Trigger pipeline for branch

Bitbucket also provides functionality to run pipeline manually, You have to follow below steps for the same

  • Go to the bitbucket repository
  • Look for branch for which you have configured pipelines --> Click on dots(right-hand side of branch) --> run pipeline for branch

  • Select your branch to run pipeline for the same

Conclusion

   It's amazing for developers and DevOps to be free from build and deployment stuff every time. They have to perform only some clicking events to deploy code on EBS and build process.
Comments
No comments yet. Be the first.
contact-us Request a callback