AWS CodeCommit is a secure, highly scalable, managed source control service that hosts private Git repositories. It makes it easy for teams to securely collaborate on code with contributions encrypted in transit and at rest. CodeCommit eliminates the need for you to manage your own source control system or worry about scaling its infrastructure. You can use CodeCommit to store anything from code to binaries. It supports the standard functionality of Git, so it works seamlessly with your existing Git-based tools.



  • AMI Linux system
  • A server user having access on aws codecommit service


Launch a Linux AMI EC2 instance you can launch it without key pair as instant connect option , also make sure selected enable public IP assignment.

Right Click on the server and click on connect it will take to the EC2 Instance Connect option  click on connect  option.

It should launch the instance console in a new browser tab.


  • Select the newly created instance.
  • Click Connect at the top of the screen.
  • Leave the tab as EC2 Instance Connect and click Connect.
  • In the terminal, run the following command:
  • sudo yum update -y
  • Install Git:
  • sudo yum install git -y
  • Return to the the AWS Management Console.
  • Navigate to IAM > Users.
  • Click on cloud_user.
  • Click Add permissions > Attach existing policies directly.
  • Type “AWSCodeCommit” in the filter search box.
  • Select the AWSCodeCommitFullAccess policy.
  • Select the AWSCodeCommitFullAccess policy.
  • In the AWS Management Console, with cloud_user opened, click the Security credentials tab.
  • Under Access keys, click Create access key.
  • Click Download .csv file.
  • Click Close.
  • Under HTTPS Git credentials for AWS CodeCommit, click Generate
  • Click Download credentials.

Click Close.


Login to the server created type aws configure provide the access key ID and secret access key from the CSV downloaded under create access key option.

Once we are able to successfully able to connect using aws configure we can use below CLI command to create the repository.

aws codecommit create-repository –repository-name RepoFromCLI –repository-description “My demonstration repository”

  • In the AWS Management Console, open CodeCommit and refresh the screen.
  • Click on the RepoFromCLI repository link to open it.
  • Click Add file.
  • Click Upload file.
  • Click Choose file.
  • Select any small file from your machine you don’t mind uploading.
  • Enter your name under Author name and your email under Email address.
  • Click Commit changes.
  • Return to your terminal window.
  • Copy the URL next to “cloneUrlHttp”:, and be careful not to copy the quotation marks.
  • Run the following command to clone the repository to the server:
  • git clone [paste the URL you just copied]
  • Enter the Git credentials username from the file downloaded earlier for the username prompt.
  • Enter the Git credentials password from the file downloaded earlier for the password prompt.
  • Check the clone:
  • Go into the repo directory: cd RepoFromCLI
  • Look for files in the repo:
  • Run the following command to create a local text file:
    vim test.txt
  • Hit i to enter insert mode and type the text:
  • This is just a test of working with CodeCommit from the CLI
  • Hit the Escape key, type :wq!, and press Enter.
  • Add the file to Git:
  • git add test.txt
  • Commit the file to the local repository:
  • git commit -m “added test.txt”
  • Verify the file was committed:
  • git log
  • Push the change to the CodeCommit repository:
    git push -u origin main
  • Enter the Git credentials username from the file downloaded earlier for the username prompt.
  • Enter the Git credentials password from the file downloaded earlier for the password prompt.
  • Refresh the AWS Management Console view of the CodeCommit repository for RepoFromCLI and verify the new file was uploaded.

ci and cd using jenkins and azure devops

A node js application example where CI is done by Jenkins and CD is done by Azure devops. Build is getting generated by Jenkins and artifacts is given to Azure devops for deployment.


Create an Azure DevOps Organization

  • In the top left, click on the hamburger menu, and select All services.
  • In the All services search box, type “azure devops”.
  • Select Azure DevOps organizations.
  • Click the link for My Azure DevOps Organizations.
  • Choose your region, and click Continue.
  • Click Create new Organization.
  • Keep clicking Continue until you reach the page to set up your project.
  • In Project Name, name your project MyFirstProject.
  • Click Create Project.

Create the Jenkins CI Build

  • Copy the forked page’s URL (it will contain your GitHub username), return to your Jenkins browser tab, and paste the URL into the Repository URL field.
  • In Branch Specifier, enter */jenkins.
  • Under Build, click Add build step, and select Execute shell from the dropdown menu.
  • In the Command field, enter:
    npm install
    zip -r node_modules package.json server.js
  • Under Post-build Actions, click Add post-build action, and select Archive the artifacts from the
  • the dropdown menu.
  • In Files to archive, enter
  • Click Save.
  • On the left-hand menu, select Build now.
  • In the lower left, click on your build, which should appear as #1.
  • On the left-hand menu, click Console Output.
  • Go back to your browser tab with the MyFirstProject page in Azure DevOps, and click Project Settings in the lower left-hand corner.
  • On the left side, under Pipelines, click Service connections.
  • Click Create service connection.

 Create Azure DevOps CD Pipeline

  • On the New service connection menu, select Azure Resource Manager. Then, click Next.
  • Select Service principal (manual), and then click Next.
  • Return to your main Azure portal browser tab, and open Cloud Shell by clicking on the icon immediately to the right of the search bar (looks like a caret (>) inside of a square).
  • Select PowerShell.
  • Click Show advanced settings.
  • For Storage account, click Use existing.
  • For Cloud Shell region, use the region where your cloud resources are deployed.
  • To check this, click the hamburger menu on the top left of the Azure portal.
  • Select All resources.
  • Review the Location column to see the region where your resources are deployed. Use this for the region in the Cloud Shell.
  • Under File share, select Create new, and type fileshare.
  • Click Create storage.
  • Create Azure Service Connection
  • Once you are connected to the Cloud Shell, get the subscription ID:
  • Get-AzSubscription
  • Copy the subscription ID under Id.
  • Return to your browser tab with the Project Settings Azure DevOps page, and paste the subscription ID into the field under Subscription Id (in the New Azure service connection pane).
  • Back in PowerShell, copy the subscription name under Name.
  • In the Service connection name field, name the service connection SP for service principal.
    Click Verify and save.

Create Jenkins Service Connection

  • In the top right-hand corner, click New service connection.
  • Select Jenkins, and click Next.
  • Paste the root Jenkins URL you copied and saved to a text file at the beginning of the lab into the Service URL field.
  • Click the checkbox for Accept untrusted SSL certificates.
  • For Username, enter cloud_user, and type in the password you created while setting up the Jenkins installation.
  • Click Verify to make sure verification succeeded.
  • Under Service Connection Name, enter Jenkins.
  • Click Verify and save

Create the Azure DevOps CD pipeline

  • On the left-side menu, select Pipelines.
  • Select Releases.
  • Click New pipeline.
  • Under Select a template, click Empty job, and then Apply.
  • Click Add an artifact.
  • In the Add an artifact pane, click 5 more artifact types.
  • Select Jenkins from the artifact types.
  • In the Service connection dropdown menu, select Jenkins.
  • In the Jenkins Job dropdown, select nodejs.
  • Click Add.
  • Under Stage 1, click the 1 job, 0 task link.
  • In the search bar of the Add tasks pane, type Azure web.
  • Select Azure Web App, and click Add.
  • Click the Azure Web App Deploy tile, and set the following values:
  • Azure subscription: SP
  • App type: Web App on Linux
  • App name: Select the only app name that’s available
  • Click Save near the top right.
  • Click OK.
  • Near the top right, next to Save, click Create release.
  • Under Artifacts, select the only available version.
  • Click Create.
  • In the green bar on the top, click the link for Release-1.
  • Under the Stage 1 box, click Logs to view the progress.
  • Wait until the Azure web application has successfully deployed. This may take several minutes.
  • Return to your main Azure portal tab, and click on the hamburger menu in the upper left-hand corner.
  • Select App Services.
  • Select the application created.
  • Under Essentials, click the URL to open the Node.js web application.





We are looking for a Node.js Developer responsible for managing the interchange of data between the server and the users. Your primary focus will be the development of all server-side logic, definition and maintenance of the central database, and ensuring high performance and responsiveness to requests from the front-end.



Job Description

● Experience in designing and building large-scale, high availability, secure web applications and
REST-ful API using Node.js in an agile methodology.
● Knowledge on Hosting Database instances in cloud Platform(ex: aws,azure,gcp etc.)
● Good Hands-on experience in Node Js frameworks like Express,
● Strong proficiency with JavaScript, Node.JS, and ES6 or ECMA 2015
● Knowledge and understanding of Event Loop architecture.
● Knowledge and understanding of promises and await.
● Integration & Design of data storage solutions [RDBMS, NoSQL DB] [MsSQL,MySQL, MongoDB]
● Implementation of security like oauth 2.0, JWT and data protection.
● Knowledge of caching mechanisms.
● Knowledge and understanding of Request, Axios, proficiency in REST-ful APIs.
● Experience working with Google web services, Docker, AWS Lambda (or equivalent), and
Serverless capabilities
● Experience in unit testing library eg Chai, Mocha
● Knowledge of data structures and Algorithms, Node Global variables and In-built libraries
● Understanding the nature of asynchronous programming and its quirks and Workarounds
● Design and implementation of low-latency, high-availability, and performant applications.
● Experience in version control tool Git


● Expert knowledge of NodeJS, ExpressJS, and MongoDB
● MS SQL Server 2008/2012/2014 Database development and Administration experience required
● Knowledge in SQL Server.
● Knowledge on Nodejs Sequlize package.
● Exceptional coding skills in JavaScript with a thorough focus on optimization
● Hands-on experience with scalable, high-traffic applications
● Creating secure RESTful-based web services in XML and JSON

usage of shared library in jenkins

shared pipelines libraries

codes run on groovy sandbox. Sandbox keep watch of all the function called in this library. If anything nefarious in code Jenkins will raise and alert and admin has to approve it whether it can be executed or not.

It can be defined at job level on multipipeline job configuration as shown below

Amazon Machin Images) are used to build instances. They store snapshots of EBS volumes, permissions, and a block device mapping which determines how the instance OS controls the attached volumes.

AMIs can be shared, free or paid and can be copied across AWS Regions.


Are trusted library they run without ‘sandbox’ restrictions and may use @Grab. We will look into the following four things to get your hands dirty with the shared library.

  1. Create a Shared Library Structure
  2. Create Custom Shared Library Code
  3. Configure Shared Library In Jenkins Configuration
  4. Create Declarative Pipeline as Code With Shared Library.

Let looks at each one in detail.

Create a Shared Library Structure

Note: In this guide, we will be concentrating only on the vars folder for creating your first shared library. src and resources will be covered in the advanced shared library guide.

Jenkins shared library has the following structure. You can get the basic structure and code used in this article from Github ->


All the files under vars are global functions and variables. The file name is the function name. We will be using the filename in our declarative pipeline.

Create Custom Shared Library Code

In this section, we will create the shared library code for Git Checkout functionality.

Generate Pipeline Syntax Using Snippet Generator:

You can create the code snippets that can be used in share library function using the Pipeline Syntax Generator available in Jenkins. This will make our life easier for creating custom library DSL. All the supported pipeline functionality can be generated from the snippet generator.

You can access the syntax generator from your Jenkins on /pipeline-syntax/ path. For example,

Here is the screenshot which shows creating a git checkout pipeline snippet using the pipeline syntax generator.

Preview unavailable

Number 5 in the screenshot shows the generated snippet. Here is the properly formatted checkout snippet.

checkout([    $class: 'GitSCM',     branches: [[name: '*/master']],     doGenerateSubmoduleConfigurations: false,     extensions: [],     submoduleCfg: [],     userRemoteConfigs: [[url: '']]    ])

Create a Shared Library For Git Checkout

Lets convert the checkout snippet we generated in the above step to a shared library.

Create a file named gitCheckout.groovy under vars folder.

Here is our Git Checkout shared library code. We have removed all the empty checkout parameters which got generated by default.

def call(Map stageParams) {     checkout([        $class: 'GitSCM',        branches: [[name:  stageParams.branch ]],        userRemoteConfigs: [[ url: stageParams.url ]]    ])  }

Here is the code explanation,

  1. def call(Map stageParams) – A simple call function which accepts a Map as an argument. From the pipeline stage, we will pass multiple arguments which get passed as a map to the shared library.

  2. stageParams.branch – its the branch parameter which comes from the pipeline stage and we use stageParams to access that variable in the shared library.

Commit the changes and push it to your repository.

Add Github Shared Library Repo to Jenkins

Now that we have a basic git checkout library ready lets add it to Jenkins configurations.

Step 1: Go to Manage Jenkins –> Configure System

Step 2: Find the Global Pipeline Libraries section and add your repo details and configurations as shown below.


Use Checkout Library in Declarative Pipeline

We always call the library using the filename under vars. In this case, gitCheckout is the filename created under vars. Here is how we call gitCheckout library from the pipeline or Jenkinsfile

As you can see we are passing branch and url parameter to the Checkout function. Here is the full declarative pipeline code.


stage('Git Checkout') {    gitCheckout(        branch: "master",        url: ""
123456stage('Git Checkout') {    gitCheckout(        branch: "master",        url: ""    )}  )}

@Library('jenkins-library@master') _


pipeline {
agent any
stages {
stage('Git Checkout') {
steps {
branch: "main",
url: ""

Like gitCheckout, you can create all your pipeline steps a shared library and you don’t have to repeat your common functionalities in all your pipelines.

job for Atlassian product specialist



Atlassian product specialist

Job Description

• Delivering best quality software with best software practices as per Bosch standards

• Build, manage and mentor the development team.

• Interface with our GB users and partners to understand requirements, set priorities and communicate direction and progress

• Provide technical leadership for every aspect of software

• Manage the agile development process and methodology to deliver value to customers

• Help develop long-term development and business technology strategies

• Feature development by following SCRUM methodology – estimation, implementation, testing and release

• Supporting the L3 team if there are any tickets which requires development attention


• Min. 8+ years of experience in Software development using Java/J2EE Technologies

• Strong hands-on object-oriented programming experience in Java 8.

• Java/J2EE Certification, Scrum knowledge is must

• Working knowledge on Atlassian Suites. Trained and Certifications from Atlassian Confluence product is required

• Experience building web applications using Java/J2EE, Spring, Sprint boot & Hibernate

• Experience in jQuery, HTML, CSS & any one of the JavaScript frameworks (React/Angular)

• REST API Development & API documentation. Using swagger will be a plus.

• Experience in using collaborative platforms such as JIRA, Bit Bucket.

• TDD approach with Junit & Power Mock/Mockito frameworks.

• Experience in troubleshooting production issues using thread dumps, heap dumps, log analysis tools like Splunk.

JOB OPENING :- Network and security engineer


Position :-

Network & Security Engineer

Job Description

  • Graduate or postgraduate
  • Should have good knowledge about network & security, Cloud & Datacenter experience will be preferred.
  • Candidate should know the network, BGP, Switches, Routers, Firewall, vFirewalls etc
  • Candidate should be worked on Fortinet, SOPHOS, Sonicwall, Cisco etc.
  • Should have hands-on experience on network & security

Core/Must Have Skills

  • Expertise in Ansible to used as deployment tool
  • Expertise in scripting YAML , POWERSHELL , BATCH
  • Expertise in automating build and deployment of different application Java , Angular , .Net based application
  • Have goods hands-on writing pipeline as script in Bamboo , Jenkins
  • Have good exposure to source code management like GIT
  • Have exposure to work in cloud based CI/CD solution either Azure or AWS

Work Location

  • Noida ( Residing in & around Noida Only )



Roles Summary and Role Description

Looking for DevOps Engineer with Core Java having 5 to 8 years of professional experience who has proven experience with CI/CD tools (preferably jenkins), build tools like maven, ant and automation configuration tools like Chef, Ansible etc.As a DevOps Engineer you will be be supporting the jobs in lower environments, configuring the new Jenkins jobs, writing scripts for automating the configuration etc, configuring jobs on cloud platform like AWS and Azure.

Core/Must Have Skills

  • Expertise in Ansible to used as deployment tool
  • Expertise in scripting YAML , POWERSHELL , BATCH
  • Expertise in automating build and deployment of different application Java , Angular , .Net based application
  • Have goods hands-on writing pipeline as script in Bamboo , Jenkins
  • Have good exposure to source code management like GIT
  • Have exposure to work in cloud based CI/CD solution either Azure or AWS

Work Location

Reporting and project location will be Noida Utter Pradesh but candidate can work remotely.

Work Location



AWS CodeArtifact is a fully managed artifact repository service that makes it easy for organizations of any size to securely store, publish, and share software packages used in their software development process. CodeArtifact can be configured to automatically fetch software packages and dependencies from public artifact repositories so developers have access to the latest versions. CodeArtifact works with commonly used package managers and build tools like Maven, Gradle, npm, yarn, twine, pip, and NuGet making it easy to integrate into existing development workflows.


Note all the commands mentioned under this article will be executed from the CLOUDSHELL so we should launch cloudshell with user which is having requried rights on CODEARTIFACT service.


creation of domain

AWS CodeArtifact domain is a resource for codeartifact of AWS. CodeArtifact domains make it easier to manage multiple repositories across an ograginizatio. we can use a domain to apply permission across mulitple repositories owned by diffreent AWS accounts. Repositories are aggregated into a higher-level entity known as a domain. All package assets and metadata are stored in the domain, but they are consumed through repositories.

create a codeartifact domain command

aws codeartifact create-domain –domain my-domain

creating repository under above created domain


  • Create a domain
  • Create repository under the above domain
  • Create upstream repository for our above created repository
  • Add external connection to the npm public repository
  • Using CLI to install the npm package


1) Create a domain:
aws codeartifact create-domain –domain my-domain

2) Create a repository in your domain:
aws codeartifact create-repository –domain my-domain –repository my-repo

3) Create an upstream repository for your my-repo repository:
aws codeartifact create-repository –domain my-domain –repository npm-store

4) Add an external connection to the npm public repository to your npm-store repository:
aws codeartifact associate-external-connection –domain my-domain –repository npm-store –external-connection “public:npmjs”

5) Associate the npm-store repository as an upstream repository to the my-repo repository:
aws codeartifact update-repository –repository my-repo –domain my-domain –upstreams repositoryName=npm-store

6) Configure the npm package manager with your my-repo repository (fetches an authorization token from CodeArtifact using your AWS credentials):
aws codeartifact login –tool npm –repository my-repo –domain my-domain

7) Use the npm CLI to install an npm package. For example, to install the popular npm package express, use the following command, if we don’t specify a version, this command will install the latest version available in the external repo:
npm install express

(express is a Node.js web application framework used to develop web and mobile applications)

8) View the package you just installed in your my-repo repository:
aws codeartifact list-packages –domain my-domain –repository my-repo

Once all above command completes successfully we can go and check in console our newly created domains and repository should be present there.




About AWS CodeStar

AWS CodeStar enables you to quickly develop, build, and deploy applications on AWS. AWS CodeStar provides a unified user interface, enabling you to easily manage your software development activities in one place. With AWS CodeStar, you can set up your entire continuous delivery toolchain in minutes, allowing you to start releasing code faster. AWS CodeStar makes it easy for your whole team to work together securely, allowing you to easily manage access and add owners, contributors, and viewers to your projects. Each AWS CodeStar project comes with a project management dashboard, including an integrated issue tracking capability powered by Atlassian JIRA Software.


Click on AWS KeyPair option and generate a new key pair which we will be using in our configuration or AWS CODESTAR.


From EC2 services just open AWS CodeStar , click on CREATE PROJECT option.

Under next screen click on Create Service Role

After choosing the service role click templates option and choose HTML5 click Next.

Under next screen give required details about the CodeStar project setup.

  • Project name
  • Project repository (AWS CodeCommit)
  • EC2 Configuration details
    • Instance type
    • VPC
    • Subnet details
    • Key pair name

Once all project details care provide click on next and the click on Create Project option.

AWS CodeStart first time project creation takes some 5-10 minutes of time so we have to wait till the project  creation gets completed. Once the project is completed it should show Project Provisioned.

Under Provisioned AWS Codestart project we should get below tabs.


Click on pipeline tab it should show first pipeline is deployed and success status for (Source , Build and Deploy)

At Top click on VIEW APPLICATION option

It should take us to the newly HTML application deployed.


Go to codecommit repository under AWS CodeStar change the line number 61 to “Testing Sample Commit for Auto build and Deploy” commit the changes.

Once the new code is deployed automatically we can browse the application an see the changes.

Taking scheduled backup of mysql using Jenkins and powershell script

Jenkins pipeline as code to take database backup of mysql at scheduled time


Go to developer settings and generate Personal access token.

Settings >> Developer settings >> Personal access token


Go to credential manager under jenkins

Under credential manager create token based credential with name git-cred-token under password put the credential we have just generated , click on save.


Now come to Jenkins choose New Item then choose project type as Pipeline.

Under Pipeline configuration page and under Pipeline tab choose options as below

SCM :- git
Repository URL GitHub – devops81/dbbackupscript can fork this repository

Credentials :- Choose the credential git-cred-token which we generated above

Click on save it should complete the job creation.

Come back to the project page and click on Build Now it should ask for parameters as database name whose backup required to be taken.

Choose the database whose backup required to be taken click on build now option internally it should call the Jenkinsfile under the GITHUB repository.

Once the pipeline is success it should be shown as below.

Database backup should be created at designated location.