INFRASTRUCTURE

Bad experience with Jenkins? Give BitBucket Pipelines a try!

I’ve already had several attempts to introduce Continuous Integration with Jenkins in my development workflow. Unfortunately it was never working as expected and it turned out to be more of a time consuming trouble than helpful tool in increasing quality and productivity.

Important note: Everything I am sharing here is my personal opinion. Although Jenkins wasn’t the best tool in my case, I would still use it if I had very specific requirements for building, testing and deploying application. In that sense Jenkins would be the most flexible. My point here is to show how to implement CI in the easiest way.

What I dislike about using Jenkins as a CI tool

It can be quite expensive to run for a small project

I am not talking about server cost only, which can be large if you need several services to run your integration tests. As with everything you need to manage yourself, you have to care about regular updates (I would highly recommend updating Jenkins frequently, because one of my instances got hacked because of the security vulnerability), if something crashes in the environment you have to debug and fix it, running many builds a day can generate a lot of output data and you need to make sure that you don’t run out of a disk space.

Builds are dependant on the environment

I’ve spent long hours trying to find out why my tests are running locally, but not on the Jenkins host, it’s especially common if you don’t use any form of virtualization for the build environment like VMs, Docker containers or LXC containers. In my latest attempt I tried to provision LXC containers with Ansible and run tests on them, but builds were taking too much time and still they were susceptible to disk, network or other host issues.

Creating the first working configuration takes a lot of time

If you are new to the Jenkins eco-system you need to learn how to install the right plugins, what certain configuration options mean and how to setup a secure authorization policy, so your builds are not publicly available. I takes hours, but the only thing you really need is running a simple command like:

python manage.py test

with all dependencies and services being available in the build environment.

BitBucket Pipelines

BitBucket pipelines is a new tool available in BitBucket which can run your builds in Docker containers with a little configuration and cost. The simplest build configuration for running python tests would be:

image: python:3.5.1
pipelines:
  default:
    - step:
        script:
          - pip install -r requirements.txt
          - python test.py

which you put inside a bitbucket-pipelines.yml file. That’s all you need to add CI to your project, it takes a few minutes, not hours like in case of Jenkins. You can use any Docker image which is available in the public or private registry.

It’s not silver bullet though, there are some limitations:

  • You have resources limit (4GB of memory, 2 hours of execution time, 5GB of disk space and you can run only 3 additional docker containers with other services like cache or database)
  • It takes some time before your build starts to be executed (You can often see a message Waiting for a build agent for a few minutes)
  • Currently you can not build your own image with the pipeline, you have to use an image published in the Docker registry

Taking into account how easy it is to create the first configuration, I think it is a great alternative to Jenkins for small teams and projects, when you can not afford to manage your own CI/CD infrastructure.

How to use it for Continuous Delivery?

Maybe it’s quite unusual project to use CI/CD for it, but I thought the BitBucket Pipeline would be great for publishing new articles on this blog, because I am using a repository anyway (my blog is powered by Jekyll). The build process consists of:

  • building html files from Jekyll sources
  • publishing new files to my hosting account (through FTP)

For sending all files by FTP I decided to use ncftp which has more useful features than standard ftp command.

My build configuration looks as follows:

image: ruby:2.3.0
pipelines:
  default:
    - step:
        script:
          - apt-get update && apt-get install ncftp
          - bundle install
          - gem install jekyll bundler
          - bundle exec jekyll build

  branches:
      master:
        - step:
            script:
              - apt-get update && apt-get install ncftp
              - bundle install
              - gem install jekyll bundler
              - bundle exec jekyll build
              - sh deploy.sh

With BitBucket Pipelines we can make a deployment easily, just by pushing code to defined branch (or pattern: feature/*) additional command is executed which publishes the code. In my case it is a simple bash script, which basically copies all the files to remote location through FTP connection:

#!/bin/bash

ncftp -u $FTP_USER -p $FTP_PASSWORD $FTP_HOST << EOF
put -R _site/*
bye
EOF

All environment variables can be defined in repository settings.

Tomasz Chojna

Software Developer @ Grand Parade (William Hill)