Monday 13 January 2014

How I use Jenkins to test & build django-writingfield.

Jenkins is an automated continuous integration server and I’ve been using it for about nine months. Just like learning to use VIM, learning to use Jenkins has made me a better at what I do. Recently I decided it was time to apply my Jenkins skills & knowledge to help out with automating and managing the release process for django-writingfield.

Build Pipeline

Jenkins out of the box wasn’t that useful to me, however with the additions of a few plugins it grew to become very, very useful. The most important plugin for me is the build pipeline plugin. It’s a plugin that allows you to chain jobs together into “build pipeline”. Here is the pipeline I have for django-writingfield–

Little boxes on the hillside, all made of ticky tacky

The above may look super complicated, but really it’s not. Each build is assigned a number ( the top row is #50 if you have eagle eyes ) and represented horizontally. Each box in the horizontal row is one “job”. Green boxes mean that the job executed successfully, red boxes indicate the job failed and blue boxes mean the job didn’t run ( which could be for many reasons, but in this case I was because I altered the order of the jobs in the pipeline ).

Step By Step

Step 1

Internal Release

This step builds writingfield into a pypi package and uploads it onto my private pypi server. The resulting package is used at every step along the build process but I also get a win because if it fails to build for any reason, then the entire pipeline will fail. Here’s the code that get’s executed to run the job –

# creates, populates and activates the virtualenv
./bin/venv.sh
. ./venv/bin/activate

# get version from the writingfield package
VERSION=`python -c "from writingfield import get_version; print get_version()"`

# remove the current release, but always return true even if it fails.
rm /home/jamiecurle/pypi/packages/django-writingfield-$VERSION.tar.gz | true

# now upload the internal release
python setup.py sdist upload -r internal

Step 2, 3, 4

Build Django versions

These three steps each install a copy of writingfield into a virtualenv with specific versions of Django and then run the python tests against each version of Django ( currently I test against Django 1.4.10, 1.5.5 and 1.6.1 ). I have close to 100% python test coverage with writingfield, but that’s mainly because it’s a very small code base and there isn’t a whole lot to test ( from Python perspective ), given the front-end nature of the app.

In this step I also install & uninstall the package created from the first step to ensure that it installs and uninstalls without problems.

Here’s the code –

# venv
./bin/venv.sh
. ./venv/bin/activate

# django install - the version changes depending on the job
pip install https://pip.curle.io/packages/django-1.4.10.tar.gz

# test
python bin/runtests.py

# get version
VERSION=`python -c "from writingfield import get_version; print get_version()"`

# install and then uninstall
pip install "https://pip.curle.io/packages/django-writingfield-$VERSION.tar.gz"
yes | pip uninstall django-writingfield

How I could improve these tests

I plan to introduce testing for the JS that powers writingfield ( as improving my JS is very much one of my immediate missions in life ) so I’d almost certainly like to test using a JS library such as Jasmine or Qunit.

As well as testing the code with unit tests I’d also like to incorporate headless testing of the UI via Robot Framework or something similar. This would give me the chance to ensure that the app works as it is intended to. Unit tests are all very well, but the closer the tests are to the real world, the better.

I’d also like to implement a sample Django project and run the common management commands to ensure that nothing that I’m doing interferes / breaks anything for the end user.

Step 5

Documentation is one of the cornerstones of success for any software project and this step builds the documentation to ensure that there are no errors. The documentation for django-writingfield is hosted with Read The Docs ( RTD ) and isn’t a huge document, in fact it’s one humble page but this doesn’t mean it should be neglected. Here’s the build code –

# create, populate then activate the venv
./bin/venv.sh
. ./venv/bin/activate

# install our development requirements
pip install -r etc/pip/dev.txt

# get into the docs directory and build it.
cd docs
make html -b

By ensuring that the documentation can be built I’m saving time having to check over on the RTD to make sure it was built correctly; I can have confidence that it was.

Step 6

If I have a favourite step then it’s this one because this step gathers stats about the quality of the various parts of the code and allows me to track them over time. I use a few Jenkins plugins to do make this step work.

I use the Cobertura plugin plugin to display coverage reports and I use the Violations plugin to configure and gather information about PEP8 violations and JSHint to collect and report on JavaScript syntax errors and warnings. I’ll not go into the configuration on these things here because they’re mostly straight forward ( though JSHint did give me some jip ).

I also run the coverage details through the very nice coveralls.io service so that I have a badge generated for the README.md that shows the current code coverage on the project. Whist this is generated partially from vanity, I do think that badges can serve as useful traits on a sucessful Django project.

Here’s the code that powers this part of the build –

# do the venv
./bin/venv.sh
. ./venv/bin/activate

# run Django tests to generate coverage
pip install Django==1.6.1
python bin/runtests.py

# now do coveralls submission
git checkout develop
git pull origin develop
COVERALLS_REPO_TOKEN=SECRET coveralls

# pep8 checks
pep8 writingfield setup.py > pep8.txt | true

# jshint checks
~/node_modules/jshint/bin/jshint writingfield/static/writingfield/writingfield.js --reporter=jslint > jshint.xml | true

The more eagle eyed amongst you may notice that I’m currently piping the output from the jshint and pep8 checks through true. I’m doing this because I’m in two minds about whether to fail a build because of pep8 or JSHint errors. Today, I don’t want to, tomorrow I might.

Once this part of the build has ran I get access to a number of graphs and reports.

Here’s the summary of coverage: Coverage summary

The summary of the PEP8 & JSHint checks ( you can tell I only implemented JSHint checks from build #65 onwards ):

Finally, here is a sample from the Cobertura reports:

These reports make me a better developer because they serve as a map showing me where to go in order to make the code more robust. They also help ensure that quality is always improving or at least maintaining the status quo. Should any aspect of these stats begin to fail, I have immediate visual queues that I need to step up on the Q&A front.

Managing the quality of the code in this way without any doubt helps ensure that only quality code makes it into a final build.

Step 7

This step is a manual step that is not triggered automatically ( the little clock icon in bottom right when clicked will trigger the build).I have this set up as a manual job so that I control when a build get’s pushed out as an final release. The code that powers this step is pretty boiler plate –

# The virtualenv 
./bin/venv.sh
. ./venv/bin/activate

# now build the python package
python setup.py sdist upload

No great surprises here, but it’s one less thing I need to care about and certainly the kind of repetitive task that is best delegated out to Jenkins.

Step 8

This is the final step and is mostly concerned with the git repository. It merges the current develop branch into master with a --no-ff and then tags the repo so that a new release is triggered via the Github releases page.

# merge develop into master
git checkout develop
git pull origin develop

git checkout master
git pull origin master
git merge develop --no-ff
git push origin master

# now tag it, delete the previous tags and attempt a new one. Failure is cool.
VERSION=`python -c "from writingfield import get_version; print get_version()"`
git fetch origin --prune --tags
git tag -a $VERSION -m "The $VERSION release"
git push --tags origin

This last step isn’t really necessary from the perspective of releasing the package, but by having Jenkins perform the step at the end of each successful build I get to enforce a repetition and consistency in the workflow. No matter what the context, repetition and consistency are keys to unlocking advancement.

Conclusion

So far this pipeline has greatly reduced the amount of things that I need to think about when pushing out a new release. It helps enforce consistency because each new version of the package is the result of following established steps. Quality is improved as the app is being tested from lots of directions and should any fail, then the entire pipeline fails. Finally the repetition of packaging, tagging, merging is handled automatically and this means that I can focus on the good stuff — playing with code and trying out new things; always in the pursuit of learning.