Experience Adopting TDD on my Team

A year ago, our team started the transition to Agile.
We made several drastic changes to the way we did things such as introducing...

  • A fully functional functional test automation suite
  • co-located cross discipline scrum teams
  • Unit test and tdd
  • 3 week iterations
    I think overall we've made an effective transition.  One of the most challenging things to get to stick has been unit test and tdd.  I'd like to use this entry to list out what we did, where we are at, and discuss some lessons learned.  Hopefully this will be of some use to others.

Things we did to Transition

  • Pulled a couple of developers pre-release, who spent time exploring unit test tooling and strategy.
  • Gave 2 education sessions, one focusing on presentation of what TDD and unit test is, and an overview of the tooling we intended to use for the release.  The other session was a workshop, where developers went through a sample development task, doing it in a test-driven way.
  • Integrated unit test build and execution as part of the nightly build.
  • Promoted a way for developers to easily execute tests within their development environments
  • Set up metrics building to track how individual teams, as well as teams as a whole were doing implementing Agile.
  • Posted lots of examples and design patterns specific to our technologies on our project wiki, many adapted from the book Working Effectively with Legacy Code by Feathers.
  • Identified subject matter experts on the team that could provide individual help to developers struggling with implementation
  • Made unit test coverage of new and changed code part of our done checklist for sprint exit

Technologies We Chose

  • Junit with EasyMock primarily
  • eclEmma for IDE code coverage
  • Cobertura for build coverage (cobertura fit into our metrics analysis tooling, plus I think it had nicer reports out of the box)
  • xRadar for end of release retrospective analysis (did quality indicators in our codebase improve over the course of the release).  Found the tool not as useful during construction (too much information, not team specific)

Where we are today

  • There are developers on our team doing TDD and unit test really well.  There are also developers on our team not doing it at all.
  • Unit test are still building and running nightly.  broken tests break the build and are immediately fixed.
  • We've currently got unit test suite of 1796 unit tests running that takes 24 seconds to run in my IDE
    What we could've done better

  • Provided more workshop like classes, and/or had the TDD/UT subject matter experts follow up with teams more often.

  • Been more strict about UT coverage on sprint exit reviews.  We had a way of showing how teams did with unit test on new and changed code, but we let some teams "slide" through exit reviews without adding much coverage to their new code.  This was primarily because we were asking them to do so much new things, that we wanted to focus hammering the teams on a few things, primarily producing quality functional test automation and coverage.
  • Celebrate successes more (ex. party on the 1000th unit test milestone)

What we did that was good

  • Tie unit test into our build.  This forces developers to keep them up to date.
  • Provide design patterns and help - this got some developers past the block of "you can't unit test this technology"
  • IDE integration - they are easy to add - no setup
  • Created a tool that shows test coverage per scrum team

What's next?

  • Some metrics analysis on what we did as a team this release.  Pull some key trends graphs to show the team the effects the unit tests had on our code base - reduced complexity, decoupling, etc.
  • Get team feedback on TDD/UT in a release retrospective
  • Next release, do some additional brown bags on UT and TDD strategies
  • Create some more visual representations of how we are doing during construction.
  • Stress Unit test more in team-wide "done" criteria for sprints