After attending a number of conferences and events, and performing numerous interviews, I'm starting to hear the same things again and again. Since Dan North challenged all my assumptions at QCon, I'm reluctant to outright ridicule them, but I will put forward my personal opinion.
Note: these are things I have heard from multiple sources, so with any luck I am not breaking the sanctity of the confessional interview.
I've never pair programmed, but I've frequently worked with a partner on critical production problems
I find this fascinating. If there's one thing that needs to be fixed as fast, as correctly, as efficiently as possible, it's a production issue. And when there is one, "everyone" knows that two heads are better than one, even The Business.
If this is the case, why is it so hard to sell pair programming as the default state of affairs?
Is it because creating new features is seen as just typing, where the bottleneck is access to the physical keyboard? Is it because fixing defects when the pressure isn't on is suddenly easier for one person on their own without help?
This state of affairs is interesting to me as it implies that when proverbial hits the fan, the instinctive thing to do is to work collaboratively. Why don't we do it more often?
We use Test Driven Development to get coverage
Seems weird to me to write your tests first to get coverage. If unit test coverage is your most important metric (and other people have covered why this might not be the case), I'm not sure why you would write your tests first. Seems to me that you'd get better coverage writing the tests after the code. That way you can be sure you've covered every eventuality.
To me, the statement implies two assumptions which I would challenge:
- The primary value of writing your tests first is to meet your coverage requirements
- Coverage is a meaningful metric
TDD/BDD has a number of benefits (...and now I'm reluctant to list them here in case people repeat them back to me in an interview). Good coverage will probably be a side effect of being forced to write your tests first, but I'm not convinced that's the best thing that will come out of using TDD.
I only test first when I know what I want to code
I've overheard people saying that they test first when they know what the code is going to look like. So you dive straight into the code when you don't know what you're doing???
Of course there is a place for this - spikes, prototyping, getting a feel for a new library, so on and so forth. But I feel that for most code that you write in your day job, you probably have a business requirement and possibly (probably?) a less firm idea of how you're going to code it. To me, this translates into writing the test first (which documents what you want to deliver, which you already know) and then getting that to pass (which is writing the code, which is the bit you might not know).
Mostly because I'm still interviewing candidates, and I don't want to give away the correct answers....