As you may have seen, over the last year (especially the second half), the Thunderbird team have been looking at improving interactions with external contributors to help both Thunderbird development and the contributors. Most of these thoughts have been on Dan’s blog.
Over the last few months of last year, one of my tasks was to work with the various Thunderbird reviewers to get the backlog of reviews in the Thunderbird product cleared down. We almost managed it towards the end of last year, but now we’re in the position that we have no reviews over a month old outstanding (and had we not just had the Christmas holidays, the list might be even shorter). ui-reviews are almost in the same position with a month and 2 days being the oldest.
Some of the old reviews were obsolete, some need more work, and a few were reviewed and pushed through as the original contributor was no longer available.
The fact we’re now down to all reviews being less than a month old is a great improvement over the 50-odd outstanding reviews at the start of last year. Here’s a graph showing the number of bugs with open review requests over time in the Thunderbird product since June 2009:
This all means that we can now start this year from a better baseline, and we’ll be able to easily identify reviews that are taking a long time and see if we can move them forward more quickly (e.g. by re-assigning, reminding, helping etc.). I feel this will help our contributors by giving timely reviews, or at least feedback as to when reviews are likely to happen.
I’m also going to be looking at the Mailnews Core product reviews at the start of this year, at the time of writing, there are about 50 outstanding reviews, some going back to 2004. It is quite possible that some of these are obsolete, but I’d like to look through them and try and make sure we land what is sensible to get landed.
The other thing I’ve been pondering over the last few months is review times. Back around Thunderbird 3 we knew our contributors were not happy with the length of time it took to get review, so I’m looking at actually getting some figures to quantify that and set future targets.
Over Christmas I had some spare time, and rather than do things directly on Thunderbird (because that’s definitely work), I chose to investigate the bzapi and the bztools (because its something new and not what I currently do as work). As a result, I came out with a script, that gives a very sketchy estimate of average review times across a product (or particular search).
I say sketchy, because it doesn’t take into account multiple review requests for the same attachment (it takes the time from the first review being requested to the first review granted) and it doesn’t deal with cancellations or other changes.
So across all the bugs in the following products in bugzilla.mozilla.org, here’s the summary of average review times (rounded up a bit):
- Thunderbird: 6 days 3 hours
- MailNews Core: 11 days
- Toolkit: 6 days 2 hours
These reflect the entire set of bugs in one product from the beginning of bugzilla until now, i.e. many bugs, years and people, some of whom no longer contribute – so it is not going to be really useful to draw conclusions from these, but I thought folks may be interested in the overall numbers.
I’m hoping to extend the scripts with some better data gathering and probably insertion into a database (unless someone has done this already!), then I hope to analyse on items like review times for reviews requested in a particular year, and average review time compared to the attachment size.
As the script is currently very crude, I won’t publish it just yet until I’m nearer the database stage – what I have is fairly easy to work out if you really want to.