The external behavior of a component will need to change over time
Here's a real story. I once was involved with a very a small project, let's call it the PQR project: three people working part time for three weeks, putting together an HTTP server a web client and a GUI client - all are very simple. During these three weeks we were also learning some new technologies so actual coding time (of all developers combined) was about 20-25 days.
During these three weeks two important changes were applied to PQR's specification:
- The technology with which the GUI client was implemented had to be changed. Instead of implementing it over Tech.A we had to switch to Tech.B.
- The initial specs defined the data that should be persisted by the server. As we were playing with the intermediate versions of the project, we came to realize that a descent user experience requires that additional information will be persisted.
The main point of this post is not if/how we managed to support these changes. The point is that even in small projects specs are not stable. We were not successful in defining the project's goals for a three week period in a project which is as simple as industrial projects get. Of course, If the project were more complicated (more people, wider scope) then the instability was likely to be even higher.
This example indicates that a "fire and forget" type of development (AKA: "divide and conquer" ) where one breaks down the desired functionality into a few large pieces, assigns each piece to a programmer, and then lets each programmer work on his task in isolation of his peers (until an "integration" milestone is approaching) is broken.
First, external forces will change the specs, thus affecting the assigned tasks. In the PQR project, the change in client technology was due to some external factors (business/marketing constraints). Even though the initial specs were examined and audited by several layers of approvals, no one had predicted this change.
Second, feedback from working early versions of the product (even with partial functionality) will change our understanding of the product and its desired capabilities. In PQR, the change regarding which-information-should-be-persisted was driven by experimentation with early versions.
If we had taken a Fire-and-Forget approach then our ability to respond to the first change were very limited as every team member was in the middle of his large task. Also, by the time a first working version were available, very little time was left to implement significant changes.
Bottom line: Software is unstable. Breaking the effort into tiny tasks with frequent integration points (I am speaking about granularity of hours not weeks) is an excellent way to cope with this inherent instability.
As much as I agree with you, saying your anecdotal example indicates the "divide and conquer" approach is broken seems a bit far fetched. Someone else might have a situation where it works. You know, some other smaller project maybe?
Like I said, I agree with you. But its a bit premature to dismiss something so general on just one anecdotal evidence.
Anonymous
September 2, 2010 at 10:46 AMIt would be very convenient for project managers for specs to never change, so they often think something is deeply wrong when they do. But the reality is that they *always* change. I've been programming almost 20 years, and I've never seen a stable set of specs.
Any development methodology that refuses to take change as a given is bound to disappoint. That's what I found so refreshing about agile methodologies: rather than fight change, they embrace it.
virbots
August 26, 2011 at 8:03 AM