River or Lake? The Water Theory of Software

[article]
Summary:

Heraclitus once said, "It is impossible to step into the same river twice." This is true for software, too. Software is constantly changing, and there are several theories on how these changes are introduced into production. Linda Hayes describes some of the theories and offers ways to navigate the seas of change.



Everyone knows—usually from painful experience—that software is constantly changing. But how those changes make it into production can vary widely. In some cases, changes are introduced into production as a constant flow, at any time of the day or night. In other cases, changes are strictly versioned and released into production at predefined intervals. It's similar to the differences between a river and a lake: A river is constantly flowing into the sea, but a lake gathers water until the dam is opened.

Which of these describes your environment, and what does it mean to testing?

Whitewater
The river theory is usually applied to critical internal applications that run on mainframes or other centralized platforms. The criticality of these applications demands rapid responsiveness and their centrality means the change does not have to be proliferated to multiple platforms. You might think that this approach would be used only for serious errors that must be fixed immediately, but I have found that it is also used for enhancements in the name of time to market.

Whitewater rafters know that running the same stretch of rapids doesn't mean you won't encounter new obstacles. Likewise, the challenge for testers is obvious: If a change can be introduced at any moment, how can you rely on the predictability you need to verify expected results? How do you know that a test that ran successfully in the morning will get the same result in the afternoon? How can you perform regression testing when there is no known, stable set of functionality?

The benefit to a business is agility. Customer needs or market drivers can induce an almost instant response. The shorter the cycle time, the faster the time to market, and defenders of this practice pride themselves on their flexibility and responsiveness. But,like whitewater river rafting, this approach is both exciting and dangerous. Rolling in one change can cause breakage elsewhere, and the rapid fix for that breakage can cause even more issues. In fact, I know of several companies where the cascading effect of errors created by changes caused production to become fundamentally unstable for months, wreaking havoc throughout the enterprise. In one case, it actually resulted in federal regulatory oversight for more than a year when customers complained that their account balances were not accurate. For this reason, testers must learn to navigate these waters and anticipate the potential changes.

Lake Placid
The lake theory predominates in commercial and/or distributed applications. Because changes must be widely disseminated, releases are scheduled at regular intervals. Naturally, software vendors have to follow this practice, because their customers won't tolerate constant updates, and installing internal applications with desktop components to hundreds (if not thousands) of machines is too cumbersome to perform constantly. In both cases, the logistics discourage continuous churning. So, at specified intervals the dam is opened and the changes are released.

Testing benefits are self-evident: By batching up functionality into a single release, functional and regression testing are simplified, because there is a measure of stability. The downside is that the volume of testing is greater because there are more changes per release, but this is offset by the capability of performing regression testing to ensure that the changes aren't having unintended effects.

For a business, the lake theory introduces both structure and stricture. Structure because projects require more planning and coordination, and stricture because response times are usually longer. Of course, a critical error can still breach the dam and be rushed into production, but the very stability created by this approach will reduce the probability of errors reaching production in the first place. Locks and Levees
As usual, there is a middle ground. Just as locks can be used to gradually step a ship from one body of water to another, and a levee can be built to keep a river within its banks, so can the river and lake theories be reconciled. On the surface the idea is simple: Adopt the lake concept, but open the dam at shorter, faster intervals.

Years ago annual releases were not uncommon, but today the intervals are much shorter. Quarterly or even monthly schedules are not unusual, and in "Web World," weekly or even daily releases occur. The ideal is not necessarily a fixed amount of time, but rather whatever amount of time is needed to assure the quality of what makes it into production. If a full regression test takes two weeks, then daily releases are likely to introduce issues.

Now, some might argue that there is no difference between a daily release and an anytime one, but I say there is. At least with a daily release there is a 24-hour window when the stability you need for test predictability exists, whereas, if changes are completely unscheduled there is none. The key is whether you know what the state of the system is at any given time. Knowing the water level and reading the currents is important.

Is your company shooting the rapids or minding the dam? How does that affect the way you work?

Tags: 

About the author

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.