发信人: zhms()
整理人: (2000-10-26 13:10:20), 站内信件
|
The Emergence of "Light" Development Methodologies
by Edward Yourdon
Today's IT professionals are caught in a conundrum: on the
one hand, they know from painful experience that
developing complex, high-quality information systems with
a seat-of-the-pants "hacking" approach is risky. But they
also know that the formal, disciplined software
engineering approaches typically associated with ISO-9000
and SEI-CMM are so bureaucratic and time-consuming that
it will be impossible to meet increasingly aggressive
schedules in the current, competitive, "Internet time"
environment.
We can characterize the formal, disciplined software
engineering methodologies as "heavy" - not only in terms
of the weight of the paper documentation they produce, but
also in terms of the degree of management effort, QA
reviews, and rigid procedures that the developers are
expected to follow. By contrast, such development
approaches as RAD ("rapid application development") and
prototyping could be characterized as "light" - not only
because they tend to produce a minimal amount of paper
documentation, but also because they minimize the degree
of managerial effort. Unfortunately, many of the RAD
projects in the 1990s were so "light" in their
methodological approach that they were almost non-existent;
in retrospect, such projects often degenerated into hacking
exercises, with virtually no documentation at all.
Clearly, a balance is needed between the extreme of no
methodology, and the other extreme of methodological
overkill. Perhaps the most popular "balanced" methodology
today is "XP", as explained by Kent Beck in his book,
"eXtreme Programming eXplained". Another is the SCRUM
methodology developed by Ken Schwaber. The philosophies
associated with these "light" methodologies are familiar
to most software engineering veterans; indeed Peter
DeGrace and Leslie Hulet Stahl summarized them a decade
ago in "Wicked Problems, Righteous Solutions: A Catalog
of Modern Software Engineering Paradigms"
(Prentice Hall, 1990).
Light methodologies represent a conscious risk-reward
approach to investing time, money, and resources in the
various activities associated with development. For
example, how much is too much requirements analysis, and
how much is too little? A "requirements-light" approach
might consist of documenting each of the several hundred
requirements associated with a development project in a
single, succinct sentence. A "requirements-medium"
approach might consist of a paragraph of narrative text
for each requirement. And a "requirements-heavy" approach
might require detailed UML models, data-element
definitions, and formal descriptions of the "methods"
associated with each object.
The choice between a requirements-light and requirements-
heavy approach is likely to be significantly influenced
by corporate "time to market" pressure. Similarly, the
rate of employee turnover is a factor: one of the
justifications for a formal software development process
is that a detailed document describing the requirements,
design, and code will cause less chaos if key developers
quit in the middle of a project. And while the systems of
the 70s and 80s might have been expected to have a
productive lifetime of a decade or two, perhaps the dot-
com enterprise is willing to make a formal commitment that
its e-business applications will only last a year before
being scrapped and completely rewritten. If that's the
case, and if the next-generation application is expected
to be radically different than the current one, does it
really make sense to follow a requirements-heavy approach
just because it's a prerequisite for achieving SEI-CMM
level 3?
Similarly, how much formality and rigor are appropriate
for design and testing? How much is appropriate when it
comes to time reporting, progress reporting, status
meetings, and the other familiar activities associated
with managing a project - especially for the projects that
only last a week? These questions have always been
relevant, but the answers we've traditionally accepted as
corporate strategies need to be re-examined at least every
few years, because the cost-benefit parameters change as
business conditions change, as technology changes, and as
our software developers change.
The light methodologies also re-examine the assumptions
that we've historically made about investing resources in
requirements analysis, and the assumptions we've made about
investing resources in process improvement. In 1981, Barry
Boehm's "Software Engineering Economics" provided the
stunning revelation that if we made a mistake during the
systems analysis phase of a project, it was ten times
cheaper to identify hat problem while the project was
still in the analysis phase, rather than allowing it to go
undetected until design. But Boehm made a fundamental
assumption that may not be true in this first decade of
the new millennium: the order-of-magnitude escalation
figures are only relevant if it's somehow possible to
identify the defect during the life-cycle phase when it
occurs. In today's environment, there are many business
situations where that premise simply isn't true. When
Bill Gates sat down to formulate the requirements for
Internet Explorer, perhaps he said "Make it just like
Netscape Navigator, only better." And perhaps
Netscape's Marc Andreesen thought, "I'll make Navigator
just like Mosaic, only better." But what was Tim Berners-
Lee thinking when he created the initial version of the
World Wide Web and the first crude version of a browser?
What sense would it possibly have made for him to write
detailed requirements?
Similarly, the heavy methodology advocates argue that if a
defect is found during development, it should not be
blamed on the individual who committed the defect; instead,
it should lead to a re-examination of the process that
allowed that defect to occur. But again, there is a
fundamental assumption - i.e., the only reason it's worth
investing the resources to identify the flaw in a process
is because we intend to use the same process again -
because our next project is going to be similar enough to
the last project that we would naturally use the same
process. But now things are changing so fast that there's
no assurance that project N+1 will have any similarity to
project N; thus, yesterday's process might have to be
changed substantially for tomorrow. Thus, perhaps it's
not worth the effort to fix anything other than major
flaws in the process, because the details are only going
to be useful for a single project.
Of course, there are circumstances where we can continue
to depend on the old, fundamental principles of software
engineering, which may justify a heavy methodological
approach. But we should ask ourselves whether the
assumptions behind those principles are still valid. For
many of today's projects, the underlying assumptions need
to be changed, and light methodologies are the most cost-
effective approach.
--
我是火焰, 确实无疑.
※ 来源:.月光软件站 http://www.moon-soft.com.[FROM: 202.105.67.15]
|
|