Still no silver bullet?

In his 1986 article No Silver Bullet—Essence and Accident in Software Engineering, Fred Brooks suggests that there’ll never be a single tool, technique, or fad that realises an order-of-magnitude improvement in software engineering productivity. His reason is simple: if there were, it would be because current practices make software engineering ten times more onerous than they need to be, and there’s no evidence that this is the case. Instead, software engineering is complex because it provides complex solutions to complex problems, and that complexity can’t be removed without failing to solve the complex problem.

Unfortunately, the “hopes for the silver” that he described as not being silver bullets in the 1980s are still sold as silver bullets.

  • Ada and other high-level language advances. “Ada will not prove to be the silver bullet that slays the software productivity monster. It is, after all, just another high-level language, and the big payoff from such languages came from the first transition, up from the accidental complexities of the machine into the more abstract statement of step-by-step solutions.” Why, then, do we still have a pre-Cambrian explosion of new programming languages, and evangelism strike forces pooh-poohing all software that wasn’t written in the new hotness? On the plus side, Brooks identifies that “switching to [Ada will be seen to have] occasioned training programmers in modern software design techniques”. Is that happening in strike force land?
  • Object-oriented programming. “Such advances can do no more than to remove all the accidental difficulties from the expression of the design. The complexity of the design itself is essential; and such attacks make no change whatever in that.” The same ought to go for the recent resurgence in function programming as a silver bullet idea: unless our programs were 10x as complex as they need to be, applying new design constraints makes equally complex programs, specified in a different way.
  • Artificial intelligence. “The hard thing about building software is deciding what to say, not saying it. No facilitation of expression can give more than marginal gains.” This is still true.
  • Expert systems. “The most powerful contribution of expert systems will surely be to put at the service of the inexperienced programmer the experience and accumulated wisdom of the best programmers. This is no small contribution.” This didn’t happen, and expert systems are no longer pursued. Perhaps this silver bullet has been dissolved.
  • “Automatic” programming. “It is hard to see how such techniques generalize to the wider world of the ordinary software system, where cases with such neat properties [as ready characterisation by few parameters, many known methods of solution, and existing extensive analysis leading to rules-based techniques for selecting solutions] are the exception. It is hard even to imagine how this breakthrough in generalization could conceivably occur.”
  • Graphical programming. “Software is very difficult to visualize. Whether we diagram control flow, variable scope nesting, variable cross-references, data blow, hierarchical data structures, or whatever, we feel only one dimension of the intricately interlocked software elephant.” And yet visual “no-code solutions” proliferate.
  • Program verification. “The hardest part of the software task is arriving at a complete and consistent specification, and much of the essence of building a program is in fact the debugging of the specification.” Indeed program verification is applied more widely now, but few even among its adherents would call it a silver bullet.
  • Environments and tools. “By its very nature, the return from now on must be marginal.” And yet software developers flock to favoured IDEs like gnus to watering holes.
  • Workstations. “More powerful workstations we surely welcome. Magical enhancements from them we cannot expect.” This seems to have held; remember that at the time Rational was a developer workstation company, who then moved into methodologies.

Meanwhile, of his “promising attacks on the conceptual essence”, all have accelerated in adoption since his time.

  • Buy versus build. Thanks to free software, we now have don’t-buy versus build.
  • Requirements refinement and rapid prototyping. We went through Rapid Application Development, and now have lean startup and minimum viable products.
  • Incremental development—grow, not build, software. This has been huge. Even the most staid of enterprises pay at least some lip service to an Agile-style methodology, and can validate their ideas in a month where they used to wait multiple years.
  • Great designers. Again, thanks to free software, a lot more software is developed out in the open, so we can crib designs that work and avoid those that don’t. Whether or not we do is a different matter; I think Brooks’s conclusions on this point, which conclude the whole paper, are still valid today.

About Graham

I make it faster and easier for you to create high-quality code.
This entry was posted in design, software-engineering. Bookmark the permalink.

3 Responses to Still no silver bullet?

  1. Peter Steiner says:

    Recently found a possible silver bullet for refactoring code bases: https://github.com/openrewrite/rewrite

    If it fits your use-case then it really saves 10x time.

  2. I think you are underselling OO a bit by just lumping it in with the rest:

    In NSB, Brooks wrote of OOP: “Many students of the art hold out more hope for objectoriented programming than for any of the other technical fads of the day. I am among them. ”

    In the 20 year retrospective on NSB, he stated: “Of the candidates enumerated in “NSB”, object-oriented programming has made the biggest change, and it is a real attack on the inherent complexity itself.”

    So unlike other approaches, not just tackling accidental complexity.

    https://dl.acm.org/doi/pdf/10.1145/1297846.1297973

  3. Graham says:

    That’s fair, and if true then it puts a greater onus on proponents of other fad paradigms to either:

    1. Demonstrate that OO introduces its own accidental complexity that the other paradigm removes without replacing; or
    2. Demonstrate that there’s still a heap of untackled essential complexity in OO approaches.

    I’m more optimistic that the first has potential (there’s a lot of bad OO out there, because it’s so unevenly understood and applied) than the second. OO adopts the problem domain directly into the solution, so you can’t really find any more direct approach to addressing the essential complexity. Meanwhile other paradigms like FP first say “let’s pretend that this problem system is composed of pure functions” which seems additive, rather than reductive.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.