The Northern Spy — If All You Have Is A Hammer

northernspy3
May 2015

Why is it
that so many software projects fail? Here in the frozen north, we routinely see one government and enterprise IT project after another delivered late, and not working, to the opprobrium of all the putative users. Government catastrophes run into the tens of millions. Notable failures in the United States include the IRS ($8B), FAA ($2.5B), FBI ($500M), McDonalds ($170M) and Denver airport ($560M) projects, to select a very few.
And yet, though there is no single silver bullet to kill the vampire sucking the life blood out of software development, the principles of sound software engineering production and management have been well known for decades. We know how to solicit input from potential users, design, prototype, choose development tools, factor, modularize, develop in well-constructed teams, use code repositories, do code review, perform both automated and in “situ” tests, and do user training and maintenance. We’ve been teaching these principles since at least the early eighties.
So again the Spy asks “why?” From this seat, the painful answers are simple enough and is a fine example of:
1. The Spy’s third law.
The practice of theory never matches the theory of practice.
The industry pays lip service to the theory that tells us how to do it slow, clean and right, but most shops practices the quick and dirty that it knows gets it wrong almost all the time. Far too much software development is little more than an extended hack, sometimes producing what may walk with a limp but will never run well or for long, is unmaintainable, unmanageable, and cannot be scaled up while remaining functional.
2.  The Spy’s fifth law.
Every sufficiently large monolithic project eventually becomes unusable, unmanageable, non-maintainable, and non-upgradable.
This has many corollaries (see the Northern Spy site) not the least of which is that the more money thrown at a badly planned project the worse it inevitably gets. What used to take weeks to go horribly, terribly wrong (giving some chance to fix it) can now cost billions in nanoseconds.
3. Abysmal management practices
A Provincial government here in the land of disappointed Canuck hockey hopes and igloo housing–a government that shall go unnamed, but isn’t far away–recently and most reluctantly admitted that the school management software it mandated Province-wide eight years ago ($97M) needed replacement (read: was a catastrophe–grade entry could take hours because the system was so slow) and announced a competition to replace it. Does anyone know what criteria were used to choose the winner? Has anyone consulted the users? Can anyone believe the new 12-year contract will produce better results? Ditto the new health care management system. Early reports have it that things once taking three or four clicks to achieve now take twenty or more–not a good sign. Dilbert’s managers are alive and well and live everywhere.
4. Poorly chosen software tools.
Tools for software development have always lagged behind hardware capabilities, for several reasons, not the least of which are that tool creation is difficult, and installed base an anchor. Management dictates the use of, and/or developers adopt development tools for the same reasons they buy hardware–emotional  attachment, the enthusiasm of friends and colleagues, perceived coolness factors, installed base, and hackability quotient–not necessarily quality (why else so many PCs?). Promotion of sound software engineering principles such as safety, security, reliability, and appropriateness are seldom factors. If all you have are software hammers, momentum and the casting of every problem into a nail become the reigning paradigms.
Yet, the heady days of the sixties through early eighties saw robust experimentation with programming language constructs designed to improve the software development regimen. For a time, we became discriminating about our tools, and were rewarded by the production of both special- and general-purpose languages designed to promote sound practice. One of these, appeared in the early 1980s.

Modula-2
was created by famous language designer Niklaus Wirth of ETH Zurich, Modula-2 in the late 1970s to overcome the deficiencies of extant notations, correct what he believed were errors in his earlier ALGOL-W and Pascal, and demonstrate that a restricted group of intensely thoughtful and practical designers could design far better than the large committee that put together the extensive but unwieldy Ada for the United States DoD.
Oh, Ada was a great state of the art theoretical toolbox, but like all horses designed by a committee, had overly many humps (such as operator overloading) and it took years to produce compliant compilers. Moreover, the DoD connection led many to boycott it. Wirth’s Modula-2 was leaner, meaner, easy to learn and use, and produced better, faster, more reliable code than anything else on the market at the time. It still does. In fact, the Spy knows of software engineers who design and develop first in Modula-2, then re-do their work in C++, the latter solely to satisfy managers who know no better. That reminds him of the days when managers mandated software flow charts they scarcely understood themselves, so the market produced code analysis tools to generate flow charts after the fact.
Modula-2’s     design actively promoted the principles of sound software engineering–planning projects in carefully factored modules that encourage information hiding with only the interfaces made public, thus promoting the use of Abstract Data Types and re-usable software without the overhead of OO, type-safety, marking programs at the outset when they contain inherently unsafe constructs, and generally making it easier to teach, learn, and use a simple but powerful set of tools, and develop more. The code is readable and maintainable.
Why didn’t Modula-2 take over the world? It might have if it were called Pascal-2. But, theory vs practice intervened again. Wirth’s definition, published in a series of editions of Programming in Modula-2 contained just sufficient vagueness, especially in the libraries (e.g. very simple I/O, that only suggested, not prescribed), and enough variations between editions so that compiler manufactures’ products were all mutually incompatible, especially for I/O. Indeed, the best an early edition of the Spy’s text on the subject could do was suggest three broad models for I/O that students could adapt for learning particular products (using from one to five library imports). Code portability was out of the question.
Consequently the British Standards Institute (BSI) proposed to the International Organization for Standards (ISO-not an acronym but Greek for “the same”) that it lead a standards effort, so that manufacturers of compilers would have more specific and consistent targets. A number of nations, including Canada and the United States, agreed, joined the effort, and sent delegations to meetings starting at Nottingham in 1987 and on to other locations in Europe, North America, and New Zealand in subsequent years.
The Spy was Canada’s delegate, attending all but the New Zealand meeting (and hosting one at TWU), wrote portions of and edited the entire standard suite. In the course of time, JTC1/SC22/WG13 did eventually produce a official standard that was accepted by most countries and adopted by compiler manufacturers.
But the process involved eleven years of arguing over the semantics of export from local modules, the complexity of the layered I/O library, and the practicality of the various delegates preferred additional features such as exceptions, termination code, object oriented facilities, generics, and the inclusion of COMPLEX and BCD ADTs.
Take OO facilities for instance. What is needed for functional OO is well known, but there are many binary decisions around the design of an OO language (e.g. arity, traceability, object protection, etc.), and if one examines any three languages, two will do it one way, the third the other, with no clear majority. The committee being unable to decide either, Modula-2 ended up with both traced and untraced objects. It also ended up with a complex exceptions/termination system, partly in the language partly in libraries.
By the time it was done, and the results had become ISO standards 10514 (a 705 page exhaustive base standard written in VDM-SL), 10514-2, and 10514-3 (smaller Generics and OO subsidiary standards written in plain language, the former by the Spy), the world was weary of waiting and had moved on. Developers had begun using other tools they imagined were good enough, and had become unwilling to better their choices. Type safety, modularization of code, and readability were all abandoned, with C++ and similar tools enthusiastically employed building software on quicksand. Apple’s Objective-C was somewhat more reliable, but only because it was tightly controlled and restricted to their ecosystem.
Oh, one can do a lot with these and other hard-to-learn, easy-to write, hard-to-read, debug, and maintain languages, and many people have. Some are hackers’ delights. Some, like PHP are so unsafe that web hosts routinely disable many of its functions on their servers or they wouldn’t function at all. Others, like the MS VBA are so poorly designed, so non-orthogonal, and so badly documented that they are canonical textbook examples of how not to produce a language (even though many of us have to use it).
Simply put, bad tools and worse practice don’t scale up for size and volume any better than traffic roundabouts. Large projects that were badly conceived, designed, and managed and that then sustain continuous product use repeatedly demonstrate the weakness of common notations.
For instance, the larger the project, the more difficult it is to keep track of memory management, hence the more likely programs will crash with out-of-memory errors because the programmer has forgotten to dispose of dynamic real estate. Since much software runs more or less continually over long periods of time, a single such error guarantees crashes, and is indeed the source of most such. Given the cryptic nature of the notations, and the poorly designed and documented software many create with it, finding such bugs is a herculean task (an apt comparison considering what must be swept from the average software stable).
Sun’s Java attempted to address this by having only traceable objects, but the language is compiled to an interpreted virtual machine, which inhibits performance. Moreover, it has not been maintained or standardized, and for years had significant differences between platforms that mitigated against portability, so that it may now be considered a dead language.
The bottom line: software needs for the real world of the twenty-first century are too complex to handle with current tools and practice. Students easily discern this. In the Spy’s language theory course, they come up with the same lists of criticisms, the same laments over obviously poor tools, and the same wish lists for something better year after year. Apple recently attempted to address some of these issues with Swift, and that notation must catch on in the Apple ecosystem (Cupertino will require it), but despite some improvements in these directions, and the “wow” factor of the playground, it is still cryptic, lacking in safety features, has an unfinished air, and encourages hacking code rather than intentional discipline–a case in point being the undisciplined way in which operators can be invented and/or overloaded).

In 2008, the Spy commenced
working with telecommunications engineer and project manager Benjamin Kowarsch (then of France and Switzerland, recently once again of Japan) on a revision of Modula-2 with the modest initial goal of reviving and updating a tool that was already superior to what most people currently employed. The initial design was completed in 2010, so it received the moniker Revision-10.
However, life intervened, and the project languished for a time. When we returned to it a few years later, our frustration level with the available tools had grown and our goals therefore became far more robust.
– return to and stay with the simplicity of Wirth’s original design;
– be informed by, but only minimally adopt any of the ISO work;
– tear down, scrutinize, and justify every language feature and design decision;
– build sound practice into the language and library;
– produce a robust tool whose code is readable, maintainable, and scalable;
– add only necessary modern features to reflect best practice;
– remove little used or obsolete syntax and features;
– blur the lines between built-in and library ADTs by treating both identically in programs;
– build a library system that could be leveraged for code reusability and productivity;
– create an easy interfacing system to libraries in other languages, including C++, Fortran, and Swift/Objective-C;
– restore the SE culture of safety, security, reliability;
– improve the way SE is practiced and taught.
The design was finished in early 2015, and is now available on the Bitbucket repository, and some material (outdated at this writing) on a Wiki. Participation is welcomed. Some library details remain to be perfected, but finishing the report/description for publication by Springer-Verlag (Spy mostly responsible) and producing a reference compiler (Benjamin largely responsible) now have priority, and some of this latter work will inevitably dictate fine details around the file handling modules, which are a little rough and incomplete (though the top or user I/O layer is quite polished).
Thus far, only the base language has been completed. Object orientation will be added at phase two. Lest this sound disappointing to some, note that the Spy at least believes OO is overrated, at least in the sense that it is often used for things much more easily achieved using generics and operator binding. Remember that the earliest iterations of OO in the 60s and 70s were abandoned because undisciplined programmers too often created unnecessarily complex and unnavigable hierarchies of objects that were impossible to debug.
The Spy also notes that our orientation has progressed over the years from hardware to process (algorithms), to abstract data types to object types. Excepting the latter, our understanding of most of these is relatively robust. But the problems facing us are those of dealing with large data mines and enormous numbers of documents. We need new programming paradigms centred around these. (document orientation?)
We believe Modula-2 R10 is an important milestone in language theory and practice, and that its ideas will be widely adopted in other notations, whether it becomes ubiquitous or not. It is time to lay aside or radically redesign C++ and like tools. Swift-2 may have possibilities when Apple finishes it. These outdated and unreliable languages had a good run for hacking together projects that were small by today’s standards, but something better is necessary to robustly handle the really big ones, and patching atop existing languages has never worked well.

In the months to come
this space will describe Modula-2 R10 under the following headings:
– the type structure;
– why some old language features were dropped;
– the rationale for adding certain new ideas;
– the control mechanisms;
– abstraction;
– the standard library facilities;
– the use of Wirthian macros to simplify I/O and achieve simple and very low overhead operator and reserved word binding;
– the use of simple generics (like those of the Spy’s ISO Modula-2 design) to help leverage libraries for robust and safe design, and to promote code re-usability;
– the use of library blueprints to guide the correct use of operator and reserved word binding (tidbit: you can bind to FOR, allowing a library to design, say, tree traversals that can be invoked in code using FOR node IN theTree DO….)
We hope to persuade Apple and others to open their ecosystems to become multi-language environments, and have little doubt what discerning programmers will choose given the choice. Enjoy the series.

–The Northern Spy
URLs for Rick Sutcliffe’s Arjay Enterprises:
The Northern Spy Home Page: http: //www. TheNorthernSpy. com

General URLs for Rick Sutcliffe’s Books:
Author Site: http: //www. arjay. ca

Publisher’s Site: http: //www. writers-exchange. com/Richard-Sutcliffe. html

The Fourth Civilization–Ethics, Society, and Technology (4th 2003 ed. ): http: //www. arjay. bc. ca/EthTech/Text/index. html

Sites for Modula-2 resources
Modula-2 FAQ and ISO-based introductory text: http://www.modula-2.com
R10 Repository and source code: https://bitbucket.org/trijezdci/m2r10/src
More links, Wiki: http://www.modula-2.net
p1 ISO Modula-2 for the Mac: http://modula2.awiedemann.de/

Please follow and like us:

About the Author

rsutcliffe

Opinions expressed here are entirely the author’s own, and no endorsement is implied by any community or organization to which he may be attached. Rick Sutcliffe, (a. k. a. The Northern Spy) is professor of Computing Science and Mathematics at Canada’s Trinity Western University. He has been involved as a member or consultant with the boards of several community and organizations, and participated in developing industry standards at the national and international level. He is a co-author of the Modula-2 programming language R10 dialect. He is a long time technology author and has written two textbooks and nine alternate history SF novels, one named best ePublished SF novel for 2003. His columns have appeared in numerous magazines and newspapers (paper and online), and he’s a regular speaker at churches, schools, academic meetings, and conferences. He and his wife Joyce have lived in the Aldergrove/Bradner area of BC since 1972.