OS X Ramblings
The Spy has mentioned problems and concerns with OS X in this space several times. Applications quit suddenly, permissions get unglued, and updaters fail to solve the problem.
Recently the Applications folder on a new 1G TiBook got a mind of its own: Any attempt to move an application to it or to a subfolder caused the finder to quit and restart, the attempted action not performed (a self-taught folder action). No problem with text and other files, just applications. Moreover, finder info would not access the permissions without the same effect.
Went to the install disk, redid the permissions, ran Disk First Aid, Disk warrior, and Tech Tool, all to no avail. I also applied the 10.2.4 update, but got nowhere. The only fix was an archive-and-install from the 10.2.3 disks. The latter was also the only way to solve a classic application quitting bug on a 1G desktop that had been upgraded successively (but not entirely successfully) from 10.1 through 10.2.3.
Conclusion: Apple’s update process is not catching all the files it should–either when their updater is constructed, or else when it is applied. Use only the archive-and-install method.
Still, with 10.2.2, Apple has passed a kind of divide in day-to-day useability and stability. For the first time, the Spy is willing to heartily recommend switching from 9.2. Now, if only NisusWriter for OS X were available. Patience. Patience.
The Is Technology Better Department
When your friendly mail carrier wanders along the street multitasking between chatting up the customers, delivering paper mail and avoiding neighbourhood dogs, (s)he can easily handle items with the addressee’s name incorrectly spelled, house number digits transposed, or an incorrect postal (zip) code.
By contrasts, our decidedly unfriendly eMail carriers return even slightly misdirected messages with a stern, automatically generated failure missive, whose content and utility depends on the robot refusing delivery. One mail carrier uses fuzzy logic to ignore mistakes, the other is far too much a slave to DWIS rather than DWIM. Hmm.
On a similar note, the Spy observes that a particularly nasty spammer has taken to using an Arjay Enterprises’ domain (4civ.com) as a return address. Naturally, a large percentage of these messages are refused and returned to the sender. But this means my mailbox has become choked with thousands of these, and my company’s reputation is damaged. Sending a complaint to the domain responsible, (eNews.com.tw) resulted in even more such returned mail, this time using the return address from which we mailed the complaint. Hmmm.
In common with many others, I use mail filters to reduce the amount of spam, then occasionally check the directory where the trash ends up to see if anything has been misdirected there. Unfortunately, in order to catch a reasonable proportion of the spam (perhaps half the hundred or so a day in that category) we’re finding legitimate mail is being trashed as well–the latest being two applicants for jobs, a communication from a lawyer, and a newsletter we really want to arrive. All contained “bad” phrases in innocuous contexts. Again, hmmmm.
Besides showing that spammers utter lack morality, politeness, respect, civility and like basic characteristics of supposedly civilized people, these problems illustrate that technology that is supposed to be an improvement can easily have side effects that make life worse. In a sense, technology can be its own disease.
In my Software Engineering course, I worry aloud to students that failure to follow a rigid discipline of code planning, review, testing, and maintenance leads to projects that either fail in process, or collapse of their own unmaintainability in production (incidentally wasting billions in client funds). For sufficiently large projects, one can only make statistical estimates of the number of bugs.
Sutcliffe’s saying (unless someone else has already named it) is: Any sufficiently complex technology is unmanageable. It can neither be understood, effectively modified, nor maintained. It is doomed to collapse. Certain hundred-million line plus programs whose names I will not mention have probably reached this exalted status. Has the Internet? Has eMail? Is there a fix?
The Mad Scientist Scenario
Say the word “crooked” and most people, rightly or wrongly, will add “politician”, “lawyer”, or “businessman.” (Why not businesswoman?) Those are bad enough stereotypes, but that of us scientists is worse. Mary Shelley’s Dr. Frankenstein and his brutish home-made monster gave rise to a whole genre of horror for a reason. It touched a deep-seated fear that we might auto-immolate (whether vertently or inadvertently)–our very technological successes become the instrument of self-inflicted demise. This is an important concern, for high technology inevitably places in the hands of a large number of individuals the power to utterly destroy civilization (to the point of sterilizing the planet permanently).
This month’s Analog editorial wonders further if this provides the answer to the Fermi paradox. (i.e. We haven’t been contacted by aliens because there’s no one out there, and that’s because all sufficiently advanced civilizations eventually destroy themselves.) You see, a madman bent on genocide or other deadly mischief need no longer persuade a nation by demagoguery to raise an army to do his will. All he needs is to corrupt a single knowledgeable scientist.
This fact takes the “mad scientist” scenario from the realm of junk pulp SF to the front burner of real modern concerns, for there is plenty of evidence that dictators in several regimes have already done just this, and merely require sufficient time to instantiate any number of potentially lethal schemes.
Closer to home (for our profession at least), were, say, any of the top fifty net cops to go bad, that one person could end to the network as we know it. Indeed, only a combination of skill and loyalty on the one hand, and incompetence on the other has kept the net going as ling as it has. Without much more attention to security from the code planning stage through to the deployment and maintenance of our computing infrastructure and applications, we may soon have to learn what it was like before the Internet came into being. Hands up, now. How many do?
This is a good motivator for writing fiction. One can speculate about such matters more subtly, at the same time exploring options that might prevent a technological meltdown singularity. Thus my Hibernia (where the Irish divide their time between ruling the world and fighting among themselves), offers these (admittedly band-aid) solutions:
1. there is an almost universally pervasive common moral code, as Christianity once was on much of Tirdia (our earth),
2. some technology (weapons that kill at a distance, human genetic engineering) are banned altogether,
3. because humankind is fallen in all the alternate universes, there will always be cheaters whose activities need to be policed, lest they unleash the forces in Pandora’s Box,
4. the police are themselves subject to the authority of the royal court,
5. when that in turn becomes corrupt, life itself, never mind civilization, hangs by a thread. Only the king stands in the gap against chaos, and he’s been deposed.
Throw in the use of high technology for a couple of centuries longer than us, saints and sinners, good and bad hackers, romance, coming of age angst, and some other entertainment hooks to draw people into the ideas, and you have some idea what The Timestream novels are all about. I mention this because Writers Exchange ePublishing is about to release The Friends, which is Volume Two of The Interregnum, and a sequel to The Peace. Initially available only as eBooks, both should be in dead tree format as well within a few weeks more. Volume three, The Exile, is not far behind. See the links below.
The Language Zoo Revisited
On an only slightly related point, I note that there’s getting to be much more activity on the programming language front these days, with new experimental notations joining the parade to the post with each passing month. Are we heading back to the seventies, which were characterized by the availability of hundreds of programming notations, most of which never gained enough of a following to make its mark? And, aren’t C++ and Java good enough for everything?
Yes, and no, in that order.
A cardinal rule of software engineering is There is no silver bullet. That doesn’t stop would be Lone Ranger language designers from returning to their mines periodically in the attempt to fashion the perfect notation. And, they should, for the incompleteness principle tells us that no technology can ever be known to be the best available.
For its part, C++ is an exceedingly poor teaching language. It is cryptic, arcane, and non-orthogonal. It is well suited to the maintenance of high priests, hackers, and their ilk, but neither for teaching nor for the kind of discipline upon which successful multi-hundred-million LOC programs are built.
OTOH, Java is well designed, minimal, mostly orthogonal, and easier to learn, but suffers from a complex library, a few badly implemented or missing features of critical importance, and from its owner having poisoned the standards well. There will never be effective outside input or an effective Java standard, and cross-platform stability continues to be problematic. “Write once, run anywhere” remains an elusive goal.
The ISO standards committee SC22/WG13 did a remarkable job of giving Niklaus Wirth’s minimalist Modula-2 a well-defined semantics and of adding modern features like Generics and OO without the result becoming overly complex, but they were too late to deflect the fad wagon (at least in North America), so the best of the modern languages has been relegated to niches. Wirth’s later Oberon fragmented into so many localized efforts that it became impossible to say what the language/OS was any more, and consequently, it is unlikely to be used outside a few research universities.
Meanwhile, the web has imposed its own programming demands, largely in the areas of cross platform compatibility (where Java was supposed to shine) and string handling (where it decidedly does not). As a result, various notations for scripting, string handling, and data stream management have come to the fore in the last ten years, though none of these pretend to be general purpose languages.
Is the Spy dreaming, or can we do better? Is it science fiction to hope all our collective talent could be put to use in making a notational tool that would have
1. the versatility of a modern general purpose notation,
2. the clarity, simplicity, and readability of Modula-2,
3. the string handling of the best scripting languages,
4. generics as (ahem) simple and even more elegant than in ISO Modula-2,
5. a simple and useful OO model (this has proven an elusive goal),
6. compile to the JVM or an extension thereof (UVM?),
7. a degree of interopability with existing libraries, even when coded in other notations,
8. be standardized across platform via the ISO process.
As part of my next sabbatical from the university teaching job hat puts ground meat on my family’s table, I plan to undertake a feasibility study on the matter. Anyone out there interested in contributing?
Certainly all these co-processes are connected. If they aren’t, the next generation (if any) will write our epitaph: They grew too soon old und too late schmardt. We need to start asking and answering some of the tough professional questions about work quality. We need to do the same for our technological goals. Do we have any? Do we care who uses technology and how? Or are we even asking ethical questions any more? The old saw that science and technology were morally neutral was a dangerous illusion. It’s time we did more, better.
Memo to self: focus on one topic next month. Multitasking is hard.
Second memo to self: find a topic for next month.
–The Northern Spy