Essays on the near future by software wizards Dan Allen, Frank Alviani, Elon Gasper, Ted Johnson, Scott Knaster and Leonard Rosenthal.
The Macintosh is now a mature product.
As of January 1989 Macintosh is five years old. It has come a long way: from 128 KB of RAM to 8 MB, from 400 KB floppies to 1.2 MB, from no hard disk to gigabytes if needed, and from two applications (good old MacWrite and MacPaint) to thousands.
The average Macintosh user today falls into one of two categories: The Man On The Street (TMOTS) and The Man In the Office (TMITO). Those of the first category bought a Mac for its simplicity and ease of use while those of the second category are much more sophisticated users, usually from the Fortune 500 or from a university. Most early Macintosh buyers were in the TMOTS category, while buyers of Mac IIs are largely business related. College students fall somewhere in between.
There are probably a million Macintosh Plus like machines around, and they are owned largely by members of category one: TMOTS. They wants good, reliable versions of the old standbys: word processors, spreadsheets, databases, and perhaps some graphics. Their tasks are fairly standardized and include writing letters, creating flyers, keeping simple lists, and perhaps some accounting. These buyers — historically the large majority of Macintosh owners — are lucky if they have the 128 KB ROMs, one megabyte of RAM, and a 20 megabyte hard disk. They do not have accelerators, modems, large displays, or color monitors, and they do not use MultiFinder. The challenge for software writers for this audience is to produce more powerful versions of these basic applications while maintaining a high degree of simplicity and ease of use. In addition, these applications must be reasonable in size — say less than 300 KB on disk — and run in less than 300 KB of RAM. Unfortunately, few of today’s popular applications achieve this goal. They require what I term Category Two hardware. Let’s face it: The Man On The Street does not have an extra $1,200 to spend on RAM in order to use FullWrite. Instead she uses — and is very happy with — WriteNow, which will run in 128 KB.
Category Two users have lots of RAM and enjoy using MultiFinder. They have large, fast hard disks, color monitors, full-page displays, and they frequently connect up to other computers through modems and local area networks (LANs). This growing category consists mainly of Mac II buyers from the Fortune 500. These “power users” can afford to pay $10,000 for a computer because they use it 8 hours a day for their livelihood. This market segment is wide open to bringing mainframe applications down to the desktop, but simplicity of use must not sacrificed. Such areas that are good candidates for the Mac II platform include animation, signal processing, CAD, simulation, and multimedia. Each of these has — in the past — required highly specialized hardware. Such applications are candidates for the Mac II, especially when combined with custom NuBus cards.
The challenges for today’s developers consist of increasing the power of applications while maintaining simplicity. For Category One users, the additional challenge is to keep the application small in its memory requirements. For Category Two users, the challenge is to branch out into whole new types of programs that have in the past only been found on mainframes.
— Dan Allen
Dan Allen is a Software Explorer at Apple, where he has done programming, testing, teaching, documenting, benchmarking, and debugging on many different projects, including MacApp, HFS, HD-20, Mac Plus, MacsBug, and many parts of MPW, including the MPW Shell. He currently works with Bill Atkinson on HyperCard.
The ugly duckling has turned into a swan at last
The Macintosh-style graphical human interface stands vindicated by science and popularity. It is praised by its former detractors, admired by those who ridiculed it, and shamelessly imitated by a host that includes Big Blue itself. Many early Mac enthusiasts are ready to smugly pat each other on the back while wistfully reminiscing about how they were graphical interface before graphical was cool.
Is the evolution of the human interface all over? Or does another cycle emerge after this, with the current Mac graphical interface destined to become a dinosaur in turn? If so, where are the little mammals and when will we see their day?
The history of the human interface is one of increasingly complex programming creating systems that are more intuitive to use because they better fit innate human abilities. Our familar Macintosh software conventions are the best known and now most standard embodiment of the latest generation in this evolution. They employ three main factors: icons, manipulation, and the desktop metaphor. Icons are more intuitive representations that replace of the more distant abstraction of alphanumeric symbolism. Manipulation objects and pointing devices let us select, move and handle simulated things. Finally, the metaphor of the desktop provides the context in which their power becomes practical for the use of software tools.
Future developments are definitely going to involve further refinements of the graphics metaphor. Witness Open Look from Sun, Intuition on the Amiga, and NeXT Step. However, these are just variations on a theme, not a new beginning.
That beginning is now upon us in a transition to a new generation of human interface. It will be a multimedia one, distinguished by having an even wider bandwidth of communication between person and machine. There are two primary means by which this enhancement will occur: sound and animation.
The use of sound brings another dimension to the user experience. This is not a reference to novelties like digitized startup audio inits and sonic finder, but rather to tools where sound makes an inherent contribution to form and function. HyperCard is the first significant instance of a whole new class of tools to create this next level of solutions to the user interface problem.
The ultimate sound form for humans to decode and create in terms of complexity is speech. Speech output and input are a logical next step in user interface. Already hardware manufacturers prepare for it, as in the microphone built into the NeXT computer.
The other important area in the nascent new generation of human interface is animation. The power of animation as a fully realized component of a multimedia user interface will lie not with just prerecorded movies, but random-access animation. This means real computer-generated motion pictures with interactive functionality under precise control of the computer and variable by it in real time. For example, the simulation of the human form, modeled by the computer in response to its needs for effective communication with us.
The multimedia synergy of sound and animation has potential barely explored. Voice and vision can be combined and synchronized in many ways. One particularly powerful combination is of two of the most advanced instances: synchronization of speech with facial animation to create “Max Headroom” type synthetic actors. These agents can then function as knowlege navigators, as demonstrated by Apple’s visionary John Sculley in his videos. The Macintosh-generated “Albert” character created by Harry Anderson’s StakDek company and now appearing on the Disney TV series “The Absent-Minded Professor” is another example of this fascinating possibility.
We Mac users have a special opportunity to be more than passive spectators as this new horizon comes into view. With the tools already at our command, our participation can be at the level of changing the world. For in the end, the user interface of the future will be created by us. Let’s not rest on our laurels over the acceptance of the Macintosh-style graphical interface. The victory of its vision and our faith is just another stage in the inevitable evolution of the human interface.
— Elon Gasper
Elon Gasper is the president of Bright Technology Inc. He is a specialist in interactive graphics and real-time programming for small computers. Elon previously was software development manager for a UCLA audiovisual research department and has also taught in the California State University System.
Five years have passed since the Macintosh burst onto the personal computer scene, and the future is finally starting to arrive. We immediately saw that the windowing, graphic interface was the most effective (and fun!) approach available for real human beings in most situations, and knew in a flash that the future of computing for the next 20 or more years was in front of us. Obviously, in a few years everybody would be taking a similar tack…
While it took longer than some of us thought, that flash of illumination is proving to have been more accurate than the traditionalists would ever have granted. And now that windows and mice and everything nice are sweeping the world, we have to prepare to play a different role: no longer “the brave hero braced against the icy blasts of conformist disapproval”, but good programmers who happen to have more experience in one style of interface. In short, the monopoly is rapidly going to disappear.
In the next three years, I expect MS-Windows/Presentation Manager to spread widely in the IBM world (MS-Windows for right now, due to the resource requirements for PM, with OS/2 finally becoming fairly visible after 1990). In the workstation market, X-Windows is enjoying considerable support from a wide variety of vendors who are normally intense rivals, and interfaces based on it will probably become quite widespread. The Atari and Amiga clans have had windowing interfaces for several years, and even lesser known systems are benefitting from the WIMP (Windows, Icons, Menus, and Pop-ups) approach.
The result is that while Macintosh programmers have a deep well of knowledge to draw from — making us highly desirable properties in the less-experienced MS-DOS markets — we can no longer depend on the uniqueness of the Mac interface to “protect” us. We must study the techniques and assumptions of the competition, and truly learn from them, if we are to remain competitive and possibly even expand our current horizons.
Complexity is another great challenge we face. While many great programs have been created by individual craftsmen and small teams in the past — and will be in the future — the unending feature wars that seem to make up a good deal of our industry’s marketing realities virtually guarantee that programs and staff will only get larger as time passes. It is difficult indeed to evolve from rogue programmers into genuine teams — but absolutely vital that we do so! A team is more than just several programmers with a manager to try and balance impossible deadlines against providing enough new features to keep sales (and the paychecks) coming in — it is a group of people who are each aware of what the others are doing, where they are going, and what they can contribute to each other. A mere group of programmers is a millipede stepping on its own feet; a team will actually make progress. Learning to deal with code reviews, source management systems, and programs that consist of 100,000+ lines of code is part of our future — a future that can be satisfying if we are careful.
Perhaps the greatest challenge in the near future is to keep our souls. It is flattering to be a hot property in a new world; it is tempting to design programs to meet the magazines’ check-list reviews to get a good score and sell lots of copies. But many of us jumped onto the Macintosh tumbrel because we thought it was a better way to go; we didn’t know if it was destined for the palace or the guillotine, but a chance to make computers work naturally for people, to do things in a fun and non-intimidating way made the career risk worthwhile. We can’t let the grind of bug-splatting and feature-splicing take the fun out of it all, for the user and (even more importantly) us! Keep hacking! Do insanely great things sometimes for their own sake, even if marketing says they aren’t important!
— Frank Alviani
Frank Alviani started programming in high school in 1967 on an IBM 1620. His first Mac was a 128K model, but he now enjoys the luxury of a MacII at Odesta (creators of Double Helix II), where he makes tools and adds features. Outside interests include his family, blacksmithing, and remodeling his home.
It’s June, 1989.
There’s a big party taking place, a homecoming party for the just-graduated Benjamin Braddock. Curiously, the party’s guests are his parents’ friends, not his. Ben is bombarded with advice, good wishes, kisses and hugs. He’s feeling disoriented. In the living room, he is accosted by Mr. McGuire. We join in progress:
Mr. McGuire:: Ben.
Ben: Mr. McGuire.
Mr. McGuire:: Ben.
Ben: Mr. McGuire.
Mr. McGuire: Come with me for a minute. I want to talk to you.
(Mr. McGuire puts his arm around Ben’s shoulder in a fatherly fashion and walks him out to a private spot next to the swimming pool.)
Mr. McGuire: I just want to say one word to you — just one word.
Ben: Yes, sir?
Mr. McGuire: Are you listening?
Ben: Yes, I am.
(Mr. McGuire looks directly at Ben.)
Mr. McGuire: Objects.
(There is a long pause as Ben ponders this.)
Ben: Exactly how do you mean?
Mr. McGuire: There’s a great future in objects. Think about it. Will you think about it?
Ben: Yes, I will.
Mr. McGuire: Shhh. ‘Nuff said. That’s a deal.
Will this scene really be happening next year? I hope so. It should (except that you don’t have to graduate from college to heed this advice). Object-oriented programming has been one of our industry’s Next Big Things for a long time, right up there with the Year of the Network and voice recognition. But object-oriented programming is really happening now, finally, in a big way. If you want to have a good, long life as a programmer, one smart thing you can do right now is to immerse yourself deeply in object-oriented programming.
Aha! I hear you out there, you skeptical engineers, asking “Why? What’s the big deal? Why can’t I just keep writing software with procedure-oriented systems?” Well, I’m here to tell you briefly about some of the great and fun benefits of object-oriented programming, including at least one benefit that you’ve never heard before.
Object-oriented programming supports the creation of user-defined types (usually called classes) that specify not only an object’s state (like fields in a C struct or Pascal record), but also how to make it do something (usually called methods). This makes objects into black boxes that already know how to do things.
This is kind of like the real world: your clock radio has some fields that show state (the time, the alarm time, the station, the volume), and it has methods (turn on, turn off, change the station, make the display dim). By putting the state information and the methods together in the same place, objects reduce the number of concepts you have to worry about.
The Macintosh Toolbox knows about some general details of windows, controls, and menus, but their specific appearance is defined by definition procedures, or defprocs. In this way, parts of the Toolbox can deal with classes of components that have well-defined interfaces but very different details. Object-oriented systems use this handy feature extensively.
Nobody writes a Macintosh program from scratch. Everybody starts with a shell, or another application, and then adds wondrous new functionality. Object-oriented systems are designed for this kind of sharing. Classes of objects are defined as refinements of existing classes. All the behavior of the existing class is inherited; then, the programmer specifies what’s different about the new class.
Now, for the best reason to find out about object-oriented programming, and the one that nobody’s ever talked about before: the industry’s leaders are finally turned on to it, and there’s no looking back. MacApp and its descendants are going to be vitally important in the future Macintosh world. More and more people are deciding that object-oriented programming is the best solution available for the kinds of computers and interfaces that the world is making.
It’s a good idea not to think of this object-oriented stuff as the easy way out. Object-oriented programming does not make Macintosh programming (or anything else) trivial. It does make it more manageable and more reasonable, and it often makes it more fun, too.
So put it on your list of things to do. Get MacApp. Try it. Have fun with it. The train’s a-comin’, and you can climb aboard, watch it go by, or get run over. And cow catchers leave nasty scars.
(Special thanks to David Goldsmith and to the Voyager Company’s Criterion Edition laser disc of “The Graduate”.)
— Scott Knaster
Scott Knaster is the author of How to Write Macintosh Software (Hayden Books, 1986) and Macintosh Programming Secrets (Addison-Wesley, 1987), the only Macintosh technical book with a comic strip. He was Technical Support Manager for Apple from 1984 to 1987 and was a co-founder of software publisher Acius, Inc. He is now a Writer-Engineer at Apple.
In January of 1984, Apple Computer, Inc.
released the Macintosh and a new metaphor for personal computing — the “Desktop Metaphor”. This metaphor was simple — your computer’s desktop should emulate your personal desktop. There you could store your files, keep your tools for crafting these files, and throw things away when they were no longer needed. But Apple forgot one thing — on your personal desktop usually sat a telephone that allowed you to communicate with the rest of the world — but this was missing from the world of the computer desktop…
Attempts have been made ever since to bring that telephone to your desktop. We were given the software that would store away our large address books and, when needed, actually dial our telephone for us. Then came the generation of the “Terminal Emulators” that would allow us to communicate with the older generation of computers in their outdated textual modes. Now we begin to see the advent of Graphical Terminal Programs that promise to bring the Macintosh Interface to these antiquated textual systems — but is this really where we want to go?
A few years ago I had the pleasure to be part of a group whose job it was to find ways to integrate the Cray Supercomputer with the Apple Macintosh in the most transparent way possible. Initially there were simply the scripts and macro’ for the telecommunications programs of the day, but then came the idea — “The Cray Finder”. It was the communications interface of the future, bringing communications to the desktop where it belonged. Unfortunately, it is still just an idea, but one whose time has come.
My vision of future telecommunications revolves around the concept that telecommunications should be transparent to the user. It should be as intuitive as dragging a folder into the trashcan to throw it away or, even more appropriately, addressing and stamping a letter to go out US Mail.
Let’s say you have just written the “Great Short Story” and you need to get it off to your editor before the deadline. Well on your desktop is a folder which you have specified as belonging to your editor. You simply drag the document (or a copy of the document) into this folder and poof it gets mailed because you specified, when you created the folder, the mail address of the recipient. The system knows how to connect with the other side and transmit the document. And this does not have to be just E-Mail — why couldn’t it be sending it as Fax? The user should not be concerned about how to get the data to its destination — it should just get it there!
That’s fine for mail, but what about BBS’s and services like GEnie and CompuServe? The metaphor should still hold true. To find out what new information is on GEnie, you open up your GEnie folder which might have subfolders called Mail and Macintosh RoundTable. To read your new mail, or your new messages, you simply open up the appropriate folder and you would be presented with the documents (and possibly even more subfolders) for you to act on as you see fit. Here again you previously specified some parameters for the folder when it was created such as how often to check for new mail/messages. Since the system is responsible for dealing with the data transmission process, it is possible that Faxs and file might even be waiting for you in your mail folder.
To bring telecommunications on the Macintosh to this level, there are really two problems that need to be solved: partly by Apple System Software and partly by some good Mac programmers.
The first thing that needs to happen is for Apple to improve its support for both Asynchronous and Appletalk communications. On the hardware level, this includes support for multi-port serial boards for the Mac SE and Mac II, as well as internal modems. On the software level, this would provide programmers with a common interface to both serial and Appletalk communications. This would also allow make data transfer via protocols such as TCP/IP and X.25 transparent to the programmer. She could write for one common interface which would work no matter what method of transfer the user chose for her data.
It would appear that Apple is about to take steps in this direction with the much rumored Communications Toolbox. According to the rumors that appeared in many periodicals, this Toolbox would function much like the current OS Toolbox by providing sets of “managers” each one responsible for a related set of communication tasks. For example, one might be responsible for routing data to the appropriate location via either serial or Appletalk lines while another might be responsible for doing different types of file transfers. This is just what the doctor ordered. It’s still in it’s infancy, but so is my dream.
Now that the drudgery of data communications has been made much simpler, the task of making communications transparent to the user can be dealt considerably easier and in much less time.
The communications software would require not only a very intuitive user interface for the user to specify the parameters for each folder’s communication’s needs, but it would also require a very powerful scripting language to carry out its tasks properly. A scheduler and usage log would also be nice touches so the user can have even more control over his communications and later be able to see exactly what transpired.
Although my vision of future telecommunications revolves predominantly around mail or mail-like features, it is my feeling that this will be the largest usage for telecommunications in the future. Such things as terminal emulation, so that users can communicate with older equipment, would still be available through a special mode of the communications software, but its primary function would be data exchange via mail-like means.
— Leonard Rosenthal
Leonard Rosenthal has been programming computers since he was first introduced to them 9 years ago. Since 1984 he has written both shareware and commercial applications for the Mac, including, most recently, ∑Edit and Aldus' Persuasion. He is currently employed by Software Ventures in Berkeley as one of the programmers for MicroPhone.
Developers familiar with the concept of object-oriented programming know that it has the potential to significantly alter the way software is developed.
And at Aldus, we believe that object-oriented programming will bring about fundamental changes in the packaged software industry — that it will open new markets for both large and small software developers in the years to come. I’d like to explore the opportunities that we believe will be created for “component” developers.
Object-oriented programming is based on the principles of encapsulation and reusability. The principle of encapsulation states that objects, or components, are complete in and of themselves, and that their interaction with other objects is isolated and clearly defined. The principle of reusability states that encapsulated objects can and should be reused, with or without modification, to build more complex objects and applications. Such encapsulated, reusable objects have been called “software ICs,” since, like hardware integrated circuits, they have clearly defined inputs and outputs and are expected, by their very design, to be used in conjunction with other ICs to create complex systems.
As objected-oriented programming techniques gain widespread use on the Macintosh — as we believe they will — a new class of component suppliers will evolve to provide the software ICs from which tomorrow’s great applications will be built. At Aldus, we haven’t fully assessed the impact of such a development on our business, but we do know it will be profound. We don’t believe, for example, that we’ll continue to create entire applications from scratch as we did with Aldus PageMaker®. Nor do we believe we’ll continue to license and market completed applications developed by others. Instead, we’ll buy or license components, add custom components of our own, and then market and support the completed application.
Aldus has already begun to push ahead in this new direction. Aldus PageMaker 3.0, for example, contains text import and export filters that are components of code developed independently of the program. These code resources are installed directly into the application file with a utility like the Font/DA Mover. Aldus® FreeHand™ 2.0 contains a similar mechanism for installing special text effects. Future Aldus versions and products will expand on these ideas, so many opportunities exist for independent developers to create additional product filters and features in this manner. And as we adopt more object-oriented programming techniques, we expect to see an explosion in the opportunities for independent developers to provide components for Aldus products.
But there are many other issues to explore about the opportunities for software developers in the new world of software ICs. Who, for example, will become the $200-a-pop CPU suppliers, and who will supply the dime-a-dozen memory chips? Will application vendors like Aldus become simply board-level integrators? Will we specify and build custom ICs? Or will we specify the ICs and turn to outside subcontractors to provide the chips? Will we license these custom software ICs to others? These and many other questions remain to be answered. But we’re convinced that fundamental changes will sweep the packaged software industry in the 1990s. And we intend to be a leader in the revolution.
— Ted Johnson, Aldus Fellow
Ted Johnson is an Aldus Fellow who has worked at the Seattle software company since May of 1985. He led development of the PC version of PageMaker, then was in charge of version 3.0 for both the PC and Mac. Recently, he headed the engineering side of Persuasion’s rollout. His current interests are in object-oriented programming, multimedia integration, and electronic publishing.