by Tom Jennings
Revised 27 July 1999
Input/output, I/O, is the klunky and crude write-off phrase for the only part of symbolic machinery that's of any use or interest, where the alien innards tie to the world, mainly people.
The interface between people and their machines is pretty much the limiting factor in making symbolic machinery useful. It's historically been a bottleneck and a weak point, and not because it's difficult (which it is) but because there's a cultural failure-of-will to spend any money or effort on it. Displays are still considered an after-thought on computers today; almost universally computers are sold as "monitor extra" (notably Apple, with it's much better attitude towards these things, is the main exception).
Personally I'd find it much better to have an absolutely fantastic set of interfaces (flat visual, limited-depth 3D, binaural or 3D sound, tablets, trackpads, stylii, etc) and who-cares about the CPU. All the interface software sucks, because interfaces are made second-place to CPUs. Few of us really need screaming CPU cycles, what we need are functions to get mundane and creative work done, however they might be built.
Typical "user interfaces" crapify otherwise useful machinery. We're not yet collectively sophisticated enough to will and implement useful or interesting ways to use computers. Human hearing is fantastically sensitive and subtle, with amazing directionality and phase sensitivity; and we've had the computing hardware and at least part of the knowledge to exploit it usefully and interestingly. But it's still relegated to laboratories; not because it can't be done, or would consume extrordinary resources; but because it's not "economically feasible", meaning that our culture hasn't found it "useful" enough in a short-term way to exert the effort it takes to MAKE it "economically feasible". (When a thing is deemed worthy, it gets the money; transistors and silicon was no accidental discovery, large sums were spent on pure research for decades to eliminate bulky components and component interconnections; because it met other cultural needs, such as defense.)
Anyways, I'm just ranting, I only needed HTML to contain the sole [original] entry in the I/O page here, which uncoincidentally is a good example of what has been done to meet a need for people to deal with large amounts of information, both text and image, in a real-time environment. It's also, amusingly, equally awful, the whole thing fascinating in many ways. This is ARTOC, an "automated" military strategic information center from computing's Bronze Age. Who else could spend the money, besides the military?
Article excerpted from ELECTRONIC INFORMATION DISPLAY SYSTEMS, Spartan Books, Inc, Washington D.C., 1963; edited by James H. Howard, Rear Admiral, U.S.N.(Ret.). Reprinted without permission.
ARTOC was developed for the U.S. Army, around 1961, by Aeronutronic, a division of Ford Motor Company. It provides improbably sophisticated functionality with unimaginably crude hardware, by today's standards. I admit at least half of my interest in this is perverse, in that I love multi-discipline machinery, but the functions provided -- let's not ask how robust this system actually was in the field -- are pretty damn amazing. And consider that the displays were done optically; the quality of the displayed images was probably a damn sight better than average large displays today (though of course ARTOC was severely limited in the data it could present.
But while you're cringing and guffaw-ing over the hardware ("hundred million bit disk file... each housed in a 2-1/2 ton utility truck") the data coordination, presentation and redundancy are pretty damn cool.
Please note, some of the photos in the original book were not very good; the scans are not the problem.