Close to the Machine



Download 268.43 Kb.
Page2/3
Date conversion08.07.2018
Size268.43 Kb.
1   2   3

In "Close to the Machine" you tell the tale of a boss at a small company whose new computer system lets him monitor the work of a loyal secretary -- something he'd never before thought necessary. You call that "the system infecting the user." But a lot of people view the computer as a neutral tool.

Tools are not neutral. The computer is not a neutral tool. A hammer may or may not be a neutral tool: you know, smash a skull or build a house. But we use a hammer to be a hammer. A computer is a general-purpose machine with which we engage to do some of our deepest thinking and analyzing. This tool brings with it assumptions about structuredness, about defined interfaces being better. Computers abhor error.

I hate the new word processors that want to tell you, as you're typing, that you made a mistake. I have to turn off all that crap. It's like, shut up -- I'm thinking now. I will worry about that sort of error later. I'm a human being. I can still read this, even though it's wrong. You stupid machine, the fact that you can't is irrelevant to me. Abhorring error is not necessarily positive.

It's good to forgive error.

And we learn through error. We're sense-making creatures who make sense out of chaos, out of error. We zoom around in a sea of half-understood and half-known things. And so it affects us to have more and more of our life involved with very authoritarian, error-unforgiving tools. I think people who work around computers get more and more impatient. Programmers go into meetings and they hate meetings. If someone meanders around and doesn't get to the point, they'll say, what's your point!?! I think the more time you spend around computers, the more you get impatient with other people, impatient with their errors, you get impatient with your own errors.

Machines are also wonderful. I enjoy sitting there for hours, and there's a reason it's so deeply engaging. It's very satisfying to us to have this thing cycling back at you and paying attention to you, not interrupting you. But it's not an unalloyed good -- that's my point. It's changing our way of life deeply, like the automobile did, and in ways we don't yet understand.

When people talk about computers, they fall into two groups: the true believers -- you know, technology will save us, new human beings are being created. For me, as an ex-Communist, let me tell you, when people start talking about a new human being, I get really scared. Please! We're having trouble enough with this one! And on the other hand, other people think computers are horrible and are ruining our lives. I'm somewhere in the middle. I mean, we can't live without this any more. Try to imagine modern banking. Try to imagine your life in the developed world without computers. Not possible.



There's a lot of romanticism, in places like Wired magazine, about digital technology evolving its own messy, chaotic systems. In "Close to the Machine," you lean more to the view that computer systems are rigid and pristine.

From the standpoint of what one engineer can handle, yes. You can't program outside of the box, and things that happen that were not anticipated by the programmer are called design flaws or bugs. A bug is something that a programmer is supposed to do but doesn't, and a design flaw is something that the programmer doesn't even think about.

So it's a very beautifully structured world. In computing, when something works, engineers talk about elegant software. So in one sense, we're talking about something very structured and reductive. And in another sense, we're really not, you're talking about elegance, and a notion of beauty.

What makes a piece of software code elegant?

I'll try to speak by analogy. Physicists right now are not happy about their model of the world because it seems too complicated, there are too many exceptions. Part of the notion of elegance is that it's compact. And that out of something very simple a great deal of complexity can grow -- that's why the notion of fractals is very appealing. You take a very, very simple idea and it enables tremendous complexity to happen.

So from the standpoint of a small group of engineers, you're striving for something that's structured and lovely in its structuredness. I don't want to make too much of this, because with most engineers there's a great deal of ego, you want to write the most lines of code, more than anybody else, there's a kind of macho.

Yet the more elegant program does the same thing in fewer lines.

When you're around really serious professional programmers, this code jockey stuff really falls away, and there is a recognition that the best programmers spend a lot of time thinking first, and working out the algorithms on paper or in their heads, at a white board, walking. You dream about it, you work it out -- you don't just sit there and pump out code. I've worked with a lot of people who pumped out code, and it's frightening. Two weeks later, you ask them about it, and it's like it never happened to them.

So the motive of a true program is a certain compact beauty and elegance and structuredness. But the reality of programming is that programs get old and they accumulate code over the years -- that's the only word I can use to describe it, they accumulate modifications. So old programs, after they've been in use 10 or 15 years, no one person understands them. And there is a kind of madness in dealing with this.

And with the new systems we're creating, even the ones that are running now, there's a tremendous amount of complexity. Right now, if you talk to people who try to run real-world systems, it is a struggle against entropy. They're always coming apart. And people don't really talk about that part much. The real-world experience of system managers is a kind of permanent state of emergency. Whereas programmers are kind of detached for a time and go into this floating space, networking people live in this perpetual now.



It's the world of pagers.

They've got things that buzz them. They're ready to be gone in a minute. To try to keep the network running is very much to be at the edge of that entropy where things really don't want to keep running. There's nothing smooth, there's nothing elegant. It's ugly, full of patches, let's try this, let's try that -- very tinker-y.



That's such a different picture from the rosy vision of the Net as this indestructible, perfectly designed organism.

One of the things I'm really glad about the success of the Web is that more people now are being exposed to the true reality of working on a network. Things get slowed up. It doesn't answer you; it's down. How often can't you get to your e-mail?



I couldn't send e-mail to you to tell you I'd be late for this interview.

Some little server out there, your SMTP server, it's busy, it's gone, it's not happy. Exactly. And this is a permanent state of affairs. I don't see anything sinister or horrible about this, but it is more and more complex.

The main reason I wrote "Close to the Machine" was not just to talk about my life. Of course, everyone just wants to talk about themselves. But I have this feeling that imbedded in this technology is an implicit way of life. And that we programmers who are creating it are imbedding our way of being in it.

Look at the obvious: groupware. That doesn't mean software that helps people get together and have a meeting. It means, help people have access to each other when they're far apart, so they don't have to get into a room, so they don't have to have a meeting. They don't have to speak directly. That is a programmer's idea of heaven. You can have a machine interface that takes care of human interaction. And a defined interface. Programmers like defined interfaces between things. Software, by its nature, creates defined interfaces. It homogenizes, of necessity. And some of that's just plain damn useful. I'm happy when I go to the bank and stick a card in and they give me a wad of money. That's great.



For someone who is so immersed in the world of technology, you take an unusually critical view of the Net.

The Net is represented as this very democratic tool because everyone's a potential publisher. And to an extent that's true. However, it is not so easy to put a Web site up. Technically, it's getting more and more difficult, depending on what you want to do. It's not as if the average person wants to put up a Web site.

The main thing that I notice is the distinction between something like a word processor and a spreadsheet, and a Web browser. From the user's point of view there's a completely different existential stance. The spreadsheet and the word processor are pure context -- they just provide a structure in which human beings can express their knowledge. And it's presumed that the information resides in the person. These are tools that help you express, analyze and explore very complex things -- things that you are presumed already to know. The spreadsheet can be very simple, where essentially you're just typing things in and it helps you to format them in columns. Or it can be a tool for really fantastically complicated analysis. You can grow with it, your information grows with it -- it's the ideal human-computer tool.

With a Web browser, this situation is completely reversed. The Web is all content with very limited context. With the Web, all the information is on the system somewhere -- it's not even on your computer. It's out there -- it belongs to the system. More and more now even the programs don't reside with you -- that's the notion of the thin client, the NetPC and Java.

So the whole sphere of control has shifted from the human being, the individual sitting there trying to figure out something, to using stuff that the system owns and looking for things that are on the system. Everyone starts at the same level and pretty much stays there in a permanent state of babyhood. Click. Forward. Back. Unless you get into publishing, which is a huge leap that most people won't make.

But the Net is not one central system, it's a million systems. That creates a lot of the confusion -- but the advantage is that there's such a vast and diverse variety of material available.

My criticism, I suppose, is not of the Net but of the browser as an interface, as a human tool. I'm looking at it as a piece of software that I have to use. This is the only way I can interact with all this stuff. Some of what's out there may be good, some of it may not be. But I don't have the tools to analyze it. I can print it -- that's it. I can search for occurrences of a certain word. I can form a link to it. I can go forward and back. Am I missing anything here?



What do you mean by "analyzing" a Web page?

At this point, when you put something up on the Web, you don't have to say who put it up there, you don't have to say where it really lives, the author could be anyone. Which is supposedly its freedom. But as a user, I'm essentially in a position where everyone can represent themselves to me however they wish. I don't know who I'm talking to. I don't know if this is something that will lead to interesting conversation and worthwhile information -- or if it's a loony toon and a waste of my time.

I'm not a big control freak, I don't really know who would administer this or how it would be. But I would just like to see that a Web page had certain parameters that are required: where it is and whose it is. I would like to have some way in which I could have some notion of who I'm talking to. A digital signature on the other end.

That's feasible today.

Yeah, but the ethos of the Net is that everything should be free, everyone should do whatever they want -- you're creating this marketplace of ideas I can pick and choose in. But if I don't have the tools to pick and choose and I don't know who I'm talking to, essentially I'm walking into a room and I have blindfolds on.

The political ethos of the Net, its extreme libertarianism -- that's another thing that comes out of the programming social world. You know, whoever's the most technically able can do whatever they want. It's really not "everyone can do whatever they want"; it's that the more technically able you are, the more you should be able to do. And that's the way it is online to me. It is a kind of meritocracy in a very narrow sense.

How did you first become a programmer?

I've had one foot in the world that speaks English and one foot in the world of technology almost my whole life. I majored in English, minored in biology. The minute I got out of college, I worked as a videographer. A side note: Those were the days when we thought that giving everybody a Portapak -- a portable video machine -- would change the world. So I'd seen one technical revolution. And it made me very skeptical about the idea that everyone having a PC would change the world.

I was doing some animation stuff, and I saw people doing computer-aided animation and found it fascinating. I asked them how they did it, and they said, well, do you know Fortran? I wasn't thinking of becoming a professional programmer. This was 1971, 1972. The state of the art was primitive, and I didn't become a programmer at that point. I did photography for a living; I was a media technician. I moved out here [to the Bay Area], pumped gas, answered telephones for a living -- I mean, talk about professions that have been technologically made obsolete. I liked being a switchboard operator, it passed the time very nicely.

That's a different way of being close to a machine.

The machine is very simple -- it's the people who don't cooperate. Anyway, I did socially useful media for women's groups, women's radio programs, photography shows. I came of age doing media at a time when we thought it should be imbedded in social action. And then I got more involved in political work, in lesbian politics and women's politics, and then eventually got tired of the splits -- people were always dividing. So I had some friends who had joined a Communist formation, and it was the time to sort of put up or shut up. I joined up. Of course, I did technical and media stuff for them -- I was responsible for their graphic-arts darkroom and laying out their newspaper. The inevitable part of my life is to be involved with machines.



In "Close to the Machine" you talk about the parallels between being a programmer and being a Communist.

It's a very mechanistic way of thinking, very intolerant of error -- and when things got confusing we tried to move closer to the machine, we tried to block out notions of human complexity. You tried to turn yourself into this machine. We were supposed to be proud of being cogs, and really suppressing and banishing all that messy, wet chemical life that we're a part of.

Now, supposedly, only a cadre was supposed to go through this; the rest of humanity wasn't. Eventually, you realize, if the world is being remade by these people who've suppressed all these other parts of themselves, when they're done with all their decades and decades of struggle, will they remember how to be a complicated human being? That can happen to you if you do programming. So I quit.

I actually went through a very serious and very damaging expulsion. And I became a professional programmer because of that expulsion. I say in the book, I was promoted very rapidly because my employer was amazed at my ability to work hundreds of hours a week without complaining. But I had been rather damaged by that year. I spent months just sitting there working symbolic logic proofs. That was all I could do. It was the only way I could calm myself down and try to get my brain back.

Programming was both a symptom of how crazy I was and also a great solace. So then I got a job, and I was promoted in a minute, and they wanted to make me a product manager, I was a product manager for a minute, then I said no, I can't stand it, I want to program. I was a programmer, then I was put in charge of designing a new system, then they wanted to make me a manager again.

This has been the history of my life. Eventually I became a consultant, because I don't want to manage programmers.



To what extent are programming languages actually languages? Can you look at someone's code and tell what kind of person wrote this?

I can tell what kind of programmer they were, but not what type of person they were. Code is not expressive in that way. It doesn't allow for enough variation. It must conform to very strict rules. But programmers have styles, they definitely have styles. Some people write very compact code. Compact and elegant. Also, does one comment the code, and how generous are those comments? You can get a sense of someone's generosity. Are they writing the code with the knowledge that someone else has to come by here, or not? Good code is written with the idea that I'll be long gone and five years from now it, or some remnant, will still be running, and I don't want someone just hacking it to pieces. You sort of protect your code, by leaving clear comments.

Another rule of thumb is that all programmers hate whoever came before them. You can't help it. There's a real distinction in programming between new development people and people who work on other people's code. I had this space of five years where I did only new development for systems that had no users. Programming heaven! But very few programmers get that. It's a privilege. The places you have to express yourself are in the algorithm design, if you're in new development -- and that is an art. But not many people are programming at that stage.

So if you ask me, it's not a language. We can use English to invent poetry, to try to express things that are very hard to express. In programming you really can't. Finally, a computer program has only one meaning: what it does. It isn't a text for an academic to read. Its entire meaning is its function.


SALON | Oct. 9, 1997

The dumbing-down of programming



P A R T_O N E:
REBELLING AGAINST MICROSOFT, "MY COMPUTER" AND EASY-TO-USE WIZARDS, AN ENGINEER REDISCOVERS THE JOYS OF DIFFICULT COMPUTING.

BY ELLEN ULLMAN
Last month I committed an act of technical rebellion: I bought one operating system instead of another. On the surface, this may not seem like much, since an operating system is something that can seem inevitable. It's there when you get your machine, some software from Microsoft, an ur-condition that can be upgraded but not undone. Yet the world is filled with operating systems, it turns out. And since I've always felt that a computer system is a significant statement about our relationship to the world -- how we organize our understanding of it, how we want to interact with what we know, how we wish to project the whole notion of intelligence -- I suddenly did not feel like giving in to the inevitable.

My intention had been to buy an upgrade to Windows NT Server, which was a completely sensible thing for me to be doing. A nice, clean, up-to-date system for an extra machine was the idea, somewhere to install my clients' software; a reasonable, professional choice in a world where Microsoft platforms are everywhere. But somehow I left the store carrying a box of Linux from a company called Slackware. Linux: home-brewed, hobbyist, group-hacked. UNIX-like operating system created in 1991 by Linus Torvalds then passed around from hand to hand like so much anti-Soviet samizdat. Noncommercial, sold on the cheap mainly for the cost of the documentation, impracticable except perhaps for the thrill of actually looking at the source code and utterly useless to my life as a software engineering consultant.

But buying Linux was no mistake. For the mere act of installing the system -- stripping down the machine to its components, then rebuilding its capabilities one by one -- led me to think about what has happened to the profession of programming, and to consider how the notion of technical expertise has changed. I began to wonder about the wages, both personal and social, of spending so much time with a machine that has slowly absorbed into itself as many complications as possible, so as to present us with a fa├žade that says everything can and should be "easy."

* * *


I began by ridding my system of Microsoft. I came of technical age with UNIX, where I learned with power-greedy pleasure that you could kill a system right out from under yourself with a single command. It's almost the first thing anyone teaches you: Run as the root user from the root directory, type in rm -r f *, and, at the stroke of the ENTER key, gone are all the files and directories. Recursively, each directory deleting itself once its files have been deleted, right down to the very directory from which you entered the command: the snake swallowing its tail. Just the knowledge that one might do such great destruction is heady. It is the technical equivalent of suicide, yet UNIX lets you do it anyhow. UNIX always presumes you know what you're doing. You're the human being, after all, and it is a mere operating system. Maybe you want to kill off your system.

But Microsoft was determined to protect me from myself. Consumer-oriented, idiot-proofed, covered by its pretty skin of icons and dialog boxes, Windows refused to let me harm it. I had long ago lost my original start-up disk, the system was too fritzed to make a new one and now it turned away my subterfuges of DOS installation diskette, boot disks from other machines, later versions of utilities. Can't reformat active drive. Wrong version detected. Setup designed for systems without an operating system; operating system detected; upgrade version required. A cascade of error messages, warnings, beeps; a sort of sound and light show -- the Wizard of Oz lighting spectacular fireworks to keep me from flinging back the curtain to see the short fat bald man.

For Microsoft's self-protective skin is really only a show, a lure to the determined engineer, a challenge to see if you're clever enough to rip the covers off. The more it resisted me, the more I knew I would enjoy the pleasure of deleting it.

Two hours later, I was stripping down the system. Layer by layer it fell away. Off came Windows NT 3.51; off came a wayward co-installation of Windows 95 where it overlaid DOS. I said goodbye to video and sound; goodbye wallpaper; goodbye fonts and colors and styles; goodbye windows and icons and menus and buttons and dialogs. All the lovely graphical skins turned to so much bitwise detritus. It had the feel of Keir Dullea turning off the keys to HAL's memory core in the film "2001," each keyturn removing a "higher" function, HAL's voice all the while descending into mawkish, babyish pleading. Except that I had the sense that I was performing an exactly opposite process: I was making my system not dumber but smarter. For now everything on the system would be something put there by me, and in the end the system itself would be cleaner, clearer, more knowable -- everything I associate with the idea of "intelligent."

What I had now was a bare machine, just the hardware and its built-in logic. No more Microsoft muddle of operating systems. It was like hosing down your car after washing it: the same feeling of virtuous exertion, the pleasure of the sparkling clean machine you've just rubbed all over. Yours. Known down to the crevices. Then, just to see what would happen, I turned on the computer. It powered up as usual, gave two long beeps, then put up a message in large letters on the screen:

NO ROM BASIC

What? Had I somehow killed off my read-only memory? It doesn't matter that you tell yourself you're an engineer and game for whatever happens. There is still a moment of panic when things seem to go horribly wrong. I stared at the message for a while, then calmed down: It had to be related to not having an operating system. What else did I think could happen but something weird?

But what something weird was this exactly? I searched the Net, found hundreds of HOW-TO FAQs about installing Linux, thousands about uninstalling operating systems -- endless pages of obscure factoids, strange procedures, good and bad advice. I followed trails of links that led to interesting bits of information, currently useless to me. Long trails that ended in dead ends, missing pages, junk. Then, sometime about 1 in the morning, in a FAQ about Enhanced IDE, was the answer:

8.1. Why do I get NO ROM BASIC, SYSTEM HALTED?

This should get a prize for the PC compatible's most obscure error message. It usually means you haven't made the primary partition bootable ...

The earliest true-blue PCs had a BASIC interpreter built in, just like many other home computers those days. Even today, the Master Boot Record (MBR) code on your hard disk jumps to the BASIC ROM if it doesn't find any active partitions. Needless to say, there's no such thing as a BASIC ROM in today's compatibles....

I had not seen a PC with built-in BASIC in some 16 years, yet here it still was, vestigial trace of the interpreter, something still remembering a time when the machine could be used to interpret and execute my entries as lines in a BASIC program. The least and smallest thing the machine could do in the absence of all else, its one last imperative: No operating system! Look for BASIC! It was like happening upon some primitive survival response, a low-level bit of hard wiring, like the mysterious built-in knowledge that lets a blind little mouseling, newborn and helpless, find its way to the teat.

This discovery of the trace of BASIC was somehow thrilling -- an ancient pot shard found by mistake in the rubble of an excavation. Now I returned to the FAQs, lost myself in digging, passed another hour in a delirium of trivia. Hex loading addresses for devices. Mysteries of the BIOS old and new. Motherboards certified by the company that had written my BIOS and motherboards that were not. I learned that my motherboard was an orphan. It was made by a Taiwanese company no longer in business; its BIOS had been left to languish, supported by no one. And one moment after midnight on Dec. 31, 1999, it would reset my system clock to ... 1980? What? Why 1980 and not zero? Then I remembered: 1980 was the year of the first IBM PC. 1980 was Year One in desktop time.

The computer was suddenly revealed as palimpsest. The machine that is everywhere hailed as the very incarnation of the new had revealed itself to be not so new after all, but a series of skins, layer on layer, winding around the messy, evolving idea of the computing machine. Under Windows was DOS; under DOS, BASIC; and under them both the date of its origins recorded like a birth memory. Here was the very opposite of the authoritative, all-knowing system with its pretty screenful of icons. Here was the antidote to Microsoft's many protections. The mere impulse toward Linux had led me into an act of desktop archaeology. And down under all those piles of stuff, the secret was written: We build our computers the way we build our cities -- over time, without a plan, on top of ruins.

- - - - - - - - - - - -
"My Computer" -- the infantilizing baby names of the Windows world

My Computer. This is the face offered to the world by the other machines in the office. My Computer. I've always hated this icon -- its insulting, infantilizing tone. Even if you change the name, the damage is done: It's how you've been encouraged to think of the system. My Computer. My Documents. Baby names. My world, mine, mine, mine. Network Neighborhood, just like Mister Rogers'.

On one side of me was the Linux machine, which I'd managed to get booted from a floppy. It sat there at a login prompt, plain characters on a black-and-white screen. On the other side was a Windows NT system, colored little icons on a soothing green background, a screenful of programming tools: Microsoft Visual C++, Symantec Visual Cafe, Symantec Visual Page, Totally Hip WebPaint, Sybase PowerBuilder, Microsoft Access, Microsoft Visual Basic -- tools for everything from ad hoc Web-page design to corporate development to system engineering. NT is my development platform, the place where I'm supposed to write serious code. But sitting between my two machines -- baby-faced NT and no-nonsense Linux -- I couldn't help thinking about all the layers I had just peeled off the Linux box, and I began to wonder what the user-friendly NT system was protecting me from.

Developers get the benefit of visual layout without the hassle of having to remember HTML code.
-- Reviewers' guide to Microsoft J++

Templates, Wizards and JavaBeans Libraries Make Development Fast


-- Box for Symantec's Visual Cafe for Java

Simplify application and applet development with numerous wizards


-- Ad for Borland's JBuilder in the Programmer's Paradise catalog

Thanks to IntelliSense, the Table Wizard designs the structure of your business and personal databases for you.


-- Box for Microsoft Access

Developers will benefit by being able to create DHTML components without having to manually code, or even learn, the markup language.


-- Review of J++ 6.0 in PC Week, March 16, 1998.

Has custom controls for all the major Internet protocols (Windows Sockets, FTP, Telnet, Firewall, Socks 5.0, SMPT, POP, MIME, NNTP, Rcommands, HTTP, etc.). And you know what? You really don't need to understand any of them to include the functionality they offer in your program.


-- Ad for Visual Internet Toolkit from the Distinct Corp. in the Components Paradise catalog

My programming tools were full of wizards. Little dialog boxes waiting for me to click "Next" and "Next" and "Finish." Click and drag and shazzam! -- thousands of lines of working code. No need to get into the "hassle" of remembering the language. No need to even learn it. It is a powerful siren-song lure: You can make your program do all these wonderful and complicated things, and you don't really need to understand.

In six clicks of a wizard, the Microsoft C++ AppWizard steps me through the creation of an application skeleton. The application will have a multidocument interface, database support from SQL Server, OLE compound document support as both server and container, docking toolbars, a status line, printer and print-preview dialogs, 3-D controls, messaging API and Windows sockets support; and, when my clicks are complete, it will immediately compile, build and execute. Up pops a parent and child window, already furnished with window controls, default menus, icons and dialogs for printing, finding, cutting and pasting, saving and so forth. The process takes three minutes.

Of course, I could look at the code that the Wizard has generated. Of course, I could read carefully through the 36 generated C++ class definitions. Ideally, I would not only read the code but also understand all the calls on the operating system and all the references to the library of standard Windows objects called the Microsoft Foundation Classes. Most of all, I would study them until I knew in great detail the complexities of servers and containers, OLE objects, interaction with relational databases, connections to a remote data source and the intricacies of messaging -- all the functionality AppWizard has just slurped into my program, none of it trivial.

But everything in the environment urges me not to. What the tool encourages me to do now is find the TODO comments in the generated code, then do a little filling in -- constructors and initializations. Then I am to start clicking and dragging controls onto the generated windows -- all the prefabricated text boxes and list boxes and combo boxes and whatnot. Then I will write a little code that hangs off each control.

In this programming world, the writing of my code has moved away from being the central task to become a set of appendages to the entire Microsoft system structure. I'm a scrivener here, a filler-in of forms, a setter of properties. Why study all that other stuff, since it already works anyway? Since my deadline is pressing. Since the marketplace is not interested in programs that do not work well in the entire Microsoft structure, which AppWizard has so conveniently prebuilt for me.

This not-knowing is a seduction. I feel myself drifting up, away from the core of what I've known programming to be: text that talks to the system and its other software, talk that depends on knowing the system as deeply as possible. These icons and wizards, these prebuilt components that look like little pictures, are obscuring the view that what lies under all these cascading windows is only text talking to machine, and underneath it all is something still looking for a BASIC interpreter. But the view the wizards offer is pleasant and easy. The temptation never to know what underlies that ease is overwhelming. It is like the relaxing passivity of television, the calming blankness when a theater goes dark: It is the sweet allure of using.

My programming tools have become like My Computer. The same impulse that went into the Windows 95 user interface -- the desire to encapsulate complexity behind a simplified set of visual representations, the desire to make me resist opening that capsule -- is now in the tools I use to write programs for the system. What started out as the annoying, cloying face of a consumer-oriented system for a naive user has somehow found its way into C++. Dumbing-down is trickling down. Not content with infantilizing the end user, the purveyors of point-and-click seem determined to infantilize the programmer as well.

But what if you're an experienced engineer? What if you've already learned the technology contained in the tool, and you're ready to stop worrying about it? Maybe letting the wizard do the work isn't a loss of knowledge but simply a form of storage: the tool as convenient information repository.

(To be continued.)
SALON | May 12, 1998

- - - - - - - - - - - -

Go on to Part Two of "The Dumbing Down of Programming," where Ellen Ullman explores why wizards aren't merely a helpful convenience -- and how, when programmers come to rely too much upon "easy" tools, knowledge can disappear into code.

- - - - - - - - - - - -


Ellen Ullman is a software engineer. She is the author of "Close to the Machine: Technophilia and its Discontents."

The dumbing-down of programming



P A R T_T W O:
RETURNING TO THE SOURCE. ONCE KNOWLEDGE DISAPPEARS INTO CODE, HOW DO WE RETRIEVE IT?

BY ELLEN ULLMAN



I used to pass by a large computer system with the feeling that it represented the summed-up knowledge of human beings. It reassured me to think of all those programs as a kind of library in which our understanding of the world was recorded in intricate and exquisite detail. I managed to hold onto this comforting belief even in the face of 20 years in the programming business, where I learned from the beginning what a hard time we programmers have in maintaining our own code, let alone understanding programs written and modified over years by untold numbers of other programmers. Programmers come and go; the core group that once understood the issues has written its code and moved on; new programmers have come, left their bit of understanding in the code and moved on in turn. Eventually, no one individual or group knows the full range of the problem behind the program, the solutions we chose, the ones we rejected and why.

Over time, the only representation of the original knowledge becomes the code itself, which by now is something we can run but not exactly understand. It has become a process, something we can operate but no longer rethink deeply. Even if you have the source code in front of you, there are limits to what a human reader can absorb from thousands of lines of text designed primarily to function, not to convey meaning. When knowledge passes into code, it changes state; like water turned to ice, it becomes a new thing, with new properties. We use it; but in a human sense we no longer know it.

The Year 2000 problem is an example on a vast scale of knowledge disappearing into code. And the soon-to-fail national air-traffic control system is but one stark instance of how computerized expertise can be lost. In March, the New York Times reported that IBM had told the Federal Aviation Administration that, come the millennium, the existing system would stop functioning reliably. IBM's advice was to completely replace the system because, they said, there was "no one left who understands the inner workings of the host computer."

No one left who understands. Air-traffic control systems, bookkeeping, drafting, circuit design, spelling, differential equations, assembly lines, ordering systems, network object communications, rocket launchers, atom-bomb silos, electric generators, operating systems, fuel injectors, CAT scans, air conditioners -- an exploding list of subjects, objects and processes rushing into code, which eventually will be left running without anyone left who understands them. A world full of things like mainframe computers, which we can use or throw away, with little choice in between. A world floating atop a sea of programs we've come to rely on but no longer truly understand or control. Code and forget; code and forget: programming as a collective exercise in incremental forgetting.

* * *


Every visual programming tool, every wizard, says to the programmer: No need for you to know this. What reassures the programmer -- what lulls an otherwise intelligent, knowledge-seeking individual into giving up the desire to know -- is the suggestion that the wizard is only taking care of things that are repetitive or boring. These are only tedious and mundane tasks, says the wizard, from which I will free you for better things. Why reinvent the wheel? Why should anyone ever again write code to put up a window or a menu? Use me and you will be more productive.

Productivity has always been the justification for the prepackaging of programming knowledge. But it is worth asking about the sort of productivity gains that come from the simplifications of click-and-drag. I once worked on a project in which a software product originally written for UNIX was being redesigned and implemented on Windows NT. Most of the programming team consisted of programmers who had great facility with Windows, Microsoft Visual C++ and the Foundation Classes. In no time at all, it seemed, they had generated many screenfuls of windows and toolbars and dialogs, all with connections to networks and data sources, thousands and thousands of lines of code. But when the inevitable difficulties of debugging came, they seemed at sea. In the face of the usual weird and unexplainable outcomes, they stood a bit agog. It was left to the UNIX-trained programmers to fix things. The UNIX team members were accustomed to having to know. Their view of programming as language-as-text gave them the patience to look slowly through the code. In the end, the overall "productivity" of the system, the fact that it came into being at all, was the handiwork not of tools that sought to make programming seem easy, but the work of engineers who had no fear of "hard."

And as prebuilt components accomplish larger and larger tasks, it is no longer only a question of putting up a window or a text box, but of an entire technical viewpoint encapsulated in a tool or component. No matter if, like Microsoft's definition of a software object, that viewpoint is haphazardly designed, verbose, buggy. The tool makes it look clean; the wizard hides bad engineering as well as complexity.

In the pretty, visual programming world, both the vendor and programmer can get lazy. The vendor doesn't have to work as hard at producing and committing itself to well-designed programming interfaces. And the programmer can stop thinking about the fundamentals of the system. We programmers can lay back and inherit the vendor's assumptions. We accept the structure of the universe implicit in the tool. We become dependent on the vendor. We let knowledge about difficulty and complexity come to reside not in us, but in the program we use to write programs.

No wizard can possibly banish all the difficulties, of course. Programming is still a tinkery art. The technical environment has become very complex -- we expect bits of programs running anywhere to communicate with bits of programs running anywhere else -- and it is impossible for any one individual to have deep and detailed knowledge about every niche. So a certain degree of specialization has always been needed. A certain amount of complexity-hiding is useful and inevitable.

Yet, when we allow complexity to be hidden and handled for us, we should at least notice what we're giving up. We risk becoming users of components, handlers of black boxes that don't open or don't seem worth opening. We risk becoming like auto mechanics: people who can't really fix things, who can only swap components. It's possible to let technology absorb what we know and then re-express it in intricate mechanisms -- parts and circuit boards and software objects -- mechanisms we can use but do not understand in crucial ways. This not-knowing is fine while everything works as we expected. But when something breaks or goes wrong or needs fundamental change, what will we do but stand a bit helpless in the face of our own creations?

- - - - - - - - - - - -
An epiphany on unscrewing the computer box: Why engineers flock to Linux

1   2   3


The database is protected by copyright ©dentisty.org 2016
send message

    Main page