Monday, May 24, 2010

Whatever Happened to Editors at Publishing Houses?

Doesn't anybody edit books before they're published anymore?

I've been spending a few weeks polishing up my technical skills, and so I've been reading a series of books on a particular subject. One of the books is A Baker's Dozen: Real Analog Solutions for Digital Designers. The book is a beautiful compilation of knowledge, wisdom and tips on the subject of analog circuit design. It's written by one of today's leading experts on the subject: Bonnie Baker, currently vice president or director of something important at Microchip. Baker is great, and I have a soft spot in my engineering heart for Microchip, and so I'm enjoying reading the book and digesting its contents.

There's just one problem with the book. It is very poorly edited.

I'm only into the third chapter, but it seems like I've had to add a handwritten correction to every other page of the book. Only one correction so far is a technical error (she got the two's complement notation for -2 wrong). All the rest are errors involving:

  • spelling errors
  • typographical errors
  • homonyms and homologues
  • word usage and sentence structure problems
  • grammar mistakes
  • awkward phrasing or misuse of common idioms
  • noun-verb disagreement ("is/are", for example)

This is not the only technical book I've encountered with an editing problem. It seems like most, if not all, of today's technical books have serious editing problems. Sometimes they're total disasters, like those written by Myke Predko (don't take my word for it; go read the reviews on and, but more often they're books like this one, where the technical content is (mostly) accurate, but the writing appears ... um ... "sloppy" isn't fair to the writer ... more like poorly edited.

It simply looks as though the editor wasn't doing his/her job. If I had to guess at what happened, I'd say that the publisher assigned a non-technical editor to Baker's book, and the editor was so bamboozled by the technical content that she completely forgot about her own ability to manage the basic mechanics of writing.

It's also possible that Baker told her publisher she wanted to edit the book herself. In a world of word processors and automated spell checkers, I feel that this is becoming increasingly common, especially among engineers who think that with the right manual or tutorial and a week to study it, they can become an expert on any subject. (This is the "any engineer can become a ..." syndrome. You read it here first, folks.)

But folkloric wisdom points out the problems with being your own expert: The taunt "Physician, heal thyself" shows up in the Bible, and a more modern proverb asserts that "Any lawyer who chooses to represent himself has a fool for a client." Authors shouldn't be their own editors.

Or maybe Baker's publisher, tight on cash (or just plain tight), had laid off too many editors and assigned this book and five others to a junior editor or intern with a one-week deadline. The junior editor had to let something slide, and this was it.

Or maybe Baker's editor was a raging incompetent, whose own literary skills are not very far above those required for modern high-school newspapers and yearbooks.

Good editors, like good schoolteachers, are worth their weight in gold. They're an overworked and underpaid bunch, and they may feel like we expect too much of them, but a good editor can make the difference between a "good read" and a waste of paper. My editors have always done a fantastic job, making a huge difference in the finished product.

You can't just jack into a port on the matrix and download the skills necessary to become an editor. It doesn't work like that. Every author should have a competent professional editor. Lacking a competent professional editor, every technical author should have an author whose writing they admire, and who is not a member of their technical profession, to go through their manuscript and fix their writing.

(If you want a more mainstream example of the difference a good editor can make, then look at the quality of language in Tom Clancy's The Hunt for Red October and compare it with the quality of language in the Tom Clancy's Op Center books. Clancy didn't write the Op Center books; he just lent them his name. He should have lent them his editors. The Op Center series is overburdened with horrendous language errors, which repeatedly bring the story to a screeching halt and ultimately cause the reader to throw down the book in frustration.)

I would further suggest that every technical author should ask a colleague whom they view as a competitor, or with whom they share a mild animosity, to review their manuscript for technical errors. Who is better qualified to find errors in your work, than someone who doesn't like you in the first place? Asking esteemed colleagues and best friends to check your work is great, but your greatest asset as an author will be someone who will root like a truffle hound for your mistakes.

It helps to have a good example. In this exercise in personal improvement, I'm saving the best book for last. Horowitz and Hill's The Art of Electronics is not only a technical treasure and a bible among electrical engineers, but it's also beautifully written, an example of how well English prose can be turned to instruct in even the most technical subjects. It's always worth reading and rereading, not just for the technical education, but for the exquisite turns of phrase.

Thursday, May 6, 2010

What was once a "program" is now an "app"

Observe, with me, the interesting progression of terminology in the software industry.

(For those who are not computerly inclined, here's a quick definition of basic terms. Way back in the dawn of digital time, computers were composed of hardware and software. Hardware is all the tangible, physical parts of a computer: the metal case, the printed circuit board and all the little components on it, even the wires that connect everything together. Even soft plastic parts are still tangible, so we still call them "hard" ware.

Software is all the intangible stuff that makes computers run -- the programs. They're not tangible or physical, so they can't really be called "hard" ware. But since they're as essential to the computer's functioning as the hardware, they had to be called something -- hence, "soft" ware.)

Today we consider all the synonyms for "software":
  • program
  • software program
  • firmware
  • code
  • suite
  • application
  • app
Back when computers were brand new, they used to run programs. A program was a series of instructions that the computer followed to complete a task. The earliest computers had to be programmed, or fed instructions, by flipping a series of switches on the front panel. (After flipping the switches hundreds of times just to enter one program, and still getting it wrong, engineers decided they needed to invent a different way to program a computer -- in a hurry. That's how we ended up with mass storage devices.)

Even in the early days of home computers, people called them programs. One program fit on one floppy disk, and to use the computer, users ran the program.

As computing tasks got more complicated, programmers started including other things on the disk: data files, libraries, configuration files, and additional programs to be run automatically by the main program. Since users weren't just running a single program anymore, programs began to be marketed as software -- an interesting, and not at all inappropriate, reuse of the generic term.

I still crack up when I see a marketing department or a magazine writer refer to these things as software programs. It's a hilarious redundancy.

Some software is critical enough to the computer's operation that it's stored in a memory chip inside the computer. Because this software is stored in hardware, it's neither hard nor soft, so the pros call it firmware.

We've come a long way from the days of flipping switches. Today you can use one programming tool on one machine to write programs in many different programming languages, for many different systems. This has led programmers to start referring to the stuff they produce as code. There's source code, object code, assembly code, byte code, high-level code, low-level code, compiled code, interpreted code, Mac code, Linux code, Windows code, ... and it's all just programs and programming.

Somewhere along the way, software vendors began to combine separate programs into agglomerations that were (supposed to be) more powerful than the individual programs. These were referred to as suites. For a while, everyone was selling an office suite. Adobe has a lock on the best graphics suite in the world. Other companies sell animation suites, CAD suites, audio/video production suites, gaming suites, and in a beautiful recursion, software development suites.

Also somewhere along the way, someone said, "We're not selling programs, we're selling applications." After that, what was formerly known as something else became known as an application -- still just a collection of programs and related software items.

With the introduction of very smart handheld devices such as the iPhone, someone said, "Since these devices are small, the term for the software they run should also be small." So an application became an app, as in "There's an app for that."

Ah, but the term app got away from its creators. First it spread to non-iPhone handhelds, and then to non-handheld hardware, and finally to browsers. Now any software that runs on any kind of computing device is an "app".

But when you distill it down to its essential parts, it's still just a program.