I said in the first part of this series that one of the books I wanted to talk about was written in 1974. My colleague Dennis Schafroth guessed that it might be Kerninghan and Ritchie’s classic, The C Programming Language, but the first edition of that book actually came out in 1978. But I awarded half a point anyway, because the book I had in mind (A) was co-written by Brian W. “Water buffalo” Kernighan, and (B) had a second edition in 1978, the same year as K&R. It is Kernighan and Plauger’s The Elements of Programming Style (amazon.com, amazon.co.uk)
(This isn’t a cover image from Amazon, it’s a scan of my personal copy, because I wanted you to see how well-thumbed it is.)
What can a book from 1974 possibly have to teach us 36 years later? Especially when all its example code is in FORTRAN(!) and PL/1(!!)? A lot, as it turns out. For one thing, it contains (on page 10) perhaps the single wisest thing that has ever been said about programming:
“Everyone knows that debugging is twice as hard as writing a program in the first place. So if you’re as clever as you can be when you write it, how will you ever debug it?”
It’s a short book: at only 168 smallish pages, it’s less than a fifth as long as the bloated 938-page tome I have about XSLT, and about one third as long as Fowler et al.’s Refactoring, which is perhaps its spiritual heir. It’s arranged as eight chapters: Introduction, Expressions, Control Structure, Program Structure, Input and Output, Common Blunders, Efficiency and Instrumentation, and Documentation; each chapter demonstrates ten or so rules (which I will list below).
So far, so didactic: anyone can guess from this that EoPS is a useful book; what’s not so obvious is that it’s very funny. I’ll let K&P explain their approach (from the preface to the First Edition on page xi):
This book is a study of a large number of “real” programs, each of which provides one or more lessons in style. We discuss the shortcomings of each example, rewrite it in a better way, then draw a general rule from the specific case. […] all of the programs we use are taken from programming textbooks. Thus we do not set up artificial problems to illustrate our points — we use finished products, written and published by experienced programmers.
This in itself is pretty hilarious: EoPS consists entirely of the mistakes made by people confident enough to publish their own programs as examples of good style. What lifts it to the realm of laugh-out-loudfulness is the very dry style: K&P graciously abstain from going to town on the deficiencies of the programs they study, but their minutely detailed dissections speak volumes, and seem (unless I am imagining it) to convey an undertone of profound disdain. For example, consider these observations on a program to calculate the area under a curve:
“With all the extraneous assignments removed, it is easier to see the underlying structure. It is also easy to see that the indentations reflect little of what is going on. But what is the purpose of the variable I? It is laboriously kept equal to J so that OUT can be called at the end of the last iteration. Clearly I is not needed, for J could be used for the test. But the test is not needed; OUT could be called just after the inner DO loop has terminated. But OUT need not be called at all, for its code could just as well appear in the one place it is invoked. The structure simplifies remarkably.”
As another example, check out this section on commenting:
In other sections, fairly significant programs like a maze solver are taken apart, subjected to dispassionate but merciless criticism, and put together again shorter, clearer, more correct and more functional than before. In short, Kernighan and Plauger don’t just explain how to program well, theyshow us how it’s done.
I’m not going to claim that the book hasn’t aged. As you can see from the extract above, the typography looks very primitive (it was done on an early version of troff), and rules such as “Avoid the Fortran arithmetic IF” and “Initialize constants with DATA statements or INITIAL attributes; initialize variables with executable code” just don’t apply any more in the post-FORTRAN era. Another rule, “Write first in an easy-to-understand pseudo-language; then translate into whatever language you have to use”, is also not applicable now that languages like Python and Ruby can read and execute the equivalent of the old pseudo-code. This in itself demonstrates the value of one of K&P’s more enduring rules: “Let the machine do the dirty work”. Yes indeed: translating pseudo-code into executable code should not be left to humans.
But most of the rules are timeless, and remain as true and important today in 2010 as they were in 1974. The first rule after the introduction remains one of my favourites: “Say what you mean, simply and directly”. (You may not believe it, reading this blog, but I try to apply this to my prose writing as well as my programming.) Others that we should all try to live by: “Each module should do one thing well”; “Let the data structure the program”; “Make it right before you make it faster” and “Keep it right when you make it faster”.
So I keep coming back to EoPS (I am re-reading it as I write this) because it’s short, it’s easy reading, it’s funny, and much of its advice is timeless. In a way, you could say its age is even a plus-point, because it makes it obvious which of the rules are of their time and which are fundamental — whereas, for example, everyone knows that Design Patterns contains a mix of genuine insight and mere patches for Java’s lack of expressive power, but it’s not yet clear which patterns fall into which categories. Give it another twenty years, and we should be in a position to figure that out.
Appendix: “summary of rules” from EoPS
Abstracted from the appendix SUMMARY OF RULES in The Elements of Programming Style (Second Edition) by Brian W. Kernighan and P. J. Plauger, pub. McGraw-Hill, ISBN 0-07-034207-5.
This summary is designed to give a quick review of the points we covered in the book. Remember as you read the rules that they were presented in connection with one or more examples — go back and reread the pertinent section if a rule doesn’t call them to mind.
To paraphrase an observation in The Elements of Style, rules of programming style, like those of English, are sometimes broken, even by the best writers. When a rule is broken, however, you will usually find in the program some compensating merit, attained at the cost of the violation. Unless you’re certain of doing as well, you will probably do best to follow the rules.
INTRODUCTION
- Write clearly — don’t be too clever.
EXPRESSIONS
- Say what you mean, simply and directly.
- Use library functions.
- Avoid temporary variables.
- Write clearly — don’t sacrifice clarity for “efficiency”.
- Let the machine do the dirty work.
- Replace repetitive expressions by calls to a common function.
- Parenthesize to avoid ambiguity.
- Choose variable names that won’t be confused.
- Avoid the Fortran arithmetic IF.
- Avoid unnecessary branches.
- Use the good features of a language; avoid the bad ones.
- Don’t use conditional branches as a substitute for a logical expression.
- Use the “telephone test” for readability.
CONTROL STRUCTURE
- Use DO-END and indenting to delimit groups of statements.
- Use IF-ELSE to emphasize that only one of two actions is to be performed.
- Use DO and DO-WHILE to emphasize the presence of loops.
- Make your programs read from top to bottom.
- Use IF … ELSE IF … ELSE IF … ELSE … to implement multi-way branches.
- Use the fundamental control flow structures.
- Write first in an easy-to-understand pseudo-language; then translate into whatever language you have to use.
- Avoid THEN-IF and null ELSE.
- Avoid ELSE GOTO and ELSE RETURN.
- Follow each decision as closely as possible with its associated action.
- Use data arrays to avoid repetitive control sequences.
- Choose a data representation that makes your program simple.
- Don’t stop with your first draft.
PROGRAM STRUCTURE
- Modularize. Use subroutines.
- Make the coupling between modules visible.
- Each module should do one thing well.
- Make sure every module hides something.
- Let the data structure the program.
- Don’t patch bad code — rewrite it.
- Write and test a big program in small pieces.
- Use recursive procedures for recursively-defined data structures.
INPUT AND OUTPUT
- Test input for validity and plausibility.
- Make sure input cannot violate the limits of your program.
- Terminate input by end-of-file or marker, not by count.
- Identify bad input; recover if possible.
- Treat end of file conditions in a uniform manner.
- Make input easy to prepare and output self-explanatory.
- Use uniform input formats.
- Make input easy to proofread.
- Use free-form input when possible.
- Use self-identifying input. Allow defaults. Echo both on output.
- Localize input and output in subroutines.
COMMON BLUNDERS
- Make sure all variables are initialized before use.
- Don’t stop at one bug.
- Use debugging compilers.
- Initialize constants with DATA statements or INITIAL attributes; initialize variables with executable code.
- Watch out for off-by-one errors.
- Take care to branch the right way on equality.
- Avoid multiple exits from loops.
- Make sure your code “does nothing” gracefully.
- Test programs at their boundary values.
- Program defensively.
- 10.0 times 0.1 is hardly ever 1.0
- Don’t compare floating point numbers just for equality.
EFFICIENCY AND INSTRUMENTATION
- Make it right before you make it faster.
- Keep it right when you make it faster.
- Make it clear before you make it faster.
- Don’t sacrifice clarity for small gains in “efficiency”.
- Let your compiler do the simple optimizations.
- Don’t strain to re-use code; reorganize instead.
- Make sure special cases are truly special.
- Keep it simple to make it faster.
- Don’t diddle code to make it faster — find a better algorithm.
- Instrument your programs. Measure before making “efficiency” changes.
DOCUMENTATION
- Make sure comments and code agree.
- Don’t just echo the code with comments — make every comment count.
- Don’t comment bad code — rewrite it.
- Use variable names that mean something.
- Use statement labels that mean something.
- Format a program to help the reader understand it.
- Indent to show the logical structure of your program.
- Document your data layouts.
- Don’t over-comment.
It truly is a fantastic book, the first one I read on on software engineering that I remember by name. Highly recommended.
Hands down fantastic.
I want to go buy this book now.
Harold, I can’t tell you how pleased it makes me to read that! I sometimes think I am the only person who even remembers this book , let alone loves it.
Pingback: [Link] Programming Books, part 2: The Elements of Programming Style « jkwiens.com
What does Sushi have to do with this?
Ollie,
Sushi has to do with EVERYTHING.
Don’t have the book, but am curious what is a “telephone test”?
I remember reading the book (translated to Russian) in high school, about 10 years after its first edition, more than a quarter century ago. It’s a timeless classic. Thanks for reminding; brings good memories.
Randomdeterminism, I agree that the “telephone test” is one of the less enlightening rules in the EoPS appendix. The idea there was, if you read your code out loud to someone over a telephone, would they be able to type it in correctly from your dictation? If so, that’s good evidence that you got it simple enough.
“Code Complete 2″ by Steve McConnell is a similarly important work.
Being a vegetarian, I can live without all the Sushi. However, it IS colorful! And I have to confess that when I am staying on the beach in Mexico somewhere beyond the back-end of civilization, fish is one of the few fresh staple foods (as well as tortillas and beans) available. So, believing in the philosophy that you should eat to live, in that situation I have no problem with scarfing down a nice Huachinango A La Diabla (Red Snapper w/ really hot chile sauce), or whatever else the local fishermen brought in that day.
Rubberman, I will try to remember to include some vegetarian sushi in the next-but-two article. (The next one and one after that are already written and cued up.)
Impressive article, thanks. I also found your your “Whatever happened to programming” quite interesting. I still, however, haven’t found the answer to The Riddle of Reuse that I’m seeking so I’ll consult you, the oracle. :) Right now, we “excellent masters of our libraries, not terrible servants”, flipping your phrase on its head. This by definition means the libraries are under constant renovation, which creates our conundrum, but first a little context — sorry for the length of his “comment”.
I have followed the K&P philosophy for the last 20 years or so. In the nineties I switched jobs a few times, and somehow I stumbled into running my own software company. It’s been a fantastic experience. No more meetings. Just code. We also write automated tests, which are code, but not heavily promoted by K&P, Knuth, or even Paul Graham, which has always surprised me. In any event, we now maintain a handful of applications, some of them quite large. We have an open source framework built up over the last 10 years with a very high function-to-code ratio. All of this backed up by hundreds of tests.
Two of us write 95% of the code. We have tried to hire people, some of them have stayed five years, but they simply couldn’t understand how to reuse what we are continuously evolving. We follow K&P’s dictum: Don’t strain to re-use code; reorganize instead. Today we’d say, “refactor regularly”, but the meaning is the same.
When the two of us want to write a new app, it’s a few lines of code. The rule we have is: If you are writing more than a few lines of code, you are doing the wrong thing.
There is also the other problem of programming languages. In 1999, when we chose Perl, it was the only dynamic language with enough stability to support our first application (partnership tax accounting and groupware). We are not in Paul Graham’s class, or we might have chosen Lisp, and rolled even more code ourselves. Rather we said CPAN had some good stuff, mod_perl was rock solid, so rock ‘n roll, and don’t think about the consequences too much. After all, we’ll probably sell the business for $1B, and be off to greener pastures. Not!
Since we use Perl, new programmers can write code as they like (TMTOWTDI) and often get the application working well enough. However, after a couple of years, the defects start becoming overwhelming to manage. Then, either me or my partner have to step in, create appropriate tests, shrink the code (often quite extensively), stabilize the application, and then add the new code to satisfy the new business requirements which forced the defects to the surface. It usually takes a couple of days, no more. Newbie simply can’t approach our more complex applications.
So that’s the Riddle of Reuse I’ve been trying to solve for the last few years. Many of the problem domains we deal with are complex in their own right, and most of the applications have been operating continously for well over five years. When we thrown our framework on top of the natural complexity of the applications themselves, it’s simply too much for even experienced programmers to handle. Perhaps there are programmers out there who can handle the complexity. I haven’t met them or the ones I know are uninterested in solving the Riddle of Reuse. Perl itself is a barrier, and then there’s our “nearly-right-but-not-quite libraries” which we get to “make right”, not workaround. That leads to more reuse, less copy-and-paste, but more complexity.
Any suggestions as to how we might solve The Riddle, Mike?
Thanks,
Rob
Good enough article & book to agree with most of it, but suggest one of K&Ps guidelines I frequently, remorselessly, violate:
Avoid temporary variables.
Not so much: I often have variables that are set once, then used (and discarded), just so that if I break in a debugger, the value of the expression or function call is displayed without my having to ask.
Of course, I try to make routines / functions that are short enough to stay with 7 +/- 2 variables floating around (locals in scope) in most cases.
I really hate the thousand line monstrosities with 100 to 150 active variables (!) floating around that some of my coworkers (various companies) seem to conjure up, which always seem to be chock full of bugs, strangely enough.
Speaking of debug-ability, I remember a discussion with one coworker about using functions to identify data flow (“see, these inputs are used to create this/these output(s)”). This was a strange new concept, apparently.
Well, that’s enough digression for now.
As a retired programmer of 32 years, I came from the era of 1974 and I can tell you exactly what “The Elements of Programming Style” has to teach us today. ABSOLUTELY NOTHING!
That era was, we now know, the closing years of a scientific priesthood which had us kneeling in reverence in sealed rooms before the mighty IBM 360, and later, sacred minicomputers. The priesthood surrounded their domain with important and dense treatises and continued ridicule of the desktop computer claiming, at first, that a C compiler could never run on one, and that the elaborately developed machinations of their if-then-elses, procedure calls and functions, were the be all and end all of all computing. Co-routines anyone?
Back in the real world, of course, things developed quickly. the priesthood stood indignant against newcomers unblessed by Gries, Kernighan or that paragon of arrogant elitism, Edsger Dykstra – who would, or so he claimed, reduce all of programming to a series of logical equations deduced from the very pre and post conditions of the computation and whose book, “A Discipline of Programming” proved to be universally praised and widely studied and adopted, NOT.
Real programmers did indeed use some things from the elitists but they also added a great deal, used tools in different ways, and, eventually the personal computer revolution brought computation to the hands of the masses leaving the elites to blather on and on about their “contributions” while easily forgetting or ignoring their contempt and disregard for programmers who rose through the ranks, compromised, used what tools where available and succeeded. Elitist tools like “Smalltalk”, killed by excessive Xeroxian paternalism, where rarely, if ever, included.
I’ve found that the very first thing to be learned from “expert” books such as that one on “style” or anything else, is to completely disregard its admonitions and seek one’s own style, no matter if it leads to Dykstra’s despised APL, to Emacs, to Prolog, Lisp or C++.
The greatest hacker of all, nature, does not use any such “guidelines” and here we are. Why should we?
Roboprog, I agree that “avoid temporary variables” is one of the rules that has aged less well. It comes from K&P’s analysis of programs where the excessive use of temporaries was one of the clues that the flow of the program was unnecessarily complex — not something that we have to worry about at the detail level any more, since we’re all using WHILE loops and suchlike all the time instead of a GOTO mazes.
Personally, I’ve been working on specification-based languages for some time now. My opinion is that programming languages per se are a broken paradigm. There will never be enough programmers with the domain knowledge required to meet society’s greater and greater dependence upon computing systems. The only hope, IMO, are specification languages. To wit, tell the computer what you want it to do, not how to do it. A domain expert can do that for their needs, without needing to know some obscure programming language syntax and then convert that domain expertise into a properly functional program or software system.
James Pannozzi, thanks for an alternative viewpoint. I like it when people agree with me, but it would be boring if everyone did it :-)
As a point of information, though, I think you gravely mischaracterise Kernighan, Plauger, and their Unix buddies with talk of “the closing years of a scientific priesthood which had us kneeling in reverence in sealed rooms before the mighty IBM 360 [… and which] surrounded their domain with important and dense treatises and continued ridicule of the desktop computer”. This sounds more like a description of the very status quo that Unix was a reaction against: by design it ran on small computers (and was mocked by some for that very reason); it always emphasised the computer as servant rather than master or object of worship; and the Unix manuals, at least before the BSD era, were famed for being terse and practical rather than dense and theoretical. No doubt the attitudes you described existed, and were maybe even prevalent, in 1974; but you’re looking in the wrong place if you blame the likes of Kernighan and Plauger for them.
Also: “… that paragon of arrogant elitism, Edsger Dykstra – who would, or so he claimed, reduce all of programming to a series of logical equations deduced from the very pre and post conditions of the computation and whose book, ‘A Discipline of Programming’ proved to be universally praised and widely studied and adopted, NOT.” Rail all you want against Dijkstra, the fact of the matter is that we all now use his “structured programming” all the time (except when programming in assembly). It’s an idea that has stood the test of time so successfully that we don’t even think about it any more. (I bet many recent CS graduates wouldn’t even know what the term “structured programming” refers to.)
And finally …
“I’ve found that the very first thing to be learned from ‘expert’ books such as that one on ‘style’ or anything else, is to completely disregard its admonitions.”
It seems strange to refuse to stand on the shoulders of giants; and I really don’t see why you would disregard admonishions such as “Write clearly — don’t be too clever” and “Watch out for off-by-one errors”. Still, if works for you.
“Rail all you want against Dijkstra, the fact of the matter is that we all now use his “structured programming” all the time (except when programming in assembly)”
We use it only because it is easy to teach, like cooking from a cookbook. Structured programming is the cause of the BSOD and many other stateful bugs. It has taken me about 10 years to unlearn many of the lessons perpetrated on the computing profession by Djikstra. Gotos are not harmful; we use them all the time in the form of “exceptions”, “break”, etc. Probably the worse thing we have suffered against from Djisktra though is his dogmatic hatred of testing. The only reason to be against tests is if you think you are perfect, and that’s the ultimate problem with Djikstra: he thought he was perfect.
Those rolls look oh so devilishly tasty. Breadcrumbs coating. Yum. Here in Brazil, sushi joints almost always use a kinda-sorta pancake dough coating for the Hot Philadelphias (raw salmon + cream cheese + some herbs), or Hot Whatever they make. Some places do use the schnitzel-like coating, though.
No, wait, I meant to write something about programming here. Gah. Er. Um. Ah. I wonder if the authors of that book ever thought of writing an updated version? (In which, for example, temporary variables, like Native Americans, are no longer considered evil?)
Oh, and I was hooked to this blog as of 20 minutes ago, thanks to the “Whatever happened to programming” post. It really struck a chord with me. I got to do all than fun stuff (RS-232 modules for CP/M, a multiuser MS-DOS clone, TRS CoCo ROM BASIC extensions, SMTP and FTP implementations for credit-card swipe machines, and even weirder things) that seem to have vanished from the professional world of programming.
I leave with a question: Lewis, Jenson, or Don’t-Give-a-Damn?
Cheers,
Juan
Rio de Janeiro, Brazil
P.S.: My name shall not be construed as evidence that Brazilians speak Spanish. They speak Portuguese.
Rob, we may be at cross-purposes here. The heart of “structured programming” was expressing algorithms in terms of constructs that had inherent semantics: if-then-else blocks, while loops, encapsulated procedures/functions, and so on. Not building these, and evil mutant hybrid offspring forms out of GOTOs. And we all do that now, all the time, not because it’s easier to teach but because it’s easier to write programs that way, and easier to maintain programs that have been written that way.
You say that we still use GOTO in the form of exceptions and BREAK; but they are not GOTO, or at least, not the form of it that structured programming was about weaning us off of. They are specific, constrained forms of flow transfer, and they are much, much easier to reason about than old-fashioned non-local GOTO with an arbitrary label. (At least, BREAK is much, much easier to reason about; exception are merely much easier.)
Let me be clear, by the way, that I am by no means a no-GOTOs-ever zealot — in fact, one of the things I don’t like about my current favourite language, Ruby, is that it lacks GOTO. Sometimes — not often, but sometimes — GOTO is just the most economical and natural way to express an algorithm, and I resent a language designer who arrogantly takes it upon himself to remove that option from me for what amounts to religious reasons. The point is not Never Use GOTO; it’s that when we do use GOTO now, we’re going so carefully and intentionally because that is what’s demanded; not as a substitute for semantically coherent control constructs.
So, yeah, I think Dijkstra won that one, emphatically.
(But … being against tests? I can’t even.)
I am embarrassed to admit that I don’t understand the Lewis-Jensen question. Thanks for your remarks on the sushi, though — that is a side of this blog that I want to develop more.
Mike, I think people like the “idea” of Dijkstra without accepting that we don’t actually program like he would have wanted us to. You write: “the code becomes runnable, then it does something useful, it passes tests, and then — yes! — it’s not just an idea any more, but an actual program.”
Dijkstra wrote: ‘You must make the program in such a way that you can give convincing argument for its correctness.” AND
“That program testing does not provide such a convincing case is well-known.” [EWD1036, 1988] This view is 100% at odds with what you wrote.
The heart of structured programming is imho the “single exit”. The concept of modularization came about in the 1950s, if not earlier.
In EWD215 [1969] , he says that “our intellectual powers are rather geared to master static relations”, which is why it is difficult to teach dynamic programming. Imperative programming of which structured programming is a subclass is easier to teach, because it is easy to follow one statement after another.
Dijkstra spells out the problem with dynamic programming very well, which leads us to the problem with DSLs, functional programming, and to my Riddle of Reuse. While it is easy to write imperative code, it quickly becomes unmanageable for any problem of significant complexity, which is where Greenspun’s Tenth Rule comes in: Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp.
For most of his life, Dijkstra thought Lisp was an abomination. In EWD1284 (1999), he concedes he “was very slow on appreciating LISP’s merits”. The problem is that any Dijkstra followers have to be even slower to appreciate Lisp’s merits, which is why languages that emulate some of Lisp’s features, such as closures and continuations, are only now becoming popular. If you had been taught Lisp instead of Pascal/C/C++/Java, you wouldn’t be complaining about having to use C++/Java, and instead be programming in something far superior to Ruby. Alas, we suffer from the legacy of structured programming so that we are only now entering a Lisp renaissance.
Well, Rob, I guess that “the idea of Dijkstra” is indeed what I am talking about — the idea of what I guess you’d call “weak” structured programming (which by the way is much more about single-entry than single-exit). While I agree that other and stricter aspects of ESD’s approach haven’t taken hold (and I also agree with you neither should they have!), that doesn’t negate for me the enormous and pervasive influence he’s had on how imperative programming is done.
The effects that might also have had on delaying the recognition of functional programming’s merits is a whole nother matter, and not one that had occurred to me before. Still, if Lisp really is everything that its fans claim, then I think it needs a rather more compelling excuse than Dijkstra’s opposition for why, 52 years after its invention, it is still struggling for acceptance. Everyone agrees that Lisp brought very important concepts to the table, and modern languages are much better for them; but Lisp for actually doing work in? I’ve not used it enough to bring a technical critique, but I just have to have reservations about a language that can live on the fringe for more than half a century. I’d be interested to know why you think that is?
I think Lisp suffers at the hands of cognitive dissonance and weak abstract thinking. “On Lisp” by Paul Graham is an extremely well-written book, and it’s free. Have you read it? Probably not. If you program Java for a living, you aren’t going to read a book on Lisp. It would challenge your world view too much. That’s cognitive dissonance. You are going to read a book like Code Complete, which operates from your well-understood imperative coordinate system.
Abstract thinking is hard, really hard. Our brains have not evolved to support it very well. Try visualizing a four dimensional toroid. I can’t do it. It’s too abstract for me. However, after years and years of practice, I can now visualize software declaratively and often definitionally.
Lisp requires a mental shift that most programmers are unwilling to make. It’s much easier to go to languages like Ruby, Perl, Python, etc., because they are rooted in imperative programming, but allow you to program functionally. That’s how I came upon Lisp: by using Perl’s Lispish features one by one.
The vast majority of Perl code is imperative, of course. When I wrote my Extreme Programming with Perl book, I took some code on CPAN as an example, and converted it to a stateless/functional. It’s shorter, and imiho, more readable. Judge it yourself: http://www.extremeperl.org/bk/refactoring
Like Dijkstra, programmers will gravitate to programming in Lisp, even if it doesn’t look like Lisp. It will take a few more decades, but that’s the nature of evolution: it takes a “long” time. The moving parts of programming were supplied to us over fifty years ago, we’ve just been rearranging the bits so that our minds can grasp them.
Rob Nagler wrote:
I’m quoting that paragraph in full because it’s an excellent example of what I think is one of the reasons Lisp has made relatively little headway in the last 52 years. Its advocates have this tendency to assume everyone else is stupid — that if they don’t use Lisp it must be because they are not clever enough for it.
Let me note in passing that even when this is true, it’s not a strategy that’s likely to win converts.
In point of fact I do not program Java for a living, and a quick browse of this blog’s earlier articles will show that I am shifting from Perl towards Ruby in part because it is better for expressing functional-programming ideas. I am not blind to the value of FP; it’s because of that that I am so intrigued by Lisp’s chronic failure to conquer the world. So I think we need a better explanation than “you’re all just too dumb to get it”.
BTW., your refactoring chapter is a nice and helpful worked example of how to do this stuff; but I don’t see that it has much to do with Lisp.
I guess the answer is “don’t give a damn”, then. :)
http://en.wikipedia.org/wiki/Lewis_Hamilton
http://en.wikipedia.org/wiki/Jenson_Button
In no way did I mean to say you were stupid.
I apologize for the informal “you”: s/you/one/g. And, I probably should have said, “which operates from the perspective of classical computer science teaching.” I’m a programmer, not a writer. :-(
I’ll try a different approach on explaining why Lisp is not in widespread use.
Darwin’s famous book is now 150 years old. Yet, even today, people have a difficult time with the concept of billions of years. Most people who “believe” in evolution have difficulty thinking in this time frame. Moreover, most people who believe in evolution still believe in a god as defined by Genesis. Richard Dawkins, Sam Hill, Christopher Hitchens, etc. have written quite eloquent texts pointing out this contradiction, but they have failed to change the way most people think about an omniscient deity.
The problem of evolution and Lisp are really one in the same. We were taught to design our programs Top Down, as god supposedly did, and reality is that we don’t really have a clue how to design programs, and only after a period of time does the “best” design emerge.
We don’t program in Lisp, because as Dijkstra said, “while now, 40 years later, functional programming is still considered in many CS departments as something much too fancy, too sophisticated to be taught to undergraduates.” [EWD1284, 1999] It is the CS professors who are either stupid or think we are too stupid to understand functional programming. Professors are the ministers of the Church of Structured Programming. If Lisp is “right”, then all those professors are “wrong”, and what we were taught in school is “wrong”. However, that’s not going to happen, and we’ll just have to meander our way towards Lisp in the same we were are meandering our way towards understanding evolution.
Thanks for the compliment on my refactoring chapter. The technique used by the _list() method was accepting a function as data. Lisp was invented in 1958 is based on lambda-calculus (1936), which is where other programming languages got the idea of functions as state. Therefore, any time you use a function pointer, you are using a fundamental Lisp construct. The difference is that with Lisp that function pointer points to a list, which can be inspected. In C, for example, all you can do is look at the machine code, not the original C instructions. I don’t know of any other language which lets you introspect as much as you can in Lisp.
Pingback: Where Dijkstra went wrong: the value of BASIC as a first programming language « The Reinvigorated Programmer
Pingback: Where Dijkstra went wrong: the value of BASIC as a first programming language « The Reinvigorated Programmer
Funny you should mentions this book.
I have a copy my father bought the year I was born…
While the Fortran and PL/1 were a bit of a turn off at first, I keep referring back to it and it keeps informing my own style.
An oldie but a goodie.
@Rob Nagler
I don’t know where you were taught but I was taught functional programming on my CS degree. And lots and lots of CS degrees do the same.
I think functional programming is having a resurgence but I don’t think LISP is about to become mainstream. And that’s because:
1. ) Syntax matters. Oh yes it does :-)
2.) Types matter. And LISP predates the invention of the vast bulk of their theory.
3.) Macros, while powerful, are technique of last resort. Most things – even a lot of generic programming – can be done with more constrained tools (ones that didn’t exist in the ’50s – e.g., combinator libraries).
4.) The ideas where LISP is a clear win – garbage collection, read-evaluate-print-loop, dynamic compilation, easy construction of DSLs, anonymous functions and closures, and so on – tend to be adopted by other programming languages.
Pingback: Programming Style | Superposition Kitty
Pingback: Programming Books, part 3: Programming the Commodore 64 « The Reinvigorated Programmer
Pingback: So what actually is my favourite programming language? « The Reinvigorated Programmer
Pingback: The hacker, the architect and the superhero: three completely different ways to be an excellent programmer « The Reinvigorated Programmer
Gareth, you missed:
5) Because Lisp is a family of languages rather than a single language, and because the only attempt to unify the variants turned into a gargantuan monster that makes C++ look trim (Common Lisp, especially CLOS), no single implementation has ever reached critical mass: and most programs are not portable between implementations, not least because every implementation even of Common Lisp has a completely different FFI for calling into languages like C where pretty much all the libraries live. Lisp has the same virtual machine trap as Java does, but fewer libraries.
I do wonder if Lisp would be alive at all now if it wasn’t for Emacs. Every single Lisp hacker I know of who learned the language after the 80s learned it first so they could extend Emacs or XEmacs. (Under the circumstances it’s a shame that Emacs Lisp is such a horrible dialect.)
Pingback: Programming Books, part 4: The C Programming Language « The Reinvigorated Programmer
You might also like Kernighan and Pike’s 1999 /The Practice of Programming/, which updates the rules, includes test automation, and presents examples in modern languages.
http://cm.bell-labs.com/cm/cs/tpop/index.html
Jason, I do own The Practice of Programming, and I did find it helpful to read — I’ll probably re-read it again pretty soon. But somehow it didn’t quite, for me, have the classic, concentrated feel. I’d put it high in the second rank of programming books.
I have’t read The Elements of Programming Style, but it sounds as if the authors were purposely emulating another indispensable manual, Strunk and White’s The Elements of Style
http://en.wikipedia.org/wiki/The_Elements_of_Style
Pingback: Frameworks and leaky abstractions « The Reinvigorated Programmer
@ Valda Redfern
Actually, I believe that the emulation of Strunk & White was deliberate, and a sort of kudos to them.
Pingback: Are you one of the 10% of programmers who can write a binary search? « The Reinvigorated Programmer
Pingback: Binary search redux (part 1) « The Reinvigorated Programmer
Pingback: Writing correct code, part 1: invariants (binary search part 4a) « The Reinvigorated Programmer
Here’s a 2009 presentation by Kernighan entitled “Elements of Programming Style”. It covers ideas from the book and includes (some) updated examples:
http://video.ias.edu/PiTP2009-Kernighan
Pingback: Infovore » Links for April 28th
Pingback: The difference between imperative and functional programming « The Reinvigorated Programmer
Pingback: What does it take to test a sorting routine? « The Reinvigorated Programmer
Pingback: Programming Books, part 5: Programming Pearls | The Reinvigorated Programmer
Pingback: The Elements of Programming Style | markjeee.com
Pingback: More thoughts… | Scali's OpenBlog™
Pingback: Debugging | 42 IT Solutions
Pingback: Summary of rules from “Elements of Programming Style,” 1974 | Beyond the Beyond | Wired
Pingback: Understanding this Brian Kernighan quote | Coding and Programing
Your comments on Kernighan remind me of another book by Kernighan. I found a mimeographed book from 1974 on the net a couple of years ago – it contained all the Fortran code (and C code too if you want it) for Kernighan’s RATFOR.
I don’t doubt the quality of the code, but I found it unreadable. However, consciously or unconsciously, I may have copied his ideas in 1975, in writing a much simpler program (to handle land survey data). I wrote a scanner (tokeniser) which did much less than Kernighan’s, but was only a few lines long – even so, it was at the limits of readability for me.
Richard Mullins
Funny to think that the RATFOR compiler is out there somewhere. You will be aware of course that another classic Kernighan-co-authored book — K & Plauger’s Software Tools — has all its code in RATFOR. Probably the only use that language has ever had outside of Bell Labs!