Sudoscience

Firing from the hip

Table of Contents

1. Introduction

1.1. Using our words

Is software development an art? Does it want to be? One trivial but telling point in favor of this conclusion is the crude and lazy habit of calling previously existing (and often related) software "prior art." Lazy because it gives an unwonted significance to the pattern in the computing world of making minor alterations to tools and software and releasing it as if it were new. Basically equivalent products, one addressing gripes or perceived flaws about the other, proliferate. Crude because it has the mystifying properties of jargon while pretending to be clear in its intentions. "My work is art!", it proclaims, by dint of existing in a one-way flow of derivation.

Salesmen like the late Steve Jobs are much quoted as voicing cynical bromides such as, "Good artists copy. Great artists steal." This by way of smuggling a virtuous creative ethic into corporate espionage and the quotidian travails of office labor. Let us consider some more catholic, and less insipidly moralizing, definitions of art and artists. Donald Knuth, eminent computer scientist, takes a historical approach. He eschews the contemporary practice of using art to mean only "fine art". Instead, he prefers art to mean the application of principles discovered by science. Moreover, art is practical, dynamic, and intuition-based. Science, traditionally, is none of these things: it represents the product of reason, knowledge, sometimes attained through action and experimentation, but just as well through reflection.

Here are two excerpts from his paper, Computer Programming as an art:

Meanwhile we have actually succeeded in making our discipline a science, and in a remarkably simple way: merely by deciding to call it "computer science."

As civilization and learning developed, the words [science and art] took on more and more independent meanings, "science" being used to stand for knowledge, and "art" for the application of knowledge.

1.2. Art? Science? Both? Neither?

The important conclusion to be drawn from these references, to my mind, is that naming, contrary to many understandings, is largely a matter of preference, even whim. Software development wants to be scientific because it is downstream of several more-or-less scientific disciplines; but it also wants to be scientific because things like correctness and performance matter (for some definition(s) of correctness and performance.) But it is an art because what else is software development if not the application of knowledge gained from research in computer science?

Can something be both science and art? Is architecture an example, or would one be compelled to distinguish between structural mechanics1 and architecture? Where must the division be placed? More of this anon. For the sake of balance, let us take a look at another, entirely different, definition of art. This one is offered by Nietzsche in his Gay Science: "Art is the good will toward appearance."

Good will? Appearance? The way I interpret this typically ambiguous pronouncement by one-third of the Terrible Three—the so-called "Masters of Suspicion," Freud, Nietzsche, and Marx—is that "good will" is a generosity of spirit, an openness to being wrong, to suffering indecision, uncertainty, and anxiety in the face of unreason; in short, a consummately unscientific outlook. In this light, the sense of "appearance" becomes a bit clearer, and is conveyed through the converse of this scholastic Latin phrase, esse quam videri: "to be, rather than to seem."

2. Seeming

If we can successfully make something a science by merely naming it one, can we achieve the same result vis-à-vis art? Knuth probably thinks so: the title of his compendious and still-ongoing series of computer science textbooks is The Art of Computer Programming. The differences to be identified between computer programming and software development will have to wait for another blog post. For now, I will hand-wave away these differences.2

As for Nietzsche's proposition: this one presents, not surprisingly, a little more trouble. Incidentally, Nietzsche thought that naming was the truest indicator of originality, for it served to point out things that people had never before considered. Suffice it to say that Nietzsche's opinion of science, however, is considerably less sanguine.

My own sense is that software development is inherently a discipline of appearances. In part, this is due to the overwhelming and inescapable burden of history3: consider, for example, the basically ossified nature of the client-server model, which permeates all aspects of modern operating systems, networking, and even end-user interaction4. Whatever idealistic quality we would like to pinpoint in the way we design our software, we are beholden to the way things appear to be; in the case of some novel interface that promises to liberate us from the [insert your programmer's conceptual hobbyhorse here], we are faced with a unity-as-resistance that defies attempts to break it down and understand it. We are hamstrung by old habits, user requirements, software design patterns, and numerous other limitations. Some things have a false bottom that refuses to be broken through.

3. Being

For better or worse, the industry, or else various influential people in the industry, strongly resist(s) this notion. They want their work to be rational, considered, free of emotional or political motivations, and easily explainable to others, preferably non-technical people, along these lines. Here's a demonstrative snippet from Nathan Marz's keynote at ReClojure 2025: "If you're not in a constant state of annoyance, you're doing it wrong." This remark is intended to be glib, the mark of a veteran expressing his disdain for the vicissitudes of his chosen profession. But it reveals something else, too. That is the characteristic affect-ladenness of software development. With the best of intentions we confuse this sentiment as a sign of circumspection or the wisdom of experience. What it entails is a culture of nay-saying, a kind of ascetic system of gurudom, where habits and practices are enforced by a top-down expertise that does not necessarily have anything more than a contingent bearing on reality. What might work for one guru, or expert, or Software Architect, could very well be anathema to another. What appears to be the insignificant quarrels or internecine squabbles of a knowledgeable few, that only affect their discussions, is actually a perilously slippery slope of discourse that widens a gap between the plebeian masses of people writing the software, and the people designing it.

4. Coda

I used to be a professional software developer. I quit because the work no longer interested me as it once did. More noteworthy to me were some of the ideas and points I have raised and belabored in the course of this post. That they are outside the realm of day-to-day activities as a programmer/developer seems to me an obvious fact. That being said, this post will continue, in numberless installments, while I attempt to satisfactorily—that is, for my own satisfaction—address these issues, and other things.

5. Sources

Dr. Manhattan denying that sources exist.

Footnotes:

1

This probably isn't the right term. Corrections welcome.

2

Sorry.

3

The exercise in subjectivity par excellence.

4

Example: Fonts on Linux. Investigate at your own peril.