Parkinson’s Law of Triviality

Over that past ten days or so I found myself making constant references to Parkinson’s Law of Triviality in order to explain certain dilatory behaviors. If (like me) you do not have a copy of the book, Poul Henning-Kamp has written an excellent write-up on the concept which is hosted on bikeshed.com:

Parkinson shows how you can go in to the board of directors and get approval for building a multi-million or even billion dollar atomic power plant, but if you want to build a bike shed you will be tangled up in endless discussions.

Parkinson explains that this is because an atomic plant is so vast, so expensive and so complicated that people cannot grasp it, and rather than try, they fall back on the assumption that somebody else checked all the details before it got this far. Richard P. Feynmann gives a couple of interesting, and very much to the point, examples relating to Los Alamos in his books.

A bike shed on the other hand. Anyone can build one of those over a weekend, and still have time to watch the game on TV. So no matter how well prepared, no matter how reasonable you are with your proposal, somebody will seize the chance to show that he is doing his job, that he is paying attention, that he is *here*.

100.000 disclaimers δεν κάνουν μία γνώμη προσωπική

In dealing with customers and outsiders, remember that you represent the company, ostensibly with full responsibility and authority.

You may be only a few months out of college, but most outsiders will regard you as a legal, financial, and technical agent of your company in all transactions, so be careful of your commitments.”

[via “The Unwritten Laws of Engineering“, 1944]

Using Mathematics as an argument

I just came out of a meeting where the following phrase was spoken (and the meeting’s context does not really matter):

– Mathematics has spoken. You can never ever have everything as a variable. You have to have constants.

This was used as a math-therefore-I-am-right-full-stop argument. Never, ever use Mathematics, Science or any other bus-stop argument in a room filled with 60+ people with Mathematics, Engineering and Computer Science degrees and expect to be taken seriously. Interpret the fact that you were not countered as politeness instead.

reboot to fix

Appart from its historic value, “The Hacker Crackdown” is full of gems, like:

Starting over from scratch will generally rid the switch of any software problems that may have developed in the course of running the system. Bugs that arise will be simply wiped out by this process. It is a clever idea. This process of automatically re-booting from scratch is known as the “normal fault recovery routine”.

So you see this was not Bill’s idea in the first place.

Personally I hate reboot-to-fix. Rebooting must be a final(?) solution which in fact not only puts the problem under the carpet, but also deprives one of the possibility (and sometimes data) of finding out what the cause of the problem is. It is performed under pressure, under hurry and usually with no data at hand to replicate the problem and study it in a test environment and with some peace of mind. “Make the danm thing work first; find out what happened later! We’re losing money!” Downtime is an option and so routers and servers get reloaded… I will not sit in an ivory tower though pointing fingers, for I’ve practiced this “problem solving” technique a number of times.

Reboot-to-fix comes with a price: While at times it seems like a time saver, it only pushes forward in time the manifestation of the problem. At a later (and more inconvenient) time. And then it stops looking like a time saver. And if doing the same thing over and over expecting different results can be considered as a sign of paranoia, reboot-to-fix is just another sign of that.

Update: Some 12 days later, Paul Venezia wrote “When in doubt, reboot? Not Unix boxes“. Cool stuff!

Blast from the past: Bruce Sterling on Cyberspace

With everybody and his dog prepending cyber- to almost everywhere, declaring the cyberspace as a new war dimension (the first four being earth, sea, air and space) abusing and overusing terms like cyberwar, cyberdefense, cyber-infrastructure it is a good idea to return to the basics, like the very definition of the cyberspace. Luckily in his 1992 “Hacker Crackdown” introduction, Bruce Sterling came to assist us, long before a definition for the wider public was needed:

A science fiction writer coined the useful term “cyberspace” in 1982, but the territory in question, the electronic frontier, is about a hundred and thirty years old. Cyberspace is the “place” where a telephone conversation appears to occur. Not inside your actual phone, the plastic device on your desk. Not inside the other person’s phone, in some other city. THE PLACE BETWEEN the phones. The indefinite place OUT THERE, where the two of you, two human beings, actually meet and communicate.

Although it is not exactly “real,” “cyberspace” is a genuine place. Things happen there that have very genuine consequences. This “place” is not “real,” but it is serious, it is earnest. Tens of thousands of people have dedicated their lives to it, to the public service of public communication by wire and electronics.

Even if you have no interest in reading about Operation Sundevil, the introduction of the book is a very informative essay on cyberspace that stands on its own.

Read Next: Proposal for cyber war rules of engagement.

The Social Organization of the Computer Underground

I think I read the text version of “The Social Organization of the Computer Underground” sometime between 1993 and 1995. Recently I found out that the author has written an anniversary edition with a new introduction to the text (plus PDF and ePub versions).

While information in the text is dated (it was published in 1989) it is still a useful reading for those who wish to understand just a little deeper what went on (and some of what goes on) in the Digital Underground. Even better the introduction offers a methodology on how to do this the right way. I still consider it mandatory reading. My best part of the text is how the following typology from Best and Luckenbill’s 1982 “Organizing Deviance” is used to describe the Computer Underground:

Form of Organization Mutual Association Mutual Participation Division of Labor Extended Organization
Loners no no no no
Colleagues yes no no no
Peers yes yes no no
Mobs yes yes yes no
Formal Organizations yes yes yes yes

I think that people who will read the text will agree that the above typology most probably stands even today. Formal organizations for example do not appear in Meyer’s study, however these days almost every nation is investing in building a cyberwarfare capability (and this is not an “overground” operation).

It is a pity, I think, that such a work cannot be repeated today. If it could, it could provide us with some glimpse into modern cybercrime networks and even espionage (industrial or national) ones. But then again one can hope that there exists the sociologist who will prove me wrong.

PS: Revisiting the text I was reminded of the Cu Digest to which I was a subscriber for quite some time.

Update: Reading the description about the Anonymous group behind the HBGary hacks, I kind of appreciate the above table even more:

“Anonymous is a diverse bunch: though they tend to be younger rather than older, their age group spans decades. Some may still be in school, but many others are gainfully employed office-workers, software developers, or IT support technicians, among other things. With that diversity in age and experience comes a diversity of expertise and ability.”