Why Reliability Metrics?
There's a new law being considered in the EU that would require software companies to pay for damages caused by bugs.
A comment about halfway down the page recommends requiring specific certifications for coders working on specific kinds of projects. Just like engineers, doctors, and lawyers.
My initial reaction was "That's dumb." But that's because I was thinking of the way our current certification system works. You cram some obscure material about details that you'll probably never actually need to know (and, when you do, you're just going to use google to find them anyway), take some computerized test, and pass or fail.
How could that ever possibly show that you can create reliable software?
But the idea dovetailed with a conversation I had last night with a former boss. He was comparing me with a guy I replaced, who took around 20 tries to get a login screen to work. We were both happy to agree that it doesn't matter how fast and pretty it is, or how many extra bells and whistles it has; if the core functionality doesn't work, you wasted your time creating it.
I'm tempted to veer off and talk about how we've gotten used to exactly that, but I'll leave it for some other day.
My question is: how can a developer prove that he's capable of writing reliable software? For most of this article, that's all I'm going to discuss. I'm ignoring very important things like maintainability, performance, scalability, etc. I'm focused on "Does the software do what the end-user wants?"
The Existing Situation (as I see it)
An organization can point to existing software to demonstrate its abilities. A developer could show a portfolio of existing work, but that really doesn't prove anything. There's nothing to say he/she actually wrote any of it, or even did the googling to find existing code him/herself.
Languages and methodologies don't really matter either. I once worked with a very old-school C programmer named Mike. We were using C#, but he tried to avoid anything more modern than the concepts in K&R C. He copy and pasted all over the place, because, he claimed, he didn't want to take the "extra time" it would cost him to create a base class and refactor into that. Personally, I think he just didn't understand inheritance.
He writes extremely reliable code (even though it was a nightmare for anyone else to try to change).
It's mathematically possible to prove that a piece of code is correct. It might even approach something that could be done by mere mortals, if functional languages ever become mainstream. For now, it's just mental masturbation for academics.
NASA has their reliability procedures neatly codified. I've read that they spend more time working on the process of creating code than they actually do creating it. Maybe we could get them to make those processes public and certify developers in them. That is sort of the thing the slashdot suggestion seemed to be focused around. Crashing planes, power plants blowing up, etc. NASA's methodology would kill most software companies, because it takes too long, but we aren't talking about most software companies, are we? The EU legislation seems (to me) to be aimed squarely at Microsoft, hoping to bleed them for losses caused by things like bugs in Excel.
Maybe the government could force that sort of thing on publicly traded companies, like an extension of SOX. But SOX requirements are already ridiculously expensive and painul. Besides, I haven't noticed them being all that effective.
Certification, re-thought
So, what about that original idea? Certify developers the same way we do doctors and lawyers. Someone who's already certified sponsors a developer, they go through essay exams and oral boards, and actually design and write some code.
Of course, any code examples would be too small to prove the developer could actually write a reliable distributed transactional enterprise-level system, say. And, just because a developer knows he/she should write unit tests and actually test changes themselves doesn't really mean they'll do so...or be allowed the time to do so by their managers.
And it would skew the results toward people who think the same way. I suspect the certified people would become dominated by people who favor their own pet methodology (like Mike, or maybe the people who are always chasing the latest automated testing methodology du jour...you know who I mean).
There doesn't seem to be much reason for this sort of thing in the open source communities, but there are possibilities there, too. If all (or even most) of the Open Office developers got this sort of certification, that would be even more reason for governments to ditch MS Office.
Then there's the question of programming language. Most developers aren't polyglots. When I started programming, I latched onto the advice to "learn one new language every year." Like Java, the .Net environment is exploding so quickly that it's now tougher to keep up with the changes in each new release than it was to learn those new languages. Besides, many have a prejudice against time spent working in other languages, as if it doesn't count toward experience in whichever language their shop uses. (I'm not saying a guy who started with C++ then switched to other languages for several years can jump right back into doing C++, but it isn't that hard to pick back up...that's exactly what I did for my current job. It took about a month to knock the rust off, but those years spent working in other languages have been extremely valuable).
For most languages, even if they don't grasp all the subtleties, most developers could read most code. Many won't even make that much effort (I remember trying to show another developer some python...he almost gave up in disgust because he didn't know what to open the sample with--how do programmers stay employed when they don't understand that almost all code is written in a text editor--then he completely dismissed the language because I hadn't used Hungarian, and he couldn't tell what type everything was). Anyway, if you're familiar with one language derived from Algol, you can probably pick up the basic gist, even if the coder did use variable names that offend your personal sensibilities. But, if you start getting into more obscure languages, all bets are off. Even though a lot of those languages were designed for reliability.
From that perspective, maybe the certifiers shouldn't look at the code at all. Just say "make a web service that fulfills these requirements." When it's delivered, change the requirements drastically. Then a third time. You know, just like what happens in real life.
Maybe that's too drastic, but they were talking about something like becoming a doctor or lawyer.
If we really are going that far, maybe potential Certified Developers should also be required to serve an internship. But what about Mike?
Then there's education. I still run across rants in blogs about whippersnappers who call themselves software engineers, even though they have no degree. Hey, that was my title at a couple of jobs. Don't blame me.
Some of the best programmers I know (any way you choose to measure it) have no college degree. Relatively few have a CS degree...and that's usually because they've learned to write software outside the ivory towers. A degree shows that you can put up with bureaucratic nonsense, cram for tests, and are willing to delay gratification (or just didn't have a clue what you wanted to "be" when you grew up), and you might have had some exposure to writing code. All useful skills in the industry, but there isn't really indication that you have the mentality to write reliable software.
Other Considerations
Like I said at the beginning, all this has been focused strictly on reliability. But it's been my experience that that goes hand-in-hand with maintainability. And that goes hand-in-hand with the way someone structures their code. Blocks that aren't evenly indented don't necessarily mean the coder's careless, but if he didn't even take that much pride in his work...it's become a big red flag for me.
And what about security? Mike's code did what it was supposed to, but it was incredibly insecure. I showed him (and our boss) how easy it was to hack, but they dismissed it because our end-users weren't smart enough to figure that out. Sigh.
I read another recent slashdot article (which I'm too lazy to find the link for) about our air traffic control systems. Since they connected to the Internet (whose brilliant idea was that?) they're getting hacked left and right. If you're worried about a plane's computers locking up in mid-flight, it seems like something that could tell them to fly into each other would also be on the agenda. I seem to recall talking with one of the developers who was involved in that fiasco, and they used C#...but that's just vaguely recalled hearsay, so don't hold me to it. If what I recall is correct, though, he'd be right up front as one of the people getting certified as a Reliable Software Engineer.
Besides, what about actually understanding performance? I've had jobs where the boss' reaction to any suggestion about optimization was "We'll just add another server." But, for the most part, performance matters. If your website crumples when it starts getting 10 requests per second, it turns out that it really wasn't all that reliable to start with.
Then there's that whole maintainability thing I've been ignoring. A given piece of software might be "Reliable" in its current incarnation. Requirements always change, eventually.
As I understand it, the functional requirements for the air traffic control computers didn't actually change. But the software was written in lisp, for ancient lisp machines that were wearing out. The requirement change was "It has to run on modern hardware." I guess whoever made the decisions didn't trust any modern lisp implementations (I'd love to hear from anyone who has an inside-scoop on this), so they did a complete re-write in a more industry-standard language.
That sort of thinking was what killed Netscape.
So we're back to some sort of board of certified Engineers who certify Engineer wanna-be's through some sort of rigorous process.
The Problem
There are 2 major problems with that (that I see just now). The first is that software development is still more art than science. Much "Enterprise" software is developed by people with questionable skills following some sort of magical Process that lets the business types plug code monkeys into slots like factory drones on an assembly line. This is equivalent to McDonald's making its employees do everything by The Manual.
Many of those people may very well be entitled to the title "Software Engineer," but that seems (to me) to be missing the entire point. Come to think of it, real engineers, lawyers, and doctors are also as much artists as scientists.
The second major point is that these things tend to become cliques designed to exclude people with opposing views and methodologies. Many states (well, Texas, at least) started requiring lawyers to get licenses because they didn't like certain lawyers' political leanings. The AMA makes it really hard to become an MD, at least partially because it keeps competition low and their rates high. They may have originally had some justification to filter out the quacks, but it turns out that MD's really don't know as much as many of us would like to believe; even the practice of bleeding, which so many of us laugh at now, was good medicine in certain cases (I highly recommend Survival of the Sickest by Dr. Sharon Moalem, for an interesting, readable, and amusing take on why medical "science" really isn't all that advanced).
It seems a lot more cut and dried for real engineers. Either you understand the math and physics behind what holds up a suspension bridge, or you don't. Then again, I know a carpenter who designed his own shop because he didn't want to pay an architect. When he finally got his plans approved through whichever committee it was, they reluctantly approved. But they required him to drive each nail by hand, instead of using a nail gun. Some sort of revenge because he just had the experience and eye to do things the Architect's Union (or whoever) spent years in school learning how to approximate?
Craft vs Art and Science
Anyway. Software "engineering" isn't an old enough discipline to be anywhere close to a "real" engineering discipline. As I understand it, real engineers know that, if they build things in a certain way, they will have specific effects. Tall buildings are designed to sway in the wind, with massive counterweights to reduce the sway so the building can remain upright. Dams are designed to get stronger, the more water that builds up behind them. The oil in your car engine somehow gets thicker and more slippery as it warms up.
This sort of thing is starting to emerge as more and more of us start to turn to "The Cloud." But we're a long ways away from being able to dissect a section of code, analyze its O(whatever) factor mathematically, much less calculate how any given change impacts that actual user requirements (assuming the requirements gatherers were even remotely competent). Automated testing helps with that last, but that still feels more like just piling up more sticks in front of the dam, instead of just designing it right in the first place.
Ironically enough, software development was, arguably, approaching the status due an "engineering discipline" somewhere back in the stone ages when people were still crafting hand-optimized assembler routines for certain pieces of code where hardware was still slow enough that that level of optimization was still justified. Even then, the architects/engineers pretty much never factored in major systems failures. Like "What happens if someone unplugs the power cord in the middle of this transaction?" or "What if the user pulls the disk out of the drive while I'm writing to it?"
These days, we don't have to worry about that sort of thing. Whichever black box we're using throws an exception, and we let the user know that something went wrong. Or the database server rolls the transaction back and forgets it ever happened, like it was some girl who puked all over your car and passed out while you were taking her back to your place from the bar.
These days, most of us leave the "engineering" mainly up to Microsoft (or FaceBook, Amazon, or wherever). Let them deal with that stuff while we get on with actually doing our jobs. Like quilters (they even use that term for Azure), we pick the pieces we want to use and stitch them together. We aren't even close to the engineering level.
Back Full Circle
So what about those people who are working at that level? Should they get some sort of certification?
They probably all have tons of certifications already, along with Masters' Degrees or PhD's from prestigious universities. What more does anyone want?
On the other hand, software development is sort of the epitome of the scientific method. "If I do this, I think that will happen. Let's see..." And experience develops it into something that resembles engineering "When I do this, I know that will happen," even if it's very far from being mathematically rigorous (don't get me started on math). But there's also a very strong touch of art "Well, ***. This didn't work. What happens if I do that?"
I dunno. Usually theoretical posts like this help me iron out my thoughts, but this one has left me with more questions than answers. I'd love opinions from anyone who actually bothered to read this full thing.