I got into a small debate about software development with someone recently via the comments section to a previous blog-post.
During the course of the debate I thought of an analogy to make part of my argument, but I think that it has broader applicability, which triggered this post.
I have been talking to a lot of people lately about “Software Engineering” and debating with people that I know, and some that I don’t, about what it takes to establish a profession, and an engineering discipline.
I perceive a reasonably broad consensus, amongst people that we may consider thought-leaders in our industry, some of whom I am happy to call friends, about what “good” software development looks like. I also perceive a level of dismay in that group about much common practice.
So what are these disciplines and where is the consensus?
I perceive a broad agreement that waterfall style thinking, although still very common in practice, is a busted idea. The data is in, it just doesn’t produce great software!
Software development is a learning process, from beginning to end. So we must work to establish effective, high-quality, fast feedback loops in order to maximise our opportunities to learn. That means working iteratively, as well as lots of other things.
We are not good at predicting the future and so we must be experimental, we must be sceptical of our ideas and find ways to evaluate them quickly and effectively. We need to be more data-driven, measuring rather than guessing.
Automated testing provides a substrate that helps us to achieve many of these goals. Taking a test-driven approach to development enhances the degree to which we can carry out these fast, cheap experiments in the design, and implementation, of our code.
If I am to be intellectually honest in my convictions, then all that I have just said about the development of code is also true about the creation and evolution of our approach to development. We should be data-driven, empirical, experimental in our approach to improving development process.
On the “data-driven” front we are making some progress. The excellent work done by my friends at DORA has raised the bar on measurement of process and practice in our industry. Their new book Accelerate explains the science behind their measurements. The results of these measurements are that, for the first time, we have data that says things like “Your company makes more money if you do x”, where ‘x’ is doing some of the things above.
The DORA folk have a model that predicts success (or failure) of your development approach. All of this is based on a peer-reviewed approach to data collection and analysis.
We can interpret these perceptions in several ways. Perhaps I am wrong and merely echoing the contents of my own filter-bubble (probably to some extent!). Most of the “thought leaders” that I am thinking of are old-hands, a polite euphemism meaning that my social group is getting-on a bit. Maybe these are the rants of old men and women (though most are men, which is another problem for our industry sadly).
A more positive interpretation, and one that I am going to assume for the rest of this post, is that this represents something more. Perhaps we are beginning to perceive the need to grow-up, a little, as an industry?
My own, primary, interest in this is around the engineering disciplines that I think that we should try to establish as a norm for software developers who consider themselves professionals. I would like us to have a more precise definition of what “Software Engineering” means. It would need to rule some things out, as well as define some things that we should always do.
Others are interested more in the “Profession” side of things. I have recently seen a rise in people discussing ideas like “ethics” in software development. Bob Martin has a couple of interesting talks on this, and closely related, topics. He makes good points about the explosive growth of our industry and the consequent dilution of expertise. He estimates that the average level of experience, amongst software developers, is just 5 years. As a result we, as an industry, are very bad at learning from the mistakes of the past.
I have been careful in my choice of words here. Currently we are not a “Profession” we are a “Trade”. The difference between these two is that a “profession” demands qualifications as a barrier to entry, and has rules to reject people that don’t conform to its agreed, established norms. By these defining characteristics we don’t qualify as a profession.
You can’t practice law or medicine without the appropriate qualifications. In our industry, if you can pass the interview, you can take part. If I can convince an interviewer that I am competent, over a small number of hours during the course of an interview, I could go and write software that controls an aeroplane, a medical scanner or a nuclear power plant. An individual company may have rules that demand a specific degree, or other qualification, but our “trade” does not.
If you are a surgeon and you decide that washing your hands between operations is a waste of your valuable time, once people notice of the increased death-rate at your hands, you will be “struck-off” and not allowed to practice surgery ever again, anywhere.
There can be no profession without professional discipline.
In 1847 Ignaz Semmelweis made an important discovery:
“The introduction of anaesthetics encouraged more surgery, which inadvertently caused more, dangerous, post-operative infections in patients. The concept of infection was unknown until relatively modern times. The first progress in combating infection was made in 1847 by the Hungarian doctor Ignaz Semmelweis who noticed that medical students fresh from the dissecting room were causing excess maternal death compared to midwives. Semmelweis, despite ridicule and opposition, introduced compulsory hand-washing for everyone entering the maternal wards and was rewarded with a plunge in maternal and foetal deaths, however the Royal Society dismissed his advice.” (Wikipedia https://en.wikipedia.org/wiki/History_of_surgery)
This resonates with me. I advocate for some specific practices around software development. These practices work together, in sometimes subtle ways. I believe that the combination of these practices provide a framework, a structure, a disciplined approach to software development that has the hallmarks of a genuine “engineering discipline”.
I believe that, like “washing your hands” as a surgeon, some of these disciplines are so important that they should become norms for our industry. I don’t doubt that you can write software without fast feedback, without automated tests, without an experimental approach, without collaborative teams and with big-up-front designs and with a 12 month plan. A positive outcome, though, is much less certain. Just because some surgeons had patients that survived, despite their lack of hygiene, doesn’t mean that hygiene isn’t a better approach.
These days, nobody can consider themselves a surgeon if they ignore the disciplines of their profession. I believe that one day, one way or another, we will, of necessity, adopt a similar approach.
If we are to establish ourselves as a profession, rather than as a trade, we will need to do something like this. Software is important in the world. It is the revolutionary force behind our civilisation at the moment. I foresee three futures for our industry.
1. We do nothing. At some point, something REALLY bad happens. Some software kills LOTS of people, or maybe destabilises our political, economic or social institutions. Regulators will regulate and effectively close us down, because they will get it wrong. (It has taken us decades to understand what works and what doesn’t, and we are supposed to be the experts!)
2. We start trying to define what it means to be a “Software Professional” in the true sense of the words. Something bad happens, but the regulators work with us to beef-up our profession, because they can see that we have been trying to apply some “duty of care”.
3. The AI Singularity happens and our Silicon overlords take the task of writing software out of our hands.
Ignoring 3 for now…
Scenarios 1 and 2 are both problematic.
I fear that we will continue with 1. The short-term economic imperative will continue to drive us, for a while, until the population at large realise just how important software has become. At which point there will be repercussions as they react to the lack of a sufficient duty-of-care in many instances. The VW emissions scandal is an early warning of this kind of societal reaction, I think.
Scenario 2 is problematic for different reasons. I think that it is the more sensible strategy, but it demands that we change our industry and allow it to progress from trade to profession. Daunting! At which point, if we succeeded, I would be expelled for not having any relevant qualifications. This is a big challenge, and not just for me personally ;-). Our industry is still growing explosively, educational establishments are not really delivering people with the skills ready to be “professional” in the sense that I mean. Many universities (maybe even most) still teach waterfall development practices for goodness sake!
My own experience of hiring and training young people into our industry suggests that there is relatively little advantage in hiring Computer Science graduates over most other graduates. We pretty much had to start from scratch with their brain-washing, errrr “on-the-job training”, in both cases. It is easy, even common, to graduate from a CS course and not be able to program, let alone program well. Physics, and other hard-science, graduates have a better understanding of experimental discipline and scientific rigour. The main problem with physicists (and most CS graduates) is getting them to realise that “yes, programming is actually quite difficult to do well” and the techniques that work for a few lines of private code don’t scale well.
There is still much debate to be had. Despite the fairly broad consensus that I perceive on what it means to apply “engineering thinking” in software, I still regularly get people arguing against the practices that I recommend. If I am honest, most of these arguments are ones that I have heard many times. Often these arguments are based on dogma rather than measurement or evidence. If we are to be more scientific, apply more engineering discipline to our work, we cannot base our decisions on merely anecdote. That is not how science and engineering work!
I am not arrogant enough to assume that I have all of the answers. However, I confess that I am hubristic enough to believe that the people expressing “ridicule and opposition” on the basis of dogma or only anecdote don’t have a strong case. Mentally I dismiss those arguments as being analogous to the surgeons who don’t “wash their hands”.
If you want to change my mind, change it with data, change it with evidence.
I think that we are in the same state as surgeons in the 1850s. Today, there is no reputable surgeon in the world that does not wash their hands before surgery now. This discipline wasn’t always obvious though. I believe that we have identified a number of practices that are the equivalent for software development of “washing your hands” for surgeons. I spend a lot of my time describing these despite <occasional> “ridicule and opposition” 😉
In both cases, existing practitioners, who don’t “wash their hands”, claim that this is unnecessary and a waste of time. I think that the data, and, I hope one day, history, is on my side.