Software “killing you softly”?

What is the problem with software? The disasters with the Boeing 737 MAX unfortunately show once again the importance of good software in the operating systems of devices and transport systems in particular. With software for these types of important systems, two questions are extremely crucial:

1/ Is it clear in all situations who is in charge of controlling the system?

2/ Is the end user or supervisor sufficiently involved in the design and programming of the system?

Who controls the aircraft? Regarding the first question: it seems, as far as it can be seen from the reporting, that while the pilots realized that something was wrong, the pilots had no possibilities to intervene sufficiently. The question is of course whether that should be possible. If the reasoning (the design) is that in all cases, also in this case, the “automatic” will always do better than the pilot, then it can be this is consciously built into the system. But the crucial point is in any case: it must be clear at least to the pilot himself whether he should and is able to take over the steering in urgent cases and thus also have the right means to do so OR that his role is only supporting so he cannot control or influence the system. Most people will find the latter situation a creepy thought, but the fact is, of course, that most planes already fly almost 100% on autopilot. But that’s not the point either. What matters is whether or not the pilot has the option to intervene in the event of an emergency. This is such a crucial point that you could even imagine that airlines should make it clear to passengers in advance whether the final responsibility lies with the software or the pilot in a specific aircraft.

Do not allow hybrid situations. After all: it is an essential point of confidence in the flight whether you would rather have a flesh and blood pilot decide in the event of an emergency, or the software. As far as it can be seen from the reporting, it has not been clear with the steering of the Boeing 737 MAX who was ultimately “at the controls” and there seems to have been a hybrid situation in which the pilot could intervene “a little”, but not completely or sufficiently to fly yourself. Such a hybrid situation should never occur. In all cases, it must be clear who ultimately controls the aircraft: the pilot or the software. There should never be any uncertainty about this and certainly not with the pilot himself. Exactly the same applies to steerable cars, trains or other self-driving systems.

Software construction is partly a creative process. Regarding the second question: in many IT organizations, software builders make decisions themselves about what the system does and does not do. Of course there is a lot of collegial consultation. Of course a lot is learned from past mistakes. And in all cases they try to build a solid system. But if the question arises as to where the ultimate responsibility is that the system performs action X in case A and action Y in case B, then in many cases that is not clear. In fact, software construction is in part a “creative process” of individual programmers and, despite extensive testing, it is not always easy to predict what the software will do in extreme cases or after prolonged use. With regard to accountability, reference is usually made to architecture groups that have created the design or the design principles. But solid registrations of designs do not normally exist. And in particular smaller or later updates are rarely well documented. All this is not an acceptable situation for end users. Much more transparency is needed in software construction about why systems use which algorithms. Much more input is needed with regard to functional requirements of end users or regulators. Much more attention needs to be paid to whether the software has been developed correctly. and whether the maximum has been reached to prevent unexpected errors. As is usual in engineering, in medical and so many other classical professions.

Design principles must be transparent and public. Design principles whether it are planes, cars or nuclear power plants, should be crystal clear for important systems with a social risk. End users or supervisory authorities must have control over crucial design principles, such that, for example, with an aircraft, it is always clear who has responsibility and makes decisions in which cases. But do we have to bother the citizen and the government with all these things? Hell yes. As a good example, we have a Steam Service in the Netherlands (now privatized at Lloyd’s). That body establishes requirements that a system that is “under pressure” must meet. This is to protect society. The Service also carries out investigations into existing installations to ensure compliance with the rules. This type of requirements for installations and associated supervision exist for a number of socially important issues such as medical interventions, environmental effects, dyke designs, bridge building, etc.

Independent supervision of software construction is required. But these types of control mechanisms do not exist for software systems in general. There are no formal regulatory bodies to ensure that software does what it is supposed to do, is built as it should be, neither in civil aviation, nor for cars, nor for nuclear power plants, factories, etc. And that while now the entire society is controlled by software. Of course there are IT audit committees in all companies. But who knows how little impact these types of IT audit committees have in practice, how little knowledge they have about IT, knows that there is no serious supervision that the software will function as it is supposed to do.
One thing is crystal clear: software in vital systems such as air planes, cars or power plants is too important to be left to the, usually commercial, builders or owners of the system. Just as in other areas, guidelines, Environmental Impact Reports, social principles, standards and controls, are required, it is clear such principles and controls also must be established in the area of ​​software in vital systems. At the moment in most cases nothing has been arranged at all. This situation should not continue like this.

Leave a Reply

Your email address will not be published. Required fields are marked *

four + 3 =