Opinion: What you dont see you dont care, but you should

Editor's picture

Mark Ford - Technology Blogger

In the following, references to “software” also means “firmware” or similar.

From heart pumps to self-driving cars, the robots have long arrived. Behind every movable part lays a monster; the infinite possibilities of software. Each application built upon thousands of lines of computer code. Built layer by layer, an ever growing stack of reusable objects. And like a tower of cards or a hairline crack, unchecked problems can quickly lead to complete structural collapse.

Computer hackers understand those building blocks and they seek out those hairline cracks. It may take longer, but even without the blueprints - the source code - the weaknesses may still be found and exploited. The dangers to human life from accidental or malicious intent are well recorded. And history demonstrates, security through obscurity buys little or no time.

When it comes to software, one of my greatest worries is an almost complete lack of accountability for the software inside medical devices (or indeed any logic controlled mechanical device). Barnaby Jack, for example, demonstrated the wireless hacking of insulin pumps (search Wikipedia) with the ability to deliver fatal doses of insulin.

We have standards and regulations for the things in front of us, the physical and the real. We have health and safety for cars, we have strict control over food and drugs. We are entitled to know the mechanics of a car and the ingredients in our food and drink. To ensure justice is fair, we have open courts and transparency in law. But when it comes to computer code in everyday consumer goods, there is almost entirely no regulation. The code silently serves us, playing a critical role in our lives yet is largely unseen and almost entirely unregulated. And the stack of cards grows wider and taller.

As consumers we put all our trust in the hands of just a few system designers and computer programmers. These experts may be certificated and have many years of experience, but they are essentially self-regulated.

Within any discipline, peer review plays a vital role. With proprietary software, peer review may be limited as competition in the marketplace often encourages secrecy and discourages external audits. Software developers understand how complex and equally how fragile computer code can be. They also have a great fear of liability, which is why we always see them wash their hands with it in the software licence.

When software is open, its strengths and weaknesses are visible to everyone. That greater scrutiny brings a hardcore of like minded people together. But much more importantly; open source software crosses borders, industry and political agenda.

Open source is not a magic bullet to safety and privacy but in today's ever increasing and complex world it gives us something we desperately need, transparency and trust.

When it comes to medical devices, or indeed any logic controlled mechanical device, I believe all the software should be open. I am not suggesting open source is a complete solution but, on balance, I feel we have a right to know (and better to know) the inner workings than trying to deal with the unknown. Today it's self-driving cars, tomorrow it's lasers on your retinas for personal identification and full on virtual reality... don't let the science blind you!

Proposed rules for medical devices and devices with logic controlled mechanical movement:

  1. If an author chooses to stay proprietary then, in the event of an error, they must accept a much greater liability. For proprietary software, an “as is” software licence should not be broadly recognised. Significant errors must be dealt with as they would be considered “faulty goods”.
  2. Conversely, open source software should be be more broadly allowed to be “as is”, provided the author(s) made reasonable modifications when notified of significant errors. By opening up the software, the author(s) are showing good intent.
  3. The author(s) may sign (but not lock) the software from end user modification, but where it is user modified, the original author is no longer liable.
  4. A significant error would be any output or movement that is not reasonably expected and pertaining to the task at hand.
  5. All versions of software must be archived and their signatures published. In the event of a legal case, the prosecutor should be able to reconstruct the software and signature. Failure to do so in such circumstances would lead to heavy fines.
  6. When someone discovers a flaw, they would ideally (but not necessarily) report the issue(s) to the developers first, especially where the error is significant. The author has a reasonable right to correct errors in a reasonable amount of time. Criminal exploits, especially where the error is significant, should attract greater legal consequences.

The following compromises are negotiable:

  1. A “sunrise” period, where software can be kept closed for an initial period of, say, 12 months. This would give the author(s) time to audit, correct and maintain. The sunrise period would not apply to subsequent versions of software, only to the initial release.
  2. Reduced liability with the passage of time where no significant errors are found (say 5 years?)

For further information, please see the following video...  it's a little long but largely relevant:


Feedback welcome in the comments.