(this page is part of the Family Guide to Digital Freedom, 2007 edition. Please do read that introduction to know more about the Guide, especially if you mean to comment this page. Thanks)

What do a UK prime Minister, a US warship and a fighter plane have in common? They all were put in danger, or at least in quite embarrassing situations, because of poor design, use or understanding of software, or at least of the policies that should regulate its use.

In 2003, the British Government published online an official dossier on Iraq’s security and intelligence organizations. Most of that dossier had been simply copied from three different articles: after a quick, very basic analysis of that file, a security consultant was able to find out who had worked on the document.

On June 16, 2006 negotiations between the United States and England for a very advanced military plane, the Joint Strike Fighter, reached an impasse: England, being a sovereign State, obviously wants to be able to maintain its military airplanes without relying on any foreign contractor. In order for this to happen, one of the necessary conditions is to have unlimited access to the source code of the software which controls all the vital functions of the plane, from flight control to communications.

Consequently, in December 2006 the British Ministers were urged to start searching for alternatives to the 140 billion pound project unless the United States “agreed within weeks to share sensitive technology”. Such worries are not just theoretical. In 1998 the USS Yorktown remained “dead in the water” for more than two hours because its computers were unable to divide by the number zero. It took two days of pierside maintenance to fix the problem.

Some of the problems mentioned in this chapter come from relying on what the specialists call “security through obscurity”, an approach whose validity seems very limited today. Security through obscurity is when any company designs a weak (security-wise) product for any sensitive application and then keeps the design a secret to hide the flaws and limits, while marketing it as unbreakable just because of that secrecy. Conceptually, this is the same thing as using a cardboard door for your house, placing a big plant in front of it and then feeling safe because thieves and other crooks “won’t be able to find where the door is, ah-ah!!”

The truth is that, no matter how many extremely competent engineers in one company develop and maintain the product full time, there will always be one million times that many programmers around the Internet to break the code very soon. It might just happen by their sheer numbers, like the story of one million monkeys dancing on one keyboard, and eventually producing by pure chance a Shakespeare sonnet.

Really open formats and software, especially when national security is concerned, could be of great help in all these cases. The doors of bank vaults are not made of tempered steel because nobody knows what steel is. They are made of such materials just because every expert knows their composition and consequently can confirm that it is the best possible one fors the job.

For the same reasons, security software developed in the open would have much a better chance of being resistant to faults and intrusion attempts: were such weaknesses present, any expert could have the possibility of finding and denouncing them. Software chosen in this way would also have the extra advantage of being legally supportable by many different (local) software companies, thus giving the Government more negotiating power when choosing a supplier, more opportunities to create local jobs and less ways to waste your taxes.

Of course, the effectiveness of such solutions would be limited without correctly formulated laws and procurement contracts together with, as desperate an effort as it may seem, a proper ICT basic training for all government officials and Parliament Members.