Software is a very difficult concept for humans to understand because it isn’t physical. We are physical beings that gain all of our intuition and judgement based on input from our senses. Inherently, we struggle with things we can’t experience physically.
Engineers have known for a very long time that mechanical devices eventually wear out. Entropy is a very physical thing, and humans have intuitively understood that nothing lasts forever long before the second law of thermodynamics was formulated. But when software was introduced, it took engineers and their managers some time to learn that software wears out too. This shocked them because software cannot be physically damaged, and can be copied infinitely. Only after painful experience did the industry learn that software cannot be maintained and modified forever.
Similarly, I think that humans understand that physical systems have limits and most people understand that an ever increasing building would eventually collapse under its own weight. Nevertheless, we have greater trouble, I think, in recognizing that our increasingly automated society could eventually suffer the same fate.
How many pieces of software does your life depend on today? Did a piece of software wake you up this morning? Is it counting your steps and checking your health? What about managing your car’s internal operations? Was the energy delivered to your home and office controlled by software? If you took a child to the hospital, how many different pieces of software written by how many different engineers were necessary for her treatment?
Automation is being planned for our cars, our farming, and a number of other industries. The world’s mightiest militaries can no longer function without digital drones and instantaneous communication between the President and his forces on the ground.
Consider then, that this past week, my Windows 10 laptop started behaving strangely. It became sluggish and unresponsive and I had to restart the machine. Within twenty-four hours, it was having the same problems.
It took me many hours of research and a few special-purpose, engineer-level utilities to determine that I had a “memory leak” in my network device driver. Checking further, I found an update was available and my problem disappeared.
Out of the seven billion people on this planet, how many have the technical training to do what I did? Out of the 6.4 billion things that are expected to be connected to the Internet this year (30% more than last year), how many owners and operators would be able to solve a problem like this?
Certainly the challenges of human specialization are not new. Most people living in advanced, industrialized nations cannot fix their own cars, or even repair their own houses. Almost none of us grow our own food.
But automation introduces an entirely new problem. The machines are not only difficult to keep working, they can actually be turned against us. Every single piece of software running anywhere in the world is a potential slave to a bad guy’s control. Hackers use “bugs” in software to turn the software against its benign purpose and into a weapon. As the complexity and quantity of software increases, so do the number of potential opportunities for malicious attack.
What kind of solutions do we turn to? Ironically, we largely turn to increased automation. We need better firewalls, better anti-virus, better intrusion-detection, and so forth and so on. For the level of complexity in our digital world, there is simply no other option.
How far can automation go? How many pieces of software can our lives depend on? At what point does the system collapse under its own weight?
I don’t know. There certainly are some brilliant researchers investigating designs that are mathematically proven to be bug free within a reasonable range of assumptions. Perhaps we can find new structures that can support a much heavier automation load.
But we cannot fool ourselves into believing that automation has no weight and that it has room for unlimited growth.