18 Dec

Value through Reality

We’ve been chasing a mirage.

For over fifty years, mankind has built his greatest achievement in modern computing, and yet we’re surrounded by technology we don’t fully understand.   We store our memories in it, move our money with it, and even trust it to keep us safe while we’re flying thousands of feet above the ground.  But based on what we’ve seen so far, has it earned this level of confidence?

We’re always told about how computers keep getting faster and faster, but my parents still need to upgrade every five years just to read the same e-mail and surf the Internet.  Despite the regular reports of major organizations around the world being crippled by network breaches and identity theft, everyone seems determined to put more and more of our personal information online.   Even the simplest tasks we perform at home or work can be blocked entirely by flaws that we all pleasantly refer to as bugs.

Would we accept these limitations anywhere else?  Do you think anyone regularly replaces their refrigerator to get a new display on their ice dispenser or to upgrade the finish on the doors?  If one out of thirty-five banks in the United States were being robbed on a regular basis, would anyone choose to safeguard anything there?  And yet, with computing, all conventional pillars of common sense are ignored and we keep accepting something less than continual improvement from our technology.

Consider the even more subtle effects.   Of the fifty people you know on Facebook, can you say who your best friend really is any more?   What does it mean to be liked today when the cost of offering it is negligible or can be taken away with a single keypress?   How much of your online activity is being quietly recorded only to be used against you during your next job interview or credit application?  All of this uncertainty can be summed up with a single question.

What exactly is truly real today?   

There’s a disturbing irony in the fact that computing requires an incredible command of science and the natural world in order to function, and yet it currently produces increasingly unreal and illusionary effects – almost like black magic.   Think about how ancient people regarded magic:

  • It was something unpredictable and couldn’t be fully understood.  
  • It could be a blessing or a curse, improving your life or destroying it.
  • It could only be used by a select few and there wasn’t any rhyme or reason for what qualified them to wield its great power.

Does this sound familiar?   

Now we’re hearing about how every single device and machine in the world will be connected through the Internet and controlled with Artificial Intelligence (AI).   I don’t know about you, but from where I’m standing that seems like it could be filed under the category of man’s worst idea ever.   I don’t even want to think about what happens when some hacker in the Ukraine thinks it’s amusing to turn my furnace off while I’m away for Christmas, only to flood my house when the pipes burst.   What will the smart machines do to prevent that disaster, I wonder?  That’s even a minor thing, turning a single switch on or off.   Think about the possibilities for issues when bits of the same software are used in self-driving cars, power plants or nuclear silos.

The problem, as I see it, is a philosophical one. 

 

The Flaw with Virtual 

Words matter.   They are tools for communicating ideas and the choices of which ones to use are critical for understanding.    The words house and horse have entirely different meanings and yet they differ by only a single letter.  One letter of one word can change the entire meaning of a whole page of text.   When we use words as metaphors for complex ideas, their importance only increases.

Consider the term software.  We all use it and yet what does it really mean?  The intent is to describe instructions that operate a computer, which can be easily changed to produce different behavior.  This concept allows us to significantly improve our lives over what we experienced with fixed-function devices years ago.   But, the word software can also imply something pliable or weak, lacking predictability.  While it is great to be able to download a different version of a mobile game, it isn’t always preferable to lose certainty about what is moving money between checking accounts.

With words, you also have to consider your audience as well.  Almost like the way a virus spreads, words move from person to person, each one forming opinions about their meaning.  So while a consumer may have opinions on the term software regarding how it functions, the developers writing the software also form opinions, yet from a perspective that benefits their own position.  Programmers have long considered software to be enticing because of how easily it can be adapted to simplify their own lives.  They build abstractions like garbage collection, multitasking,  or interpreted programming languages with the best of intentions for improving efficiency, but those things actually make it softer and more unpredictable as a result.

The problem is that many of the words in computing are not grounded in any single reality that we can all experience.   The reason that engineers can build a skyscraper with nearly perfect safety and reliability is because its existence is proven by physical laws of nature.   Every single person on earth can objectively experience the forces of gravity or the tensile strength of steel.   Whether we want to believe in a different reality or not, there is no disputing these facts.   On the other hand, when you look at computing, how often are quantitative questions answered with, ‘it depends’?  How much memory does my e-mail application require?   When will this database reorganization complete?   Where is my personal information stored?  Can my computer run this game?

If you go back far enough, software originally did have a very close relationship with the real world and the physics of electricity.  The first computers were programmed by moving physical switches or plugging wires into different ports.   There were few abstractions and anyone using them had to have a good appreciation for how their physical movements influenced the program.   As time went on, computers were still reasonably predictable.  Assembly language programs from even thirty years ago on home computers had to be carefully designed and used the entire computer during their execution.  We knew exactly how big they were and could even estimate how fast they would run.   There was no disputing individual clock cycles or absolute byte counts.   As best as I can identify it, the change happened sometime in the 1980’s when we started talking more about all things virtual.

When we were becoming enamored with concepts of virtual reality, the very natural desire to experience strange new worlds began to pollute our understanding of how computers should be programmed.  The same people who eventually read and enjoyed Snow Crash wanted to write programs or hack computers from inside a simulated reality.  While this can all seem like productive forms of experimentation, the problem is that without a strong position of where fundamental concepts should be taken, little ideas take root and slowly form larger philosophies over time.  Since we’ve never really taken the time to choose a clear philosophy of computing, we got the mongrel one that comes from every conceivable opinion all summed together.

 

Embrace Reality

There is no reason that computing can’t share all the same benefits enjoyed by traditional engineering disciplines, but its creators need to recalibrate their expectations.   The responsibility for making this change lies squarely with chip designers, electrical engineers and software developers.  We need to begin with basic principles and then move our work into a true science of software and associated engineering.

As I mentioned above, architectural engineering is only possible when the fundamental science never changes.  New ideas can expand on old, but can never truly replace prior laws.  Software development requires this degree of constancy if we will ever build systems for truly transformative purposes.  In other words, I’m suggesting that computing requires universal, standardized units of processing, storage, interconnectivity and time.   In the same way that a Helium atom will always have two electrons or light will always travel at 299,792,458 meters per second, reliable computing requires units and behavior that are consistent across all platforms.  While it is true that all platforms are not created equal, it is essential that we begin considering what a fundamental, common science of computing would look like and how it could accommodate the decentralized nature of its ongoing development.  To that end, my suggestion is that we begin by considering a very simple origin principle:

The measure of value in any computing environment is directly proportional to degree at which it reflects and influences physical reality.

Very intentionally, there is a lot in this sentence that was left unsaid.  There is no description provided on how we can achieve this measurement or even whether this value is something that could be easily traded for money.   But, if you acknowledge the problems I’ve mentioned above, then it is paramount we remember the path we’re currently on.  If we don’t always build towards a real computing experience, we risk only greater catastrophes in the future when our lives become even more digitized.