Moore's Law And The Software Industry
April 19, 2015 marks the 50th anniversary of the publication of Moore’s Law, which has enormously benefited hareddware but also the software industry. April 19th, 1965 Dr. Gordon Moore introduced a fundamental way to view growth in technology later labeled "Moore’s Law" - generally stated as the doubling of the number of device components on a silicon chip every two years. Most conversations on the subject tend to center around the rapid changes in silicon manufacturing technologies and the associated economies of scale that have simultaneously increased the performance and decreased the cost of modern computers (smartphones, tablets, laptops, desktops, servers and high-performance computers).
But the benefits go well beyond getting a better laptop at a lower price. The software industry has benefited enormously from this progression of cheaper, smaller and more accessible hardware.
In the thirty year period from 1980 to 2010 there was an approximately 10 000 fold increase in the relative performance of the microprocessors used in personal computers. In the same timeframe, the cost of a basic personal computer has gone from approximately $4300 (in 2015 dollars) to about $200 today.
In essence, individual processor instructions, that were once measured in multiples of microseconds, are now measured in fractions of nanoseconds and are significantly more affordable.
Of course, the improvements to your basic software workstation also include more and faster RAM (from 16KB in 1981 to a typical 4GB today), local storage (from 160KB to 500GB today) and display systems (320 x 200 x 16 colors to HD+ resolutions supporting millions of colors); as well as the added advantage of an always-on, high-speed, network connection to the Internet.
During the past fifty years the software industry and its related technologies have changed dramatically. The variety and depth of software development tools, languages, libraries, operating systems and execution platforms today reaches well beyond the future that many software developers imagined fifty years ago.
The software "tools of the trade" have continuously evolved to take full advantage of the ever increasing performance and capacity of each generation of software workstation. Something as simple as your code editor has evolved with those hardware improvements, starting in the 1970s thru today:
simple line-mode text editor (e.g., ed)
multi-line character-mode text editor (e.g., vi)
extensible character-mode editor (e.g., Gnu Emacs)
language-aware GUI code editor (e.g., TextMate and Sublime Text)
full IDE (e.g., Visual Studio, Xcode, Eclipse et al)
There’s overlap in the list above, and many other good examples, but an austerity of resources requires basic solutions (ed and vi) and an increasing abundance in performance and capacity results in new and unique productivity-enhancing features like syntax highlighting, extensible scripting languages and the fully integrated development environments (IDEs) that go well beyond a code editor by including source-level debuggers, intelligent API hinters (aka Intellisense), device simulators, etc.
According to Paul Fischer, a technical consulting engineer for mobile and embedded development tools at Intel, one of the more interesting software trends, enabled by the increased hardware capacity and performance attributed to Moore’s Law, is the ability to build applications using languages that are increasingly more abstract relative to the underlying hardware; languages that have moved further from the constraints of the underlying processor’s instruction set.
When hardware resources are scarce, the languages of choice are thin — it is relatively easy to relate the language constructs to underlying machine operations; like a thin veneer over the underlying processor. The most obvious illustration of this idea is assembly language, which has a nearly perfect one-to-one correlation with the machine code that runs directly on the processor. Usually this means languages that are compiled directly into native machine code (such as Fortran and C/C++).
Increased hardware performance and capacity enabled the development of bytecode or p-code engines that interpret instructions compiled for a virtual processor. This virtual processor can be tailored to the needs of the language, rather than the other way around. Bytecode engines translate virtual processor instructions into real processor instructions at application runtime. Apply a few more doses of Moore’s Law and you can add a just-in-time (JIT) compiler to the virtual processor’s runtime engine, to further optimize the execution process. Pascal, Java, C# and Ruby are examples of languages that utilize this mechanism, and flourished due to the hardware advances accorded to Moore’s Law.
Today we are able to build large and viable applications using fully interpreted languages, such as PHP, JavaScript and HTML5 (JS/HTML/CSS). The underlying interpreters for these languages require even more resources, provided as a courtesy of the ever improving hardware platforms, and move the developer even further from an execution model dictated by the real processor. The interpreted nature of these languages has also changed the coding process, providing developers with the means to quickly experiment and try new things.
One of the ultimate goals of the software community has been the reuse of existing software modules. Initially that meant reusing standard libraries (either in source or binary form) and small command-line utilities (e.g., when used with scripts). The acceptance of open-source software and an amazing array of JavaScript frameworks and "micro-libraries" are some good examples. But the ultimate reuse story is that of full-scale virtual machines.
Today’s network application servers (web servers, database servers, etc.) typically run in a dedicated virtual machine hosted in an operating system (OS) that maintains the direct interface with the real hardware. This arrangement allows for easier migration of virtual machines to new hardware platforms; optimization of hardware compute and storage resources; tuning an OS and application configuration to a specific task; minimizing software compatibility issues between multiple applications and the underlying operating system.
This is the ultimate software reuse strategy, where an entire OS is combined with application-specific software and middleware to implement a highly-functional dedicated system in the form of a purpose-built virtual machine. Today it is possible to run virtual machines that run at near native speeds on something as inexpensive as an everyday desktop machine.
The Internet of Things (IoT) also represents an intersection of embedded systems with Internet connectivity. Just as standard computing platforms have benefited over the past fifty years from faster, smaller and cheaper electronic hardware, so has the world of embedded systems. What’s new for embedded systems is the availability of small, low-cost, low-power devices that include the hardware and software needed for full-scale Internet connectivity.
This is largely being driven by the development of System on Chip (SoC) silicon for smartphones. Mobile platform SoCs represent an opportunity to build embedded systems on standard OS and hardware components in a way that is analogous to the evolution of the personal computer ecosystems of the past 35+ years.
The Android Open Source Project (AOSP) is turning into an interesting platform on which to build applications on mobile silicon. Two highly visible projects that are leveraging the AOSP platform are Firefox OS and the Ubuntu phone system. In addition, the Yocto Project provides tools and templates to make it easier to build a custom Linux OS image for your IoT device.
SoC platforms represent a new and exciting avenue for the application of Moore’s Law. As high-volume mobile platforms (phones and tablets) evolve, they will leave in their wake inexpensive silicon devices containing an array of useful functions and features. A new suite of software tools and applications is bound to follow, pushing software developers onto embedded platforms and into applications they might never have imagined.