It's easy to forget that the empires built by Steve Jobs, Bill Gates, and their peers were raised on foundations laid more than a century ago. On the day Apple's co-founder would have turned 61, let's take a moment to appreciate the pioneers who came before.
A difference engine built from Babbage's original designs by the London Science Museum
The modern concept of a programmable computer was created by British engineer Charles Babbage in the late 1800s. Babbage first envisioned what he called a "difference engine," a concept which he later refined into an "analytical engine" that could handle arithmetic, branching, and loops.
The analytical engine is the great-great-great-great-grandfather of the machines we think of today as computers. Everything from a Mac to an iPhone can trace its history to the analytical engine.
Around the same time that Babbage developed the analytical engine, the world found its first programmer: a woman named Ada Lovelace, who corresponded with Babbage and is widely credited with inventing the modern interpretation of the algorithm.
Mechanical computing plodded along for the next few years -- as engineers and mathematicians like Herman Hollerith (whose company would later merge with others to become IBM), Bertrand Russel, and Raytheon founder Vannevar Bush developed many foundational principles of modern computer science -- until things began to get interesting in the 1930s.
Bill Hewlett and David Packard helped create Silicon Valley and stoke the imagination of a young Steve Jobs.
The mid-to-late-30s saw the fruits of a partnership between Bill Hewlett and David Packard (from which a young Steve Jobs would find his calling) come to life, along with the nearly simultaneous invention of the relay-based computer by George Stibitz at Bell Labs and German engineer Konrad Zuse.
World War II brought the next breakthroughs, with the later-ostracized Alan Turing leading the development of the BOMBE to break German Enigma machine codes. This led more-or-less directly to the development of Colossus, also by British intelligence services, which is largely credited as the first real programmable digital computer.
Computers leapt further into the digital age with ENIAC at the University of Pennsylvania. Developed by John Mauchly and J. Presper Eckert, ENIAC was essentially the first version of what we know as a computer today.
Hardware continued to evolve, but it took Douglas Engelbart's 1968 Mother of All Demos to show what a computer could really do.
During that 90-minute presentation, Engelbart showed off the components that would form the foundation for the next 50 years of computing. Application windows, the mouse, hypertext, word processors -- all of it can be traced back to TMoAD.
From there, the path is well-trodden: Jobs, Gates, and their deputies -- Steve Wozniak, Andy Hertzfeld, Paul Allen, Rod Holt, Bill Fernandez, and others -- forged ahead and created the computing world that we know now.
This isn't to say that the contributions of Jobs and Gates should be discounted; they were the right men in the right place at the right time, ready to push humanity down the path toward personal computing. Without them, the world would undoubtedly be a different place today.
Unfortunately, the inescapable march of history means that those who came before are often forgotten. Jobs himself was fond of quoting programming pioneer Alan Kay, using Kay's insightful remark that "people who are really serious about software should make their own hardware" to frame Apple's ever-growing vertical integration.
In that spirit, let's take Jobs's birthday to remember those who made it possible for him to make his own dent in the universe.