Nov 26 2014
I have succumbed to the world of books-on-tape and been using the 90 mile/2x daily commute from Cape Cod to Burlington to do more than listen to NPR and curse my hollow, rat-on-a-wheel existence. I’ve been a fan since my commute to and from Manhattan for Eastman Advisors, but perhaps Gibbons’ The Decline and Fall of the Roman Empire wasn’t the wisest first choice for an automotive literary experience. In 2001, when I briefly did the same commute to McKinsey & Co’s short-lived TomorrowLab, I listened to lectures from The Learning Company, thinking I was being super efficient and relentlessly self-improving like Dr. Evil’s father the baker who claimed to have invented the question mark.
Audible is definitely making the commute a lot more enjoyable. I’ve tried dictating into a voice recorder and plugging the results into Nuance software’s speech recognition software, but I can’t compose via dictation and feel like an utter asshole in bumper-to-bumper holding a little red Sony recorder under my chin and pretending I am composing literary genius (note to self, search to see if any significant piece of literature has ever been dictated).
So far this fall I’ve listened to The Map Thief; Peter Thiel’s From Zero to One; and Chris Anderson’s Free, along with a steady diet of podcasts, usually Drupal and open source related. I was always impressed by the anecdote that George Gilder, the Forbes columnist and newsletter writer wrecked a couple cars out in the Berkshires because he was fond of driving around listening to technical lectures and proceedings from the IEEE and other deep-geek gatherings. The story, unconfirmed, was he put a car or two in the ditch because he’d get so wrapped up in talks about erbium-doped fiber amplifiers.
Now I’m into Walter Issacson’s breezy history of the computer, Internet, and digital revolution and I’m liking it, even if everything is old news because I’ve worked in the industry since the early 1980s and have read pretty much all the histories and biographies of the computer age. Walter’s biography of Steve Jobs was a huge bestseller and I enjoyed it very much as it taught me two things about Jobs which I did not know before: Jobs didn’t flush toilets and he banned Powerpoint. In Innovators, Issacson does a fine job of keeping the history from falling into the rat hole of theoretical science, injects some good human drama and tales of eccentricity, and connects it all together in a way that a Millenial obsessed with hooking up on Tindr and managing their social networks might actually pause and pay some respect to the geniuses (mostly men, mostly working during the Great Depression) who invented the vacuum tube powered 50-ton monster computers that got things rolling.
He begins with the tale of Ada Lovelace, daughter of the poet Lord Byron, a passionate mathematician who is regarded as the first computer programmer because of her work with Charles Babbage during his development of the mechanical calculator, the Analytical Engine during the first half of the 19th Century. Then a leap to the second half of the century, and the mechanization of the US Census by IBM’s early founder Herman Hollerith (this cutting down the analysis of the census from an eight year manual process to just one); and then to 1937 — the year it all came together in the US, England and Germany for a bunch of unconnected inventors and scientists who looked at the technology available to them and managed, through a combination of vacuum tubes, hardwired circuits, electric-mechanical relays, “memories” made out of rotating tin cans and lots of scrounging, to independently invent variations on what are now regarded as the first working computers.
There’s something about listening to the excitement caused by the Mother of All Demos (Douglas Engelbart’s demonstration of the first mouse, graphical user interface, and network in 1968), the founding of Intel by Bob Noyce, Gordon Moore and Andy Grove, the impact the MITS Altair had on the Bay Area’s hacker/maker subculture, the development of the Internet’s protocols out of ARPANET…..all familiar stories, but very chilling when told through an Android phone mounted on the dashboard of a car, a pocket computer with more storage and power that ever could have been conceived of and yet…..
That future was always in the minds of people like Vannevar Bush — the man who forged the collaboration between the military, academic and industrial research during World War II and was the “Scientist-in-Chief,” advising presidents from FDR through Eisenhower: he described the personal computer he called a “Memex” in a famous essay published by The Atlantic, “As We May Think.” Alan Kay at Xerox PARC and his vision of the DynaBook in the early 70s. Ada Lovelace speculating in the 1820s that someday there would be machines that could help create art.
The theme that fascinates me — a theme emerging from listening to Chris Anderson, Peter Thiel, and Issacson — is that the greatest invention in all of the Information Age is debatable, but the one that is most intangible is the way “innovation” is defined and happens. I personally detest the way “innovation” is tossed around by buzzwordists along with “impactful” and “pivot” like verbal styrofoam peanuts; but a tangible definition and set of conditions conducive to it occurring is coming together in my mind. Hence, Churbuck’s Theory of Innovation:
- Those who talk about innovation generally don’t understand it.
- Innovation is not a synonym for creativity or discovery — creativity and inspiration are required, but the words are not synonyms. Galileo didn’t innovate his heliocentric view of the universe — he proved through a telescope that the planets orbit the sun (and pissed off the Church in the process).
- Innovation strictly defined in my mind is the commercialization of invention. Bear with me, for this is the fine distinction between discovering some new truth: “Silicon doped with impurities will become a semiconductor” that’s a discovery, an invention. Putting a logical circuit on a base of doped silicon by printing a pattern of conductive lines is an invention and can be patented. Realizing you can cram all the logical functions required by a computer’s central processing unit onto a single chip and then selling the hell out of them (as Intel did) I argue is an innovation. Science leads to discoveries. Innovation applies those discoveries to products or processes.
- Innovation is an “aha” moment to be sure, but it usually doesn’t happen alone, by a lone genius in a garage, but in a group of collaborators in the right combination of environment and management structures. Flat organizations based on a meritocracy are far more conducive to innovation than old command-and-control structure. Open sharing of invention and discoveries is the fuel for innovation — innovation is derived and borrowed from many sources and crushed by patents and secrecy.
I highly recommend the Thiel book — his view of what it takes to build a company, as well as Issacson’s — it’s more about the management and culture breakthroughs than anything else. Flat organizations, rewarding individual contributors with a piece of the action, and leadership that makes decisions and shepherds projects towards clear goals while deflecting distractions are as important as any factor, technical or creative. Issacson provides an excellent example: Texas Instruments took the invention of the transistor by Bell Labs and made the first transistor radios — reducing a bulky device that sat on a table and required tubes to a form factor the size of a stack of index cards. As luck would have it, the first transistor radio coincided with the emergence of rock-and-roll, and for the first time teenagers could listen, in privacy, to the music their parents hated on the living room Philco. The invention — the Regency TR-1 — was the innovation of a new “use case,” the transistor itself was introduced to the public consciousness, and the result was as profound (if not more so) than the iPod 50 years later.
It’s a long way of my saying, it’s good to stop and take stock of the invention that surrounds us and realize that in a very short span of time — 50 years essentially — we’re gone from “Shake Rattle and Roll” in our pockets, to The Decline and Fall of the Roman Empire on our dashboards.