Subtitled “How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution,” this lengthy (480 pages) overview of the foray into computers and the connected world is an impressive survey. And the first of it’s kind, as far as I’m aware.
It begins at, well, the beginning of it all: The early 19the Century with Charles Babbage and his “difference machine” and Ada Lovelace’s ruminations on concepts of programming (which would not be borne out until the mid-20th Century: Loops, logic, subroutines/libraries).. It continues on a chronological path, sometimes forking to show how multiple groups were working – often without the other’s knowledge – to crack the same problem. For example, the integrated circuit was developed by two teams almost simultaneously. Yet only one got a Nobel Prize for the discovery, simply because the other team leader had died (there are no posthumous Nobel Prizes)..
Isaacson does a good job of when to dwell on a subject and when to just note a milestone and briefly outline its significance. There is a lot of history – and a lot of characters – to talk about, and the book doesn’t go astray too often (one nit to pick: Isaacson spends a couple of pages talking about Ada Lovelace’s father, the poet Lord Byron. Why? Yes, he was a famous writer and womanizer, which would be germane in a book about the lives of poets, but here?).
In tracing the history of the digital revolution, Isaacson comes to two main conclusions:
1) Progress was not spurred by solitary geniuses, but by collaborative groups:
Math and physics are often powered by keen insight of gifted individuals – think Newton or Einstein. But the digital revolution almost always needed teams, to overcome wide-ranging problems, such as electrical and material expertise for the transistor or integrated circuit. No one person could really tackle this alone.
And that leads to another truism Isaacson uncovers:These groups, often duos, often worked most effectively when one individual was technical, the other a visionary/salesman. Think Steve Jobs and Steve Wozniak of Apple, or Paul Allen and Bill Gates at Microsoft: The former in the group knew how to sell products, the latter knew how to build stuff.
2) The digital revolution is roughly divided into two eras: Hardware, and then software:
Back in the day, the nerds knew about memory addresses and how to read the colored bands on transistors to figure out its properties and so on. Today, few know – and really don’t care – exactly how a computer or router works. Hardware is for huge companies with billion-dollar factories, not for people geeking out in their parent’s basement.
The transition to software began around the rise of Microsoft – the difficult and time-consuming tasks of building computers and connecting them (both locally with ethernet and remotely via the internet) was over. Now was the race to devise tools to use the hardware/networking.
And software development could still be done by individuals, or small groups.
VisiCalc – the vision of one man – became the first “killer app.”
Others followed, but software, much like hardware before it, became too large for individuals or small groups. I can’t even imagine how many lines of code were in even early editions of Adobe Photoshop!
As computers – and their underlying software – became more flexible yet more complex, computers and networks once again allowed groups – this time often far-flung groups – to collaborate, this time with small fixes to big projects (Linux and other open-source software) and content collaboration. For the latter, Isaacson ignores Facebook, Twitter and Reddit and instead focuses on Wikipedia. He ties Wikipedia’s “anyone can edit” structure back to home brew clubs of the past, where hardware and software was freely shared.
Interesting – and accurate – take.
Isaacson is a good writer, and he makes good choices about what to include, what to gloss over and what to ignore. It’s a good overview of almost 200 years of digital development without having to really understand the tech it describes.