top of page

Walter Isaacson’s technology arc I | Where does biology stand now?

Updated: Mar 11, 2022


Thoughts from reading the three books: the innovators, Steve Jobs and The Code Breaker


The parallel between digital and biological technology

Walter Isaacson is a biographer par excellence: he picks subjects with great significance and relevance to us, asks the right questions, parses a vast number of resources and, with a great literary flair, weaves them into a narrative so elegant that anybody who is penned by him is fortunate to live forever in those words.


Isaacson is highly accomplished, and among his many roles, he is a professor of history. Historians have a great grasp of time in the length of decades or more, and unlike most of us with a narrowed perception of time, they see today and tomorrow as a recurrence of what has happened in the past. It seems like to them, history is now repeating itself.


Isaacson’s work with Steve Jobs and the innovators has greatly influenced his awareness of the trends leading to the digital age of today (defined as when the general public has personal computers, user-friendly operating systems, the internet, websites and search engines). It is in this light of history that he penned the story of Jennifer Doudna and molecular biology in The Code Breaker. In the later part of the book, he provided one of his rare personal inserts that may be interesting to (aspiring) biologists:


If I had to do it all over again—pay attention, you students reading this—I would have focused far more on the life sciences, especially if I was coming of age in the twenty-first century. People of my generation became fascinated by personal computers and the web. We made sure our kids learned how to code. Now we will have to make sure they understand the code of life.

Isaacson, Walter. The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race (p. 478). Simon & Schuster. Kindle Edition.


Isaacson’s optimism for genome editing is heartening. I surmise that his positivity for CRISPR-Cas technology, which is now still in its infancy, stems from the strong belief that we will one day tame the genetic code as we did for the binary code. Isaacson traced the development of the digital revolution and deemed it suitable to start its narrative in the early 1800s (certainly it goes further back, such as Boolean algebra as an extension of Aristotle’s logic, but we must start somewhere not too long-winded to write) and he proceeded to tell the story over the next one and a half-century, from Ada Lovelace's haunting vision of the modern computers in 1843 to the von Neumann architecture, transistors, microchips, internet, personal computers, operating systems, web, search engines, blogs and Wikipedia in 2001. Some remnants of this history swirled in his head as he later wrote the biography of molecular biology and Jennifer Doudna, which led him to this thought:


While attending the 2019 CRISPR Conference in Quebec, I am struck by the realization that biology has become the new tech. The meeting has the same vibe as those of the Homebrew Computer Club and the West Coast Computer Faire in the late 1970s, except that the young innovators are buzzing about genetic code rather than computer code. The atmosphere is charged with the catalytic combination of competition and cooperation reminiscent of when Bill Gates and Steve Jobs frequented the early personal computer shows, except this time the rock stars are Jennifer Doudna and Feng Zhang. The biotech nerds, I realize, are no longer the outsiders. The CRISPR revolution and coronavirus crisis have turned them into the cool kids on the edge, just as happened to the awkward pioneers who once populated the cyber-frontier. As I wandered around reporting dispatches from the front lines of their revolution, I noticed that even as they pursue their new discoveries, they feel tugged, sooner than the digital techies did, to engage in a moral reckoning about the new age they are creating.

Isaacson, Walter. The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race (p. 373). Simon & Schuster. Kindle Edition.


At least in some ways, Isaacson believed that the biological era will parallel the digital revolution. In 2019, he believes we are at an epoch of biology akin to when personal computers were on the brink of being pushed to the masses in 1975, followed by, nearly two decades later, the World Wide Web (1991). It is doubtlessly difficult to predict the future, but the best way to do so is to make the future. And CRISPR-Cas tech is certainly in this direction of overcoming technical difficulties in making the system more reliable (less off-targeting in the DNA), viable (delivery in-vivo) and safe (lower immunogenicity), especially in the endeavour to treat human diseases. Outside of therapeutics, genome editing tools can be used in a myriad of meaningful ways, such as in fast diagnostic devices, increasing plant disease resistance and making gluten-free wheat. It is too useful a tool to not develop, and thus it shall.


Throughout the book, Isaacson may be feverish in conveying that the biological era is coming, but he never paralleled its pace with the digital revolution. And the reason is not difficult to see. Back in the heyday of the technological boom, successful garage projects were cheered on, public participation in the homebrew hobby community and open-source projects (think Wikipedia and GNU/ Linux operating systems) were encouraged ('Given enough eyeballs, all bugs are shallow') or at least not frowned upon. Most important of all, many aspects of digital tech were accessible, and there was room to wiggle in with self-taught expertise to make something worthwhile for others.


All of these characteristics that spurred on the digital revolution are not seen for genome editing. First, biology is a conservative field where even a person who graduated bachelor's in biology is seen as insufficiently trained as a scientist. This attitude presents a barrier against public science. Garage projects are currently ill-advised because any genome editing to yourself or others is a medical procedure that should not be treated lightly (you might be charged). Genome editing of organisms around you might sound fun (a really cool thing that was shown in Netflix's Biohackers) but the concern around GMOs makes regulations tricky to follow. If you ask for forgiveness, not permission, then I hope you are not in jail first.


And besides, biological engineering is an arduous process unlike building a garage computer from scratch. This fact is seemingly dawning upon Elon Musk, who pushed his Neuralink employees like how he pushed his Tesla/ SpaceX engineers. One of the Neuralink scientists, Tim Hanson, left the company saying "Basic science is basically slow". In biology, if you deal with complex organisms, you require a lot of patience and perseverance). This snail pace is killer, and if we needed nerds to start the digital revolution, then we need nerds with no soul now.


This means that biology relies on governmental or organizational support, which aligns the pace of biology to that of World War II, when computing machines were made by the government, supported by trained academics from universities such as John von Neumann and Alan Turing, to crack German codes. The pace of development is restricted to expert participation only, but perhaps in today's world we have more biologists, labs, global demand and thus capital from the promises of genome editing, so we might proceed more hastily.


In short, we're in the explosive period of 1975 but from here on, possibly evolving faster than 1940 but slower than 1975. Let's do some behind-the-envelope calculations. If we are drastically slowed in development by a quarter of the speed, then to reach the golden age of mass commercialization before 2000, it takes us at most 25 x 1.25 = 31.25 years from now. We're talking about having Google and Wikipedia, not the big data age we are currently in, which makes the future incredibly exciting: what will genome editing bring to us in 50 years? Unless developments in physics can make us time travellers, we can only wait and see.



PS Here's a trivia for nerds out there; below is a brief outline of the digital revolution’s history, only if you're interested:


1843: Ada Lovelace envisioned general-purpose computers, wrote the first loop, recursion, and computer algorithm, and dreamt of code libraries.


1890: Punch card machines of Herman Hollerith to tabulate a large amount of information, including population statistics. Without the machine, it would take eight years to tabulate the 1890 census. With the machine, it was done in a year.


1937: Master’s thesis of Claude Shannon used electric circuits to implement Boolean logic. This was a monumental leap because ‘true’, ‘false’, ‘and’, ‘or’ and ‘not’ make up a large part of our logical reasoning. When built into electric circuits, we are opened to the possibility of building incredibly complex logic architectures made up of these tiny blocks of Boolean logic.


1943: Colossus was built to crack the German Lorenz Cipher “Tunny” during World War II. This was not the same as Alan Turing’s Bombe machine, codenamed Victory, which was installed in 1940. The Colossus was programmed by switches and plugs instead of the electronically stored programmes modern computers use today.


1945: First published report describing the von Neumann architecture which underlies modern stored-program computers.


1947: Transistors were invented. Transistors act as small ON/OFF electric switches or amplifiers. Multiple transistors can work in unison to make Boolean logic gates.


1952: The first computer compiler was written and the von Neumann architecture computer, Maniac, was built.


1956: William Shockley’s company, Shockley Semiconductor Laboratory, was founded in Palo Alto. He attempted to create a four-layer diode made of silicon that supposedly work better than transistors. This marked the beginning of Silicon Valley.


1957: Fortran programming language appeared.


1959: Fairchild semiconductor invented the microchip capable of scalable production. A microchip contains many transistors and is a foundational step to making a microprocessor that functions as the central processing unit (CPU) of a computer.


1963: The mouse was invented.


1969: The ARPANET network was built. This network used the two foundational technologies of the modern internet: a big packet-switched network using the Internet protocol suite.

B, the forerunning programming language of C, appeared.


1971: Email and Intel 4004 microprocessor were invented.


1972: C programming language appeared.


1973: Xerox PARC made Alto, the first computer to have a graphical user interface operating system but it was too expensive for the public and were mainly bought by organizations like universities. Xerox PARC also developed the ethernet, which differs from the modern internet because it only served a small geographical region.


1975: A very exciting year. The first commercially successful personal computer, the Altair 8800, was advertised in hobbyist magazines. The unassembled kit cost USD 439.

In the same year, Microsoft was formed and the Apple I was sold.


1980: Microsoft writes the operating system for IBM personal computers.

C++ (C with classes) programming language appeared.


1984: Richard Stallman quit his job to develop GNU, the free operating system. Apple Macintosh 128K was released.


1985: Microsoft Windows was released.


1990: Python programming language appeared.


1991: GNU lacks a kernel, the core of the operating system that allows the software and hardware to talk to each other. Linus Torvalds wrote and released the open-source Linux kernel, which makes the open-source GNU/ Linux operating system more complete.

Also, in the same year, the World Wide Web was introduced.


1993: R programming language, written by and for statisticians, appeared.


1995: Java and JavaScript programming languages appeared.


1998: Google launched!


1999: Blogger launched.


2001: We have Wikipedia now.

9 views0 comments
bottom of page