Picture the world as it was in 1983, the year BioTechniques launched. Ronald Reagan was in the White House. Americans thrilled to the adventures of Luke Skywalker and Darth Vader in “Return of the Jedi” and took their first steps into the digital age with the then two-year-old IBM PC.
On the scientific front, DNA sequencing was still relatively new. The polymerase chain reaction didn't yet exist—it was still just a brainstorm inventor Kary Mullis had on a California highway that wouldn't see print until 1985. Meanwhile, cell biologists were capturing ever more beautiful images with their microscopes, but their efforts were constrained by the then insurmountable diffraction limit.
Not surprisingly, life science today bears little resemblance to the way it was practiced three decades ago. Whole-genome sequencing is now routine, as is PCR. And microscopists have breached the once inviolable diffraction limit, taking pictures of cellular structures with unprecedented resolution.
To gain some perspective on the past 30 years in life science technology development, we spoke with key innovators in the fields of DNA sequencing, PCR, and microscopy. Their insights are both interesting and informative, a history lesson from the developers who changed the way we work in the lab.Sequencing: Moving Up By Scaling Down
In 1983 George Church was nearing the end of his graduate work at Harvard University. He had matriculated in 1977, and that same year joined the lab of DNA sequencing pioneer (and future Nobel laureate) Walter Gilbert.
From the beginning, Church was focused on genome sequencing. In 1974, while still a Duke University undergraduate, he manually keyed in and folded the structure of every then-known tRNA sequence. “I started hoping that every person could afford to see their own DNA sequence,” he writes on his web site. Graduating in 1984, his thesis described “the first methods for direct genome sequencing, molecular multiplexing [and] barcoding.”
That year, Church participated in a meeting at Alta, Utah that would launch the Human Genome Project (HGP). At the time, the only technology capable of decoding the human genome was Sanger dideoxy sequencing, and as the HGP took shape, some argued that the way forward was to ramp up existing technology. But Church argued for a different solution.
“I felt that $3 billion was an unacceptable amount of money for Sanger-based sequencing.” He would spend much of the next two decades trying to find a less costly alternative. And he wasn't the only one.
Jonathan Rothberg ran his first sequencing reaction as an undergrad at Carnegie Mellon University in 1983. He entered Yale University as a doctoral student two years later, and by the time he graduated in 1991 had managed to completely sequence a neuronal gene called slit—9000 bases decoded in six years.
Despite the slow pace, it was apparent even then that the burgeoning field of genomics promised to transform medical science, and in 1991 Rothberg founded CuraGen, a biotech firm devoted to mining the human genome for drug targets and insights into complex diseases. By 1999, CuraGen was a $5 billion, publicly traded company, and Rothberg was its CEO. “I thought I was on top of the world.”
Then his son, Noah, was born and was having difficulty breathing. The “consensus” human genome that was being assembled at the time was of little immediate value for Noah. “I realized that I wasn't as interested in the human genome; I was really interested in my son Noah's genome. And that for me was my first introduction to personal genomics.”
Up until then, genomics had taken what Rothberg calls a “Henry Ford approach” to sequencing, “where you just set up an existing technology, but you set it up as an assembly line.” That strategy, implemented at places like the Broad Institute in Cambridge, Massachusetts and the Sanger Institute in the UK, is difficult to scale to the point where sequencing an individual genome is financially and temporally practical.
So Rothberg, like Church, was looking for a paradigm shift, which he found on the cover of a computer magazine that happened to be on his desk. That cover described a new Intel Pentium chip. “We've been doing it wrong,” he thought. “It's not about Henry Ford, it's about [Intel co-founder] Gordon Moore.”