Where has the time gone? It seems like only yesterday we were in the early days of 2013, and now the year is drawing to a close. Traditionally, December editorials are about looking back at the year that was and forward to the year that will be. Some years it is easy to identify a “hot topic” while in other years it takes a little digging and thinking to uncover important trends. Turns out, 2013 was easy.
While 2013 definitely saw a variety of interesting new methods being published in BioTechniques and other journals, developments in DNA sequencing methodologies stood out from the pack. For years, sequencing has been the domain of large centers, core facilities, and researchers with deep pockets. Instrument and reagent costs were high, and for most, output outpaced the needs of individual researchers. But all this might be changing as DNA sequencing appears to be reaching a unique tipping point where costs are down, while innovative methods and techniques aimed at maximizing output and applications for “lower-throughput” users abound.
In January, Lazinski and Camilli reported on a modified DNA cloning and library construction approach for next-generation sequencing (NGS). Modifications and enhancements for library construction have been appearing more often in the literature. For me, the impact of such articles is two-fold: First, these methods serve to (in most instances) speed up and/or lower the cost of the upstream steps in DNA sequencing, enhancing possibilities for smaller labs. Secondly, they serve as foundations to educate new users on different approaches through comparisons with existing methods. These are the next steps towards wider adoption—decreasing costs, increasing speed, simplifying methodologies, and inspiring future users. The end result here is a deeper understanding and, eventually, a better informed usage of NGS.
In February, Okoniewski et al. presented a technique for localizing large genomic deletions using the Pacific Biosciences and Illumina NGS platforms. Copy number variation (CNV) analysis is a growing area of interest for geneticists as it is becoming clear that these genomic modifications can play critical roles in some human diseases. However, localizing the breakpoints where a deletion or duplication has occurred can be a challenge. Applications such as those presented by Okoniewski and colleagues pave the way for others to use NGS platforms for more than standard whole genome or whole exome sequencing studies. And these new applications are creating unique opportunities, another trend of sequencing development in 2013. In the long run, it will be applications, maybe even more so than education and cost reduction, that will lead to the greater use of NGS by all researchers.
Localizing deletions wasn't the only target sequencing application we saw during 2013— in one case, targeting was also a target. In June, Li et al. demonstrated a new methodology to capture protein-coding genes among highly divergent species. The technique adds to a growing toolkit that has slowly been developed for evolutionary biologists and other life scientists interested in studying gene families from species where little or no reference sequence is available.
August brought two more novel approaches for massively paral lel sequencing—a high-plex PCR method for sequencing large numbers of amplified products and a new assessment tool for quantification and size characterization of sequencing libraries. These articles by Nguyen-Dumont et al. and Laurie et al., respectively, further demonstrate the growing interest of researchers in developing new tools and techniques to enhance NGS adoption.
In the end, I suspect 2013 will be remembered for the new methods, techniques, and applications that are finally taking advantage of the maturing NGS platforms currently available—the starting point for a democratization of the technology. But with new systems and approaches set to debut in the coming months, the full impact that massively parallel sequencing will have on biological research remains to be seen.