Indel detection by amplicon analysis: an alternative sequencing method

Written by Eric Paul Bennett

Eric Paul Bennett, an associate professor at The University of Copenhagen (Denmark) in the Department of Odontology, discusses the development of the Indel Detection by Amplicon Analysis technique as an alternative to next-generation sequencing.

In the last decade, Bennett has focused efforts on ways of establishing precision-engineered isogenic cells lacking individual “glycogenes. Using ZFNs, TALENs and CRISPR/Cas9, these efforts have among others led to establishment of novel methodologies that both improve the targeting and indel detection efficacy of precise gene editing methodologies. The methods and principles developed show great translational potential into the biotech and therapeutic genome targeting space.

Why did you develop Indel Detection by Amplicon Analysis (IDAA) method?

That’s a good question. In around 2012-2013, we were in our group, in need of a high throughput indel detection method, but there was not much available. The only option we had at that time was an enzymatic cleavage assay that people know as a surveyor assay or T7IE assay. We would also use restriction fragment length polymorphism which requires a restriction site at the target site or Sanger sequencing. So, we were in need of a high throughput method for screening hundreds of knockout clones that we were trying to establish at the time, targeting 200 different human genes of interest.

I realized that the single-base discrimination power of the DNA Sanger-sequenator instrument would be of great value in the gene editing space. At the time, nobody realized that the predominant indel outcome of CRISPR-Cas is a single-base insertion, which only added to the power of IDAA, in addition to the low cost, limited effort and short turnaround time (within a day). So, the short answer is because there was an unmet need when we established the method 10 years ago.

Many people will go to next-generation sequencing (NGS) applications for deconvoluting the indels that are formed. NGS is quite laborious, and the costs could be a limitation for smaller academic groups. I think IDAA is a convenient and low-cost alternative to NGS.

How would you suggest a researcher go about deciding on an appropriate indel detection method for the experiment?

It all depends on the application. For example, if the application is gene editing, there are different methods available, IDAA just being one. There is TIDE/ICE, Sanger sequencing, NGS, enzymatic cleavage assays and digital PCR, and they all have pros and cons. For a making a knockout cell line for instance, the low cost and high throughput method becomes of key interest to the researcher. In that case, IDAA could be the perfect choice, but if the sequences are important, IDAA would not reveal the sequence as it only measures the indel size and therefore, a sequencing approach needs to be taken into consideration, either NGS or Sanger-based.

Why are standardized methods needed and how could this be achieved?

Currently, all genome editing settings have their own way of quantifying and validating indels at both the on-target and importantly the off-target site. This means that comparison of indel profiles across labs and groups is difficult because the methods and the workflows are different. What is needed is standardized reagents that one could calibrate the indel identification workflow up against, and this is being investigated at the The National Institute of Standards and Technology Genome Editing Consortium (MD, USA). NIST is intending to establish physical DNA standard samples possessing different indels at different frequencies, which allows the researcher to calibrate the indel identification workflow against these known standards. These are available at distributor status from NIST and eventually, the aim is that the reagents are available to the scientific community. I think these physical standards will be good way of enabling more across the board transparency of the performance of all the different methods that are available for indel identification.


CRISPR 10x SpotlightSpotlight: CRISPR 

CRISPR is a key element of many lab scientists’ toolkits. In addition to its application in basic research, CRISPR is also being applied as a diagnostic and therapeutic tool, and is generating invaluable insights for drug discovery.


Can labs currently compare their CRISPR experiments, and what steps should be taken to enable comparison in these studies?

Experiments can vary both with regard to efficiency and specificity, that depend on several parameters including nuclease and delivery modality etc. Potential off-target and unintended on-target outcomes also impact on the experimental outcome. For CRISPR experiments to be comparable, one has to focus on both the on-target and off-target outcomes. With regard to the latter, we need to expand the toolbox of methods for looking at off-target and unintended on-target events. Current PCR-based methods are limited in analytical size and only cover a few hundred base pairs around the edited site and can potentially miss analysis of larger events that occur at the target site after editing.

What steps need to be taken to ensure in vivo gene editing and safe end results are reproducible?

I think it’s very important that we identify the landscape of unintended outcomes, especially when we go in vivo and if we go to therapeutic applications of gene editing. A specific concern here, relates to delivery and specificity, and this is currently a limitation for in vivo gene editing for therapeutic purposes. Also, current editing based therapeutic strategies, are mainly targeted to the liver, so here we need to develop novel delivery modalities that can ensure specific tissue targeting that enable establishment of curative treatments of extrahepatic tissues.

How has it felt to watch CRISPR grow since you first begun using it and what potential are you most excited about it having in the future?

Having been in the scientific field for quite some years, I have experienced the era of PCR, cDNA cloning and automated sequencing in the ’90s, that led to the mapping of the human genome, and I experience some of the same excitement when we talk about CRISPR today as an enabling technology.


 The opinions expressed are those of the interviewee and do not necessarily reflect the views of BioTechniques or Future Science Group.