to BioTechniques free email alert service to receive content updates.
qPCR Efficiency Calculations
 
Kristie Nybo, Ph.D.
BioTechniques, Vol. 51, No. 6, December 2011, pp. 401–402
Full Text (PDF)

This month's question from the Molecular Biology Forums (online at molecularbiology.forums. biotechniques.com) comes from the “Real-Time qPCR/qRT-PCR Methods” section. Entries have been edited for concision and clarity. Mentions of specific products and manufacturers have been retained from the original posts, but do not represent endorsements by, or the opinions of, BioTechniques.

Molecular Biology Techniques Q&A

Why do I calculate efficiencies higher than 100% for my multiplex qPCR reaction? (Thread 31310)

Q I am running a multiplex qPCR reaction using VIC and FAM probes for two different targets. I ran a 2-fold dilution series, starting at 32ng/µl down to 0.25ng/µl. The slopes from my regression of Ct over log DNA amount were about -1.45 and -1.40 for the two amplicons, indicating PCR efficiencies >300%. I am puzzled because I am using a previously published protocol where the authors claimed efficiencies of 90%.

I am running a multiplex qPCR reaction using VIC and FAM probes for two different targets. I ran a 2-fold dilution series, starting at 32ng/µl down to 0.25ng/Q µl. The slopes from my regression of Ct over log DNA amount were about -1.45 and -1.40 for the two amplicons, indicating PCR efficiencies >300%. I am puzzled because Iam using a previously published protocol where the authors claimed efficiencies of 90%.

What does it mean when PCR efficiencies are higher than 100%? What am I doing wrong and how can I fix it?

A This might happen when you have PCR inhibitors present in your template. The inhibitors get diluted out with low copy numbers in the mixture, which can artificially raise your efficiency.

A Another possibility is if you have non-specific products being amplified in the dilute samples that lower the Cts. You can check for this by running a gel with your PCR reaction products.

You can also look at the various parts of the curve to see if efficiencies are normal at one extreme of your dilution or the other or if it appears normal in the middle of the curve. You may find that your data falls in a good range of the curve.

Q I excluded DNA samples at the extreme ends of the series (32 and 0.25) and re-calculated the efficiencies, but it didn't change anything. It doesn't seem to matter which samples I exclude or include. I still get roughly the same efficiencies.

Also, when I use the software program LinRegPCR to calculate efficiencies within each individual well, I get average efficiencies of ∼84% and 94% for my two respective amplicons. I am not sure which calculations I should trust. I think this suggests there is something wrong with my dilution series.

If I change the DNA sample and TE buffer I'm using for the dilutions, will that solve the inhibitor problem? Or will I at least see believable reaction efficiencies?

A You are merely adding too much template. With too much template, your standard curves will be almost the inverse of what you want. You should never use more than 2 ng/µl in your qPCR reactions. Try your 1:2 serial dilutions from 2 ng/µl onward and it will work perfectly.

Q If I exclude the wells with concentrations higher than 4ng/µl and only keep the 0.25, 0.5, 1, and 2 dilutions, I still get the same slopes. I am not sure that the concentration is causing the problem.

A Why are you concerned about 84% and 94% efficiencies? The accepted range for qPCR is generally 80 to 110%.

When you use all of your dilutions for the calculations, you get greater than 100% efficiency, correct? But then you get 84 and 94%, respectively, after you eliminate the extremely concentrated points. The difference between an expected 90% and observed 84% and 94% is negligible in qPCR, given that different labs and different hands have subtle nuances in sample preparation methods.

For example, if your lab's efficiency for target A is 94% and for target B is 84% and the other lab sees 90% for both targets, then go ahead and just report it that way. As long as you are using the exact same master mix and the exact same machine and the exact same primer and probe supplier, the differences are not going to be a problem.

A I had a different interpretation of the question. It says, “when I use the software program LinRegPCR to calculate efficiencies within each individual well, I get average efficiencies of ∼84% and 94%.” I think this means that the software calculates an individual efficiency for each well based on the amplification curve. Is that right?

I wonder why that method isn't used more often. My system doesn't even include it. Intuitively I like it, but I haven't researched it.

A I also think that the LinReg approach to calculate efficiency on a sample-by-sample basis for each amplification curve is the best way to go. From what I understand, deciding where background starts for each curve is a confounding issue. But, the Sisti, Guescini Cyo method should be able to get around that. It would be great for all machines to be capable of doing that. I think they are moving that direction.

But back to the question of efficiencies that are greater than 100%: I have seen those kinds of efficiencies before with SYBR Green assays of several targets and also pre-formulated AB assays for several targets. With the AB assays that returned efficiencies higher than 100%, we later found a machine programming error. But for the SYBR Green assay with greater than 100% efficiency for several targets, I still have no explanation. That was the first time that particular operator performed the procedure, so it may be that the serial dilutions were not set up correctly.

Perhaps if you try serial 1:4 or serial 1:5 dilutions you may get better slopes.

Or, is it possible that you are dealing with arithmetic addition to your signal due to an overabundance of degraded or truncated DNA in your samples?

Are you certain there is nothing else being amplified by your primers or probes for each target?

Are your samples of the exact same variety as in the publication you got your qPCR approach from?

Q The LinReg approach calculates efficiencies within each well using the amplification curve. That approach gives me the average efficiencies I want, albeit with variation between wells. I am running 24 wells in total with 8 dilutions (32ng to 0.25ng) done in triplicate. So when using LinReg, I get 24 efficiency calculations each for the repeat PCRs and for the single copy gene PCRs.

The other approach I tried involves regressing the Ct values of the dilution series over the log amount of DNA. The slope of the regression is used to calculate the efficiency with the formula efficiency = 10^(-1/slope)-1*100. Using the latter approach, I get one efficiency calculation for the repeat PCR and one for the single copy gene PCR that are in the range of 300-400%.

I am going get a new DNA sample and redo the experiment using alternative dilutions as you suggest and I'll also run a few gels to see if there's non-specific amplification.

A I think many people first optimize PCR primers with SYBR green before using the probes so they can see a melt curve at the end of the cycling. Running your PCR products on a gel is better for discriminating different products, but it is more time consuming.

Q I figured out what I was doing wrong. Basically, I was using the wrong base in the back transformation formula for calculating the PCR efficiency. So the formula I was using (10^-1/ slope) requires the DNA amount to be log transformed to the base 10 but I was regressing Ct over DNA amount on the natural log scale. So it was just a calculation error after all.