to BioTechniques free email alert service to receive content updates.
High-Content Screening
Lynne Lederman

is a freelance medical writer based in Mamaroneck, NY.
BioTechniques, Vol. 43, No. 1, July 2007, pp. 25–27
Full Text (PDF)

Understanding the Cell

High-content screening, based on the use of automated imaging of phenotypic differences between living cells, can be applied to small molecule drug and inhibitor discovery and to pathway elucidation. High-content screening is primarily based on automated digital microscopy and flow cytometry coupled with computer-based data analysis and storage. The technical and bioinformatics requirements remain challenging, but solutions are certainly in development.

Happier Cells

Robin A. Felder, Professor and Associate Director of Clinical Chemistry and Toxicology, and Director, Medical Automation Research Center, University of Virginia, Charlottesville, VA, is looking at how biotechnology can move the cell culture world from the two dimensional (2-D) paradigm to three-dimensional (3-D) culture and better phenotypes. To do this, his group has created 3-D matrices to allow cells better access to nutrients. They are developing media that are more bio-compatible with the needs of regenerative medicine than culture media of the past and that will eliminate the lot-to-lot variation associated with serum or similar non-defined additives. “One goal is to create a pipetable 3-D system to move beyond having to remove cells from a surface by banging on them, adding trypsin, reducing the temperature, or other things that make cells unhappy. The ultimate endpoint is to aspirate and dispense, and we have achieved that goal by creating a levitatable 3-D microcarrier of the correct size.”

These microcarriers are optically clear, nonfluorescent, manufacturable by a consistent process, and can be levitated in an externally applied field—all qualities that should allow them to be used in processes that could be applied to drug development or production of cells or tissues for in vivo use. Felder doesn't think there will be a limit on the types of cells that can be grown this way. His group has grown primary human cells, human cell lines, cancer cells, and standard research cell lines.

In a project supported by the National Institutes of Health, his group has grown 90 human lines derived from renal proximal tubules from individuals with and without high blood pressure to identify cellular defects associated with hypertension. “This is decades of work,” he says. “We should be able to expand and make these cells consistently available, and that should be true of all human cells.”

In addition to using the 3-D culture system for basic cellular research, Felder notes that it may facilitate cell-based production, allowing one to go from the freezer to assay in the future, rather than having to thaw small amounts of cells, expand them in plates, detach them in some manner, then further expand them to the numbers needed. He sees this as eliminating one of the last big bottlenecks in drug discovery. It should also improve high-content screening, bioproduction, and regenerative medicine, as well as basic research and allow the use of primary cells instead of models. “The field is just about to catch fire,” he says. “An individual researcher who might have spent a lifetime to get a favorite cell type to behave will be able to get it right out of the freezer and use it.”


“The ultimate goal of high-content screening is fully understanding cellular function in a systems biology context,” says J. Paul Robinson, SVM Professor of Cytomics, Director, Purdue University Cytometry Laboratories, West Lafayette, IN, and President, International Society for Analytical Cytology. His group added imaging to their flow cytometry research a decade ago. “You cannot survive without being multi-disciplined,” he says. “We want to understand how cells operate individually, in teams, in organs, in the body. We are taking a cytomics approach. At the end of the day, we are looking for biomarkers to predict functional changes. That's the Holy Grail. If you knew what to look for and could identify it before it was too late, for example, in cancer or other diseases, you could fix it,” he observes.

Image 1.

Ideally, one would like to identify what is causing a change before it is non-rectifiable. Robinson notes that although one might not ultimately be able to make a permanent change to cure a disease, it might be possible to use drugs or behavior modification to improve life expectancy and quality of life, giving the example of elevated blood pressure—one target of his research. Moving into the high-content area is possible, he believes, because there are tools to operate at very high speeds, to automate, replicate, and reduce variability; also because it is becoming easier to look at more parameters simultaneously.

Image 2.

One problem Robinson identifies is that currently no two instrument manufacturers have the same standards and calibrations. He compares the current state of high content screening with flow cytometry over 20 years ago. As that technology expanded and grew, he notes, the quality of the equipment and the data improved. Right now, however, he describes high-content screening as “every person for [his or her self]” in terms of standards and data analysis. “We don't need to change the hardware, but the material out of the equipment ought to be comparable,” he says. “Manufacturers shouldn't lock away the algorithms and not tell people what they are doing. Market forces will drive the technology,” Robinson concludes.

At the Core

David E. Solow-Cordero is the Director of the High-Throughput Bioscience Center in the Department of Chemical and Systems Biology at Stanford University School of Medicine, Stanford, CA. His facility supports whatever projects the faculty brings to it, typically including small RNA-mediated interference (sRNAi) screening, as well as small-molecule high-throughput screens. In this field, he sees the flow of personnel from industry back to academia. “We're taking over what startup biotech companies did before,” he says. “It gives a first leg-up on drug discovery, but academia is also very interested in understanding pathways.”

“For us, the most exciting thing is being able to do whole genome RNAi screening,” he says. “Most of these screens involve high-content analysis. These are mega experiments with results that give a lot of information.” What is lacking is a comprehensive way of tying all the information together (e.g., relating high-content imaging of cells of similar phenotype to potentially linked cell signaling pathways that are revealed in gene knockout assays). “In principle, all the bits and pieces are there, including the information and the tools, the question is how to put them together.”

The biggest issue for Solow is the total amount of data generated and the tremendous amount of storage required for it. One single RNAi whole genome knockout or similar small molecule knockout experiment can generate 5–10 terabytes of high-content data. “We are unable to store and have it handy and make it publicly available,” he says. “It's three orders of magnitude more than the 10 giga-bytes of data we can handle and make publicly available at this time.” One way of getting around this for now, Solow notes, is to capture data from microplate cytometers as fluorescent intensity and location rather than as photos or images, which reduces the data content by a fiftieth. This results in less primary data with fewer details and lower resolution, but it is accumulated faster. One way to overcome this problem, he notes, is to “wait for the computer guys to catch up. One of the best things the human genome project did was to advance computing and to get computation into biology.”

Another issue Solow identifies involves the integration of robotics and how long it currently takes to image a plate. In high-throughput screening, it could take months to screen several hundred plates without robotics.

Charles Karan, who is the Manager of the High-Throughput screening Room at The Rockefeller University in New York City, points out that major reasons many institutions have core facilities for high-throughput screening are the cost of the equipment and that the process is not a single science, but rather relies on many competencies, including molecular biology, organic chemistry, genetics, informatics, and robotics. He hasn't seen the flow of expertise from industry to academia, but is seeing more individuals come out of degree programs with a good understanding of assay development and miniaturization. At The Rockefeller University, most researchers use Karan's facility to dissect some aspect of the biology of a system they may have been studying their entire academic lives. They may be able to use the effects of small molecules as an alternative to traditional genetic tools to understand their particular system as a whole.

Karan agrees with Solow that they generate so much data that they sometimes don't know what to do with it. He feels that the technology may be ahead of itself in some areas. For example, in high-throughput screening, the development of small wells was ahead of development of equipment ideal for the handling of small volumes of liquids. In the same way, analytic software development lags behind the data collection. “The more tools you have,” he concludes, “the better off you are. In the end, it's all about answering the questions you ask.”

Image 3.