Automated pipetting

As with any analytical platform, manual pipetting is best suited for the handling of very few samples at a time, and quickly becomes error-prone as throughput escalates. Automated liquid handling is the ideal solution in large-sample volume applications, such as convicted offender data banking. Proven robotic technologies tested in genome sequencing and clinical environments can be readily implemented in forensic environments, and their application can provide benefits in process precision and reproducibility, increased sample tracking confidence through reduction of error-prone tasks, the reliability of movements generated from computer programs and sample throughput.

Sample tracking confidence is an important aspect of data quality that automated pipetting can improve compared with manual processes. High-density assay plates are required for high-throughput applications involving the low reaction volumes of PCR-based assays, an environment where the high positional accuracy and reliability of automated pipetters are considered a prerequisite to secure sample tracking. The majority of robotic instruments can integrate barcode scanning devices to record the location and identity of all barcoded containers on the instrument work surfaces, assuring correct container addressing by pipetting or transport heads. A powerful feature available through this enhancement is operator-independent sample and reagent tracking. Many of these robotic systems offer logging capabilities that support quality assurance and chain-of-custody objectives by recording pipetting steps associated with specimen processing.

More sophisticated implementations support communication with external systems such as LIS applications (Scholl et al., 1998; Fr├ęgeau et al., 2003; Leclair et al., 2004b; Scholl, 2004; Leclair and Scholl, 2005). Logic can be introduced to permit the LIS to control many aspects of sample flow management. Client-based applications can be designed to initiate a transaction containing information about specimens, reagents, operators, instruments, etc. prior to initiating a processing step. After the LIS has confirmed that sequential processing steps are completed in order, reagents are appropriate and have passed quality control tests, operators and equipment are validated, as well as making other important confirmations, a transaction is returned to the client application to signal that processing can proceed. A final transaction is sent back by the client application to the database to record that the process is complete. Since queries to the database can be incorporated at the outset of various processing steps, the LIS

can track the progress of every sample, prevent inappropriate processing and alert operators about detected incongruities.

A further level of integration brings under the management of LIS applications the control of robotic instrumentation through worklists - large yet simple text files that contain the required pipetting commands for the execution of a routine - written in language supported by the robot's control software (Leclair et al., 2004b; Leclair and Scholl, 2005). Pipetting and sample flow management logic can be integrated to transform isolated instruments into components of fully integrated processing platforms. If the intuitive approaches of trained forensic analysts can be understood in sufficient detail to permit their description in a rules-based system, then the system can dynamically alter default processing schemes when certain pre- or mid-process conditions are met. This type of customization can include every aspect of automated pipetting, such as changing source and / or destination containers (i.e. cherry-picking), altering transfer volumes and changing liquid pipetting specifications. Sample flow management logic can manage sample re-processing queues to regroup samples that share the same point of reintroduction in the processing scheme or similar modified pipetting schemes, and build batches optimized for pipetting efficiency. An integrated capability promotes higher first-pass processing success by reducing wasteful analytical attempts under non-ideal conditions, which may prove mission-critical for many scarce casework samples. Such a system provides flexibility with the processing platform that exceeds what may normally be achieved manually.

Convicted offender databanking was the first forensic process to employ extensive automation. These specimens are collected under controlled conditions similar to those employed for clinical genetic testing, making for ideal specimens for genetic analysis. As most of the wet chemistry is similar from one forensic application to the next, the experience from automated processing of large numbers of convicted offender specimens has been leveraged into other areas of forensic processing. However, variation in sample input attributes greatly increases the processing contingencies that must be handled and thereby increases the design complexity for an automated processing system. In that respect, casework and MFI samples, being collected at crime / disaster scenes, often present compounded problems linked to substantial variation in substrate, cell / tissue type and quality / quantity of recoverable biological material, which calls for chemistries and robotic pipetting schemes supporting a larger range of quality / quantity of input material. The increase in precision and reproducibility afforded by automated pipetting devices provides a consistency that may improve the quality of STR data generated from compromised samples.

Casework samples also present additional intricacies at the evidence screening step as many samples need to be localized and cut away from larger pieces of evidence, all samples need to be assessed to confirm human origin of the recovered material, the body fluid involved, their suitability for ensuing DNA

extraction and genetic analysis, and the amount of material to be processed in order to meet processing platform sample input range specifications. A substantial, largely manual front-end processing step is necessary to qualify the samples for further processing and direct them to an appropriate wet chemistry processing protocol. Throughput improvements for this front-end step may come from improved stain visualization technologies, streamlined presumptive tests and clerical technologies (i.e. LIS-supported voice recording, speech-to-text software, digital photography and tactile computer screens) to facilitate and expedite accurate information capture.

In summary, substantial improvements in throughput can be readily realized through the implementation of LIS-supported automation of pre-data analysis steps. Development activities should focus on enhancing efficiencies through aids to evidence screening, and on the extended integration of automated liquid handlers with LIS systems. Continued progress in these fields is essential if reductions in casework backlogs are to parallel those for convicted offenders.

Was this article helpful?

0 0

Post a comment