The Molecular Line

As the 21st century begins, the clinical diagnostic laboratory is entering into a new phase. Biotechnology, in all its forms, is the fastest growing discipline in the modern clinical laboratory. From the original experiments of Gregor Mendel in 1865 describing hybridization of plants, to the Human Genome Project, molecular biology is presenting the laboratory professional with new challenges. These challenges are not limited to the laboratory professional and questions of science. Ethical, legal, and commercial questions are every bit as daunting as the scientific issues.

Molecular biology appears to be a relatively recent discipline, but it is possible to point to events in the distant past that can be considered precursors of the current science. The Assyrians and Babylonians, between 5000 and 2000 bce, recognized the existence of gender in the date palm tree and undertook artificial pollination (13). It can be surmised that the purpose was to enhance those traits that were considered desirable. Between the 6th and 4th centuries bce, the Greeks discussed inherited traits and the relative influence of inheritance versus the environment with respect to birth defects (13). Arguments were put forth as to how humans developed and what was the source of the traits that were easily recognized between parent and offspring.

Progress was slow until Maupertius described the first inherited genetic disorder through four generations of a family in 1752 (13). Joseph Adams published on the hereditary properties of disease, delayed expression of hereditary diseases, and environmental exposure as a trigger for disease (13). This was quickly followed by Schleiden and Schwann proposing that nucleated cells were the fundamental units of life and Virchow's theory that cells could only come from the division of existing cells (13). In 1859, Darwin published "On the Origin of the Species," a significant work that was lacking in genetic theory (13).

The point in history where most people agree that molecular genetics began is 1865 when Gregor Mendel presented his work with peas to explain hereditary traits (14). As important as this work is now viewed, it must be remembered that it went essentially unrecognized for 35 years. For the remainder of the 1800s, there were many discoveries and theories of cell physiology and structure. Some of the current terms in molecular biology and genetics such as nucleic acid, chromosomes, and mitosis come from this era.

The first half of the 20th century was fueled by Mendel's work. Traits in plants and animals were re-evaluated in light of the Mendalian ratio. Remarkable studies by remarkable scientists contributed a large volume of basic science to this burgeoning discipline. Such contributions as chromosomes are paired (1902), X and Y chromosomes determine gender (1905), genes are physically on chromosomes (1910), the first gene map (1913), and DNA isolated and purified (1935) were obligatory precursors to the Human Genome Project (13,14).

There were other events during this period that have an interesting relationship with modern molecular biology. First, the stress of every first-year genetics student can be traced to 1910, at Columbia University, when T. H. Morgan determined the sex-linked character of some traits in Drosophila melanogaster. A choice of model that allowed the study of many generations in a short time span. This became important because Hermann Muller used the fruit fly model to demonstrate how ionizing radiation accelerated the formation of genetic mutations. This was important several years later to the Department of Energy. Second, X-ray crystallography studies of DNA and proteins led to the origination of the term molecular biology, the implication being that there had been a movement from the relatively gross studies of biology to a more refined study on a molecular level. Third, the concept of one gene, one enzyme was proposed by George Beadle and Edward Tatum (15). This theory unified several of the discrete pieces of information that had preceded it and also unified genetics and biochemistry (15). This is the first major discovery of molecular biology (15).

Up until 1950, information was being gathered, unified, and converging to a point that is now called molecular biology. However, as the computer scientists say, there is another thread, one that is more closely related to the practice of current molecular biology. In the early part of the 20th century, Archibald Garrod published his findings about the hereditary nature of alkaptonuria (15). The significant issue here was that he was able to show this to be a metabolic or chemical disorder (15). Of the cases he studied, all were the product of a union between first-degree relatives, and Garrod was able to reason that this particular mating practice allowed a recessive character to surface, as predicted by a Mendelian distribution (15). This finding was followed a few years later by his book The Inborn Errors of Metabolism. The exact chemistry of the involved metabolic pathways would not be known for several years, but this is the first instance of laboratory work yielding a diagnosis, similar to today's practice.

A similar discovery took place in 1934. Asbjorn Folling was able to relate mental retardation to the metabolic disorder phenylketonuria (PKU), an inborn error in the metabolism of phenylalanine. The error arises because of a mutant gene in the synthesis of the enzyme phenylalanine hydroxylase (13). This represents another application of molecular biology; a genetic disease is detected very early in life and a treatment regimen can be started to prevent the disastrous sequelae.

In 1949, Linus Pauling and his research group published an article in the journal Science that described sickle cell anemia as a molecular disease (15). This is the first description of a medical disease on a molecular basis and attributed to a mutant gene. Pauling followed this with a description of the a-helix structure of proteins (14,15). This elucidation of the three-dimentional structures of proteins was a remarkable feat; yet Pauling did not discover the structure of DNA.

These efforts continued in the 1950s with the establishment of the number of human chromosomes at 46 and the discovery of the chromosome abnormalities Down syndrome (trisomy 21), Turner syndrome (45,X), and Klienfelter syndrome (47,XXY) (13). This was an impressive start for the discipline of cytogenetics. Although these advances are some of the earliest representations of how medical molecular biology is practiced, the fundamental science was advancing at a remarkable rate. Chargaff determined that the ratio of the nucleic acid bases was always 1 : 1, an important breakthrough for determining the structure of DNA (13,14). Combined with the crystallography work of Rosalind Franklin and Maurice Wilkins that showed an orderly, multiple polynucleotide chain, helix structure, James Watson and Francis Crick were able to propose a structure for DNA (16).

Many pieces of the puzzle were being put in place by the extraordinary work of many groups. The gross structure and many technical details of the components of DNA were becoming available to the research world. One very important question about how this package of information was translated to actual proteins remained. Marshall Nirenberg and Heinrich Matthaei conducted a series of experiments with RNAs to see if they would synthesize proteins. From this series came the "poly U" discovery that UUU was the code for phenylalanine (16). Along with significant contributions from Gobind Khorana (polynucleotide synthesis), Philip Leder (tRNA binding to ribosomes to determine code), and many others, this opened the way for the remainder of the code to be broken.

As important as the fundamental discoveries were, equally important developments were made in methods and technologies. This period saw the isolation and characterization of reverse transcriptase, DNA ligase, and restriction enzymes, methods for staining chromosomes, the Southern blot assay for DNA fragments, an approach to determining the nucleic acids in DNA, phage and plasmid development, and the polymerase chain reaction. The polymerase chain reaction was a sort of integration of all that had gone before. This relatively simple procedure made it possible to characterize DNA, even though the source material was of limited quantity. It was an active time that was making all of the tools available for cloning.

Cloning, as a process, was accomplished before the word was coined. In 1952, Robert Briggs and Thomas King used the technique to clone frogs (17). The method was a nuclear transfer procedure that was later improved and replicated by John Gurdon at Cambridge (17). There were other experiments with the fruit fly and bacteria, but in the early 1970s, Paul Berg created the first recombinant DNA and Stan Cohen, Herb Boyer, and colleagues created the first recombinant DNA organisms (16). They had successfully amplified toad DNA in Escherichia coli. This was the beginning of genetic engineering and began the discussion of the social impact of this new science.

The ability to alter genes and to amplify them in another species was a source of concern to say the least. Could these new "agents" cause deadly diseases that would ravage mankind?

This was truly uncharted territory, and for the next few years, the scientific community discussed what, if anything, should be done to control the science. An international conference in February 1975, held in Pacific Grove, CA (Asilomar Conference), generated a set of provisional recommendations that were later used by the National Institutes of Health (NIH) to formulate a set of mandatory guidelines. All NIH-funded programs were obligated to follow these guidelines (16). Other agencies around the world soon adopted similar restraints on recombinant DNA research.

An unusual situation was now developing. The potential of cloning was not lost on anyone and each experiment received wide exposure in both the popular press and scientific journals. Therefore, enterprising individuals saw the commercial possibilities and the early biotech companies were born. Genentech, Cetus, Genex, Biogen, and Amgen were some of the early entries into genetic engineering (16). In 1980, it became legal to patent genetically engineered organisms. This encouraged the pharmaceutical and research companies to pursue protein hormones, drugs, and specific links between genetic abnormalities and diseases. Somatostatin was the first genetically engineered hormone and was followed in the next year by insulin and shortly thereafter by erythropoietin. It was clear that this was the new approach to the production of pharmaceuticals.

The cloning experiments that were conjecture in the 1950s and 1960s became reality in the 1980s. Nuclear transplant cloning of mammals was accomplished on mice, sheep, bovines, pigs, goats, and rabbits (17). These experiments and others paved the way for the cloning of Dolly, the first animal cloned from adult cells. Dolly was born in July of 1996 at the Roslin Institute in Scotland (17). The theories had been turned into practice and the pharmaceutical industry as well as all of biological science would embark on an exciting and daunting future.

One of the greatest success stories in molecular biology has to be the Human Genome Project. Some time after WWII, the Atomic Energy Commission (AEC), later known as the Department of Energy (DOE), was intensely interested in studying the health effects of ionizing radiation (18). The AEC had been consumed with the Manhattan Project and the creation of the atomic bomb and one can reasonably assume that the interest in ionizing radiation was piqued by those events. The AEC was the largest funder of genetic research in the United States. By the 1980s, the DOE was supporting research on the health impact of non-nuclear sources of energy (18). In this environment, Charles DeLisi began to muse about mapping the human genome. The obstacles were tremendous. Initially, it was thought that the process that would be required to accomplish such a feat was impossible. Techniques were not available yet to make this practical. Major agencies like the NIH were not interested and neither were many of the ranking scientists throughout the country. The idea persisted, and as technologies were developed and computer power was improved, there began to be more interest. The DOE announced in 1987 the formation of the Human Genome Project Initative that would order and sequence the human genome (18). In science as in all of life, nothing becomes more interesting than when someone else has an interest. This announcement by the DOE stirred the interest of the NIH or, more specifically, the director, James Wingaarden (18). The scientific community was still divided because the perception was that the money used for this project would jepordize all the other funded research. The cost of this project was estimated in the billions of dollars. The National Academy of Sciences wrote a report for the NIH that supported the genome project. The funding war was on. DOE and NIH were cosponsors of the project in the early days, but funding gradually shifted to the NIH (18).

The official announcement of the Human Genome Project was made in 1990 and it was expected to take 15 years and 3 billion dollars. The original goals were to generate a high resolution gene and physical map of the human genome, to determine the complete DNA sequence in humans and other organisms, to develop the technology to store, analyze, and interpret the data, and to assess the ethical, legal, and social implications of genomics. It is interesting to note that the ethical, social, and legal issues were funded from the same source as the scientific project. Remarkable progress and cooperation, as evidenced by the sharing of sequence data within 24 h, was a hallmark of the project. That is not to say there was not some competition from the commercial side. Celera Corporation was created to compete with the NIH project and had set as its goal a 3-year timetable and a significantly reduced cost.

President Clinton announced in June 2000 the completion of the first draft of the Human Genome Project. It was a joint presentation, with both the commercial company Celera and the International Human Genome Consortium represented. At the time of the announcement, the project had cost about 300 million dollars and was several years early (13). It was truly a first draft in that the entire genome had not been mapped, but by 2001, about half was finished and available in the public database (13). Fifty years after the announcement of the structure of DNA, the Human Genome Project has accomplished virtually all of its goals.

From this brief overview, it can be said that molecular biology has developed, more than any other science, by the cooperative effort of many diverse disciplines. There was the mathematical approach of Mendel employing algebraic logic (19). The isolation of protein-free nucleic acid by Richard Altman and the experiments to convert nonvirulent to virulent bacteria by Fred Griffith and Oswald Avery are significant developmental landmarks (20). The application of the principles of theoretical chemistry by Max Delbrück and Erwin Schrödinger allowed others to proceed using the principles of physical science (19). James Watson and Francis Crick elucidated the mathematically satisfying structure of DNA (21). There were the developments in computer science that made the software and hardware available for the storage and data analysis possible. Even the legal issues of patenting have to be considered as contributing to the development of this science.

For the first time in the history of the diagnostic laboratory, molecular biology is extending the range of information available. Until this time, the laboratory has been descriptive in nature. It could measure events that were currently going on by evaluating the chemistry, hematology, or anatomical pathology. Molecular biology allows the laboratory to be predictive in nature. Now, statements can be made about events that might occur in the future. This is different from an elevated value for blood glucose, when the diagnosis of diabetes can be made. This new technology returns results that indicate the patient might be at risk for a disease. Because of the ability of this technology to detect carriers of a mutation and predict risk for disease development, ethical considerations and genetic counseling have become an inseparable part of the laboratory procedure. Preventive medicine, therefore, will benefit from this new technology. In cases of a family history that suggests high risk for a particular disease, a lab test might indicate that there is no risk to a specific family member. If there is a significant risk, then medical care might be able to intervene at a much earlier stage. This has significant financial benefits for those that have to control the costs of healthcare.

The pharmaceutical industry will benefit greatly from the genetic engineering approach to drug production and even to synthesizing drugs specific for an individual patient. Perhaps that synthesis will take the form of modifying genes in animals so the animal will synthesize the human product, greatly reducing the need for organic synthesis.

Diabetes Sustenance

Diabetes Sustenance

Get All The Support And Guidance You Need To Be A Success At Dealing With Diabetes The Healthy Way. This Book Is One Of The Most Valuable Resources In The World When It Comes To Learning How Nutritional Supplements Can Control Sugar Levels.

Get My Free Ebook


Post a comment