Chemistry as a science was developing rapidly in the 18 th and early 19th centuries. Diabetes was the subject of intensive investigation by the medical chemists. Physicians like John Rollo and William Cruikshank demonstrated the presence or absence of sugar in the urine of diabetics, depending on the state of the patient's disease (5). They were proponents of being able to monitor a patient's status by knowing if there was sugar in the patient's urine (5). The conventional wisdom of the time was that most of the practicing physicians were not sophisticated enough to do this type of analysis routinely (5). However, the concept of monitoring through repeated testing was being considered.
In 1827, Richard Bright published his Reports of Medical Cases, in which he described albumin in the urine of patients with dropsy, which is the collection of serous fluid in body cavities or cellular tissue. Bright was aware of the anatomical lesions that were associated with this condition and he made the link between chemical findings and anatomic findings (5,7). This condition was later referred to as Bright's disease in honor of the association made by Bright (7). This undoubtedly gave considerable credence to the analysis of specimens to diagnose disease. The test was simple: Heat the urine and observe the white precipitate that formed. The presence of kidney disease could be predicted by the outcome of the test. The test was not perfect, and as more people performed the test, conflicting results were obtained. Some "albuminous" material did not precipitate unless chemicals were added, and some normal people had precipitates in their urine (5).
These and other arguments were responsible for the slow acceptance of Bright's albumin procedure and its use in diagnostic medicine. However, the principle of laboratory testing and the additional insight that it gave the physician was developing slowly. Progress was on the front side of a geometric curve, and by the end of the first half of the 19th century, greater utilization and faster development of laboratory diagnostic medicine would be commonplace.
A major driving force for this increased activity was the work of Gabrial Andral. Recognizing the value of chemical analysis of body fluids, he argued effectively for increased research by the medical community (5,6). He was a believer in chemical and microscopic examination of the liquid as well as the solid components of the body (5,6). His work, Pathological Hematology, examined blood chemically, microscopically, and visually in both the healthy and diseased populations (6). He measured or calculated the major components of blood and was able to demonstrate a decreased red cell mass in anemia and a decreased blood albumin in albuminuria (6). The success of Andral's work coupled with his enthusiasm led other workers to pursue similar investigations. The studies of sugar in the blood of diabetics and uric acid in the blood of those who suffered from gout are prime examples of this effort (5).
The hematology specialties were benefiting from Andral's work. The invention of counting chambers for red cell quantitation and their use in diagnosing anemia was discussed in medical publications (8). All the work in chemical analysis of fluids and the microscopic analysis of blood was facilitated because bloodletting was still an acceptable form of therapy. This was a Catch 22 situation. As more information was gained about diseases like anemia through these new methodologies, it became apparent that bloodletting was not appropriate therapy and this inexhaustible source of study material would become less available.
The last half of the 19 th century was a prolific time for the development of laboratory methods. So many methods began to appear that the practicing physician had difficulty in choosing which assays were reliable and which were not (8). Time and equipment were also becoming impediments to the routine use of these new tests. Physicians complained that the time required to conduct these analyses and the expertise involved in their performance exceeded their capabilities (8). The problem would get worse before it ever got better. Chemistry was becoming a tool to be used by all the medical disciplines. Dr. Paul Ehrlich used chemical dyes to test urine in the diagnosis of typhoid fever and aniline dyes to distinguish different types of leukocytes (7,8). The study of gastric disturbances was accomplished by chemical analysis of stomach contents removed by gastric tube (8).
The medical literature was replete with discussions of new methods and of the failures and shortcomings of older ones. If technological advances were the only requirement for legitimacy, then laboratory medicine was an established discipline. However, true acceptance can only be established through academic credentials. Throughout the 19th century, there was a parallel development of the academic and political status of diagnostic laboratories. Initially, medical schools did not have the status of universities and it was well into the 19th century before they became equivalent to colleges (9). Around the same time, medical chemistry was being split from chemistry, which was either general or organic chemistry (9). The latter subjects were considered to be more pure science and, therefore, more prestigious. The result was that physiological chemistry was the domain of physiology with no academic standing of its own
(9). There were notable exceptions, one being the University of Tübingen, at which the first chair of physiological chemistry was established in 1845 (9). The first chairman was Julius Schlossberger, who was responsible for all the chemistry teaching in the medical facility (9). He was followed by Felix Hoppe-Seyler under whom the chair was transferred to the philosophical faculty (9). The chair survived until the 20th century, primarily because of the stature of the chairmen, all of whom were both organic and physiologic chemists (9). Similar academic structures were not as successful at other universities. It was not until the position was established in the medical schools of the United States that physiological chemistry became secure.
The first laboratory of physiological chemistry in the United States was established in 1874 at the Sheffield Scientific School of Yale University under the direction of Russell H. Chittenden
(10). This was rapidly followed by similar laboratories being established at other major universities. Faculty members were expected to teach and conduct research, and professional positions were awarded on the academic credentials of the applicant (10). These facilities were a permanent part of universities and, later, medical schools. The premedical and medical training of physicians included laboratory training in the biological and chemical sciences.
Another development in the late 1800s was the appearance of hospitals in the United States. By midcentury, these hospitals were designing laboratory space for purposes of urinalysis
(7). Resources were being made available for laboratory work, of which the overwhelming volume was urinalysis (7). There is some indication that urinalysis was routine in at least one prestigious Eastern hospital (7). At the end of the century, the hospital laboratory had been joined by the ward laboratory
(8)—a smaller version of the main laboratory. The rationale was that a small lab space near patients would reduce the length of time required to get results and could be staffed by physicians and house staff. In practice, these two types of laboratory grew, each requiring more resources to make them operational.
Two physicians added considerable credence to the concept of a professionally staffed, hospital-based laboratory. Otto Folin, at a lecture in 1908, proposed that laboratories should be hospital based and staffed by professional physiological chemists (11). William Osler judged the value of the laboratory to be indispensable to the clinical physician. When physicians as influential and respected as these were become proponents of laboratory testing, the position of the laboratory was permanently secured (8).
With the continued acceptance of the laboratory and hospitals considering them to be integral to their service, a subtle change began to take place. Once the lab became established, work was generated. A review of several hospitals' records at the beginning of the 20th century indicates that urine testing was being done on most patients even if there were no indications for such a test (7). These urine tests were being refined by investigators who now had positions in the laboratory. New tests were being added for blood and other body fluids. Many of the famous names associated with clinical laboratory medicine such as Folin, Benedict, Garrod, Koch, Van Slyke, and Ehrlich, among others, came from this era. This was a very productive time for research and development; many of these findings were being transferred to the diagnostic service laboratory.
Urine and diabetes have been studied for centuries in the hope of providing better care. There are records at the Pennsylvania Hospital that urine sugar measurements were used to monitor the therapy for a diabetic woman (7). The end point was a negative urine sugar finding and so urine was analyzed every day. This was one of the first records of using lab tests to monitor treatment. After insulin was discovered, it became more important to monitor sugar because of the difficulty in controlling insulin therapy. Insulin preparations were of different purities, and external factors such as exercise and diet made insulin dosing very difficult. Blood sugar analysis was possible but difficult and so it became routine to do regular urine sugar measurements (7).
Methods continued to be developed and clinical applications were tested. Each test or procedure found its way into the service offerings of the clinical lab. It was apparent that at least two problems were continuing to plague the lab. First, the volume of work was increasing and projections indicated that the trend would continue. Second, laboratory tests were difficult and tedious to do and, as a result, showed significant variability from imprecision. A partial solution to these issues came from Dr. Leonard Skeggs. His design of a continuous flow analyzer was the first practical unit for the laboratory (12). These first designs were essentially mechanical duplicators of hand procedures but enabled the lab to increase throughput and improve the precision of the analysis. Automation for laboratory testing has undergone several generations of change. These instruments are now found in all sections of the laboratory and encompass a wide spectrum of methodologies.
Was this article helpful?