26143 SSI add LAT CLI 3mm crop

IMMUVIEW LEGIONELLA BLOOD TEST

26034 Samsung 120928 SM PT1

7 minutes is all it takes for LABGEO PT10

Frances1 5c202e

The whooping cough epidemic: is the ‘new’ vaccine to blame?

Pertussis (whooping cough) has been a significant cause of morbidity and mortality in young children since the first epidemic was described in 1578. Currently in the West even when infants suffering from the disease are hospitalized and appropriately treated, around 1% still die, and in less developed countries the mortality rate in infants is as high as 4%. However, following the isolation of the causative organism Bordetella pertussis over a century ago, years of research and development resulted in the introduction of an effective vaccine in the 1940s.
The whole cell vaccine used heat-killed bacteria combined with diphtheria and tetanus toxoids to give the classical DPT vaccine, usually given to infants three times during their first year of life with further booster doses twice during childhood. The advent of this vaccine did not prevent the three to five year pertussis epidemic cycle, but it elicited a strong immune response and the total number of cases plummeted in immunized populations. There were some common side-effects, including swelling, mild fever and pain, but these were trivial compared with the high risk of children contracting pertussis if they were not immunized. Sadly, though, very dubious research linked cases of SIDS and encephalopathy with use of whole cell pertussis vaccine, and the popular press eagerly disseminated this dangerously misleading information. Parents began to exercise their so-called ‘freedom of choice’ based on a dearth of unbiased information and stopped having their children immunized, so in the 1990s a new acellular vaccine (DPaT) with fewer side effects gradually replaced the classical DPT.
Now cases of pertussis have more than tripled in the last five years in much of the globe, and the resulting whooping cough epidemic is the worst for 50 years. While it is possible that a more virulent strain of bacterium has evolved, the most likely explanation is that the ‘new’ vaccine is not as effective as its predecessor. Indeed a recent robust study from Australia compared incidence of pertussis in 40,694 children who were immunized in 1998 with either DPT or DPaT (both vaccines were still in use at that time). Significantly higher rates of pertussis were found in the children who had received the latter vaccine.
The suggested solution to the pertussis epidemic is to extend immunisation programmes to cover pregnant women as well as all those who come in contact with young infants. Wouldn’t reintroducing the old vaccine be simpler?

C66 Figure1 ChagasTCRUZI Culture 1 copy

Diagnosis of blood-borne parasitic infections: an overview

Methods for the diagnosis of blood-borne parasitic infections have stagnated in the last 20–30 years. However, recently, there has been a tremendous effort to focus research on the development of newer diagnostic methods focusing on serological, molecular, and proteomic approaches. This article examines the various diagnostic tools that are being used in clinical laboratories, optimized in reference laboratories and employed in mass screening programmes.

by A. Ricciardi and Dr M. Ndao

Blood-borne protozoans are the causative agents of some of the world’s most devastating and prevalent parasitic infections. This group of pathogens includes members of the Trypanosoma, Leishmania, Plasmodium, Toxoplasma, and Babesia genera. Most of these infections, with the exception of toxoplasmosis and babesiosis, have always been described as being tropical or subtropical. However, the increase in international travel as well as the arrival of new immigrants has made some of these tropical diseases realities in developed countries as well. In addition, infection via contaminated blood (transfusions and organ transplants) has become a major problem. Clearly, the transmission of blood-borne protozoans is boundless and the actual number of cases is underestimated. Quick diagnosis has always been a priority in order to determine the appropriate treatment and prevent fatalities. In addition, now more than ever, advances in diagnostics can help prevent transmission and provide active surveillance. Currently, diagnostic and reference laboratories use an array of techniques including microscopy, serological assays, and molecular assays. Here, the advantages and disadvantages of the methods will be discussed.

Toxoplasmosis
Toxoplasmosis, caused by Toxoplasma gondii, has a worldwide distribution. In immunocompetent individuals, more than 80% of primary Toxoplasma infections are asymptomatic [1]. Toxoplasmosis becomes a problem when an individual is immunocompromised or during pregnancy. Diagnosis of toxoplasmosis varies according to the immune status of the patient.

Diagnosis of immunocompetent individuals relies on serology. Early antibody responses can be detected via methods such as the dye test, immunofluorescent assay, and agglutination test whereas later IgG titres are detected by enzyme-linked immunosorbent assay (ELISA). For many years, the Sabin-Feldman dye test was the gold standard diagnostic technique due to its sensitivity and specificity. In recent years, few laboratories have continued to use this method and rather focused on newer techniques such as indirect immunofluorescent antibody tests, hemagglutination tests, capture ELISAs, and immunosorbent agglutination assays (ISAGAs). Serological assays lack the capacity to differentiate between recent and older infections; IgM levels can persist for over two years [2]. In order to determine whether an infection is recent, avidity ELISA is performed. This assay verifies IgG avidity and is based on the concept that as the immune response progresses, an immunoglobulin’s affinity for a specific antigen will increase [3].

Diagnosis of Toxoplasma infection during pregnancy is crucial in order to prevent congenital toxoplasmosis. Prenatal diagnosis involves performing real-time polymerase chain reaction (PCR) using amniotic fluid. The PCRs used often target the B1 gene of the parasite [1]. Upon delivery, PCR is performed on either the placenta or the cord blood serum in order to detect parasites. ISAGAs are also often performed. If the tests are positive, cord blood samples at one week of life are sent to a reference laboratory [1]. Follow up serology is again performed at one month and then every two to three months. There have been recent advances in the field of toxoplasmosis post-natal diagnosis. An ELISA assay that measures interferon-gamma levels upon stimulation of whole blood cells with Toxoplasma crude antigens has been developed. This method has proven to be both sensitive and specific [4].

In the case of immunocompromised patients, a quick diagnosis is essential because the infection can be fatal. Diagnosis relies on detecting parasites either by PCR or microscopy. Microscopic examination of Giemsa-stained tissues or smears is the quickest and most inexpensive method for diagnosing toxoplasmosis. However, poor sensitivity is the major pitfall of this method. PCR can also be performed on blood or cerebral spinal fluid (CSF) samples in order to detect parasite DNA. However, the degree of sensitivity attained by the PCRs is questionable and requires further investigations [1].

Leishmaniasis

Protozoans of the Leishmania genus are transmitted to humans via sand fly bites. Visceral leishmaniasis (VL), which is a lethal infection if left untreated, can also be transmitted by blood transfusions, organ transplants, and sharing of needles among intravenous drug users.

Direct parasitological methods, such as microscopy and cultures, are the gold standard methods when diagnosing VL. These methods have high specificity, but varying sensitivity. Direct detection of parasites is performed by microscopic examination of aspirates from spleen, bone marrow, or lymph nodes [5]. Using spleen samples increases sensitivity, but the procedure to obtain the aspirates risks internal bleeding. Parasite culturing from aspirates is widely used by reference laboratories.

Extensive research on the development of Leishmania serological assays has uncovered a myriad of candidate diagnostic antigens. The most promising antigens were the kinesin-related proteins. From this group, rK39 was the most tested antigen [6–8]. The rK39 antigen has been used to develop an immunochromatographic strip test (ICT)-based rapid diagnostic test which is advantageous for mass screening in endemic areas. This test requires a drop of peripheral blood and can be completed in approximately fifteen minutes [7]. Although the rK39 ICT rapid test was quite successful in Asia, it was often unable to detect Leishmania infections in African patients [5]. Additionally, rapid diagnostic tests still need standardization in order to become a regular practice in clinical laboratories.

PCR is the main molecular tool for Leishmania diagnosis due to its high sensitivity and reliability. Different PCR target sequences that are commonly used include ribosomal RNA genes, kinetoplast DNA, mini-exon derived RNA, internal transcribed spacer regions, etc., [5]. Quantitative PCR is useful because it allows for the quantification of parasites as well as species typing. Furthermore, this technique can be used to monitor treatment efficiency. Unfortunately, equipment requirements as well as the high cost limit the use of PCR for mass screening purposes in the field. The introduction of loop-mediated isothermal amplification (LAMP) could facilitate the use of molecular techniques for diagnostics. LAMP is highly specific, carried out under isothermal conditions, quick, and requires less complicated equipment (5). Moreover, reagents can be kept at room temperature, and there are no post-PCR steps. Assessment of drug treatment can also be carried out through the use of nucleic acid sequence based amplification (NASBA) which amplifies RNA sequences under isothermal conditions. Coupled to oligochromatography, NASBA can be used to monitor the progression from active disease to cure [9].

Chagas Disease (American Trypanosomiasis)

Chagas disease is the result of an infection with the blood-borne protozoan Trypanosoma cruzi. The parasite is transmitted by the triatomine bug. The second most important mode of transmission is via contaminated blood. This includes blood transfusions, organ transplants, and congenital transmission.
During the acute stage of Chagas disease, parasites can be observed in the blood. For this reason, diagnosis is carried out by direct microscopic viewing of Giemsa-stained thin and thick blood smears [10]. Parasites may also be detected through the use of hemocultures. In Chagas endemic areas, xenodiagnosis may be performed. This method involves allowing the naïve triatomine bug to take a blood-meal from the patient, and then analysing the bug for the presence of trypanosomes. It is believed that with continued research, molecular methods will eventually replace indirect diagnostic techniques such as blood cultures and xenodiagnosis [10]. However, molecular tests need to be standardized for routine clinical practice.
During the chronic stage of Chagas disease, diagnosis relies on serology; however, these tests often yield results that are difficult to interpret [10]. Commonly used, standardized serological assays include indirect immunofluorescence (IIF), indirect hemagglutination (IHA), and ELISA. IIF and IHA are commonly used due to their good sensitivity; however, their results are operator-dependent, and there is a lack of studies which analyse their reproducibility [10]. Currently, the immunoblot and radioimmunoprecipitation assays are in the process of being standardized. Both tests showed promise in early studies. A great deal of work is also being focused on the development and standardization of molecular methods such as PCR, which could be useful in monitoring chronic phase, reactivation, and treatment response.

As previously mentioned, disease transmission can also occur from mother to child, leading to congenital Chagas. Screening of neonates can be performed via direct methods, such as microscopy, or PCR using venous or cord blood samples from the newborn. These tests have very high sensitivity when performed during the first month of life [10]. Serological analysis may also be performed.

Sleeping Sickness (African Trypanosomiasis)

Trypanosoma brucei is the causative agent of African trypanosomiasis, and it is transmitted via the bite of the tsetse fly. During the first stage of the disease, parasites can be found circulating in the peripheral blood. The second stage is marked by parasites crossing the blood-brain barrier and infecting the central nervous system (CNS). The parasitic subspecies dictates geographic distribution, prognosis, and diagnosis.
T. b. gambiense causes West African trypanosomiasis, which is a slow progressing disease and is characterized by low parasite loads [11]. Definite diagnosis is carried out by microscopic observation of blood, lymph node aspirate, or CSF for the presence of parasites. In the field, the card agglutination test for trypanosomiasis (CATT/T. b. gambiense) has been widely used since its development in 1978 (12). Whole blood is used, and the assay directly detects T. b. gambiense specific antibodies. CATT/T. b. gambiense is cheap, quick, and highly sensitive. However, the test can give rise to false positives in individuals who are co-infected with malaria [12]. Although CATT/T. b. gambiense is the most sensitive, similar tests such as micro-CATT and LATEX/ T. b. gambiense can also be used. If these assays generate positive results, they need to be confirmed by microscopy or other molecular methods.

T. b. rhodesiense causes East African trypanosomiasis, which progresses quickly and is characterized by high parasite loads (11). For this subspecies, there is no diagnostic equivalent to the CATT/T. b. gambiense. However, diagnosis by microscopic observations of thick and thin smears is simple due to the elevated parasite load associated with T. b. rhodesiense.

Microscopy is the most practical technique to be used in rural areas. However, microscopy requires adequately qualified personnel in order to prevent misdiagnosis. Molecular methods would substantially improve the diagnosis of African trypanosomiasis. PCR techniques have been developed to screen the CSF of patients. The discovery of the SRA gene in T. b. rhodesiense has proven to be a breakthrough for the promotion of PCR techniques. Reactions targeting this gene have the potential to identify a single trypanosome [11]. There has also been the introduction of fluorescence in-situ hybridization in combination with peptide nucleic acid probes aimed towards ribosomal RNA. However, these tools for diagnosis are new and require further optimization. Extensive research is being focused on standardizing molecular techniques and rendering them more accessible. The use of LAMP is a step forward in improving molecular
approaches [11].

Future research needs to focus on the improvement of molecular diagnostic techniques. Currently, second stage infections are diagnosed by microscopic observation of CSF. Research is being conducted to test various cytokines and antibodies as biomarkers for CNS infection [11].

Malaria
Malaria is the most important parasitic infection in the world due to its high mortality. The causative agents, parasites of the Plasmodium genus, are transmitted by Anopheles mosquitoes. Quick diagnosis is essential in order to determine the appropriate treatment as well as to prevent further transmission.

Microscopy is the gold standard for laboratory diagnosis. This method involves detecting parasites in Giemsa-stained thick and thin blood smears. However, microscopic results are operator-dependent, thereby causing the sensitivity to vary. A great deal of effort has been focused on developing rapid diagnostic tests (RDTs) which can be used in the field. These tests can supplement microscopy, but they cannot replace it yet. Current RDTs are serology based and use three different Plasmodium antigens: Plasmodium histidine-rich protein, Plasmodium lactate dehydrogenase, or Plasmodium aldolase [13]. These tests are quick, easy to perform, and require minimal patient samples. However, they are not specific for species such as P. malariae, P. ovale, and P. knowlesi. Furthermore, false positives may be observed due to cross-reactions in patients with Schistosoma mekongi or rheumatoid factor [14]. In addition, the tests inefficiently detect P. falciparum infections from South America, as this species does not produce the common histidine-rich proteins [15].

Currently, there are no commercially available molecular assays. Although some reference and government laboratories have developed their own molecular assays, their availability is limited. LAMP is currently in the spotlight. Poon et al. developed a LAMP test which detected the target sequence of P. falciparum 18S ribosomal RNA gene [16]. They stated that the price of this test was one tenth that of a conventional PCR. Recently, LAMP was further simplified in the form of a card test. It was used in combination with DNA filter paper and melting curve analysis. This system was shown to be highly specific and sensitive [17]. Improvement of the LAMP technique should be geared towards the development of rapid diagnostic tests which could potentially be used in the field.

Babesiosis
Babesiosis is caused by parasites belonging to the Babesia genus that are spread by certain ticks commonly found in North America. The parasites infect red blood cells (RBCs), and consequently cause hemolytic anemia. The disease can be fatal in splenectomy patients, immunocompromised individuals, and the elderly. Diagnosis is complicated by the symptoms’ resemblance to other tick-borne illnesses.

The gold standard of babesiosis diagnosis relies on detecting the parasites in the patients’ RBCs. This is achieved by microscopic observation of thick and thin blood smears. Babesia infections can be easily mistaken for P. falciparum infections [18]. Additionally, false negatives are common in immunocompetent individuals whose parasitemia can be lower than 1% [18]. Samples are often sent to reference laboratories in order to confirm ambiguous results. IFFs are used to detect anti-babesial IgM and IgG [18]. They are sensitive, specific, and reliable. ELISAs and immunoblots, although not standardized, can be performed to confirm the IFF results. However, compared to IFFs, Babesia detecting ELISAs require higher concentrations of antigen and have varying sensitivity [18]. Future research on babesiosis diagnosis is aimed at developing multiplex PCR assays that will be able to detect several tick-borne infections. PCR assays have the potential to yield positive results from 100µl blood samples containing as little as three parasites; demonstrating the incredible advantage that molecular techniques could contribute to diagnosis of this parasitic disease [18].

Proteomics
Dr Momar Ndao’s laboratory focuses on the improvement and advancement of diagnosis. Through our work, we hope to encourage the development of proteomic strategies for the diagnosis of parasitic infections. Mass-spectrometry platforms are the future of proteomics, and they can be used to identify biomarkers from biological fluids. Some techniques that can be used to analyse protein expression include matrix-assisted laser desorption ionization time-of-flight mass-spectrometry (MALDI-TOF MS), surface-enhanced laser desorption ionization time-of-flight mass-spectrometry (SELDI-TOF MS), liquid chromatography combined with mass-spectrometry, isotope-coded affinity tags, and isobaric tags for relative and absolute quantification [19]. When SELDI is used, samples are directly spotted onto chemically active ProteinChip Array surfaces which can be chosen based on specific chemical and biological properties. With MALDI, samples are mixed with the matrix component prior to loading on a chip. These proteomic platforms can be useful in identifying biomarkers that are indicative of a specific pathophysiological state. Currently, members of our laboratory are using both SELDI and MALDI techniques extensively to identify biomarkers of blood borne parasites.

Summary
Quick and correct diagnosis of parasitic infections is crucial to avoid deaths and further disease transmission. Diagnostic methods include parasitological techniques, such as microscopy and culturing, serological assays, and molecular tests [Table 1]. Although several serological and molecular diagnostic tools are being tested and used by certain reference laboratories, results are always confirmed by microscopy which remains the gold standard. Many newer assays have not been standardized yet, thus, forcing diagnosticians to rely on microscopic observations. Unfortunately, the evolution of diagnosis in the field of parasitology has been slow to progress. Fortunately, in recent years, several groups have focused their research on the improvement of diagnostics. Current research emphasizes the development and optimization of molecular techniques such as PCR and LAMP. Additional work must concentrate on rendering molecular diagnostics more accessible. Although relatively new at the moment, proteomic platforms seem to be the future of diagnosis. These new techniques can identify biomarkers which can categorize susceptible individuals, distinguish between the different stages of an infection, and monitor whether treatments lead to cure. Diagnostic research has made much progression, however, there is still a lot of work to be done and improvements to be made. In order to better the diagnosis of blood-borne parasitic infections, research plus communication is the answer.

References
1. Robert-Gangneux F and Darde ML. Clin Microbiol Rev 2012; 25: 264–96.
2. Gras L, et al. Epidemiol Infect 2004; 132: 541–8.
3. Lefevre-Pettazzoni M, et al. I Clin Vaccine Immunol 2007; 14: 239–43.
4. Chapey E, et al. J Clin Microbiol 2010; 48: 41–5.
5. Srividya G, et al. Parasitol Res 2012; 110: 1065–78.
6. Badaro R, et al. J Infect Dis 1996; 173: 758–61.
7. Chappuis F, et al. Trop Med Int Health 2006; 11: 31–40.
8. Singh S, et al. Clin Diagn Lab Immunol 2002; 9: 568–72.
9. Saad AA, et al. PLoS Negl Trop Dis 2010; 4: e776.
10. Lescure FX, et al. Lancet Infect Dis 2010; 10: 556–70.
11. Welburn SC, et al.. Adv Parasitol 2012;79: 299–337.
12. Magnus E, et al. Ann Soc Belg Med Trop 1978; 58: 169–76.
13. Wilson ML. Clin Infect Dis 2012; 54: 1637–41.
14. Leshem E, et al. J Clin Microbiol 2011; 49: 2331–2.
15. Gamboa D, et al. PLoS One 2010; 5: e8091.
16. Poon LL, et al. Clin Chem 2006; 52: 303–6.
17. Yamamura M, et al. Jpn J Infect Dis 2009; 62: 20–5.
18. Hunfeld KP, et al. Int J Parasitol 2008; 38: 1219–37.
19. Ndao M. Interdiscip Perspect Infect Dis 2009; 2009: 278246.

The authors
Alessandra Ricciardi, BSc
National Reference Centre for Parasitology, Research Institute of the McGill University Health Center, Montreal, Canada

Momar Ndao, DVM, MSc, PhD
National Reference Centre for Parasitology at the Montreal General Hospital, Montreal, Quebec, Canada
E-mail: momar.ndao@mcgill.ca

C68 Figure 1 amended

Molecular diagnosis and sub-speciation of cutaneous leishmaniasis

Diagnosing cutaneous leishmaniasis histologically depends on the identification of the amastigotes, which is inconclusive and leads to cases of missed diagnosis or misdiagnosis. In this article, we describe a rapid diagnostic molecular method for Leishmania species identification and differentiation using DNA extracted from formalin-fixed paraffin-embedded (FFPE) skin tissue biopsies.

by L. Yehia and Dr I. Khalifeh

Clinical background
Cutaneous leishmaniasis is a chronic disease caused by Leishmania protozoan parasites that is on the increase in endemic and non-endemic regions because of environmental changes triggered by humans [1, 2]. It is most prevalent in the Middle East and North Africa. With changes in vector (sandfly), habitat and increased travel among populations, the incidence of leishmaniasis is showing a clear increase [3].

There are more than 20 strains of Leishmania that are pathogenic to humans [4], and these are partially responsible for its clinical diversity. The diagnosis of cutaneous leishmaniasis rests on the pathological identification of the amastigotes, which may be inconclusive [5]. This is dependent on the strain type, host response and the disease stage. Accurate microscopic diagnosis is essential to permit appropriate targeted therapy [6].

Clinically, cutaneous leishmaniasis may be asymptomatic and self-limiting. However, cases progressing to mutilating ulceration and disfiguring scarring have also been reported [7]. As the disease progresses, the number of amastigotes decreases to the point where none can be detected microscopically. The absence of amastigotes is a common problem encountered in up to 47% of cases [8]. In such instances, the diagnosis of cutaneous leishmaniasis must not be excluded [4].

Materials and methods
Skin biopsies embedded into FFPE tissue blocks were collected for 122 patients diagnosed clinically with cutaneous leishmaniasis. Cases included in the study were restricted to cutaneous lesions of patients who did not receive treatment prior to the biopsy. Cases with visceral or mucocutaneous involvement and with material insufficient for PCR or histopathological examination were excluded. Clinical information pertaining to the lesion was also collected including: number, duration, location and dermatologic appearance. In addition, the patient’s age, gender and country of residency were tabulated.

Cases were classified according to the modified Ridley’s parasitic index, a traditionally used pathological scoring system based on microscopic analysis of hematoxylin and eosin stained slides. DNA was then extracted from FFPE tissue blocks of each patient. Polymerase chain reaction (PCR) was performed using Leishmania-specific ribosomal internal transcribed spacer 1 (ITS1-PCR). Nested ITS1-PCR was performed on cases negative for conventional ITS1-PCR. ITS1-PCR amplicons were then digested with HaeIII for subsequent restriction fragment length polymorphism (RFLP) subspeciation.

Results
Of the 122 skin biopsies, microscopic evaluation of stained slides identified 54 cases (44.3%) labeled as histologically negative (with no unequivocal amastigotes detected). Of these negative cases, 9 (17%) were shave biopsies and 45 (83%) were punch biopsies.

DNA extracted from FFPE tissue blocks collected for all cases ranged from 4 to 1672 ng/μl (mean=213 ng/μl, SD=289 ng/μl). The oldest blocks were 19 years of age, whereas the newest were less than 1 year old. The quantity of the extracted DNA dating back to 1992 was 166 ng/μl (SD=128 ng/μl), whereas that for specimen from the year 2010 was 272 ng/μl (SD=161 ng/μl) indicating that a good quantity of DNA could be extracted from archival well-preserved FFPE tissues, even when they were old.

ITS1-PCR was performed on DNA extracted from all cases. Initially, and regardless of the histopathological analysis, 55 (45%) cases were positive and showed a band of between 300 and 350 base pairs indicative of Leishmania by agarose gel electrophoresis. The remaining 67 (55%) were negative (Fig. 1A, B). The negative cases were subjected to nested ITS1-PCR and 100% of these cases actually turned out to be positive for Leishmania (Fig. 1C).

Comparing the resultant ITS1-PCR bands to the DNA pattern of normal skin tissues, we identified 54 cases – that had been shown as negative by histopathology according to Ridley’s parasitic index – that amplified DNA with Leishmania-specific primers by conventional or nested ITS1-PCR, and that failed to show the normal skin profile seen in the negative controls tested. RFLP analysis identified L. tropica subspecies in all cases, identified by the presence of a 200 and 60 base pairs restriction fragments (Fig. 2) [9].

Clinical and diagnostic significance
Cutaneous leishmaniasis is a disease that is endemic in many regions of the world. With the ease of travel in the world, human and animal reservoirs of Leishmania parasites have been established in regions that previously were not known to harbour the sandfly vector because of habitat incompatibility. Thus, novel endemic areas have emerged in regions across the world. Therefore, a high index of suspicion becomes crucial for early diagnosis and control of leishmaniasis. With the advent of molecular diagnostic techniques and their high sensitivity and specificity, it has become easier to detect and control many infectious diseases, including leishmaniasis, as shown in this and other studies.
Traditionally, direct detection of parasites is performed by microscopic examination of clinical specimens or by cultivation, but either approach may be diagnostically problematic [1, 4, 10]. Cultures may take long periods, possibly weeks, for sufficient parasites to grow for species characterization. In addition, success in microscopic identification of amastigotes in stained preparations varies depending on the number of parasites present and/or the experience of the person examining the slide [11]. This is mainly due to the fact that all Leishmania species are morphologically similar and may present with a variable number of amastigotes. As the disease progresses, the number of amastigotes decreases to the point where none can be detected histopathologically.

Despite these drawbacks, microscopic identification and parasite cultivation are still the primary diagnostic tools used in most regions where leishmaniasis is endemic. However, it is stressed that accurate and rapid species identification is not possible using either technique. In the last decade, polymerase chain reaction (PCR) analysis has been successfully introduced and has been proven to be the most sensitive molecular tool for direct detection and parasite characterization of Leishmania species in clinical samples [1, 5, 12].

Accurate Leishmania species identification and subspeciation in clinical specimens is now possible by subjecting the extracted DNA to PCR, followed by enzymatic digestion to identify restriction fragments indicative of the subspecies. Such amplification using Leishmania-specific primers allows the indirect yet conclusive detection of the amastigotes, when present in a given clinical specimen. A highly sensitive method is valuable especially in chronic cases where the parasitic index is low and potentially undetectable by conventional microscopy.

Conclusion
This study successfully identified L. tropica in 54 skin biopsies from patients clinically suspected of having cutaneous leishmaniasis with negative biopsies. The importance of this result is manifested in the need for diagnostic tools that are sensitive, specific, rapid and capable of identifying all clinically significant Leishmania species from FFPE tissue blocks (Fig. 3).

Therefore, ITS1-PCR carried out on DNA extracted from FFPE tissue specimens, followed by HaeIII RFLP analysis, is a valuable method for the rapid and reliable diagnosis of cutaneous leishmaniasis. In chronic cases where the parasite load is low, or when insufficient tissue is available, nested ITS1-PCR can be performed to increase sensitivity. The advantages of this method are also highlighted with the possibility of using different biological specimens, and the ability to detect both Old World and New World leishmaniasis.

The work summarized here was first published as Yehia L. et al., 2012 [13].

References
1. Schonian G, et al. Diagn Microbiol Infect Dis 2003; 47: 349.
2. Goto H, Lindoso JA. Expert Rev Anti Infect Ther 2010; 8: 419.
3. Scarisbrick JJ, et al. Travel Med Infect Dis 2006; 4: 14.
4. Ameen M. Clin Exp Dermatol 2010; 35: 699.
5. Singh S, et al. Expert Rev Mol Diagn 2005; 5: 251.
6. Salman SM, et al. Clin Dermatol 1999; 17: 291.
7. David CV, Craft N. Dermatol Ther 2009; 22: 491.
8. Safaei A, et al. Dermatology 2002; 205: 18.
9. Kazemi-Rad E. Iran J Public Health 2008; 37: 54.
10. Farah FS, et al. Arch Dermatol 1971; 103: 467.
11. Bensoussan E, et al. J Clin Microbiol 2006; 44: 1435.
12. Schonian G, et al. Trends Parasitol 2008; 24: 135.
13. Yehia L, et al. J Cutan Pathol 2012; 39: 347–355.

The authors
Lamis Yehia, BSc
Biomedical Sciences Training Program, Case Western Reserve University in Cleveland, Ohio, USA

Ibrahim Khalifeh, MD
Department of Pathology and Laboratory Medicine, American University of Beirut, Beirut, Lebanon

E-mail: ik08@aub.edu.lb

p16 1

Real time RT-PCR is the gold standard for laboratory diagnosis

Noroviruses are the most common cause of viral gastroenteritis in humans. In recent years diagnostic methods for Noroviruses, especially real-time reverse transcription-polymerase chain reaction (RT-PCR) for the detection of Norovirus-RNA, have been improved and become more widely available.

by Dr Christoph Metzger-Boddien

Noroviruses are transmitted by fecally contaminated food or water, by person-to-person contact, and via aerosolization of the virus and subsequent contamination of surfaces. They are the most common cause of viral gastroenteritis in humans [15]. Symptoms include nausea, vomiting, diarrhea, and stomach cramping. Additional symptoms are fever, chills, headache, muscle aches and a general sense of tiredness. The onset of symptoms can begin quickly and an infected person may feel sick after a very short period of time. In most people, the illness lasts for about one or two days. People with Norovirus illness are contagious from onset of symptoms until at least three days after recovery. Some people may be contagious for even longer. Noroviruses are highly contagious. The estimated dose is as low as 18 viral particles. Approximately 5 billion infectious doses can be present in each gram of feces during peak shedding [16]. Infection can be more severe in young children and elderly people. Dehydration can occur rapidly and may require medical treatment or hospitalization [10].

Sporadic disease
In recent years diagnostic methods for Noroviruses, especially real-time reverse transcription-polymerase chain reaction (RT-PCR) for the detection of Norovirus-RNA, were improved and became more widely available. Subsequently, it became obvious that Noroviruses are the leading cause of sporadic gastroenteritis in all age groups. In Germany, since the implementation of the notification requirement according to §§6 and 7 of the infection protection act (Infektionsschutzgesetz, IfSG) a rise of reported cases can be observed with a seasonal accumulation during the winter months from October to March (2001: 9,223 cases, 2004: 64,973 cases, 2007: 201,242 and 2008: 212,769 cases, source: Robert-Koch-Institute, RKI, Berlin), but still a high estimated number of unreported cases remain.

Outbreaks
Noroviruses are the predominant cause of gastroenteritis outbreaks worldwide. Data from the United States and European countries show that Norovirus is responsible for approximately 50% of all reported gastroenteritis outbreaks (range: 36%–59%) [12]. Periodic increases in Norovirus outbreaks are associated with the emergence of new GII.4 strains. These emergent GII.4 strains are rapidly replacing existing strains predominating in circulation and sometimes cause seasons with high Norovirus activity, as in 2002–2003 and 2006—2007 [17, 20]. Genetic drift successfully promotes the re-emergence of GII-4 variants in the population [13]. Because the virus can be transmitted by food, water and contaminated environmental surfaces as well as directly from person to person, and because there is no long-lasting immunity to Noroviruses, outbreaks can occur in a variety of institutional settings (e.g. nursing homes, hospitals, and schools) and affect people of all ages. Multiple routes of transmission can occur within an outbreak; for example, point-source outbreaks from a food exposure often result in secondary person-to-person spread within an institution or community [4]. Of the 1,518 Norovirus outbreaks in the USA, during 2010 – 2011, laboratory confirmed by the CDC, 59% were from long-term care facilities (889 outbreaks); 8% were from restaurants (123 Outbreaks); 7% were from parties & events 7% (99 outbreaks); 4% were from hospitals (65 outbreaks); 4% were from schools (64 outbreaks); 4% were from cruise ships (55 outbreaks); and 14% were from other and unknown events (223 outbreaks) [10].

Foods that are commonly involved in outbreaks of Norovirus infection are e.g. leafy greens, fresh fruits, and shellfish. However, any food that is served raw or is being handled after cooking can get contaminated.

In Germany, according to data published by the RKI, the number of Norovirus outbreaks has increased by 20% between 2009 and 2010. Recently, the RKI published the final report of a huge outbreak of acute gastroenteritis in five Eastern German federal states. The source of the outbreak was a batch of deep-frozen strawberries. In total, over 11,000 cases of disease occurred. It was Germany’s largest foodborne outbreak of gastroenteritis, with several hundred institutions affected. In a considerable proportion of tested patients, Noroviruses were found [4].

Analysis of outbreak costs
In fact there is a huge socio-economic impact of Norovirus-associated diseases. A study of Johnston et al. 2007, showed the costs of an outbreak including the estimated loss of revenue because of unit closures, sick leave and cleaning expenses [7]. Because of the high contagiousness of Noroviruses early diagnosis in order to set up appropriate hygiene interventions is the most useful measure. In 2004, Lopman et al. showed, that diagnosis of the first case within three days instead of four reduces the duration of an outbreak by seven days [5, 8].

Diagnostic methods
The clinical specimens used for Norovirus diagnosis in most cases are stool and vomit samples. There is no cell culture method for the isolation of Noroviruses from clinical specimens available. Therefore, the majority of clinical virology laboratories perform RT-PCR assays for Norovirus detection. Additionally, for preliminary identification of Norovirus as the cause of gastroenteritis outbreaks, there are enzyme immunoassays (EIA) and rapid tests available. However, these kits are not recommended for individual diagnosis.

Real-time RT-PCR assays
The region between ORF1-ORF2 is the most conserved region of the Norovirus genome, with a high level of nucleotide sequence identity across strains within a genogroup [6]. This region is ideal for designing broadly reactive primers and probes for real-time RT-PCR (RT-qPCR) assays for high throughput screening in clinical diagnostic laboratories and for the detection of Norovirus RNA in
environmental samples (e.g. food and water).

The quality of the real time RT-PCR results is dependent on the quality of template RNA-extraction from clinical and environmental samples. The implementation of extraction controls in commercial RT-PCR duplex assays (e.g. Control-RNA in MutaREX Norovirus Kit, Immundiagnostik AG, Bensheim, Germany) minimizes the risk of false negative results due to inhibition or partial inhibition of the reverse transcription step and/or the PCR and due to processing errors during the extraction of RNA. Control RNA is added to a sample before RNA extraction with a commercial kit (e.g. High pure viral RNA Kit, Roche Diagnostics GmbH, Mannheim, Germany;  or intron viral gene spin, gerbion, Kornwestheim, Germany) and its recovery is measured subsequently in the duplex real time RT-PCR. The latest generation of commercially available Norovirus real time RT-PCR Kits is extremely sensitive and specific [18]. Therefore such tests have become the gold standard for Norovirus laboratory diagnosis in the past few years.

Enzyme immunoassays
For detection of Norovirus antigen in clinical samples, rapid assays (e.g. EIA) offer an alternative to real time RT-PCR assays. However, the development of a broadly reactive EIA for Noroviruses has been challenging because of the number of antigenically distinct Norovirus strains and the high viral load required for a positive signal in these assays. Commercial kits include pools of cross-reactive monoclonal and polyclonal antibodies. In evaluation studies, the sensitivity of these kits ranged from 36% to 80%, and specificity has ranged from 47% to 100% compared with real time RT-PCR [1, 2, 3, 9, 11, 14, 19].

Summary
Norovirus real time RT-PCR Kits offer a sensitive, specific, fast and cost effective diagnosis. Results can be generated within one hour. But clearly only real time RT-PCR Kits containing control RNA used as extraction control for process monitoring produce feasible and reliable results. RNA extraction from clinical specimens and the reverse transcription of RNA to cDNA are the most crucial steps in Norovirus RT-PCR procedures. Errors in sample preparation and/or RT-reaction can lead to false negative results in conventional RT-PCRs as well as real time RT-PCRs when internal controls (RNA or DNA) are already added to the PCR master-mix. Laboratories performing in-house RT-PCR for Noroviruses should critically evaluate their tests with regard to these high quality standards. Because of the modest performance of Norovirus Enzyme Immunoassays, particularly their poor sensitivity, they are not recommended for clinical diagnosis of Norovirus infection in sporadic cases of gastroenteritis. Negative samples will have to be confirmed by real time RT-PCR in outbreaks as well as in sporadic cases.

References
1. Burton-MacLeod JA, et al. J Clin Microbiol 2004;42:2587–95.
2. de Bruin E, et al. J Virol Methods 2006;137:259–64.
3. Dimitriadis A, et al. Eur J Clin Microbiol Infect Dis 2005;24:615–8.
4. Großer Gastroenteritis-Ausbruch durch eine Charge mit Noroviren kontaminierter Tiefkühlerdbeeren in Betreuungseinrichtungen und Schulen in Ostdeutschland, 09-10/2012. Epidemiologisches Bulletin Nr. 41/12: 414-417, Oct 15th, 2012
5. Hansen S, et al. J Hosp Infect 2007; 65: 348–53
6. Hoehne M, et al. BMC Infect Dis. 2006; 6: 69.
7. Johnston CP, et al. Clin Infect Dis. 2007;45:534–40.
8. Lopman BA, et al. Emerg Infect Dis 2004; 10: 1827–34
9. Morillo SG, et al. J Virol Methods 2011, 173(1):13-16.
10. Norovirus. Centers for Disease Control and Prevention. CDC 24/7 12 Apr 2012.
11. Okitsu-Negishi S, et al. J Clin Microbiol 2006;44:3784–6.
12. Patel MM, et al. J Clin Virol 2009;44:1–8.
13. Reuter G, et al. J Clin Virol. 2008 Jun;42(2):135-40. Epub 2008 Apr 16.
14. Richards AF, et al. J Clin Virol 2003;26:109–15.
15. Said MA, et al. Clinical Infectious Diseases 2008; 47 (9): 1202–8.
16. Teunis PF, et al. J Med Virol 2008;80:1468–76.
17. Vega E, et al. Emerg Infect Dis. 2011;17(8):1389–95.
18. Vennema H, et al. QCMD Norovirus 2011 EQA Programme Final Report. Dec. 2011.
19. Wilhelmi de Cal I, et al. Clin Microbiol Infect 2007;13:341–3.
20. Yen C, et al. Clin Infect Dis. 2011;53(6):568–71.

The author
Christoph Metzger-Boddien, PhD
gerbion GmbH & Co. KG
Remsstr. 1, D-70806 Kornwestheim, Germany

Quality control: the emergence of risk-based analysis

One of the fastest-paced developments in clinical laboratories has been in the area of quality control (QC) systems. Its driver has been the increase in the performance and sophistication of QC software, which has progressively tightened benchmarks for acceptable standards. On the plus side, improved QC systems clearly help a laboratory to serve the needs of patients more efficiently. Less clear is the latest, paradigm-shifting QC guideline known as EP-23; it is so far restricted to the US (where it originates), but is likely to have a major impact on Europe.

Quality control in a lab concerns routine operational and technical activities to verify that a particular test is conducted correctly. The main aim of QC software has been to ensure the validity of both test methodology and results, to define and set acceptable SDs (standard deviations), and to correct errors if they occur (ideally before they do so), or flag them as such.

There is a wide variety of software for laboratory QC. Market leaders such as Westgard and Bio-Rad supply end-to-end solutions. Other vendors provide application specific software, for example Hematronix’s Real-Time and Quantimetrix’s Quantrol Online for monitoring performance against peers, Boston Biomedica’s AccuChart for infectious disease testing, etc.

Staff challenges
The capability of the staff who run laboratory tests is a major issue, but this has long been attended to – in terms of accreditation of study programmes and training courses as well as requirements for continuing education to stay abreast of developments in the field. In the US, the Clinical Laboratory Improvement Amendments Act (1988) legally ensures that laboratory staff have to be up to the mark.

On the other hand, staffing has recently begun posing another set of problems. This is because many laboratory personnel who ushered in the IT era have begun to retire. In spite of high levels of unemployment, finding an adequately qualified pool of new recruits is proving to be a major problem in the US. [1]

As with any other core systems software, QC has been in perpetual evolution – a result of ever-changing regulations and market forces. Even the most intuitive and adaptive software requires people, experienced people, to tweak and adapt the programs in order to get them to work well and deliver the best results within a particular environment. QC is no exception.

The need for qualified personnel is set to increase dramatically as US laboratories shift away from the current system of equivalent QC to risk-based analysis, which is based on a more scientifically rigorous methodology. The US has decided to completely abandon equivalent QC in favour of a new risk-based analysis system, known as EP-23.

Equivalent versus risk-based QC
Standard operating procedures (and inbuilt IT system capabilities) for equivalent QC usually entailed running controls just once a month. In case of an aberration, the entire month load of patients (or over 8% of the annual total) needed to be recalled, samples retaken and tests rerun. Scheduling the re-tests alongside a current batch almost invariably led to capacity bottlenecks, which could then spill over into the subsequent months. Risk-based analysis is meant to do away with such contingencies.

Nevertheless, risk-based analysis also means more complex software, and more human intervention. It requires identifying potential error sources in a test or device, and implementing (external or ‘wet’) controls to reduce the risk. Meanwhile, the pathways to implement EP-23 remain somewhat nebulous. Proponents of risk-based analysis, on their part, acknowledge its complexity, but argue that the costs of error in equivalent QC far outweigh the latter, not only in terms of re-running tests but in case of wrong diagnosis.

EP-23 will drive need for skilled lab staff
Clearly, EP-23 will rely heavily on experienced laboratory personnel. The Clinical and Laboratory Standards Institute (CLSI), the US professional society mandated with establishing EP-23, notes: [2]

“The decision of how the laboratory performs its risk assessment to develop a quality control plan (QCP) will be up to the laboratory director. Some tests analysed on the same analyser may have risks of error so similar that they can be grouped on the same QCP, with only minor additions or deletions for individual tests, while other tests on the same analyser may have significantly greater, or lesser, risks and need a completely different approach to a QCP.” It also acknowledges that there “is no specific format that is required for the presentation of a QCP.”

In an official presentation on EP-23 by the Centers for Disease Control, [3]  CLSI goes on to add: “Labs will receive guidance to enable them to develop effective, cost-efficient QC protocols that will ensure appropriate application of local regulatory requirements based on the technologies selected by the lab and reflective of the lab’s unique environmental aspects. Labs will receive guidance to develop QC processes and procedures to reduce negative impact of test system’s limitation, while considering laboratory environmental/operator factors like personnel competency, temperature, storage conditions, clinical use of test results, etc.”
In such a scenario, a looming shortage of qualified personnel would hardly help.

Large laboratories clearly have an edge in being ready for a shift to EP-23, since they can afford to recruit specialist consultants to manage the changeover. For their smaller counterparts, the outlook is likely to be very different.

Europe and EP-23
The impact of EP-23 on Europe remains to be seen. At present, the EU has made no official comment, in spite of the inevitable issues which could arise, for example within the framework of the International Conference for Harmonization (ICH).

Part of the reason for its nonchalance may simply lie in the fact that there is no similar European laboratory QC standard, like EP-23. Indeed, several EU countries have their own national systems covering QC in laboratories – for example Belgium’s Directive pratique pour la mise en place d’un système qualité dans les laboratoires agréés dans le cadre de l’INAMI, France’s Guide de bonne exécution des analyses de biologie clinique, and Britain’s CPA Manual for Laboratory Accreditation.

On its part, European standard EN 45001, currently recommended for laboratories, is far broader in scope than EP-23. It covers not only QC but technical competence, human resources, organizational structure, document management and much more. It is also based on the international ISO Guide 25. ISO 25 is currently under revision, and is due to replace EN 45001.

US proponents for globalizing EP-23 note that its inspiration too lies in the accepted ISO standard, 14971. Between the sweeping generalities of EN 45001 and the different national systems in place for lab QC, it may be hard to argue that EP-23 could be a good path forward for Europe too.

References
1. http://www.healthcareitnews.com/
news/lab-staff-shortages-call-better-point-care-diagnostics
2. http://www.clsi.org/Content/NavigationMenu/Education/EP23QA/EP23_Q_A.htm
3. http://wwwn.cdc.gov/cliac/pdf/Addenda/cliac0908/Addendum%20N.pdf

C65a

New technology allows previously esoteric testing to be performed in a core laboratory

FilmArray is a highly automated small instrument capable of detecting infectious agents using PCR technology. Due to its simplicity the tests could be performed in a rapid-response core laboratory by general medical technologists. This operational model has demonstrated achievements in reducing turn-around-time and thus improved patient care.

by Dr M. Xu, Dr X. Qin, Dr M. L. Astion and Dr J. C. Rutledge

Acute respiratory infection and the importance of early diagnosis
Acute respiratory infection is one of the major causes of outpatient visits and hospitalization in young children and older patients with chronic respiratory diseases. Most acute respiratory infections are caused by viral agents, whereas bacterial infections occur much less frequently. Occasionally, patients with viral infection, but without a definitive diagnosis, are given antibiotics unnecessarily. Viral respiratory infection in immunocompromised patients has significant morbidity and mortality implications, and early initiation of appropriate antiviral therapy can be life-saving. In addition, isolation of patients with viral respiratory infection plays a critical role in infection prevention. Therefore, laboratory tests providing accurate and timely determination of the infectious agents associated with respiratory diseases are crucial in clinical practice.

Methods of diagnosis of acute respiratory infection
Many diagnostic tests for respiratory viral infection are available. Point-of-care tests for detecting viral antigens have the shortest turn-around-time, usually just a few minutes. These rapid antigen tests are available for only a limited number of viruses such as influenza A (Flu A), influenza B (Flu B) and respiratory syncytial virus (RSV), though the sensitivity of rapid antigen tests is low ranging from 20–80%, with a generally acceptable specificity if the tests are used during the respiratory virus season [1]. Direct fluorescence assay (DFA) has higher sensitivity (~80%) than rapid antigen tests and reasonable turn-around-time (TAT) of a few hours [1]. However, these are  complex assays requiring specialized and experienced technologists.  Viral culture has long been considered as the gold standard for detection of respiratory viral infection with the shortcoming of requiring days for the definitive identification of viral etiology.

In the past few years, several molecular tests have been developed to detect viral RNA or DNA using the polymerase chain reaction (PCR) method. One study compared the rapid antigen test, DFA, and viral culture with RT-PCR in the detection of influenza A H1N1 2009, and found sensitivities of only 18%, 39% and 46% respectively [2]. The specificity of all methods is not significantly lower than that of realtime PCR, which is over 90%. The authors recommended that all DFA negative results should be tested with realtime PCR. Although most molecular tests using PCR technology show high sensitivity and specificity, they are technically complex, time consuming, and require specialized medical technologists to perform the tests. This type of molecular assays is usually only available in large reference laboratories or medical centres with specialized microbiology, virology, or molecular laboratories.  These specialized laboratories usually do not operate during evening and night shifts and perform these tests in batches, and, therefore, the TAT for most molecular testing is relatively long, ranging from 6 to 24 hours.

Emerging new technology
FilmArray (BioFire, previously named Idaho Technologies; Salt Lake City, UT) is a newly developed small desk-top single-specimen-flow instrument with fully automated process for detection of respiratory infectious agents by real time PCR technology [3].  The respiratory panel performed on FilmArray is able to detect 17 viral agents including adenovirus, coronavirus HKU1, coronavirus NL63, coronavirus 229E, coronavirus OC43, human metapneumovirus, rhinovirus/enterovirus, Flu A, Flu A H1, Flu A H1 2009, Flu A H3, Flu B, parainfluenza 1, 2, 3, 4, and RSV, plus Bordetella pertussis, Chlamydophila pneumoniae, and Mycoplasma pneumoniae from respiratory specimens. The test requires only 5 minutes hands-on time of a technologist and 65 minutes total of analyser time. The testing pouch contains all the reagents for nucleic acid extraction, reverse transcription, and two steps of PCR amplification. The built-in software automatically analyses the specific melting curves of the PCR products and reports the results as positive or negative for specific infectious agents. General medical technologists with proper training are able to perform the test without any difficulties. Several comparative studies between FilmArray and other molecular tests for respiratory viral agents have shown comparable results for the detection of respiratory infectious agents [4–6].

Impact on TAT and patient care
Our rapid response core laboratory (Core Lab) is staffed by approximately 35 full-time employees (FTEs). It provides tests of general chemistry, hematology, coagulation, urinalysis, blood gas, limited therapeutic drug monitoring, and a few rapid manual tests such as monospot, pregnancy test, and sickle screen. Our Core Lab also went through a major process improvement using the Toyota production system to streamline the testing workflow [7], and  testing was designed based on a lean, single-piece flow principle without batching [7]. Using these principles we eliminated STAT testing. All the tests performed in core lab are standardized to meet a TAT of 1 hour, where TAT is defined as the time from sample receipt in the laboratory to the time the result is verified in laboratory information system. To provide 24-hour per day, 7-day per week (24/7) service to our emergency department (ED) and urgent care centre, we implemented the FilmArray respiratory panel in the Core Lab [8]. Prior to implementing the FilmArray testing, we sent our respiratory samples to a regional reference laboratory performing viral testing using the DFA method. The regional reference laboratory had an on-site facility for performing DFA testing. During the first 4 months of testing using FilmArray, we tested twice as many samples as the same time period the previous year. The average TAT was reduced from 7 hours the previous year using DFA, to 1.6 hours using FilmArray. With FilmArray, 82% of the tests were completed within 2 hours, and 95% were completed within 3 hours. Previously, with DFA, none of the tests were completed within 2 hours and only 2% of time the tests were completed within 3 hours. In addition, FilmArray detected 17 viral agents, whereas DFA detected only 8. The additional viral agents detected by FilmArray include 4 types of corona virus, 3 additional types of Flu A, parainfluenza 4, and rhinovirus/enterovirus. Although no specific treatments exist for some of the above viral agents, such as corona viruses, parainfluenza virus and rhinovirus, detection of them allowed physicians to make a specific diagnosis, which gave patients reassurance and prevented further costly diagnostic work-up and unnecessary use of antibiotics.

After implementing the FilmArray respiratory panel, we also looked at the effect of shortened TAT on patients admitted to the ED. The current guidelines for treating patients of positive Flu A and Flu B with oseltamivir recommend administering the medication within 48 hours of onset of symptoms. We found that due to the fast TAT of respiratory viral testing, more than 80% of patients admitted to the ED were given the medication or prescription in the ED or within 3 hours of discharge from the ED. This practice would have been impossible previously with DFA testing at the reference lab, which had an average of 7 hours of test TAT.

Finally, the additional clinical benefit of early detection of the infectious agents is the ability to cohort the patients effectively for appropriate isolation. As part of our hospital infection prevention policy, admission of patients with respiratory symptoms is subject to FilmArray respiratory viral screening at no charge. Clearly, the early and appropriate isolation of patients with respiratory symptoms has potential positive impact on infection prevention and overall cost savings for both patients and hospitals. One such example concerns two patients with respiratory symptoms who were scheduled for surgery. The respiratory viral testing results were negative for influenza virus for both patients, and this eliminated the need for the strict isolation procedures, such as wearing masks for staff and using negative pressure for the operating room, that would have had to have been used in the absence of test results.

Financial consideration
Although the price of FilmArray respiratory viral panel is slightly higher than that of other conventional PCR methods, the labour saving due to its simplicity is substantial and offsets the supply costs. In addition, the sample requirement for FilmArray test is a nasal swab rather than a nasal wash, which was the sample of choice for the DFA respiratory viral assay. It is much easier for nursing staff to collect a nasal swab than a nasal wash. In addition, the nasal wash creates an aerosol that mandates room cleaning and 30-minute room closure before the next use. The cost saving for a busy ED room time is difficult to calculate but is significant. One report examined the financial consequence of reducing ED boarding (the length of time a patient stays in the ED) and found that a 1-hour reduction in ED boarding time would have resulted in $9693 (~£6058) to $13,298 (~£8311) of additional daily revenue [9].

Future trends
The simplicity of the FilmArray assay gives it the potential to expand in small general laboratories. Currently, BioFire Diagnostics Inc. is developing gastrointestinal, blood culture ID, and sepsis panels using FilmArray technology. The current major drawback of FilmArray is its restriction to single-sample throughput. The further improvement to provide higher throughput will expand its utility in high-volume clinical laboratories.

In summary, due to its simplicity and clinical utility, the FilmArray is the first multiplex molecular test that has entered the general clinical laboratory, rather than a specialized laboratory. This marks a new era in laboratory medicine. FilmArray significantly improves the diagnosis and care of patients with respiratory infections. Overall, new and emerging technologies like FilmArray will allow more infectious agents to be detected earlier and more accurately by instruments situated in general core laboratories rather than in specialized laboratories, thereby speeding results from a 7/24 operations.

References
1. Takahashi H, Otsuka Y, Patterson BK. Diagnostic tests for influenza and other respiratory viruses: determining performance specifications based on clinical setting. J Infect Chemother 2010; 16: 155–61.
2. Ganzenmueller T, Kluba J, Hilfrich B et al. Comparison of the performance of direct fluorescent antibody staining, a point-of-care rapid antigen test and virus isolation with that of RT-PCR for the detection of novel 2009 influenza A (H1N1) virus in respiratory specimens. J Med Microbiol 2010; 59: 713–7.
3. Poritz MA, Blaschke AJ, Byington CL et al. FilmArray, an automated nested multiplex PCR system for multi-pathogen detection: development and application to respiratory tract infection. PLoS One 2011; 6: e26047
4. Loeffelholz MJ, Pong DL, Pyles RB et al. Comparison of the FilmArray Respiratory Panel and Prodesse real-time PCR assays for detection of respiratory pathogens. J Clin Microbiol 2011; 49: 4083–8.
5. Rand KH, Rampersaud H, Houck HJ. Comparison of two multiplex methods for detection of respiratory viruses: FilmArray RP and xTAG RVP. J Clin Microbiol 2011; 49: 2449–53.
6. Pierce VM, Elkan M, Leet M et al. Comparison of the Idaho Technology FilmArray system to real-time PCR for detection of respiratory pathogens in children. J Clin Microbiol 2012; 50: 364–71.
7. Rutledge J, Xu M, Simpson J. Application of the Toyota Production System improves core laboratory operations. Am J Clin Pathol 2010; 133: 24–31.
8. Xu M, Qin X, Astion ML et al. Implementation of FilmArray respiratory viral panel in a core laboratory improves testing turn-around-time and patient care. Am J Clin Pathol Jan. 2013, In press.
9. Pines JM, Batt RJ, Hilton JA, et al. The financial consequences of lost demand and reducing boarding in hospital emergency departments. Ann Em Med 2011; 58: 331–40.

The authors
Min Xu, MD, PhD
Xuan Qin, PhD
Michael L. Astion, MD, PhD
Joe C. Rutledge, MD
Department of Laboratories, Seattle Children’s Hospital,
4800 Sand Point Way NE, A6901
Seattle, WA 98105, USA
E-mail: min.xu@seattlechildrens.org

C69 Figure 1

ESBL NDP and Carba NP tests: novel techniques for rapid detection of multidrug-resistant bacteria

Two novel biochemical tests, the ESBL NDP and the Carba NP tests, have been recently developed for the early detection of ESBL- or carbapenemase resistance traits in Enterobacteriaceae. Those tests are rapid, sensitive, specific and cost-effective. Implementation of those tests in clinical microbiology laboratories may significantly improve the management and outcome of patients.

by Dr L. Dortet, Dr L. Poirel and Prof. P. Nordmann

Multidrug resistance is now emerging worldwide at an alarming rate among Gram negatives bacteria, causing both community-acquired and nosocomial infections [1–3]. One of the most important emerging resistance traits in Enterobacteriaceae corresponds to the acquisition of resistance to broad-spectrum β-lactams, which is mainly associated with production of clavulanic acid inhibited extended-spectrum β-lactamases (ESBLs) [4, 5]. An ESBL is a β-lactamase that confers reduced susceptibility, i.e. resistance, to the oxyimino-cephalosporins (e.g. cefotaxime, ceftriaxone, ceftazidime) and monobactams (e.g. aztreonam). The hydrolytic activity of ESBLs can be inhibited by several β-lactamase inhibitors such as clavulanic acid and tazobactam. Noteworthy, ESBLs usually do not hydrolyse cephamycins (e.g. cefoxitin and cefotetan) and carbapenems (imipenem, meropenem). In the context of worldwide spread of multidrug resistance, ESBL producers that are mostly Escherichia coli and Klebsiella pneumoniae are not only found as source of hospital-acquired but also of community-acquired infections [4-6]. Consequently, the last line of therapy, carbapenems, is now frequently needed to treat severe infections. However, carbapenem-non-susceptible Enterobacteriaceae due to the production of a carbapenem-hydrolysing enzymes termed carbapenemases, have been reported increasingly [1, 7, 8], leaving us with almost no effective molecules.

Thus, the early detection of ESBL and carbapenemase producers in clinical microbiology is now of utmost importance for determination of appropriate therapeutic schemes and the implementation of infection control measures.

Recently, we have developed two novel tests for rapid identification of (i) ESBL-producing Enterobacteriaceae (ESBL NDP test) [9] and (ii) carbapenemase-producing Enterobacteriaceae and Pseudomonas spp. (Carba NP test) [10–12]. We discuss here the clinical value of those tests.

Detection of ESBLs: Place of the ESBL NDP test in the diagnostic armamentarium
Current techniques for detecting ESBL producers are based on the determination of susceptibility to expanded-spectrum cephalosporins followed by the inhibition of the ESBL activity, mostly by clavulanic acid or tazobactam [13]. The double-disk synergy test, the “E-test” ESBL and the combined disk method have been proposed for that purpose. All those techniques consist of the identification of a synergy between an extended-spectrum generation cephalosporin (ESC) and an inhibitor of β-lactamase (i.e. clavulanic acid or tazobactam) after 18–24h of growth on Mueller-Hinton agar.

This synergy is visualized by (i) a “bouchon de champagne”-shaped image between the extended-spectrum generation cephalosporin and the clavulanate-containing disks for the double-disk synergy test, by (ii) a difference of minimal inhibitory concentration of more than three dilutions between ESC alone and association clavulanate-ESC for the “E-test” ESBL and by (iii) a difference of inhibition diameter of more than 5 mm between an ESC-containing disk and a combined disk containing the same ESC plus clavulanate.

Sensitivities and specificities of the double-disk synergy test and of the E-test are good, ranging from 80 to 95% [13]. However, due to the large diversity of ESBLs [6] that do not hydrolyse ESC similarly, several combinations of those molecules (cefotaxime, ceftazidime and cefepime) together with clavulanate should be tested. Based on the same principle, automated methods for bacterial identification and susceptibility testing are also used in the detection of ESBL-producing organisms. The performance of those systems varies and differs depending on the species investigated with a much higher sensitivity (80–99%) than specificity (50–80%) [13]. However, those tests require mostly overnight growths after isolation of the bacteria, meaning that up to 24–72 h can elapse before ESBL production is detected once the isolate has grown.

Molecular methods (PCR, hybridization, sequencing) based on the detection of ESBL genes have been developed as an alternative. Although classical PCR and DNA arrays necessitate isolation of the bacteria from the clinical sample, real-time PCR based techniques may be performed directly on clinical samples, leading to a decrease of the detection delay. However, these molecular techniques remain costly and require a certain degree of expertise, which is not accessible to non-specialized laboratories. Additionally, those detection methods are able to detect only known genes. They are usually not performed in a routine laboratory but restricted to epidemiological purposes.

Recently, a rapid and cost-effective biochemical test was developed for the detection of ESBL producers, namely the ESBL NDP test [9]. This test is based on a technique designed to identify the hydrolysis of the β-lactam ring of a cephalosporin (cefotaxime), which generates a carboxyl group, consequently acidifying a medium [Figure 1A]. It can either be performed in a 96-well microtiter plate or into a single tube [Figure 1B]. The acidity resulting from this hydrolysis is identified by the colour change using a pH indicator. Inhibition of ESBL activity is evidenced by adding tazobactam in a complementary well [Figure 1]. The ESBL NDP test may be performed on isolate colonies or directly from clinical samples. When performed on bacterial colonies, the overall sensitivity and specificity of the ESBL NDP test are 92.6% and 100% respectively. The ESBL NDP test can easily differentiate ESBL producers from strains that are resistant to expanded-spectrum cephalosporins by other mechanisms, and from those that are likely to be susceptible to expanded-spectrum cephalosporins. Sensitivity of the test is 100% when the ESBL is of the CTX-M-type. Of note, those CTX-M ESBLs have spread worldwide and have become the most predominant type of ESBL [14]. The ESBL NDP test possesses excellent sensitivity (100%) and specificity (100%) when performed directly from blood cultures. In that case, the gain of time for detection of ESBL producers is ~48h compared to the previously mentioned techniques. Additionally, the ESBL NDP test may also be performed directly on colonies grown on selective media used for the screening of colonized patients, leading to a gain of time of at least 24h for the identification of carriers of ESBL producers and consequently faster implementation of adequate hygiene measures that will further prevent the development of nosocomial outbreaks [2, 5].

Detection of carbapenemases: Place of the Carba NP test in the diagnostic armamentarium
In Enterobacteriaceae, carbapenem resistance may be related either to association of a decrease in bacterial outer-membrane permeability with overexpression of β-lactamases possessing no carbapenemase activity, or to the expression of carbapenemases [7]. The spread of carbapenemase producers is an important clinical issue since carbapenemases confer resistance to most β-lactams. A variety of carbapenemases have been reported, such as Ambler class A carbapenemases of KPC-type, metallo-β-lactamase (Ambler class B) of VIM-, IMP- and NDM-types, and Ambler class D carbapenemase of OXA-48-type [7]. In addition, the detection of carbapenemase producers is a major issue since they are usually associated with many other non-β-lactam resistance determinants, giving rise to multi- or even pandrug-resistant isolates [1, 3].

Potential carbapenemase producers are currently screened first by susceptibility testing based on breakpoint values for carbapenems. Additional non-molecular techniques have been proposed for in vitro identification of carbapenemase production. One of the commonly used techniques corresponds to the modified Hodge test (MHT), which has been used for years. Although the addition of zinc to the culture medium was recently shown to increase the sensitivity of this test [in particular for metallo-β-lactamase (MBL) producers], the MHT remains time-consuming (at least 24h) and may lack of specificity (frequent false-positives with Enterobacter spp. overexpressing their chromosomal cephalosporinase, and false-negatives results with many NDM producers). Other detection methods based on the inhibitory properties of several molecules do exist, either for KPC (e.g. boronic acid, clavulanic acid) or MBL (e.g. EDTA, dipicolinic acid) producers, therefore allowing discrimination between the diverse types of carbapenemases. All those methods are time-consuming since they do require isolation of the bacteria from the infected samples followed by at least an additional 24h period of time for performing the inhibitor-based technique. Several molecular methods such as simplex and multiplex PCRs, DNA hybridization and sequencing are also commonly used for the identification of carbapenemase genes in research laboratories and reference centres. Recently a real-time PCR (RT-PCR) technique has been used for detecting KPC producers directly from blood cultures. Although interesting, this molecular-based technique is costly and requires expertise in molecular techniques.

A rapid and cost-effective biochemical test, the Carba NP test, was recently developed to detect carbapenemase production from isolated colonies [12]. The principle of this test is the same as that of the ESBL NDP test, but uses  imipenem as substrate instead (Figure 2A). The Carba NP test differentiates carbapenemase producers (100% sensitivity and 100% specificity) from strains being carbapenem resistant due to non-carbapenemase-mediated mechanisms (Figure 2B) such as combined mechanisms of resistance (outer-membrane permeability defect associated with overproduction of cephalosporinase and/or ESBLs) or from strains that are carbapenem susceptible but express a broad-spectrum β-lactamase without carbapenemase activity (ESBL, plasmid and chromosome-encoded cephalosporinases). Interpretable positive results are always obtained in less than 1h total time, which is unique, making it possible to implement rapid containment measures to limit the spread of carbapenemase producers. The Carba NP test might be performed from colonies recovered from antibiogram (gain of time at least 24h) or from selective media used for screening of carriers (gain of time at least 48h). It was shown to detect carbapenemase producers not only in Enterobacteriaceae [11, 12] but also in Pseudomonas spp. [10]. Additionally, the Carba NP test has also been evaluated for detection of carbapenemase-producing Enterobacteriaceae directly from positive blood cultures [15]. In that case, the Carba NP test has 97.9% sensitivity and 100% specificity. This technique, once applied routinely in clinical laboratories, may guide the first line therapy for treating patients with sepsis, and therefore significantly change the patient outcomes, particularly in areas where carbapenemase producers are highly prevalent (such as Greece, Italy, Turkey, Israel, India). Additionally, when compared to molecular techniques, the Carba NP test may detect any carbapenemase production regardless of the corresponding gene being either known or unknown. Consequently, the Carba NP test is a useful tool for the detection of new carbapenemases that might eventually further disseminate, as recently shown with NDM-1 carbapenemase [8].

Conclusion
The ESBL NDP and the Carba NP tests are rapid, sensitive, specific and cost-effective biochemical tests for the early detection of the most important emerging resistance traits corresponding either to ESBL- or carbapenemase-producing Enterobacteriaceae. Implementation of such tests in the strategies of detection of multidrug-resistant bacteria may significantly improve the management and outcome of colonized and infected patients. Subsequently, the antibiotic stewardship would be improved leading to the decrease of the selective pressure that plays a crucial role in the emergence and spreading of multidrug-resistant bacteria.

Abbreviations
IMP, imipenemase; KPC, Klebsiella pneumoniae carbapenemase; NDM, New Delhi metallo-β-lactamase; VIM, Verona imipenemase; OXA, Oxacillinase

References
1. Schwaber MJ, Carmeli Y. JAMA 2008; 300: 2911–2913.
2. Spellberg B, Blaser M, et al. Clin Infect Dis 2011; 52(Suppl 5): S397–428.
3. Walsh TR, Toleman MA. J Antimicrob Chemother 2011; 67: 1–3.
4. Coque TM, Baquero F, Canton R. Euro Surveill 2008; 13.
5. Pitout JD, Laupland KB. Lancet Infect Dis 2008; 8: 159–166.
6. Poirel L, Bonnin RA, Nordmann P. Infect Genet Evol 2012; 12: 883–893.
7. Nordmann P, Dortet L, Poirel L. Trends Mol Med 2012; 18: 263–272.
8. Nordmann P, Poirel L, et al. Trends Microbiol 2011; 19: 588–595.
9. Nordmann P, Dortet L, Poirel L. J Clin Microbiol 2012; 50: 3016–3022.
10. Dortet L, Poirel L, Nordmann P. J Clin Microbiol 2012; 50: 3773–3776.
11. Dortet L, et al. Antimicrob Agents Chemother 2012; 56: 6437–6440.
12. Nordmann P, Poirel L, Dortet L. Emerg Infect Dis 2012; 18: 1503–1507.
13. Drieux L, Brossier F, et al. Clin Microbiol Infect 2008; 14(Suppl 1): 90–103.
14. Livermore DM, Canton R, et al. J Antimicrob Chemother. 2007; 59: 165–174.
15. Dortet L, Bréchard L, et al. J Antimicrob Chemother (submitted 2012).

The authors
Laurent Dortet, PhD, PharmaD, Laurent Poirel, PhD and
Patrice Nordmann, PhD, MD

Service de Bactériologie-Virologie, INSERM U914 “Emerging Resistance to Antibiotics”, Hôpital de Bicêtre, Assistance Publique/Hôpitaux de Paris, Faculté de Médecine Paris Sud, K.-Bicêtre, France

E-mail: patrice.nordmann@bct.aphp.fr

Frances1 3f3931

Shedding light on obesity and vitamin D status

The worldwide prevalence of obesity (defined as a BMI greater than 30) has more than doubled in the last thirty years, largely as a result of lifestyle changes leading to many people having a greater energy intake than expenditure. According to a study published in The Lancet recently, with the exception of populations in sub-Saharan Africa, over-eating is now a more serious health risk than eating too little. Globally three million people per year die as a result of obesity, three times the number who die from malnutrition.
There is also an increasing prevalence of vitamin D insufficiency and deficiency, particularly in developed countries. For example, according to a recent article published in the British Medical Journal, more than half the adult population in the UK is vitamin D insufficient, and 16% are severely deficient in the winter and spring; alarmingly up to a quarter of UK children are vitamin D deficient. This growing public health problem is also largely the result of lifestyle changes. Endogenous synthesis through exposure of the skin to sunlight is the major source of vitamin D; dietary sources are limited. Today’s children and adolescents tend to spend less time outside than previous generations, and the message that over-exposure to sunlight increases the risk of melanoma has lead to general over-cautiousness. While the role of vitamin D in regulating calcium and phosphorus and in the mineralization of bone has long been established, more recent work has linked vitamin D deficiency to a range of conditions including untoward pregnancy outcomes, diabetes, cancer, cardiovascular disease and autoimmune diseases. Previous observational studies also noted a link between obesity and vitamin D deficiency, but studies on vitamin D supplementation and weight loss yielded inconsistent results; it was not known which of the conditions was the cause and which the effect.
Now a meta-analysis involving a total of over 42,000 people of European ancestry from six different countries has been published. Twelve SNPs related to BMI and four SNPs associated with vitamin D formulation were analysed in normal weight, overweight and obese subjects. The results indicate that a higher BMI leads to lower vitamin D levels, probably because this vitamin is fat-soluble, but that higher vitamin D levels have no effect on obesity.
It would thus be prudent to monitor vitamin D status in obese subjects and give supplementation if needed, but encouraging lifestyle changes to incorporate regular exercise outdoors would kill two birds with one stone and would be of benefit to all of us!