25899 Coris BioConcept Insertion Juin2012

Mefloquine-Strip and Proguanil-Strip

25904 advert CLI sepsis

BioVendor’s Range of Sepsis Markers

Frances1 18

Known cancer-causing pathogens: the tip of the iceberg?

At the end of the 19th century the Scottish pathologist William Russell published an article in The Lancet titled ‘The Parasite of Cancer’ [The Lancet 1899; 3984: 1138-1141]. The response by the physicians of the day ranged from scepticism to sheer disbelief, largely because cancer is considered to be a non-communicable disease. With the exception of a few eccentric scientists, this incredulity persisted into the latter half of the 20th century, when the discovery of the Epstein-Barr virus from Burkitt lymphoma cells in 1964 led to recognition that there may be a few cancer-causing viruses. However in the last fifty years there has been a steady increase in the number of infectious agents found to cause cancer, and in May this year a relevant article was published in The Lancet Oncology. The authors used data from 2008 and found that of the 12.7 million new cancer cases occurring in that year, around two million were caused by infectious agents.
The cancer-causing microbes discovered so far include viruses, bacteria and parasites. Currently hepatitis B and C viruses, which can cause hepatocellular carcinoma, human papilloma viruses, strains of which are the aetiological agents of cervical cancer, and the bacterium Helicobacter pylori, which can cause gastric cancer, are considered to account for more than 80% of the cancers caused by infectious agents. Other carcinogenic viruses discovered so far include human T-lymphotropic virus, Kaposi’s sarcoma-associated herpes virus and Merkel cell polyomavirus. Well established bacterial associations with cancer include Salmonella typhi and gallbladder/hepatobiliary carcinoma and Chlamydia pneumoniae and lung carcinoma. In addition there is evidence for the association of Streptococcus bovis and colorectal cancer. There are also some established relationships between parasitic infections and cancer. The association of long-term infection with the parasitic fluke Schistosoma haematobium and bladder cancer is well documented, as is the association of long-term infection with the Far Eastern liver fluke Opisthorchis viverrini and cholangiocarcinoma. Some evidence also links the common protozoan parasites Crytosporidium parvum to gastrointestinal cancer and Toxoplasma gondii to the development of brain tumours.
All these organisms are able to evade the host’s immune system and establish persistent infections of many years’ duration, ultimately initiating abnormal cell growth followed by tumour development. So couldn’t efforts to lighten the global burden of cancer put more emphasis on timely diagnosis of these infections followed by suitable therapy, or even better the development and widespread use of effective vaccines? And whilst there isn’t a single ‘parasite of cancer’, it is likely that there are many other cancer-causing pathogens to uncover!

C25 Fig 2

Recent advances in PCR-based infection diagnosis in patients with suspected sepsis

Early detection to enable timely therapeutic intervention is crucial for improved outcome in patients with sepsis, but diagnosis is difficult, as the clinical signs associated with the condition commonly occur in patients with systemic inflammatory response syndrome (including sterile SIRS). This article discusses the current and emerging PCR-based technologies for the diagnosis of sepsis.

by Dr Satyanarayana Maddi, Dr Paul Dark and Dr Geoffrey Warhurst

Sepsis and issues for its early diagnosis
Sepsis is the clinical syndrome resulting from the host’s response to infection and represents a major international healthcare problem being a major cause of mortality and morbidity as well as a massive burden on resources [1]. The clinical signs associated with sepsis, such as changes in respiration, pulse, temperature and circulating immune cell counts, are non-specific and commonly seen in patients with a systemic inflammatory response syndrome (or SIRS) as well as in other insults such as tissue injury, where there is no infective cause. Early identification of sepsis and the ability to differentiate it from sterile SIRS is an important diagnostic goal in international medical practice. Evidence suggests that giving the most appropriate antimicrobial therapy at the earliest opportunity to patients with severe forms of sepsis saves more lives than any other medical intervention [1]. The Surviving Sepsis Campaign, which promotes early goal-directed management of sepsis, recommends initiation of antimicrobial therapy within one hour of clinical suspicion of sepsis [1]. Ideally, this requires rapid confirmation that infection is present and identification of the organism(s) involved. The guidelines advocate taking a whole blood sample and, where possible, other supporting clinical samples for microbiological culture prior to antibiotic administration. The problem facing clinicians is that blood cultures routinely take two to three days to confirm the presence of pathogens in the bloodstream (‘pathogenaemia’) and up to five days to either rule it out or to obtain a complete profile of the pathogen including its antibiotic susceptibility/resistance pattern. Also, since viable organisms are needed for culture, the tests can be compromised if the patient has received antimicrobial therapy prior to sampling, which is common in this clinical field.

In the face of this lack of time-critical information on the infection status of the patient coupled with the knowledge that delaying antimicrobial therapy will impair the survival chances of those patients that have infection, current opinion favours the early use of broad-spectrum and high potency antibiotics with focussing to specific organisms when microbiological evidence becomes available [1, 2]. This ‘safety first’ approach is currently the best available but does have negative consequences, particularly in terms of the overuse of antibiotics. The widespread use of broad-spectrum antibiotics is implicated in the emergence of antibiotic resistant pathogens and increasing rates of infection with Clostridium difficile and fungi. In addition many patients who will subsequently be shown to have had no infection are exposed to unnecessary treatment with powerful and potentially toxic drugs.

Application of PCR to diagnosis of pathogenaemia in suspected sepsis
While microbiological culture is likely to remain the gold standard for infection diagnosis, there is growing interest in the potential of PCR technology to provide early, time critical information based on the detection and recognition of bacterial or fungal pathogen DNA in blood [1, 2]. Platforms based on real-time PCR have proved to be the most effective in this field allowing continuous monitoring of amplicon production with either fluorescent dyes that bind non-specifically to double stranded DNA or fluorescently labelled probes that bind to specific sequences. In real-time PCR, the whole process of amplification, product detection and analysis is achieved in a single reaction vessel. Furthermore, several sequence-specific probes with different fluorescent reporters can be added to the reaction, allowing simultaneous determination of multiple products. This process is therefore ideally suited to sepsis diagnosis in which a variety of pathogen species could be involved. In terms of its application to infection diagnosis in blood (and other clinical samples), PCR offers a number of potential advantages; results are available in a matter of hours rather than days, the extreme sensitivity facilitates detection of even minute amounts of pathogen DNA in clinical samples and the test is not significantly affected by prior administration of antibiotics.

Two basic approaches to assay design have been used, either using specific primers to detect a particular organism or, more commonly, universal primers that bind to conserved sequences in bacterial but not human DNA and can detect a broad range of organisms [1]. The latter approach is ideally suited to sepsis diagnosis which can be caused by a variety of pathogen species. For bacteria, the most favourable targets are sequences in the 16S and 23S rRNA genes which are ubiquitous in bacteria and therefore ideally suited for universal detection of bacterial pathogens. More recently, the gene sequence between the 16S and 23S regions, the so-called internally transcribed region (ITS), has been targeted because it contains additional hypervariable regions that allow even better discrimination between bacterial species. Fungal pathogens can be detected by targeting analogous regions in the fungal genome [1].

Following PCR of these regions, pathogen species present can be identified by (a) specific binding of fluorescent hybridisation probes to the amplified target (b) sequencing of the amplified DNA (c) hybridisation to microarrays (d) melting temperature profiling of the amplified products.

Commercial PCR platforms for bloodstream infection diagnosis
Based on these approaches, a number of commercial systems are now available for detection of bacterial/fungal DNA in blood. Lightcycler SeptiFast, the first real-time PCR system to receive a European CE-mark (2006) for use in diagnosis of bloodstream infection, is manufactured by Roche Diagnostics (Basel, Switzerland) [Figure 1]. SeptiFast is a multiplex assay for detection and identification of a defined panel of 25 bacterial and fungal pathogens known to cause the majority of bloodstream infections in critical care. The assay can be completed in 6-8 hours and has a reported sensitivity of between 3 and 30 colony forming units (CFU) per mL of blood. SeptiFast is currently the most studied commercial PCR-based test for sepsis-associated blood-stream infection with numerous clinical validity studies published to date. At the time of writing, the author’s laboratory is hosting the first independent multicentre systematic validity study comparing SeptiFast with culture for the diagnosis of suspected healthcare-associated bloodstream infection [2]. Based on the results of this study, independent recommendations will be made to the UK’s Department of Health as to whether this real-time PCR technology has sufficient clinical diagnostic accuracy to move forward to efficacy testing during the provision of routine clinical care.

SepsiTest (Molzym GmbH & Co. KG, Bremen, Germany) [Figure 2], which was awarded a CE mark in 2008, uses universal primers to detect bacterial or fungal DNA in blood and other clinical samples but relies on post-test sequencing of the products for subsequent species identification [1, 3]. Studies evaluating the use of SepsiTest in a clinical setting are beginning to appear in the literature [3]. A third CE marked commercial platform, VYOO PCR identification test from SIRS-Lab GmbH, Germany is a semi-automated method combining broad range PCR with multiplex detection plus microarray hybridisation. In addition to detecting 34 bacterial and six fungal species covering 99% of sepsis-associated pathogens, it also detects five resistance markers i.e. mecA for Methicillin Resistance Staphylococcus aureus, vanA and vanB for vancomycin resistance in enterococci and blaCTX-M15 and blaSHV for extended spectrum β-lactamases in gram negative bacilli. To date no published clinical validity studies of this product are available.

Future/emerging approaches and technologies
High resolution melting analysis (HRMA) is a post PCR amplification method of analysing DNA that does not require multiple expensive fluorescent probes, and is solely dependent on intercalating dye chemistry for its results. Using universal primers, the target regions are amplified and then melting curve analysis is performed in high resolution (high resolution with HRM analysis) after the PCR. Thanks to the advances in instrumentation which can delineate minute shifts in the melting temperatures, the species are identified using shifts in the melting profile of the amplicons [4]. HRM analysis is quick and cost effective. The HRMA is still in developmental stage but the future looks encouraging.

Other approaches under various stages of developments for diagnosis of sepsis are FilmArrays (Idaho Technology Inc. USA) [5] [Figure 3], and ‘‘Lab-on-a-Chip’’ using microfluidic-technology i.e. taqMan Low-density array (TLDA), which overcomes limitations in multiplex PCR assays, namely the narrow range of probe regions needed for multiplexing and the inability of the PCR instrument to detect more than six fluorophores simultaneously [6].

Conclusions and future
Widespread technology adoption of these PCR systems will not occur in healthcare until clinical effectiveness has been proven. No adequately powered systematic clinical effectiveness studies have been performed to date in the field of sepsis, resulting in the absence of data that would support optimal pricing of the available technologies alongside health service adoption. There is clearly an unmet need in the field of sepsis diagnostics, but a more coordinated approach to health technology assessment and adoption in this field is urgently required to help patients benefit from the elegant technologies currently available and from those under development.

References
1. Dark PM, Dean P, Warhurst G. Bench-to-bedside review: the promise of rapid infection diagnosis during sepsis using polymerase chain reaction-based pathogen detection. Crit Care 2009;13(4):217.
2. Dark P, Dunn G, Chadwick P, Young D, Bentley A, Carlson G et al. The clinical diagnostic accuracy of rapid detection of healthcare-associated bloodstream infection in intensive care using multipathogen real-time PCR technology. BMJ Open 2011 Jan 1;1(1):e000181.
3. Kühn C, Disqué C, Mühl H, Orszag P, Stiesch M, and Haverich A. Evaluation of Commercial Universal rRNA Gene PCR plus Sequencing Tests for Identification of Bacteria and Fungi Associated with Infectious Endocarditis. J Clin Microbiol. 2011 August; 49(8): 2919–2923.
4. Ozbak H, Dark P, Maddi S, Chadwick P, Warhurst G. Combined molecular gram typing and high-resolution melting analysis for rapid identification of a syndromic panel of bacteria responsible for sepsis-associated bloodstream infection. J Mol Diagn 2012 Mar;14(2):176-184.
5. Caliendo AM. Multiplex PCR and emerging technologies for the detection of respiratory pathogens. Clin Infect Dis 2011; 52 (4): S326-30
6. Kodani M, Yang G, Conklin LM, Travis TC, Whitney CG, Anderson LJ et al. Application of TaqMan low-density arrays for simultaneous detection of multiple respiratory pathogens. J Clin Microbiol 2011 Jun;49(6):2175-2182.

The authors
Satyanarayana Maddi, Paul Dark, Geoffrey Warhurst
Infection Inflammation Injury Research Group (3IRG)
Salford Royal NHS Foundation Trust, UK. School of Translational Medicine
The University of Manchester, UK.

p10 01

Early detection of acute kidney injury in sepsis: how about NGAL?

Sepsis frequently results in acute kidney injury (AKI). Although AKI markedly contributes to mortality in sepsis, its diagnosis is frequently delayed due to limitations of current biomarkers of renal impairment. Neutrophil-gelatinase-associated lipocalin (NGAL) has been demonstrated to be a biomarker of early AKI. This review analyses the potential use of NGAL in sepsis.

by Dr W. Huber, Dr B. Saugel, Dr R. M Schmid and Dr A. Wacker-Gussmann

Pathophysiology, definition and epidemiology of sepsis
Sepsis is a clinical syndrome characterised by systemic inflammatory response to infection [1-2]. Incidence of sepsis has increased by a factor of four within the last three decades, with an estimated incidence of 650,000 cases per year in the USA. SIRS (systemic inflammatory response syndrome) describes a similar inflammatory reaction to non-infectious aetiologies such as poly-trauma, acute pancreatitis and burns. Apart from different aetiology, sepsis and SIRS share a common definition requiring two or more of four criteria of systemic inflammation (fever/hypothermia, tachycardia >90/min, tachypnoe >20/min or paCO2<32mmHg and leukocytosis (>12G/L) or leukopenia (<4G/L) [Table 1], [1-2]. Pathophysiology of sepsis is mainly attributed to imbalanced and generalised release of pro-inflammatory mediators resulting in impaired circulation, tissue injury and organ failures up to multiple-organ-dysfunction-syndrome (MODS). Despite strong evidence for therapeutic efficacy of early causative therapy (treatment of infection source), antibiotics and several supportive strategies, mortality from severe sepsis and septic shock remained up to 20-50% in recent sepsis trials [1-2]. There is an ongoing debate on the benefits of supportive strategies such as hydrocortisone, intensified-insulin-therapy and immunomodulation. However, there is strong consensus about the paramount importance of early sensitive diagnosis and staging of sepsis (severe sepsis and septic shock) in order to initiate appropriate monitoring and therapy as early as possible. In general, patients with severe sepsis will require intensive care and haemodynamic monitoring to optimise circulation. Diagnoses of severe sepsis and septic shock are mainly based on the evidence of organ failure and emphasize the impact of circulatory failure. The impact of different organ failures on outcome of ICU-patients is substantiated by numerous studies [1-5)]. Interestingly, renal and liver failure were among the organ failures with the most pronounced impact on outcome in several studies [3-5]. At first glance, this might be surprising. However, circulatory and respiratory failure can be easily detected at early stages of severe sepsis, and symptomatic therapy of these organ failures is the main target of intensive care. By contrast, renal and liver failure remain underrated and 'late-stage-diagnosed losses of organ function' in the development of MODS. Difficulties in early detection of renal and hepatic failure by traditional markers has probably also resulted in their under-representation in scoring-systems: Regarding renal failure, APACHE-II and the SOFA-score are mainly based on absolute serum creatinine values. However, the use of serum creatinine as a marker in these scores and particularly as an early marker of septic renal failure is limited by a number of drawbacks: serum levels of creatinine are dependent on age, gender, muscle mass and race. Furthermore, in case of impaired glomerular filtration, serum creatinine levels can be lowered by tubular secretion, which contributes to the phenomenon of the 'creatinine-blind-range' of renal failure: glomerular-filtration rate (GFR) can decrease to about 50% with serum creatinine levels staying within the normal range. Numerous formulae for GFR estimation slightly improve this drawback. However, GFR formulae are neither part of sepsis definitions nor are they included in SAPS-II, SOFA- and APACHE-II-score. Even the more recent Acute-Kidney-Injury-Network (AKIN) definition of AKI rejected GFR, which has been included in the previous RIFLE-classification (RIFLE: Risk, Injury, Failure; Loss, End-Stage Renal Disease). RIFLE and AKIN as well as the new KDIGO-definition (KDIGO: Kidney Disease: Improving Global Outcomes) are mainly based on changes in serum creatinine compared to baseline values, which are 'known or presumed to have occurred within the prior seven days' [6]. Comparison with a baseline value which is not known in a substantial percentage of patients remains a major problem of these definitions. In general, their usefulness is substantiated as consensus definitions for acute changes in renal function within 2-7 days after the first measurement of serum creatinine rather than being highly sensitive for early AKI. This also relates to the fact that increased serum creatinine on ICU admission of a septic patient might result from constant chronic renal impairment as well as acute renal failure in a patient with previously normal renal function. Both, acute and chronic renal impairment have been demonstrated to significantly influence outcome, albeit to a different degree, with patients with AKI more frequently requiring mechanical ventilation [4, 7]. In the context of sepsis, specification of renal impairment is particularly important: acute septic renal impairment results in markedly worse prognosis, classification as severe sepsis and intensified monitoring in an ICU. By contrast, stable chronic renal impairment in a patients just fulfilling two of the four sepsis criteria would be a minor risk factor contributing to outcome similar to older age. Being a marker of function rather than of injury remains the major drawback of serum creatinine for differentiation of renal impairment. Approaches to early detection of AKI
Systematic efforts have therefore been made to characterise markers of early renal injury. Using several established animal models of acute renal injury (e.g. ischaemia, nephrotoxic medication including contrast-medium), up-regulation of a number of potential genes has been demonstrated as a short-term reaction to experimental acute renal injury [8]. Among those up-regulated genes and a number of other biomarkers, NGAL, Kidney-Injury-Molecule-1 (KIM-1), interleukin-18 (IL-18) and cystatin C have been most intensively studied. Cystatin C provides characteristics most similar to creatinine: this marker is a cysteine proteinase inhibitor synthesised in all nucleated cells and freely filtered by the glomerulus. The major adavantages over serum creatinine are that cystatin C is not secreted by the tubulus and that it is not affected by age, gender, muscle mass and race. However, with increased levels of cystatin C resulting from accumulation due to decreased glomerular filtration, cystatin C remains as a marker of decrease in renal function rather than a biomarker of early kidney injury. Several studies suggest its slightly earlier (within 24h?) detection of AKI compared to serum creatinine.

Another ‘candidate molecule’ for early detection of AKI is IL-18, a pro-inflammatory cytokine that is induced in the proximal tubule and detected in urine after AKI. In clinical settings, increase in urinary levels within 6h and peak-values within 12h have been demonstrated in cardiopulmonary bypass patients with AKI after 48h according to serum creatinine.

KIM-1 is a transmembrane protein that is markedly over-expressed in the proximal tubule after ischaemic or toxic AKI. A number of clinical studies suggest earlier detection of AKI by KIM-1 compared to serum creatinine, e.g. with elevated urinary KIM-1 levels 12h after paediatric cardiac surgery and prediction of renal replacement therapy (RRT) and mortality in AKI.

NGAL
The most promising biomarker for early acute kidney injury at present is NGAL, which is the profuct of one of seven genes markedly up-regulated in a ischaemia-reperfusion mouse model [8]. NGAL is a 178 amino-acids polypeptide expressed by neutrophils and other epithelial cells including the proximal tubule. NGAL provides several physiological functions including bacteriostatic (depriving bacteria of iron essential for growth), antioxidant (stops free and reactive iron from producing oxygen free radicals) and growth-factor properties (regulates cell proliferation, apoptosis, differentiation). Furthermore, there is a possible rescue role in other epithelia (breast, uterus), and NGAL also is overexpressed in some epithelial tumours.

Regarding its potential clinical use, NGAL has been validated as an early biomarker of AKI induced by cisplatin, contrast-media and cardiac surgery as well as a screening marker for patients at risk in the emergency department (ED) and ICU. In these settings, urinary and plasma levels of NGAL after 2h-12h were significant predictors of AKI defined by later increases in serum creatinine within 24-48h. Depending on setting and methodology, best predictive capabilities of NGAL were found for cut-off values between 50 and 150 µg/L. In a large ED study in 635 patients, urinary NGAL levels clearly differentiated between acute (markedly elevated NGAL) and chronic (not elevated NGAL) renal impairment, whereas there was substantial overlap of serum creatinine values for both groups ([9].

These abilities to discriminate between acute and chronic renal impairment might be particularly useful in patients with sepsis [Figures 1 and 2]. With assessment of renal function in ICU patients based on serum creatinine, normal creatinine levels might be ‘false negative’ and will increase as late as after 48h. On the other hand, increased values of creatinine can result from stable chronic renal impairment. Misinterpretation of these values – ‘false positive’ for septic renal failure – might result in inappropriate allocation of resources, e.g. efficacy for most of the supportive measures in sepsis has been demonstrated predominantly for patients at high risk and with severe sepsis, whereas side effects might outweigh the benefits in patients with less pronounced sepsis.

A potential role for NGAL in sepsis has been suggested in several clinical studies. In 143 paediatric ICU patients Wheeler et al. demonstrated that septic shock, but not SIRS, resulted in a significant elevation of NGAL compared to controls [10]. Furthermore, NGAL on admission was significantly higher in children developing AKI within seven days after admission compared to children without AKI. Serum levels of NGAL and creatinine did not correlate on day one after admission.

A study in 971 ICU patients investigated the predictive capabilities of nine biomarkers on admission regarding severe sepsis within 72h. The best predictive capabilities were found for NGAL, whereas D-dimer, BNP and CRP were of limited use. A score based on NGAL, IL-1-receptor-antagonist and protein C levels significantly distinguished four groups of patients developing no sepsis, severe sepsis, septic shock and death [11]: the area under the curve for the score derived from these three biomarkers was 0.80 for severe sepsis, 0.77 for septic shock and 0.79 for death.

Another recent study found significantly elevated plasma NGAL levels within
4 hours after admission in septic as well as non-septic-patients with AKI according to RIFLE-criteria compared to patients without AKI [12]. Increases in NGAL were even more pronounced in septic compared to non-septic AKI patients. Similarly, urinary NGAL-levels were higher in septic compared to non-septic patients without AKI [13], suggesting that cut-off-values for NGAL to predict AKI might be higher than for non-septic patients.

In summary, clinical applications of NGAL in sepsis comprise early detection of AKI in patients with normal serum creatinine (NGAL+, crea-) compared to patients without renal impairment (NGAL-, crea-). Furthermore, NGAL might be useful to distinguish patients with stable chronic renal impairment (NGAL-, crea+) from patients with ongoing or ‘acute on chronic’ renal injury (NGAL+, crea+) [Figure 1].

With regard to sepsis, more sensitive detection of AKI (NGAL+, crea-) would result in staging as ‘severe sepsis’ instead of sepsis in patients without other organ failures [Figure 2]. Early detection of AKI might help to allocate additional causative (antimicrobial therapy, intervention) and supportive measures for sepsis as well as specific measures to prevent further renal damage. These attempts include intensified haemodynamic monitoring to optimise fluid load, avoidance of further nephrotoxic medications and procedures (contrast-application) or at least prophylactic approaches such as hydration or administration of theophylline or acetylcysteine [13]. Allocation of these resources according to significant predictors and avoidance of further renal impairment carries a high potential for cost effectiveness as emphasised by a number of studies.

Further studies are required to validate that early determination of NGAL improves diagnosis and outcome in septic and non-septic patients at risk of AKI. Future studies should also investigate if including NGAL into scoring (APACHE-II, SOFA, SAPS-II) systems improves their predictive capabilities.

References
1. Dellinger RP et al. Crit Care Med 2008; [published correction appears in Crit Care Med 2008; 36:1394-1396] 36:296-327.
2. German Sepsis Society. German Interdisciplinary Association of Intensive Care and Emergency Medicine. Prevention, diagnosis, therapy and follow-up care of sepsis: 1st revision of S-2k guidelines of the German Sepsis Society (Deutsche Sepsis-Gesellschaft e.V. (DSG)) and the German Interdisciplinary Association of Intensive Care and Emergency Medicine (Deutsche Interdisziplinäre Vereinigung für Intensiv- und Notfallmedizin (DIVI)). Ger Med Sci. 2010 Jun 28;8:Doc14.
3. Chertow GM et al. J Am Soc Nephrol 2005 Nov;16(11):3365-70.
4. Metnitz PG et al. Crit Care Med 2002 Sep;30(9):2051-8.
5. Kramer L et al. Crit Care Med 2007 Apr;35(4):1099-104.
6. KDIGO Clinical Practice Guideline Acute Kidney Injury. Kidney International Supplements 2012; 2: 1-138.
7. Walcher A et al. Ren Fail 2011;33(10):935-42.
8. Mishra J et al. J Am Soc Nephrol 2003 Oct;14(10):2534-43.
9. Nickolas TL et al. Ann Intern Med 2008 Jun 3;148(11):810-9.
10. Wheeler DS et al. Crit Care Med 2008 Apr;36(4):1297-303.
11. Shapiro NI et al. Ann Emerg Med 2010 Jul;56(1):52-59.e1.
12. Lentini P et al. Crit Care Res Pract 2012: 2012:856401. Epub 2012 Feb 14.
13. De Geus HR et al. Am J Respir Crit Care Med 2011 Apr 1;183(7):907-14. Epub 2010 Oct 8
14. Huber W et al. Radiology 2006; 239(3):793-804.

The authors
Wolfgang Huber MD, Bernd Saugel MD, Roland M Schmid MD
II. Medizinische Klinik und Poliklinik, Klinikum rechts der Isar der Technischen Universität München, Ismaningerstr. 22, D-81675 München, Germany
and
Annette Wacker-Gussmann MD
Universitätsklinik Tübingen, Kinderheilkunde und Jugendmedizin, Abteilung für Neonatologie, Calwerstr. 7, D72076 Tübingen, Germany

Correspondence to Wolfgang Huber
e-mail: wolfgang.huber@lrz.tu-muenchen.de; Tel: +0049 (0) 89 4140-5478

Scientific Lit picture 04

Scientific literature review: sepsis

There are a huge number of peer-reviewed papers covering sepsis, and it is frequently difficult for healthcare professionals to keep up with the literature. As a special service to our readers, CLI presents a few key abstracts from the clinical and scientific literature chosen by our editorial board as being particularly worthy of attention.

Predictors of survival in sepsis: what is the best inflammatory marker to measure?

Lichtenstern C et al. Curr Opin Infect Dis. 2012 Jun;25(3):328-36.

Beyond the widely used acute-phase proteins C-reactive protein (CRP) and procalcitonin (PCT) in sepsis manegement, many new molecules have been studied deriving from different organs or cells affected, due to the systemic nature of sepsis. Cytokines, coagulation factors/characteristics, vasoactive hormones and several others have recently proved to be relevant in sepsis syndrome and probably useful for outcome prediction. However, single time point measurements may be less predictive than consideration of the time-dependent course of parameters. Many biomarkers display relevant correlation with the clinical outcome of patients with severe sepsis and septic shock. Consideration of their time courses may be more reliable than absolute levels. Clinical decision should not only be based on biomarkers but organ dysfunctions, for example, should also be taken into account.

Cytokine profiles of preterm neonates with fungal and bacterial sepsis

Sood BG et al. Pediatr Res. 2012 May 4.

Information on cytokine profiles in fungal sepsis (FS), an important cause of mortality in extremely low birthweight infants (ELBW), is lacking. The authors hypothesised that cytokine profiles in the 1st 21 days of life in ELBW with FS differ from those with bacterial sepsis (BS) or no sepsis (NS). In a secondary analyses of the NICHD Cytokine study, three groups were defined – FS (≥1 episode of FS), BS (≥1 episode of BS without FS) and NS. Association between 11 cytokines assayed in dried blood spots obtained on days 0-1, 3±1, 7±2, 14±3, and 21±3 and sepsis group was explored.Of 1066 infants, 89 had FS and 368 had BS. Compared to BS, FS was more likely to be associated with lower birthweight, vaginal delivery, patent ductus arteriosus, postnatal steroids, multiple central lines, longer respiratory support and hospital stay, and higher mortality (p<0.05). Analyses controlling for covariates showed significant group differences over time for IFN-γ, IL-10, IL-18, TGF-β and TNF-α (p<0.05). These differences, which may have implications for diagnosis and treatment, require validation in rigorously designed prospective studies.

Prognostic value of proadrenomedullin in severe sepsis and septic shock patients with community-acquired pneumonia

Suberviola B et al. Prieto B. Swiss Med Wkly. 2012 Mar 19;142:w13542.

Midregional proadrenomedullin (proADM) is a novel biomarker with potential prognostic utility in patients with community-acquired pneumonia. The aim of this study was to investigate the value of proADM levels for severity assessment and outcome prediction in severe sepsis and septic shock due to CAP. The prospective observational study included 49 patients admitted to ICU with both a clinical and radiologic diagnosis of pneumonia and fulfilling criteria for severe sepsis or septic shock. The prognostic accuracy of proADM levels was compared with those of pneumonia severity index and of procalcitonin (PCT) and C-reactive protein (CRP). Forty-nine patients with severe sepsis or septic shock due to CAP were included in the study. Mortality was 24.5% for ICU and 34.7% for hospital mortality. In all cases proADM values at ICU admission were pathological (considering normal proADM levels <4 nmol/L). ProADM consistently rose as PSI class advanced from II to V (p = 0.02). Median proADM levels were higher (p <0.01) in hospital non-survivors 5.0 (1.9-10.1) nmol/L vs. survivors 1.7 (1.3-3.1) nmol/L. These differences were also significant with respect to ICU mortality. The receiver-operating characteristic curve for proADM yielded an AUC of 0.72; better than the AUC for PCT and CRP (0.40 and 0.44 respectively) and similar to PSI (0.74). In this study MR-proADM levels correlated with increasing severity of illness and death. High MR-proADM levels thus offer additional risk stratification in high-risk CAP patients.

p16 03

Direct thrombin inhibitor assays

We have investigated the effects of three (Lepirudin, Argatroban and Bivalirudin) direct thrombin inhibitors (DTI) on routine and dedicated assays.
We found routine tests to be non-discriminative between concentrations of different DTI. The dedicated Hemoclot assay showed identical lineair increases for all three DTI.
We conclude that a dedicated calibrated assay based on a diluted thrombin time (Hemoclot) appears to be the most suitable assay for monitoring purpose.

by Dr Joyce Curvers, Dr Volkher Scharnhorst and Dr Daan van de Kerkhof

Clinical background
The use of direct thrombin inhibitors (DTIs) for prophylactic or therapeutic anticoagulation is increasing due to their predictable bioavailability, short half life and limited interaction with other medication [1-5]. The current idea is that the newer anticoagulants should not require laboratory monitoring because of these advantages. However, although monitoring of anticoagulant therapy may not be required for ‘standard’ patients, patients with an increased bleeding risk, specific co-medication (such as amiodarone or bridging therapy with coumarins), or a deviant body mass or water homeostasis (e.g. neonates, during pregnancy, the obese, the elderly, in renal insufficiency, oedema, cardiac disease) may still require occasional blood analysis. In addition when the compliance or effectiveness of the anticoagulants is doubted, measurement of the coagulation status can be crucial for the correct treatment of a patient. Since DTIs interfere with the central clotting enzyme thrombin, almost every coagulation assay is affected by its presence in blood. This also accounts for routinely used assays such as the aPTT or PT (and INR) [6].

Up to date, there is no consensus on how oral or intravenous administrable DTI should be monitored and specifically which assay should ideally be used [6,7]. In this study we performed an in vitro study in which we investigated the effect of increasing concentration levels of three DTIs: lepirudin, bivalirudin and argatroban in six plasma pools on aPTT, PT, TT and on dedicated DTI-assays (Hemoclot from Hyphen BioMed and Ecarin Clotting Time from STAGO) on a coagulation analyser (STA-R Evolution, Roche).

Materials and methods
Six different pools (N>20 samples per pool) were collected from residual plasma from patients with aPTT and PT values within reference limits (assuming that patients did not take any anticoagulant medication based on their normal aPTT and PT values).

Argatroban (Arganova, Mitsubishi Pharma, lot PF41977, 100 mg/mL) and lepirudin (Refludan, Pharmion, lot 24661611L, 50mg) were provided by the local hospital pharmacy. Bivalirudin (Angiox or angiomax, The Medicines company, lot 1574697, 250 mg) was a kind gift from the Medicines Company. All DTIs were diluted with saline (0,9% NaCl) to 5 g/L. These stock solutions were spiked into the pooled plasmas (N=6) to reach final concentrations of 1, 2, 3, 4 and 5 mg/L. Therapeutic doses of DTI are currently advised at 2 mg/L (according to package leaflet). Different plasma pools with each different concentration of different DTIs were frozen in triplicates at <-70˚C until time of measurement. Clotting times in the aPTT, prothrombin time (PT) and thrombin time (TT) as well as the dedicated assays Hemoclot (a diluted TT) and the Ecarin Clotting Time (ECT) were recorded. Results
For all thrombin inhibitors investigated here, the fold increase compared to no DTI in six pools measured in routine tests (aPTT, PT and thrombin time) are shown in Figure 1. The aPTT shows a non-linear concentration-response relationship with a more gradual increase at higher DTI concentrations resulting in a limited sensitivity of the assay in this range. The concentration-response relationship for the PT was linear but with different sensitivities for the different DTIs. The low sensitivity was found especially for bivalirudin and lepuridin with respectively a maximum 2- and 3-fold increase in PT coagulation time at 5 mg/L. The thrombin time also showed a linear concentration-response relationship, with a high increase in coagulation time as function of concentration, especially for lepuridin, exceeding the maximum installed measuring range (i.e. 240 sec) of the STA-R evolution.

Figure 2 shows the data for the dedicated thrombin inhibitor tests. Similar results as for the PT were observed for the ECT, also with respect to the differences between different direct thrombin inhibitors. Lepirudin showed an increase in ratio up to 5-fold baseline value in the ECT. The increase in the Hemoclot was linear for all DTIs with similar increase as a function of concentration measured.

Conclusion
Concluding, dedicated DTI assays overcome the drawbacks of routine assays such as the PT, aPTT or TT, in which the ability to discriminate between different concentrations is insufficient. This would suggest that monitoring DTIs using the aPTT is obsolete. We have shown that dose-response curves of DTIs in dedicated assays such as the Hemoclot and ECT are acceptable. Moreover, they can be applied in a routine setting, have short turn around times and can be used to distinguish inappropriate from appropriate dosing without the necessity of reanalysis after dilution. Given that a calibrator is included in the assay kit and the test gives similar result for different DTI formulations, the Hemoclot assay appears to be the most suitable assay for monitoring purposes (apparent in this study). As more new oral thrombin inhibitors such as dabigatran etexilate find their way into troutine practice, dedicated assays may aid the clinician in better decision making concerning anticoagulant therapy, especially in certain groups of patients in need of monitoring. However, research is needed to properly determine therapeutic and prophylactic concentration ranges, with calibrated dedicated DTI assays.

Current status
The administration of (oral and) intravenous direct thrombin inhibitors is increasing, since more applications are becoming available. The pharmaceutical companies pay little attention to the fact that, in certain situations, indication of the concentration is warranted.

We are currently validating a calibrated assay based on a diluted thrombin time for use in our laboratory (and clinic), as are several other laboratories nation-wide.

Future prospects
Up to now little is known about interference of different anticoagulants combined with DTI (e.g. during bridging therapy) and the effects on the different dedicated assays. Future research will show the value of the different DTI assays in monitoring patients in order to distinguish proper dosing from under dosage or over dosage.

Moreover, standardisation and calibration of (present and new) dedicated assays for the measurement of DTI is a major issue of concern. Therefore we are currently conducting research in which a comparison of coagulation assay results with actual concentrations of the different DTI (measured with LCMSMS) is investigated.

Notification
Part of this publication is included in a manuscript that will be published in the American Journal of Clinical Pathology.

References
1. Di Nisio M, Middeldorp S, Buller HR. Direct thrombin inhibitors. N Engl J Med 2005; 353: 1028-1040.
2. Stone GW, Witzenbichler B, Guagliumi G et al. HORIZONS-AMI Trial Investigators. Bivalirudin during primary PCI in acute myocardial infarction. N Engl J Med 2008; 358: 2218-2230.
3. Mehran R, Lansky AJ, Witzenbichler B et al. HORIZONS-AMI Trial Investigators. Bivalirudin in patients undergoing primary angioplasty for acute myocardial infarction (HORIZONS-AMI): 1-year results of a randomised controlled trial. Lancet 2009; 374: 1149-1159.
4. Connolly SJ, Ezekowitz MD, Yusuf S et al. RE-LY steering committee and investigators Dabigatran versus warfarin in patients with atrial fibrillation. N Engl J Med 2009; 361: 1139-1151. Erratum in: N Engl J Med 2010 Nov 4;363(19):1877
5. Schulman S, Kearon C, Kakkar AK et al. for the RE-COVER study group. Dabigatran versus warfarin in the treatment of acute venous thromboembolism. N Engl J Med 2009; 361: 2342-2352.
6. Gosslin RC, Dager WE, King JH et al. Effect of direct thrombin inhibitors, bivalirudin, lepirudin and argatroban, on prothrombin time and INR values. Am J Clin Pathol 2004; 121: 593-599.
7. Van Ryn J, Stangier J, Haertter S et al. Dabigatran etexilate – a novel, reversible, oral direct thrombin inhibitor: interpretation of coagulation assays and reversal of anticoagulant activity. Thromb Haemost 2010; 103: 1116-1127.

The authors
Joyce Curvers PhD, Volkher Scharnhorst PhD and Daan van de Kerkhof PhD
Clinical Laboratory
Catharina Hospital Eindhoven
Eindhoven
The Netherlands

The case for better blood management



    

Although significant progress has been made to improve blood safety and efficiency, laboratories still face workforce challenges, and the lack of global standards and better quality controls in blood management and haemovigilance pose a threat to patient safety. To help address these challenges, a symposium was hosted at the 2011 American Association of Blood Banks Annual Meeting with key opinion leaders from the transfusion medicine departments of the Cleveland Clinic, Children’s Hospital Los Angeles and USC Medical Center.
The attending experts broadly agreed that the key to safe, efficient blood management boils down to the ‘5Rs’: ensuring the Right patient gets the Right donor unit at the Right time, the Right way and for the Right reason. Presentations at the symposium discussed how the transfusion laboratory can deliver on the 5Rs through improvements in blood management and haemovigilance practices.

by Scott Saccal

Improving blood safety
Laboratories strive to protect patients’ health and deliver safe blood and blood components to the right person at the right time, but they are under constant pressure to do more with less – including fewer skilled laboratory technicians and scarcer financial resources. To help in meeting these demands, blood bank laboratories are increasingly employing automation. For example, many blood banks are standardising across instrument platforms and implementing testing technologies such as Column Agglutination (CAT). These testing methods are easier to use, and help reduce the opportunity for error and variation among both technologists and tests because they provide stable and clear endpoints that deliver accurate, objective and consistent results.

Automation helps minimise the labor-intensive, time-consuming manual tests that require specialised skills and significant experience to master, such as patient and unit typing, antibody screening and cross-matching. Automation also increases the capacity of technologists so they can focus on priorities such as time-sensitive emergency situations, difficult patient workups and quality-improvement processes. For example, computerised physician order entries have been found to reduce errors related to labour-intensive tasks by 50 percent, and all errors in general by 80 percent [1,2]. Ultimately, automated testing can increase a lab’s capacity, potentially allowing it to serve more patients while helping it operate more efficiently.

New approaches to load management and haemovigilance
Quality controls help ensure the safety of the patient at the end of the bloodline by helping to assure that from the moment a donation is made, the right blood and blood products are delivered to the right patient at the right time, and in the right way for the right reason. While great strides have been made to implement quality controls for the highest level of patient safety, there remains much work to be done.

Across the globe, a total of 106 countries have national guidelines on blood management, yet no universal safety standards exist. Further, broken system links and human errors due to distraction, fatigue and inattention account for approximately 70 percent of lab errors and cause catastrophic consequences such as the inappropriate administration of blood and/or adverse reactions [3]. Giving the wrong donor unit or giving an inappropriate transfusion can lead to serious complications, disease transmission and even fatal haemolytic reactions [4].

To combat the high incidence of laboratory errors, many hospitals and clinics are appointing Transfusion Safety Officers (TSO), to oversee work outside of the laboratory to improve patient safety during transfusions [5]. The increase in demand for blood and blood components suggests that additional measures will be needed to promote transfusion safety across departments, oversee institution-wide haemovigilance and error and accident reporting, provide education on transfusion reactions, implement guidelines, perform safety training and identify new technology for enhanced safety.

Additionally, hospitals and healthcare institutions around the world are developing ways to meet new standards – both by instituting their own rigorous policies, and by understanding and implementing the guidelines from organisations that oversee the safety of transfusions. The UK’s Serious Hazard of Transfusion programme and the U.S. Centers for Disease Control (CDC) National Healthcare Safety Network (NHSN) suggest voluntary reporting structures to create a reliable source of information for the medical and scientific community about blood transfusion issues, including warning facilities about adverse events that could be systemic.

Industry groups also are striving to improve patient care and safety while maximising healthcare system efficiencies. For example, AABB collaborates with the U.S. Department of Health and Human Services on biovigilance activities, including programmes directed at a variety of different domains such as donor haemovigilance and transfusion recipient haemovigilance. Through the collaboration, the organisations are gathering and analysing data to help find trends and establish best practices for safer, more efficient transfusions and transplants [6]. Similarly, the International Society of Blood Transfusion (ISBT) and the European Haemovigilance Network (EHN) began a working group in 2004 focused on creating a common set of definitions for issues in the field, which would enable global benchmarking and is intended ultimately to increase the safety of blood donors and recipients around the world [7].

Shared commitment to patient safety
Protecting the precious life of a patient who will receive a unit of blood remains the focus of blood bankers everywhere. As the pressures and demands rise, labs are finding new ways to be efficient and haemovigilant, while never losing sight of the real person at the end of the bloodline. Despite the continued shortage of highly skilled technologists and scientists entering the laboratory science workforce, blood bankers are utilising automation and best practices to improve transfusion testing, and implementing new approaches to blood management and haemovigilance to deliver on the 5Rs of blood safety. Protecting the safety of patients through efficient blood management and haemovigilance is a commitment all of us share as part of the transfusion medicine community.

References
1. Bates DW et al. Effect of computerized order entry and a team intervention on prevention of serious medication errors. Journal of the American Medical Association 1998; 280: 1211-1212
2. Bates DW et al. The impact of computerized order entry on medication error prevention. Journal of American Medical Informatics Association 1999; 6: 313-332.
3. Kaplan HS. Getting the right blood to the right patient: the contribution of near-miss event reporting and barrier analysis. Transfusion Clinique et Biologique 2005; 2: 380-384.
4. Shulman Ira A. ‘Assuring That The Right Patient Gets The Right Donor Unit, At The Right Time, The Right Way, And For The Right Reason.’ Slide 6. AABB annual meeting. San Diego, CA. Ortho Clinical Diagnostics Blood Management Symposium. October 24, 2011
5. Davey Richard J. ‘The Safety of the Blood Supply.’ Food and Drug Administration Division of Blood Applications webinar. Available at: http://www.fda.gov/downloads/AboutFDA/Transparency/Basics/UCM245738.pdf. Accessed October 20, 2011.
6. Biovigilance – AABB program — http://www.aabb.org/programs/biovigilance/Pages/default.aspx
7. Haemovigilance – ISBT — http://www.isbtweb.org/fileadmin/user_upload/WP_on_Haemovigilance/ISBT_StandardSurveillanceDOCO_2008__3_.pdf

The author
Scott Saccal
Worldwide Marketing Director
Transfusion Medicine
Ortho Clinical Diagnostics