Scholarly article on topic 'Is Simplified Acute Physiology Score 3 better than APACHE II to predict mortality in transplanted critical patients?'

Is Simplified Acute Physiology Score 3 better than APACHE II to predict mortality in transplanted critical patients? Academic research paper on "Health sciences"

CC BY
0
0
Share paper
Academic journal
Critical Care
OECD Field of science
Keywords
{""}

Academic research paper on topic "Is Simplified Acute Physiology Score 3 better than APACHE II to predict mortality in transplanted critical patients?"

Critical Care Volume 13 Suppl 1, 2009

29th International Symposium on Intensive Care and Emergency Medicine

Brussels, Belgium, 24-27 March 2009

Published online: 13 March 2009

These abstracts are available online at http://ccforum.com/supplements/13/S1 © 2009 BioMed Central Ltd

Time course of dyspnea evolution in the emergency department: results from the URGENT dyspnea survey

M Tavares1, P Pang2, S Laribi3, A Mebazaa3, M Gheorghiade2

1Hospital de Santo Antonio-CHP, Porto, Portugal; 2Northwestern University, Chicago, IL, USA; 3Hopital Lariboisiere, Paris, France Critical Care 2009, 13(Suppl 1):P1 (doi: 10.1186/cc7165)

Introduction There is considerable uncertainty about the reproducibility of the various instruments used to measure dyspnea, their ability to reflect changes in symptoms, whether they accurately reflect the patient's experience and if its evolution is similar between acute heart failure syndrome patients and nonacute heart failure syndrome patients. URGENT was a prospective multicenter trial designed to address these issues.

Methods Patients were interviewed within 1 hour of first physician evaluation, in the emergency department or acute care setting, with dyspnea assessed by the patient using both a five-point Likert scale and a 10-point visual analog scale (VAS) in the sitting (60°) and then supine (20°) position if dyspnea had not been considered severe or very severe by the sitting versus decubitus dyspnea measurement.

Results Very good agreements were found between the five-point Likert and VAS at baseline (0.891, P <0.0001) and between changes (from baseline to hour 6) in the five-point Likert and in VAS (0.800, P<0.0001) in acute heart failure (AHF) patients. Lower agreements were found when changes from baseline to H6 measured by Likert or VAS were compared with the seven-point comparative Likert (0.512 and 0.500 respectively) in AHF patients. The worse the dyspnea at admission, the greater the amplitude of improvement in the first 6 hours; this relationship is stronger when dyspnea is measured with VAS (Spearman's rho coefficient = 0.672) than with the five-point Likert (0.272) (both P <0.0001) in AHF patients. By the five-point Likert, only nine patients (3% (1% to 5%)) reported an improvement in their dyspnea, 177 (51% (46% to 57%)) had no change, and 159 (46% (41% to 52%)) reported worse dyspnea supine compared with sitting up in AHF patients. The PDA test with VAS was markedly different between AHF and non-AHF patients.

Conclusions Both clinical tools five-point Likert and VAS showed very good agreement at baseline and between changes from baseline to tests performed 6 hours later in AHF patients. The PDA test with VAS was markedly different between AHF and non-AHF patients. Dyspnea is improved within 6 hours in more than three-quarters of the patients regardless of the tool used to measure the change in dyspnea. The greater the dyspnea at admission, the greater the amplitude of improvement in the first 6 hours.

Endotracheal intubation in the emergency room: a multicenter questionnaire

SB Bélorgey1, L Montésino2

1CHSF Gilles de Corbeil, Corbeil Essonnes, France; 2CHG, Longjumeau, France

Critical Care 2009, 13(Suppl 1):P2 (doi: 10.1186/cc7166)

Introduction Endotracheal intubation (ETI) engages the patient's life and demands a good experience. A preliminary prospective study has shown in one hospital that emergency physicians (EPs) rarely performed ETI. Do the EPs in Ile de France (Paris region) have sufficient experience and regular training to realise this procedure safely in the emergency room (ER)? Methods We conducted a descriptive telephone-based questionnaire study to assess EPs' endotracheal intubation skills through all ERs in Ile de France public hospitals. A questionnaire was completed by the investigator during a 10-minute telephone call with at least one EP in each ER. The structure of hospitals, number of ETIs performed, devices and personnel available and the existence of protocols were collected. Their usual practice of sedation and intubation, training and proposals for changes were noted.

Results The study was made through all of the 64 public hospitals of Ile de France. Fifty-six hospitals have an ICU, 37 a mobile ICU. We questioned 96 EPs; that is, 10% of EPs from our region. All of the 96 EPs called responded. These physicians were certified emergency physicians (CAMU) for 90% of them. The median of ETI declared was 24.5/year per ER. Thirty-eight percent of EPs performed less than five ETIs during the past 2 years. The success rate reported was 85%. In 94% of ERs, metallic blades and Eischmann mandrin were available and about two nurses can help during the procedure. Predictive criteria for difficult ETI cited the most were: short neck, obesity, small mouth opening and otorhinolaryngology disease or previous history of cervical radiotherapy. Seventy-six percent of EPs followed the recommendations for preoxygenation and 91% performed rapid sequence induction. The vast majority (76%) of ERs did not have standardized procedures for airway management. Theoretical training was acquired for 46% of EPs by the CAMU, practical training occurring in the operating room for 71%. Among the EPs interviewed, 87% believe that they should remain the principal actor for ETI - although as high as 89% of them consider that they were insufficiently trained in ETI management and only 41% pursued continuing medical education on that theme. Seventy-seven percent proposed to spend time in the operating room to improve their practice of ETI.

Conclusions ETI is rarely performed in the ER. It should be part of the EP curricula and written procedures should be made.

Radiological validation of endotracheal tube insertion depth in prehospital emergency patients

DM Maybauer1, MO Maybauer1, H Wolff2, E Pfenninger2, W Geisser3

1The University of Texas Medical Branch and Shriners Burns Hospital, Galveston, TX, USA; 2University Hospital, Ulm, Germany; 3County Clinics, Dillingen, Germany Critical Care 2009, 13(Suppl 1):P3 (doi: 10.1186/cc7167)

Introduction Incorrect positioning of the endotracheal tube (ETT) within the airway after emergent intubation can result in serious complications. Accidental mainstem bronchus intubation is associated with contralateral atelectasis, tension pneumothorax, hypotension, and decreased survival. Conversely, failure to place the tube several centimeters beyond the vocal cords may result in inadvertent extubation, aspiration, pneumonia, or laryngeal spasm [1]. The aim of this study was to investigate the occurrence of ETT malpositioning after emergency intubation in the out-of-hospital setting.

Methods A retrospective study of a 5-year time period, using records of 1,081 patients admitted to the trauma emergency room (ER) at a university hospital. Within 30 minutes after admission, a chest X-ray or whole-body CT scan was routinely performed in intubated patients to determine the tube-tip-carina relationship. Results Sixteen out of 1,081 patients died immediately after admission to the trauma ER and were not further radiologically diagnosed. Of the surviving 1,065 patients, 346 (32.5%) were female and 719 (67.5%) male. In the group of 488 intubated patients, 346 (70.9%) were correctly intubated, 89 (18.2%) were not correctly intubated - herein were 64 patients (14.7%) intubated with tip-carina distance <2 cm, and 25 patients (5.7%) were endobronchially intubated. Chest X-ray scans were not available for 53 patients (10.9%). Detailed data on ETT placement were available in 435 patients; 346 (79.5%) with correct ETT placement, 89 (20.5%) with incorrect ETT placement. None of the patients displayed an esophageal or pharyngeal intubation (0%). Of 435 patients, 324 had been intubated preclinically on scene -254 (78.4%) were correctly intubated, 70 (21.6%) were not correctly intubated.

Conclusions This study clearly shows that ETT misplacement in emergency patients is still a serious problem with an incidence of 21.6% in our study, of which 5.7% were endobronchially intubated. We conclude that the skill level of the operator may be key in determining efficacy of endotracheal intubation. Based on our findings, all efforts should be made to verify the tube position with immediate radiographic confirmation after admission to the ER.

Reference

1. Owen RL, Cheney FW: Endobronchial intubation: a preventable complication. Anesthesiology 1987, 67:255-257.

Does bedside chest ultrasound in the ICU improve early diagnosis and quick resolution of pleural effusion?

L Tutino1, R Spina2, A Di Filippo1, S Batacchi2, G Cianchi2, A Peris2

1University of Florence, Italy; 2Careggi Teaching Hospital, Florence, Italy

Critical Care 2009, 13(Suppl 1):P4 (doi: 10.1186/cc7168)

Introduction A bedside chest ultrasound (bCUS) programme S2 performed by intensivists after 18 months of training was

introduced on a regular basis in a 10-bed emergency ICU from April to November 2008, in order to check its effectiveness in the early diagnosis and treatment of pleural effusion (PE). The results have been compared with those of a control group, when bCUS was not part of the ICU procedures.

Methods The procedure was performed within the first 24 hours after admittance. All of the 92 patients were examined supine, with the probe perpendicular to the chest wall, using all of the intercostal spaces as the acoustic window. With this technique, once we identified the lung's base, we looked for signs of PE according to the following criteria: (a) space between the two pleural layers; (b) variation in the interpleural distance of this space during breathing. For each patient the following data were collected: age, sex, weight, height, SAPS II, number of chest drains, number of ultrasound scans performed, number of significant PE (at least 2 cm of width between the two pleurae), amount of ultrasound-guided drainage actually performed within the first 24 hours, timing of resolution of PE. Data were compared with those of a group of patients admitted to the ICU from January to March 2008, when bCUS was not part of the daily procedures and only chest CT and X-ray scans were used as evidence of PE. We considered P <0.05 statistically significant. Results A total of 103 bCUS were performed on the first day in the control group, against the 1 2 bCUS performed in the study group. A total of 59 PEs for which drainage proved to be useful were found. An amount of 27 pleural drainages were performed within the first 24 hours. We have no evidence of complications. All of the positive cases for PE have been successfully treated. All drainage was performed within the first 24 hours or at least within the first 48 hours.

Conclusions Compared with the control group (25.9%), in the study group 45% of drainage performed was done within the first 24 hours thanks to the skill of intensivists. As far as PE is concerned, the introduction of bCUS performed by intensivists in the ICU daily routine determines an increase of early diagnosis and treatment. However, the increase in the number of the first-day treatments was not significant since this procedure is now turning from a purely diagnostic approach into an operative one. This will necessarily need time.

Portable chest radiography in mechanically ventilated ICU patients: does synchronizing with end-inspiration improve the quality of films?

A Cheng1, K Tang1, H Yip1, W Kwan1, Y Lee2, A Lee2, K Lee2, K Wong1, C Gomersall2

1North District Hospital, Sheung Shui, Hong Kong; 2Prince of Wales Hospital, The Chinese University of Hong Kong, Shatin, Hong Kong

Critical Care 2009, 13(Suppl 1):P5 (doi: 10.1186/cc7169)

Introduction The quality of portable chest radiographs (CXRs) taken in the ICU is inferior to films taken in the radiology department. The inability of sedated, ventilated patients to hold their breath during CXR will also affect the degree of lung inflation and contribute to lack of correlation between serial CXR changes and clinical status [1]. We studied the effect on CXR quality by manually synchronizing the ventilator to end-inspiration in mechanically ventilated ICU patients.

Methods A pair of CXRs was taken after recruiting intubated, ventilated patients within 24 hours of emergency ICU admission. Intubated post-elective surgical patients were excluded due to the high likelihood of normal lungs. The control film was taken in the

usual way, at a random phase of the ventilator cycle. For the synchronized film, the investigator wore a lead apron and dosimeter, stood 1 to 1.5 meters away from the patient, and pressed the inspiratory hold button. The sequence of the paired films was computer-randomized. The ventilator model, settings, patient position and portable X-ray machine settings were kept constant between films. Patients served as their own controls. Films were independently scored (1 = not clear/poorly inflated, 5 = very clear/well inflated) by two specialist radiologists based on five criteria: (i) clarity of lines and tubes, (ii) definition of pulmonary vasculature, (iii) visibility of mediastinum, (iv) definition of the diaphragm and (v) degree of lung inflation. Linear regression, taking two radiologists' scores of each patient into account, was used to examine whether there were any differences in the criteria ratings between random and synchronized films. Radiologists and statistician were blinded.

Results We recruited 110 patients; there were no complications from the breath-hold maneuver. Dosimeter readings were negligible. Synchronized films had higher total scores and mean scores for criteria (ii) to (v), 95% confidence interval. P values were statistically significant: for total score, P <0.001; and for criteria (ii), P = 0.001; (iii), P <0.001; (iv), P <0.01; and (v), P <0.001. Conclusions Synchronizing the CXR to end-inspiration improves the quality of the film and is safe. Reference

1. Langevin PB, Hellein V, Harms SM, Tharp WK, Cheung-Seekit C, Lampotang S: Synchronization of radiograph film exposure with the inspiratory pause. Effect on the appearance of bedside chest radiographs in mechanically ventilated patients. Am J Respir Crit Care Med 1999, 160:2067-2071.

Quality assurance report on the use of continuous positive airway pressure and end-tidal carbon dioxide during respiratory distress in field emergency care

D Lain1, S Bourn2

Oridion Capnography, Needham, MA, USA; 2AMR, Aurora, IL, USA Critical Care 2009, 13(Suppl 1):P6 (doi: 10.1186/cc7170)

Introduction The use of continuous positive airway pressure (CPAP) is beneficial in the hospital and home care environment. It is used to support ventilation during neurological disease, ventilatory defects, congestive heart failure and obstructive sleep apnea. Field emergency medicine has inherent complications for the delivery and monitoring of patients receiving CPAP. We completed an internal quality audit to determine whether CPAP had benefit and whether capnography could be comfortably used in parallel with a CPAP device to monitor ventilation. Methods The data collection was completed on patients with respiratory distress. Data were collected pre-CPAP and post-CPAP. Patients were monitored with capnography and pulse oximetry. Emergency Medical Services and Emergency Department staff evaluated acceptance and ease of use of the equipment. A one-tailed paired test and descriptive statistics were completed. Results Eighteen respiratory distress patients received CPAP: eight female, nine male and one patient had missing data (sex entry was blank). Mean age was 79 years. Statistical significance was determined at P <0.05. There was no significant difference in heart rate: mean pre-CPAP = 116, mean post-CPAP = 114, P = 0.19. There was a significant improvement in arterial oxygen saturation percentage: pre-CPAP = 81.5, mean post-CPAP = 90.2, P = 0.0003. There was a significant improvement in end-tidal carbon dioxide (etCO2) (ventilation parameter): mean pre-CPAP = 40.1, mean

post-CPAP = 35.1, P = 0.038. There was a significant decrease in respiratory rate: pre-CPAP = 36, post-CPAP = 33, P = 0.031. Using a Borg scale for severity of respiratory distress, there was a significant improvement after CPAP: pre-CPAP = 8.06, post-CPAP = 5.7, P = 0.0001. The emergency medical technicians found the devices, CPAP, mask, and etCO2, easy to use, and 16 patients ranked it comfortable. Two patients were uncomfortable with CPAP. The most comfortable score was 10; the population scored the overall comfort of CPAP with etCO2 at 7.8. Conclusions CPAP in field emergency medicine can be easily applied, is well tolerated, and results can be monitored by capnography. Capnographic measurements indicated improved ventilation by a decrease in carbon dioxide. CPAP and etCO2 can be used in field emergencies to support and monitor ventilation during respiratory distress.

Cardiogenic oscillations extracted from spontaneous breathing airway pressure and flow signal are related to chest wall motility and continuous positive airway pressure

S Schumann1, F Messmer1, M Lichtwarck-Aschoff2, C Haberthuer3, K Moeller4, J Guttmann1

University Medical Center, Freiburg, Germany; 2University

Hospital, Uppsala, Sweden; 3Kantonsspital, Luzern, Switzerland;

4University of Applied Sciences, Furtwangen, Germany Critical Care 2009, 13(Suppl 1):P7 (doi: 10.1186/cc7171)

Introduction During mechanical ventilation, signal pulses within pressure and flow curves can be observed that are related to the activity of the beating heart. From a signal-processing view, these cardiogenic oscillations (COS) can be understood as repeated pulses that are transferred via the lungs and airways to the airway opening. It was demonstrated earlier that COS, achieved during breath-holding maneuvers, were influenced by changes in mechanical properties of the thorax [1]. We hypothesized that these COS can be extracted from the airway pressure and flow signal during spontaneous breathing. Furthermore, we hypothesized that these isolated signals contain information about the mechanical properties of the respiratory system. Methods Fifteen healthy volunteers breathed spontaneously against continuous positive airway pressure (CPAP) of 0 to 9 cmH2O at normal thorax or at thorax bending, limiting motility to 90% of the normal circumference. Airway pressure and flow as well as an electrocardiogram were recorded at a sample frequency of 200 Hz. To isolate the signals that are related to the activity of the heart, pressure and flow data were aligned in time to the R-wave of the QRS complex and averaged.

Results Highly characteristic pressure and flow oscillations could be extracted from the spontaneous breathing signals. Both signals were closely related (r2 = 0.97 ± 0.02, each P <0.0001). The pressure amplitude of the COS was influenced by CPAP (P =0.049) and thorax motility (P <0.001); in contrast, the flow amplitude was influenced only by thorax motility (P <0.001). Conclusions COS can be extracted from the airway pressure and flow signal during spontaneous breathing. They contain information about the mechanical properties of the respiratory system. After further investigations, our new method potentially allows an estimation of compliance of the respiratory system during spontaneous breathing. Reference

1. Bijaoui E, Baconnier PF, Bates JH: Mechanical output impedance of the lung determined from cardiogenic oscillations. J Appl Physiol 2001, 91:859-865. S3

Noninvasive positive pressure ventilation in infants with respiratory failure

I Lazar, Y Cavari, S Sofer

Soroka Medical Center and Faculty of Health Sciences, BenGurion University of the Negev, Beer Sheva, Israel Critical Care 2009, 13(Suppl 1):P8 (doi: 10.1186/cc7172)

Introduction Respiratory failure is a common indication for admission to a pediatric intensive care unit (PICU). Tracheal intubation and invasive ventilation carries some risk and can contribute to morbidity and possible mortality. Noninvasive positive pressure ventilation (NIPPV) is a mode in which ventilation is applied without tracheal intubation but via nasal prongs or a face mask. We hypothesized that using NIPPV in infants with pending respiratory failure may improve their outcome. Methods In this prospective study, we enrolled infants admitted with pending respiratory failure to the PICU. NIPPV was delivered using silicone INCA nasal prongs (Ackrad Labs, Trumbull, CT, USA) connected to an INFANT STAR 950 standard ventilator. The NIPPV mode of support (continuous positive airway pressure vs. synchronized nasal intermittent positive pressure ventilation) was set to meet patient needs. Vital signs, ventilator settings and laboratory results were recorded electronically. The primary outcome was prevention of invasive ventilation. Secondary outcomes were seeking a physiological marker for NIPPV failure and complications from NIPPV.

Results Between December 2007 and November 2008, 18 patients (15 infants, three enrolled twice) were eligible to receive NIPPV based on the attending physician's decision. Median age was 3 months (15 days to 17 months). Eleven patients had apnea and seven patients had respiratory distress. Among them, five patients had bronchiolitis, two had central hypotonia, two had Pertussis and six patients had other miscellaneous respiratory conditions needing support. Fourteen infants (78%) were improved and weaned off NIPPV while four (22%) infants had to be intubated. The mean length of NIPPV was 47 hours. No complication secondary to the NIPPV was recorded. No single vital sign or laboratory test predicted whether a patient would improve or fail NIPPV, although the improvement (when happened) was noticed within hours of NIPPV. All four failures were due to continuum of apnea.

Conclusions Our study shows that NIPPV is a safe and successful method to support infants suffering from pending respiratory failure. Lacking a physiological or biological marker that could distinguish between so-called responders and nonresponders to NIPPV, and the high rate of success in preventing the need for invasive ventilation, supports our recommendation to trial any infant in pending respiratory failure with NIPPV.

Functional respiratory effects of noninvasive ventilation in acute hypercapnic patients with chronic obstructive pulmonary disease

AG Graziani1, B Carenzi1, F Morgagni1, B Pratico2, P Casalini1, GF Stefanini1

1Ospedale per Gli Infermi, Faenza, Italy; 2Ospedale Bufalini, Cesena, Italy

Critical Care 2009, 13(Suppl 1):P9 (doi: 10.1186/cc7173)

Introduction Noninvasive ventilation (NIV) is useful for improving oxygenation, decreasing hypercapnia, and avoiding invasive S4 mechanical ventilation in patients with chronic obstructive

pulmonary disease (COPD) exacerbations and acute respiratory failure [1,2]. During the hospitalization, the treatment intensity is quite variable on the basis of the patient's clinical condition. Often, after the acute phase, patients need only periods of NIV followed by oxygenotherapy. NIV reduces the work of breathing and probably improves the respiratory muscular strength even when the ventilation is ended [3]. The aim of this study is to evaluate this effect in COPD submitted to NIV.

Methods We studied 10 patients (seven male; three female; age 74.2 ± 5.3 years) admitted to our hospital with a COPD exacerbation and acute respiratory failure. After a period of clinical stabilization, patients underwent spirometry and arterial blood gas as basal examinations. For each patient, NIV was applied for 2 hours and than spirometry and arterial blood gas were repeated. Results are presented as mean ± SD. Data were evaluated by paired t test and a value of P <0.05 was taken as statistically significant.

Results At the end of NIV, the forced ventilatory capacity (FVC) and the forced expiratory volume at the first second (FEV1)/FVC ratio were significantly increased compared with the basal value: FVC 47.9 ±11.0% basal, 55.3 ±15.6% after NIV, P =0.02; FEV1/FVC 68.5 ±17.1 basal, 59.3±15.2 after NIV, P =0.007. Compared with the basal value, arterial carbon dioxide pCO2 was significantly decreased: 68.0 ± 2.5 mmHg (basal), 55.3 ± 4.5 mmHg (after NIV), P <0.02; and arterial oxygen was increased: 50.0 ± 14.2 mmHg (basal), 56.3 ± 15.3 mmHg (after NIV). Conclusions The results of this study demonstrate that, in COPD exacerbation, NIV determines a decrease of the pulmonary hyperinflaction. This effect could improve the clinical status and the work of breathing. References

1. West JB: COPD. In Pulmonary Physiology and Pathophysiology. Baltimore, MD: Lippincott Williams & Wilkins; 2001:3353.

2. Brochard L: Mechanical ventilation: invasive versus non invasive. Eur Respir J 2003, 22(Suppl 47):31s-37s.

3. Lightowler JV, et al.: Non-invasive positive pressure ventilation to treat respiratory failure resulting from exacerbation of COPD disease: Cochrane systematic review and metaanalysis. BMJ 2003, 326:185-187.

Clinical application of noninvasive ventilation in acute respiratory failure in a general ICU

T Correa, L Morais, P Sanches, H Feitosa, C Azevedo, T Figueiredo, C Taniguchi, R Caserta, C Barbas

Hospital Israelita Albert Einstein, Sao Paulo, Brazil Critical Care 2009, 13(Suppl 1):P10 (doi: 10.1186/cc7174)

Introduction Noninvasive ventilation (NIV) is an option in patients with acute respiratory failure [1-3]. However, NIV complications and failure are not yet completely understood [1,2]. Our objective was to evaluate the indications, complications and failure of NIV in an adult general ICU.

Methods From 1 August to 27 October we analyzed prospectively all patients admitted to a 40-bed clinical-surgical ICU of a tertiary care hospital. From those patients, we included the ones who received NIV (total face mask coupled to a BIPAP Vision® or Synchrony®) and evaluated the indications, causes of failure and the complications of this ventilatory support.

Results During the study period, 465 patients were admitted to the ICU; 111 patients (23.9%) received NIV. The main indications for NIV were: hypoxemic respiratory failure in 22 patients (19.8%), respiratory infection in 19 (17.1%), acute COPD in 14 (12.6%), as

part of a weaning strategy in 16 (14.4%), cardiogenic pulmonary edema in 15 (13.5%), ALI/ARDS in nine (8.1%), palliative care in six (5.4%), neuromuscular disease in one (0.9%) and others in nine patients (8.1%). NIV did not avoid intubation in 31 patients (27.9%). The main reasons for failure were: progressive acute respiratory failure in 23 patients (71.9%) and neurological deterioration in five patients (15.6%). NIV was used after extu-bation in 16 patients, and in five of them (31%) it was necessary reintubation. The only complication observed was gastric insufflation in six patients (5.4%).

Conclusions NIV is frequently used in a general ICU and the main indication is acute hypoxemic respiratory failure. The NIV failure incidence was significant but similar to the medical literature. References

1. Demoule A, et al.: Benefits and risk of success or failure of noninvasive ventilation. Intensive Care Med 2006, 32:17561765.

2. Celikel T, et al.: Comparison of noninvasive positive pressure ventilation with standard medical therapy in hyper-capnic acute respiratory failure. Chest 1998, 114: 1636-1642.

3. Wysocki M, et al.: Noninvasive pressure support ventilation in patients with acute respiratory failure. A randomized comparison with conventional therapy. Chest 1995, 107: 761-768.

Relative effectiveness of two nasal continuous positive airway pressure devices in VLBW infants: first report from a multicenter, randomized, controlled trial

K Bober1, J Swietlinski2, J Zejda1, K Kornacka3, D Pawlik4, J Behrendt1, E Gajewska5, M Czyzewska5, P Korbal6, J Witalis7, W Walas8, A Turzanska9, M Wilinska10, G Zielinski11, B Czeszynska12, T Bachman13

1Medical University of Silesia, Katowice, Poland; 2The Children's Memorial Heath Institute, Warsaw, Poland; 3Medical University Warsaw, Poland; 4 Medical College Jagiellonian University of Cracow, Poland; 5Medical University Wroclaw, Poland; 6Regional Hospital Bydgoszcz, Poland; 7 Regional Hospital Rzeszow, Poland; 8Regional Hospital Opole, Poland; 9Center of Medical Postgraduate Education Warsaw, Poland; 10Regional Hospital Lomza, Poland; 11Regional Hospital Czestochowa, Poland; 12Pomeranian Medical University Szczecin, Poland; 13California State University, San Bernardino, CA, USA Critical Care 2009, 13(Suppl 1):P11 (doi: 10.1186/cc7175)

Introduction Nasal continuous positive airway pressure (nCPAP) is accepted as an effective and relatively complication-free method of respiratory support of premature infants [1,2]. We intended to compare the effectiveness of two nCPAP devices/approaches (Hudson prongs/bubble (H), and Infant Flow (IF)), in different groups of very low birth weight infants in a large trial. Methods Infants weighing 750 to 1,500 g, gestational age <32 weeks, were enrolled from April 2006 to July 2008 at 12 centers. The newborns, categorized into three study groups, were randomly assigned to one of the nCPAP devices in the first 6 hours of life. Study group A (n = 119) were neonates placed immediately on nCPAP. Group B (n =157) were placed on nCPAP after receiving surfactant. Group C (n = 56) were treated with conventional ventilation and nCPAP was used as the method of weaning from mechanical ventilation.

Results There were no statistically significant differences between the two devices with regard to treatment success, pneumothorax or bronchopulmonary dysplasia. The incidence of severe nasal

complications was lower in the infants treated with Infant Flow:

0.8. (IF), 6.6.% (H) (P =0.01) in group A; 0.6% (IF), 5.1% (H) (P = 0.0l) in group B; and 0% (iF), 5.3% (H) (P =0.1) in group C. The incidence of necrotizing enterocolitis was also lower in the group A infants treated with Infant Flow: 2.5% (IF), 8.3% (H) (P =0.03).

Conclusions (1) The two nCPAP methods are comparable with regard to the incidence of pulmonary complications and primary effectiveness. (2) The Infant Flow method resulted in fewer severe nasal complications. References

1. Morley CJ, Davis PG, Doyle LW, Brion LP, Hascoet JM, Carlin JB, COIN Trial Investigators: Nasal CPAP or intubation at birth for very preterm infants. N Engl J Med 2008, 358:700708.

2. Swietlinski J, Bober K, Gajewska E, Helwich E, Lauterbach R, Manowska M, et al., Polish Noninvasive Respiratory Support Program Study Group: Introduction of Infant Flow nasal continuous airway pressure as the standard of practice in Poland: the initial 2-year experience. Pediatr Crit Care Med 2007, 8:109-114.

Contribution of noninvasive ventilation in the precocious extubation in the medical ICU

B Charra, A Hachimi, A Benslama, S Motaouakkil

Hôpital Ibn Rochd, Casablanca, Morocco

Critical Care 2009, 13(Suppl 1):P12 (doi: 10.1186/cc7176)

Introduction During the past decade, noninvasive ventilation (NIV) has imposed itself as an alternative to endotracheal intubation. Several recent studies let us believe that this technique could be also beneficial at the precocious extubation. The purpose of our survey is to value the place of NIV at the precocious extubation in ventilated patients.

Methods A prospective, randomized and controlled survey has been driven in a medical resuscitation unit during 6 months (June to December 2007). After 48 hours of mechanical ventilation (MV), if the patients have no fever, no neurological anomalies nor hemodynamic instability and SaO2 >90% with 40% FiO2, a spontaneous breathing trial (T-piece) is performed. If after the T-piece test the clinical status, blood gas and hemodynamic data were good, the patient was extubated. If these criteria were not filled, the MV was continued and a daily assessment performed. On the other hand, if the patient was anxious, agitated, with polypnea >35/minute, PaO2 <50 mmHg under 40% FiO2, heart rate >145/minute, systolic arterial pressure >170 mmHg or <70 mmHg or arrhythmia, the patient is randomized for one of the two protocols: in the first, the patient was extubated and NIV (pressure support - positive end-expiratory pressure) via a facial mask was performed; in the second, a classic weaning was performed with pressure support ventilation. The quantitative variables are expressed as the average or median ± standard derivation, and the qualitative variables by as the percentage. The univariate analysis was based on the chi-squared test or Fisher test for the qualitative variables and the Student test for the quantitative ones. P <0.05 is considered significant. The statistic analysis was based on SPSS 11.0 for Windows.

Results Twenty-four patients (13 men and 11 women) were enrolled (12 in each group: NIV group and control group). The mean age was 42 ± 2 years. The length of hospitalization for the NIV group is less than that for the control group (P = 0.001). The weaning of MV was more precocious in the NIV group than in the control group (P = 0.001). Also, nosocomial pneumonia occurred S5

less in the NIV group than in the control group (P = 0.04). No case of mortality was noticed.

Conclusions It seems that NIV permits one to shorten the duration of MV and length of stay in the ICU at the precocious extubation.

Survey of current intubation practices in Polish neonatal and pediatric ICUs

J Swietlinski1, J Zejda2, G Brozek2, E Musialik-Swietlinska2, M Migdal1, K Bober2

1The Childrens Memorial Heath Institute, Warsaw, Poland;

2Medical University of Silesia, Katowice, Poland

Critical Care 2009, 13(Suppl 1):P13 (doi: 10.1186/cc7177)

Introduction Guidelines pertaining to the details of intubation practices in neonates and children are not well established. We sought to describe the current practices with regard to the intubation of newborns and children.

Methods The study was performed in 2007. Anonymous questionnaires were sent out to all Polish neonatal (n = 418) and pediatric (n = 45) ICUs. The overall response rate was 65%. The response rate in level III of neonatal care was 86.7% and in the pediatric ICU (PICU) was 65.1%. The responders were asked to provide information regarding the frequency of specific practices. Data analysis (the difference between neonatal units and PICUs) was performed by means of procedures available in SAS software. Results Seventy-four percent of neonatal units and 89.3% of PICUs have a policy for elective intubation. Only a part of the units has a written policy (from 48% in level III neonatal ICUs to 19.4% in level I neonatal units). Unplanned extubation was found an important problem in 3.0% of neonatal units and 7.1% of PICUs (P = 0.05). A written protocol for difficult airways intubation was available in 48.1% of PICUs and 3.0% of neonatal units (P =0.0001). In total, 92.2% of PICUs have regular sedative practices for elective intubation. Only 44.6% of neonatal units have such a policy (P =0.0001). Numerous combinations of either sedatives or muscle relaxants were found to be used by units. However the most common policy in neonatal units (69.0%) was a single dose of midazolam. Combination of thiopental or midazolam, muscle relaxant and/or atropine was used frequently (79.4%) in PICUs. Only 74% of PICUs and 37.5% of neonatal ICUs have a policy for elective extubation (P = 0.0001).

Conclusions When compared with similar studies, a lower number of Polish neonatal and pediatric ICUs have a written policy for elective intubation. Only a minority of PICUs fail to provide any sedation prior to elective intubation. More than one-half of neonatal units have no such policy despite strong evidence of physiologic and practical benefits. This phenomenon was found also by others. A lack of written guidelines for the extubation procedure is another finding for a future educational programme.

Incidence of post-intubation hemodynamic instability associated with emergent endotracheal intubations: a systematic review

R Green1, B Hutton2, L McIntyre2, D Fergusson2

1Dalhousie University, Halifax, NS, Canada; 2Ottawa Health

Research Institute, Ottawa, ON, Canada

Critical Care 2009, 13(Suppl 1):P14 (doi: 10.1186/cc7178)

Introduction The development of hemodynamic instability following emergent endotracheal intubation and the initiation of S6 positive pressure ventilation is a potentially life-threatening adverse

event. Unfortunately, the incidence of post-intubation hemodynamic instability (PIHI) is relatively unknown. The objective of this study is to estimate the risk of PIHI in adult patients who require emergent intubation and to identify factors contributing to the likelihood of this adverse event.

Methods This is a systematic review of published adult, inhospital studies of emergent endotracheal intubation. A systematic search of Medline (1950 to November 2008) and relevant bibliographies was completed. No restrictions were placed on the language of publication, patient diagnosis, indication for intubation, or intubation method employed. One author independently reviewed all citations, and two authors reviewed all candidate articles during the process of final selection. Data were independently retrieved on a standardized data abstraction form by two authors. Random-effects meta-analysis was used to estimate the pooled prevalence of PIHI across studies.

Results A total of 22 relevant studies were identified and included in our analysis. One randomized controlled trial and 21 observational studies met the eligibility criteria. Sample sizes ranged from 33 to 2,833 patients (median, 214). The prevalence of postintubation hypotension ranged from 0% to 39%, with a random-effects, pooled estimate of 8.5% (95% CI, 4.8% to 14.5%). Studies that defined PIHI with a temporal relationship between blood pressure reduction and intubation had a PIHI prevalence of 13.9% (95% CI, 8.8% to 21.2%) compared with a prevalence of 5.0% (95% CI, 1.6% to 15.0%) in studies that did not. Heterogeneity between studies limits conclusions on the effect of indication for intubation, intubator experience, medications utilized to facilitate intubation, and management strategies used for PIHI. Conclusions Post-intubation hemodynamic instability occurs commonly after emergent intubations. Efforts are required to identify risk factors, and potential preventative and therapeutic interventions for PIHI.

A novel method to develop an elastic, thin-walled, leak-proof, inflatable tracheal tube cuff

M Cressoni1, A Zanella2, M Epp3, I Corti3, V Hoffmann3, P Cadringher1, T Kolobow3

1Policlinico IRCCS, Milano, Italy; 2Universita' Milano-Bicocca, Monza, Italy; 3National Institutes of Health, Bethesda, MD, USA Critical Care 2009, 13(Suppl 1):P15 (doi: 10.1186/cc7179)

Introduction Commercially available endotracheal tube (ETT) cuffs are made of PVC polymer that shows little stretch upon inflation, resulting in a need for a cuff diameter larger than the trachea. Such cuffs form folds, which became a ready passageway for bacteria-colonized subglottic secretions. We developed an elastic and smooth balloon with no folds upon inflation. Methods In vitro study. Six different ETTs with the newly designed cuff were tested for leakage in a 20 mm internal diameter acrylic tube at 20 cmH2O (1.96 kPa) inflation pressure, pouring 20 ml methylene-blue colored water into the acrylic tracheal tube above the cuff to visualize any leak. We observed the ETT cuff for 24 hours for possible leakage.

In vivo study. Four Yukatan minipigs were intubated with the new ETT cuff and mechanically ventilated for ~40 hours. Methylene-blue colored water was poured into the subglottic space through a line attached to the ETT. We looked for blue discoloration of mucus retrieved during the tracheal suction at 5, 10 and 30 minutes following instillation. At autopsy, tracheal mucosa was examined for possible cuff-related damage.

Results The Lycra cuff was devoid of folds upon inflation, and touched the wall of the mock trachea. There was no leakage of

methylene-blue colored water. Instead, the average leakage across the Mallinkrodt Hi-Lo cuff was 1,182.2 ±1,321.0 ml/hour (P<0.0001 vs. the Lycra prototype); the average leakage across the Microcuff was 12.2 ±3.6 ml/hour (P<0.0001 vs. the Lycra prototype, and P <0.01 vs. the Microcuff).

We observed no methylene-blue in tracheal secretions. The pig trachea appeared normal at autopsy, with no signs of erosion. Conclusions We showed that a new concept, a smooth ultrathin elastic tracheal tube cuff, can perform better than present commercially available tracheal tubes.

Abstract withdrawn

Influence of tracheostomy on duration of weaning from mechanical ventilation

S Dubrov, F Glumcher

National O.O. Bogomolets Medical University, Kiev, Ukraine Critical Care 2009, 13(Suppl 1):P17 (doi: 10.1186/cc7181)

Introduction There are several advantages of tracheostomy if compared with orotracheal intubation, such as a more comfortable state for the patient, more effective aspiration of the sputum of respiratory ways, reduction of respiratory way resistance and availability of peroral nutrition. These advantages also include more safe maintenance of the passableness of respiratory ways, reducing the terms of carrying out respiratory support and the patient staying in the ICU, decreasing the pneumonia development risk. This study is designed to compare the duration of weaning in patients with tracheostomy versus patients suffering orotracheal intubation.

Methods For this observational prospective cohort study, multiple trauma patients requiring more than 72 hours of mechanical ventilation were prospectively selected. A group of patients that underwent early tracheostomy before weaning attempts (ET) were compared with a group of patients that underwent weaning attempts while being orotracheally intubated. Tracheostomy was performed for these patients only in case there were several unsuccessful attempts of weaning (forced tracheostomy) lasting more than 7 days (FT). The general patient state was estimated by the APACHE II score, traumatic damages have been estimated by the injury severity score, and the degree of infringement of consciousness has been estimated by the GCS. Results Seventy-one patients meeting the inclusion criteria were subject to our trial. The ET group included 41 patients that underwent tracheostomy lasting 2 to 5 days from the moment of mechanical ventilation being started. The FT group consisted of 30 patients and tracheostomy was performed for 60% of them. Both groups of patients were statistically comparable. The median duration of mechanical ventilation was shorter in the ET group of patients (256 vs. 314 hours, P = 0.047). The median duration of weaning was also shorter in the ET group of patients (61 vs. 103 hours, P =0.008) if compared with the FT group. In total, 34.1% of patients of the ET group suffered pneumonia, while in the FT group the percentage was 50% (P = 0.032). The mortality rate was almost the same in both groups (P = 0.944). Conclusions Results of this study show that early tracheostomy, if performed for patient's suffering severe combined trauma, reduces the duration of weaning. Early tracheostomy results in a decrease of the frequency of complications such as ventilator-associated pneumonia. The duration of tracheostomy did not affect the mortality rate of the investigated groups of patients.

Safety of percutaneous tracheotomy in patients with cricoid cartilage not identified: report of 122 cases

T Adanir, A Sencan, M Aksun, A Atay, G Aran, N Karahan

Ataturk Training and Research Hospital, Izmir, Turkey Critical Care 2009, 13(Suppl 1):P18 (doi: 10.1186/cc7182)

Introduction Percutaneous tracheotomy (PT) is one of the most frequent procedures carried out in critically ill patients. However, it carries contraindications in the patient group younger than 16 years of age, in cases with known or anticipated difficult endotracheal intubations, in infection of surgical areas and in cases with cricoid cartilage not identified. In this retrospective study, we evaluated the safety of performing the PT-associated blunt dilatational exploration with a right-angle clamp in patients having cricoid cartilage not identified.

Methods We reviewed retrospectively the data of 122 PTs in patients having cricoid cartilage not identified due to obesity or postural deformity, between January 2006 and October 2008. The data obtained from charts included age, sex, timing of PT, duration of the procedure, minor or major complications and mortality. All of the procedures were performed at the bedside in the ICU. After as maximal a positioning of the patient as could be performed, local anesthetic infiltration was applied to 1 to 2 cm superior of the jugular notch of manibrium sterni (according to the structure of patient's neck). After incision (~2 cm) of skin and subcutaneous tissue, all layers of subcutaneous tissue were passed through until feeling the trachea by finger using a right-angle clamp (blunt dilatational exploration). So, cricoid cartilage was directly palpated by the tip of the finger, and the attempt (Griggs technique) was performed between the first and second tracheal cartilages below the cricoid cartilage.

Results The patients were mechanically ventilated for an average of 12.9 ± 2.6 days. They were 57 ± 14 (26 to 86) years old, and 64 of them were female, 58 of them were male. The duration of the technique was 2.5 to 5 minutes. There was no death or cardiac arrest related to tracheotomy. There were 113 PTs (92.6%) documented as uncomplicated. There was no technically difficult procedure, and none of the patients changed into a surgical approach during the procedure. However, major hemorrhage developed during first 24 hours in eight patients. In one patient, pneumomediastinum was determined in the 48th hour after the procedure. The overall complication ratio was established as 7.4%. Conclusions PT associated with using a right-angle clamp seems to be safe; it could be performed in the patients having cricoid cartilage not identified.

Percutaneous dilational tracheostomy: early and late complications

RM Corso, E Fabbri, M Terzitta, P Gudenzi, J Chanis, M Baccanelli, G Gambale

GB Morgagni Hospital, Forli, Italy

Critical Care 2009, 13(Suppl 1):P19 (doi: 10.1186/cc7183)

Introduction Percutaneous dilational tracheostomy (PDT) is a common procedure in ICU patients. In this study we evaluated perioperative complications. Moreover we looked for late complications by telephone interview together with a clinical evaluation in the suspected cases.

Methods We included 170 consecutive patients admitted to GB Morgagni Hospital ICU, between June 2005 and June 2007 who underwent PDT. Demographic data, clinical data and severity S7

scores (SAPS II), data about the tracheostomy technique and tracheostomy major complications were collected. We used the Ciaglia technique with endoscopic guidance throughout the procedure. Twelve months after discharge, we traced and interviewed our patients about possible late complications connected with the tracheostomy. Symptomatic patients were referred to the ENT specialist for fiberoptic laryngoscopy control. Results PDT was performed in 170 patients as a routine procedure by intensivists. The main primary indications for PDT were weaning failure (29%) and neurological dysfunction (71%). One hundred and five patients were male and 65 female, with an age average of 68 ± 15 years. The mean SAPS II was 53 ± 10 points. The intubated time before PDT was 5 ± 2 days and the time in the ICU after PDT was 14 ± 8 days. The ICU mortality was 16%. Placement was successful in all cases. The total incidence of major complications was 1.18%: a simple pneumothorax successfully treated with chest tube insertion and one early (after 72 hours) cannula displacement evolved to cardiorespiratory arrest and death. We traced 38 patients alive 12 months after discharge; 22 patients answered the telephone interview. None complained of respiratory symptoms, Four patients described symptoms that were considered worth further examination and were invited to an ENT control. In two patients, swallowing uncoordination was found. In another two patients, a 20% tracheal stenosis was found. The stenosis was, however, asymptomatic.

Conclusions In our experience PDT had an overall low rate of major complications (1.8%). Only one patient had severe early complication. We did not find severe late complication. In selected patients, PDT with endoscopic guidance guarantees a high safety standard [1,2]. References

1. Gambale G, Cancellieri F, Baldini U, Vacchi Suzzi M, Baroncini S, Ferrari F, Petrini F: Ciaglia percutaneous dila-tional tracheostomy. Early and late complications and follow-up. Minerva Anestesiol 2003, 69:825-833.

2. Christenson TE, Artz GJ, Goldhammer JE, Spiegel JR, Boon MS: Tracheal stenosis after placement of percutaneous dilational tracheotomy. Laryngoscope 2008, 118:222-227.

Tracheostomy in the ICU: an analysis of 443 procedures A Marban, J Lopez

University Hospital La Paz, Madrid, Spain

Critical Care 2009, 13(Suppl 1):P20 (doi: 10.1186/cc7184)

Introduction The aim of this study is to analyse our experience with tracheostomies performed in the critical care unit of a tertiary university hospital.

Methods A retrospective clinical records review of patients who underwent this procedure in a 7-year period. Results From January 2001 to December 2007, 6,333 patients were admitted to our unit; 1,528 needed mechanical ventilation (MV) for more than 48 hours and 443 underwent tracheostomy. The median age was 56 years (14 to 88 years); 66% were male. The median APACHE II score was 20 (4 to 44). The main diagnoses were polytrauma including head injury in 24.2%, other structural neurological diseases in 21%, and prolonged weaning of various aetiologies in 35%. The percutaneous dilational technique was used in the majority of cases (90%). The mean duration of MV prior to tracheostomy was 13.8 days (SD = 6.4). The overall complication rate was 6%. Intraprocedural complications were atelectasis (0.4%) and bleeding (2%). Two of the patients needed surgical control or transfusion (0.4%). Two stoma infections S8 developed in the open tracheostomy group. The most frequent

complication was tracheal stenosis, encountered in 15 patients (3%). The ICU mortality was 20.7%. Of the 351 patients discharged from the ICU, 45.8% were decannulated prior to discharge from the ICU and 31% in the ward; 23% of them could not be decannulated at any moment. Ward mortality in the group of patients decannulated in the ICU was 5%, 10% in the patients decannulated in the ward and 37% in those who failed decannulation, for a total of 50 deaths before hospital discharge (11%). The main diagnoses of the patients who died on the ward were: residual encephalopathy in 62% (postanoxic, posttraumatic or other causes), severe chronic respiratory failure in 10%, spinal cord injury in 6%, and neuromuscular disease in 4%. Conclusions We had a low rate of early complications, similar to other series, with no procedure-related deaths [1]. Our main complication was airway stenosis. As in other studies, patients who needed a tracheostomy belonged to a group of patients with a high severity and mortality. Some of them do not recover a satisfactory neurological and functional status to be decannulated and present a high ward mortality. Reference

1. Díaz-Regañón G, Miñambres E, Ruiz A, González-Herrera S, Holanda-Peña M, López-Espadas F: Safety and complications of percutaneous tracheostomy in a cohort of 800 mixed ICU patients. Anaesthesia 2008, 63:1198-1203.

Percutaneous dilational tracheostomy in neurointensive care patients

M Ramamurthy, P Nair

Walton Neurosciences Centre, Liverpool, UK

Critical Care 2009, 13(Suppl 1):P21 (doi: 10.1186/cc7185)

Introduction Neurointensive care patients often require elective tracheostomy for prolonged ventilatory support, control of intra-cranial pressure as sedation is weaned and for impaired pharyn-geal and laryngeal reflexes. The possibility of raised intracranial pressure, worsened by patient positioning and intraprocedural occult hypercarbia, makes it a higher risk procedure [1]. There is little information on the timing of percutaneous dilational tracheo-stomy (PDT) or periprocedural complications in neurointensive care patients.

Methods Out of 80 patients who underwent PDT over a period of

1 year, information was obtained and analysed on 52 patients. Baseline demographical information collected included the date of admission, date of PDT, level of the operator, supervision, and periprocedural complications. We also looked at the use of post-procedure chest radiography (CXR). Analysis was then carried out to determine the timing of PDT, the incidence of complications and the use of CXR.

Results Fifty-two patients were included, median age 56 years (range 20 to 79 years). The procedure was carried out either by two trainees with a consultant supervising (40%) or by a consultant and a trainee (57%). Two patients who had a difficult anatomical approach had the procedure done by two consultants. The timing of PDT ranged from 1 day to 22 days with a mean of 7.69 days and SD of 4.29 days. There were only three reported complications (5%), none of them major or involving raised intracranial pressure. CXR was requested in 68% of cases; of the 35 patients who did have CXR, only 51% had recorded reports in the notes.

Conclusions In spite of recommendations that CXR is not required following uncomplicated PDT, most operators still request one, a habit that leads to unnecessary patient and staff exposure to radiation. The majority of the PDTs was performed by trainees in

our unit, and the low complication rate proves that the technique is safe and easy. PDT in neurointensive care patients carries a higher risk, but with proper patient selection and senior input the procedure is as safe as in general intensive care patients. References

1. Reilly PM, Anderson HL 3rd, Sing RF, Schwab CW, Bartlett RH: Occult hypercarbia. An unrecognized phenomenon during percutaneous endoscopic tracheostomy. Chest 1995, 107:1760-1763.

Service evaluation of complications following tracheostomy insertion in ICU patients

AJ Glossop, TC Meekings, SJ Webber

Sheffield Teaching Hospitals NHS Foundation Trust, Sheffield, UK Critical Care 2009, 13(Suppl 1):P22 (doi: 10.1186/cc7186)

Introduction We prospectively studied the tracheostomy complication rate in ICU patients (four ICUs, 36 beds) over a 6-month period. Quoted complication rates following tracheostomy vary widely [1,2].

Methods We evaluated all tracheostomies sited in ICU patients in our trust between June and November 2008. Complications on insertion, whilst cannulated and post (tracheal) decannulation were recorded. Patients were followed up until 4 weeks post de-cannulation, hospital discharge or death.

Results Sixty-four patients underwent tracheostomy (58 percutaneous, six surgical). The mean time with tracheostomy was 24.8 days. Twenty-six insertion complications occurred in 20 (31%) of 64 patients. Fifty-five patients received follow-up (four transferred to another hospital, one died, four lost to follow up before first visit). Eighteen complications occurred whilst cannulated in 12 (22%) of 55 patients. The number (%) of insertion complications were: all, 26 (41%); major, five (8%) -major bleed, four (6%); posterior tracheal wall injury, one (2%); minor - minor bleed, 12 (19%); abandonment/conversion to surgical procedure, four (6%); tracheal cartilage fracture, three (5%); other, two (3%). The numbers (%) of complications whilst cannulated were: all, 18 (33%); major, seven (13%) - tracheostomy tube blockage/displacement, four (7%); loss of airway with severe hypoxia, three (5%); minor - prolonged bleeding, two (4%); local infection, one (2%); surgical revision, two (4%); other, six (11%). Post decannulation, 38 patients were followed up. There were no major complications. The number (%) of post-decannu-lation complications were: all, 28 (74%); difficulty swallowing, eight (21%); regurgitation of liquid, eight (21%); voice change, six (16%); hoarseness, two (5%); regurgitation of solids, two (5%); altered cough, one (3%); abnormal breathing, one (3%). One patient had complications lasting >30 days post decannulation. Overall patient outcomes 30 days post decannulation (excluding 15 patients transferred to other hospitals) were: 59% discharged from hospital, 29% dead, 12% inpatient decannulated. Conclusions Although the tracheostomy complication rate in our trust was 41% at insertion (8% major), 33% whilst cannulated (13% major) and 74% post decannulation, only one post-decan-nulation complication persisted beyond 30 days and none was major. Seventy-one per cent of patients undergoing tracheostomy survived to 30 days post decannulation. References

1. Silvester W, Goldsmith D, Uchino S, Bellomo R, Knight S, Seevanayagam S, et al.: Percutaneous versus surgical tracheostomy: a randomized controlled study with long-term follow-up. Crit Care Med 2006, 8:2145-2152.

2. Díaz-Regañón G, Miñambres E, Ruiz A, González-Herrera S,

Holanda-Peña M, López-Espadas F: Safety and complications of percutaneous tracheostomy in a cohort of 800 mixed ICU patients. Anaesthesia 2008, 63:1198-1203.

Risk factors for unplanned extubation in critically ill patients

RI De Groot, LP Aarts, MS Arbous

Leiden University Medical Center, Leiden, the Netherlands Critical Care 2009, 13(Suppl 1):P23 (doi: 10.1186/cc7187)

Introduction Unplanned extubation (UE) is a frequent complication in ICU patients associated with increased morbidity, mortality, duration of mechanical ventilation, ICU stay and hospital stay. Although UE has been studied, still not much is known on the incidence, determinants and outcome. The aim of the study was to assess the incidence and determinants of UE in a tertiary-care ICU. Methods From 1 December 2005 to 1 June 2008 a prospective case-control study was undertaken. Cases were consecutive adult patients in a 29-bed medical, surgical, neurosurgical, and thoracic-surgical ICU who experienced an UE. The UE was defined as premature removal of the tube by the patient. For each case, four controls were randomly selected. Controls were mechanical ventilated patients who did not experience an UE at the time a case occurred. Demographics and clinical characteristics were obtained from the electronic medical records. Wilcoxon rank-sum and chi-squared tests were used as appropriate. To determine independent risk factors for UE, univariate logistic regression was used. Determinants significant in the univariate analysis were included in the multivariate logistic regression. This model was tested for the clinically relevant interaction between determinants. Results In the study period, 74 UEs occurred and 296 controls were collected. The incidence of UE is 2.1% for mechanically ventilated patients and 0.4% per ventilation day. Cases and controls did not differ significantly with respect to age, type of admittance or diagnosis category. Forty-seven percent of the cases had to be reintubated, 77% did not experience another complication. Cases had significantly lower median length of intubation (5 vs. 7 days, P = 0.069), ICU mortality (18 vs. 27%, P =0.096) and hospital mortality (19 vs. 34%, P =0.028). Significant predictors of UE in the multivariate analysis were admittance to the thoracic surgery unit (OR = 2.63, 95% CI = 1.06 to 6.53, P = 0.037) and a Ramsay sedation score of 1 (OR = 30.57, 95% CI = 3.18 to 294.2, P = 0.003), 2 (OR = 25.47, 95% CI = 3.00 to 217.0, P = 0.003), and 3 (OR = 7.02, 95% CI = 0.78 to 63.01, P = 0.082) compared with the most sedated score. Protective factors are female gender (OR = 0.55, 95% CI = 0.26 to 1.19, P = 0.131), and use of midazolam at the time of UE (OR = 0.44, 95% CI = 0.19 to 0.99, P = 0.048).

Conclusions Although UE can be defined as a complication we could not find a correlation with morbidity and mortality.

Protocol-driven weaning from mechanical ventilation: a study into adherence and outcomes

R Rahman West, M Saidi, D Dawson

St Georges Hospital, London, UK

Critical Care 2009, 13(Suppl 1):P24 (doi: 10.1186/cc7188)

Introduction Numerous studies have shown that ventilator weaning protocols are likely to reduce the duration of mechanical ventilation and ICU stay. In 2001 a taskforce of pulmonary and critical care experts developed guidelines for weaning and S9

discontinuation of mechanical ventilation that recommended the development and implementation of respiratory weaning protocols for nonphysician healthcare professionals in the ICU [1]. Our weaning protocol was established in 2005 based on clinical evidence and best-practice recommendation at the time. The primary aim of this audit is to ascertain the extent of protocol adherence in our unit. The secondary aims include correlation of protocol-driven weaning to outcome as defined by successful extubation and reduction in the length of mechanical ventilation. Methods A prospective study of all patients who received mechanical ventilation over a 1-month period, excluding patients who had tracheostomy insertion and patients who had their treatment withdrawn. We looked at the rate of compliance with our weaning protocol, the reason for noncompliance and outcome. Results Fifty-two patients were included, 18 protocol-driven and 34 nonprotocol-driven weaning. The most common reason for nonprotocol weaning was clinical decision (40.4%). In total, 19.2% of patients had an alternative spontaneous breathing trial from the protocol and were counted as nonprotocol. There were no differences in the rate of successful extubation between patients who were weaned from protocol versus nonprotocol, 94.4% vs. 79.4% respectively (Fisher's exact test P = 0.236). Duration of ventilation was also similar in the protocol and nonprotocol groups, mean ± SEM = 84.7 ± 16.6 vs. 76.4 ± 12.8 hours (unpaired t test P = 0.699). The overall success rate of extubation was 86.5%. Conclusions Our compliance rate is 34.6%, and the protocol-driven weaning trial does not improve outcome in our unit. However this could be due to the small sample size, the timing of the study and a nondiscriminatory protocol. Reference

1. MacIntyre NR, et al.: Evidence-based guidelines for weaning and discontinuing ventilatory support: a collective task force facilitated by the American College of Chest Physicians; the American Association for Respiratory Care; and the American College of Critical Care Medicine. Chest 2001, 120(6 Suppl):375S-395S.

SmartCare is faster than paper-protocol weaning

R Kleijn, M Van Spreuwel-Verheijen, B Kalkman, P Tangkau, L Dawson, S Sleeswijk Visser, I Meynaar

Reinier de Graaf Gasthuis, Delft, the Netherlands Critical Care 2009, 13(Suppl 1):P25 (doi: 10.1186/cc7189)

Introduction We compared a computer-driven weaning protocol (SmartCare on Evita XL; Drager, Lubeck, Germany) with our paper-based nurse-driven weaning protocol.

Methods The ICU is a 10-bed intensivist-led unit in a 500-bed teaching hospital. We compared our paper-based nurse-driven weaning protocol [1] with SmartCare in a prospective cohort study. All consecutive patients receiving mechanical ventilation between May and October 2008 and fulfilling the inclusion criteria were included. Patients were included if the intensivist on his twice daily rounds considered the patient ready for withdrawal of the ventilator and if patients were on pressure support with no more than 50% oxygen and no more than 8 mbar positive end-expiratory pressure, had no fever, had a normal pH, were arousable and had no more than 5 |ig/kg/min dopamine. Patients were excluded if they were ready for immediate extubation or if they had a tracheostomy. For the first 3 months of the study, patients were allocated to the paper-based nurse-driven weaning protocol, and for the last 3 months to SmartCare.

Results The results are presented in Tables 1 and 2. Normally S10 distributed data are presented as the mean and SD, nonparametric

Table 1 (abstract P25)

Baseline characteristics

Paper SmartCare

(n = 17) (n = 17) P value

Age 70.5 (9.4) 62.4 (13.8) 0.07

t test

APACHE II score 13.7 (5.2) 13.5 (4.4) 0.90

t test

Days ventilated 3 (2 to 4) 3 (2 to 4) 0.97

Mann-Whitney

U test

Table 2 (abstract P25)

Results

Paper SmartCare

(n = 15) (n = 17) P value

Wean time (hours) 4.5 2.6 0.007

(2.8 to 21.3) (1.8 to 3.6) Mann-Whitney

U test

Reintubation 0/15 (0%) 2/17 (12%) 0.49

Fisher exact test

28-day mortality 1/15 (7%) 0/17 (0%) 0.47

Fisher exact test

data as the median and interquartile range. Thirty-two patients were enrolled in the study. Baseline characteristics were similar in the two groups. The main result was that the median weaning time was significantly shorter in the SmartCare group: 4.5 hours vs. 2.6 hours.

Conclusions SmartCare reduces the weaning duration when compared with a paper-based nurse-driven weaning protocol. Reference

1. Wulff A, Kalkman B, Orsini M, Van der Hoeven M, Van der Velden J, Tangkau P, et al.: The effect of a protocol on the duration of weaning. Intensive Care Med 2004, 30(Suppl 1):S21.

Respiratory muscle oxygen saturation during weaning

X Borrat, J Mercadal, S Benito, R Adalia, E Zavala, J Tercero

Hospital Clinic Barcelona, Spain

Critical Care 2009, 13(Suppl 1):P26 (doi: 10.1186/cc7190)

Introduction Unnecessary prolongation of mechanical ventilation is related to increased morbidity. On the contrary, early discontinuation of mechanical ventilation with reintubation is also related to bad prognosis. High respiratory rate, cardiac load and neuro-muscular dysfunction are known factors related to weaning failure. Oxygen tissue saturation (StO2) obtained by near-infrared spectroscopy (NIRS) reflects the balance between oxygen delivery and consumption at the muscle level. StO2 evolution during weaning may have a role in assessing respiratory muscle performance and help us to predict patient readiness to be weaned. The study objective is to describe respiratory muscle StO2 during a T-tube test.

Methods A patient with mild head injury and pulmonary contusion was submitted to a T-tube trial after obtaining stability on day 5. The NIRS signal from serratus anterior muscle was acquired by placing an Inspectra device probe on the skin surface of the

StO2 evolution during the T-tube trial. HR, heart rate; RR, respiratory rate; MAP, mean arterial pressure.

muscle [1]. Simultaneously, the respiratory rate, heart rate, mean arterial pressure and arterial oxygen saturation were recorded. Results After 5 minutes the patient failed on his first T-tube trial, showing profuse sweating, accessory muscle recruitment and increasing respiratory rate and mean arterial pressure (Figure 1). The StO2 signal decreased during ventilatory failure until the patient was on assistance again.

Conclusions NIRS was sensitive to respiratory muscle fatigue but further research is in process to assess its predictive capability. Reference

1. Moalla W, et al.: Respiratory muscle deoxygenation and ventilatory threshold assessments using near infrared spectroscopy in children. Int J Sports Med 2005, 26:576582.

retrograde step, this was noted as a continuous attempt at weaning. Similar data were collected for subsequent retrograde steps.

Results The median duration of MV from initiation to discontinuation was 85 hours (range 1 to 345 hours), with a median time of 37 hours (range 1 to 245 hours) on PCV and 32 hours (range 0 to 264 hours) on ASB. The median time from recorded spontaneous breaths on PCV to ASB was 1 hour (range 0 to 34 hours). Twenty-nine (62%) patients on PCV progressed on ASB without any retrograde steps. The median time from having pressure support <10 cmH2O on ASB to unassisted CPAP was 2 hours (range 0 to 41 hours). Thirty-one (66%) patients on ASB progressed on CPAP without any retrograde steps. Forty-one patients (87.2%) had a first attempt to wean from PCV to ASB and 40 patients (85%) from ASB to CPAP within 10 hours of eligibility. Maximum delay in initiating first attempts to ASB was 34 hours, and 41 hours to CPAP. Reasons for retrograde steps included respiratory instability (n =19), signs of poor tolerance or haemo-dynamic instability on ASB/unassisted CPAP (n =18) and interventions, for example bronchoscopy, imaging, theatre, and so forth (n = 13).

Conclusions The median duration on MV in our unit compares favourably with a large randomised controlled trial with a similar patient population [1]. Most patients had their first attempt to wean within 10 hours from eligibility. A significant number of patients may have weaned more quickly if a formal protocol was in place. Reference

1. Marelich GP, Murin S, Battistella F, Inciardi J, Vierra T, Roby M: Protocol weaning of mechanical ventilation in medical and surgical patients by respiratory care practitioners and nurses: effect on weaning time and incidence of ventilator-associated pneumonia. Chest 2000, 118:459-467.

Ventilator dependency among morbidly obese in the ICU CL Jessen, KM Larsen

Aarhus University Hospital, Aarhus C, Denmark

Critical Care 2009, 13(Suppl 1):P28 (doi: 10.1186/cc7192)

Early and continuous weaning from mechanical ventilation without formal protocols in a university hospital

K Lam, J Walker

Royal Liverpool University Hospital, Liverpool, UK Critical Care 2009, 13(Suppl 1):P27 (doi: 10.1186/cc7191)

Introduction There is evidence that formal weaning protocols can reduce the duration of mechanical ventilation (MV) and complications of prolonged unnecessary ventilation [1]. In our ICU we do not employ a formal protocol, but have a standard practice where patients with spontaneous respiratory efforts on pressure-controlled ventilation (PCV) have a trial of assisted spontaneous breathing (ASB). Patients on ASB with pressure support <10 cmH2O will then have a trial of unassisted continuous positive airway pressure (CPAP) with high-flow oxygen. Methods A retrospective audit was conducted on 47 patients in a 13-bed general medical and surgical ICU of a university hospital. The length of time between first spontaneous breaths while on PCV and a trial of ASB was recorded. The length of time from achieving pressure support <10 cmH2O on ASB to a trial of unassisted CPAP was recorded. A retrograde step was defined as going back on to PCV from ASB, or to ASB from unassisted CPAP. If the trial of weaning was repeated within 12 hours of a

Introduction The purpose of this study was to evaluate the dependency for mechanical ventilation among morbidly obese patients (MOP) defined by BMI >40 kg/m2, admitted to our ICU. Because of reduced functional residual capacity, increased risk of atelectasis, increased work of breathing and decreased compliance of the lungs and chest wall [1], MOP are expected to have a high dependency of mechanical ventilation. Early tracheotomy has a beneficial outcome in a medical population of patients admitted to the ICU [2], and one should assume benefits of early tracheotomy in MOP because they are at high risk of pulmonary complication. A subject of debate as a study has shown morbid obesity associated with increased risk of complications [3]. Methods All MOP admitted for more than 24 hours in a 12-bed mixed ICU at a Danish university hospital in the period of 2007 and 2008 were retrospectively included. The ICU stay was registered as well as airway management, length of mechanical ventilation and time for tracheotomy after intubation.

Results Twenty-one morbidly obese patients were admitted. Fifteen patients (71.4%) needed mechanical ventilation. Three of these patients had a period of noninvasive ventilation. The median duration of ventilation was 13 days (range 4 to 71 days) and median length of stay was 16 days (range 4 to 71 days). Eleven patients were tracheotomised after a median 7 days (range 1 to 11 days). Six patients had no need for mechanical ventilation. Their median length of stay was 3 days (range 1 to 12 days). There was

no difference in age and BMI between the two groups. Female/male ratio was 8/7 in the ventilated group versus 5/1 in the nonventilated group. Surgical/medical ratio was 11/4 in the ventilated group versus 6/0 in the nonventilated group. Only one patient died in the ICU.

Conclusions A high proportion of MOP admitted to our ICU needed mechanical ventilation (71.4%) and a very high proportion was tracheotomised. Further studies are needed to evaluate the beneficial effects of early tracheotomy in this patient group. References

1. Marik P, et al.: The obese patient in the ICU. Chest 1998, 113:492-498.

2. Rumbak MJ, et al.: A prospective, randomised, study comparing early percutaneous dilational trachetomy to prolonged translaryngeal intubation (delayed tracheotomy) in critically ill medical patients. Crit Care Med 2004, 32:16891694.

3 Solh A, Jaafar W: A comparative study of the complications of surgical tracheostomy in morbidly obese critically ill patients. Crit Care 2007, 11:R3.

Comparison of a novel humidifier with two conventional humidifiers during high-frequency oscillatory ventilation

J Davies1, N Tiffin2, N MacIntyre1

1Duke University Medical Center, Durham, NC, USA; 2Hydrate, Inc., Midlothian, VA, USA

Critical Care 2009, 13(Suppl 1):P29 (doi: 10.1186/cc7193)

Introduction During high-frequency oscillatory ventilation (HFOV), drying of the airways and mucous plug formation can be complications associated with inadequate humidification. This study compares water vapor delivery of a standard passover humidifier and a conchatherm humidifier typically used during HFOV with a novel humidifier that employs the principle of capillary force vaporization.

Methods The Sensormedics 3100B oscillatory ventilator (Cardinal Health, Yorba Linda, CA, USA) was connected to a test lung at the following settings: mean airway pressure, 30 cmH2O; power, 6.0; inspiratory time percentage, 33%; frequency of 6 Hz and bias flow 30 l/min on room air. The 3100B was run at the above settings first using five different MR850 passover humidifiers (Fisher & Paykel, Auckland, New Zealand), followed by five different Hydrate OMNIs (Hydrate Inc., Midlothian, VA, USA) and then five different ConchaTherm Neptunes (Teleflex Medical, Research Triangle Park, NC, USA). The gas temperature and relative humidity were recorded continuously using an electronic hygrometer/thermometer (SHT75; Sensirion, Staefa, Switzerland) in two circuit configurations: (1) between the test lung/patient wye (triangular plastic connector that connects the inspiratory and expiratory limbs of the circuit with the patient endotracheal tube) and (2) distal to a condensation tube placed between the test lung/patient wye. The condensation tube served to approximate the upper airways in our

Table 1 (abstract P29)

Wye Wye absolute Distal absolute

temperature humidity (mg/l) humidity (mg/l)

MR850 34.4 ± 0.9 33.7 ± 1.3 17.2 ± 6.1

OMNI 34.4 ± 1.3 37.2 ± 2.6 26.8 ± 5.0

Neptune 32.0 ± 0.9 30.8 ± 1.7 17.7 ± 1.0

S12 Data presented as mean ± SD.

lung model and collect the condensate. Water condensate was collected over a 30-minute test period for each run. All humidifiers were set to 37°C. Results See Table 1.

Conclusions In this model, the Hydrate OMNI provided the highest absolute humidity during HFOV. The difference was amplified at the end of the condensation tube. Further study on humidification during HFOV is warranted.

A new in vitro model for force measurements at the isolated entire rat diaphragm

C Armbruster, K Gamerdinger, M Schneider, S Schumann, H Priebe, J Guttmann

University Medical Center, Freiburg, Germany

Critical Care 2009, 13(Suppl 1):P30 (doi: 10.1186/cc7194)

Introduction Several diseases of different origin as well as prolonged mechanical ventilation result in diaphragmatic dysfunction and atrophy. To date most in vitro experiments on the mechanics of diaphragm muscles are performed at isolated muscle strips. Since the improvement of the knowledge about the diaphragm functionality is lacking in view of the position and curvature, a model of the whole diaphragm appears necessary. Here we present a new in vitro model of an isolated whole rat diaphragm inside a bioreactor [1].

Methods The bioreactor consists of a pressure chamber on which a highly flexible membrane is attached. A rat diaphragm is fixed on this membrane. By application of a gas volume into the pressure chamber, a defined deflection of the diaphragm is achieved. The highly flexible membrane adapts a given shape thus allowing the diaphragm to take up its in vivo profile. Electrical stimulation results in a contraction of the diaphragm. These diaphragm-twitches generate pressure pulses inside the pressure chamber. Rat diaphragms were excised rapidly and kept at room temperature in Krebs-Henseleit solution bubbled with oxygen. By application of one of two different initial pressures (P1: 7 mbar; P2: 12 mbar) the diaphragms were set to a defined degree of deflection. Using a platinum wire electrode, the diaphragms were electrically stimulated at impulses of 6V. The stimulation train duration was set to 500 ms or 1 second. The pulse duration was set to 50 ms at a frequency of 50 Hz. The pressure pulses resulting as the response on muscle contraction were measured inside the pressure chamber.

Results An increase of the initial pressure led to an increased pressure caused by the muscle contraction. An enlarged muscle regeneration time led to an increase of the diaphragmatic twitch-induced pressure.

Conclusions Our new in vitro model of an isolated whole rat diaphragm allows the performance of mechanical and physiological investigations on the entire diaphragm. A dependency of pressure development of the diaphragm on its deflection state could be demonstrated. Furthermore, we found effects of diaphragm muscle fatigue. Our new model could be used in numerous types of investigations, such as releasing factors of diaphragmatic dysfunction or respiratory muscle fatigue. Reference

1. Schumann S, Stahl CA, Möller K, Schneider M, Metzke R, Wall WA, Priebe HJ, Guttmann J: Contact-free determination of material characteristics using a newly developed pressure-operated strain-applying bioreactor. J Biomed Mater Res B Appl Biomater 2008, 86B:483-492.

Adaptive support ventilation prevents ventilator-induced diaphragmatic dysfunction: an in vivo piglet study

B Jung1, N Rossel1, C Le Goff1, N Claveiras1, M Wysocki2, S Matecki1, G Chanques1, S Jaber1

1Saint-Eloi Hospital, Montpellier, France; 2Hamilton Medical, Rhazuns, Switzerland

Critical Care 2009, 13(Suppl 1):P31 (doi: 10.1186/cc7195)

Introduction Mechanical ventilation is a lifesaving supportive therapy for patients with acute respiratory failure. However, prolonged mechanical ventilation results in the complete absence of neural activation and mechanical activity of the diaphragm and has been shown to induce ventilator-induced diaphragmatic dysfunction (VIDD) [1]. Few studies have shown that maintaining spontaneous ventilation could prevent VIDD in in vitro animal studies [2]. Adaptive support ventilation (ASV) is an automatic ventilation mode that allowed pressure-controlled breaths in active patients able to trigger. The aim of our study was to compare ASV with controlled ventilation (CV) without paralysing agents on diaphragmatic contractile properties in an in vivo piglet study. Methods Two groups of six anesthetized piglets were ventilated during a 72-hour period. Piglets in the CV group (n = 6) were ventilated without any spontaneous ventilation and piglets in the ASV group (n = 6) were ventilated with the possibility of triggering spontaneous ventilation. The main endpoint was the transdiaphrag-matic pressure (Pdi) (after bilateral, supramaximal, transjugular stimulation of the two phrenic nerves), which represents the in vivo contractile function of the diaphragm. A force-frequency curve was drawn after stimulation from 20 to 1 20 Hz of the phrenic nerves.

Results The piglets in the ASV group were maintained with spontaneous ventilation during 80% of the study period instead of less than 1% in the CV group. At 72 hours, Pdi was decreased by 35% in the CV group although it was not modified in the ASV group (Figure 1).

Conclusions Spontaneous breathing with ASV prevents VIDD in comparison with totally CV in an in vivo healthy piglet model.

Figure 1 (abstract P31)

References

1. Sassoon CS: Ventilator-associated diaphragmatic dysfunction. Am J Respir Crit Care Med 2002, 166:1017-1018.

2. Futier E, Constantin JM, Combaret L, Mosoni L, Roszyk L, Sapin V, et al.: Pressure support ventilation attenuates ventilator-induced protein modifications in the diaphragm. Crit Care 2008, 12:R116.

Characterization of the mechanical ventilator adjustment process

H Al-Otaibi, J Hardman, R Mahajan

The University of Nottingham, UK

Critical Care 2009, 13(Suppl 1):P32 (doi: 10.1186/cc7196)

Introduction The aim of the present study was to gain insight into the mechanical ventilator adjustment process, particularly events that prompt clinicians to adjust the ventilator, the indications, considerations, and methods used to assess the successfulness of the adjustment episode.

Methods A prospective, observational, noninterventional study was conducted in a 24-bed adult, medical and surgical intensive care unit at a regional hospital. Patient demographics, ventilator adjustment episodes, and clinical decisions related to these adjustments were collected. Clinicians were asked to complete a ventilator flowsheet before and after any ventilator adjustment episode. Simultaneously, they were asked to complete an open-ended questionnaire related to the process of ventilator adjustment episode.

Results A total of 168 ventilator adjustment episodes derived from 26 mechanically ventilated patients were evaluated. Among these episodes, the ventilator mode was adjusted 33 times (20%), the minute volume 37 times (22%), the positive end-expiratory pressure 19 times (11%), the pressure support 21 times (12%), and the fraction of inspired oxygen 61 times (36%). Triggers which stimulate the process of mechanical ventilator adjustment were categorized into: routine ventilator checks (35%), routine arterial blood gases (ABG) (20%), weaning trials (13%), and calls from nurses (9%). The most common indications to adjust the ventilator were hyperoxygenation (28%) and weaning trials (26%). Peripheral oxygen saturation, ABG, and level of consciousness were the most common considered variables during the process of ventilator adjustment. In 42% of the total adjustment episodes, clinicians did not consider any technical, physiological or psychological variable while they adjusted ventilator parameters. Clinicians based their decisions to adjust ventilator settings on clinical experience (72%), trial and error (15%), protocols (11%), and scientific equations (2%). Clinicians assessed their decisions via ABG results (47%), peripheral oxygen saturation (23%), and general patient assessment (9%).

Conclusions The process of ventilator adjustment is mainly stimulated by routine ventilator checks and ABG. ABG and weaning trials are the most common indications. Most ventilator adjustment decisions are based on clinical experience, evaluated via ABG and peripheral oxygen saturation.

Pdi evolution during the study. P <0.05 CV vs. ASV group at 48 hours.

Impact of a dedicated ventilatory support team on how mechanical ventilation is employed in a tertiary-care hospital

F Saddy, A Thompson, R Serafim, F Gago, N Charris, J Pantoja

Copa D'Or, Rio de Janeiro, Brazil

Critical Care 2009, 13(Suppl 1):P33 (doi: 10.1186/cc7197)

Introduction Advances in our knowledge of the pathophysiology of respiratory failure have forced major revisions of our approach to ventilatory support. We describe how mechanical ventilation is employed in four different ICUs (surgical, clinical, cardiac and neurological) of a tertiary-care Brazilian hospital where a ventilatory support team composed of intensivists is responsible for a daily-basis follow up.

Methods A prospective observational study enrolled all invasive mechanically ventilated patients admitted to four ICUs from May 2004 through June 2008. Daily recorded data included: demographics, diagnosis, modes of ventilation, tidal volume/kg (Vt), positive end-expiratory pressure (PEEP) level, peak inspiratory pressure, plateau pressure (Pplat), recruitment maneuvers, use of sedation and neuromuscular blocking agents (NBA), tracheotomy, barotrauma, ventilation days, and length of stay (LOS) in the ICU. Results are expressed as the mean ± SD and percentage. Differences were assessed by one-way ANOVA followed by the Tukey test. P <0.05 was considered significant. Results A total of 1,715 patients was studied. Diagnosis prevailed depending on the ICU's characteristics. Ventilatory data are depicted in Table 1. Recruitment maneuvers were used in less than 2% of patients. The most frequent type of ventilatory mode was spontaneous (P <0.05). Barotrauma was similar and occurred in less than 0.63% (P >0.05). Intravenous sedation was administered for no more than 40% of the time on mechanical ventilation. NBA was used for no more than 0.25% of patients. LOS and ventilation days were different among ICUs (P <0.05).

Table 1 (abstract P33)

Ventilatory data

Figure 1 (abstract P34)

Vt (ml/kg)

PEEP (cmH2O)

Pplat (cmH2O)

Surgical Clinical Neuro Cardiac

6.6 ± 2

6.6 ± 1.7

6.7 ± 2.1

6.8 ± 1.5

8.9 ± 3.2

8.6 ± 2.7 8.3 ± 2.3

8.7 ± 2.1

22 ± 6.3 22 ± 5.9 22.7 ± 4.4 21.3 ± 4

Conclusions Daily interaction of the ventilatory support team and the ICU practitioners guaranteed a homogeneous and up-to-date form of ventilatory support care to the patients in the different ICUs.

Hyperoxia in mechanically ventilated patients

S Bolton, E Pugh, A Hay, S McKechnie

Royal Infirmary Edinburgh, UK

Critical Care 2009, 13(Suppl 1):P34 (doi: 10.1186/cc7198)

Introduction Supplemental oxygen is part of the supportive treatment of hypoxaemic respiratory failure. Prolonged exposure to high fractions of inspired oxygen (FiO2) has been shown to be injurious to the lung in vitro [1]. The optimal paO2 in critical illness has not been established but the clinical impression is that patients are frequently exposed to a higher FiO2 than is necessary. The aim

of this study was to investigate current oxygen treatment in patients undergoing prolonged mechanical ventilation and determine whether this resulted in exposure to a higher FiO2 than that required to adequately oxygenate haemoglobin (SaO2 >95%). Methods Indices of oxygenation were collected for 30 patients requiring mechanical ventilation >2 days. Data were collected 4-hourly for 48 hours. The PaO2:FiO2 ratio was determined for each data point and used to model the theoretical FiO2 required to maintain an SaO2 of 95% (paO2 = 10.7 kPa). Results The results are shown in Figure 1. Data are expressed as medians (IQR). The median observed FiO2 was 0.4 throughout the study whereas the median FiO2, calculated to maintain an SaO2 of 95%, was 0.3.

Conclusions Our data demonstrate that critically ill patients may be exposed to a higher FiO2 than that required to maintain adequate oxygenation. Further, these results highlight an area of ICU care that has received little study, with no published clinical trials examining the effect of FiO2 on outcome. Reference

1. McKechnie S: The effect of hyperoxia on alveolar epithelial injury and repair. J Intensive Care Soc 2008, 9:94.

Are chest X-rays necessary after chest tube insertion in trauma emergencies?

MS Moeng

Johannesburg Hospital, Houton, Johannesburg, South Africa Critical Care 2009, 13(Suppl 1):P35 (doi: 10.1186/cc7199)

Introduction Johannesburg Hospital is a level I trauma center in Gauteng. We routinely do chest X-rays (CXRs) in indicated cases, if feasible, and prefer to perform a CXR after intercostal drain (ICD) insertion to maintain quality and direct further care [1]. Methods A prospective data collection of patients who had injuries that required the insertion of an ICD over a period of

8 months from 1 August 2006 to 30 March 2007. A questionnaire was developed and it included the patients' demographics, mechanism of injury, reason for ICD insertion, findings of both the initial and post-ICD insertion CXR, change in management and any acute complications noted.

Results One hundred and forty patients were identified for the study, 129 (92.1%) were males and 11 (7.9%) were females. The average age was 32 (range 11 to 64) years. Eighty-four (60%)

patients sustained stab wounds and 20 (14.3%) had gunshot wounds. The remaining 36 (25.7%) sustained blunt injury. One hundred and four (74.3%) patients had an ICD inserted for both radiological and clinical findings, while 19 (13.6%) patients had drains inserted due to radiological findings and 14 (10%) patients on clinical grounds only. Three (2.1%) patients had drain insertion after CT scan findings.

In patients who had an initial CXR, clinical and/or radiological findings confirmed 47 (36.7%) patients with haemopneumothoraces, 49 (38.3%) patients with pneumothoraces, 26 patients (20.3%) with haemothoraces and four (3.1%) patients had significant surgical emphysema with fractures and two (1.6%) patients had no abnormalities on CXR. No acute complications to chest tube insertion were noted.

In 89 (63.6%) patients, the post-ICD CXR showed good position of the drain with improvement in pathology, in 31 (22.1%) patients an inadequate ICD position was noted, in 17 (12.1%) patients significant retained haemothoraces were shown and in three (2.1%) patients poor lung expansion was detected. The post-ICD CXR contributed to change in management in 29 (20.7%) of the cases. Twenty-two (15.7%) patients required a change in position of the tube, six (4.3%) had surgery performed, and one (0.7%) patient had their conservative treatment escalated. Four (2.9%) patients should have had their tubes adjusted. Conclusions Routine usage of post-ICD CXR contributes to a change in management in one out of five trauma patients. Reference

1. Huber-Wagner S, Korner M, Ehrt A, Kay MV, Pfeifer KJ, Mutschler W, Kanz KG: Emergency chest tube placement in trauma care - which approach is preferable? Resuscitation 2007, 72:226-233.

Bedside chest ultrasound reduces the rate of chest X-ray and CT examinations

R Spina1, L Tutino2, A Di Filippo2, L Perretta1, A Cecchi1, A Peris1

1Careggi Teaching Hospital, Florence, Italy; 2University of Florence, Italy

Critical Care 2009, 13(Suppl 1):P36 (doi: 10.1186/cc7200)

Introduction A bedside chest ultrasound (bCUS) programme performed by intensivists after 18 months of ultrasound training was introduced in the ICU routine between April and November 2008 in order to evaluate its effects on the number of chest X-ray (CXR) and CT scans. The setting was a 10-bed emergency ICU. Methods From April to November 2008 every patient has undergone a bCUS within the first 48 hours since admittance, then between the fourth and the sixth days of their stay. All of the 92 patients were examined supine, with a convex probe perpendicular to the chest wall, using all the intercostal spaces as the acoustic window. From the lung base, every intercostal space has been examined, looking for a pleural effusion, attaining to the following criteria: a space between the visceral and parietal pleura, movement of that space in agreement with the respiratory pattern. As for the volume esteem of the effusion, the distance (mm) between the posterior part of the lung and the posterior chest wall was measured. For each patient the following data were collected: age, sex, weight, height, SAPS II, number of CXR and CT scans done. Data were compared with those of a group of patients admitted to the ICU from January to March 2008, when bCUS was not part of the daily procedures. P <0.05 was considered statistically significant.

Results The two groups of patients were statistically homogeneous. Comparing the control group with the study group, CXRs were reduced by 22.63% (from 433 to 335; P =not significant) and CT scans were reduced by 42.36% (from 144 to 83; P <0.05).

Conclusions The use of daily routine bCUS is useful to reduce the traditional radiology requests. The decrease of CT scans is statistically relevant. As far as CXR requests are concerned, the decrease does not appear significant since the CXR is relatively easy to perform and it is mandatory after radiopaque device positioning. These amount to 34% of the total CXRs performed.

Combination of variability with pressure support ventilation enhances lung protection and function in experimental acute lung injury

M Gama de Abreu1, PM Spieth1, AR Carvalho1, P Pelosi2, T Koch1

University Clinic Carl Gustav Carus, Dresden, Germany;

2University of Insubria, Varese, Italy

Critical Care 2009, 13(Suppl 1):P37 (doi: 10.1186/cc7201)

Introduction Protective ventilation with low tidal volumes became the standard of care in acute lung injury (ALI). Although spontaneous breathing activity may be beneficial even in early ALI, protective ventilation is usually performed as controlled ventilation. Theoretically, pressure support ventilation (PSV) may be advantageous over pressure-controlled ventilation (PCV), particularly if combined with variability in the form of noisy PSV [1,2]. Methods Two protocols (A and B) were performed in pigs (20 to 30 kg). Animals were anesthetized, intubated, mechanically ventilated and ALI was induced by surfactant depletion. In protocol A, animals (n = 12) were randomly assigned to a sequence of PCV, PSV and noisy PSV, being ventilated for 1 hour in each mode (crossover design). The distribution of lung aeration and perfusion were determined. In protocol B, animals (n = 24) were randomly assigned to 6 hours of mechanical ventilation with PCV or PSV or noisy PSV, and lungs were extracted for quantification of inflammation and lung damage. In both protocols, gas exchange and respiratory parameters were determined. Statistical analysis was performed with univariate and multivariate tests, as appropriate. Significance was accepted at P <0.05. Results Compared with PCV, arterial oxygenation, intrapulmonary shunt, mean airway pressure and elastance of the respiratory system were improved by PSV and further improved by noisy PSV. Also, noisy PSV reduced the work of breathing and respiratory drive compared with PSV. Lung damage and inflammation were reduced by assisted mechanical ventilation, but comparable between noisy PSV and PSV. The distribution of lung aeration did not differ among the three modes, but PSV and noisy PSV were associated with more redistribution of pulmonary blood flow towards better aerated ventral areas compared with PCV. Conclusions The combination of variability with PSV enhances lung protection and function in experimental ALI. The main mechanism of improvement of lung function by assisted mechanical ventilation using pressure support is not recruitment of dorsal areas, but rather redistribution of pulmonary blood flow to ventral zones. References

1. Gama de Abreu M, Spieth PM, Pelosi P, Carvalho AR, Walter C, Schreiber-Ferstl A, et al.: Noisy pressure support ventilation: a pilot study on a new assisted ventilation mode in experimental lung injury. Crit Care Med 2008, 36:818-827.

2. Spieth et al.: Am J Respir Crit Care Med in press. S15

Measurement of the forearm to warrant low-tidal-volume ventilation in the acute respiratory distress syndrome

M Moller, J Volz, J Neuzner

Klinikum Kassel, Germany

Critical Care 2009, 13(Suppl 1):P38 (doi: 10.1186/cc7202)

Introduction Acute respiratory distress syndrome (ARDS) is a life-threatening situation in patients on the ICU. Most patients have to be ventilated mechanically to provide adequate oxygenation. Reduction of tidal volumes as low as 6 ml/kg adjusted bodyweight has been convincingly shown to reduce ARDS and mortality in the ARDSnet trial [1] and is now recommended in treating such patients. In the ARDSnet trial, body weight has been calculated by a formula implementing the body height [1]. We suggest that in most patients on ICUs the correct height is not known or is at best estimated, but very seldom correctly measured. We searched for both an easily obtainable and reproducible body mark to correctly predict body height. Anthropological and forensic data have shown a close correlation between the ulna length and body size. We prospectively measured body height and right and left ulna length in ventilated ICU patients. Methods In a 2-month period, 39 men and 42 women from four ICUs in a teaching hospital were included consecutively (mean age 62 years, range 23 to 87 years). Body height was measured following a standardized protocol, ulna length was measured from the edge of the olecranon to the caput ulnae. Results Eighty-one patients were included. Thirty-nine men: height 145 to 199 cm (mean 178.5 cm), right ulna 25 to 32 cm (mean 28.3 cm), left ulna 24.5 to 32 cm (mean 28.3 cm). Forty-two women: height 142 to 185 cm (mean 165.7 cm), right ulna 21 to 29 cm (mean 25.0 cm), left ulna 21 to 28 cm (mean 24.9 cm). Regression analyses were made in SAS 9.1 and showed a significant correlation between the body height (men and women) and the length of the ulna. Regression analyses for men: body height in cm = 3.9314 x (right ulna; in cm) + 67.059 cm (r2 = 53.25%; SD = 6.79) and body height in cm = 3.9786 x (left ulna; in cm) + 65.824 cm (r2 = 52.50%; SD = 6.84). Regression analyses for women: body height in cm = 4.88 x (right ulna; in cm) + 43.76 cm (r2 = 55.53%; SD = 6.61) and body height in cm = 5.41 x (left ulna; in cm) + 30.95 cm (r2 = 60.67%; SD = 6.21). Conclusions Ulna length is an easily obtainable estimate of total body height. It may aid in implementing low-tidal-volume ventilation. Anthropological data correlate reasonably with a contemporary ventilated ICU population. Whether ulna-derived estimates of body height may improve adherence to established guidelines has to be studied. Reference

1. Acute Respiratory Distress Syndrome Network: Ventilation with lower tidal volumes as compared with traditional tidal volumes for acute lung injury and the acute respiratory distress syndrome. N Engl J Med 2000, 342:1301-1308.

Effect of pleural effusion on gas exchange and response to positive end-expiratory pressure in acute lung injury/acute respiratory distress syndrome patients

D Chiumello, C Mietto, V Berto, M Cressoni, M Lazzerini, L Gattinoni

Fondazione IRCCS, Ospedale Maggiore Policlinico, Milan, Italy Critical Care 2009, 13(Suppl 1):P39 (doi: 10.1186/cc7203)

Introduction Pleural effusion is a common finding in acute lung S16 injury/acute respiratory distress syndrome (ALI/ARDS) patients.

However, the effect of pleural effusion on gas exchange, respiratory mechanics and response to positive end-expiratory pressure (PEEP) in ALI/ARDS patients, during mechanical ventilation, has never been prospectively studied.

Methods Patients with a diagnosis of ALI/ARDS, who underwent a CT scan at 5 cmH2O PEEP for clinical reasons, were included in the study. Lung and pleural effusion were outlined separately; lung total weight and pleural effusion volume were computed with dedicated software. A PEEP test at 5 and 15 cmH2O with constant minute ventilation was performed. Exclusion criteria were: age <18 years, hemodynamic instability, chronic obstructive pulmonary disease and evidence of barotrauma. Results We enrolled 11 ALI/ARDS patients (10 male). The mean clinical characteristics on admission to the ICU were: age 67.5 ± 9.9 years, BMI 24.4 ± 2.1 kg/m2, PaO2/FiO2 203.1 ± 56.1 mmHg, PEEP 9.1 ± 2.8 cmH2O and pH 7.3294 ±2 0.048. The volume of pleural effusion was not significantly related with the change in PaCO2 (r2 = 0.068; P = 0.437), PaO2 (r2 = 0.015; P = 0.722) and dead space (VD/VT) (r2 = 0.019; P = 0.682) going from PEEP 15 to PEEP 5 cmH2O.

Conclusions Pleural effusion does not seem to influence the gasexchange response to PEEP.

CT-scan lung morphology predicts the response to a recruitment maneuver in acute respiratory distress syndrome patients

J Constantin1, S Grasso2, JJ Rouby3, E Futier1, B Gallix4, B Jung4, JE Baazin1, S Jaber4

1Hotel-Dieu Hospital, Clermont-Ferrand, France; 2Ospedale Policlinico, Bari, Italy; 3Pitie Salpetriere Hospital, Paris, France; 4Saint-Eloi Hospital, Montpellier, France Critical Care 2009, 13(Suppl 1):P40 (doi: 10.1186/cc7204)

Introduction CT-scan lung morphology (lobar or nonlobar) is the main determinant of positive end-expiratory pressure (PEEP) response. Repartition of gas and tissue probably influences the response to a recruitment maneuver (RM), but to date there is no proof. The aim of this study was to assess RM-induced changes in lung morphology and gas exchange during and after RM in acute respiratory distress syndrome (ARDS) patients. Methods Nineteen patients with ARDS were included in the study. Patients were ventilated with volume control ventilation (Gallileo; Hamilton Medical, Bonaduz, Switzerland) with Vt = 6 ml/kg (ideal body weight) and respiratory rate to keep PaCO2 >55 mmHg without intrinsic PEEP. After a first CT scan in zero end-expiratory pressure conditions, a pressure-volume curve was performed and the PEEP was set above the lower inflection point. After a stabilisation period, a second CT scan (in PEEP) was performed. Then a RM was performed, using continuous positive airway pressure 40 cmH2O for 40 seconds. At the end of the RM (between 35 and 39 s) a third CT scan was performed. Five minutes after the RM, a fourth CT scan was performed in PEEP conditions (same level of PEEP). Blood gas analysis was sampled at each step of the study. CT-scan analysis was performed using specific volumetric software (IRMA).

Results Ten men and nine women, 63 ±11 years old, were included in the study. SAPS II was 44 ± 8. Nine presented a nonlobar CT-scan attenuation and eight a focal loss of aeration. All patients presented early ARDS (onset between ARDS diagnostic and study inclusion was 18 ± 11 hours). Setting PEEP 2 cmH2O above the lower inflection point increased PaO2/FiO2 from 167 ± 110 to 205 ± 72 mmHg (P <0.001), and 265 ± 80 mmHg after the RM (P <0.005 vs. PEEP) and 273 ± 96, 5 minutes after

the RM. The RM-induced recruited volume at the same level of PEEP was 8 ± 45 ml in the case of lobar CT-scan attenuation versus 96 ± 63 ml in nonlobar CT-scan attenuation (P <0.005). The overinflated lung volume after RM was 121 ± 170 ml in the case of lobar attenuation versus 18 ± 22 ml in diffuse (P <0.005). During the RM, the overinflated lung volume was over 50% in the case of lobar CT-scan attenuation.

Conclusions As for PEEP, lung morphology predicts the response to a RM. In the case of lobar loss of aeration, RM induced overinflation more than the recruited volume and should be avoided.

Alveolar wall disruption and lung Inflammation associated with positive end-expiratory pressure and recruitment maneuver in pigs

A Ambrosio1, DT Fantoni1, CK Marumo2, D Otsuki2, C Gutierres3, Q Lu3, J Noel-Morgan2, JJ Rouby3, JO Auler Jr2

1Faculdade de Medicina Veterinâria e Zootecnia da Universidade de Sao Paulo, Brazil; 2Faculdade de Medicina da Universidade de Sao Paulo, Brazil; 3La Pitiè-Salpêtrière Hospital, Assistance Publique Hôpitaux de Paris, University Pierre et Marie Curie, Paris, France

Critical Care 2009, 13(Suppl 1):P41 (doi: 10.1186/cc7205)

Introduction Different levels of positive end-expiratory pressure (PEEP) associated or not with alveolar recruitment maneuver (ARM) may have a significant impact on ventilator-induced lung injury [1], but this issue has not been well addressed in a model of lung injury using inhaled hydrochloric acid. We aimed to evaluate the effects of PEEP and ARM on respiratory mechanics and lung tissue in an inhaled hydrochloric acid acute lung injury (ALI) model. Methods Thirty-two pigs (22.9 ± 2.8 kg), were randomly allocated into one of four groups (I - Control-PEEP, II - Control-ARM, III -ALI-PEEP, IV - ALI-ARM). ALI was induced by intratracheal instillation of hydrochloric acid. PEEP values were progressively increased and decreased from 5, 10, 15 and 20 cmH2O in all groups. Three alveolar recruitment maneuvers of 40 cmH2O for 20 seconds were applied to the assigned groups at each interval of PEEP level increase or decrease. Histological analysis was made to evaluate the presence of inflammatory infiltrates, alveolar wall thickening, atelectasis, hemorrhage, alveolar edema and alveolar disruption. Histomorphometrical analyses of the lungs were also performed to determine alveolar dimensions. Results Inflammation and alveolar disruption were statistically greater in group II when compared with group I. There were no statistical differences between groups III and IV with respect to hemorrhage, alveolar edema, inflammation and alveolar disruption. Mean alveolar area and mean alveolar intercept were higher in group IV when compared with groups I and II (P <0.05). The mean alveolar area was significantly smaller in the diaphragmatic lobes when compared with other middle and upper pulmonary lobes, but no statistical difference was found among groups. Conclusions The association of PEEP with ARM promoted a greater degree of alveolar disruption.

Acknowledgement Grants from LIM08-Anesthesia; FAPESP 02/08621-1. Reference

1. Halter JM, Steinberg JM, Schiller HJ, DaSilva M, Gatto LA, Landas S, Nieman GF: Positive end-expiratory pressure after a recruitment maneuver prevents both alveolar collapse and recruitment/derecruitment. Am J Respir Crit Care Med 2003, 167:1620-1626.

Ventilation with high positive end-expiratory pressure improves oxygenation after cardiac surgery independently of the mode of ventilation and of the use of nitric oxide

L Hajjar, F Galas, N Rossati, A Leme, R Kalil Filho, J Auler

Heart Institute, Sao Paulo, Brazil

Critical Care 2009, 13(Suppl 1):P42 (doi: 10.1186/cc7206) Introduction Postoperative pulmonary dysfunction in patients undergoing cardiac surgery with cardiopulmonary bypass is a significant clinical problem and has long been recognized. Postoperative hypox-emia carries high morbidity leading to prolonged postoperative recovery and hospital stays. We hypothesized that adding high positive end-expiratory pressure (PEEP) would be effective for treatment of postoperative hypoxemia after cardiac surgery with cardiopul-monary bypass, independently of the mode of ventilation or the use of nitric oxide.

Methods During 2 years, 210 patients undergoing coronary artery bypass with pump surgery after diagnosis of hypoxemia (PO2/FiO2 <200) were randomized into six groups after operation (35 in each group): pressure-controlled ventilation (PCV) with inhaled nitric oxide and ideal PEEP (group 1), PCV without inhaled nitric oxide and ideal PEEP (group 2), PCV without inhaled nitric oxide and PEEP 5 cmH2O (group 3), volume-controlled ventilation (VCV) with inhaled nitric oxide and ideal PEEP (group 4), VCV without inhaled nitric oxide and ideal PEEP (group 5) and VCV without inhaled nitric oxide and PEEP 5 cmH2O (group 6). Arterial and mixed venous blood were drawn and analyzed before the interventions, 1, 2, 4 and 6 hours after the interventions in the ICU to determine the PaO2/FiO2 ratio. Hemodynamic measurements were analyzed. The time to extubation was compared among groups, accordingly to the ICU weaning protocol.

Results There was no significant difference among the groups regarding hemodynamic measurements (mean arterial blood pressure, heart rate, central venous pressure and SVO2). Oxygenation was higher in both high-PEEP groups (groups 1, 2, 4 and 5) than in the PEEP 5 groups (groups 3 and 6) during the mechanical ventilation period (P <0.01). Also, the time to extubation was significantly lower in high PEEP groups than in the PEEP 5 groups (220 min vs. 428 min, P <0.03).

Conclusions Ventilation with high PEEP after cardiac surgery is associated with improvement of oxygenation and less time of mechanical ventilation independently of the mode of ventilation and of the use of nitric oxide. References

1. Celebi S, Koner O, Menda F, Korkut K, Suzer K, Cakar N: The pulmonary and hemodynamic effects of two different recruitment maneuvers after cardiac surgery. Anesth Analg 2007, 104:384-390.

2. Rouby JJ, Ferrari F, Bouhemad B, Lu Q: Positive end-expiratory pressure in acute respiratory distress syndrome: should the 'open lung strategy' be replaced by a 'protective lung strategy'? Crit Care 2007, 11:180.

Control system for automated titration of positive end-expiratory pressure and tidal volume using dynamic nonlinear compliance as the setpoint_

S Lozano-Zahonero1, A Wahl2, D Gottlieb2, J Arntz1, S Schumann2, J Guttmann2, K Moller1

1Furtwangen University, Villingen-Schwenningen, Germany;

2University Hospital Freiburg, Germany

Critical Care 2009, 13(Suppl 1):P43 (doi: 10.1186/cc7207)

Introduction An automated respiratory mechanics control was developed to individually adapt the energy transfer from the S17

ventilator to the respiratory system. The controller titrates the positive end-expiratory pressure (PEEP) and tidal volume (VT) to ventilate the lung at its maximal compliance in order to avoid excessive lung overinflation as well as underinflation. Methods The mechanics controller consists of a software program to set PEEP and VT and a user interface to observe the compliance and the controller state. The program has following structure: (1) Dynamic compliance is calculated breath by breath using the implemented slice function [1]. This function divides VT into six consecutive volume slices of equal size. (2) For each volume slice, one value of dynamic compliance (CSLICE) is determined by least-squares fit using the linear resistance and compliance model within each slice [2]. The six CSLICE values are plotted over the corresponding volume, giving the compliance-volume curve. (3) The shape-compliance function of the controller identifies one out of six shape categories [3]. (4) The PEEP and VT-change function calculates the PEEP and VT titration depending on the shape category and sends a command to the ventilator for setting the new PEEP and VT automatically.

Results The system was tested with previously recorded patient data (McRem) [4]. The compliance controller retrospectively analysed the respiratory data and determined the shape category depending on the course of CSLICE. For shapes representing an intratidal increase of CSLICE, the controller increased the PEEP. A reduction of PEEP occurred when CSLICE decreased intratidally. PEEP was maintained when CSLICE was maximal and constant. Furthermore, for hybrid shape categories (one part in the linear region and one part in the increasing and/or decreasing region) the VT was reduced.

Conclusions The automated respiratory mechanics control system titrates PEEP and VT automatically until intratidal compliance reaches its maximal value within an appropriate VT. References

1. Schumann S, et al.: Modellierung und Bestimmung der nichtlinear volumenabhängigen Compliance der Lunge. In

Dreiländertagung der Deutschen, Österreichischen und Schweizerischen Gesellschaften für Biomedizinische Technik, Zürich; Proceedings V118; 2006.

Guttmann J, et al.: Determination of volume-dependent respiratory system mechanics in mechanically ventilated patients using the new SLICE method. Technol Health

Care 1994, 2:175-191.

Mols G, et al.: Volume-dependent compliance in ARDS: proposal of a new diagnostic concept. Intensive Care Med 1999, 25:1084-1091.

Stahl CA, et al.: Dynamic versus static respiratory mechanics in acute lung injury and acute respiratory distress syndrome. Crit Care Med 2006, 34:2090-2098.

Figure 1 (abstract P44)

Effect of positive end-expiratory pressure on regional ventilation monitored by electrical impedance tomography in mechanically ventilated ICU patients

D Gommers, IG Bikker, J Bakker

Erasmus Medical Center, Rotterdam, the Netherlands Critical Care 2009, 13(Suppl 1):P44 (doi: 10.1186/cc7208)

Introduction The optimal positive end-expiratory pressure (PEEP) is a balance between the prevention of overdistention in the nondependent part and alveolar collapse in the dependent part. Electrical impedance tomography (EIT) has been introduced to monitor regional change of ventilation at the bedside. We evaluated the effect of changes in PEEP on regional ventilation in mechanically ventilated patients with or without lung disorders.

Methods Functional EIT (fEIT) images were obtained in 14 patients on pressure-controlled ventilation with constant driving pressure at four PEEP levels (15, 10, 5 and 0 cmH2O). fEIT images made before each reduction in PEEP were subtracted from those recorded after each PEEP step to evaluate the regional increase/decrease in tidal impedance in each EIT pixel. Results The response of regional tidal impedance to PEEP showed a significant difference from 15 to 10 cmH2O (P = 0.002) and from 10 to 5 cmH2O (P = 0.001) between patients with and without lung disorders (Figure 1). During the decrease in PEEP from 15 to 10 cmH2O, tidal impedance increased in the ventral parts in both groups, but decreased markedly in the dorsal parts in the patients with lung disorders. From PEEP 10 to 5 cmH2O, tidal impedance increased in the ventral parts and decreased in the dorsal parts in patients without lung disorders, whereas in patients with lung disorders tidal impedance decreased in both regions. Lowering the PEEP from 5 to 0 cmH2O decreased tidal impedance in both regions in both groups.

Conclusions During a decremental PEEP trial, EIT can visualize improvement or loss of ventilation in dependent and nondependent parts, indicating lung collapse or decreased overdistention.

Measurement of the end-expiratory lung volume without interruption of mechanical ventilation in pediatric patients

IG Bikker, T Scohy, D Gommers

Erasmus Medical Center, Rotterdam, the Netherlands Critical Care 2009, 13(Suppl 1):P45 (doi: 10.1186/cc7209)

Introduction Monitoring the end-expiratory lung volume (EELV) is a valuable tool to optimize respiratory settings that could be of particular importance in mechanically ventilated pediatric patients. We evaluated the feasibility and precision of an ICU ventilator with an inbuilt nitrogen washout/washin technique in mechanically ventilated pediatric patients.

Methods Duplicate EELV measurements were performed in 26 patients between 5 and 30 kg after cardiac surgery. All measurements were taken during pressure-controlled ventilation at 0 cmH2O positive end-expiratory pressure (PEEP). Results Linear regression between duplicate measurements was excellent (R2 = 0.99). Also, there was good agreement between duplicate measurements, bias -1.0% (-1.7 ml) ± 5.7% (15.5 ml) (Figure 1). The mean EELV was 18.9 ± 4.4 ml/kg at 0 cmH2O PEEP. The EELV correlated significantly with age (P<0.001,

r = 0.92, R2 = 0.79), body weight (P <0.001, r = 0.90, R2 = 0.77) and height (P <0.001, r = 0.90, R2 = 0.79). Conclusions This ICU ventilator with an inbuilt nitrogen washout/ washin EELV technique can measure EELV with precision, and can easily be used for mechanically ventilated pediatric patients.

Success of recruitment maneuvers during pneumoperitoneum is dependent on the intraabdominal pressure

H Runck, S Schumann, J Haberstroh, J Guttmann

University Medical Center Freiburg, Germany

Critical Care 2009, 13(Suppl 1):P46 (doi: 10.1186/cc7210)

Introduction Intraabdominal hypertension is a condition affecting respiratory mechanics in many situations such as laparoscopy or morbid obesity. We investigated the effects of different intraabdominal pressures (IAP) at different positive end-expiratory pressure (PEEP) levels on nonlinear respiratory mechanics and the effects on hemodynamics and oxygenation in a rat model. The application of a pneumoperitoneum served as a model for elevated IAP. Methods A helium pneumoperitoneum was established in 20 anesthetized female Wistar rats that were randomly allocated to one of four PEEP levels (0, 3, 6 and 9 mbar). IAP of 9, 12, 15 and 18 mbar was instilled in a random sequence in each rat, followed by respiratory mechanics measurement and blood gas analysis. From the low flow maneuvers' pressure-volume loops, the lower inflection point and the mathematical turning point within the expiratory limb were detected and compared with the IAP applied during the measurement. For investigation of the lung's intratidal compliance, 10 consecutive breaths before and after low flow maneuvers were analyzed. Intratidal nonlinear compliance was calculated using a modified SLICE method [1]. Results A higher IAP led to a decreased steepness of quasi-static pressure-volume loops resulting in a right shift of the expiratory lower inflection point. The pressure at the mathematical turning point of the expiratory limb correlated well with the IAP (r = 0.97, P <0.001). Intratidal compliance decreased with increasing IAP before and after the low flow maneuvers. After execution of the low flow maneuvers the compliance increased. Relative compliance gain caused by the low flow maneuvers was dependent on PEEP, IAP and slice (all P<0.001). The peak inspiratory pressure increased with increasing IAP and was significantly smaller after recruitment maneuvers (P <0.001). The peak inspiratory pressure was dependent on PEEP and IAP (P<0.001). PaO2 was dependent

on PEEP (P =0.008) but not on IAP (P =0.153). Arterial pressure was not dependent on PEEP (P = 0.068) or IAP (P =0.292). Conclusions Intraabdominal hypertension alters respiratory mechanics. IAP has no effect on oxygenation as long as PEEP is applied. The examination of nonlinearity holds important information for the evaluation of respiratory mechanics. The success of recruitment maneuvers on healthy lungs depends strongly on the IAP. Reference

1. Guttmann J, et al.: Determination of volume-dependent respiratory system mechanics in mechanically ventilated patients using the new SLICE method. Technol Health Care 1994, 2:175-191.

Correlation between lung sound distribution and functional residual capacity: preliminary findings

S Lev, YG Glickman, IK Kagan, JC Cohen, PS Singer

RMC, Petach Tiqva, Israel

Critical Care 2009, 13(Suppl 1):P47 (doi: 10.1186/cc7211)

Introduction Vibration response imaging (VRI) is a bedside lung sound monitoring system. The VRI measurement has been proven sensitive to changes in ventilator settings, including changes in the mode of mechanical ventilation or in positive end-expiratory pressure (PEEP). In the present study, we correlate lung sound distribution with functional residual capacity (FRC). Methods Thirty-nine lung sound measurements were performed on seven mechanically ventilated critically ill patients at different levels of PEEP before and after recruitment maneuver. The FRC was obtained for each measurement. Lung sound distribution was monitored using the sound distribution index (SDI), an index revealing the lung sound distribution at peak inspiration and computed from the percentages of lung sound distribution in four lung segments (right lower (RL), left lower (LL), right upper (RU), left upper (LU)) as SDI = 100 - abs(RL + LL - RU - LU) / 2 -abs(RU + LU - RL - LL) / 2 (range 0% to 100%). P values were obtained using the Wilcoxon two-sample test. Results Significantly increased mean (± SD) SDI was registered in cases with higher FRC. In two out of five measurements with FRC above 2.9 l, a low SDI was obtained (~70%). Conclusions Lung SDI significantly increased with FRC. At very high volumes, a decreased SDI may indicate possible hyperinflation.

Clinical utility of functional residual capacity measurement based on a modified nitrogen breath washout technique

F Turani1, R Barchetta1, F Mounajergi1, A Marinelli2, F Brunetti2, R Scaini1, C Di Corato1, M Falco1

1European Hospital, Rome, Italy; 2Aurelia Hospital, Rome, Italy Critical Care 2009, 13(Suppl 1):P48 (doi: 10.1186/cc7212)

Introduction Improvement of oxygenation during acute lung injury (ALI) and acute respiratory distress syndrome (ARDS) requires a high level of positive end-expiratory pressure (PEEP) to recruit nonaerated lung zones and decrease pulmonary shunt. However, monitoring of alveolar recruitment at the bedside is difficult, as neither the PaO2/FiO2 ratio, thoracopulmonary compliance or generation of pressure curve are indices of alveolar recruitment and avoidance of lung hyperinflation. Monitoring of the functional residual capacity (FRC) at the bedside may be useful to monitor directly lung recruitment and to optimize the PEEP level [1]. The aims of this study are to evaluate the FRC by a modified nitrogen

multiple washout technique (NMBW) in ALI/ARDS patients, and to set PEEP levels on data of FRC values.

Methods Twenty patients with ALI/ARDS were enrolled in the study. All patients were ventilated in pressure-controlled ventilation with an Engstrom carestation ventilator (GE Healthcare, Helsinki, Finland) in accordance with the ARDSnet guidelines. FRC measurement was carried out with the COVX module integrated within the ventilator (GE Healthcare) by a NMBW technique. Every patient had a basal FRC measurement and then three measurements at PEEP 15/10/5 cmH2O during a derecruiting maneuver. After all measurements, PEEP was set as the PEEP at which value FRC started to decrease. At basal time (T0) and after setting the best PEEP (T1) the PaO2/FIO2 ratio and static compliance were measured too. All data are reported as the mean ± SD. A ttest was used to compare changes during time. Results Table 1 presents the main results of the study.

Table 1 (abstract P48)

Parameter T0 T1

FRC (ml) 2,330 ± 400 2,933 ± 300*

PaO2/FiO2 164 ± 74 251 ± 107*

Compliance (ml/cmH2O) 38 ± 12 49 ± 15*

*P <0.05 T1 vs. T0.

Conclusions FRC measurement by the NMBW technique integrated in the ventilator is useful to assess functional lung impairment at the bedside. Setting PEEP on FRC measurements may improve lung recruitment and oxygenation, but anatomical studies (CT scan) are also warranted. Reference

1. Lambermont B, et al.: Comparision of functional residual capacity and static compliance of the respiratory system during a PEEP ramp procedure in an experimental model of acute respiratory distress syndrome. Crit Care 2008, 12:R91.

Pulmonary permeability index predicts progression to acute lung injury in patients with increased risk

CR Phillips, K Bacon, J Pinney, A Nielsen, JL LeTourneau

OHSU, Portland, OR, USA

Critical Care 2009, 13(Suppl 1):P49 (doi: 10.1186/cc7213)

Introduction Early identification of progression to acute lung injury (ALI) in patients at risk may change therapy and potentially improve outcome. Central to the pathogenesis of ALI is pulmonary micro-vascular injury and increased permeability resulting in pulmonary edema. We proposed that the pulmonary vascular permeability index (PVPI) (extravascular lung water (EVLW) (ml) / pulmonary blood volume (PBV) (ml)) reflects the severity of this injury and predicts progression to ALI in patients at risk. Methods The PVPI was measured prospectively in 27 patients who either were at increased risk to develop ALI (n = 17) or who had ALI on presentation (n = 10) for the first 5 days after admission to the ICU.

Results Ten out of 17 patients at risk for ALI progressed to it. The mean (± SEM) PVPI on day 1 was lower in patients who did not develop ALI vs. those that did (1.4 ± 0.1 vs. 2.6 ± 0.4, P = 0.01) in the 17 patients who did not have ALI on presentation (Figure 1). There was no difference in PVPI for those that developed ALI vs. those that had it on presentation (2.6 ± 0.4 vs. 2.7 ± 0.3, P = 0.5). A cutoff PVPI value of 1.9 or less discriminated those that would not develop ALI from those who did or who had it on presentation with a sensitivity and specificity of 100% and 85%, respectively (Figure 2).

Figure 1 (abstract P49)

No ALI Got ALI Had ALI

Figure 2 (abstract P49)

4,50 ■ ♦

3.50 .

Q. 8 +

> Cl. 2. so ■ i

• o ♦

I SO . i o o ♦

o.so ■ g ■

No ALI Got ALI Had ALI

Conclusions Increased PVPI is a feature of early ALI and predicts progression to ALI in patients at increased risk. Early identification of patients with elevated PVPI and who are at risk to develop ALI may lead to consideration of early initiation of lung protective ventilator strategies.

Pulmonary electrical impedance tomography changes in a model of hemorrhagic shock with endotoxemia and resuscitation

J Noel-Morgan1, D Fantoni2, D Otsuki1, JO Auler Jr1

1Faculdade de Medicina da Universidade de Sao Paulo, Brazil;

2Faculdade de Medicina Veterinaria e Zootecnia da Universidade de Sao Paulo, Brazil

Critical Care 2009, 13(Suppl 1):P50 (doi: 10.1186/cc7214)

Introduction Electrical impedance tomography (EIT) is a promising bedside device with the potential to assess changes in regional ventilation and lung blood flow [1]. The purpose of our study was to monitor lung images and changes in impedance by EIT in a model of hemorrhagic shock with endotoxemia followed by fluid resuscitation.

Methods Twelve anesthetized, mechanically ventilated, supine pigs were submitted to hemorrhagic shock (50% blood volume) and endotoxin infusion. Animals were randomly allocated to control

Functional EIT images along timepoints.

(n = 6) or a treatment group with lactated Ringer's (n = 6). The mean arterial pressure (MAP), central venous pressure (CVP), blood gas, extravascular lung water index (EVLWI), intrathoracic blood volume index (ITBVI), lung compliance and pulmonary EIT (Dräger, Germany) were measured before shock (Tbasal), 60 minutes after hemorrhagic shock (Tshock) and hourly in the treatment period (T1, T2 and T3). Statistical analysis was based on one-way ANOVA (P <0.05).

Results In Tshock there was a significant decrease in MAP, CVP, SvO2, lung compliance, cardiac index, EVLWI and ITBVI and an increase in lactate (P <0.05). Fifty percent of control animals died between T2 and T3. In treated animals, at T3 the EVLWI reached values near those of Tbasal whereas ITBVI remained below baseline (P <0.05) and above Tshock (P <0.05). There was sensible change in functional EIT images (Figure 1) and significant differences in impedance along time. Significant global impedance change occurred in T2 relative to Tshock (P <0.05). Most of the EIT changes were attributable to the ventral lobes (local1), which showed significant differences in T2 and T3 relative to Tbasal (P <0.05) and of T1, T2 and T3 relative to Tshock (P<0.05).

Conclusions Pulmonary impedance changes induced by the proposed model of shock and resuscitation were monitored successfully with the EIT device. Changes were suggestive of alterations in regional ventilation and ventilation-perfusion mismatch.

Acknowledgement Grants from FAPESP 08/50063-0,

08/50062-4 and LIM08/FMUSP.

Reference

1. Putensen C, et al.: Curr Opin Crit Care 2007, 13:344-350. P51

Determination of lung area in electrical impedance tomography images

Z Zhao1, K Möller2, D Steinmann1, J Guttmann1

Wniversity Medical Center, Freiburg, Germany; 2Furtwangen

University, Villingen-Schwenningen, Germany

Critical Care 2009, 13(Suppl 1):P51 (doi: 10.1186/cc7215)

Introduction Electrical impedance tomography (EIT) has the potential for bedside monitoring of regional lung function. Evaluation of EIT imaging requires the identification of the lung area in the images. A functional EIT-based method (fEIT) has been proposed to identify the lung area in EIT images for patients with healthy lungs [1,2]. However, in patients with certain lung diseases, the fEIT method will fail to include those lung regions where a low ventilation change is present. Besides, identified lung regions may include the cardiac-related area. A method to estimate the lung area accurately is missing. The aim of this study was to develop an improved method for lung area estimation in EIT images (LAE), which is suitable for both healthy subjects and patients with serious pulmonary diseases.

Methods In our LAE method, the lung area as determined by fEIT is mirrored and the cardiac-related area, which is distinguished in

the frequency domain, is subtracted. Forty-nine mechanically ventilated patients were investigated (test group: 39 patients, thoracic surgery; control group: 10 patients, orthopedic surgery without pulmonary disease). An EIT video sequence of 5 minutes duration comprising about 60 breathing cycles from every participant was recorded and subsequently analyzed. Statistical analysis was performed by one-way ANOVA. P<0.01 was considered statistically significant. Data are presented as means and standard deviations.

Results It is assumed that the fraction of the lung in the thorax for different people should be more or less in the same range, in spite of the state of the lung. The sizes of the lung area determined with fEIT are in control group S_C, fEIT = 361 ± 35.1 and in test group S_T, fEIT = 299 ± 60.8 (P <0.01). On the contrary, the sizes estimated with the LAE method are in control group S_C, LAE = 353 ± 27.2 and in test group S_T, LAE = 353 ± 61.1 (P = 0.41). Conclusions The result demonstrates that the novel LAE method can better access the lung region in EIT images, from which the analysis of regional lung ventilation will benefit. Further validation will be pursued by comparing the results with anatomic computed tomography image of the chest morphology. References

1. Hahn G, et al.: Physiol Meas 1996, 17(Suppl 4A):A159-A166.

2. Frerichs I, et al.: Intensive Care Med 1998, 24:829-836. P52

Development of a system for in vivo optical alveolar elastometry

D Schwenninger1, K Möller2, S Schumann1, J Guttmann1

University Hospital of Freiburg, Germany; 2Furtwangen University, VS-Schwenningen, Germany

Critical Care 2009, 13(Suppl 1):P52 (doi: 10.1186/cc7216)

Introduction Micromechanical properties of alveolar walls are needed for finite element modeling of the lung that in turn may be used to guide mechanical ventilation therapy. The aim of this project is to develop an endomicroscopic device [1] to measure the local mechanics of alveolar walls in vitro and in vivo under varying conditions. Here we report on the development and the evaluation of the endoscopic system, which is - on the first run -performed in an artificial environment (bioreactor) with known mechanical properties.

Methods A system of two concentric trocars was built to adjust and monitor the local pressure by means of a flushing fluid in the endoscopic field of view. The fluid is pumped through the double trocar system that is surrounding the endoscope. By adjusting the flow rate of the fluid, the mean pressure P2 in the endoscopic field of view, can be kept constant. The alveolar pressure is changed by adjusting the airway pressure P1. The transpulmonary pressure (Pt) for the observed subpleural alveoli is thus Pt = P1 - P2 (Figure 1b). Pt is varied by applying different continuous positive airway pressure values to animal-model airways. The mechanical reaction of the observed alveoli is, thereby, recorded by video endoscopy. Assuming that the recorded outlines reflect changes in the diameter of an observed alveolus allows calculation of mechanical properties such as the stress-strain relationship of the alveolar wall. For evaluation, the endoscopic system is applied to an artificial membrane with known mechanical properties in a bioreactor. A pattern of particles on the membrane allows quantifying the three-dimensional deformation under pressure changes. Mechanical membrane properties can be determined by the relation between membrane-deformation and transmembrane pressure Pm = Pa - Pb (Figure 1a).

Figure 1 (abstract P52)

Figure 1 (abstract P53)

Results Preliminary results were obtained in the bioreactor with a polymer membrane (polydimethylsiloxan) of 100 |im thickness. Graphite particles were used to produce a particle pattern. Deformation due to different Pt' was observed and recorded successfully.

Conclusions The in vivo estimation of micromechanical properties such as the stress-strain relationship of elastic walls is feasible. For that, the developed system has to be technically optimized. Reference

1. Stahl CA, et ail:. J Biomech 2006, 39:598. P53

Effects of inhaled iloprost on acute respiratory distress syndrome in prone and supine positions

E Senturk1, N Cakar1, P Ergin Ozcan1, A Basel1, T Sengul1, L Telci1, F Esen1, M Winterhalter2

1University of Istanbul, Turkey; 2Hannover Medical School, Hannover, Germany

Critical Care 2009, 13(Suppl 1):P53 (doi: 10.1186/cc7217)

Introduction In several studies it has been shown that inhaled pulmonary vasodilators (NO and iloprost) can decrease the pulmonary hypertension and also improve the oxygenation during acute respiratory distress syndrome (ARDS) [1]. We investigated the effects of prone and supine positioning on the effects of inhaled iloprost in an animal-ARDS model.

Methods After approval of the animal ethics committee, 10 pigs were anesthetized and intubated. Invasive systemic and pulmonary arterial catheterizations were performed (T1). ARDS was induced in all animals with the infusion of oleic acid (0.15 to 0.30 ml/kg). The study design is shown in Figure 1. Hemodynamic and respiratory parameters and ventilation parameters were measured; arterial and mixed venous blood samples were drawn; and were recorded in T1 to T6. Pigs were ventilated in volume-controlled ventilation mode with FiO2 100%, with 4 cmH2O positive end-expiratory pressure (PEEP) in the beginning and with 8 cmH2O PEEP after induction of ARDS. Statistical analysis was made with Student's t test, repeated measures of ANOVA (with Tukey as the post-hoc test) and paired t tests. P <0.05 was significant. Results There was no significant difference between the sequences. Iloprost decreased the mean pulmonary arterial pressure in both supine (37 vs. 31 mmHg) and prone (38 vs. 29 mmHg) positions significantly, but there was no significant difference between both positions. Prone position was associated with an improvement in oxygenation compared with supine position both with or without iloprost application. There was no spillover effect of iloprost. Conclusions Iloprost decreased pulmonary arterial pressures in both positions. On the other hand, the prone position improved oxygenation. The decrease in pulmonary arterial pressures and improvement in oxygenation was better in prone position + iloprost; however, these findings were not statistically significant. Reference

1. Zwissler B, et al.: Inhaled prostacyclin (PGI2) versus inhaled nitric oxide in adult respiratory distress syndrome.

Am J Respir Crit Care Med 1996, 154:1671-1677.

Oleic acid infusion (0.20 mL /kg)

Hop (220 ng/kg)

Supine » Sup-Nop •

Prone Pron-llop

supine

4cmH20*-->8cmH,0

peep" peep"

Prone Pron-llop

T3 T4

Ilop, iloporost; Pron, prone; Sup, supine.

Decreased vascular endothelial growth factor expression in lung tissue during acute respiratory distress syndrome

L Azamfirei, S Gurzu, S Copotoiu, I Jung, R Copotoiu, K Branzaniuc, R Solomon

University of Medicine and Pharmacy, Targu Mures, Romania Critical Care 2009, 13(Suppl 1):P54 (doi: 10.1186/cc7218)

Introduction Endothelial injury is an important prognostic factor in acute respiratory distress syndrome (ARDS) [1,2]. Vascular endothelial growth factor (VEGF) plays a critical role in endothelial destruction and angiogenesis [3]. The expression of VEGF in ARDS varies, depending on epithelial and endothelial damage [4,5]. The objective of this study was to investigate the expression of VEGF in lung tissue from ARDS patients.

Methods Lung specimens were obtained by autopsy from 10 patients with severe ARDS and were compared with a control group of 10 non-ARDS patients autopsied. All lung samples were stained for standard histopathological analysis and for immuno-histochemical methods using a specific mouse monoclonal antibody.

Results Compared with expression in non-ARDS control individuals, pulmonary expression of VEGF was significantly decreased (P <0.001) in ARDS patients. Alveolar macrophages were similarly immunopositive in both groups. No differences were noted with regard to the individual patient's characteristics (age, gender, period of ARDS condition, number of ICU days). Conclusions A decrease in alveolar type II cellularity, due to apoptosis, has been observed during ARDS that may reduce the production of VEGF in the alveolar space and may participate in the decrease in lung perfusion.

Acknowledgement Research Grant No. 136/IDEI from the National Authority of Scientific Research, Romania. References

1. Medford ARL, Millar AB: Vascular endothelial growth factor (VEGF) in acute lung injury (ALI) and acute respiratory distress syndrome (ARDS): paradox or paradigm? Thorax 2006, 61:621.

2. Dvorak HF: Angiogenesis: update 2005. J Thromb Haemost

2005, 3:1835-1842.

3. Gerber HP, Wu X, Yu L, Wiesmann C, et al.: Mice expressing a humanized form of VEGF-A may provide insights into the safety and efficacy of anti-VEGF antibodies. Proc Natl Acad Sci U S A 2007, 104:3478-3483.

4. Fehrenbach H: Development of the pulmonary surfactant system. Pneumologie 2007, 61:488.

5. Shibuya M: Differential roles of vascular endothelial growth factor receptor-1 and receptor-2 in angiogenesis. J Biochem Mol Biol 2006, 39:469-478.

The selective a7 nicotinic acetylcholine receptor agonist GTS-21 attenuates ventilator-induced inflammation and lung injury

M Kox, JC Pompe, M Vaneker, LM Heunks, JG Van der Hoeven, CW Hoedemaekers, P Pickkers

Radboud University Nijmegen Medical Centre, Nijmegen, the Netherlands

Critical Care 2009, 13(Suppl 1):P55 (doi: 10.1186/cc7219)

Introduction Mechanical ventilation (MV) induces an inflammatory response that contributes to lung injury such as in acute lung injury or acute respiratory distress syndrome. The efferent vagus nerve can limit the inflammatory response via the a7 nicotinic acetylcholine receptor (a7nAChR), the so-called cholinergic antiinflammatory pathway. The aim of this study was to evaluate the effect of the selective a7nAChR agonist GTS-21 on pulmonary and systemic inflammation and lung injury induced by MV using clinically relevant ventilator settings.

Methods C57BL6 mice (n = 40) were intraperitoneally injected with 8 mg/kg GTS-21 or placebo, after which they were mechanically ventilated for 4 hours (tidal volume 8 ml/kg; positive end-expiratory pressure 1.5 cm H2O; FiO2 0.45). Untreated, not mechanically ventilated mice were used as controls. Arterial blood gases were obtained at the end of the experiment and TNFa, IL-6, IL-1 a, IL-1 P, keratinocyte-derived cytokine (IL-8 homologue) and IL-10 were determined in plasma and lung homogenates. Lung TNFa and IL-10 mRNA expression was measured using quantitative PCR.

Results In GTS-21-treated mice, the alveolar-arterial gradient after MV was significantly reduced compared with placebo (14.0 ± 0.76 vs. 16.2 ± 0.59 kPa; P = 0.03). MV resulted in an increase of all cytokines in plasma and lung compared with control mice. TNFa was significantly lower in plasma of GTS-21-treated animals compared with placebo (196.2 ± 50.8 vs. 331.9 ± 31.9 pg/ml; P = 0.04). Similarly, in lung homogenates a distinct trend was observed towards lower TNFa levels in GTS-21-treated mice (53.9 ± 12.5 vs. 79.1 ± 5.6 pg/mg protein; P = 0.06). IL-10 levels were unaffected by GTS-21. MV strongly increased TNFa mRNA expression in lungs of placebo animals (21-fold compared with controls); this was significantly lower in GTS-21-treated mice (11fold compared with controls; P = 0.02). IL-10 mRNA expression was similar in GTS-21-treated and placebo animals. Conclusions MV with clinically relevant ventilator settings results in activation of the immune system. GTS-21 inhibits proinflam-matory cytokine production while not affecting the antiinflammatory cytokine IL-10. The reduced alveolar-arterial gradient in GTS-21-treated animals indicates attenuation of lung injury. In conclusion, limiting the inflammatory response appears to reduce lung injury, and therefore the cholinergic anti-inflammatory pathway may represent new treatment options for MV-induced lung injury.

Usefulness of soluble E-selectin in the clinicopathologic assessment of acute lung injury/acute respiratory distress syndrome

Y Kakihana, C Kuroki, H Murayama, T Ohryorji, N Kiyonaga, S Tashiro, T Imabayashi, T Yasuda, Y Kanmura, T Moriyama, A Matsunaga

Kagoshima University Hospital, Kagoshima, Japan Critical Care 2009, 13(Suppl 1):P56 (doi: 10.1186/cc7220)

Introduction A retrospective observational study was conducted to evaluate whether the plasma level of soluble E-selectin [1] might be a specific pathologic marker of acute lung injury/acute respiratory distress syndrome (ALI/ARDS).

Methods The data of 52 critically ill patients admitted to the ICU with systemic inflammatory response syndrome and initiated on mechanical ventilation were retrospectively evaluated. Results The plasma levels of soluble E-selectin determined within 24 hours of admission were significantly correlated with the Sequential Organ Failure Assessment scores determined within 24 hours of admission. Furthermore, the scores for both respiratory failure (evaluated by the PaO2/FiO2 ratio) and liver dysfunction (evaluated by the serum bilirubin value) in the Sequential Organ Failure Assessment scoring system were significantly correlated with plasma levels of soluble E-selectin. In relation to respiratory failure, the plasma level of soluble E-selectin was higher in patients with ALI/ARDS than in those without (Figure 1), and receiver operating characteristic analysis revealed that this parameter might be a specific marker of ALI/ARDS (Figure 2). Conclusions Soluble E-selectin might be specific and useful marker for the clinicopathologic assessment of ALI/ARDS in critically ill patients with systemic inflammatory response syndrome. However, further investigation is clearly needed to determine whether soluble E-selectin can indeed predict the development of ALI/ARDS. Reference

1. Okajima K, Harada N, Sakurai G, et al.: Rapid assay for plasma soluble E-selectin predicts the development of acute respiratory distress syndrome in patients with systemic inflammatory response syndrome. Transl Res 2006, 148:295-300.

Figure 1 (abstract P56)

P = 0,0003

50 -| 40

ALI ARDS

(n=13| (n =18)

Soluble E-selectin values in patients with non-ALI/ARDS, ALI or ARDS.

non-ALI/ARDS [n =21)

Prognostic value of epithelial neutrophil activating peptide 78 and monokine induced by IFNy in bronchoalveolar lavage fluid in critically ill pediatric patients with acute respiratory distress syndrome

T Shahin, H Ibrahim

Ain Shams University, Cairo, Egypt

Critical Care 2009, 13(Suppl 1):P57 (doi: 10.1186/cc7221)

Introduction Diffuse alveolar damage (DAD) is the histopatho-logical hallmark of acute respiratory distress syndrome (ARDS). DAD is due, in part, to dysregulated angiogenesis, with overexpression of angiogenic and downregulation of angiostatic chemo-kines. The objective of this study was to measure the levels of epithelial neutrophil activating peptide 78 (ENA78) as an angiogenic chemokine and monokine induced by IFNy (MIG) as an angiostatic chemokine in bronchoalveolar lavage fluid of critically ill children with ARDS to determine the prognostic value of the ENA78/MIG ratio in assessing illness severity and patient outcome. Methods The study included 35 mechanically ventilated pediatric patients with ARDS and 28 mechanically ventilated patients due to nonpulmonary causes as a control group with mean age 16.5 ±10.2 and 29.1 ± 8.1 months, respectively. They were subjected to fibroaptic bronchoscopic examination and broncho-alveolar lavage was done to estimate the ENA-78 and MIG levels using the ELISA technique.

Results Both ENA-78 and MIG levels were found to be significantly higher among patients versus controls (P<0.01) (390.3 ± 159.7 vs. 80.3 ± 17.6 pg/ml and 235.6 ± 92 vs. 91.0 ± 18.9 pg/ml, respectively). ENA-78 was found to be significantly higher among nonsurvivors, while MIG was significantly lower among nonsurvivors in comparison with the survivors (P <0.01) (455.3 ± 142 vs. 277.8 ± 29.1 pg/ml and 186.3 ± 43 vs. 359.0 ± 56.4 pg/ml, respectively). Applying a receiver operating charac-

teristic curve indicated that ENA78/MIG ratio was the best discriminator between survivors and nonsurvivors at cutoff value 1.02 with 100% sensitivity and specificity.

Conclusions Imbalance between angiogenic and angiostatic chemokines is the determining factor of ARDS patients' outcome. Future studies to investigate therapeutic modalities to enhance angiostatic factors or inhibit angiogenesis to delay diffuse alveolar damage and improve the outcome are recommended.

Cortisol-binding globulin cleavage at sites of inflammation in critically ill patients

M Williams, A Zhou, C Summers, D Halsall, D Menon

University of Cambridge, UK

Critical Care 2009, 13(Suppl 1):P58 (doi: 10.1186/cc7222)

Introduction In vitro studies have shown that cortisol-binding globulin (CBG) can be cleaved by neutrophil elastase with a resulting reduction in affinity for cortisol [1]. Local elastase production by activated neutrophils may trigger cortisol release by CBG cleavage, providing a potential mechanism for targeted delivery of corticosteroids to inflamed tissues. We looked for evidence of this process at sites of inflammation in critically ill patients. Methods Sera and bronchoalveolar lavage (BAL) fluid were collected from six mechanically ventilated patients with a clinical diagnosis of ventilator-associated pneumonia or acute respiratory distress syndrome (ARDS). These samples, along with sera from six healthy controls, were subjected to gel electrophoresis (SDS-PAGE) and immunoblotting for CBG. Densitometry was used to quantify the proportion of cleaved CBG. Results CBG cleavage product was seen within the BAL but not the sera of patients with acute inflammatory lung disease. Figure 1 shows a typical western blot comparing control sera (C1 and C2), septic sera (S1 and S2) and BAL from the same septic patients (B1 and B2, respectively). The mean percentage of CBG in the cleaved form was 63.3% ± 21.8 in BAL compared with 0.94% ± 1.73 in sera from the same patients and 12.5% ± 11.4 from healthy controls (see Figure 2; all comparisons statistically significant). Conclusions This study provides the first in vivo support for a mechanism of enhanced cortisol delivery to sites of inflammation that involves local CBG cleavage.

Acknowledgement Study supported by the Intensive Care

Society (UK).

Reference

1. Pemberton PA, et al.: Nature 1988, 336:257-258. Figure 1 (abstract P58)

kD 7550-

MW Pure CBG Serum/BAL samples

Elastase

■*■ +t C1 S1 B1 C2 S2 B2

1 2 3 4 5 6 7 8 9 10

SDS-PAGE and immunoblotting of CBG. MW, molecular weights (arrows mark the place of the relevant molecular weight markers).

C S BAL

Cleaved CBG as a percentage of total CBG. *P<0.05, **P<0.01.

Vascular connexins have differing responses to tumor necrosis factor

Y Ouellette

Mayo Clinic, Rochester, MN, USA

Critical Care 2009, 13(Suppl 1):P59 (doi: 10.1186/cc7223)

Introduction Cell-to-cell communication via gap junctions has been implicated in the control of vascular tone and may be altered in sepsis. Endothelial cells express connexin (Cx) 37, Cx40 and Cx43 and cytokines may modulate their function in sepsis resulting in altered gap junctional intercellular communication. Our hypothesis is that tumor necrosis factor (TNF) will decrease gap-junction-dependent cell-to-cell communication of vascular connexin. Methods Transformed HeLa cells expressing vascular Cx37, Cx40 or Cx43 were used in these experiments. HeLa cells were treated with TNF (20 ng/ml) for up to 2 hours. In dye-transfer experiments, carboxyfluroscein (HeLaCx40 and HeLaCx43) or Alexa Fluor-480 (HeLaCx37) was injected into one cell for 10 seconds and cell transfer was allowed to proceed for 10 minutes and the number of labeled cells counted. Cell lysates were prepared in Triton X-100 lysate buffer and detergent-soluble fractions collected. Cx37, Cx40 and Cx43 were detected by western blot.

Results After 1 hour, TNF treatment resulted in near total loss of dye-coupling in HeLaCx37 and HeLaCx43 (P <0.02, n = 14 to 16) and remained constant up to 2 hours. Dye coupling in HeLaCx40 cells remained unchanged after 1 hour and decreased after 2 hours (P <0.05, n = 10). Western blots indicated that TNF treatment did not affect detergent solubility of Cx40 and Cx43. However, TNF caused a significant increase in detergent solubility of Cx37. Conclusions These results suggest that inflammatory mediators affect connexins differently. The loss of Cx37 function may be due to the loss of detergent resistance, suggesting internalization of Cx37 in response to TNF.

Acknowledgement Transformed HeLa cells were a gift from Dr K. Willecke.

Prehospital intubation for out-of-hospital cardiac arrest

D Young1, RM Lyon1, J Ferris2, DW McKeown1, A Oglesby1

1Royal Infirmary of Edinburgh, UK; 2Ninewells, Dundee, UK Critical Care 2009, 13(Suppl 1):P60 (doi: 10.1186/cc7224)

Introduction The most appropriate advanced airway intervention in out-of-hospital cardiac arrest (OHCA) remains unproven. Trained ambulance personnel may attempt endotracheal intubation in the field for OHCA patients. This study aims to review prehospital airway management in OHCA.

Methods Observational, retrospective case review over a 4-year period. All cases of OHCA brought to the Emergency Department of the Royal Infirmary of Edinburgh, Scotland were identified. Patient demographics, the airway management technique and documented complications were recorded. The primary endpoint measure was survival to hospital admission. Results In total, 794 OHCA cases were identified. The aetiology of cardiac arrest was medical in 95.2%, traumatic in 3.9% and unrecorded in 0.9% of cases. Prehospital endotracheal intubation was attempted in 628 (79%) cases and was successful in 573 (91.2%) cases. A significant complication (multiple attempts, displaced endotracheal tube or oesophageal intubation) occurred in 55 (8.8%) cases. In total, 165 (20.8%) patients survived to hospital admission, of whom 110 (1 7.5%) had undergone prehospital intubation. Fifty-five (33.1%) patients who did not undergo prehospital intubation survived to hospital admission. Conclusions Prehospital endotracheal intubation for out-of-hospital cardiac arrest is associated with significant complications. For ambulance crews not routinely undertaking endotracheal intubation, a supra-glottic airway device may be more appropriate. Reliable methods of confirming endotracheal intubation in the field should be utilised.

Focused echocardiography and capnography during resuscitation from pulseless electrical activity after out-of-hospital cardiac arrest

G Prosen1, S Grmec1, D Kupnik1, M Krizmaric1, J Zavrsnik1, R Gazmuri2

1Centre for Emergency Medicine, Maribor, Slovenia; 2Rosalind

Franklin University, Chicago, IL, USA

Critical Care 2009, 13(Suppl 1):P61 (doi: 10.1186/cc7225)

Introduction The study evaluated the ability of focused echocardiography (F-ECHO) and capnography to differentiate between pulseless electrical activity (PEA) and pseudo-PEA during resuscitation from out-of-hospital cardiac arrest (OHCA). Methods Patients in PEA, with stable values of end-tidal PCO2 (PETCO2) during compression pauses, underwent subxiphoid F-ECHO examinations during pauses for carotid pulse evaluation, assessing for the presence or absence of cardiac kinetic activity (synchronous myocardial wall and valvular movement). Patients with stable PETCO2 during the time of F-ECHO examinations who had cardiac kinetic activity underwent a compression pause of 15 seconds during which an additional 20 IU bolus of vasopressin and a 0.9% NaCl bolus were given. If pulselessness persisted after the 15-second pause, compressions were resumed. This group was denominated the F-ECHO group (November 2007 to October 2008, n =16) and was compared with a NON-ECHO group (November 2005 to October 2007, n = 48) who also had PEA with stable PETCO2 and were managed according to 2005 European Resuscitation Council guidelines without F-ECHO. Results There were no statistically significant differences between groups with regards to sex, suspected cause of arrest, initial cardiac rhythm, witnessed arrest, time elapsed before initiation of cardiopulmonary resuscitation (CPR), and bystander CPR. Primary outcome (ICU admission: 88% vs. 50%; adjusted values: OR = 22.4, 95% CI = 4.2 to 86.9, P <0.001) and secondary outcome (return of spontaneous circulation (carotid pulses palpable): 94% vs. 54%, adjusted: OR = 28.4, 95% CI = 3.9 to 96.1, P <0.001; 24-

hour survival: 81% vs. 41%; adjusted: OR = 19.8, 95% CI = 3.1 to 72.6, P <0.001; hospital survival: 56% vs. 15%; adjusted values: OR = 31.4, 95% CI = 2.9 to 85.7, P <0.001; neurological outcome - CPC 1 (good cerebral performance) to CPC 2 (moderate cerebral disability): 50% vs. 8%; adjusted values: OR = 36.4, 95% CI = 4.8 to 115.4, P <0.001) were significantly better in the F-ECHO group. In the NON-ECHO group, significantly higher doses of epinephrine were needed (P = 0.009) and CPR lasted longer (P = 0.003). Conclusions F-ECHO and capnography during PEA in OHCA facilitates return of spontaneous circulation, ICU admission, 24hour survival, and hospital survival. This effect was attributed to the ability to distinguish between PEA and pseudo-PEA and to institute appropriate treatment during CPR. The confirmation of these results in a large study is warranted.

Patients with an automatic external defibrillator applied by a bystander in a public setting have a strikingly higher frequency of ventricular tachycardia/ventricular fibrillation than observed cardiac arrests in the home

ML Weisfeldt1, C Sitlani2, T Rea2, D Atkins3, T Aufderheide4,

S Brooks5, B Bigham5, C Foerster5, R Gray5, P Moran6,

J Ornato7, J Powell2, L Van Ottingham2, LJ Morrison5

John Hopkins University, Baltimore, MD, USA; 2University of Washington, Seattle, WA, USA; 3University of Iowa, Iowa City, IA, USA; 4Medical College of Wisconsin, Milwaukee, WI, USA;

5St Michael's Hospital, Keenan Research Centre, Li Ka Shing Knowledge Institute, Toronto, ON, Canada; 6Durham Regional Base Hospital, ON, Canada; 7Medical College of Virgina, Richmond, VA, USA

Critical Care 2009, 13(Suppl 1):P62 (doi: 10.1186/cc7226)

Introduction The overall incidence of ventricular tachycardia/ ventricular fibrillation (VT/VF) as the first recorded electrical rhythm in out-of-hospital cardiac arrest has declined from ~70% to ~25% over the past 30 years. This change has been attributed to primary and secondary prevention of cardiovascular disease and VT/VF. We evaluated whether the incidence of VT/VF as first recorded rhythm differed by location among bystander-applied automatic external defibrillator (AED) patients and emergency medical services (EMS)-witnessed cardiac arrests.

Methods A prospective cohort study of nontraumatic cardiac arrest from December 2005 to April 2007 in the Resuscitation Outcomes Consortium database from 10 US and Canadian sites. The incidence of an initial shockable rhythm on AED or documented VT/VF was compared among bystander-applied AED patients and EMS-witnessed arrests in public versus private settings. Results The first rhythm was known in 13,235 out of 14,059 (94%) adult EMS-treated cardiac arrests. Of the 13,235 with known rhythms, 3,436 (26%) had VT/VF. Among 1,115 EMS-witnessed arrests, 61/161 (38%) had VT/VF in public settings and 224/954 (23%) in private settings. Similarly, for bystander AED applied in the private setting, 39/114 (34%) were shocked. But, in contrast, 125/159 (79%) (P <0.001 vs. all other) were shocked by the AED in the public setting. Witnessed arrests in both the private setting (vs. public) and in EMS-witnessed cases (vs. bystander AED applied) were more likely to occur in older subjects and females. After adjusting for age and gender via logistic regression models, a significant difference in the odds of having a shockable rhythm in a public versus private location of arrest remained in EMS-witnessed arrests (P <0.005). The difference also remained in bystander AED-applied arrests (P <0.001) after adjusting for S26 age, gender, and bystander-witnessed status.

Conclusions The incidence of VT/VF is far greater in the public setting particularly for bystander-witnessed AED-applied arrests. Patients in the private home setting, even for EMS-observed arrests, are far less likely to benefit from AED application than bystander-witnessed patients in the public setting. Cardiopulmo-nary resuscitation strategies may need to be tailored by arrest location.

Survival unchanged 5 months after implementing the 2005 American Heart Association cardiopulmonary resuscitation and emergency cardiac care guidelines for out-of-hospital cardiac arrest

B Bigham1, K Koprowicz2, A Kiss1, P Dorian1, S Emerson2, C Zhan1, T Rea2, TP Aufderheide3, J Powell2, S Cheskes1, D Davis4, J Stouffer5, J Perry6, LJ Morrison7

1University of Toronto, ON, Canada; 2University of Washington, Seattle, WA, USA; 3Medical College of Wisconsin, Milwaukee, WI, USA; 4University of California San Diego, CA, USA; 5Gresham Fire & Emergency Services, Gresham, OR, USA; 6University of Ottawa, ON, Canada; 7St Michael's Hospital, Keenan Research Centre, Li Ka Shing Knowledge Institute, Toronto, ON, Canada Critical Care 2009, 13(Suppl 1):P63 (doi: 10.1186/cc7227)

Introduction To improve survival from out-of-hospital cardiac arrest, the American Heart Association released guidelines in 2005. We examined the effect of these guidelines on survival in the Resuscitation Outcomes Consortium (ROC) Epistry - Cardiac Arrest. We hypothesized that survival would increase after guideline implementation.

Methods One hundred and seventy-four emergency medical service (EMS) agencies from eight out of 10 ROC sites were surveyed to determine 2005 American Heart Association guideline implementation, or crossover, date. Two sites with 2005 compatible treatment algorithms prior to guideline release were not included. Patients with out-of-hospital cardiac arrest secondary to a noncardiac cause, EMS-witnessed events, patients <18 years old, and patients with do-not-resuscitate orders were excluded. A linear mixed-effects model was applied for survival controlling for time and agency. The crossover date was added to the model to determine the effect of the 2005 guidelines. Results Of 174 agencies, 83 contributed cases to both cohorts during the 18-month period between 1 December 2005 and 31 May 2007. Of 7,403 cases, 4,897 occurred during the 13-month (median) interval before crossover and 2,506 occurred in the 5-month (median) interval after crossover. The overall survival rate was 5.9%. Our model estimated an overall increase in survival over time (monthly OR = 1.02, 95% CI = 0.99 to 1.04, P = 0.23), a decrease in survival at crossover (OR = 0.92, 95% CI = 0.66 to 1.26, P = 0.59), and a further increase in survival over time after crossover (monthly = OR 1.005, 95% CI = 0.96 to 1.05, P =0.84).

Conclusions The present study found a trend towards increased survival over time and no statistically significant effect of the 2005 guidelines early after implementation. This observed increase in survival over time may be attributed to the Hawthorne effect or participation in the ROC or improved quality assurance. A delay in knowledge and skill acquisition amongst EMS providers and the need to rechoreograph their cardiac arrest treatment may explain why no significant increase in survival was observed after implementation. EMS providers may require more time to gain proficiency in the guideline changes before the full potential of the guidelines can be realized. Further longitudinal study is needed to determine the full impact of the guidelines on survival.

Outcomes following admission to ICU post cardiac arrest E Casey, B Marsh

Mater Misericordiae University Hospital, Dublin, Ireland Critical Care 2009, 13(Suppl 1):P64 (doi: 10.1186/cc7228)

Introduction ICU admission post cardiac arrest accounts for 6% of admissions to the ICU [1]. ICU survival post cardiac arrest ranges from 25% to 35% [2]. We reviewed the records of both out-of-hospital and inhospital cardiac arrest admissions to our ICU to audit their outcomes, the primary outcome variable being survival to ICU and hospital discharge. Secondary objectives were to determine the length of stay in the ICU and hospital of both survivors and nonsurvivors.

Methods We performed a retrospective review of all admissions to our ICU post cardiac arrest between January 2003 and December 2006. Our data were sourced from the ICU access database, ICU discharge summary and individual chart review. We recorded demographics and data regarding each arrest. Results One hundred and forty-seven patients were admitted to our ICU during the 4-year period. The mean age was 59 years, ranging from 16 to 88 years. Out-of-hospital cardiac arrest accounted for 51% (n = 75) of cases, inhospital cardiac arrest for 49% (n = 72). Asystole was the first identifiable rhythm in 39%, of which 21% survived to hospital discharge, 42% of whom had a poor neurological outcome. Ventricular fibrillation/ventricular tachycardia accounted for 32% of cases, of which 39% survived, all of whom had a good neurological outcome. Pulseless electrical activity accounted for 29% of cases, of which 25% survived to hospital discharge, 10% of whom had a poor neurological outcome. Overall survival was 27%, of which 15% had a poor neurological outcome. The mean ICU length of stay was 9.2 days for survivors and 6.8 days for nonsurvivors.

Conclusions The high prevalence of asystole in both groups is not in keeping with previous audit series [3] in which ventricular fibrillation/ventricular tachycardia is the predominant arrest rhythm and may reflect a delayed response time. Our survival figures are comparable with international data [3], which are limited. The higher male to female ratio is consistent with previous audit series [4], possibly reflecting the higher incidence of ischaemic heart disease in males. References

1. Nolan JP, et al.: A secondary analysis of the ICNARC case mix programme database. J Intensive Care Soc 2007, 8:38.

2 Denton R, Thomas AN: Cardiopulmonary resuscitation: a retrospective review. Anaesthesia 1997, 52:324-327.

3 Laver S, Farrow C, Turner D, Nolan J: Mode of death after admission to our intensive care unit following cardiac arrest. Intensive Care Med 2004, 30:2126-2128.

4 Roberts H, Smithies M: Outcomes after ICU admission following out of hospital cardiac arrest in a UK teaching hospital. Crit Care 2008, 12(Suppl 2):P366.

Clinical outcome in patients who experienced inhospital cardiac arrest by underlying disease

Y Shin, C Lim, Y Koh, S Hong

Asan Medical Center, Seoul, Republic of Korea

Critical Care 2009, 13(Suppl 1):P65 (doi: 10.1186/cc7229)

Introduction Some clinical diagnoses such as sepsis, renal failure, metastatic cancer, house-bound lifestyle, and stroke are associated with worse prognosis after inhospital cardiac arrest (IHCA)

[1-3]. But there are a few reports about chronic liver disease, Asian people, and the incidence of IHCA is rarely reported in the literature. The aim of this study was to evaluate the hospital mortality of adult patients who had experienced IHCA by underlying disease and the incidence of IHCA. Methods Between March and October 2008, 69 patients who experienced IHCA were prospectively enrolled in the study. There were 64,345 total admissions to the hospital in this period. Patients who had cardiac arrests in the ICU, emergency room, and operating room were excluded from this study. The hospital mortality compared group A (chronic liver disease, 10 patients and cancer, 24 patients) with group B (chronic lung disease and heart failure, chronic renal disease and diabetes mellitus).

Results The incidence of IHCA was 1.07 events per 1,000 hospital admissions. Of the 69 enrolled patients, 34 were assigned to group A and 35 to group B. The mean patient age of group B was higher than group A (66.7 vs. 56.5 years, P = 0.003). There was no difference of return of spontaneous circulation (ROSC) more than 20 minutes between the two groups (group A, 53% vs. group B, 63%, P = 0.40). The hospital mortalities of underlying disease were presented: chronic liver disease (90%), cancer (88%), chronic lung disease (20%), and heart failure (54%). The hospital mortality was higher in group A than in group B (88.2% vs. 45.7%, P <0.001). The hospital mortality was higher during the night compared with during the day (80% vs. 50%, P = 0.01) and ROSC was higher during the day than during the night (80% vs. 41%, P = 0.001).

Conclusions In this study, the chronic liver disease and cancer group had poor prognosis compared with the other underlying diseases such as chronic lung disease or heart failure. References

1. Sandroni C, et al.: Intensive Care Med 2007, 33:237-245.

2. Peberdy MA, et al.: Resuscitation 2003, 58:297-308.

3. Cooper S, et al.: Resuscitation 1997, 35:17-22.

Good outcome in octogenarians after ventricular fibrillation out-of-hospital cardiac arrest

M Busch, E Soreide

Stavanger University Hospital, Stavanger, Norway Critical Care 2009, 13(Suppl 1):P66 (doi: 10.1186/cc7230)

Introduction Mild induced hypothermia (MIH) in comatose survivors of out-of-hospital cardiac arrest (OHCA) is highly recommended [1,2]. Still, there is great uncertainty when it comes to age limits for the therapy as the initial randomised studies excluded the majority of patients over 75 years of age. Methods We retrospectively studied 115 OHCA survivors that were treated with MIH in our ICU from 2002 to 2008 with regard to cerebral performance category (CPC) at hospital discharge. Inclusion criteria were the same as in the Hypothermia after Cardiac Arrest Study Group except for the age limits. Bad outcome was defined as severe disability (CPC3), vegetative state (CPC4) and death (CPC5).

Results Bad outcome was significantly more frequent in patients older than 60 years (chi-square P = 0.003), but even in patients older than 80 years we had good outcome in more than 50% of cases. The neurological outcome according to the different age groups is displayed in Figure 1.

Conclusions Although age seems to influence outcome, we found a surprisingly high incidence of good outcome even in the oldest comatose survivors after ventricular fibrillation OHCA treated with S27

Figure 1 (abstract P66)

Outcome according to age group after ventricular fibrillation OHCA and MIH (n = 115).

MIH. Hence, our data do not support a limitation of neurointensive

care based on age alone.

References

1. Bernard SA, Gray TW, Buist MD, Jones BM, Silvester W, Gut-teridge G, et al.: Treatment of comatose survivors of out-of-hospital cardiac arrest with induced hypothermia. N Engl J Med 2002, 346:557-563.

2. Hypothermia after Cardiac Arrest Study Group: Mild therapeutic hypothermia to improve the neurologic outcome after cardiac arrest. N Engl J Med 2002, 346:549-556.

What is full recovery? Reconsidering outcome of cardiopulmonary cerebral resuscitation beyond Glasgow-Pittsburgh Cerebral Performance Categories 1

T Abe1, S Izumitani1, T Sano1, Y Nagemine1, S Ishimatsu1, Y Tokuda2

1St Luke's International Hospital, Tokyo, Japan; 2St Luke's Life Science Institute, Tokyo, Japan

Critical Care 2009, 13(Suppl 1):P67 (doi: 10.1186/cc7231)

Introduction Cerebral performance of survivors from cardiopulmonary arrest is usually based on the Glasgow-Pittsburgh Cerebral performance categories (GP-CPC), and favorable neurological outcome are GP-CPC 1 (good cerebral performance) and GP-CPC 2 (moderate cerebral disability). Especially, CPC 1 is called full recovery. However, do the patients categorized as GP-CPC 1 work as they did before? We evaluated higher brain function among patients who were classed GP-CPC 1 and analyzed the factors influencing it.

Methods A retrospective, observational, cohort study was conducted on consecutive patients (age >18 years) who were survivors from cardiopulmonary arrest hospitalized through the Emergency Department between 1 October 2006 and 31 March 2008 in an urban teaching hospital of Japan. There were 428 patients with cardiopulmonary arrest (CPA), and 136 patients (32%) were admitted to our hospital. Fifteen patients were categorized GP-CPC 1 and their higher brain functions were evaluated by physical therapists 1 month after CPA. Data were based on the Utstein style and collected on age, gender, location of cardiac arrest, witnessed, bystander cardiopulmonary resuscitation (CPR), defibrillator and cardioversion (DC) or automated

external defibrillator (AED) use, return of spontaneous circulation prior to Emergency Department arrival, estimated time of cardiac arrest, cardiac cause, and therapeutic mild hypothermia. Results Among 15 eligible patients, eight cases with nonhigher brain dysfunction and seven cases with higher brain dysfunction were analyzed. In patients with nonhigher dysfunction: bystander CPR, eight cases (100%); witnessed, eight cases (100%); and DC (AED) use, eight cases (100%). In patients with higher brain dysfunction: bystander CPR, three cases (43%); witnessed, five cases (71%); and AED (DC) use, six cases (86%). Patients with nonhigher brain dysfunction were more likely to receive bystander CPR than those with higher brain dysfunction (Fisher's exact test P= 0.026).

Conclusions One-half of patients who were generally called full recovery could not work as they did before because they had higher brain dysfunction after resuscitation. It is important to analyze factors that influence the recovery to higher brain function in patients with GP-CPC 1 to improve the outcome of cardiopulmonary cerebral resuscitation.

Intracranial pressure monitoring during hypothermia after cardiopulmonary resuscitation

E Isotani, Y Otomo, K Ohno

Tokyo Medical and Dental University, Tokyo, Japan Critical Care 2009, 13(Suppl 1):P68 (doi: 10.1186/cc7232)

Introduction Two randomized clinical trials explored that induced hypothermia improved outcomes in adults with coma after resuscitation from ventricular fibrillation [1,2]. In this study, we present the usefulness of intracranial pressure (ICP) monitoring to predict the patient's outcome after induced hypothermia. Methods Hypothermia (34°C, 48 hours) was induced in patients after the recovery of spontaneous circulation. The indication was as follows: motor response of Glasgow coma scale <5, at least one of the brain stem reflexes intact and age 15 to 80 years. ICP and oxygen saturation of the jugular vein (SjO2) monitoring were performed during hypothermia.

Figure 1 (abstract P68)

ICP monitoring during hypothermia.

♦mRSO-2 ■mRS3-5

' irS« K-

• .... ♦

32 33 34 35 36 37 38 "C

SjO2 during hypothermia monitoring.

Results Hypothermia was induced in 23 patients (55%) after resuscitation. The outcome of nine patients (39%) was modified Rankin Scale 0 to 2, 11 patients (48%) were 3 to 5 and three patients (13%) were dead. ICP during hypothermia did not increase beyond 10 mmHg in neurologically good outcome patients (Figure 1). SjO2 tended to be <80% in neurologically good outcome patients (Figure 2).

Conclusions ICP monitoring during hypothermia was extremely useful to predict the patient's outcome. SjO2 monitoring during hypothermia was also predictable about the patient's outcome. References

1. Bernard SA, et al.: N Engl J Med 2002, 346:557-563.

2. Dixon SR, et al.: J Am Coll Cardiol 2002, 40:1928-1934.

Serum neuron-specific enolase as early predictor of outcome after inhospital cardiac arrest

G Abdel Naby

Alex Faculty of Medicine, Alex, Egypt

Critical Care 2009, 13(Suppl 1):P69 (doi: 10.1186/cc7233)

Introduction Cardiac arrest is a medical emergency that, in certain groups of patients, is potentially reversible if treated early enough. Outcome after cardiac arrest is mostly determined by the degree of hypoxic brain damage. Patients recovering from cardiopulmonary resuscitation are at great risk of subsequent death or severe neurological damage, including a persistent vegetative state. The early definition of prognosis for these patients has ethical and economic implications; it is estimated that the cost for the care of severely brain-damaged survivors runs into billions of dollars each year. Methods The main purpose of this study was to investigate the value of serum neuron-specific enolase (NSE) in predicting outcomes in patients early after inhospital cardiac arrest. This study was carried on 30 patients who had inhospital cardiac arrest during their admission to the critical care unit and for whom cardio-pulmonary resuscitation was performed according to the protocol of the European Resuscitation Guidelines, who survived for at least 12 hours after the event and for whom informed consent was

obtained. We excluded patients with neoplastic diseases known to increase NSE levels (small cell lung cancer, melanoma and malignancy of the kidney and testicles, stroke (ischemic and/or hemorrhagic) or traumatic brain injury).

Results As regards NSE, it was found that the NSE level at 24 hours in patients with good recovery ranged between 0.80 ng/ml and 10.20 ng/ml with a mean of 5.60 ± 3.26 ng/ml, and for those with severe disability the NSE level ranged between 18.30 ng/ml and 20.50 ng/ml with a mean of 19.40 ± 1.56 ng/ml. Vegetative patients had a higher NSE level than others with a mean of 64.38 ± 26.16 ng/ml, and patients who died had a mean NSE level of 22.61 ± 9.93 ng/ml. The difference was statistically significant, showing a higher NSE level in patients with more hypoxic brain insult after arrest who died or remained in a persistent vegetative state. (P significant, P <0.001.) Conclusions The present study demonstrated that the NSE value at 24 hours after return of spontaneous circulation was one of the best predictors for neurological outcome as NSE levels, and it tended to increase in patients with a bad neurological outcome. The NSE level tended to decrease in those with a good neurological outcome. If NSE concentrations increase by >15 ng/ml, prognosis tends to be bad.

Controlled reperfusion prevents neurologic injury after global brain ischemia in a novel ischemic brain model

Y Ko, B Allen, Z Tan, S Sakhai, G Buckberg

David Geffen School of Medicine at UCLA, Los Angeles, CA, USA Critical Care 2009, 13(Suppl 1):P70 (doi: 10.1186/cc7234)

Introduction Recent investigations have revealed that the state of post-ischemic brain recirculation is of major importance in recovery from sudden death, and we developed an isolated global ischemic brain model that excludes the confounding variables of bypass, donor blood, and whole body damage to investigate the strategy of controlled reperfusion. This study examines primary damage of cerebral function and tissue with uncontrolled brain reperfusion following 30 minutes of brain ischemia, and tests whether controlled brain reperfusion can attenuate the damage. Methods Sixteen pigs underwent 30 minutes of global brain ischemia by clamping major neck vessels via a small suprasternal incision. Seven pigs then received uncontrolled reperfusion with normal blood, while the other nine pigs received controlled reperfusion by infusing a modified (leukodepleted, hypocalcemic, hyperosmolar, alkalotic, normoglycemic, antioxidant enriched) warm blood solution into both carotid arteries for 20 minutes. Six pigs underwent Sham operation. Brain oxygen uptake and venous conjugated dienes (CD) were measured during reperfusion. The neurologic deficit score (NDS) (0 = normal, 500 = brain death) was determined at 24 hours after ischemia, and brain water contents and cerebral infarction by 2,3,5-triphenyl tetrazolium chloride staining were assessed post mortem. Results Sham pigs were neurologically normal at 24 hours. Uncontrolled reperfusion resulted in two early deaths with brain herniation, multiple seizures, low brain oxygen uptake*, high CD levels (1.64 ± 0.03 A233 nm)*, and high NDS (244 ± 19 in survivors)* indicating marked functional neurologic impairment. Postmortem analysis showed marked brain edema (84.3 ± 0.6%)* and extensive brain infarcts*. In contrast, pigs receiving controlled reperfusion had high brain oxygen uptake, low CD levels (1.3 ±0.07 A233 nm), no post seizures, and low NDS (41 ± 11), with all pigs being able to sit and eat, and three showing complete recovery. Brain edema was reduced (81.8 ± 0.5%), and 2,3,5-

triphenyl tetrazolium chloride staining showed no infarction. *P <0.05 uncontrolled vs. controlled, mean ± SEM. Conclusions Our novel ischemic brain model provides an effective tool to study brain ischemia. More importantly, these data indicate that controlled reperfusion attenuates reperfusion injury in the brain as it has been applied in other organs, and introduces the potential of using controlled reperfusion as a new treatment for sudden death and stoke. Reference

1. Schaller B, et al.: J Cereb Blood Flow Metab 2004, 24:351371.

Effectiveness of an underbody forced warm-air blanket in preventing postoperative hypothermia after coronary artery bypass graft surgery with normothermic cardiopulmonary bypass

JE Teodorczyk, JH Heijmans, WN Van Mook, DC Bergmans, PM Roekaerts

Maastricht University Medical Centre, Maastricht, the Netherlands Critical Care 2009, 13(Suppl 1):P71 (doi: 10.1186/cc7235)

Introduction Perioperative hypothermia in coronary artery bypass graft (CABG) is associated with adverse outcomes [1,2]. An underbody forced-air warming blanket was developed for use in cardiac surgery. The primary aim of this investigation was to study whether this blanket could prevent postoperative hypothermia in routine CABG.

Methods Sixty low-risk patients who underwent elective CABG were assigned into an intervention group that received the full underbody forced warm-air system (n = 30) and a control group that received standard thermal care (n = 30). Routine heat-conservation methods were applied in both groups including draping of the patient, fluid warming and normothermic cardio-pulmonary bypass (CPB) at core temperature ~36.5°C. The forced warm-air system was set at 43°C at the end of perfusion until departure from the operating room (OR). Bladder temperature was measured at: T1 - end of perfusion, T2 - departure from the OR, T3 - arrival in the ICU, T4 - 1 hour after arrival in the ICU, and T5 -

3 hours after arrival in the ICU.

Results The number of patients arriving in the ICU with a bladder temperature >36°C was significantly higher in the intervention group than in the control group, respectively 27 patients (90%) vs. 14 patients (46.7%) (P <0.001). Initial temperatures (mean ± SD) at T1 were similar in both groups: 36.7°C ± 0.3°C vs. 36.5°C ± 0.2°C, respectively (P = 0.091). At time points T2, T3 and T4, the core temperature was significantly lower in the control group as compared with the intervention group, T2: 36.0°C ± 0.3°C vs. 36.5°C ± 0.3°C, respectively (P <0.001); T3: 35.9°C ± 0.4°C vs. 36.2°C ± 0.3°C (P <0.001); and T4: 36.0°C ± 0.6°C vs. 36.4°C ±

0.5°C (P = 0.026). At T5, 3 hours after arrival in the ICU, both groups had similar bladder temperatures (37.3°C ± 0.6°C vs. 37.2°C ± 0.7°C; P = 0.568). The temperature drop from the end of CPB to arrival in the ICU was significantly less in the intervention group compared with the control group (0.4°C ± 0.3°C vs. 0.6°C ± 0.4°C; P = 0.027).

Conclusions The present study shows that additional warmth management with a full underbody forced warm-air system, applied in the OR to patients undergoing normothermic CABG, prevents hypothermia with its deleterious effects in the early postoperative phase. References

1. Insler SR, et al.: Anesth Analg 2008, 106:746-750. S30 2. Sessler DI, et al.: Anesthesiology 2001, 95:531-543.

Good outcome in noncoronary out-of-hospital cardiac arrest treated with mild induced hypothermia

M Busch

Stavanger University Hospital, Stavanger, Norway Critical Care 2009, 13(Suppl 1):P72 (doi: 10.1186/cc7236)

Introduction Mild induced hypothermia (MIH) has become a standard of care in comatose survivors of out-of-hospital cardiac arrest (OHCA) [1,2]. Even though the initial randomised trails excluded victims with noncoronary causes of OHCA, MIH may still be useful to attenuate ischemic brain damage in this group of patients. Methods We retrospectively studied 172 coronary and 32 noncoronary OHCA survivors that were treated with MIH in our ICU from 2002 to 2008 with regard to cerebral performance category (CPC) at hospital discharge. Bad outcome was defined as severe disability (CPC3), vegetative state (CPC4) and death (CPC5).

Results Bad outcome was significantly more frequent in patients with noncoronary cause of OHCA (chi-square P <0.0001). The subgroups of noncoronary cardiac arrest differed substantially with regard to outcome. The outcome after coronary and noncoronary OHCA treated with MIH is displayed in Figure 1. Conclusions No randomised controlled clinical trial supports the use of MIH in noncoronary OHCA. Although noncoronary OHCA seems to influence outcome negatively, further studies are warranted to examine the potential benefit of MIH in this category of patients. References

1. Bernard SA, Gray TW, Buist MD, Jones BM, Silvester W, Gut-teridge G, et al.: Treatment of comatose survivors of out-of-hospital cardiac arrest with induced hypothermia. N Engl J Med 2002, 346:557-563.

2. Hypothermia after Cardiac Arrest Study Group: Mild therapeutic hypothermia to improve the neurologic outcome after cardiac arrest. N Engl J Med 2002, 346:549-556.

Figure 1 (abstract P72)

Nasopharyngeal cooling during resuscitation: randomized study

F Taccone1, F Eichwede2, D Desruelles3, D De Longueville4, HJ Busch5, D Barbut6

1Erasme Hospital, Brussels, Belgium; 2Medizinisches Zentrum Kreis Aachen, Würselen, Germany; 3UZ Gasthuisberg Leuven, Belgium; 4CHU Saint Pierre, Brussels, Belgium; 5Universitätsklinikum Freiburg, Germany; 6BeneChill, San Diego, CA, USA

Critical Care 2009, 13(Suppl 1):P73 (doi: 10.1186/cc7237)

Introduction Nasopharyngeal cooling during cardiopulmonary resuscitation has been shown to ease the resuscitation effort and to improve the resuscitation rate, survival and neurologic outcome in porcine models of both prolonged ventricular fibrillation and pulseless electrical activity arrest. The aim of this study was to determine whether nasopharyngeal cooling initiated during resuscitation improves the resuscitation rate (return of spontaneous circulation (ROSC)), survival and neurologic outcome. Methods The study is ongoing. Cooling was performed using a novel device (RhinoChill; BeneChill, Inc., San Diego, CA, USA) that sprays a volatile coolant into the nasal cavity. Patients were randomized to nasopharyngeal cooling during resuscitation or no cooling in the field, followed by cooling for all patients in hospital. All patients with witnessed arrest and a downtime less than 20 minutes deemed eligible for resuscitation were included. Naso-pharyngeal cooling was initiated either before or after defibrillation and was continued until systemic cooling could be initiated. Patients who had achieved ROSC were excluded. Resuscitation was continued until ROSC was achieved or for 30 minutes. Results Five patients were randomized to treatment and six were controls. ROSC was achieved in five out of five (100%) treated patients but in only three out of six (50%) controls. All five (100%) treated patients survived to hospital admission as compared with one out of six (16.7%) controls. At 24 hours, three out of five (60%) treated patients were alive as compared with none of the controls. The first treated patient who completed the 1-week evaluation was neurologically intact.

Conclusions Nasopharyngeal cooling initiated during resuscitation may improve the ROSC rate and survival to 24 hours. The impact of this treatment on long-term survival and neurologic outcome remains to be determined.

Introduction of an external noninvasive cooling device for effective implementation of Intensive Care Society standards post cardiac arrest

J McGrath, K Williams, D Howell, J Down

University College Hospital London, UK

Critical Care 2009, 13(Suppl 1):P74 (doi: 10.1186/cc7238)

Introduction Despite cardiac arrests accounting for 5.8% of admissions to intensive care, there is significant variation in the management and outcome of these patients in different units [1]. Randomised controlled trials have demonstrated that active cooling to 32 to 34°C for 12 to 24 hours after return of spontaneous circulation (ROSC) significantly improves the outcome of patients who have an out-of-hospital ventricular fibrillation arrest [2]. However there are a range of cooling techniques employed, and no trials have demonstrated that any particular system is superior [3].

Methods All adult patients admitted to critical care following out-of-hospital cardiac arrest with Glasgow coma scale <9 after ROSC were included. The initial audit included nine patients admitted between August 2006 and March 2007 who were cooled primarily with standard methods such as ice packs/cooling pads. Performance was re-audited between August 2007 and December 2007 following the introduction of CritiCool, and included nine patients. Data were collected using an audit form, a computerised ICU patient database (CIMS) and clinical notes. We assessed neurological outcome at ICU discharge by calculating the cerebral performance category (CPC). The CPC is scored on a scale (0 to 5) where higher scores indicate worse functional impairment.

Results Following introduction of the CritiCool there was improvement in the speed of cooling to the target temperature range (patients effectively cooled within 4 hours: 33% vs. 57%). There was also improved ability to maintain patients within the target temperature range over the 24-hour period (57% vs. 70%). There was also a trend towards improvement in the mean CPC score at ICU discharge from 3.4 to 2.3.

Conclusions We have shown an improvement in the speed of cooling and target temperature maintenance. Our study also shows that the introduction of CritiCool correlated with an improvement in CPC. We suggest that more widespread use of noninvasive cooling devices may improve implementation of standards, avoid risks associated with invasive cooling devices and potentially improve neurological outcome. References

1. Nolan JP: Intensive care society guidelines (draft); 2008.

2. Nolan JP, Morley PT, et al.: Therapeutic hypothermia after cardiac arrest: an advisory statement by the advanced life support task force of the International Liaison Committee on Resuscitation. Circulation 2003, 108:118-121.

3. Hay AW, Wann GS, Bell K, et al.: Therapeutic hypothermia in comatose patients after out-of-hospital cardiac arrest. Anaesthesia 2008, 63:15-19.

Therapeutic hypothermia for postcardiac arrest patients: physicians are warming up to the idea

B Bigham1, K Dainty2, D Scales3, LJ Morrison4, S Brooks4

institute of Medical Sciences, University of Toronto, ON, Canada;

2Centre for Health Services Sciences, Sunnybrook Health Sciences Centre, Toronto, ON, Canada; 3Interdepartmental Division of Critical Care, University of Toronto, ON, Canada; 4St Michael's Hospital, Keenan Research Centre, Li Ka Shing Knowledge Institute, Toronto, ON, Canada Critical Care 2009, 13(Suppl 1):P75 (doi: 10.1186/cc7239)

Introduction Therapeutic hypothermia (TH) improves survival and neurologic recovery in resuscitated cardiac arrest patients. However, three published surveys with low response rates (<20%) reported most physicians have never used TH. We sought to

evaluate current physician use, and barriers to use, of TH in Canada.

Methods We developed a web-based questionnaire asking physicians to self-report their experience with TH using the Pathman framework of changing physician behavior. We surveyed

all members of the Canadian Association of Emergency Physicians and the Canadian Critical Care Forum using the Dillman survey method between 19 March and 21 May 2008. Adjusted odds ratios were generated from a multiple logistic regression model that included all reported predictor variables. S31

Results We surveyed 1,266 physicians, and 37% responded. Most (78%) respondents were emergency physicians, 54% worked at academic/tertiary-care hospitals, and 62% treated more than 10 arrests annually. Almost all respondents were aware of TH (99%) and agreed that TH was beneficial (91%), but only two-thirds (68%) had used TH in clinical practice. Only one-half (50%) reported using standardized cooling protocols. Critical care physicians were more likely to use TH than emergency physicians (93% vs. 61%, OR = 6.3, 95% CI = 2.5 to 16.0) and physicians who worked at a facility with a cooling protocol were more likely to use TH than their colleagues at other facilities (86% vs. 43%, OR= 5.6, 95% CI = 3.1 to 10.0). Physicians <10 years post residency were more likely to have used TH (73% vs. 63%, OR = 2.0, 95% CI = 1.2 to 3.3) as were physicians who treated >10 cardiac arrests annually (78% vs. 53%, OR = 2.6, 95% CI = 1.6 to 4.1). The use of TH was similar comparing academic/tertiary-care hospitals with community hospitals (OR = 1.6, 95% CI = 0.92 to 2.88). Common barriers reported by physicians included: lack of awareness (31%), perceptions of futility or poor prognosis (25%), too much work required to cool (20%) and staffing shortages (20%).

Conclusions Self-reported adoption of TH is higher than previously reported at 68% among Canadian emergency and critical care physicians. Adoption is more likely by critical care physicians, those with protocols, and those treating >10 arrests annually. Several barriers were reported; overcoming these may improve the adoption of TH.

Comparison between clinical tests and the Cerebral State Index or brain death determination

M Mahjoubifard, S Borjian Boroojeny, F Nikbakht

Zahedan University of Medical Sciences, Zahedan, Iran Critical Care 2009, 13(Suppl 1):P76 (doi: 10.1186/cc7240)

Introduction Diagnosis of brain death is very important. Confirmatory tests have been used to corroborate brain death, but they are expensive, nonattainable in all ICUs and in some instances have no correlation with clinical tests [1]. The Cerebral State Index (CSI) (Danmeter, Odense, Denmark) is a portable apparatus that has been made for determination of the depth of anesthesia based on brain waves determination and analysis [2]. Methods This is a study on 65 head-injured patients and 72 alert head-injured patients. Eighteen to 24 hours after confirmation of brain death by clinical tests applied by a neurologist, the CSI was recorded according to the company's instruction. The CSI was recorded if the electromyography was zero on the screen and the quality index was above 70%.

Results If the CSI score was <3 for detection of brain death and the CSI score was >3 for detection of nonbrain-dead patients, the CSI can detect 100% of cases with brain death from nonbrain-dead patients. Furthermore, if burst suppression (BS%) >75% is for detection of brain death and BS% between 0% and 75% is for detection of nonbrain-dead patients, the CSI could detect 100% brain-dead patients from nonbrain-dead patients. Conclusions MRI, CT and electroencephalography are time consuming and expensive, and require specialized people. In a previous study the Bispectral index scale (BIS) had been used for confirmation of brain death [3], but was not successful because the range of EEG filtration of BIS is wider than that of CSI and the calculation methods are different; moreover they did not consider BS% in addition to BIS. In the present study all of the clinically diagnosed brain-dead patients had CSI = 3 and BS% = 75%, and S32 all of the clinically no-brain-death patients had CSI >3 and BS%

<75. So the sensitivity and specificity of CSI for detection of brain

death is 100%.

References

1. Hammer MD, Crippen D: Brain death and withdrawal of spport. Surg Clin N Am 2006, 86:1541-1551.

2. Jensen EW, Litvan H, Revuelts M, Radriguez BE, Caminal P, Martinez P, et al.: Cerebral state index during propofol anesthesia: a comparison with the bispectral index and the A-line ARX index. Anesthesiology 2006, 105:28-36.

3. Escudero D, Otero J, Muniz G, Gonzalo JA, Calleja C, Gonzalez A, et al.: The Bispectral index scale: its use in the detection of brain death. Transplant Proc 2005, 37:36613663.

Effects of mechanical ventilation on Cushing's triad

G Nimmo, A Howie, I Grant

Western General Hospital, Edinburgh, UK

Critical Care 2009, 13(Suppl 1):P77 (doi: 10.1186/cc7241)

Introduction Cushing's triad of bradycardia, hypertension and abnormal breathing due to critically raised intracranial pressure was described in 1903 [1] and is generally accepted clinically. We recognised that our patients undergoing uncal herniation and brain stem compression (whilst intubated and ventilated) did not appear to exhibit this classic response.

Methods We identified suitable patients from the Scottish Intensive Care Society Wardwatcher audit database for our ICU admissions from August 1999 to February 2007. We used the search terms subarachnoid haemorrhage and traumatic brain injury both linked to brain stem death testing. From the resultant 1 75 patients, a consecutive sample of 50 patients was chosen for chart review. There were 30 males, mean age 40.5 years, and 20 females, mean age 50 years. Diagnoses were subarachnoid haemorrhage n = 27, stroke n =6, traumatic brain injury n =5, subdural haematoma n =3, intracerebral haemorrhage n =6, other n = 3. The time of coning was identified by the development of fixed dilated pupils and loss of cough reflex. Haemodynamics were recorded 1 hour before coning occurred, at the time of coning and

1 hour later. In 43 ventilated patients a hypertensive surge was documented. One other patient underwent the blood pressure surge breathing spontaneously and was not intubated. In six patients the haemodynamic surge had not been documented by the bedside nurse. It was evident from chart review that coning had occurred in the hour between haemodynamic recordings. Results In 44 patients there was a significant rise in blood pressure at the time of confirmation of uncal herniation with a subsequent fall. This was accompanied by a major tachycardia in 38 patients and a normal heart rate in five patients. Only one patient exhibited a bradycardia and had been extubated as part of withdrawal of therapy.

Conclusions We have demonstrated that the haemodynamic response to critically raised intracranial pressure in a ventilated patient is not a bradycardia. We have used an approach akin to that of Cushing but all of the six patients in Cushing's original paper were breathing spontaneously. All but one of our patients was ventilated. We suggest that by avoiding hypoxaemia and hypercapnic acidosis the cardiac response is altered. It is important for clinicians to be aware of this to aid recognition of coning for diagnostic and prognostic reasons. Reference

1. Cushing H: The blood pressure reaction of acute cerebral compression, illustrated by cases of intracranial hemorrhage. Am J Sci 1903, 125:1017-1044.

Losing potential organ donors in critical care units: data from the Donor Action Database

L Roels1, C Spaight1, J Smits2, B Cohen1

1Donor Action Foundation, Linden, Belgium; 2Eurotransplant International Foundation, Leiden, the Netherlands Critical Care 2009, 13(Suppl 1):P78 (doi: 10.1186/cc7242)

Introduction The aim was to analyze heart-beating organ donation patterns in four countries using the Donor Action (DA) Program [1] nationally, to identify bottlenecks in their donation process and suggest areas for improvement.

Methods A retrospective medical record review (MRR) of all critical care deaths between January 2006 and December 2007 (n = 18,118) from 166 hospitals in 381 critical care units from Belgium, Finland, France and Switzerland was made using the DA's Diagnostic Review process. The upper age limit for medical suitability was 75 years. Data were entered into the DA System Database for analysis.

Results From 6,561 patients (36.2% of all records) with no absolute contraindications to donation, 2,973 (45.3%) met preconditions for brain death (BD) diagnosis, 2,063 had signs of severe brain damage and 1,891 met criteria for formal BD diagnosis (= potential donors). Belgium had the utmost number of patients with formal BD diagnosis (75.7%), and Switzerland (57.4%, P <0.0001) the lowest. Although donor identification rates were higher in France (93.6%) with the lowest in Finland (47.7%, P <0.0001), Finland excelled in donor referral (93.9% of identified cases) vs. only 63.8% in Switzerland (P <0.0001), and excelled in family approach rates (92.7%) vs. only 70.2% in France (P <0.0001). Consent rates as a percentage of families approached were superior in Belgium and Finland (89.5%), with the most inferior in France (65.7%, P <0.0001). Conversion rates as a percentage of potential donors vs. actual donors were higher in France (43.1%) and Belgium (42.9%) and were significantly lower in Finland (34.9%) and Switzerland (33.5%) (P = 0.0187). Only Belgium had a nonheart-beating donation policy during the study period, resulting in 11.2% more donors added to the country's donor pool.

Conclusions The DA MRR proved to be an excellent tool to identify areas of improvement within certain steps of the donation pathway, such as donor identification, BD diagnosis, donor referral, family approach and obtaining consent. Moreover, the DA MRR has shown to be applicable in different countries and environments and should be considered a unique tool for comparing countries' donation performance. Reference

1. Donor Action Program [www.donoraction.org]

Critical care staff attitudes to organ donation impact on national donation rates: data from the Donor Action Database

L Roels1, C Spaight1, J Smits2, B Cohen1

1Donor Action Foundation, Linden, Belgium; 2Eurotransplant, Leiden, the Netherlands

Critical Care 2009, 13(Suppl 1):P79 (doi: 10.1186/cc7243)

Introduction The aim was to investigate whether critical care (CC) staff attitudes to organ donation, the concept of brain death (BD) and self-reported skills in donation-related tasks impact on national donation rates.

Methods Donor Action [1] Hospital Attitude Survey data were collected from 19,537 CC staff (MDs: 3,422, nurses: 13,977, others: 2,138) in 11 countries (Australia, Belgium, Croatia, Finland, France, Israel, Italy, Japan, Norway, Poland, Switzerland) between November 2006 and October 2008. Medical and nursing staff attitudes from each country were correlated with each country's donors per million population rates in 2007 (Council of Europe data). Data examined included average support to donation, respondents' willingness to donate their own, their children's and relatives' organs, the acceptance of the BD concept and reported confidence levels with donation-related tasks. Results National donation rates significantly correlated with respondents' attitudes to donation (R =0.783, P =0.0029), acceptance of the BD concept (R = 0.651, P = 0.0279), explaining organ donation to family (R = 0.544, P = 0.0845), introducing organ donation to family (R = 0.0182, P = 0.0182) and obtaining consent for donation (R = 0.628, P = 0.0368). Conclusions Average CC staff attitudes and donation-related skills clearly influence their assessment and are a strong predictor of a country's donation rate, as demonstrated in this study. Measures to improve donation rates should focus on guidance and education of CC staff so as to ensure that these practitioners have sufficient knowledge and confidence with donation-related issues. Reference

1. Wight C, Cohen B, Roels l, Miranda B: Donor Action: a quality assurance program for intensive care units that increases organ donation. J Intensive Care Med 2000, 15:104-114.

Effect of endotracheal suctioning on intracranial pressure in severe head-injured patients

S Gholamzadeh, M Javadi

Shiraz Medical University, Shiraz, Iran

Critical Care 2009, 13(Suppl 1):P80 (doi: 10.1186/cc7244)

Introduction Endotracheal suctioning (ETS) is a routine nursing procedure used to decrease pulmonary complications; however, in severe head-injured patients it can result in a sudden increase in intracranial pressure (ICP) and may put the patient at risk for further cerebral damage [1-3]. The purpose of this study was to examine the effect of ETS on ICP in severe head-injured patients. Methods Twenty-one patients with acute severe head injury (Glasgow coma score <8, range 4 to 8) were studied. Each subject received four passes of insertion of a standardized suction catheter and application of negative pressure limited to 10 to 15 seconds in each procedure of suctioning. The ETS procedure consisted of administration of 16 breaths at 135% of the patients' tidal volume, 100% FIO2 before and after suctioning with a standardized catheter (16 French) and duration between 10 and 15 seconds. A repeated-measures model for ANOVA was used to examine the changes in mean ICP 1 min before and during the first, second, third and fourth passes of catheter insertion. Results ICP significantly increased during suctioning (P <0.001). The change in ICP was significantly greater in the fourth pass of catheter insertion than in other passes.

Conclusions Changes in ICP induced by ETS in severe head-injured patients are significant. Suction passes should be limited to two to three per procedure. Repeated suctioning may increase ICP.

References

1. Kerr ME, Rudy EB, et al.: Effect of short-duration hyperventilation during endotracheal suctioning on intracranial pressure in severe head-injured adults. Nurs Res 1997, 46:195-201. S33

Gemma M, Tommasino C, et al.: Intracranial effects of endotracheal suctioning in the acute phase of head injury. J Neurosurg Anesthesiol 2002, 14:50-54. Rudy EB, Turner BS, Baun M, et al.: Endotracheal suctioning in adults with head injury. Heart Lung 1991, 20:667674.

Figure 1 (abstract P82)

Elevated plasma ammonia concentration in patients with traumatic hemorrhage

A Hagiwara, T Sakamoto

National Defense Medical College, Tokorozawa, Japan Critical Care 2009, 13(Suppl 1):P81 (doi: 10.1186/cc7245)

Introduction The blood ammonia concentration has been reported to be elevated by hemorrhagic shock in animal studies, but this finding has been reported in only one clinical study in a small number of selected surgical patients. We therefore conducted a study to determine whether plasma ammonia is elevated in trauma patients at hospital admission and can be used as a predictive factor for serious hemorrhage.

Methods The subjects were consequent trauma patients admitted to our level 1 trauma center between November 2006 and November 2007. Blood was sampled on admission from patients who met the inclusion criteria to determine plasma ammonia and arterial lactate concentrations. Patients requiring blood transfusion or intervention for bleeding within 24 hours were classified into a blood transfusion and intervention group (BTI group). Patients not requiring the transfusion or the interventions were classified into a non-BTI group. Logistic regression analysis was performed using patients requiring BTI as the dependent variable and ammonia concentration, lactate concentration, shock index on admission, systolic blood pressure on admission, and ISS as independent variables.

Results The subjects were 148 trauma patients. The mean age was 44.8 ± 20.8 years, and the mean injury severity scale was 17.7 ±13.2. Patients showed a significant correlation between ammonia and lactate concentrations (r = 0.46, P <0.001). Intervention for arterial bleeding was required in 16 patients. Blood transfusion was required in 17 patients. The BTI group consisted of 21 patients. The BTI group had more hemodynamic instability and significantly higher ammonia and lactate concentrations than the non-BTI group. Logistic regression analysis shows that only plasma ammonia was a significant independent variable for BTI (P = 0.001). The odds ratio of requiring blood transfusion and/or intervention for arterial bleeding was 18.2 when ammonia was >80 |ig/dl.

Conclusions Elevated ammonia concentration on admission will be a predictive factor for traumatic hemorrhage requiring treatment.

Regulatory T cells are persistently enriched in the peripheral blood of head-injury patients

EJ Galtrey, AL Cox, DA Chatfield, AJ Coles, DK Menon

University of Cambridge, UK

Critical Care 2009, 13(Suppl 1):P82 (doi: 10.1186/cc7246)

Introduction We have previously shown that a significant proportion of patients have increased numbers of lymphocytes proliferating in response to myelin basic protein 10 days post traumatic brain injury (TBI); associated with a trend for more favourable outcome [1]. This suggests that the adaptive immune response may have a protective role following TBI, as demon-

Treg frequency post TBI.

strated in animal models [2]. In this study, we further explored this process by examining the effect of TBI on regulatory T cell (Treg) numbers, and determining whether a change in Treg frequency was long lived.

Methods We recruited 12 patients with severe or moderate TBI who required protocol-driven therapy to maintain cerebral perfusion and intracranial pressure. Blood samples were taken within 72 hours and 10 days post TBI. A separate group of patients was studied 6 months post TBI. Age-matched and sex-matched healthy volunteers were used as controls. Peripheral blood mononuclear cells (PBMCs) were isolated. Flow cytometry was used to identify Tregs according to the surface expression of markers (CD4/CD25hi9h/CD45ROhi9h/CD127low). Results When compared with controls, patients at 10 days and at 6 months post TBI showed a greater proportion of PBMCs with a Treg phenotype (Mann-Whitney U test; P<0.01 and P<0.05, respectively). The proportion of Treg cells at 72 hours was higher than controls, but this difference did not reach significance. See Figure 1.

Conclusions We have shown that TBI upregulates CD4+ T cells with a regulatory phenotype. This effect is sustained for at least 6 months, and may modulate protective autoimmunity. Better understanding of the functional capacity and specificity of lymphocytes identified as Tregs may provide therapeutic targets. References

1. Cox AL, et al.: J Neuroimmunol 2006, 174:180-186.

2. Hauben E, et al.: Lancet 2000, 354:286-287.

Alteration of cardiopulmonary function after severe head injury

E Isotani, Y Otomo, K Ohno

Tokyo Medical and Dental University, Tokyo, Japan Critical Care 2009, 13(Suppl 1):P83 (doi: 10.1186/cc7247)

Introduction It is very hard to achieve optimal water balance in severe head injury (SHI) patients (Glasgow coma score <8). Cardiopulmonary complications are common after SHI: neurogenic pulmonary edema, cardiac failure, and so on [1-4]. In this study we present the alteration of cardiopulmonary function on pulse contour

Volume management of SHI patients on PiCCO-plus monitoring. CVP, central venous pressure; SVV, stroke volume variation; PPV, pulse pressure variation.

hypovolemia, and this fluid redistribution caused hydrostatic fluid retention in lung tissues on PiCCO-plus monitoring after SHI. See Figures 1 and 2.

Conclusions Persistent catecholamine release and the different sensitivity of blood vessels to catecholamine cause the blood volume redistribution: systemic hypovolemia and hydrostatic pulmonary edema. The excess cardiac preload due to catechol-amine release leads to brain natriuretic peptide release resulting in natriuresis. References

1. Isotani E, et al.: Stroke 1994, 25:2198-2203.

2. Isotani E, et al.: J Cardiovasc Pharmacol 1996, 28:639-644.

3. Kubota Y, et al.: Vascul Pharmacol 2007, 47:90-98.

4. Mizuno Y, et al.: Vascul Pharmacol 2008, 48:21-31.

Abstract withdrawn

The RESCUEicp decompressive craniectomy trial

P Hutchinson, I Timofeev, S Grainger, E Corteen

University of Cambridge, UK

Critical Care 2009, 13(Suppl 1):P85 (doi: 10.1186/cc7249)

Figure 2 (abstract P83)

Lung water content of SHI patients on PiCCO-plus. ELWI, extravascular lung water index; PVPI, pulmonary vascular permeability index; EVLW, extravascular lung water; PBV, pulmonary vascular permeability index.

analysis calibrated by transpulmonary thermodilution (PiCCO-plus) monitoring after SHI.

Methods Plasma catecholamines, natriuretic polypeptides, thrombomodulin and D-dimer of nine patients were measured immediately after SHI. The cardiopulmonary functions of nine consecutive patients were monitored by PiCCO-plus daily during a week after SHI.

Results Noradrenalin, dopamine and brain natriuretic peptide concentrations were significantly high during the entire study period. Significantly higher elevations of plasma thrombomodulin and D-dimer concentrations were also observed after SHI. The intrathoracic blood volume was maintained in spite of systemic

Introduction The RESCUEicp study (Randomised Evaluation of Surgery with Craniectomy for Uncontrollable Elevation of Intracranial Pressure) aims to provide Class 1 randomised evidence as to whether decompressive craniectomy is effective for the management of patients with raised and refractory intracranial pressure (ICP) following traumatic brain injury. Methods An international multicentre randomised trial comparing decompressive craniectomy with medical management. Patients (n = 50 for the pilot phase, n = 600 for the main study) with traumatic brain injury and raised ICP (>25 mmHg) refractory to initial treatment measures are eligible for the study. Patients are randomised to one of two arms: continuation of optimal medical management (including barbiturates) versus surgery (decompressive craniectomy). The inclusion criteria are: traumatic brain injury, age 10 to 65 years, abnormal CT scan and the exclusion criteria: bilateral fixed and dilated pupils, bleeding diathesis, devastating injury not expected to survive 24 hours. Outcome is assessed using the extended Glasgow Outcome Score and the SF-36 quality of life questionnaire at 6 months, 1 year and 2 years. See Figure 1.

Figure 1 (abstract P85)

Results Over 170 patients have been recruited to date. At present we have achieved 97% data collection and 94% follow up. The study is ongoing.

Conclusions Randomising patients with traumatic brain injury to decompressive craniectomy versus optimal medical management is feasible. Whether the operation is effective and safe remains to be seen. We would welcome more centers' participation. References

1. Timofeev I, Hutchinson PJ: Outcome after surgical decompression of severe traumatic brain injury. Injury 2006, 37: 1125-1132.

2. Winter CD, et al.: The role of decompressive craniectomy in the management of traumatic brain injury: a critical review. J Clin Neurosci 2005, 12:619-623.

Drift analysis of a novel device for measurement of regional cerebral blood flow

S Wolf1, L Schürer1, P Horn2, C Lumenta1

1Krankenhaus Bogenhausen, München, Germany; 2Charite, Berlin, Germany

Critical Care 2009, 13(Suppl 1):P86 (doi: 10.1186/cc7250)

Introduction Bedside measurement of regional cerebral blood flow (rCBF) is a promising technology to detect vasospasm after aneurysmal subarachnoid hemorrhage [1]. The Bowman® rCBF monitor facilitates continuous and quantitative assessment of brain tissue perfusion in ml/100 g tissue/minute with a thermodiffusion microprobe. The probe is self-calibrating at user-definable time intervals from 1 to 120 minutes. To estimate the necessary recali-bration frequency, we looked at the measurement drift between two calibration points.

Methods In 32 patients with aneurysmal subarachnoid hemorrhage, a Bowman® rCBF probe was implanted in the vascular territory of the aneurysm-harboring vessel. According to previous experience, we performed automatic recalibration of the rCBF device every 30 minutes, thus yielding measurement periods of 25 minutes. CBF was recorded once per second. Data were averaged over 1 minute and all measurement cycles were pooled. Results The mean analyzed monitoring time per patient was 6.9 days. The mean rCBF was 24.4 ml/100 g/min at the beginning and 25.6 ml/100 g/min at the end of the measurement cycles between two recalibration periods, thus representing a mean drift of 1.15 ml/100 g/min per 25-minute measurement period (P<0.001). This drift was heterogeneous in different patients (range -3.67 to 1 2.0 ml/100 g/min). Patients number 24 to 32 who were monitored more recently showed a significant upward drift of 7.57 ml/100 g/min, whereas data from patients number 1 to 23 had a downward drift of -0.67 ml/100 g/min. Explanation for

this phenomenon is a revision of the monitor software by the manufacturer that was performed after patient 23. This finding was verified with external data.

Conclusions The current implementation of the Bowman® rCBF monitor shows a severe upward measurement drift that is clinically relevant. At present, the only solution is to perform recalibrations as frequently as possible. Reference

1. Vajkoczy P, et al.: Regional cerebral blood flow monitoring in the diagnosis of delayed ischemia following aneurysmal subarachnoid hemorrhage. J Neurosurg 2003, 98: 1227-1234.

Continuous monitoring of carbon dioxide reactivity in traumatic brain injury

G De La Cerda, V Verma

Royal London Hospital, London, UK

Critical Care 2009, 13(Suppl 1):P87 (doi: 10.1186/cc7251)

Introduction The objective is to study the relationship between end-tidal carbon dioxide (EtCO2) and cerebrovascular pressure reactivity in traumatic brain injury (TBI). Cerebrovascular pressure reactivity is the ability of cerebral vessels to respond to changes in transmural pressure. A cerebrovascular pressure reactivity index (PRx) can be determined as the moving correlation coefficient between the mean intracranial pressure (ICP) and mean arterial blood pressure. A negative or zero value reflects a normally reactive vascular bed whereas positive values reflect passive, non-reactive vessels. This index correlates significantly with outcome after TBI [1]. Carbon dioxide affects the cerebral vascular response. Our aim is to identify the optimal EtCO2, at which PRx reaches its lowest value.

Methods A prospective observational study of 20 patients with TBI at the Royal London Hospital. All patients were managed according to the local guidelines for the management of TBI. PRx was determined by ICM+ (Cambridge University) software by calculating the correlation coefficient between ICP and arterial blood pressure. A total of 965 hours of data were recorded including the mean arterial pressure, ICP, cerebral parfusion pressure, PRx and EtCO2.

Results We plotted EtCO2 against PRx to identify the optimal EtCO2 and the range of carbon dioxide reactivity. The graph of PRx compared with EtCO2 indicated a U-shaped curve, suggesting that too low or too high ETCO2 was associated with a disturbance in pressure reactivity (Figure 1). We found this pattern in 66% of the patients. In the other patients (34%) there was no such correlation, so it was not possible to identify optimal EtCO2, probably indicating loss of carbon dioxide reactivity.

Figure 1 (abstract P87)

Conclusions PRx allows the determination of the carbon dioxide level at which cerebrovascular pressure reactivity reaches its optimal value in individual patients. That would allow a dynamic approach to the carbon dioxide target in TBI. Reference

1. Zweifel C, etal.: Continuous monitoring of cerebrovascular pressure reactivity in patients with head injury. Neurosurg Focus 2008, 25:E2.

Determining neurological prognosis in patients with severe traumatic brain injury: a survey of Canadian intensivists

A Turgeon1, F Lauzier1, K Burns2, D Fergusson3, M Meade4, D Zygun5, D Scales2, R Zarychanski3, L Moore1, S Kanji3, L McIntyre3, J Pagliarello3, P Hébert3, for the Canadian Critical Care Trials Group

Wniversité Laval, Quebec, QC, Canada; 2University of Toronto, ON, Canada; 3University of Ottawa, ON, Canada; 4McMaster University, Hamilton, ON, Canada; 5University of Calgary, AB, Canada

Critical Care 2009, 13(Suppl 1):P88 (doi: 10.1186/cc7252)

Introduction Current prognostic information following severe traumatic brain injury (TBI) is of limited clinical utility. We hypothesized that wide practice variation exists in determining prognosis in this population. We conducted a survey of Canadian intensivists to better understand prognosis determination and decisions regarding level of care following severe TBI. Methods Survey items were generated to assess the perceived utilization and utility of different tests for prognosis determination, and the perception of prognosis and decision on the level of care. We used direct questions and scenario-based questions (five-point Likert scales). We pretested the questionnaire to assess its clinical sensibility, and conducted test-retest reliability. Canadian intensivists were identified at all level I and level II trauma centers. The survey was administered electronically. Nonrespondents were sent a paper questionnaire.

Results The response rate was 73% (180/215). Most respondents worked in teaching hospitals (95%), mixed medical/surgical ICUs (87%) with more than 40 severe TBI cases/year (65%). Poor neurological prognosis at 1 year was defined as Glasgow outcome scale 1, 2 or 3 for 69% of respondents. More than 60% considered that accurate prognosis determination would be most helpful within 7 days following severe TBI. Most respondents cited monitoring (>70%), clinical examination (>85%) and CT scan (>90%) as being the most frequently used source of information for evaluating prognosis, as opposed to MRI (30%) or somatosensory evoked potentials (15%). When asked if a 25-year-old male with severe TBI had a poor neurological prognosis at 1 year, 40% of respondents disagreed/strongly disagreed, 30% had no opinion and 30% agreed/strongly agreed. When asked how comfortable they would be to recommend withdrawal of lifesupport measures, 82% reported being uncomfortable/very uncomfortable, 9% had no opinion and 9% were comfortable/very comfortable.

Conclusions We observed significant variation in perceptions of neurological prognosis and in decisions regarding level of care among Canadian intensivists. Considering the importance of such prognostic information in clinical practice, a better understanding of prognosis determination in TBI is warranted.

Severe brain trauma management analysis using a highrate recording tool: better definition allows better analysis of practice

H Mehdaoui1, L Allart1, R Valentino1, I Elzein1, C Meunier1, B Sarrazin1, C Vilhelm2, D Zitouni2, P Ravaux2

1Fort de France University Hospital, Fort De France, Martinique; 2Lille 2 University, Lille, France

Critical Care 2009, 13(Suppl 1):P89 (doi: 10.1186/cc7253)

Introduction Review of practice is a way to enhance quality of care. We developed a high-rate recording tool able to store the data of critical care patients [1]. We recorded 15 severely brain-injured patients and analyzed our team's practice according to commonly admitted recommendations.

Methods Fifteen patients were recorded at a rate of one value each 2 seconds during a total of 750 hours. We analyzed the data to identify 5-minute or longer episodes of cerebral hypoperfusion (CHP) and intracranial hypertension (ICHT). The episodes were electronically detected among the signal files and manually validated using software allowing signal graph visualization and scrolling.

Results Two hundred and forty-one episodes were detected: 135 episodes of ICHT and 106 episodes of CHP. ICHT and CHP episodes were grouped in 84 cases (34%). CHP episodes were shorter than ICHT episodes (P<0.02). Medical reactions were observed in 128 cases (53%) and more often concerned episodes lasting more than 30 minutes (n = 88, 59%) than shorter episodes (n = 61, 41%). Electronic analysis of CHP showed that overall reactions are made to increase the mean arterial pressure (47%) rather than to lower intracranial pressure (35%). Adequacy of medical decisions for mean arterial pressure and intracranial pressure management was, respectively, 45% and 55% considering recommendations made for severe head trauma management.

Conclusions Actual monitoring of severely injured patients misses short episodes. Computers could help to better detect such episodes by adequate algorithms. This method will lower the human performance usually observed in complex management protocols and could help to improve decision-making if implemented at the bedside. Reference

1. Allart L, Vilhelm C, Mehdaoui H, et al.: An architecture for online comparison and validation of processing methods and computerized guidelines in intensive care units.

Comput Methods Programs Biomed 2008, 93:93-103.

Comparison of a new brain tissue oxygenation measuring probe with the established standard

S Wolf1, L Schürer1, P Horn2, C Lumenta1

1Krankenhaus Bogenhausen, München, Germany; 2Charite, Berlin, Germany

Critical Care 2009, 13(Suppl 1):P90 (doi: 10.1186/cc7254)

Introduction Besides intracranial pressure (ICP) monitoring, brain tissue oxygenation (pbtO2) monitoring with the Licox system (Integra Neuroscience, Germany) is on the verge of clinical routine in acute brain injury. Recently, a new pbtO2 probe by a different manufacturer (Raumedic AG, Germany) was introduced into the market. As this new probe facilitates measurement of ICP as well, its use would reduce invasiveness of multimodal neuromonitoring. Therefore, we investigated the agreement of pbtO2 values of both S37

probes in patients with aneurysmal subarachnoid hemorrhage necessitating ICP and pbtO2 monitoring.

Methods Eight patients with pbtO2 monitoring probes of both types implanted side by side in the same vascular territory were investigated. Multimodal monitoring data were stored online with dedicated software. Data were analyzed using the method proposed by Bland and Altman [1].

Results The mean measurement time per patient was 8.6 days. All data pooled, the mean bias was -0.66 mmHg. The precision range (two standard deviations of the bias) was -32.9 to 32.6 mmHg. The Licox probe showed a tendency for higher values at high pbtO2, while the Raumedic probe showed higher values at low pbtO2. Analysis of single patients revealed no discernible pattern in the relationship of measurement values of both probes. Three of the new probes ceased to function prematurely. Conclusions Our data suggest that measurements of both pbtO2 probes cannot be interchanged. No easy algorithm for conversion of measurement data from one system to the other is available. More rigorous bench testing is necessary before implementation of the new system in the clinical routine. Reference

1. Bland JM, Altman DG: Statistical methods for assessing agreement between two methods of clinical measurement. Lancet 1986, 1:307-310.

The brain is a major source of S100 increase in porcine endotoxemic shock

M Lipcsey1, M Olovsson1, E Larsson1, R Einarsson 2, GA Qadhr1, J Sjolin1, A Larsson1

1Uppsala University, Uppsala, Sweden; 2Fujirebio Diagnostics, Guthenburg, Sweden

Critical Care 2009, 13(Suppl 1):P91 (doi: 10.1186/cc7255)

Introduction Cerebral dysfunction frequently complicates septic shock. A marker of cerebral dysfunction could be of significant value in managing sedated septic patients. Plasma S100 (S100B) proteins increase in sepsis. S100B is present in the brain, but also in other tissues. To date, the source of this protein has not been investigated in sepsis. The aim of this study is to determine whether the brain is the source of S100B in an experimental sepsis model.

Figure 1 (abstract P91)

Methods Twenty-seven pigs were anesthetized and randomized to either infusion of endotoxin at the rate of 1 |ig/kg/hour (n = 19) or saline (n = 8). Catheters were inserted into a cervical artery and the superior sagittal sinus. Blood samples were collected from both sites and physiological data were registered before the start of the endotoxin infusion and hourly. After 6 hours, the animals were terminated and brain tissue samples were taken from the left hemisphere. S100B in plasma was measured by ELISA. Brain tissue samples were stained with biotinylated S100B antibodies. Results S100B levels increased in plasma and expression of S100B in cerebral tissue was higher in endotoxemic animals compared with controls. Statistically higher sinus versus arterial S100B concentration was only found at 2 hours in the endotoxemic animals (Figure 1).

Conclusions Although other sources exist, the brain is a major source of S100B in endotoxemia, making it a potential marker of cerebral dysfunction in septic shock.

Anemia is associated with brain tissue hypoxia and metabolic crisis after severe brain injury

P Kurtz, M Schmidt, J Claassen, E Carrera, L Fernandez, N Badjatia, S Mayer, K Lee

Columbia University Medical Center, New York, USA Critical Care 2009, 13(Suppl 1):P92 (doi: 10.1186/cc7256)

Introduction After severe brain injury, anemia can adversely affect cerebral oxygen delivery and brain tissue oxygen (PbtO2). However, it is unclear whether low hemoglobin (Hb) contributes to brain tissue hypoxia or compromises oxidative metabolism. Methods We studied 28 consecutive patients with severe brain injury (15 with subarachnoid hemorrhage, 8 with intracerebral hemorrhage, and 5 with traumatic brain injury) who underwent multimodality intracranial pressure, PbtO2 and microdialysis monitoring. The relationship between Hb levels and the risk of brain tissue hypoxia (BTH), defined as PbtO2 <15 mmHg, and metabolic crisis (MC), defined as lactate/pyruvate ratio >40 and glucose <0.7 mmol/l, was analyzed with general linear models of logistic function for dichotomized outcomes utilizing generalized estimating equations for model estimation.

Results The mean age was 53 (20 to 83) years, 18 (64%) patients were female, median Glasgow coma scale was 6 (IQR 4 to 8) and 32% were dead at discharge. A total of 3,209 hours of monitoring and 297 Hb measurements were collected and analyzed. The

Figure 1 (abstract P92)

JÈ Ù

| SO a an

Frequency of Metabolic Crisis

(LPR > 40 and Glucose < 0.7 mmol/L)

30 20 10 0

■ ■

SSmgMl 8.1101J 9.1 to 10 >10

Hemoglobin (mg/dL)

Brain sinoarterial S100B concentration differences over time. Mean/SE. *P <0.05.

Relative frequency of metabolic crisis across hemoglobin ranges.

Frequency of BTH

1 (Pbt02< 15 mmHg)

2 40 |

I ■ ■ ■

a i 6 mg/dL 8.1 to 9 9.1 to 10 >10

Hemoglobin (mg/dLl

Relative frequency of brain tissue hypoxia across hemoglobin ranges.

mean Hb was 10.1 ± 1.5 mg/dl. Hb values were categorized into four ranges: <8 mg/dl, 8.1 to 9 mg/dl, 9.1 to 10 mg/dl and >10 mg/dl. The range with lowest frequency of MC and BTH (9.1 to 10 mg/dl) was defined as the reference. For every reduction in Hb range below the reference, there was an increased risk of MC (adjOR = 1.7 (95% CI = 1 to 2.9), P = 0.048 for 1 to 9 mg/dl; and adjOR = 4.2 (1.6 to 11.4), P <0.01 for <8 mg/dl) (Figure 1). A reduction in Hb below 8 mg/dl was also associated with increased risk of BTH (adjOR = 2.5 (1 to 6.5), P = 0.04) (Figure 2). The number of hours per day monitored spent in MC, but not with BTH, was associated with mortality at discharge (adjOR = 1.6 (1.1 to 2.4), P = 0.02).

Conclusions Mild reductions in blood Hb below 9 mg/dl are associated with an increased risk of MC and BTH after severe brain injury. MC had a more pronounced relationship with anemia and was associated with a greater risk of death.

Nonconvulsive seizures and renal failure after intracerebral hemorrhage

P Kurtz, L Fernandez, D Chong, L Hirsch, J Radhakrishnan, M Schmidt, K Lee, N Badjatia, S Mayer, J Claassen

Columbia University Medical Center, New York, USA Critical Care 2009, 13(Suppl 1):P93 (doi: 10.1186/cc7257)

Introduction Nonconvulsive seizures (NCSZ) and periodic epileptiform discharges (PEDs) are common and associated with poor outcome after intracerebral hemorrhage (ICH). Our objective is to describe the frequency of renal, liver, metabolic and thrombotic dysfunction after ICH and its association with non-convulsive status epilepticus (NCSE), NCSZ and PEDs. Methods We retrospectively identified all patients with spontaneous ICH who underwent cEEG monitoring between 1998 and 2006. Data assessed included admission creatinine and maximum values during the first 14 days for creatinine and bilirubin, and minimum values for bicarbonate, base excess and platelets. Acute renal failure (ARF) was defined as an increase in creatinine >50% from baseline, and severity was assessed by the RIFLE criteria. Other continuous variables were dichotomized using the mean value as the cutoff point to define the presence of liver dysfunction, acidosis and thrombocytopenia. ICH characteristics were based on CT scans. Univariate logistic regression was conducted to identify associations between predictors and NCSE, NCSZ and PEDs. Significant (P <0.25) and clinically relevant variables were

then included in a multivariable logistic regression model to identify independent associations.

Results A total of 102 patients were studied. The mean age was 62 ± 17 years and 55 (56%) were male. Twenty-six (26%) patients were comatose on admission and 15 (17%) had an increase in ICH volume >30%. Seven (6%) patients developed NCSE, 18 (18%) NCSZ and 17 presented PEDs. ARF developed in 30 (29%) patients, liver dysfunction in 30 (29%), metabolic dysfunction in 48 (47) and thrombocytopenia in 55 (54%). Patients with NCSZ and PEDs more frequently developed ARF (56 vs. 24%, P = 0.01 and 47 vs. 26%, P = 0.08, respectively). After adjusting for age, gender, coma at admission and increase in ICH volume, patients with ARF were six times more likely to develop NCSZ (OR = 5.9, 95% CI = 1.4 to 24.6, P = 0.01) as compared with those without ARF. Similarly, after adjusting for age, gender and coma/stupor at admission, each one-point increase in the RIFLE criteria doubles the odds of developing PEDs (OR = 2.0, 95% CI = 1.1 to 3.6, P = 0.02). No significant associations were found for NCSE.

Conclusions ICH location within 1 mm from the cortex, an increase in ICH volume, coma or stupor at admission and the presence of ARF were independently associated with the development of NCSZ and PEDs after ICH.

S100 beta is a marker of brain insult caused by systemic inflammatory response syndrome

S Belyshev, A Levit, N Davydova

Regional Hospital, Ekaterinburg, Russian Federation Critical Care 2009, 13(Suppl 1):P94 (doi: 10.1186/cc7258)

Introduction Central nervous system (CNS) failure along with other systems is included in septic multiorgan failure; however, in our opinion, it is the less studied. Earlier, a S100P level rise was described for septic encephalopathy patients, but together with this sites of a local ischemia or a bleed were discovered according to computer tomography in a brain [1]. The purpose of the present study is to determine whether it is the functional or the organic CNS insult that defines septic encephalopathy and to study the dependence of septic encephalopathy on the severity of sepsis. Methods The prospective study involves 28 septic patients according to sepsis criteria and ACCP/SCCM classifications. The primary site localization of infection in our study excluded the CNS according to computer tomography. After the approval of the hospital ethics committee we investigated the patients without neurological disturbances. The patients were divided into two groups. The first group (n = 12) included the patients with various degrees of impairment of consciousness (<15 points of the Glasgow coma scale). The second group (n = 16) included patients without any disturbances of consciousness. There were no differences between groups in age, gender, severity of sepsis, the cortisone level, lactate, and arterial and jugular bulb blood saturation.

Results The S100P level in group 1 was reliably higher than in group 2, and on average it surpassed the normal rate by 2.6 times (group 2 showed the S100P level as not exceeding the normal level of 0.15 units/ml). Venous blood saturation in the two groups reliably did not differ and was within the norm. However, the saturation of the jugularis blood in group 1 was equal to 57.64 ± 13.3, which is below the norm. Venous blood lactate exceeded in group 2, while group 1 showed a normal rate both in venous and jugular blood (lactate jugular = 1.76). The 28-day mortality was reliably higher in group 1. The S100P level correlated with mortality.

Conclusions CNS failure under septic encephalopathy is of organic character and relates to CNS cell necrosis that is not of focal character. The results obtained have suggested that the S100P level may be considered a marker of unfavorable outcome in patients with severe sepsis. Reference

1. Nguyen DN, et al.: Crit Care Med 2006, 34:1967-1974. P95

Cerebral oxygenation monitoring in critical care patients with traumatic brain injury

T Tokutomi, T Miyagi, H Katsuki, Y Takeuchi, M Shigemori

Kurume University School of Medicine, Kurume, Japan Critical Care 2009, 13(Suppl 1):P95 (doi: 10.1186/cc7259)

Introduction One of the most controversial areas of traumatic brain injury (TBI) critical care is the management of cerebral perfusion pressure (CPP). Since optimal CPP levels depend on whether cerebral autoregulation is preserved, these levels must be determined for individual cases. The aim of this study was to investigate the role of jugular venous saturation (SjO2) and brain tissue oxygen tension (PbrO2) monitoring in addition to CPP and intracranial pressure (ICP) monitoring in the acute management of patients with TBI. Methods Thirty-six severe TBI patients (ages 16 to 69 years, GCS score 4 to 7) admitted to our neurosurgical critical care unit were evaluated. ICP (Camino), CPP, and SjO2 (Abbott) were continuously monitored in the 36 patients, and PbrO2 (LICOX) was continuously monitored in the last 10 patients. The treatment goal was aimed at keeping ICP <20 mmHg and CPP >60 mmHg. Patients with good recovery or moderate disability on the Glasgow Outcome Scale at 6 months after injury were regarded as having favorable outcomes, and those with severe disability, vegetative state, or death were regarded as having unfavorable outcomes. Results Thirteen patients had favorable outcomes. PbrO2 values showed a positive correlation with CPP (r = 0.380, P <0.0001). SjO2 tended to be abnormally high when the simultaneous PbrO2 values were <15 or >45 mmHg. High SjO2 combined with low CPP or low PbrO2 was associated with unfavorable outcome. The occurrence rate of CPP above 60 mmHg during the first 5 days of monitoring was 89.8% of the measurements in favorable outcome patients vs. 72.7% of those in unfavorable outcome patients (likelihood ratio = 1.2). These proportions were altered to 70.3% vs. 46.8% (likelihood ratio = 1.5) when the simultaneous SjO2 values were normal, and to 85.8% vs. 51.2% (likelihood ratio = 1.7) when the simultaneous PbrO2 values were more than 20 mmHg. Conclusions The present data emphasize the clinical significance of brain oxygen monitoring in TBI and provide evidence for high SjO2 caused by cerebral hypoperfusion. SjO2 and PbrO2 measurements in combination with ICP and CPP should help in improving

the accuracy of monitoring intracranial pathophysiology and response to treatment.

Addition of new criteria to the Sequential Organ Failure Assessment for the patients with subarachnoid hemorrhage

S Macedo, B Oliveira, D Lima, A Da Silva, N Bastos, R De Carvalho

Hospital Sao José do Avai, Itaperuna, Brazil

Critical Care 2009, 13(Suppl 1):P96 (doi: 10.1186/cc7260)

Introduction The purpose of the treatment of subarachnoid hemorrhage (SAH) is to prevent or reverse ischemic disabilities through hemodynamic therapy (4H therapy) [1]. We add some criteria to the index to assess patients with SAH. Methods Informed consent for each patient/family, APACHE II score, weekly Sequential Organ Failure Assessment, serum glucose, lactate, calcium, sodium and magnesium, measurement of axillary temperature and hourly diuresis as the additional prognostic index Sequential Organ Failure Assessment were obtained. The study enrolled 91 patients diagnosed with SAH, confirmed by CT in the ICU. The conduct to approach these new criteria was the same in all patients. These patients were divided into two groups according to their development in the ICU: Group I - patients who had satisfactory development (out of the ICU), and Group II -patients who progressed to death.

Results Among 91 patients, Group I had 55 patients (60.4%) and the 36 remaining patients (39.6%) were classified as a Group II. See Table 1.

Conclusions Regarding the criteria used to assess patients with SAH, we concluded the only criteria that showed statistical significance in the prediction of death was the serum sodium (P =0.002). Reference

1. Naval NS, et al.: Controversies in the management of aneurysmal subarachnoid hemorrhage. Crit Care Med 2006, 34:511-524.

Epidemiological analysis of patients with cerebral aneurysm submitted for an embolization at Sâo José do Avai Hospital

S Macedo, G Alves, T Souza, C Siqueira, S Siqueira, A Siqueira, L Oliveira

Hospital Sao José do Avai, Itaperuna, Brazil

Critical Care 2009, 13(Suppl 1):P97 (doi: 10.1186/cc7261)

Introduction Cerebral aneurysms affect between 1% and 5% of the adult population and are responsible for significant rates of morbidity and mortality. The treatment of intracranial aneurysms has evolved substantially since the introduction of endovascular

Table 1 (abstract P96)

Results of the criteria examined in the study obtained in both groups

Criterion Group I (n = 55) Group II (n = 36) P value

Frequency (95% CI) OR Frequency (95% CI) OR

Arterial lactate 50.9 (37.1 to 64.6) 1.0 50.0 (32.9 to 67.1) 0.96 0.5

Serum calcium 49.1 (35.4 to 62.9) 1.0 55.6 (38.1 to 72.1) 1.3 0.3

Serum glucose 23.6 (13.2 to 37.0) 1.0 36.1 (20.8 to 53.8) 1.8 0.1

Serum magnesium 10.9 (4.1 to 22.2) 1.0 0 (0 to 9.7) 0 0.04

Serum sodium 23.6 (13.2 to 37.0) 1.0 55.6 (38.1 to 72.1) 4.0 0.002

Hourly diuresis 78.2 (65.0 to 88.2) 1.0 72.2 (54.8 to 85.8) 0.7 0.3

Axillary temperature 16.4 (7.8 to 28.8) 1.0 30.6 (16.3 to 48.1) 2.2 0.06

neurosurgery by Guglielmi detachable coils in the 1990s. The ablation overtook clipping as the initial method in many centers, including Brazil, because of the safety and feasibility of this method. Methods This cohort retrospective study analyses the clinical and epidemiological variables. It was conducted from a database of patients submitted for an ablation in the neurosurgery department of Sâo José do Avai Hospital (Itaperuna - RJ, Brazil) in the period December 2006 to May 2007.

Results We studied 510 patients submitted for ablation: 406 females (79.6%) and 104 males (20.4%). The average age of patients was 50.9 years (OR = 14). The total of patients studied that required hospitalization in the ICU was 176 (34.4%), staying on average 6.1 days (OR = 5.0). Hunt-Hess scale prevalence: 1 -57.0%, 2 - 27.2%, 3 - 11.8%, 4 - 3.5%, 5 - 0.5%, and Fisher (tomography scale): 1 - 41.8%, 2 - 24.0%, 3 - 23.5%, 4 - 10.6%. We found risk factors involved in cerebral vascular accident. Conclusions We note the predominance of females for the occurrence of cerebral vascular aneurysmatic accident. The average age of patients was 50.9 years. Systemic hypertension and smoking showed a strong association with the presence of intracranial aneurysms. The arteries of the previous segment were those that had a higher incidence of aneurysms. The majority of patients did not require hospitalization in the ICU. The mortality rate of cerebral vascular accident in patients with endovascular coiling was 11.3%.

Seizures and organ dysfunction after subarachnoid hemorrhage

P Kurtz, L Fernandez, D Chong, L Hirsch, J Radhakrishnan, M Schmidt, K Lee, N Badjatia, S Mayer, J Claassen

Columbia University Medical Center, New York, USA Critical Care 2009, 13(Suppl 1):P98 (doi: 10.1186/cc7262)

Introduction Nonconvulsive status epilepticus (NCSE), non-convulsive seizures (NCSZ) and periodic epileptiform discharges (PEDs) are common and associated with poor outcome after subarachnoid hemorrhage (SAH). The objective of this study is to describe the frequency of renal, liver, thrombotic and metabolic dysfunction after SAH and their association with NCSE, NCSZ and PEDs.

Methods We studied all patients with SAH who underwent cEEG monitoring from 1997 to 2004. Data assessed included admission creatinine, maximum values during the first 14 days for creatinine and total bilirubin, and minimum bicarbonate, base excess and platelets. Renal failure was defined either as a maximum creatinine >1.1 mg/dl or as an increase in creatinine >50% from baseline. Severity of renal dysfunction was assessed by the RIFLE criteria. The other continuous variables were dichotomized using the mean value as the cutoff point that defined liver dysfunction, metabolic acidosis and thrombocytopenia. Univariate logistic regression was conducted to identify associations between predictor variables with NCSE, NCSZ and PEDs. Significant (P <0.25) and clinically meaningful variables were then included in a multivariable logistic regression model to identify independent associations. Results A total of 116 patients were studied. The mean age was 58 ± 16 years and 80 (69%) were female. Eighty-eight percent of the patients had a Hunt-Hess grade at admission of 3 or worse. Twelve patients (10%) developed NCSE, 17 (15%) NCSZ and 26 (22%) PEDs. Acute renal failure (ARF) developed in 18 (15%) patients, liver dysfunction in 24 (21%), low bicarbonate in 37 (32%), low base excess in 72 (62%) and thrombocytopenia in 55 (47%). After adjusting for age, gender and Hunt-Hess grade, patients with severe metabolic acidosis were six times more likely to develop NCSZ as compared with those without acidosis (OR =

5.9, 95% CI = 1.2 to 27.8, P = 0.03). Similarly, after adjusting for age, gender and Hunt-Hess grade, patients with renal dysfunction were six times more likely to develop NCSE as compared with those without ARF (OR = 5.8, 95% CI = 1.5 to 21.8, P = 0.01). No association was found with PEDs.

Conclusions Organ dysfunction is common after SAH. Metabolic acidosis and renal dysfunction were independently associated with the development of NCSZ and NCSE, respectively.

Alteration of cardiopulmonary function after subarachnoid hemorrhage

E Isotani, Y Otomo, K Ohno

Tokyo Medical and Dental University, Tokyo, Japan Critical Care 2009, 13(Suppl 1):P99 (doi: 10.1186/cc7263)

Introduction Volume management is crucial in intensive care; however, in some patients it is very hard to achieve optimal water balance. The subarachnoid hemorrhage (SAH) patient is a representative example [1-3]. Cardiopulmonary complications are common after SAH: neurogenic pulmonary edema, cardiac failure, and so on. In this study we will present the time course of catecholamines and natriuretic polypeptides and fluid redistribution on pulse contour analysis calibrated by transpulmonary thermodilution PiCCO-plus monitoring after SAH.

Methods Plasma catecholamines and natriuretic polypeptides of 54 consecutive patients were measured every other day during 2 weeks after SAH. The cardiopulmonary functions of 37 consecutive patients were monitored by PiCCO-plus daily during 2 weeks after SAH.

Results Noradrenalin, dopamine and brain natriuretic polypeptide concentrations were significantly high during the entire study period. Intrathoracic blood volume was maintained in spite of systemic hypovolemia, and this fluid redistribution caused hydrostatic fluid retention in lung tissues on PiCCO-plus monitoring after SAH. See Figures 1 and 2.

Conclusions Persistent catecholamine release and the different sensitivity of blood vessels to catecholamine cause the blood volume redistribution: systemic hypovolemia and hydrostatic pulmonary edema. The excess cardiac preload due to catecholamine release leads to BNP release resulting in natriuresis. References

1. Isotani E, et al.: Alterations in plasma concentrations of natriuretic peptides and antidiuretic hormone after subarachnoid hemorrhage. Stroke 1994, 25:2198-2203.

2. Isotani E, et al.: Impaired endothelium-dependent relaxation in rabbit pulmonary artery after subarachnoid hemorrhage. J Cardiovasc Pharmacol 1996, 28:639-644.

3. Kubota Y, et al.: Alterations of intracellular calcium concentration and nitric oxide generation in pulmonary artery endothelium after subarachnoid hemorrhage of the rabbit. Vasc Pharmacol 2007, 47:90-98.

Transfusion increases infection without affecting neurologic outcome in spontaneous subarachnoid hemorrhage

K Matsushima, A Eastman, S Shafi, A Burris, T Tyner, H Frankel

University of Texas Southwestern Medical Center, Dallas, TX, USA Critical Care 2009, 13(Suppl 1):P100 (doi: 10.1186/cc7264)

Introduction Liberal use of packed red blood cell (PRBC) transfusion to a predefined threshold has been shown to worsen S41

Figure 1 (abstract P99)

Plasma catecholamine and natriuretic peptide concentrations after SAH. NOR, noradrenalin; DOA, dopamine.

Figure 2 (abstract P99)

PiCCO-plus monitoring after SAH. ELWI, extravascular lung water index; PVPI, pulmonary vascular permeability index; EVLW, extravascular lung water; PBV, pulmonary blood volume.

the outcome of ICU patients. However, in an effort to improve neurologic outcomes of patients with nontraumatic subarachnoid hemorrhage (SAH), transfusions are still used frequently to maintain hemoglobin of 10 g/dl. We hypothesized that PRBC transfusion in patients with SAH would worsen their outcomes. Methods We conducted a 19-month retrospective study of 84 patients with nontraumatic SAH in an intensivist-run, high-volume, academic ICU. Patients who received at least 1 unit PRBC transfusion during their hospital stay (n = 42, median 3 units) were compared with those who did not (n = 42). Outcomes of interest were vasospasm (defined both clinically and by transcranial Doppler velocities), 28-day mortality, poor neurologic outcome (defined as modified Rankin score of >4), and occurrence of nosocomial infections (defined by National Healthcare Safety Network criteria). Associations of PRBC transfusions with these outcomes were measured using univariate and multivariate analysis that adjusted for age, gender, ethnicity, comorbidities, neurologic status upon presentation, and procedures.

Results Patients with and without PRBC transfusions were similar in age, gender, ethnicity, comorbidities, and neurologic status upon presentation. There was no difference in the incidence of vasospasm with transfusions (52% vs. 37%, P = 0.12) or mortality (1 2% vs. 17%, P = 0.53). PRBC transfusions were associated with nosocomial infections (69% vs. 33%, P = 0.001) and poor neurologic outcome (69 vs. 48%, P = 0.046) in univariate analysis. After adjustment for potential confounders, PRBC transfusion was an independent predictor of nosocomial infections (OR = 3.7, 95% CI = 1.2 to 11.6, P = 0.03) but not poor neurologic outcome (OR = 1.9, 95% CI = 0.4 to 10.6, P = 0.45). Conclusions Use of PRBC transfusion in ICU patients with SAH does not improve neurologic outcome but increases risk of nosocomial infections. Hence, restricted use of PRBC may be justified in this segment of the ICU population, as well.

Emergency surgical management for prevention of symptomatic vasospasm and normal pressure hydrocephalus after subarachnoid hemorrhage due to ruptured cerebral aneurysm

K Ishii, M Fujiki, H Kobayashi

Oita University School of Medicine, Yufu, Japan

Critical Care 2009, 13(Suppl 1):P101 (doi: 10.1186/cc7265)

Introduction Various pieces of research have been done on prevention or treatment of symptomatic vasospasm and normal pressure hydrocephalus after subarachnoid hemorrhage due to ruptured cerebral aneurysm. However, this important issue is still unresolved. In this preliminary report, we introduce our surgical strategy and techniques of subarachnoid hemorrhage due to ruptured cerebral aneurysm and discuss the prevention of symptomatic vasospasm and normal pressure hydrocephalus in the surgical aspect.

Methods The subjects consisted of 19 consecutive patients with subarachnoid hemorrhage due to a ruptured cerebral aneurysm who were surgically treated in acute stage between 2006 and 2007. All aneurysms were located at the anterior circulation. Hunt & Kosnik classifications were 1 to 3, WFNS were 1 or 2, and the Fisher group was 2 or 3. We performed neck clipping of the aneurysm through the lateral supraorbital or pterional approach. We opened surrounded cisterns of the Circle of Willis and removed the hematoma as much as we could. The lamina terminalis was routinely opened during the operation. We did not insert any drainage tube postoperatively.

Results Symptomatic vasospasms were noted in four patients. However, the symptoms were not severe and were transient in all patients. There were no patients with normal pressure hydrocephalus. The clinical outcomes were good or excellent. There were almost no perioperative complications due to surgical procedures.

Conclusions The surgical techniques with certain contributions might prevent symptomatic vasospasm and normal pressure hydrocephalus after subarachnoid hemorrhage due to a ruptured cerebral aneurysm. An accumulation of cases is necessary.

Diagnosis and treatment of subarachnoid hemorrhage-induced vasospasm

C Frosini, A Amadori, L Bucciardini, I Bacci, E Gandini, M Marinoni, S Mangiafico, P Innocenti

AOU Careggi, Florence, Italy

Critical Care 2009, 13(Suppl 1):P102 (doi: 10.1186/cc7266)

Introduction Vasospasm occurs in up to 70% of subarachnoid hemorrhages (SAHs). Transcranial Doppler (TCD) and cerebral angiography (AGF) are used to monitor and guide its treatment; however, no systematic approach to interpret their values exists yet. The aim of our study was to evaluate the correspondence between sonographic and angiographic findings and the efficacy of the endovascular treatment.

Methods One hundred and one patients were admitted to our neurological ICU with SAH from June 2006 to September 2008. All of them were examined with daily TCD. Mean flow velocity in the middle cerebral artery >200 cm/s or an increase >50 cm/s/day or a mean flow velocity in the other arteries more than two times the normal values were interpreted as indicative of severe vasospasm. If a second TCD showed high blood velocities, an AGF was performed. If the vasospasm was confirmed, the patient was considered eligible for endovascular treatment. Ten patients developed severe vasospasm and were analyzed in this study. Results All patients presented poor clinical condition at admission (Hunt Hess 4 ± 0.9, Fisher grade 3.7 ± 0.4). Sixty percent developed intracranial hypertension and 30% needed barbiturate coma and decompressive craniectomy. Severe vasospasm occurred 9 days after SAH (8.5 ± 4.4). In 100% there was correspondence between sonographic and angiographic diagnosis of vasospasm. Eighteen endovascular treatments were performed (70% either intra-arterial nimodipine or transluminal balloon angioplasty, 30% intra-arterial nimodipine). Angiographical and clinical improvement was obtained in 80% of endovascular procedures, the sonographical efficacy in 75% of them. Nine patients developed recurrent vasospasm after the endovascular therapy. Each patient received 1.8 ± 1.03 treatments. Two patients with refractory vasospasm died in the neurological ICU for intracranial hypertension; the other eight patients had good recovery at discharge (Glasgow coma scale 12 ± 2.3). At the 6-month follow-up all patients had a favorable Glasgow outcome score (3.9 ± 1.59).

Conclusions Our study confirmed the excellent correlation between TCD and angiography in patients with middle cerebral artery flow velocity >200 cm/s. TCD can be considered a very useful bedside tool for early detection of vasospasm. The endovascular management of vasospasm appeared to be safe and effective. The efficacy of the association of pharmacological and mechanical approach in improving outcomes after SAH and in reducing the frequency of secondary neurologic deficits was demonstrated. S43

Effects of simvastatin in prevention of vasospasm in nontraumatic subarachnoid hemorrhage: preliminary data

S Macedo, Y Bello, A Silva, C Siqueira, S Siqueira, L Brito

Hospital Sao José do Avai, Itaperuna, Brazil

Critical Care 2009, 13(Suppl 1):P103 (doi: 10.1186/cc7267)

Introduction Vasospasm is the main cause of death and cognitive deficits in patients with subarachnoid hemorrhage after rupture of an aneurysm (aSAH). Some trials have shown that statins in the acute phase of aSAH reduce the incidence, morbidity and mortality of cerebral vasospasm [1]. The purpose of this study is to evaluate the potential of simvastatin (SVT) as prevention against vasospasm. Methods We realized a prospective, randomized, nonblind study with the use of 80 mg SVT (night) in the first 72 hours of the beginning of bleeding, and a control group that did not use SVT, for 21 days between January and December 2008. Informed consent was obtained for all patients. CT scans were performed as a control and another CT scan was performed in patients with altered neurological signals. In the presence of changes suggestive of vasospasm or correlation in clinical conditions and CT scans, the patients were taken for a cerebral arteriography examination followed by an angioplasty procedure if necessary. Liver and renal function and LDL cholesterol were evaluated weekly, and total creatine kinase was evaluated every 3 days. Exclusion criteria: liver and renal disease, pregnant elevation of serum transaminases (three times normal value), creatinine >2.5, rhabdomyolysis or total creatine kinase >1,000 U/l. Results We excluded two patients with bleeding for more than 72 hours. There was no significant change in the levels of total creatine kinase, and renal or liver function. We included 21 patients, 11 in the SVT group and 10 in the control group. Mortality was eight patients (38%), six patients in the control group and two of the SVT group. Vasospasm was confirmed by cerebral arteriography examination in four patients in the control group and one patient in the SVT group. All the patients who died showed Fisher scale IV.

Conclusions SVT at a dose of 80 mg was effective in reducing the mortality (18.1% against 60%) compared with the group that did not use SVT, and also decreased the incidence of cerebral vasospasm despite the APACHE II score being higher in the group that used SVT (14.3 vs. 10.7). Less morbidity occurred in the SVT group, with an average Glasgow outcome score of 3.25 vs. 2.1. Reference

1. Lynch JR, Wang H, et al.: Simvastatin reduces vasospasm after aneurysmal subarachnoid hemorrhage: results of a pilot randomized clinical trial. Stroke 2005, 36:20242026.

Magnesium use on prophylaxis of vasospasm morbidity and the mortality rate in subarachnoid hemorrhage

S Macedo, R Nuss, G Lubanco, R Lovatti, S Pereira, G Lima, C Siqueira, S Siqueira, D Lima, R Torres

Hospital Sao José do Avai, Itaperuna, Brazil

Critical Care 2009, 13(Suppl 1):P104 (doi: 10.1186/cc7268)

Introduction We propose this study in order to reach two points: the clinical incidence of vasospasm morbidity, confirmed by CT; and the mortality of subarachnoid hemorrhage (SAH) patients in 28 days [1]. It shows the comparison of a group of patients that used magnesium (Mg) (intervention, Group 1) with those that did not use Mg (control, Group 2).

Methods After institutional approval and informed consent, a prospective, randomized, nonblind study was carried out between February and November 2008. The main goal of the study was to achieve a Mg serum concentration from 2.5 to 3.5 mg/dl, using a solution of Mg 2% (saline solution 5% 400 ml + MgSO4 10% 100 ml/24 hours), during the first 14 days of aneurysm rupture. Admission criteria: patients diagnosed with SAH and AT <96 hours. Exclusion criteria: patients with SAH and AT >96 hours. Results In a previous study evaluation we analysed a total of 56 patients with (n = 26 in Group 1 and n = 30 in Group 2) (Tables 1 and 2). Main results: Group 1 - vasospasm frequency 26.9% (n = 7) and mortality 19.2% (n = 5) in 28 days; Group 2 - vasospasm frequency 46.7% (n = 14) and mortality 33.3% (n =10) in 28 days. Conclusions According to the outcome, we can conclude that Group 1 obtained greater protection on the vasospasm incidence and decrease of mortality in comparison with Group 2. The P value was not significant due to a still small number of patients. Reference

1. Schmid-Elsaesser R, et al.: Intravenous magnesium versus nimodipine in the treatment of patients with aneurysmal subarachnoid hemorrhage: a randomized study. Neurosurgery 2006, 58:1054-1065.

Outcomes of ventilator-associated pneumonia in aneurysmal subarachnoid hemorrhage patients

R Lenhardt, O Akca

University of Louisville, KY, USA

Critical Care 2009, 13(Suppl 1):P105 (doi: 10.1186/cc7269)

Introduction Subarachnoid hemorrhage (SAH) from rupture of cerebral aneurysms is associated with significant mortality and morbidity. About 10% to 25% die before reaching hospital, and of

Table 1 (abstract P104)

Average age APACHE II score Average Mg level

Group I (n = 26) Group II (n = 30) 52.3 50.3 8.2 15.6 2.32 1.9

Table 2 (abstract P104)

Group I (n = 26) Group II (n = 30)

Frequency (% (n)) 95% confidence interval Odds ratio Frequency (% (n)) 95% confidence interval Odds ratio P value

Vasospasm 26.9 (7) Mortality in 28 days 19.2 (5) 11.6 to 47.8 6.6 to 39.4 0.4 0.5 46.7 (14) 33.3 (10) 28.3 to 65.7 17.3 to 52.8 1 1 0.1 0.2

those who survive about 40% to 50% develop significant neurological deficits [1]. Ventilator-associated pneumonia (VAP) is defined as pneumonia occurring more than 48 hours after initiation of mechanical ventilation. About 20% of post-aneurysmal SAH patients are reported to experience VAP [2]. In this trial, we aimed to report the short-term outcomes of VAP. We performed a surveillance analysis on aneurysmal SAH patients who required mechanical ventilation for more than 48 hours. Methods After obtaining approval from the Human Studies Committee of the University of Louisville to retrospectively analyze the prospectively collected patient data, we reviewed the electronic records of our aneurismal SAH patients admitted between 2004 and 2007. VAP was diagnosed and confirmed by the Clinical Pulmonary Infection Score supported with culture results on days 0 and 3. We analyzed host-specific and disease-specific and care-related risk factors. Categorical variables were compared with the chi-square test, and continuous data were analyzed with the unpaired t and Kruskal-Wallis tests.

Results Within 86 aneurysmal SAH patients admitted to the ICU, 45 patients needed to be ventilated for more than 48 hours (52%), and 16 of them developed VAP (19%). More than 80% of patients with SAH required either surgical or vascular procedure. The majority of VAP were late-onset pneumonias (88%). The duration of mechanical ventilation was longer in the VAP patients. About 20% of VAP group patients were also diagnosed with sepsis. However, the duration of ICU stay was neither influenced by VAP nor by sepsis. Four patients who did not experience any VAP were diagnosed with stroke during their ICU stay. All-cause mortality was not longer in patients with VAP.

Conclusions In this preliminary report of a prospective cohort trial, it appeared that VAP did not contribute to additional morbidity or mortality. The majority of VAP occurred late in the course of ventilation. This supports the theory that VAP occurs primarily due to the disease itself and that detailed and prolonged care is required for the management of aneurysmal SAH patients. References

1. Schievink WI, et al.: Neurology 1995, 45:871-874.

2. Frontera JA, et al.: Neurosurgery 2008, 62:80-87.

Outcomes from subarachnoid haemorrhage

I Whitehead, N Azam, S Bonner, J Wright

James Cook University Hospital, Middlesbrough, UK Critical Care 2009, 13(Suppl 1):P106 (doi: 10.1186/cc7270)

Introduction A retrospective assessment of the outcome of patients with poor-grade (World Federation of Neurosurgeons Grades 4 and 5) subarachnoid haemorrhage (SAH) at a regional neurosurgical centre. Previous studies have shown aggressive treatment of patients with a poor clinical grade of SAH is warranted as grading can improve following resuscitation and drainage of cerebrospinal fluid [1,2]. Many units still withdraw treatment if these patients do not improve neurologically in the first 48 to 72 hours [3].

Methods A retrospective analysis of the notes of 116 patients who were recorded as having a diagnosis of SAH was performed between October 2002 and January 2006. Patients were excluded if they were World Federation of Neurosurgeons Grade 1, 2 or 3, had a traumatic SAH or been incorrectly classified as having SAH. Results Of 116 patients identified with a diagnosis of SAH, 12 patients were excluded as they had a traumatic SAH, one patient had a dissection, one patient a subdural haemorrhage and one patient a basal ganglia haematoma. Thirteen patients had a Glasgow coma score >1 2. Eighty-eight patients were correctly

deemed as poor grade. All poor-grade patients were admitted to the intensive therapy unit. Of these 88 patients, 34 (38.6%) survived. Twenty-four (70%) of the survivors were discharged home, eight (24%) to a care home, and two (6%) remained in hospital. Seventeen (50%) had a Glasgow Outcome Score of 4 or 5. Conclusions Outcomes from poor-grade SAH at James Cook University Hospital compare favourably with published data. Review of the literature gives a wide variation in outcome between 3.2% and 42%. It is our hypothesis that a combination of more aggressive critical care management combined with early intervention should result in a reevaluation of treatment plans in those with a historically perceived poor outcome. References

1. Bailes JE, et al.: Management morbidity and mortality of poor-grade aneurysm patients. J Neurosurg 1990, 72:559-566.

2. Hutchinson PJ, et al.: Outcome from poor grade aneurysmal subarachnoid haemorrhage - which poor grade subarachnoid haemorrhage patients benefit from aneurysm clipping? Br J Neurosurg 2000, 14:105-109.

3. Wilby M J, et al.: Cost effective outcome for treating poor-grade subarachnoid haemorrhage. Stroke 2003, 34:25082511.

Full Outline of Unresponsiveness compared with Glasgow coma scale assessment and outcome prediction in coma

D Ledoux1, M Bruno2, S Jonlet1, P Choi1, C Schnakers2, F Damas1, B Lambermont1, P Damas1, S Laureys2

1Liege University Hospital, Liège, Belgium; 2Coma Science Group -University of Liege, Belgium

Critical Care 2009, 13(Suppl 1):P107 (doi: 10.1186/cc7271)

Introduction The most widely adopted scale to assess consciousness in severely brain-damaged patients is the Glasgow coma scale (GCS) [1]. Its major shortcomings are the failure to assess the verbal component in intubated patients, the inability to test brainstem reflexes and breathing patterns. In 2005, Wijdicks and colleagues proposed a new coma scale, the Full Outline of Unresponsiveness (FOUR) scale [2], which consists of four components (eye, motor, brainstem, and respiration), each component having a maximal score of 4. Our objective was to validate the French version of the new FOUR coma scale in a general ICU and to assess its predictive value as compared with the GCS. Methods We performed FOUR and GCS evaluations in randomized order in 176 acutely brain-injured patients (days from insult to randomization <1 month). We assessed the association between GCS and FOUR scores using the Spearman correlation coefficient. A logistic regression analysis adjusted for age and etiology of coma was performed to assess the link between the studied scores and the outcome based on the Glasgow outcome scales 3 months after injury (n = 63).

Results The GCS and FOUR showed a significant correlation (r =0.807). The GCS verbal component was scored 1 in 146 patients; among these, 131 were intubated. The FOUR total scores (corrected for age) showed superior outcome prediction at 3 months (OR = 0.83; 95% CI = 0.70 to 0.98, P = 0.03) as compared with GCS total scores (OR = 0.85; 95% CI = 0.70 to 1.03, P = 0.09). Conclusions The FOUR scale does not need a verbal response, thus allowing complete testing in intubated patients (in our sample 90% of patients showing a GCS V1 score were intubated). Most importantly, the FOUR scale demonstrated a better discrimination between the good (recovery of independent living) and poor neurological status at 3 months as compared with the GCS. S45

References

1. Teasdale G, Jennett B: Assessment of coma and impaired consciousness. A practical scale. Lancet 1974, 2:81-84.

2. Wijdicks EF, Bamlet WR, Maramattom BV, Manno EM, McClelland RL: Validation of a new coma scale: the FOUR score. Ann Neurol 2005, 58:585-593.

Intravenous immunoglobulins versus plasma exchange in the treatment of Guillain-Barré syndrome

B Charra, A Hachimi, A Benslama, S Motaouakkil

Hôpital Ibn Rochd, Casablanca, Morocco

Critical Care 2009, 13(Suppl 1):P108 (doi: 10.1186/cc7272)

Introduction Annually, Guillain-Barré syndrome (GBS) affects one to four cases/100,000. Intravenous immunoglobulins (IVIg) and plasma exchange (PE) are the main treatment in this disease. The purpose of this study is to compare the efficacy of IVIg versus PE in the treatment of GBS in a medical ICU.

Methods This is a prospective, monocentric, nonrandomized study, realized in the medical ICU of the Ibn Rochd university hospital of Casablanca, during 5 years (2002 to 2006). We included all patients with GBS who required mechanical ventilation (MV). An electromyogram was realized in all patients and found axonal demyelinization. We defined two groups: group 1 (group treated by IVIg: 0.4 g/kg/day during 5 days) and group 2 (group treated by PE: 4 PEs during 10 to 14 days). We collected demographic characteristics, clinical and therapeutic aspects and outcome. We evaluated the beginning of motility recuperation and MV weaning essentially. The quantitative variables are expressed as the average or median ± standard deviation, and the qualitative variables by percentage. The univariate analysis was based on the Pearson chi-square test or the Fisher test for the qualitative variables and the Student test for the quantitative ones. P <0.05 is considered significant. The statistical analysis was based on SPSS 11.0 for Windows.

Results Forty-one patients (21 in group 1 and 20 in group 2) were enrolled. The mean age was 37 ± 9 years, with a masculine predominance (75.4%). The mean length of hospitalization was 32 days. The length of hospitalization of group 1 was less than that of group 2 (P = 0.02). The weaning of MV was more precocious in the patients of group 1 than those of group 2 (P = 0.01). Also, the beginning of motility recuperation was more precocious in group 1 than in group 2 (P = 0.001).

Conclusions Although the results of the literature are not conclusive, our work - of which the most important slant is the absence of randomization - reveals that there is a meaningful difference for MV weaning between the group receiving IVIg in relation to the PE group. These encouraging results would merit confirmation by controlled and randomized works.

Determinants of critical illness polyneuropathy in the case of long-term ICU treatment

A Klimasauskas1, I Sereike1, G Kekstas1, A Klimasauskiene2, J Ivaskevicius1

1Vilnius University, Vilnius, Lithuania; 2Vilnius University Hospital, Vilnius, Lithuania

Critical Care 2009, 13(Suppl 1):P109 (doi: 10.1186/cc7273)

Introduction Neuromuscular weakness is a condition which is often found during long-term ICU treatment. Critical illness poly-S46 neuropathy (CIP) is the main reason for neuromuscular weakness.

Diagnosis of CIP is difficult in the ICU setting. The reasons, even predisposing factors, for CIP are not fully studied. The aim of this study was to analyze whether sedation, duration of ventilation, and duration of ICU stay are good determinants of CIP. Methods A prospective investigation of patients treated in the ICU for longer than 7 days during a 6-month period (1 May 2008 to 31 October 2008) was made. All ICU survivors were included in the study. The APACHE II scores, first ICU day SOFA scores, duration of sedation, amount of sedation, and duration of ventilation were calculated. Electroneuromyography was performed for every patient. Data of patients with (CIP group) and without (control group) CIP were compared.

Results Thirty-seven patients were included in the study. In 16 cases CIP was diagnosed (43.2% of patients). The was no age difference in both groups of patients (55.37 ± 16.5 years in CIP group; 51.86 ± 17.92 in control group; P = 0.55). The APACHE II score in the CIP group was high in comparison with the control group - 20.31 ± 7 vs. 15.8 ± 5.89 (P = 0.04). The CIP group admission-day SOFA score was higher in comparison with the control group - 7.87 ± 4.05 vs. 5.09 ± 2.52 (P = 0.01). Duration of ICU stay was 20.37 ± 15.54 days in the CIP group and 15.19 ± 10.95 in the control group (P = 0.24). Duration of sedation in the CIP group was 118.68 ± 219.59 hours and in the control group was 78.9 ± 94.22 hours (P = 0.45). The sedation volume in the CIP group was 5,386.8 ± 18,112.04 mg and in the control group was 892.25 ± 1,440.97 mg (P = 0.26). There was no difference in duration of ventilation in both groups (330.62 ± 376.28 hours in CIP group and 159.92 ± 273.27 hours in control group; P = 0.11). Conclusions CIP is often a complication of long-term ICU treatment (43.2% in our ICU). Correlation exists between the APACHE II score, admission-day SOFA score and development of CIP in cases of long (>7 days) ICU treatment according to our results. No correlation between duration of sedation and ventilation, volume of sedation, duration of ICU treatment and development of CIP was found. Larger studies are needed to establish the determinants of CIP.

Neurally adjusted ventilatory assistance in patients with critical illness polyneuromyopathy

D Tuchscherer1, W Z'Graggen1, A Brunello1, C Passath1, C Sinderby2, J Takala1, SM Jakob1, L Brander1

1University Hospital Bern, Switzerland; 2St Michael's Hospital, Toronto, ON, Canada

Critical Care 2009, 13(Suppl 1):P110 (doi: 10.1186/cc7274)

Introduction Neurally adjusted ventilatory assistance (NAVA) delivers pressure (Paw) in proportion to the electrical activity of the diaphragm (EAdi). It is not known whether EAdi adequately reflects the respiratory drive in patients with critical illness polyneuro-myopathy (CIPM) and would be sufficient to deliver assistance using NAVA.

Methods Fifteen invasively ventilated patients (median (quartiles): 66 (59; 73) years old, APACHE II score 19 (17; 24)) with electrophysiologically documented CIPM were studied. A level of adequate unloading (NAVAAL) was identified daily based on the characteristic response in Paw and tidal volume (Vt) to stepwise increasing NAVA (titration) as previously described [1]. NAVAal was used for a maximum of 72 hours.

Results NAVAAL was implemented in 13 patients for 54 (40; 61) hours. Three patients were liberated from mechanical ventilation during the study. NAVA could not be used in two patients (diaphragm myoclonic; excessive respiratory drive). At NAVAAL peak inspiratory EAdi was reduced by 30 (25; 40)% compared

Table 1 (abstract P110)

PaO2/FiO2 PaCO2 (mmHg) Mean inspiratory EAdi (||V) Vt (ml/kg predicted body weight) Respiratory rate (breaths/min)

NAVAal day 1 227 (158; 286) 36 (30; 45) 5.6 (3.4; 7.8) 6.7 (5.8; 8.3) 30 (23; 34)

NAVAAl day 3 263 (212; 380)* 42 (39; 45)* 5.2 (2.9; 6.6) 6.5 (5.4; 7.6) 27 (22; 35)

Data presented as median (quartiles). *P <0.05 vs. day 1.

with the lowest NAVA level used during the titration. The breathing pattern, heart rate, and mean arterial pressure remained stable during NAVA. See Table 1.

Conclusions EAdi was sufficient to use NAVA in most of our patients with moderate to severe CIPM. Implementation of a titrated NAVA level for up to 72 hours resulted in low Vt, improved oxygenation over time, and stable cardiorespiratory function. Acknowledgement Supported by the Swiss National Science Foundation 3200B0-113478/1. Reference

1. Brander L, et al.: Titration and implementation of neurally adjusted ventilatory assist in critically ill patients. Chest 2008 [Epub ahead of print].

Assessment of muscle membrane properties using muscle velocity recovery cycles in patients with critical illness polyneuromyopathy

WJ Z'Graggen1, L Brander2, D Tuchscherer2, A Brunello2, C Passath 2, J Takala2, SM Jakob2, H Bostock3

1Bern University Hospital and University of Bern, Switzerland;

2Bern University Hospital and University Hospital, Bern, Switzerland; 3Institution of Neurology, University College London, UK Critical Care 2009, 13(Suppl 1):P111 (doi: 10.1186/cc7275)

Introduction Muscle weakness and atrophy due to critical illness polyneuromyopathy (CIPM) is common in long-stay intensive care patients. Recent nerve excitability studies suggest that the recovery cycle after a single supramaximal stimulus provides useful information about axonal membrane potential and ion channel function in neuropathies. We previously found that critical illness polyneuropathy is associated with nerve depolarization, and that this depolarization is strongly correlated with serum potassium in patients with renal failure [1]. We have adapted this method to human muscle fibres, by measuring the changes in conduction velocity of muscle fibres [2]. The muscle relative refractory period (RRP) increases and supernormality (SN) decreases in ischaemia, suggesting that these measures may be indicators of membrane potential also in the muscle [2]. The aim of this study was to evaluate muscle RRP and SN in patients with CIPM. Methods Nine patients (age 44 to 73 years) with electrophysiological^ proven CIPM were studied on two occasions within 1 week. Multifibre responses to direct muscle stimulation through needle electrodes were recorded from the brachioradialis, and the latency changes measured as conditioning stimuli were applied at interstimulus intervals of 2 to 1,000 ms. RRP and SN were compared with an age-matched control group. Results In patients with CIPM, muscle RRP was abnormally prolonged (6.25 ± 2.74 ms (mean ± SD) vs. 3.27 ± 0.45 ms in healthy subjects; P =0.0015) and supernormality was reduced (3.6 ±3.1% vs. 9.3 ±3.4%; P = 0.013). Moreover, during renal failure (8/18 measurements), muscle supernormality correlated strongly with serum potassium (R = 0.95, P =0.0004). Conclusions Muscle fibres are depolarized in CIPM. If, as we have previously suggested, nerve membrane depolarization is an

important cause of neuropathy in critical illness, it seems likely that muscle membrane depolarization may be an important cause of myopathy. Serum potassium is an important factor for muscle membrane depolarization in patients with renal failure. References

1. Z'Graggen WJ, et al.: Nerve excitability changes in critical illness polyneuropathy. Brain 2006, 129:2461-2470.

2. Z'Graggen WJ, et al.: Velocity recovery cycles of human muscle action potentials and their sensitivity to ischemia.

Muscle Nerve 2009, in press.

Risk factors for developing hypoglycemia in neurocritical care patients

F Van lersel1, C Tiemessen2, A Slooter2, C Hoedemaekers1, J Van der Hoeven1

1Radboud University Nijmegen Medical Centre, Nijmegen, the Netherlands; 2University Medical Centre, Utrecht, the Netherlands Critical Care 2009, 13(Suppl 1):P112 (doi: 10.1186/cc7276)

Introduction The use of intensive insulin therapy (IIT) in neuro-critical care patients is controversial: IIT failed to improve mortality and morbidity in brain-injury patients in a number of trials. Hypoglycemia increases the risk of death and may contribute to the development of secondary brain injury in these patients. The aim of this study was to identify risk factors for hypoglycemia in critically ill neurological patients.

Methods We performed a retrospective nested case-control study in the ICU of a tertiary-care teaching hospital in neurological ICU patients admitted between January 2007 and July 2007. Neurological ICU patients were defined as patients admitted primarily to the ICU, as a result of their neurological illness. Patients were considered as cases if at least one episode of hypo-glycemia (defined as a glucose level <3.0 mmol/l (<54 mg/dl)) occurred while admitted to the ICU. Only the first hypoglycemic event (index moment) of a patient was used to match with a control patient. Control patients were randomly selected from the same population, admitted for at least the same duration until the index moment, without previous hypoglycemic events. A number of potential risk factors for the development of hypoglycemia were predefined based on the literature in ICU patients. All variables were analyzed with univariate and multivariate regression analysis, correcting for age, gender and APACHE II score. Results Of the 127 neurological ICU patients, 35 developed hypoglycemia (27.6%). Mean arterial pressure (OR = 0.93; 95% CI = 0.87 to 0.99 per increase of 1 mmHg), having a Sepsis-related Organ Failure Assessment score for hemodynamic instability >1 (OR = 5.53; 95% CI = 1.23 to 24.81), dosage of norepinephrine (OR = 8.39; 95% CI = 1.38 to 51.12 per increase of 1 mg/hour), creatinine clearance (OR = 0.98; 95% CI = 0.96 to 1.00 per increase of 1 ml/min) and gastric residual volume without adjusting the insulin dosage (OR = 11.66; 95% CI = 1.24 to 109.20) were independently associated with a risk for developing hypoglycemia. S47

Conclusions Hypoglycemia occurs in a significant proportion of neurological ICU patients. We suggest more frequent control of blood glucose values, especially in patients suffering from hemodynamic instability, renal failure and gastroparesis.

Relationship between effective osmolality changes and neurological status during treatment for severe paediatric diabetic ketoacidosis

S Tibby, A Durward, L Ferguson, H Bangalore, I Murdoch

Evelina Children's Hospital, Guy's & St Thomas' NHS Trust, London, UK

Critical Care 2009, 13(Suppl 1):P113 (doi: 10.1186/cc7277)

Introduction Cerebral oedema is a common, life-threatening complication of paediatric diabetic ketoacidosis (DKA). Several risk factors at presentation are known, including urea, and pCO2. It has recently been suggested that changes in effective osmolality during treatment may also convey risk, but this has not been confirmed due to a lack of prospective data.

Methods A prospective observational study over 24 hours of 50 children admitted to a regional ICU with severe DKA (median pH at presentation 6.90). Development of cerebral oedema was defined clinically as a deterioration in Glaser score of >2 points during the 24 hours after presentation (equating to a Glasgow coma scale <8 to 10). Changes in effective osmolality, uncorrected and corrected sodium were modelled using linear (random coefficients) mixed models, with adjustment for baseline covariates of urea and pCO2. Results Consistent with previous studies, baseline urea was higher and pCO2 lower in patients who subsequently showed neurological deterioration. All three osmolality variables (effective osmolality, uncorrected and corrected sodium) were similar at baseline between the two groups. The best-fitting model utilized corrected sodium (lowest Akaike information criterion). This showed a significant interaction effect (P = 0.01), in that corrected sodium increased over time at a rate of 0.35 mmol/l/hour in patients who did not develop neurological symptoms, but did not change in those who did.

Conclusions Lack of change in corrected sodium is associated with neurological deterioration in the treatment of severe DKA in children. This variable may be useful if incorporated into treatment guidelines.

Continuous monitoring of blood parameters in intensive care patients

A Weinstein1, O Herzenstein1, E Gabis1, I Kagan2, P Singer2

1Orsense Ltd, Nes Ziona, Israel; 2Rabin Medical Center, Petah Tikva, Israel

Critical Care 2009, 13(Suppl 1):P114 (doi: 10.1186/cc7278)

Introduction Monitoring of blood parameters, such as glucose, hemoglobin and oxygen saturation, is essential in critically ill patients. The current invasive methods are not frequent enough for efficient tight glycemic control, and result in a high rate of hypoglycemia. In addition, there is a growing need for a continuous hemoglobin measurement in postoperative care units and ICUs. The purpose of this study is to evaluate the feasibility of the fully noninvasive blood monitor (NBM device; OrSense Ltd, Nes Ziona, Israel) for continuous monitoring of glucose, hemoglobin and oxygen saturation in critically ill patients.

Methods The study was conducted on 14 patients (seven female, seven male, ages 34 to 92 years) in the ICU of the Rabin Medical S48 Center, upon receipt of informed consent. The NBM probe was

placed on patients' thumbs, where it performed measurements for up to 24 hours, with readings every 10 minutes. Patient compliance was good and no adverse effects were identified. The results obtained from the NBM device were compared with blood samples taken through an arterial line every 30 to 60 minutes and were analyzed with a blood gas machine (ABL 700; Radiometer, Copenhagen, Denmark).

Results A total of 208 paired data points were obtained in the trial. At each point, an algorithm based on a uniform model (with personal glucose calibration) was used to calculate the three blood parameters. The reference glucose range was 62 to 369 mg/dl. The median relative absolute difference was 7.3%, and a Clarke error grid analysis showed that 95.2% of the measurements fell within zones A (74.5%) and B (20.7%). The reference range of the hemoglobin was 7 to 14.5 g/dl and the median absolute error obtained was 1 g/dl. Oxygen saturation levels were tracked simultaneously with a mean error of 2.5%.

Conclusions The present study indicates the potential use of the noninvasive NBM for continual, accurate, safe, and easy-to-use multiparameter monitoring in critically ill patients. The device holds the promise of improving patient care and survival, as well as reducing staff workload.

Morning glucose correlates poorly with daily mean glucose in cardiac surgical ICU patients: impact of ultradian variation of glucose

D Hagg, S Smith, K Oveson, M Slater, H Song, A Ahmann

OHSU, Portland, OR, USA

Critical Care 2009, 13(Suppl 1):P115 (doi: 10.1186/cc7279)

Introduction Treatment of hyperglycemia with an insulin infusion protocol (IIP) has improved outcomes in ICU and cardiac surgery patients [1]. Many studies have reported morning glucose values as representative of whole-day glucose control. We recently demonstrated that in a mixed ICU population this assumption does not hold [2]. Since it has been proposed that glucose control may be especially important in cardiac surgery ICU patients, we asked whether morning glucose accurately represented whole-day glucose and determined whether glucose varied over the course of the day in this patient group.

Methods A prospective, observational single-center study in the cardiac surgical ICU of a university tertiary-care hospital. All glucose measurements in cardiac surgical ICU patients receiving an IIP targeting blood glucose 80 to 110 mg/dl were recorded from 1 June 2006 to 28 February 2007.

Results We recorded 12,109 glucose measurements from 125 patients on IIPs. The glucose measurements were widely distributed with 1%, 7%, 47%, 33%, 9% and 2% in the ranges <60, 61 to 79, 80 to 110, 111 to 150, 151 to 200 and >200 mg/dl, respectively. After 8 hours of IIP, the proportion of values in the target range increased to 49%. The 06:00-hour glucose values were lower than other values (mean ± SD: 105 ± 24 mg/dl (n = 482) vs. 113 ± 33 mg/dl (n = 10,688); P <0.0001). The 06:00 values were poorly correlated with the average glucose recorded for the remainder of the day (r2 = 0.029), which was confirmed by Bland-Altman analysis. The time-averaged glucose data exhibited ultradian variation, peaking at 10:00 and 19:00 hours. The insulin dose varied with a similar pattern. Conclusions In cardiothoracic ICU patients receiving an IIP, glucose exhibited an ultradian pattern with peaks at 10:00 and 19:00 hours, and was lower in the early morning than during the remainder of the day. Consideration of this ultradian variation may avoid hypoglycemic and hyperglycemic episodes and so facilitate

better glucose control with an IIP. Studies targeting control of hyperglycemia should report mean blood glucose values for the entire day rather than early morning values. References

1. Dellinger RP, et a/.: Intensive Care Med 2004, 30:536-555.

2. Smith SM, et a/.: Diabetes Care 2007, 30:2503-2505.

Hypoglycaemia is associated with a higher mortality in critically ill patients

A Jose Pereira1, A Biasi Cavalcanti1, F Pereira Almeida1, T Correa1, J Telles2, M Lobato1, D Nishimura1, R Da Hora Passos1, E Silva1

1Hospita/ Israe/ita A/bert Einstein, Sao Pau/o, Brazil; 2Hospita/ Portugues, Sa/vador, Brazi/

Critica/ Care 2009, 13(Suppl 1):P116 (doi: 10.1186/cc7280)

Introduction Hypoglycaemia, a common complication of strict glucose control in critically ill patients, has controversial effects on mortality. The hyperglycaemic index (HGI) takes into account the unequal time distribution of blood glucose sampling and is a better predictor of death than other methods to quantify hyperglycaemia [1]. By analogy with the HGI, we defined the hypoglycaemic index (HGI-60) as the area above the glucose curve and lower than 60 mg/dl divided by the length of ICU stay. The objective of the study was to evaluate the effects of hypoglycaemia on inhospital mortality in critical care patients using a new method for quantification of hypoglycaemia, the hypoglycaemic index (HGI-60). Methods A retrospective study performed in four mixed ICUs. From 2004 through 2006, patients treated with continuous insulin therapy were included. Admission type, sex, age, the occurrence of hypoglycaemia <60 mg/dl, APACHE II score and outcome (hospital dead) were recorded. The HGI-60 is calculated after simple interpolation of all glucose values measured during the ICU stay. The relations between independent variables and hospital mortality were determined by logistic regression analysis. Results One hundred and ninety-six patients were included. The mean age was 60.5 years and 56% were male. The mean APACHE II score was 22.8 and the hospital mortality was 51.5%. The HGI-60 median was 0.02 mg/dl/hour (interquartile range 0 to 0.13) in survivors versus 0.07 mg/dl/hour (interquartile range 0 to

0.23) in nonsurvivors (P = 0.01). Results of logistic regression analysis show that HGI-60 was associated with higher hospital mortality independently of the APACHE II score (OR = 7.2; 95% CI = 1.6 to 32.1; P = 0.009). The HGI-60 had a higher area under the receiver operating characteristic curve (0.70) than the occurrence of hypoglycaemia (0.67).

Conclusions Hypoglycaemia during the ICU stay is a marker of increased mortality. The hypoglycaemic index (HGI-60) is a better predictor of mortality than the occurrence of hypoglycaemia. Reference

1. Vogelzang M, et a/.: Hyperglycaemic index as a tool to assess glucose control: a retrospective study. Crit Care 2004, 8:R122-R127.

Insulin pseudo-resistance from adsorption to burettes

R Nagappan, S Lingam, C Corallo, N Warrior

Box Hi// Hospita/, Me/bourne, Austra/ia

Critica/ Care 2009, 13(Suppl 1):P117 (doi: 10.1186/cc7281)

Introduction A postoperative diabetic patient, stabilized on 96 units/day Actrapid by syringe pump, when changed to a burette

Table 1 (abstract P117)

Time (min) Syringe concentration (units/100 ml) Burette concentration (units/100 ml)

120 67.2 52.4

240 68.1 44

360 63.3 41.2

540 52.9 18.8

administration set required 1,200 units/day to achieve glycaemic control. The problem resolved when reverting to a syringe. An ex vivo experiment confirmed variable adsorption to syringes and burettes.

Methods Postulating insulin adsorption to the infusion inline burette as the cause of this excessively high insulin requirement, we confirmed by an in vitro experiment with (a) a Terumo® syringe with B-Braun extension tubing and (b) a B-Braun Dosifix® inline burette system. Neutral insulin (Actrapid) with a concentration of 1 unit in 1 ml isotonic sodium chloride was run at 10 units/hour. The insulin concentration at the point of exit from the infusion system was analysed periodically (Table 1).

Results Immediately after priming, there was approximately a 56% loss of insulin from the solution being delivered from the burette system, compared with 37% from the syringe system. This syringe-burette variability persisted throughout our study. At 9 hours, there had been an 81% loss of insulin from the burette system compared with 47% from the syringe. A second similar experiment also confirmed this observation.

Conclusions Data on stability of insulin solutions and infusion systems are sparse. The insulin concentration is affected more when administered via inline burette infusion sets compared with syringe pump infusion lines [1]. When managing hyperglycaemia, apart from endogenous causes and true insulin resistance as reasons for escalating insulin requirements, insulin adsorption to the infusion systems should also be considered. More studies are required to elucidate this further. Reference

1. Corallo C, et a/.: Aus J Hosp Pharm 1995, 25:129-135. P118

High blood glucose variability in acute phase is one of the most important risk factors relating to the outcome in acutely ill severe patients with glucose intolerance

M Hoshino1, Y Haraguchi2, I Mizushima3, M Sakai4, S Kajiwara1, M Takagi1

1Shisei Hospita/, Saitama, Japan; 2Nationa/ Hospita/ Disaster Medica/ Center, Tokyo, Japan; 3Nippon Engineering Co//ege, Tokyo, Japan; 4Tokyo Women's Medica/ University Hospita/, Tokyo, Japan

Critica/ Care 2009, 13(Suppl 1):P118 (doi: 10.1186/cc7282)

Introduction High blood glucose (BG) variability is considered to affect the prognosis of acutely ill patients. However, the significance of BG variability is not clearly elucidated. We investigated the significance of BG variability with a bedside-type artificial pancreas (AP).

Methods Strict BG control was performed by an AP, the STG22. Patients were evaluated at early (E) phase and late (L) phase (1 week after E phase). The number of patients was 83, among which 67 patients with daily mean BG (BGm) below 200 mg/dl were selected, because patients with BGm above 200 mg/dl had S49

extremely high mortality, or 56%. They were classified into two groups, a group with high BG variability and a group without, based on the daily standard deviation of BG (BGsd). The two groups were compared regarding the following items: (1) mortality; (2) BGm, daily maximal and minimal BG (BGmax, BGmin), and daily BG difference (BGd); (3) demographic data; (4) administered glucose and insulin; and (5) degree of organ dysfunction and SOFA score.

Results (1) In the E phase, patients with BGsd above 14 mg/dl (group H, n = 26, mean BGsd 25 ± 11) had significantly higher mortality (46% vs. 17%) as compared with those with BGsd below 14 mg/dl (group N, n = 41, mean BGsd 9 ± 3). In the L phase, mortality was not significantly different between the groups with higher BGsd and those with lower BGsd at any point. (2) There was no significant difference in BGm between group H and group N (173 ± 24 vs. 173 ± 15). (3) BGmax and BGd in group H were significantly higher than those in group N (228 ± 42 vs. 191 ± 18, and 93 ± 39 vs. 37 ± 14, respectively). BGmin in group H was significantly lower than that in group N (132 ± 28 vs. 154 ± 16). (4) There was no significant difference in the abovementioned another items between group H and group N. Conclusions High BG variability in the acute phase proved to be one of the highest risk factors. Therefore, strict BG control focusing on stabilization of BG variability especially in the acute phase was considered one of the important therapies. The AP was most reliable and therefore useful for clarifying the significance of strict BG control, and strict BG control using an AP should be considered in order to obtain the better outcome.

Effect of increasing intravenous glucose load in the presence of normoglycemia on outcome and metabolism in critically ill rabbits

S Derde, I Vanhorebeek, E Ververs, V Darras, E Van Herck, G Van den Berghe

Catholic University Leuven, Belgium

Critical Care 2009, 13(Suppl 1):P119 (doi: 10.1186/cc7283)

Introduction Endocrine disturbances during critical illness lead to a feeding-resistant wasting syndrome, characterised by profound protein breakdown, promoting delayed recovery and poor outcome. Parenteral nutrition failed to counteract the hypercata-bolic state, possibly due to aggravation of the detrimental hyper-glycemic response to critical illness. In our rabbit model of prolonged critical illness we investigated the impact of varying intravenous glucose load, while maintaining normoglycemia, on mortality, organ damage, and catabolism/anabolism. Methods Critically ill rabbits were randomised into a fasting group, a standard parenteral nutrition group, and two groups receiving either an intermediate or high additional amount of intravenous glucose within the physiological range, all maintained normo-glycemic with insulin. These normoglycemic groups were compared with a hyperglycemic group (similar high glucose load as the last normoglycemic group) and with healthy rabbits. Protein and lipid load was equal for all fed groups.

Results Varying intravenous glucose load did not affect the mortality or organ damage, provided normoglycemia was maintained. Fasted critically ill rabbits lost weight, which was attenuated by increasing intravenous glucose load. As compared with healthy rabbits, mRNA expression of several components of the ubiquitin-proteasome pathway was elevated in skeletal muscle of fasted critically ill rabbits, which was counteracted by intravenous feeding. Except in the normoglycemic group with intermediate glucose load, circulating insulin-like growth factor 1 and thyroid

hormone levels decreased in all groups, most pronounced in hyperglycemic rabbits.

Conclusions Provided normoglycemia is maintained, increasing intravenous glucose within the physiological range is safe for organ function and survival of critically ill rabbits and reduces catabolism compared with fasting.

Reduction of glucose and insulin concentrations during in vitro incubation of whole blood

S Beitland, H Opdahl, T Aspelin, L Saetre, T Lyberg

Ullevaal University Hospital, Oslo, Norway

Critical Care 2009, 13(Suppl 1):P120 (doi: 10.1186/cc7284)

Introduction Incubation of whole blood has been used in numerous in vitro investigations. The purpose of the present study was to test the hypothesis that glucose and insulin concentrations declined during incubation.

Methods Six young, healthy and fasting males donated blood. Aliquots containing heparin-anticoagulated whole blood were added to with different quantities of insulin (Actrapid; Novo Nordisk, Bagsvaerd, Denmark) and bacterial endotoxin (Escherichia coli lipopolysaccharide serotype 026:B6; Difco Laboratories, Detroit, MI, USA). Aliquots were incubated for 6 hours at 37°C in an atmosphere of humidified 5% CO2 and 95% air. Concentrations of blood glucose were measured every hour, whereas insulin was measured at baseline and 6 hours. The Wilcoxon signed-rank test was used to compare medians. Results The glucose concentration in aliquots to which insulin 30 nmol/l and lipopolysaccharide 1,000 ng/ml were added decreased linearly and approached zero after 6 hours (median 0.6 vs. 4.75 mmol/l, P = 0.027; presented as a line chart of medians with range in Figure 1). The insulin content in blood samples without any additions was reduced by more than 50% during incubation (median 22.5 vs. 48.0 pmol/l, P = 0.028; depicted as box plots with median lines, 25th to 75th percentile boxes and 10th to 90th percentile error bars in Figure 1). Similar glucose and insulin measurements were performed during incubation of other selected aliquots. The reductions of glucose and insulin content were analogous to the results above; the magnitude of reduction seemed to be independent of the addition of insulin and/or lipopolysaccharide.

Figure 1 (abstract P120)

Glucose and insulin concentrations during incubation.

Conclusions The concentration of glucose was reduced to almost zero during 6 hours of incubation; the decline was probably due to glycolysis in blood cells rather than the effects of insulin. The resulting hypoglycaemia may affect cellular functions, and addition of glucose should therefore be performed during in vitro incubations. The decline in insulin content was unexpected, as insulin is mostly degraded in the liver and kidney, but to some extent also in blood cells and extracellularly.

Surprising result after evaluating a nurse-driven guideline on blood glucose management

R Schnabel, J Zwaveling, D Bergmans

Maastricht University Medical Center, Maastricht, the Netherlands Critical Care 2009, 13(Suppl 1):P121 (doi: 10.1186/cc7285)

Introduction Whether blood glucose management is a cornerstone in reducing ICU mortality is still under debate. Nevertheless, conscientious glucose regulation by intensive insulin therapy has become an indicator in the evaluation of the medical quality of ICUs in the Netherlands. In recent years, however, there have been several reports of increased hypoglycemic events and clinical harm due to stringent glucose regulation [1]. The general ICU in our hospital consists of two units with a comparable patient population with separate nursing staff but the same medical staff. Both units work with the same nurse-driven guideline for the blood glucose management providing some liberty in the decision-making process. The frequency of glucose measurement with a point-of-care device is left to the nurses' discretion. A glucose level of 4.5 to 7 mmol/l was set as the target value on both wards. Methods After 2 years the total number of tests done on each ward, the achieved average glucose level, and the number of hypoglycemic events (glucose <2.5 mmol/l) and hyperglycemic events (glucose >8 mmol/l) according to our hospital laboratory database were evaluated.

Results Unit 1 had a total of 700 patients with 5,717 ICU-days and performed 27,218 tests according to an average of 4.8 tests/patient/day. Unit 1 achieved an average glucose level of 6.9 mmol/l with 0.6% hypoglycemia and 32% hyperglycemia. Unit 2 had a total of 504 patients with 5,384 ICU-days and performed 39,400 tests according to an average of 7.3 tests/patient/day. Unit 2 achieved an average glucose level of 6.4 mmol/l with 0.9% hypoglycemia and 20% hyperglycemia.

Conclusions Using a nurse-driven protocol, both units achieved comparable and satisfying average glucose levels with acceptable numbers of hypoglycemic events. There was a striking difference in the number of tests performed between the two wards depending on interpretation of the same guideline. Reference

1. Wiener R, et al.: Benefits and risks of tight glucose control in critically ill adults: a meta-analysis. JAMA 2008, 300: 933-943.

Evaluation of nursing perceptions about three insulin protocols for blood glucose control in critical care

T Correa, F Pereira de Almeida, A Biasi Cavalcanti, A Jose Pereira, E Silva

Hospital Israelita Albert Einstein, Sao Paulo, Brazil Critical Care 2009, 13(Suppl 1):P122 (doi: 10.1186/cc7286)

Introduction To implement a tight glycemic control protocol in the ICU, it is essential to obtain active nurse involvement [1,2]. Our

objective was to evaluate nurses' perceptions about three different blood glucose control protocols for critically ill patients. Methods As part of a randomized control trial comparing three blood glucose control protocols in ICU patients, we issued a questionnaire to all nurses who participated in the study to evaluate their perception on protocol efficacy, benefits, safety, risks, feasibility and a question asking which protocol the nurses would like to be adopted in their ICU. The randomized control trial arms were: a computer-assisted insulin protocol (CAIP) with continuous insulin infusion to maintain blood glucose between 100 and 130 mg/dl; a Leuven protocol with insulin infusion to maintain blood glucose between 80 and 110 mg/dl; and conventional treatment with subcutaneous insulin if glucose >150 mg/dl. Results Sixty nurses answered the questionnaires. The CAIP was considered the most efficient by 57% of the nurses. About 58% of the nurses evaluated its performance as good or very good, compared with 22% for the Leuven protocol (P <0.001) and 40% for conventional treatment (P = 0.08). The CAIP was considered easier to use than the Leuven protocol (P <0.001) and as easy as conventional treatment (P = 0.78). Fifty-six percent of the nurses chose the CAIP as the protocol they would like to be adopted in their institution.

Conclusions The CAIP was more efficacious, safer and easier to use than the Leuven protocol. Compared with conventional treatment, the feasibility and safety of the CAIP were considered similar. Most nurses chose the CAIP as the protocol they would like to be adopted in their ICU. References

1. Preston S, et al.: Introducing intensive insulin therapy: the nursing perspective. Nurs Crit Care 2006, 11:75-79.

2. Aragon D: Evaluation of nursing work effort and perceptions about blood glucose testing in tight glycemic control. Am J Crit Care 2006, 15:370-377.

Impact of a simple computer alert on the quality of tight glycemic control

E Vander Stichele, W De Becker, P Wouters, D Cottem, G Van den Berghe, G Meyfroidt

UZ Leuven, Belgium

Critical Care 2009, 13(Suppl 1):P123 (doi: 10.1186/cc7587)

Introduction Tight glycemic control (TGC) in the ICU is difficult, and is associated with an increased risk of hypoglycemia [1]. We use a nurse-wise insulin titration protocol for TGC in our ICU. The purpose of this study was to examine the impact of a simple computer alert on the quality of TGC.

Methods An alert was created with the EventManager® of MetaVision®. The nurses received a pop-up message on the bedside workstation, with a simple suggestion for the timing of the next measurement and a nonspecific instruction to check caloric intake and insulin dose, at the following blood glucose (BG) thresholds: BG >180 mg/dl, >110 mg/dl, <80 mg/dl, <60 mg/dl. When BG was <40 mg/dl, an alert was sent to all workstations, and to both doctors and nurses. The alert was implemented on 1 August 2007. We performed an observational cohort study, including all adults (>18 years) who were in our ICU between 31 January and 31 July 2007 (control group, n = 731), and between 31 August 2007 and 6 February 2008 (alert group, n = 654). StatView® was used for statistical analysis. Results The mean BG per patient, the glycemic penalty index (GPI) [2] and the hyperglycaemic index (HGI) [3] were significantly lower after implementation of the alert. There were fewer patients in the alert group who experienced at least one episode of BG S51

Table 1 (abstract P123)

Control Alert P value

Mean BG 116 ± 17 114 ± 17 0.003

GPI 22 ± 11 21 ± 11 0.037

HGI (mg/dl) 14 ± 16 12 ± 13 0.005

Samples/patient 58 ± 93 57 ± 92 0.680

Patients, BG <40 mg/dl 6.6% (48) 4.0% (26) 0.032

Data presented as the mean ± SD or percentage (count).

<40 mg/dl. The amount of BG samples drawn per patient was similar in both groups (Table 1).

Conclusions Even in an environment where TGC is performed well, a simple computer alert can further improve BG level control and reduce the risk of hypoglycemia, without increasing the BG sample rate. References

1. Van den Berghe G, et al.: Intensive insulin therapy in critically ill patients. N Engl J Med 2001, 345:1359-1367.

2. Van Herpe T, et al.: Glycemic penalty index for adequately assessing and comparing different blood glucose control algorithms. Crit Care 2008, 12:R24.

3. Vogelzang M, et al.: Hyperglycaemic index as a tool to assess glucose control: a retrospective study. Crit Care 2004, 8:R122-R127.

Which variables affect strict glycaemic control with intensive insulin therapy in postoperative/post-traumatic critically ill patients?

M Weiss1, M Kron2, B Hay2, M Taenzer1, M Huber-Lang1, P Radermacher1, M Georgieff1

1University Hospital, Ulm, Germany; 2Institute of Biometrics, Ulm, Germany

Critical Care 2009, 13(Suppl 1):P124 (doi: 10.1186/cc7288)

Introduction This study was performed to determine the effect variables supposed to affect optimal blood glucose concentrations between 80 and 150 mg/dl in postoperative/post-traumatic patients. Methods From January 2007 to December 2007, 826 postoperative/post-traumatic critically ill patients admitted to a university adult ICU performing intensive insulin therapy were surveyed daily using computer assistance with respect to minimal and maximal daily blood glucose concentrations and insulin therapy. The variables age, sex, sepsis, neurosurgical patient, steroids, adrenaline and/or noradrenaline infusion rate, acute renal failure, liver function assessed by the Model of Endstage Liver Disease score [1], organ dysfunctions reflected by the Sequential Organ Failure Assessment score [2], and severity of disease by the Simplified Acute Physiology Score II [3] were monitored. Results Seven hundred and sixty-four patients with an ICU stay >48 hours were eligible for evaluation. In multiple logistic regression with backward elimination to determine the most relevant parameters, sepsis (OR = 1.2, with corresponding 95% CI = 1.1 to 1.4), neurosurgical patients (OR = 1.6, CI = 1.3 to 1.9), steroids (OR = 1.5, CI = 1.2 to 1.9), noradrenaline infusion

(OR = 1.4, 95% CI = 1.2 to 1.6), and age (per year) (OR = 1.02, CI = 1.01 to 1.02) were associated with an increased risk not to lay within an optimal blood glucose range of 80 to 150 mg/dl (P <0.01).

Conclusions Sepsis, neurosurgery, steroids, catecholamine infusions and age may be associated with increased risk for difficult blood glucose control in postoperative/post-traumatic patients. References

1. Kamath PS, Wiesner RH, Malinchoc M, et al.: A model to predict survival in patients with end-stage liver disease. Hepatology 2001, 33:464-470.

2. Vincent JL, de Mendonca A, Cantraine F, et al.: Use of the SOFA score to assess the incidence of organ dysfunction/failure in intensive care units: results of a multicenter, prospective study. Working group on 'sepsis-related problems' of the European Society of Intensive Care Medicine. Crit Care Med 1998, 26:1793-1800.

3. Le-Gall JR, Lemeshow S, Saulnier F: A new Simplified Acute Physiology Score (SAPS II) based on a European/North American multicenter study. JAMA 1993, 270:2957-2963.

Comparison of conventional measures of glucose control versus the area under the curve from a continuous glucose monitoring device in critical care patients

K Fitousis, M Sirimaturos, S Mannan, D Hamilton, S Hendricks, M Liebl

The Methodist Hospital, Houston, TX, USA

Critical Care 2009, 13(Suppl 1):P125 (doi: 10.1186/cc7289)

Introduction In critical illness, hyperglycemia is a frequent complication resulting from metabolic and hormonal changes. Recent findings suggest increased glycemic variability may confer a strong independent risk of mortality in the critically ill [1]. Conventional measurements of glucose control include morning glucose, mean daily glucose, percentage of glucose readings in the goal range, and the hyperglycemic index (HGI) [2]. The objective of this study was to determine the most appropriate method for assessing glycemic control in cardiovascular (CV) surgery patients.

Methods Data were obtained from a continuous glucose monitoring system (Medtronic CGMS® System Gold™; Medtronic Diabetes, Northridge, CA, USA) database containing continuous glucose monitoring system, fingerstick, and morning laboratory blood glucose values from adult CV ICU patients. A total of 23 patients contributed to the dataset. The area under the curve every 24 hours was calculated to determine the HGI. The mean and median daily glucose, percentage of glucose readings within and above the goal range (80 to 115 mg/dl) and the mean morning glucose were calculated. Statistical analysis was performed utilizing Spearman's rho correlation to determine which measurement of glucose control correlates best with the HGI. The HGI served as the comparator group, as it is a validated and comprehensive method of assessing glucose control over time. Results See Table 1.

Conclusions Mean glucose was the most reflective and practical method for determining glycemic control in critically ill CV surgery

Table 1 (abstract P125)

Mean glucose Median glucose % glucose 80 to 115 mg/dl % glucose >115 mg/dl Morning glucose

Spearman 0.735 (<0.001) 0.519 (<0.01) -0.578 (<0.001) 0.494 (<0.01) 0.196 (0.26)

patients. The morning glucose correlated the least with the HGI, demonstrating morning glucose may not be the most appropriate method to define glycemic control. References

1. Krinsley JS: Crit Care Med 2008, 36:3008-3013.

2. Vogelzang M, et a/.: Crit Care 2004, 8:R122-R127.

Exogenous glucagon-like peptide 1: a potentially novel therapy for the management of hyperglycaemia in the critically ill

A Deane1, R Fraser1, M Horowitz1, C Burgstad2, L Besanko2, M Finnis2, A Zaknic2, M Summers2, M Chapman2

1University of Ade/aide, Austra/ia; 2Roya/ Ade/aide Hospita/, Ade/aide, Austra/ia

Critica/ Care 2009, 13(Suppl 1):P126 (doi: 10.1186/cc7290)

Introduction The purpose of this study was to determine the effects of exogenous glucagon-like peptide 1 (GLP-1) on the glycaemic response to enteral nutrition in nondiabetic, critically ill patients. Exogenous GLP-1 lowers blood glucose concentrations in both healthy humans and patients with type 2 diabetes, via suppression of glucagon, stimulation of insulin secretion and slowing gastric emptying. As the humoral effects are glucose dependent, the use of GLP-1 is not associated with hypo-glycaemia. The effects of GLP-1 on glycaemia in critical illness have hitherto not been evaluated.

Methods Seven, nondiabetic, critically ill patients (four males, three females; age 58 ± 6 years) received, on two separate days, intravenous GLP-1 (1.2 pmol/kg/min) or placebo between t = 0 and 270 minutes, in randomised double-blind fashion. Between t= 30 and 270 minutes a liquid nutrient was infused intra-duodenally at a rate of 1.5 kcal/min. Blood glucose, plasma insulin and glucagon concentrations were measured. Data are presented as the mean ± SEM. Statistical analyses were performed using the paired t test or repeated-measures ANOVA. Results Compared with placebo, GLP-1 decreased peak glucose concentrations (10.1 ± 0.7 mmol/l vs. 12.7 ± 1.1 mmol/l; P =0.01) and markedly attenuated the overall glycaemic response to enteral nutrition (blood glucose area under the curve 30 to 270 min, 2,077 ±145 mmol/l/240 min vs. 2,568 ± 208 mmol/l/240 min; P <0.02) (Figure 1). GLP-1 caused a transient, but nonsustained, suppression of plasma glucagon concentrations (t = 30 min, 90 ± 12 pmol/ml vs. 104 ± 10 pmol/ml; P <0.01) and tended to

Figure 1 (abstract P126)

Post-pyloric nutrient liqiud infused t - 30-270 min

Glycaemic response to enteral nutrition is markedly attenuated (***P <0.02).

increase the plasma insulin/blood glucose ratio (between t = 0 and 270 min, mean Ainsulin/glucose ratio 5.0 ± 2.0 mU/mmol vs. 2.5 ± 0.9 mU/mmol; P = 0.12).

Conclusions Acute, exogenous GLP-1 infusion markedly attenuates the glycaemic response to enteral nutrition in the critically ill. Exogenous GLP-1 represents a potentially novel therapy for the management of hyperglycaemia in the critically ill.

Tight glucose control: is there any influence on outcome? A retrospective cohort study

A Stoszkova, P Dostal, V Cerny

University Hospita/ Hradec Kra/ove, Czech Repub/ic Critica/ Care 2009, 13(Suppl 1):P127 (doi: 10.1186/cc7291)

Introduction Recent recommendations on tight glucose control in general critically ill patients questioned its effect on patient outcome [1,2]. The aim of our work is to determine whether the implementation of tight glucose control to our practice improved our patients' outcome.

Methods In this retrospective cohort study we used baseline data from our intensive care register. An intravenous insulin protocol to maintain tight glucose control was implemented in our practice on 6 June 2003. In the study we enrolled patients admitted to our six-bed multidisciplinary ICU over a 2-year period before (years 2001 and 2002 - group with usual glucose control) and a 2-year period after (years 2004 and 2005 - group with tight glucose control) the introduction of tight glucose control. In total 231 adult mechanically ventilated patients, admitted primarily or within the first 24 hours to our ICU, were included. We recorded the length of ICU stay, length of artificial ventilation, cost, APACHE II and SOFA scores, and examined the incidence of nosocomial infections and mortality in both groups of critically ill patients. Results One hundred and fifteen patients in the group with usual glucose control and 116 patients in the group with tight glucose control were analyzed; no significant difference on severity of disease in both groups (APACHE II score 23.7 vs. 24.14, P =0.765 and SOFA score 7.9 vs. 7.8, P = 0.743) was detected. Tight glucose control was associated with a significant reduction of nosocomial pneumonia (20% vs. 11%, OR = 0.34, 95% CI =

0.19.to 0.86). There was no significant difference in hospital mortality, length of stay, artificial ventilation and cost. Conclusions Tight glucose control was associated with a significant reduction of nosocomial pneumonia, but not with a reduction of hospital mortality, length of stay, artificial ventilation and hospital cost.

References

1. Wiener RS, et a/.: Benefits and risks of tight glucose control in critically ill adults. JAMA 2008, 300:933-944.

2. Preiser JC, et a/.: Clinical experience with tight glucose control by intensive insulin therapy. Crit Care Med 2007, 35(9 Suppl):S503-S507.

Should we treat children with hyperglycaemia with insulin after cardiac surgery?

H Zwart, A Struijs, R Van Thiel, A Bogers, J Verhoeven, K Joosten

Erasmus MC, Rotterdam, the Nether/ands

Critica/ Care 2009, 13(Suppl 1):P128 (doi: 10.1186/cc7292)

Introduction Critically ill infants and children often develop hyperglycaemia. In adults it is associated with worsened outcome.

Table 1 (abstract P128)

Low intake High intake Inadequate intake

Highest glucose Duration of (mmol/l) hyperglycaemia (hours) Highest glucose Duration of (mmol/l) hyperglycaemia (hours) Highest glucose Duration of (mmol/l) hyperglycaemia (hours)

Insulin 10.7 7.3 13.4 6.6 10.3 8.0

No insulin 7.5 2.7 10.3 3.1 8.1 4.8

No studies have so far investigated the feasibility and outcome of a standardized insulin/glucose protocol in children with congenital heart disease after cardiac surgery.

Methods We prospectively studied children with congenital heart disease after cardiac surgery, for glucose intake, concomitant blood glucose values and results of insulin treatment. Results are expressed as the median and range, P <0.05 considered statistically significant.

Results Eighty-nine children were evaluated (male 56.2%), age 0.9 years (1 day to 17.9 years), length of ICU stay 22 hours (2.5 to 28.0 hours). All children survived. Fifty children were treated with and 39 without insulin. Overall, first blood glucose on admission to the ICU was 5.7 mmol/l (3.1 to 21.6 mmol/l), after 6 hours was 8.2 mmol/l (4.7 to 23.0 mmol/l), highest blood glucose was 9.9 mmol/l (3.4 to 23.7 mmol/l), and 77.5% was hyperglycaemic during admission (>8.0 mmol/l). Fifty out of the 70 children with hyperglycaemia were treated with insulin. Time to reach normo-glycaemia in insulin-treated children was 6.4 hours (0.3 to 17.2 hours), total length of insulin treatment was 11.9 hours (2.3 to 23.0 hours) and the length of hyperglycaemia was 6.7 hours (1.2 to 17.0 hours) (Table 1). The duration of hyperglycaemia of treated and untreated children was not significantly different. Hypogly-caemia (<4.0 mmol/l) occurred in 10 children (11.2%); none of them had severe hypoglycaemia (<2.2 mmol/l). Conclusions This study shows that a majority (77.5%) of children admitted after cardiac surgery develops hyperglycaemia and 72.5% were treated with insulin. The duration of hyperglycaemia was not different between children with or without insulin treatment. It can be questioned whether beneficial effects of insulin therapy can be expected with the short duration of insulin treatment (11.9 hours), which was noticed in this study.

Effect of glucose-insulin-potassium infusion on mortality in critically ill patients: a systematic review and metaanalysis

M Puskarich1, A Jones1, J Kline1, M Runyon1, S Trzeciak2

1Carolinas Medical Center, Charlotte, NC, USA; 2Cooper

University Hospital, Camden, NJ, USA

Critical Care 2009, 13(Suppl 1):P129 (doi: 10.1186/cc7293)

Introduction Fifty years' worth of published evidence has inferred benefits of glucose-insulin-potassium (GIK) infusion to critically ill patients. We sought to measure the treatment effect of GIK infusion on mortality in critically ill patients.

Methods We conducted a systematic review of the Cochrane Library, MEDLINE, EMBASE, CINAHL, conference proceedings, clinical practice guidelines, and other sources using a comprehensive strategy. We identified randomized controlled trials comparing GIK treatment with standard care or placebo in critically ill adult patients. The primary outcome variable was mortality. Two authors independently extracted data and assessed study quality using standardized instruments; consensus was reached by

conference. Preplanned subgroup analysis included studies of high-quality methodology, septic shock or other circulatory shock populations. We used the chi-square test and the proportion of total variation in study estimates that is due to heterogeneity (P) to assess for statistical heterogeneity (P <0.10, /2 >25%). The primary analysis was based on the random effects model to produce pooled ORs with 95% CIs.

Results The search yielded 1,720 potential publications; 23 studies were included in the final analysis, providing a sample of 22,525 patients. Included studies only contained populations of acute myocardial infarction and cardiovascular surgery patients. The combined results demonstrate no statistically significant heterogeneity (P = 0.57, /2 = 0%) and no effect on mortality (OR = 1.02; 95% CI = 0.93 to 1.11) with GIK treatment. Among the high-quality studies (n = 4) there was no effect on mortality (OR = 1.04; 95% CI = 0.95 to 1.14). No experimental studies of shock or sepsis populations were identified.

Conclusions This meta-analysis found that there is no mortality benefit to GIK infusion in critically ill patients; however, study populations were limited to acute myocardial infarction and cardiovascular surgery patients. No studies were identified utilizing GIK in patients with septic shock or other forms of circulatory shock, providing an absence of evidence regarding the effect of GIK as a therapy in patients with shock.

Incidence and risk factors of hypertriglyceridemia in the ICU_

MR Shams1, N Tavassoli2, N Tavassoli2, H Plicaud1, M Genestal1

1Purpan University Hospital, Toulouse, France; 2University of Paul-Sabatier, Toulouse, France

Critical Care 2009, 13(Suppl 1):P130 (doi: 10.1186/cc7294)

Introduction A linear correlation between serum triglyceride and risk of intensive care mortality was identified. Various clinical and metabolic situations such as sepsis, renal failure, hepatic failure and administration of certain drugs have been associated with hypertriglyceridemia (HTG). It is well known that HTG is associated with acute pancreatitis. Development of systemic inflammatory response syndrome or the onset of organ system dysfunction are the complications of severe acute pancreatitis that is an episode of acute pancreatitis. The aim of this study was to determine the incidence of HTG in patients receiving artificial nutrition in an ICU, to describe their characteristics and to establish the relevance of risk factors associated with HTG. Methods A prospective, observational, cohort study performed in a general ICU of Toulouse University Hospital, Toulouse, France from 1 May 2007 to 31 July 2008. We included consecutive intensive care patients with an initial serum triglyceride level less than 3 mmol/l who received at least 7 days of artificial nutrition and with at least three biochemical serum triglyceride analyses with a 1-week interval. The chi-square test and a multivariate logistic

regression model were performed for statistical analysis. A total of 17 clinical factors were studied as independent variables. Results A total of 107 patients were included in the study. Administration of lipid was 0.83 ± 0.36 g/kg/day. The incidence of HTG was calculated as 17.9% per year. Multivariate analysis identified three independent risk factors for HTG: age (P =0.02; adjusted p-coefficient = -0.83), insulin dosage (P = 0.01; adjusted P-coefficient = 0.83), hepatic failure (P =0.04; adjusted p-coefficient = 1.27).

Conclusions The present study showed that ICU-admitted patients receiving artificial nutrition are prone to develop HTG. Hepatic failure and the insulin infusion rate were the most important risk factors for HTG. Age had a protective effect. Our results raise three important matters: (1) serum triglyceride measurement is necessary in seriously ill patients receiving artificial nutrition; (2) elevated triglycerides reflect the degree of insulin resistance and severity of critical illness; and (3) the hyper-triglyceridemic profile of young patients admitted to the ICU is very important to consider.

Abstract withdrawn

Should perioperative immune-modulating nutrition therapy be the standard of care? A systematic review

LB Weitzel1, R Dhaliwal2, J Drover2, G Schiel1, D Heyland2, W Mayles1, P Wischmeyer1

1University of Colorado, Aurora, CO, USA; 2Queens University, Kingston, ON, Canada

Critical Care 2009, 13(Suppl 1):P132 (doi: 10.1186/cc7296)

Introduction Major surgery carries a significant risk of postoperative infections, such as surgical site infections. An estimated 500,000 surgical site infections occur annually at a cost of more the $1 billion/year in the US alone. Surgical trauma leads to an initial excessive inflammatory response, together with an almost immediate and dramatic depression of cell-mediated immunity. This immnosuppression may be due to a significant decrease in plasma arginine levels observed following surgery. This arginine deficiency can severely impair T-cell proliferation and key T-cell receptor function. Perioperative arginine administration can prevent arginine deficiency and restore cellular immunity. The purpose of this meta-analysis was to examine the relationship between immune-modulating enteral nutrition therapy (IMENT) containing arginine and infectious complications, length of stay, and mortality rates in surgical patients.

Methods All prospective randomized controlled trials of arginine-containing IMENT versus standard enteral nutrition in surgical

patients conducted from 1990 to 2008 were identified from multiple databases. Studies included in the analysis evaluated infectious complication, length of stay, and/or mortality rates. Methodological quality of individual studies was scored and necessary data were abstracted in duplicate and independently. Results Thirty randomized trials with a total of 2,789 patients compared the use of arginine-containing IMENT with standard enteral nutrition in surgical patients. Arginine-containing IMENT significantly decreased infectious complications (relative risk =

0.58, 95% CI = 0.48 to 0.69, P <0.00001) and overall length of stay (weighted mean difference = -2.09, 95% CI = -3.20 to -0.97, P =0.0002) versus standard enteral nutrition. As expected in a low-mortality surgical population, however, no effect was observed on mortality (relative risk = 1.06, 95% CI = 0.60 to 1.80, P =0.84).

Conclusions The cumulative results show that arginine-containing IMENT significantly reduces overall infections and length of stay in surgical patients. Based on this evidence, arginine-containing IMENT could soon become the standard of care in the surgical patient. This large treatment effect demands definitive evaluation in a large multicenter trial.

Arginine metabolism in a small animal model of sepsis and after hemihepatectomy

B Van der Hoven1, T Teerlink2, S De Jong2, P Van Leeuwen2, J Bakker1, D Gommers1

1Erasmus University Medical Center, Rotterdam, the Netherlands;

2VU University Medical Center, Amsterdam, the Netherlands Critical Care 2009, 13(Suppl 1):P133 (doi: 10.1186/cc7297)

Introduction Asymmetric dimethylarginine (ADMA) is an inhibitor of the arginine-NO pathway. ADMA accumulates when degradation in the liver by dimethylarginine dimethylaminohydrolase is impaired. In theory, plasma citrulline, formed when arginine is converted by NO synthase, and when ADMA is metabolized, would be lowered and ornithine, formed by the degradation of arginine in the urea cycle, would be potentially elevated when ADMA accumulates as in sepsis and in liver failure [1]. Methods Fourteen male Wistar rats were randomly allocated to lipopolysaccharide (LPS) or hemihepatectomy (HH). Plasma levels of arginine, ADMA, citrulline and ornithine were measured before and 120 minutes after 5 mg/kg LPS and HH, respectively. Results See Table 1.

Conclusions Plasma levels of arginine and derivatives should not be interpreted as a reflection of metabolism at the tissue level. In HH, the elevated ADMA levels suggest dimethylarginine dimethylaminohydrolase activity depends on the liver tissue mass. Reference

1. Wu G, et al.: Arginine metabolism: nitric oxide and beyond.

Biochem J 1998, 336:1-17.

Table 1 (abstract P133)

Results

Parameter (|imol/l) LPS HH

t = 0 t = 120 t= 0 t = 120

Arginine 151.3 ± 28.7 99.1 ± 27.9* 138.0 ± 43.7 221.4 ± 57.1*

ADMA 1.14 ± 0.19 1.32 ± 0.24 0.59 ± 0.06 1.48 ± 0.27*

Citrulline 107.2 ± 19.3 129.1 ± 20.3* 83.5 ± 7.36 118.4 ± 18.5*

Ornithine 99.8 ± 29.2 129.6 ± 39.1* 66.4 ± 19.2 170.7 ± 50.6*

'Significant difference.

Ginger extract dietary supplementation effects on delayed gastric emptying and ventilator-associated pneumonia in adult respiratory distress syndrome patients

M Mokhtari, Z Shariatpanahi

SBU Medical Sciences, Tehran, /ran

Critical Care 2009, 13(Suppl 1):P134 (doi: 10.1186/cc7298)

Introduction Delayed gastric emptying is one of the major reasons for enteral feeding intolerance in ICU patients [1-3]. We studied the effect of ginger supplementation to the diet on development of ventilator-associated pneumonia (VAP) and adult respiratory distress syndrome (ARDS) in ICU patients [2]. Methods In a prospective, randomized, placebo-controlled fashion, 32 ARDS mechanically ventilated patients who were fed entrally were studied. Patients were randomized into two groups; one group had ginger added and the other had placebo added to their enteral feeding. The amount of feeding tolerated in the first 48 hours, the amount of feeding tolerated during the entire study period, development of VAP, the number of ICU-free days, the number of ventilator-free days and morality were evaluated. Results Enteral feeding tolerated in the first 48 hours of study was significantly higher in patients with the ginger-supplemented diet (51% vs. 57%, P <0.005). However this was not different during the entire study period (92% vs. 93%, P = 0.42). VAP was seen in 6.3% of the patients in the ginger group and in 31.3% of the control group, which was not statistically significant (P = 0.08). The ICU mortality of 15.6% was similar in the two groups. The median number of ventilator-free days of 10 versus 7 days and ICU-free days of 7 versus 4 days were significantly higher in the ginger extract group, with respective P values of 0.02 and 0.04. Conclusions Supplementing the diet with ginger extract in ARDS patients reduces the delayed gastric emptying risk and helps reduce the duration of mechanical ventilation and increases ICU-free days. References

1. Berne DJ, et al.: Erythromycin reduces delayed gastric emptying in critically ill traumatic patients: a randomized, controlled trial. J Trauma 2002, 53:422-425.

2. Yavagal DR, Karnad DR, Oak JL: Metoclopramide for preventing pneumonia in critically ill patients receiving enteral tube feeding: a randomized controlled trial. Crit Care Med 2000, 28:1408-1411.

3. Hoffman T: Ginger: an ancient remedy and modern miracle drug. J Environ Sci Health B 2008, 43:127-133.

Immunostimulated enterocytes activate extracellular arginase I which competes with enterocyte inducible nitric oxide synthase for arginine during inflammation

K Miki, R Delude, M Killeen

University of Pittsburgh Medical Center, Pittsburgh, PA, USA Critical Care 2009, 13(Suppl 1):P135 (doi: 10.1186/cc7299)

Introduction We hypothesized that constitutively expressed arginase 1 (Arg-1) may be released from hepatocytes to the circulatory compartment to play the role of extinguisher at the front and to serve a protective function by modulating inflammation. We used a cell culture model of epithelial barrier dysfunction to determine whether liver cytosolic proteins could decrease NO' production and preserve enterocyte paracellular barrier function on S56 that basis.

Methods We exposed immunostimulated Caco 2BBe enterocyte-like cells to human liver cytosol (LC). Cytomix (IFNy, TNFa, and IL-1 P) was used to stimulate the cells. Arginase activity in cell supernatants and murine serum was measured by following the generation of citrulline and urea from arginase. Tissue lysates and conditioned media were untreated or treated with the enzyme inhibitors (S)-(2-boronoethyl)-L-cysteine-HCl (BEC) and l-N(6)-(1-iminoethyl) lysine (L-NIL), which inhibit arginases and inducible nitric oxide synthase (iNOS), respectively.

Results Cytomix increased paracellular permeability, and induced the expression of iNOS and release of NO'. LC protein (400 |ig/ml) applied to the basal but not apical compartment preserved barrier function and completely blocked the release of NO' but only slightly decreased the magnitude of iNOS protein expression in a dose-dependent and time-dependent manner. Ultrafiltration and ultracentrifugation demonstrated that microsomal Arg-1 prepared from LC decreased iNOS-dependent NO' production. BEC and anti-Arg-1 antibody inhibited the NO' blocking ability of LC. Surprisingly, LC arginase activity required activation by a cell-derived factor and its release could be blocked by treating cells with L-NIL. Increased consumption of arginase by activated LC Arg-1 led to decreased iNOS dimerization, which decreased NO' production. In the serum from endotoxemic mice, arginase activity was significantly increased. Furthermore, iNOS existed in ileal mucosa predominantly in the inactive monomeric form at 18 hours after lipopolysaccharide injection, consistent with decreased iNOS activity in the absence of arginase. Conclusions Arg-1 is one such liver-derived protein that increases in serum during endotoxemia and other inflammatory states. Modulation of mucosal iNOS activity following activation of circulating Arg-1 may be protective because it would be expected to decrease epithelial barrier dysfunction as a result of decreased NO' production.

Role of parenteral glutamine supplementation on patient outcome in the surgical ICU

R Aboelmagd, K Moez, W Salem

National Cancer /nstitute, Cairo, Egypt

Critical Care 2009, 13(Suppl 1):P136 (doi: 10.1186/cc7300)

Introduction During the excessive organ/tissue demand of glutamine (Gln) in episodes of stress following major surgery, endogenous Gln production may not be sufficient to meet the increased requirements. The aim of the study was to evaluate the effect of parenteral Gln supplementation on the outcome of surgical cancer patients after major surgery [1]. Methods The study was performed on 40 adult patients admitted to the surgical ICU at the National Cancer Institute requiring total parenteral nutrition (TPN) for at least 5 days. Patients were assigned to two groups: control group, 20 patients received nutritional support as the usual protocol; and Gln group, 20 patients received nutritional support as the usual protocol + dipeptiven 300 mg/day for 7 days. Standard vitamins, trace elements, electrolytes and insulin therapy were supplied. The rate of infections, ICU and total hospital lengths of stay, severe hyperglycemia and days of mechanical ventilation were recorded. Results There were no differences between groups according to the demographic data, metabolic and nutritional parameters and the numbers of diabetic patients. There was also no difference in the duration of TPN. The incidence of pneumonia, surgical wound infection, sepsis, urinary or intravenous catheter infection were significantly lower in the Gln group compared with the control group. The days of mechanical ventilation among ventilated

patients and the ICU length of stay were significantly lower in the Gln group compared with the control group. The nitrogen balance was more negative in control patients than in Gln-supplemented patients (the difference was insignificant). Hyperglycemia episodes were significantly lower and easy controlled in the Gln group. Total plasma amino acid concentrations and the Gln plasma level increased about 40% in patients receiving Gln supplementation. The number of adverse events per patient was significantly lower in the Gln group (2.1 vs. 2.9, P <0.01).

Conclusions Parenteral Gln-supplemented TPN reduces the clinical complications of surgical patients, mainly through a lower incidence of pneumonia and better metabolic tolerance. This forms a strong rationale for the use of Gln-supplemented regimens for surgical ICU patients. Reference

1. De Sousa DA, Greene LJ: Intestinal permeability and systemic infections in critically ill patients: effect of glutamine.

Crit Care Med 2005, 33:1125-1135.

Plasma citrulline kinetics and prognostic value in the critically ill patient

G Piton1, B Cypriani1, E Monnet1, JC Navellou1, C Manzon1, O Barbot1, F Carbonnel2, G Capellier1

1Hôpital Jean Minjoz, Besançon, France; 2Hôpital de Bicetre, Kremlin-Bicetre, France

Critical Care 2009, 13(Suppl 1):P137 (doi: 10.1186/cc7301)

Introduction Multiple organ failure (MOF) is a frequent cause of death in the critically ill patient. The gut could be the cornerstone of MOF, the first step being an early splanchnic ischemia, inducing the loss of barrier function, systemic infections and MOF [1]. Plasma citrulline (normal 20 to 60 |imol/l) reflects a small bowel mass and is decreased in various small bowel diseases [2]. The objectives of the study were to study plasma citrulline kinetics and their prognostic value in adults hospitalized in the ICU. Methods A prospective monocentric observational study, including adults consecutively admitted to the ICU without small bowel disease, and without chronic renal failure. We studied at onset, hour 12, hour 24, day 2, and day 7 of plasma citrulline, and clinical, biological, prognostic and therapeutic parameters. The univariate analysis of plasma citrulline (0 to 10 |imol/l, 11 to 20 |imol/l, and >20 |imol/l) with other variables and multivariate analysis.

Results Sixty-seven patients were included, mean age 60 years, organ dysfunctions and/or infections model ODIN score of 2.4, IGS2 score (Simplified Acute Physiology Score) of 50, and a 28-day mortality of 34%. During the first day, mean plasma citrulline decreased from 18.8 to 13.5 |imol/l, and it decreased more in nonsurvivors than in survivors among patients without acute renal failure (37% vs. 18%). A lower plasma citrulline at hour 24 was associated with higher plasma C-reactive protein, nosocomial infection rate, and 28-day mortality (P = 0.006, P = 0.03 and P = 0.03). A lower plasma citrulline at day 2 was associated with a higher use of catecholamines, and a lower use of enteral feeding (P = 0.02 and P = 0.01). In multivariate analysis, plasma citrulline at hour 24 and ODIN score at admission >3 were associated with 28-day mortality (P = 0.04 and P = 0.04).

Conclusions A lower plasma citrulline at hour 24 was associated with higher plasma C-reactive protein, nosocomial infection rate, and 28-day mortality. A lower plasma citrulline at day 2 was associated with a higher use of catecholamines, and a lower use of enteral feeding. Such results could reflect the systemic consequences of acute intestinal failure in patients hospitalized in the ICU.

References

1. Deitch EA: Multiple organ failure. Pathophysiology and potential future therapy. Ann Surg 1992, 216:117-134.

2. Crenn P, Messing B, Cynober L: Citrulline as a biomarker of intestinal failure due to enterocyte mass reduction. Clin Nutr 2008, 27:328-339.

Modulation of lipid utilisation by parenteral administration of a fish-oil-enriched new lipid formula (SMOFlipid®) in surgical ICU patients: comparison with a lipid emulsion based on olive and soybean oil

S Piper1, T Schollhorn1, I Schade2, R Beschmann2, K Rohm2

1Hospital of Frankenthal, Germany; 2Klinikum Ludwigshafen, Germany

Critical Care 2009, 13(Suppl 1):P138 (doi: 10.1186/cc7302)

Introduction Within the frame of postoperative total parenteral nutrition, a frequent consequence due to postaggression metabolism is an inhibition of lipoprotein lipase leading to hypertriglyceridemia [1]. The aim of this study was to investigate whether the administration of a fish-oil-containing lipid emulsion (SMOF) compared with a lipid emulsion based on olive and soybean oil led to a better utilisation of the lipids. Methods A prospective randomised study. After approval of the ethical committee, 44 postoperative surgical patients with an indication for parenteral nutrition therapy were included in the study. Nonprotein calories were given as 60% glucose and 40% lipid emulsion. The total energy intake per day was 25 kcal/kg body weight. The sedation regimen was standardised (midazolam and fentanyl), propofol (a lipid emulsion) was avoided. Patients were divided into two groups: group A (n = 22) received SMOF (SMOFlipid® 20%), and group B (n = 22) received an emulsion based on olive and soybean oil (ClinOleic20%). Lipid emulsions were administered for 5 days continuously. Trigylceride (TG) levels were measured before the start of infusion (d0), at day 1 (d1), day 2 (d2), and day 5 (d5) after the start of infusion. A pathological TG level was defined at 300 mg/dl and the significance level at P <0.05. Results There were no significant differences in TG levels at baseline (d0: group A: 119 ± 35 vs. group B: 120 ± 45 mg /dl; P =0.87), whereas at d2 (group A: 151 ± 52 vs. group B: 202 ± 108 mg/dl; P < 0.03) and at d5 (group A: 163 ± 72 vs. group B: 233 ± 94 mg/dl; P < 0.01) the TG levels in the SMOF group were significant lower than in the control group. At d5 the incidence of pathological TG levels was significant lower in patients receiving SMOF (0%) compared with the control group (31.8%). Conclusions The administration of a fish-oil-containing lipid emulsion within a parenteral nutrition regimen led at d2 and d5 of the nutrition regimen to significantly reduced TG levels compared with a lipid emulsion based on olive and soybean oil, indicating a better utilisation of the administered TGs. Reference

1. Piper SN, et al.: Eur J Anaesthesiol 2008, 25:557-565. P139

Gastrointestinal function in critically ill trauma patients using motility capsule technology

S Rauch1, K Krueger2, N Roewer1

1University of Wuerzburg, Germany; 2University of Louisville, KY, USA Critical Care 2009, 13(Suppl 1):P139 (doi: 10.1186/cc7303)

Introduction The aim of this study was to investigate the gastric emptying time and small bowel transit time, using a novel wireless S57

motility capsule in trauma patients with intracranial hemorrhage. We hypothesized that gastric emptying and small bowel transit are delayed.

Methods We recruited eight trauma patients with intracranial hemorrhage (six male/two female, mean age 40 years, APACHE III score 41 ± 7, Glasgow coma scale 8 ± 2) who were intubated, mechanically ventilated, sedated, and older than 18 years in this prospective, controlled, Institution Review Board-approved trial. The historical control group consisted of 81 healthy volunteers studied in a separate trial (Protocol #122205: Assessment of whole gut transit time using the SmartPill capsule: a multicenter study). A pH, pressure and temperature sensing capsule (SmartPill™; SmartPill Inc., Buffalo, NY, USA) was positioned with a capsule delivery device (AdvanCE™; US Endoscopy, Mentor, OH, USA) into the patient's stomach. The data were transmitted to a recorder attached to the patient's abdomen. The data were analyzed by two independent observers.

Results There was a significant difference (P = 0.004) in the gastric emptying time for ICU patients, 28.8 ± 31.3 hours (mean ± SD), and healthy volunteers, 3.3 ± 1.1 hours. There was no significant difference between the small bowel transit times in ICU patients, 7.1 ± 3.6 hours, and 4.1 ± 1.6 hours in healthy volunteers. There was no difference in sedation and analgesia consumption between the ICU patients. None of the patients received any proton pump inhibitor or prokinetic medication. Conclusions Gastric emptying is significantly delayed in major trauma patients; however, small bowel transit times are similar to those in healthy volunteers.

Is intragastric administration of enteral nutrition safe in acute severe pancreatitis?

I Grigoras, D Rusu, O Chelarescu, N Andrioaie, A Nistor

'Gr.T. Popa' University of Medicine and Pharmacy /asi, Emergency

University Hospital 'Sf Spiridon' /asi, Romania

Critical Care 2009, 13(Suppl 1):P140 (doi: 10.1186/cc7304)

Introduction Enteral nutrition is the standard of nutritional support in acute severe pancreatitis. Nutrients are routinely delivered below the Treitz angle either by jejunostomy or by an endoscopically placed nasojejunal tube. In recent years the safety of intragastric delivery was under scrutiny [1]. Our study aimed to evaluate the characteristics of intragastric nutrition and its safety in acute severe pancreatitis.

Methods The retrospective study included all patients with acute severe pancreatitis (admission APACHE II score >12) admitted to an emergency university hospital during a 3-year period (2005 to 2007). Nutritional support was assessed as type, route and timing. Collected data were age, admission and highest severity scores, intraabdominal pressure, antibiotic use, surgery, ICU and hospital lengths of stay, and outcome. The safety of intragastric nutrition was assessed as the outcome.

Results Forty-two patients were enrolled. Enteral nutrition was used in 25 patients (59.5%). The majority (20 patients, 80%) received at least for several days associated parenteral nutrition until the caloric needs could be met by the enteral route. Route of administration: intragastric, 16 patients (only oral intake, nine patients; nasogastric tube, five patients; combined oral and jejunostomy, two patients) and only jejunostomy, nine patients. No patient had a nasojejunal tube. Intragastric nutrition started on the 4.4th hospital day (mean value) (range 2 to 10 days). There were S58 no statistically significant differences between patients with

intragastric versus jejunal nutrition concerning the demographics and severity scores on admission. The patients with intragastric nutrition compared with the jejunostomy group had a significantly lower rate of surgery (31.2% vs. 100%, P <0.01), antibiotic use (81.2% vs. 100%, P = 0.05) and lower mortality rate (25% vs. 55.5%, P = 0.01).

Conclusions Despite the classical presumption that gastric nutrition may worsen the evolution of acute severe pancreatitis, our study shows that intragastric administration does not increase mortality. Even more, compared with jejunostomy, it is associated with improved outcome, a lower rate of surgical interventions, and less antibiotic use. Reference

1. Eatock FC, Chong P, Menezes N, et al.: A randomized study of early nasogastric versus nasojejunal feeding in severe acute pancreatitis. Am J Gastroenterol 2005, 100:432-439.

Critical illness gastrointestinal hypomobility disorder and success of enteral erythromycin

N Shaikh, Y Hanssens, M Kettern

Hamad Medical Corporation, Doha, Qatar

Critical Care 2009, 13(Suppl 1):P141 (doi: 10.1186/cc7305)

Introduction Erythromycin is the most potent prokinetic drug available. It is commonly used intravenously. The aim of our study was to determine the effective lowest dose of enteral erythromycin, its tolerability and comparing the dose with the severity of the disease and outcome.

Methods All patients admitted to the trauma ICU between January 2004 and January 2008 who developed feeding intolerance were included (residual volume >500/24 hours). The starting dose was 1 25 mg erythromycin twice daily; if no response, the dose was increased up to a maximum of 1 g twice daily. Data were entered in SPSS software. The chi-square test + one-way ANOVA with post hoc analysis were used. P <0.05 was significant. Results One hundred and seven patients were included, with 85% being male, average age 41 ± 18 years. The majority of patients (54%) suffered from traumatic brain injury, 74% were ventilated, 51% on inotropes, 55% on three opioids, 84% on enteral feeding. The Sequential Organ Failure Assessment score was significantly higher and the Glasgow coma scale was significantly lower in patients receiving 1g erythromycin (P <0.05). See Table 1. Conclusions A low dose of enteral erythromycin 1 25 mg twice daily is potentially effective in patients on intravenous metoclopromide having enteral feeding intolerance. Combining prokinetic therapy and keeping the administration of erythromycin as short as possible can prevent the development of bacterial resistance.

Table 1 (abstract P141)

Response rate and outcome related to enteral erythromycin dose

Dosage (mg, twice daily) Percentage of patients Response rate (%) Average days Outcome (%)

125 45 100 4 71

250 37 97.5 9 55

500 13 85.6 12 43

1,000 5 40 11 0

A randomised prospective trial to compare the efficacy of bolus versus continuous nasogastric feeding in paediatric intensive care

P Kamath, J Longden, C Stack, A Mayer

Sheffield Children's Hospital, Sheffield, UK

Critical Care 2009, 13(Suppl 1):P142 (doi: 10.1186/cc7306)

Introduction Failure to establish early nasogastric (NG) feeding is common in paediatric intensive care (PIC) and influences the outcome. Decreased gastrointestinal motility is multifactorial in origin. NG feeds may be administered continuously or by intermittent bolus; we hypothesise that bolus enteral feeds are more physiological when compared with continuous NG feeds. We aimed to compare bolus versus continuous NG feeding in PIC. The outcome measures were the time to achieve maximal nutritional requirement by volume of feed and to identify the frequency of adverse events.

Methods Following ethical approval and informed consent, eligible admissions to a tertiary PIC from April 2006 to February 2008 were prospectively randomised and enrolled into the study. Gastrostomy feeding, use of motility drugs, gastroesophageal reflux and gastrointestinal surgery were the exclusion criteria. Patients were randomised to receive either 3-hourly bolus feeds or continuous feeding over 21 hours per day, for a 48-hour study period. NG tubes were aspirated 3-hourly and the gastric residual volume was recorded. The protocol was designed to give equal feed volume in a 24-hour period. Intolerance was defined as gastric residual volume more than 125% of feed administered. Data are expressed as the median (interquartile range). Mann-Whitney's test and Fisher's exact test were used to test for associations.

Results Seventy-six patients were enrolled (1.2:1, male:female), median age 11.6 (2.8 to 49.0) months and weight 10 (4.5 to 16.2) kg. Bolus group subjects achieved maximal feed potential by 21 hours (95% CI = 18.6 to 25), the continuous group required 27 hours (95% CI = 19.5 to 30, P = 0.035). The demographic characteristics, sedation, muscle relaxant or inotrope usage between groups were comparable. There was no statistically significant difference in the incidence of adverse events (bolus n = 3 and continuous n = 4, P = 0.71).

Conclusions This is the first study comparing enteral feeding techniques in PIC patients. In PIC, bolus feeding methods may be more beneficial, in achieving maximal nutritional requirements earlier, when compared with continuous feeds. The risk of aspiration of aspiration or vomiting is low with both feeding techniques.

Nutrition therapy in the critical care setting: what is best achievable practice? An international quality improvement project

NE Jones, R Dhaliwal, X Jiang, DK Heyland

Queen's University, Kingston, ON, Canada

Critical Care 2009, 13(Suppl 1):P143 (doi: 10.1186/cc7307)

Introduction The purpose of this study was to describe current nutrition practices in ICUs and to determine the best achievable practice relative to the Critical Care Nutrition Clinical Practice Guidelines (CPGs).

Methods We conducted an international, prospective, observational, cohort study. In January 2007, each ICU recorded data on nutrition practices on a consecutive cohort of 20 mechanically ventilated adult patients that stayed in the ICU for at least 72 hours. Data were collected from the time of admission to the

ICU to ICU discharge, or for a maximum of 12 days. Relative to the CPGs, we report average, best, and worst site performance on key nutrition practices.

Results A total of 158 ICUs from 20 countries participated, and each enrolled an average of 18.6 patients for a total of 2,946 patients. Adherence to CPG recommendations was high for some recommendations; namely, use of enteral nutrition (EN) in preference to parenteral nutrition (PN) (site average 61.7% (range 1 to 97.3%) of patients received EN alone), glycemic control (site average 7.5 (range 3.5 to 10.4) mmol/l), lack of utilization of arginine-enriched enteral formulas (site average 3.5% (range 0 to 92.3%) of patients on EN), delivery of hypocaloric PN (site average 16.8 (range 2.7 to 35.5) kcal/kg), and the presence of a feeding protocol (79.7% of ICUs). However, significant practice gaps were identified for other recommendations. The average time to start of EN was 46.5 hours (range 8.2 to 149.1 hours). The average use of motility agents and small bowel feeding in patients with high gastric residual volumes was 58.7% (range 0 to 100%) and 14.7% (range 0 to 100%), respectively. There was poor compliance with recommendations for the use of enteral formulas enriched with fish oils, glutamine supplementation, timing of supplemental PN, and avoidance of soy-bean-oil-based parenteral lipids. Average nutritional adequacy was 59% (range 20.5 to 94.7%) for energy and 60.3% (range 18.6 to 153.5%) for protein. Conclusions Large gaps exist between the evidence-based recommendations and actual practice in ICUs, and consequently nutrition therapy is suboptimal. We have identified best achievable practice that can serve as targets for future quality improvement initiatives.

Feeding enterally the hemodynamically unstable critically ill patient: experience from a multicentre trial (the REDOXS© study)_

R Dhaliwal, J Drover, J Muscedere, X Jiang, DK Heyland

Queen's University, Kingston, ON, Canada

Critical Care 2009, 13(Suppl 1):P144 (doi: 10.1186/cc7308)

Introduction The delivery of adequate enteral nutrition (EN) in critically ill patients with shock is problematic. The purpose of this study is to describe how EN is delivered in patients with shock from the REDOXS© study, a multicentre, randomized controlled trial of pharmaconutrition.

Methods In 20 centres in Canada and Europe, we randomized mechanically ventilated adults with two or more organ failures to one of four groups: (1) glutamine, (2) antioxidants, (3) glutamine plus antioxidants, (4) placebo. EN and parenteral nutrition were initiated and maintained independently as per the Canadian Clinical Practice Guidelines. Shock was defined as the presence of hypoperfusion requiring vasopressors for at least 2 hours. Daily data including the timing of EN, volumes received, incidence of high gastric residual volumes (hGRVs), use and timing of motility agents and small bowel feeding were collected. Results From May 2007 to July 2008, 159 patients with shock were randomized, 122 (77%) received EN only, and 13 (8%) patients received EN in combination with parenteral nutrition. EN was started 20.2 hours (median, range 0 to 204.8 hours) after ICU admission. The mean duration of EN was 9.2 days (median 6.8, range 0.1 to 30 days) and the mean volume of EN received was 67% (range 2.5 to 199%) of that prescribed. In total 73/135 (55%) had hGRVs >250 ml, and in these patients motility agents and small bowel feeding were used in 78.1% and 41.1% of patients, respectively. Motility agents were started before onset of hGRVs in 21% patients, on the same day as hGRVs in 40% S59

patients and on average 1.6 ± 0.9 days after hGRVs in 39% patients. The percentage mean prescribed volume of EN received 24 hours before and after the start of motility agents was 35.1% vs. 55.9% (P = 0.009). Small bowel feeding was started before onset of hGRVs in 20% patients, on the same day as hGRVs in 14% patients and was started on average 4.7 ± 3.5 (mean, SD) days after hGRVs in 67% patients. The percentage mean prescribed volume of EN received 24 hours before and after the start of small bowel feeding was 54.2% vs. 67.3% (P = 0.36). Conclusions In critically ill patients with shock, EN can be provided in the early phases of acute illness to the majority of patients. The delivery of EN in this population can be maximized by better adoption of motility agents and small bowel feeding.

Molecular adsorbent recirculating system: a clinical experience in acute or acute on chronic liver failure (133 sessions)

I Cardeau, L Lavayssiere, MB Nogier, O Cointault, L Rostaing

CHU Rangueil, Toulouse, France

Critical Care 2009, 13(Suppl 1):P145 (doi: 10.1186/cc7309)

Introduction The molecular adsorbent recirculating system (MARS; Teraklin Industry, Rostock, Germany) is an extracorporeal acute liver failure support system method using albumin-enriched dialysate to remove albumin-bound toxins.

Methods Between 2004 and 2007, we performed 133 MARS treatments in 46 patients: Sequential Organ Failure Assessment score 11.8 ± 4.5. Indication for MARS included 16 cases of fulminant hepatic failure, 21 cases of acute failure on cirrhosis, one intractable pruritus, eight moderate or severe acute liver failure (Bernuau criteria). Among all these patients, 16 had acute renal failure (hepatorenal syndrome). It was a retrospective study. All data were recorded before (T0) and at the end of MARS treatment (T). Results Among 16 fulminant hepatic failure patients, we observed a significant decrease of encephalopathy (P = 0.04). On the other hand we did not observe significant improvement of hemodynamic parameters (norepinephrine dose, mean arterial pressure), metabolic parameters (pH, lactate) or hepatic tests (aspartate aminotransferase, alanine aminotransferase, y-glutamyltransferase, prothrombin time, factor V). In this subgroup, hospital mortality is 31% at day 28. In the group treated for acute on chronic liver failure, the results did not show a difference between grade of encephalopathy before and after the session. We only observed a trend of improvement in hemodynamic and biologic parameters but this was not statistically significant except for cholestase parameters (y-glutamyltransferase P = 0.022, bilirubin P = 0.002). Hospital mortality is 71% at day 28. Among the patients with hepatorenal syndrome, 62.5% were anuric. We observed a significant increase of diuresis output (P <0.01). We did not observe any significant adverse event.

Conclusions Our results confirmed that nonbiologic hepatic support by MARS was safe. The results were disappointing above all in cirrhosis patients. Nevertheless, results in hepatorenal syndrome were encouraging.

Comparison of two indwelling bowel catheters on economic impact by number of bedding and dressing changes per day

Bowel Management Research Group, EC Konz

Hollister Incorporated, Libertyville, IL, USA S60 Critical Care 2009, 13(Suppl 1):P146 (doi: 10.1186/cc7310)

Introduction Fecal incontinence is prevalent in patients in the acute/ICU setting [1,2]. The primary objective of this study was to assess and compare the economic impact on fecal containment with use of catheter A or catheter B at 12 sites (A, seven sites; B, five sites) in the acute/ICU setting. Catheter A is Zassi Bowel Management System (Hollister Inc.) and catheter B is Flexi-Seal Fecal Management System (ConvaTec, Inc.). Methods An analysis of 146 patients (A, 76 patients; B, 70 patients) on the number of bedding and dressing change visits per patient-day (frequency of nursing visits per day spent changing bedding/dressings due to fecal contamination) can be used as an indirect economic measure of catheter leakage and containment. Routine daily bedding/dressing changes were not included, only catheter-related bedding/dressing changes were recorded. Results A nearly 30% reduction (1.20 vs. 1.71) in the rate of bedding/dressing changes per patient-day were observed for catheter A compared with catheter B (P = 0.0035). For catheter A sites, 735 bedding/dressing change visits occurred over 612 patient-days; and for catheter B sites, 705 bedding/dressing change visits occurred over 413 patient-days. Although nonsignificant, lower observed rates of leakage (A, 1.1; B, 1.4), repositions due to leakage (A, 0.25; B, 0.39), and devices expelled (A,

0.02. B, 0.07) may have contributed to the significant reduction in bedding/dressing changes associated with the use of catheter A compared with catheter B.

Conclusions The use of indwelling bowel management systems to divert, collect, and contain liquid stools may provide an economic advantage in an acute/ICU setting for patients with fecal incontinence. These results suggest that catheter A may have a greater economic value compared with catheter B by decreasing the number of nursing visits per patient-day. References

1. Junkin J, Selekof J: Prevelance of incontinence and associated skin injury in the acute care patient. J Wound Ostomy Continence Nurs 2007, 34:260-269.

2. Bliss DZ, Johnson S, Savik, Clabots CR, Gerding DN: Fecal incontinence in hospitalized patients who are acutely ill. Nurs Res 2000, 49:101-108.

Alterations of the immune system in acute pancreatitis and systematic inflammatory response syndrome

V Mylona1, I Vaki2, K Lymberopoulou1, A Marioli1, A Georgopoulou1, M Lada1, E Giamarellos-Bourboulis2, G Koratzanis1

1Sismanogelion General Hospital, Athens, Greece; 2ATTIKON University Hospital, Athens, Greece

Critical Care 2009, 13(Suppl 1):P147 (doi: 10.1186/cc7311)

Introduction A considerable body of evidence indicates the contribution of the immune system to the mechanisms leading to acute pancreatitis [1]. Data focusing on the early course of events at the clinical ground of the process are lacking. The aim of the present study was to provide a complex evaluation of peripheral blood monocyte and subpopulations of lymphocytes in patients with acute pancreatitis and systemic inflammatory response syndrome. Methods Forty-three patients were enrolled in the study; 33 with acute pancreatitis and systemic inflammatory response syndrome, and 10 healthy subjects. Peripheral blood immune cells were studied on days 1 and 4 by flow cytometry.

Results Percentages of natural killer (NK) cells and of apoptosis of CD4 lymphocytes upon diagnosis are presented in Table 1. Changes of NK cells and of monocyte apoptosis are presented in Table 2.

Table 1 (abstract P147)

NK cells and CD4-lymphocyte apoptosis

Median (%) Acute pancreatitis Healthy controls P value

CD(16 + 56)+/CD3- 15.09 6.50 0.036

ANNEXIN+/CD4+ 7.16 4.63 NS

Table 2 (abstract P147)

Alterations over follow-up

Day 1 Day 4 P value

CD(16 + 56)+/CD3- 43.67 52.16 0.036

ANNEXIN+/CD14+ 66.37 40.39 0.021

Conclusions Results indicate an early and significant response of NK cells and of CD4 apoptosis in the initial events of acute pancreatitis. Reference

1. Dabrowski A, et al.: Monocyte subsets and natural killer cells in acute pancreatitis. Pancreatology 2008, 8:126-134.

Hungarian perioperative selenium survey in patients with oesophageal cancer

T Leiner1, A Mikor1, A Csomos2, T Vegh3, B Fulesdi3, M Nemeth1, Z Molnar1

1 University of Pecs, Hungary; 2Semmelweis University, Budapest,

Hungary; 3University of Debrecen, Hungary

Critical Care 2009, 13(Suppl 1):P148 (doi: 10.1186/cc7312)

Introduction Selenium is one of the most investigated trace elements and an important link in the antioxidant system [1]. It is known that selenium levels are lower than normal at the time of admittance to the ICU in critically ill patients [2]. Several cohort surveys highlight the role of selenium deficiency in the carcino-genesis of oesophageal cancer [3]. Mortality and morbidity data after oesophagectomy may vary, but remain considerably high [4]. Methods In three Hungarian university centres, 36 patients who were operated on with oesophageal cancer (OG-group) and admitted to the ICU, and 96 healthy volunteers (C-group) were recruited. In the OG-group, full blood selenium levels were measured preoperatively (t0) and on the first (t1) and second (t2) postoperative days. Selenium levels were measured by atomic absorption spectrometry in the laboratories of Byosin Arzneimittel GmbH (Fellbach, Germany), blinded to patient's condition or group assignment. All data are presented as the mean ± SD. To test the normal distribution the Kolgomorov-Smirnov test was used. For statistical analysis the independent-samples t test and ANOVA were used as appropriate. For statistical analysis the Statistical Program for Social Sciences (SPSS® version 15.0) software for Windows was used. Statistical significance was considered at P <0.05.

Results There was a significant difference in the full blood selenium levels between the controls and preoperative samples (t0) of the OG-group (123.86 ± 19.14 |ig/l vs. 98.36 ± 19.02 |ig/l; P <0.001). In the OG-group selenium levels decreased signifi-

cantly during the study period; t0, 98.36 ± 19.02 |ig/l; t1, 86.92 ± 17.04 |ig/l; t2, 81.44 ± 18.31 |ig/l; P = 0.001. Conclusions This study has shown significantly lower selenium levels in OG-patients as compared with controls and a significant decrease in the postoperative period. Whether this has any influence on outcome requires further investigation. References

1. Papp LV, et al.: From selenium to selenoproteins: synthesis, identity, and their role in human health. Antioxid Redox Signal 2007, 9:775-806.

2. Forceville X, et al.: Selenium, systemic immune response syndrome, sepsis, and outcome in critically ill patients. Crit Care Med 1998, 26:1536-1544.

3. Lu H, et al.: Dietary mineral and trace element intake and squamous cell carcinoma of the esophagus in a Chinese population. Nutr Cancer 2006, 55:63-70.

4. Rodgers M, Case volume as a predictor of inpatient mortality after esophagectomy. et al.: Arch Surg 2007, 142: 829-839.

Selenium in critically ill children with cardiac dysfunction

M Abd Elmonim, TA Abd Elgawd, A Abd Elkareem, T Zidan

Ain Shams University, Cairo, Egypt

Critical Care 2009, 13(Suppl 1):P149 (doi: 10.1186/cc7313)

Introduction Selenium (Se) and selenocysteine residues are essential for the activity of glutathione peroxidase enzyme (GPX). GPX plays an important role in antioxidant defense. Se deficiency is reported in critically ill patients due to deficient dietetic intake, unsupplemented parenteral nutrition, catabolic state and increased losses. The objective of the study was to study the Se and GPX status in critically ill children with cardiac dysfunction and the effects of Se supplementation.

Methods Thirty-five critically ill cardiac children (mean age: 2.82 ± 3.41 years) with different cardiac disorders (15 with myocarditis, 15 with cardiomyopathy and five with rheumatic heart disease) were investigated for blood Se and GPX levels and echocardiographic parameters at admission and after 3 days of parenteral Se supplementation (initial dose of 2 |ig/kg/day on the first day followed by 1 |ig/kg/day). Fifteen healthy children were included as controls.

Results At admission, Se and GPX levels were significantly decreased in patients (6.61 ± 1.16 |ig/l and 10.8 ± 1.14 U/l) compared with controls (16.06 ± 2.08 |ig/l and 20.64 ± 2.13 U/l) (P <0.001). Se levels did not differ between studied cardiac diseases (P >0.05). Se levels correlated positively with corresponding ejection fraction values (EF) (r = 0.57 and P <0.05) and fractional shortening values (FS) (r = 0.45 and P <0.05), and negatively with left ventricular end-diastolic diameter (r = -0.50 and P <0.05). After Se supplementation, Se and GPX were raised (16.53 ± 2.25 |ig/l and 19.71 ± 2.63 U/l). Clinical examination revealed that orthopnea improved in 88.6% of cases and dys-rhythmia disappeared in 65.7%. Echocardiography showed that EF and FS were also significantly improved (admission values for EF and FS were 34.6 ± 7.4 and 16.8 ± 4.1, compared with 52.6 ± 11.4 and 27.7 ± 6.8 after Se supplementation; P <0.05). Conclusions Se deficiency is one of the mechanisms of worsening cardiac dysfunction in critically ill patients regardless of the underlying cardiac etiology. Se supplementation can reverse such a mechanism and improve cardiac performance. S61

Effect of high-dose selenium substitution on selected laboratory parameters and prognosis in critically ill patients

H Brodska, A Kazda, J Valenta, J Hendl

University Hospital, Prague, Czech Republic

Critical Care 2009, 13(Suppl 1):P150 (doi: 10.1186/cc7314)

Introduction High doses of selenium (Se) may improve the condition of ICU patients with respect to its involvement in antioxidative protection and other tasks. In the previous work we concluded, that the decrease of 28-day mortality is not significant. Now procalcitonin (PCT), C-reactive protein (CRP) and Sequential Organ Failure Assessment (SOFA), as markers of the severity of actual status, were monitored and evaluated in relation to the plasma Se concentration.

Methods One hundred and forty patients were randomized into groups A and B. Group A received standard Se substitution: 30 to 75 |ig/day NaSelenite intravenously; group B received Se substitution according to a protocol: 1,000 |ig at day 1, followed with 500 |ig at days 2 to 14. These groups were divided into four subgroups: systemic inflammatory response syndrome, sepsis, severe sepsis and septic shock. Plasma levels of Se, CRP and PCT were examined. SOFA and 28-day mortality were evaluated as clinical markers. Relations between parameters were evaluated statistically. In both groups, transitions among subgroups during treatment were evaluated. All patients monitored less than three times were excluded.

Results Negative correlations between Se x PCT, Se x SOFA and Se x CRP were found (Table 1). Correlation coefficients (r) were statistically significantly higher in group B. Evaluation of transitions among subgroups during treatment showed there was a difference in subgroup septic shock between groups A and B. Into the less severe stage of sepsis moved 28% in group A (14 examinations) and 44% (16 examinations) in group B.

Table 1 (abstract P150)

Negative correlations

Parameter Group A Group B Significance

Se x PCT r = -0.24 r = -0.29 P < 0.05

Se x CRP r = -0.41 r = -0.53 P < 0.05

Se x SOFA r = -0.29 r = -0.03 NS

Conclusions There were more significant negative correlations between plasmatic Se and parameters of inflammation in the group of patients supplemented with high doses of Se. In accordance was a finding of more frequent transition from septic shock to a less severe stage of sepsis. Even if a decrease of mortality was again not found, the presented results may perhaps indicate a positive metabolic influence in critical illness.

Dynamic generation of physiological model systems

J Kretschmer1, A Wahl2, J Guttmann2, K Moller1

1Furtwangen University, Villingen-Schwenningen, Germany;

2University Hospital Freiburg, Germany

Critical Care 2009, 13(Suppl 1):P151 (doi: 10.1186/cc7315)

Introduction Mathematical models are widely used to simulate physiological processes in the human body and can be exploited S62 for diagnostic purpose or the automation of therapeutic measures

[1]. Usually these models focus on one single aspect of the human physiology. Complex models with interaction between different physiological processes usually do not consist of interchangeable submodels. We therefore designed a versatile software based on Matlab with dynamically exchangeable subsystems within the three model families of respiratory mechanics, gas exchange and cardiovascular dynamics.

Methods For each submodel the parameters have been extracted from the corresponding literature. Common interfaces were defined for each model family based on these parameters to ensure interchangeability within the same model family. For simulation of human body gas exchange we used a two-compartment model with oxygen and carbon dioxide dissociation curves. The model family of cardiovascular dynamics consisted of a single-compartment model and a six-compartment model including the response to pleural pressure. The respiratory mechanics were based on a first-order resistance-compliance model and a second-order resistance-compliance model. Model parameters were fit to human test data or to data taken from the literature. Simulation is executed using a dedicated caller program to combine the selected submodels and to execute them subsequently at each time step. Submodel selection and basic parameter specification can be done via a graphical user interface. The software was tested with different model combinations. Each combination was supplied with alterations in ventilation frequency and positive end-expiratory pressure.

Results Simulation based on parameters from the literature with the variations described above showed plausible results. Alterations in ventilation frequency indicated a response time consistent with data acquired by Jensen and colleagues [2]. Conclusions The developed software is able to simulate different combinations of submodels at variable complexity. Simulation results are consistent with experimental data. Interaction between submodels can be seen in the simulation output. References

1. Lozano S, Moller K, Brendle A, et al.: AUTOPILOT-BT: a system for knowledge and model based mechanical ventilation. Technol Health Care 2008, 16:1-11.

2. Jensen MC, Lozano S, Gottlieb D, et al.: An evaluation of end-tidal CO2 change following alteration in ventilation frequency [abstract]. In MBEC Antwerpen Conference; 2008.

Intraaortic balloon pumping: why should we hurry up?

A Macas, A Mundinaite, G Baksyte

Kaunas University of Medicine, Kaunas, Lithuania

Critical Care 2009, 13(Suppl 1):P152 (doi: 10.1186/cc7316)

Introduction Application of the intraaortic balloon pump (IABP) for patients with acute myocardial infarction complicated by cardio-genic shock is undoubted. But the time of IABC initiation is questionable. The goal of the study was to evaluate the influence of the IABP initiation time and dose of dopamine for patient hemodynamic data and inhospital mortality.

Methods Sixty-two consecutive acute myocardial infarction patients managed with IABP were included in the study. The initiation time of IABP and administered dopamine doses were compared. Two subgroups of patients were separated: those receiving less than 10 |ig/kg/minute dopamine, and others receiving 10 |ig/kg/minute dopamine or more. Standard hemodynamic indices of cardiac output (CO), cardiac index (CI), stroke volume (SV), stroke index (SI), pulmonary capillary wedge pressure (PCWP), mean pulmonary artery pressure (PAmean), cardiac

power (CP = CO x MAP / 451, where MAP is mean arterial pressure) and cardiac power index (CPI = CI x MAP / 451) were measured.

Results CO in the first group was 3.58 ± 1.12 l/minute, in comparison with 2.54 ± 1.08 l/minute (P <0.0001) in the second group. CI correspondingly was 1.9 ± 0.6 l/minute/m2 and 1.38 ± 0.61 l/minute/m2 (P = 0.001), SV was 43.8 ± 26.14 ml and 30.01 ± 13.75 ml (P = 0.012), SI was 23.93 ± 15.58 ml/m2 and 16.28 ± 7.75 ml/m2 (P = 0.017), CP was 0.68 ± 0.25 W and 0.4 ± 0.23 W (P <0.0001), CPI was 0.36 ± 0.14 W/m2 and 0.22 ± 0.13 W/m2 (P <0.0001), PCWP was 21.32 ± 6.04 mmHg and 24.16 ± 7.42 mmHg (P = 0,104), PAmean was 28.74 ± 8.53 mmHg and 30.23 ± 7.05 mmHg (P = 0.47). In the second subgroup, hemodynamic indices (CO, CI, SV, SI, CP and CPI) were statistically significantly lower (P <0.05) in comparison with the data of patients receiving less than 10 |ig/kg/minute, while PCWP and CVP values differed insignificantly. Inhospital mortality was higher among patients receiving 10 |ig/kg/minute dopamine or more. Among 31 patients receiving less than 10 |ig/kg/minute dopamine 16 (51.6%) died, while among those 31 receiving 10 |ig/kg/minute or more 27 (87.1%) died. This was a significant difference according to the Pearson x2 criterion (%2 = 9.182, P =0.002).

Conclusions Initiation of aortic counterpulsation should be considered as soon as possible, while the patient with acute myocardial infarction is treated with low doses of vasopressors.

Study of risk factors for female patients with acute myocardial infarction

H Fu1, Y Zhao2

China Rehabilitation Research Center, Beijing, China; 2Gerontic

Cardiovascular Disease /nstitution, Beijing, China

Critical Care 2009, 13(Suppl 1):P153 (doi: 10.1186/cc7317)

Introduction Acute myocardial infarction (AMI) is one of the most common cardiovascular emergencies. Female patients have different features to male patients. The objective of our study was to analyze risk factors for female patients with AMI. Methods Five hundred and eighty female patients were compared with 2,058 male patients for age, occupation, positive family history, type 2 diabetes mellitus, hypertension, and hyperlipemia. Then, according to age, patients with AMI were divided into four groups: <55 years, 55 to 64 years, 65 to 74 years, >75 years. Risk factors were compared between female patients and male patients in each group.

Results Compared with male patients, female patients were older (P <0.01); The morbidity of type 2 diabetes mellitus and hypertension and the rate of physical labors were significantly higher in female patients (P <0.01, P <0.01, P <0.01); the morbidity of hyperlipemia and the rate of mental labors and positive family history were significantly lower in female patients (P<0.01, P <0.01, P <0.05). In the <55 years group, the morbidity of type 2 diabetes mellitus and hypertension and the rate of physical labors were significantly higher in female patients (P<0.05, P<0.01, P<0.01); the rate of mental labors were significantly lower in female patients (P<0.01). In the 55 to 64 years group, the morbidity of type 2 diabetes mellitus and hypertension and the rate of physical labors were significantly higher in female patients (P <0.01, P <0.05, P <0.01); the rate of mental labors was significantly lower in female patients (P<0.01). In the 65 to 74 years group, the morbidity of type 2 diabetes mellitus and hypertension and the rate of physical labors were significantly higher in female patients (P <0.05, P <0.01, P <0.01); the rate of mental labors

was significantly lower in female patients (P <0.01). In the >75 years group, the rate of physical labors was significantly higher in female patients (P <0.01); the rate of mental labors was significantly lower in female patients (P <0.01). Conclusions Compared with male patients, female patients with AMI were older, and type 2 diabetes mellitus and hypertension probably played more important roles in female patients. A higher rate of physical labors and a lower rate of mental labors in female patients probably contribute to prevent them from AMI.

Influence of arterial pressure on tissue perfusion in septic shock

E Tishkov, O Bukaev

Moscow State Medical University, Moscow, Russian Federation Critical Care 2009, 13(Suppl 1):P154 (doi: 10.1186/cc7318)

Introduction The aim of this study was to measure the effects of increasing mean arterial pressure (MAP) on systemic oxygen metabolism and regional tissue perfusion in septic shock. Methods Twenty patients with the diagnosis of septic shock who required pressor agents to maintain a MAP >65 mmHg after fluid resuscitation to a pulmonary artery occlusion pressure (PAOP) >12 mmHg were included. Norepinephrine was titrated to MAPs of 65 mmHg, 75 mmHg and 85 mmHg in 20 patients with septic shock.

Results At each level of MAP, hemodynamic parameters (heart rate, PAOP, cardiac index, left ventricular stroke work index, and systemic vascular resistance index), metabolic parameters (oxygen delivery, oxygen consumption, blood lactate), and regional perfusion parameters (gastric mucosal PCO2, skin capillary blood flow and red blood cell velocity, urine output) were measured. Increasing the MAP from 65 to 85 mmHg with norepinephrine resulted in increases in cardiac index from 3.5 ± 0.4 l/min/m2 to 5.0 ± 0.5 l/min/m2 (P < 0.03). Blood lactate was 3.5 ± 0.8 mEq/l at a MAP of 65 mmHg and 3.0 ± 0.8 mEq/l at 85 mmHg (P = NS). The gradient between arterial PCO2 and gastric intramucosal PCO2 was 11 ± 3 mmHg (1.5 ± 0.3 kPa) at a MAP of 65 mmHg and 15 ± 3 mmHg at 85 mmHg (2.0 ± 0.3 kPa, P = NS). Urine output at 65 mmHg was 40 ± 10 ml/hour and was 45 ± 12 ml/hour at 85 mmHg (P = NS). As the MAP was raised, there were no significant changes in skin capillary blood flow or red blood cell velocity.

Conclusions Increasing the MAP from 65 mmHg to 85 mmHg with norepinephrine does not significantly affect systemic oxygen metabolism, skin microcirculatory blood flow, urine output, or splanchnic perfusion.

Echocardiographic assessment of the effects of acute left ventricular pacing on patients with severe congestive heart failure and narrow QRS duration

K Hussein, H Elaassar, D Ragab, H Elattroush, R Soliman, H Khaled

Cairo University, Cairo, Egypt

Critical Care 2009, 13(Suppl 1):P155 (doi: 10.1186/cc7319)

Introduction More than 20% of patients with congestive heart failure (CHF) exhibit one form or another of mechanical dyssynchrony, intraventricular conduction impairment, or bundle branch block. The concept of dual-chamber pacing in refractory heart failure was introduced to be followed later by the technique of biventricular pacing to restore cardiac synchrony in the failing S63

heart. The aim of the present study was to address the issue of applying the technique of left ventricular (LV) pacing to that substrate of heart failure patients with a narrow rather than wide QRS complex, and with LV rather than biventricular pacing in order to permit the use of an ordinary dual-chamber pacemaker. Methods We conducted an acute study on 20 patients (15 male, five female; mean age 43 years); all had CHF (12 ischemic and eight idiopathic) with normal QRS duration on ECG. All patients were under maximal tolerated doses of antifailure treatment. All patients were subjected to M-mode and two-2D echocardiography to measure: left ventricular end-diastolic dimension (LVEDD), left ventricular end-systolic dimension (LVESD), fractional shortening (FS), ejection fraction (EF), mitral regurge area and cardiac output before and 15 minutes after LV pacing. All patients were subjected to temporary dual-chamber right atrium, LV pacing; the LV lead was passed retrogradely via the transaortic route. The pulmonary capillary wedge pressure (PCWP) was measured using a trilumen, balloon-tipped thermodilution Swan-Ganz catheter. Patients were divided into group I (PCWP >15 mmHg, 10 patients) and group II (PCWP <15 mmHg, 10 patients).

Results Echocardiographic measurements after pacing in group I showed significantly lower LVEDD (5.12 vs. 6.53 cm, P <0.004), lower LVESD (4.01 vs. 4.65 cm, P <0.034), smaller mitral regurge area (9.7 vs. 13.4 cm2, P<0.005), higher FS (18.9 vs. 17, P <0.04) and higher EF (37.9 vs. 35.5%, P <0.02). In contrast, following pacing in group II, the hemodynamics were not significantly different from pre-pacing values. Conclusions Single LV - rather than biventricular - pacing could achieve remarkable hemodynamic beneficial effects in patients with CHF even with normal QRS, but only in that substrate of patients with a high PCWP. Although this is an acute study, our findings open the scope for widespread application of the concept of multisite pacing.

Left ventricular pacing alone improves haemodynamic variables

D Rgab

Cairo University, Cairo, Egypt

Critical Care 2009, 13(Suppl 1):P156 (doi: 10.1186/cc7320)

Introduction Despite advances in drug treatment, congestive heart failure (CHF) remains a major healthcare problem associated with a poor quality of life and a high mortality rate. During the past decade, cardiac resynchronisation therapy (CRT) using biventri-cular (BIV) pacing emerged as a promising technique improving the quality of life, exercise tolerance and mortality in patients with severe CHF. Left univentricular (LV) pacing is able to achieve the same mechanical synchronisation as BIV pacing in experimental studies and in humans resulting in significant improvement in functional class, quality of life and exercise tolerance at the same extent as those observed with BIV stimulation. The aim was to study the acute hemodynamic effects of LV pacing in the category of patients with severe CHF and QRS duration <130 ms and to determine whether pulmonary capillary wedge pressure (PCWP) >15 mmHg could have an impact on these changes in hemo-dynamics.

Methods We conducted an acute study on 20 patients (15 male, five female; mean age 43 years); all had CHF (12 ischemic and eight idiopathic) with a QRS duration <130 ms. All patients had an ejection fraction <40%. Group I comprised 10 patients with PCWP >15 mmHg, group II comprised 10 patients with PCWP S64 <15 mmHg. Haemodynamics were measured using a Swan-Ganz

catheter at baseline and during LV VDD pacing. All patients were under maximal tolerated doses of antifailure treatment. Results After LV pacing, group I showed a significant decrease in right atrial pressure (P =0.001), and PCWP also significantly decreased (P =0.0001), while the cardiac output ignificantly increased (P =0.04). In group II none of these hemodynamic parameters showed any significant improvement. The improvement of hemodynamics in group I occurred despite the greater increase in QRS duration after LV pacing when compared with group II (24 vs. 21%, respectively).

Conclusions LV pacing acutely benefits CHF patients with PCWP >15 mmHg irrespective of the QRS duration. Although this is an acute study, our findings open the scope for revising the current wide QRS duration as an indication for CRT and to consider the less expensive LV rather than BIV pacing when a patient is considered a candidate for resynchronization.

Preoperative left atrial dysfunction and new-onset atrial fibrillation in cardiac surgery patients

M Brouard, JJ Jimenez, J Iribarren, L Lorente, R Perez, L Lorenzo, S Palmero, L Raja, N Perez, R Martinez, ML Mora

Hospital Universitario de Canarias, La Laguna SC, Tenerife, Spain Critical Care 2009, 13(Suppl 1):P157 (doi: 10.1186/cc7321)

Introduction Postoperative atrial fibrillation is one of the most frequent complications after cardiopulmonary bypass (CPB). The aim of the present study was to investigate the correlation between preoperative left atrial dysfunction assessed by tissue Doppler and postoperative new-onset atrial fibrillation (NOAF) after coronary artery bypass grafting (CABG).

Methods Preoperative transthoracic echocardiography Doppler was performed on elective cardiac surgery patients. Left atrial function was evaluated with tissue Doppler imaging of the mitral annulus.

Results We studied 92 patients, 73 (79%) males and 19 (21%) females, mean age 67 ± 10 years, in preoperative sinus rhythm who underwent elective CABG surgery under CPB. Nineteen patients (20.6%) developed NOAF at 34 ± 12 postoperative hours. Patients with NOAF were older (71 ± 7 vs. 66 ± 10 years, P = 0.034), had a larger left atrial diameter (LAD), lower peak atrial systolic mitral annular Doppler velocity (A'm) and higher E'/A' ratio in a bivariate analysis. Stepwise logistic regression analysis showed that LAD (OR = 2.23, 95% CI = 1.05 to 4.76; P = 0.033) and lower A'm (OR = 0.70, 95% CI = 0.55 to 0.99; P = 0.034) were independently associated with postoperative NOAF. Conclusions A preoperative left atrial dysfunction assessed by tissue Doppler imaging may identify the patients at risk of postoperative NOAF.

Right ventricular involvement in Takotsubo cardiomyopathy

MM Moeller, J Voelz, C Lenz, J Wicke, R Gradaus, J Neuzner, J Neuzner

Klinikum Kassel, Germany

Critical Care 2009, 13(Suppl 1):P158 (doi: 10.1186/cc7322)

Introduction Takotsubo syndrome (TS) is characterized by a transient apical ballooning of the left ventricle, reversible ST-T-segment abnormalities and mildly elevated troponin without coronary artery stenosis, mimicking myocardial infarction (MI). Originally described as a disturbance of the left ventricle, recently

an involvement of the apex of the right ventricle (RV) has been recognized affecting approximately one-quarter of cases. Methods TS was diagnosed in 10 patients (nine female, one male; mean age 67 years) who displayed typical signs of acute myocardial ischemia, showed typical ECG changes and slightly elevated troponin I (0.8ng/ml; n ±3.5 ng/ml; normal <0.3 ng/ml). Echocardiography was obtained on admission in all patients. TS was presumed by identification of the typical LV apical ballooning configuration (echo positive). In 10 patients left heart catheteri-zation was then performed immediately.

Results All patients displayed LV apical ballooning on LV angio-graphy without coronary artery lesions. Nine patients were echo positive (90%; sensitivity 91%, specificity 100%); in the one echonegative patient the LV angiogram showed only a very small apical ballooning area. In all of the nine patients classified as echo positive, involvement of the RV apex was identified as well. Additional hypokinesia of the middle part of the RV was seen in two patients; however, the basal RV wall segments were never compromised. One patient showed dynamic LV outflow tract obstruction.

Conclusions RV involvement was a common feature in TS, involving 90% of patients. In all cases, the apical portion of the RV was compromised, the basal segments were never affected. In most patients the typical wall motion disturbance was readily seen on transthoracic echocardiogram. RV involvement was identified in all echo-positive patients. Probably, identification of RV apical ballooning may aid in differentiating TS from MI.

Possible cause of bradycardia developed due to a2-sympothomimetic infusion

K Popugaev, I Savin, A Goriachev, A Oshorov, A Troitskiy, P Kalinin

Neurosurgical Research Institute N.N. Burdenko, Moscow, Russian Federation

Critical Care 2009, 13(Suppl 1):P159 (doi: 10.1186/cc7323)

Introduction Bradycardia is described as a side effect of a2-sympothomimetic infusion. It is supposed as a dose-dependent phenomenon. Cholinergic antagonists or p-sympathomimetics are recommended for the bradycardia correction. We propose that the severity of bradycardia associated with a2-sympothomimetic infusion depends on the level of hypothyroidism. The aim of this report is to provide evidence for our point of view. Methods Fifteen patients after sellar region tumor surgery were included in the study. Resistant arterial hypotension developed in the early postoperative period in all patients and they were monitored with a Swan-Ganz catheter for this reason. The hormonal profile (triiodothyronine, thyroxine, free triiodothyronine, free thyroxine, and cortisol) was investigated daily during the whole period of hemodynamic monitoring.

Results The cause of resistant arterial hypotension in all cases was decreased vascular tone (systemic vascular resistance index = 1,503 ± 624 dyn-s-cm5-m2), and so phenylephrine as the a2-sympothomimetic was the drug of choice. Administration of phenylephrine started with the mean dose of 2.9 |ig/kg/minute, and the maximal mean dose during the period of arterial hypotension was 5 |ig/kg/minute. Bradycardia developed in four patients during infusion of phenylephrine. All these patients had clinical signs of hypothyroidism (hypothermia, dynamic ileus, etc.) and decreased levels of T3, T4, free T3 and free T4. In order to correct bradycardia, the infusion of phenylephrine was combined with cholinergic antagonists and p-sympathomimetics. Simultaneously, dose administration of thyroid hormones was increased.

Heart rates reached the normal range, when clinical signs of hypothyroidism were corrected, and the level of thyroid hormones tended to rise. In 11 patients without clinical signs of hypo-thyroidism and normal levels of thyroid hormones in plasma, bradycardia never developed even if they received high doses of phenylephrine (5.5 |ig/kg/min).

Conclusions Bradycardia, associated with a2-sympothomimetic infusion, is a consequence of hypothyroidism in patients after sellar region tumor surgery. This phenomenon is not dose dependent. Patients with bradycardia, developed after beginning a2-sympotho-mimetic infusion, need to be screened for hypothyroidism. If hypothyroidism is confirmed, therapy with l-thyroxin should be administrated immediately. In patients having received thyroid hormones, their doses need to be increased.

Automatic real-time detection of myocardial ischemia by epicardial accelerometer

P Halvorsen1, E Remme1, A Espinoza1, L Hoff2, H Skulstad1, T Edvardsen1, E Fosse1

1Rikshospitalet University Hospital, Oslo, Norway; 2Vestfold

University College, Tensberg, Norway

Critical Care 2009, 13(Suppl 1):P160 (doi: 10.1186/cc7324)

Introduction Epicardial accelerometers have proved to detect myocardial ischemia with high sensitivity [1]. In this combined experimental and clinical study using an epicardial accelerometer, we aimed to test two methods for real-time automated detection of myocardial ischemia.

Methods One accelerometer (5 x 5 x 2 mm3) was sutured in the perfusion area of the left anterior descending artery (LAD). Epicardial acceleration was simultaneously recorded with ECG, and the ECG QRS complex was automatically detected for timing of systole. From the epicardial acceleration, signal circumferential peak velocity and displacement was automatically calculated within a time interval of 150 ms after peak R on the ECG. Experimental model: in 10 open-chest pigs, regional left ventricular function was reduced by temporary LAD occlusion and global myocardial function changed by esmolol infusion. The myocardial circumferential strain measured by echocardiography was used to confirm ischemia. Clinical model: the accelerometer methods were tested in seven patients, receiving coronary artery bypass grafting. LAD was occluded for 3 minutes before grafting and the accelerometer measurements were compared for hemodynamics, ECG ST-segment analysis and strain by transoesophageal echocardio-graphy.

Results Accelerometer systolic displacement was superior to systolic velocity to detect ischemia by automated analysis. Accelerometer systolic displacement demonstrated dyskinesia during LAD occlusion in pigs (11.5 ± 2.3 to -1.2 ± 2.8 mm, P <0.01), while hypokinesia was found in patients (12.8 ± 8.1 to 3.5 ±4.4 mm, P <0.01). Ischemia was confirmed by strain echocardiography in both models (P <0.01). No significant changes in hemodynamics and in the ECG ST segment were seen during LAD occlusion in patients. In the experimental model, esmolol infusion induced fewer changes in the automated accelerometer measurements than LAD occlusion (P < 0.01 ) and ischemia was detected with a sensitivity of 100% and specificity of 95 to 100%. Conclusions Sensitive real-time detection of myocardial ischemia was feasible by the use of automated analysis of continuous epicardial accelerometer signals. This technique may improve realtime detection of ischemia during and after cardiac surgery. Reference

1. Halvorsen PS, et al.: Br J Anaesth 2009, 102:29-37. S65

Prognosis of acute myocardial infarction outcomes using evaluation of cardiac power (product of cardiac output and mean arterial pressure)

A Macas, A Krisciukaitis, V Saferis, G Baksyte, A Mundinaite, V Semenaite

Kaunas University of Medicine, Kaunas, Lithuania

Critical Care 2009, 13(Suppl 1):P161 (doi: 10.1186/cc7325)

Introduction Insufficient reliability and specificity of cardiac output (CO) as a widely used parameter for prognosis of acute myocardial infarction (AMI) outcomes led to investigations and a search for new methods and parameters. Cardiac power (CP) (a parameter proportional to the product of CO and mean arterial pressure) was introduced after studies mainly performed using the invasive intermittent thermodilution (ITD) technique. The aim of this study was to investigate the reliability and specificity of the new parameter mainly by means of noninvasive techniques such as impedance cardiography (ICG).

Methods CO and CP were evaluated by both ITD and ICG methods in patients with AMI, admitted within 12 hours from the onset of pain. CP was evaluated using the suggested formula: CP = CO x MAP / 451, where MAP = mean arterial pressure. During the period of 2004 to 2008, 289 (196 men and 93 women) patients were investigated. The standard eight-electrode ICG registration was used. The optimal binning method using the minimal description length principle was used to predict outcomes after AMI: inhospital mortality, survival after 6 months and survival after 12 months.

Results CP evaluated on the first day was found as the only valuable prognostic parameter using the model entropy method in the group of patients where noninvasive evaluation of CO was used. Inhospital mortality was predicted with single cut point 0.65, sensitivity 100% and specificity 92.2%, while survival of 12 months was predicted with single cut point 0.90, sensitivity 88.9% and specificity 73.0%. Only prediction of inhospital mortality was possible in the group of patients where CO was evaluated using the ITD technique. The most significant criteria using minimized entropy model were CP evaluated on the third day (single cut point 0.79, sensitivity 84.6%, specificity 100.0%) and CO evaluated also on the third day (single cut point 4.00, sensitivity 84.6%, specificity 100.0%).

Conclusions Cardiac power is a reliable predictor for inhospital mortality and survival within the first year after acute myocardial infarction. It could be evaluated using ITD and with sufficient accuracy by means of a noninvasive method - ICG - as well.

Left ventricular torsion analysis using echocardiographic speckle tracking in a canine model of dyssynchrony and cardiac resynchronization therapy predicts global cardiac performance

B Lamia, M Tanabe, HK Kim, J Gorcsan, MR Pinsky

University of Pittsburgh Medical Center, Pittsburgh, PA, USA Critical Care 2009, 13(Suppl 1):P162 (doi: 10.1186/cc7326)

Introduction Left ventricular (LV) torsion is a primary mechanism used to eject blood during systole. We hypothesized that LV torsion is impaired during dyssynchronous contractions and restored with cardiac resynchronization therapy (CRT) in proportion to the degree that global LV performance improves. Methods Seven anesthetized open-chest dogs had high fidelity pressure and conductance volume catheters to assess LV

Figure 1 (abstract P162)

performance. Basal and apical Grayscale echo images were recorded. Right atrial (RA) pacing served as the control. Right ventricular (RV) outflow tract pacing created a left bundle branch block (LBBB) to simulate dyssynchrony. Simultaneous RV-LV free wall and RV-LV apex pacing were modeled CRT (CRTfw and CRTa). Torsion was estimated as the difference between apical and basal rotation in degrees.

Results Torsion during RA pacing was 7.0 ± 3.6°. RV pacing decreased torsion (5.1 ± 3.6°, P <0.05 vs. control), reduced the stroke volume (SV), stroke work (SW), and dP/dtmax compared with RA (21 ± 5 vs. 17 ± 5 ml, 252 ± 61 vs.151 ± 64 mJ, and 2,063 ± 456 vs. 1,603 ± 424 mmHg/s, P <0.05). CRTa improved torsion, SV, SW and dP/dtmax compared with RV pacing (7.7 ± 4.7°, 23 ± 3 ml, 240 ± 50 mJ and 1,947 ± 647 mmHg/s, respectively, P <0.05), whereas CRTfw did not (5.1 ± 3.6°, 18 ± 5 ml, 175 ± 48 mJ and 1,699 ± 432 mmHg/s, respectively, P <0.05) (Figure 1). Changes in torsion compared with RA covaried with changes in SW during RV, CRTa and CRTfw. Conclusions LV torsion, as quantified by speckle tracking echocardiography in an acute canine model, was reduced by dyssynchrony contraction and restored by CRT in proportion to the degree to which global measures of LV performance also improved. Thus, torsion and global LV performance are linked during synchronous, dyssynchronous contractions and CRT.

Cardiac diseases during pregnancy and periperium: 13 years in the ICU_

K Baccar, N Baffoun, C Kaddour

National Institute Neurology, Tunis, Tunisia

Critical Care 2009, 13(Suppl 1):P163 (doi: 10.1186/cc7327)

Introduction Pregnancy, labour and the postpartum period constitute major stresses on the cardiovascular system. Patients with heart disease may decompensate due to the physiologic changes that occur during pregnancy and may develop a cardiac event with obstetric events.

Methods A retrospective collection of data for all obstetric patients admitted to our ICU in a 13-year period (September 1995 to September 2008). Data collected include the antepartum or peripartum heart status (abnormal finding on physical examination, ECG and echocardiography), reason for admission, stay on the ICU prognostic scoring (APACHE II, Obstetrical SAPS) and the outcome.

Results During 13 years there were 42 obstetric admissions to our ICU. The reason for admission was due to pulmonary edema 16 (38.1%) cases, hemorrhagic shock 11 (26.2%) cases, cardiogenic shock 8 (19%) cases, pulmonary emboli 4 (9.5%) cases, stroke 2 (4.8%) cases and recent myocardic infarction 1 (2.4%) case.

Admission was unplanned for 36 (85.7%) parturients following emergency caesarean section. Status of the heart was severe aortic or mitral stenosis 21 (50%) cases, mitral or aortic regurgitation with heart failure 5 (11.9%) cases, mechanical prosthetic valve requiring anticoagulation 8 (19%) cases, myocardial infarction 1 (2.4%) case and peripartum cardiomyopathy 7 (16.6%) cases. The median duration of stay was 4 days (range 1 to 11 days), median Obstetrical SAPS was 19.3 (range 10 to 32) and APACHE II score was 13 (range 8 to 32). There were five (11.9%) maternal deaths, due to two cases PPCM, one cerebral death and two prothesis thrombus.

Conclusions The majority of the cases of parturients was unbooked for antenatal care; this leads us to acknowledge the existence of risk factors related to pregnancy. Maternal functional class is an important predictor of outcome; a high index of suspicion for cardiac diseases is essential to identify risk [1], once these patients are referred for a cardiologic opinion there is a needed for cardiologist to develop a systematic approach to their evaluation. Ideally these considerations should be commenced during prepregnancy consultations, but continued throughout pregnancy and the postpartum period [1]. Reference

1. Siu, Colman JM: Cardiovascular problems and pregnancy: an approach to management. Clev Clin J Med 2004, 71: 977-985.

Endothelial nitric oxide synthase deficiency and inducible nitric oxide synthase inhibition in the setting of septic cardiomyopathy

A Van de Sandt1, R Windler1, A Gödecke2, J Ohlig1, S Becher1, E Van Faassen3, T Rassaf1, J Schrader2, M Kelm1, M Merx1

1Med. Clinic I, University Hospital RWTH Aachen, Germany;

2University of Düsseldorf, Germany; 3Debye Institute of Nanomaterials Science, Utrecht, the Netherlands Critical Care 2009, 13(Suppl 1):P164 (doi: 10.1186/cc7328)

Introduction Nitric oxide (NO) plays a central role in the patho-genesis of septic cardiomyopathy. However, the relative contribution of inducible nitric oxide synthase (iNOS) and endothelial nitric oxide synthase (eNOS) remains unclear. The aim of this study is to elucidate the influence of eNOS and iNOS on cardiac function, NO production rate and survival in the clinically relevant polymicrobial cecum ligation and puncture (CLP) model of sepsis. Methods B6/c57 wildtype (WT) and eNOS-/- mice were rendered septic by CLP or were sham operated. Immediately, the selective iNOS inhibitor 1400W (6.6 mg/kg body weight intraperitoneally and subcutaneously) or carrier were applied. At 12 hours after sepsis induction, heart function was assessed by pressure-volume loops (1.4 Fr Millar catheter). NOx levels and endogenous NO production (blood plasma/heart tissue) were measured via gasphase chemiluminescence detection, high-performance liquid chromatography and electron paramagnetic resonance spectro-scopy. To evaluate apoptosis and inflammation, quantitative immunochemistry was applied. iNOS expression was analyzed via RT-PCR.

Results Cardiac function was significantly impaired solely in septic WT mice with diminished left ventricular developed pressure/dPdtmax (dPdtmax: WT CLP = 10,981 ± 1,100 mmHg/s vs. WT sham = 13,408 ± 827 mmHg/s; P <0.01) and increased left-ventricular volumes. Inhibition of iNOS in septic WT mice resulted in ameliorated cardiovascular impairment, whereas no signs of septic cardiomyopathy were observed in septic eNOS-/-. In contrast to septic eNOS-/- mice, septic WT mice developed a significant

increase in NO3-, NO2- and bioactive NO levels. eNOS deficiency was associated with diminished apoptosis and modified inflammation. In eNOS-/- mice, a decreased iNOS expression was observed compared with septic WT mice. Both genetic eNOS deficiency and pharmacologic iNOS inhibition was associated with a significant survival benefit. While eNOS-/- mice survived longest, additional iNOS inhibition in the latter diminished this benefit significantly.

Conclusions In this clinically relevant model of sepsis, eNOS constitutes an important factor mediating septic cardiomyopathy. To what extent the diminished inflammation, apoptosis, NO production rate, iNOS expression and prolonged survival in eNOS-/- mice contribute to the observed benefits remains to be clarified in future studies.

Use of levosimendan in myocardial dysfunction due to sepsis

J Vaitsis1, H Michalopoulou1, C Thomopoulos2, S Massias2, P Stamatis1

1'Metaxa' Hospital, Athens, Greece;2'Elena Venizelou' Hospital, Athens, Greece

Critical Care 2009, 13(Suppl 1):P165 (doi: 10.1186/cc7329)

Introduction Myocardial dysfunction observed within the context of sepsis is partially due to desensitization of the cardiac muscle to vasoactive agents. Levosimendan, a calcium sensitizer and a K+-ATP channel opener, plays a significant role in the treatment of myocardial depression although its use in sepsis has not yet been established.

Methods We studied 42 patients (66.1 ± 7.54 years, 24 male) who met the criteria for sepsis (infection by Gram-negative bacteria and at least two SIRS criteria) and displayed severe heart dysfunction (cardiac index (CI) <2.2, ejection fraction (EF) <35%). The APACHE II score was 21.2 ± 4.9. Patients were randomized to receive additionally to their standard treatment levosimendan (0.1 |ig/kg/minute 24-hour infusion) (group A, n = 23) without loading dose or dobutamine (5 to 10 |ig/kg/minute 24-hour infusion) (group B, n = 19). Noradrenaline was used to preserve the mean arterial pressure above 65 mmHg. The primary goal was mortality at 7 and 30 days.

Results Mortality at 7 and 30 days was 30% and 60% in group A versus 36% and 68% in group B (P = 0.05, P = 0.03, respectively). The CI and EF significantly increased in group A (ACI: 1.79 ± 0.16 vs. 1.4 ± 0.12, P = 0.027; AEF: 4.8 ± 0.2% vs. 3.5 ± 0.7%, P = 0.04), and also SvO2 (P = 0.012) and mean arterial pressure (P = 0.035) were significantly increased. Conclusions Levosimendan compared with dobutamine improves the hemodynamic profile of septic patients with myocardial dysfunction; and while there is evidence that it reduces mortality, further studies are needed to verify this.

Haemodynamic effects of levosimendan following cardiac surgery

A Slezina, E Strike, M Bekers-Anchipolovskis

Pauls Stradins Clinical University Hospital, Riga, Latvia Critical Care 2009, 13(Suppl 1):P166 (doi: 10.1186/cc7330)

Introduction Myocardial contractile function following cardiac surgery often requires inotropic support. Traditionally, phospho-diesterase inhibitors and catecholamines are used [1,2]. Levo-simendan is a novel inotropic agent-calcium sensitizer that S67

enhances myocardial contractility without increasing intracellular calcium and myocardial oxygen demand [3]. The aim of our study was to evaluate circulation following levosimendan infusion, to find out the efficiency of tissue oxygen supply and to evaluate other inotrope dosage rates during levosimendan infusion. Methods Thirteen patients with acute heart failure following elective cardiac surgery under cardiopulmonary bypass treated with levosimendan and other inotropes were enrolled. Haemodynamic parameters (mean arterial pressure, central venous pressure, mean pulmonary artery pressure, cardiac index), PaO2, SvO2, plasma lactate, cardiac troponin I and other inotropes rates were obtained at baseline, 30 minutes, 1 hour, 6 hours and 24 hours after the start of levosimendan infusion. Levosimendan was administered at a rate of 0.1 |ig/kg/minute (no bolus) with a total 24-hour dose of 1 2.5 mg for each patient. Other inotropic agents used were epinephrine (range 0.03 to 0.15 |ig/kg/min), norepinephrine (range 0.03 to 0.18 |ig/kg/min), dobutamine (range 3.5 to 7 |ig/kg/min) and corotrope (range 0.3 to 0.6 |ig/kg/min). Results Thirteen patients (nine female, four male) were investigated. Forty-six per cent of patients had III to IV (New York Heart Association) stage of congestive heart failure (ejection fraction <40%). Six patients died within 24 hours and were excluded. Maximal changes occurred 24 hours after levosimendan infusion. Mean arterial pressure increased, while the mean pulmonary artery pressure and central venous pressure decreased. The cardiac index did not change considerably. Oxidative stress markers improved. Mean infusion rates of epinephrine, dobutamine and corotrope decreased, while the norepinephrine mean rate did not change.

Conclusions Levosimendan improves circulation by means of its positive inotropic effect, increases mean arterial pressure, and decreases pulmonary pressure. Levosimendan could be a drug of choice in myocardial hypoxia, probably combined with right heart failure. References

1. Hardy JF, et al.: J Cardiothorac Vasc Anesth 1993, 7(Suppl 2):33-39.

2. Orime Y, et al.: Jpn Circ J 1999, 63:117-122.

3. Raja SG, et al.: Ann Thorac Surg 2006, 81:1536-1546.

Effects of levosimendan started 24 hours before cardiopulmonary bypass in patients admitted to the ICU with an ejection fraction lower than 30%

J Duchateau1, H Vanden Eede2

1ZNA Middelheim, Antwerp, Belgium; 2HAGA Hospital, The Hague, the Netherlands

Critical Care 2009, 13(Suppl 1):P167 (doi: 10.1186/cc7331)

Introduction Patients with poor left ventricular function require inotropic drug support after cardiopulmonary bypass (CPB). The use of levosimendan in these patients is associated with a better postoperative function [1]. A recent study has attributed this phenomenon to preconditioning effects of levosimendan [2]. However, a recent study showed no benefit in starting levo-simendan before CPB [3]. We hypothesized that admitting the patient to intensive care 24 hours before CPB and starting levo-simendan preoperatively could reduce postoperative cardiac damage.

Methods Fifty patients with an ejection fraction less than 30% scheduled for elective cardiac surgery with CPB received anesthesia with propofol, cisatracurium and sufentanil. The patients were randomly assigned to two protocols. Group A: levosimendan S68 0.1 |ig/kg/minute started before CPB after induction + dobutamine

5 |ig/kg/minute started after the release of the Aox. Group B: levosimendan 0.1 |ig/kg/minute started on the ICU 24 hours before CPB + dobutamine 5 |ig/kg/minute started after the release of the Aox. Data analysis: SV, dobutamine, noradrenaline time and dose, cardiac enzymes, global hemodynamic variables, ICU and hospital length of stay. Statistical significance was accepted at P0.05.

Results There was no difference in postoperative stroke volume and troponin I. The incidence of postoperative atrial fibrillation in both groups was very low. There was a significant difference in ICU length of stay but not in hospital length of stay. Conclusions In cardiac surgery patients with a low preoperative ejection fraction there is no difference in cardiac function and postoperative troponin I levels between the two different levosimendan treatment modalities. The incidence of postoperative atrial fibrillation is low when levosimendan is used perioperatively. There is a significant reduction in the ICU length of stay, but not in the hospital length of stay. References

1. De Hert SG: Anesth Analg 2007, 104:766-773.

2. Tritapepe L: Br J Anaesth 2006, 96:694-700.

3. De Hert SG, et al.: J Cardiothorac Vasc Anesth 2008, 22: 699-705.

Utilization of levosimendan in the cardiac ICU: case series

D Filipescu, M Luchian, A Prodea, O Gheanu, A Calugareanu, S Marin, H Moldovan, A Iosifescu, O Chioncel

IBCV, Bucharest, Romania

Critical Care 2009, 13(Suppl 1):P168 (doi: 10.1186/cc7332)

Introduction Levosimendan, a novel calcium sensitizer, has been shown to improve hemodynamic function in patients with acute heart failure [1]. The aim of the study was to assess the hemo-dynamic effects of levosimendan as a rescue medication in addition to conventional therapy in patients with low cardiac output (LCO) after cardiac surgery or myocardial infarction. Methods Forty-one patients with LCO admitted to the cardiac ICU between June 2004 and November 2008 were included in this observational hemodynamic study. Thirty-four patients were admitted after open heart surgery and seven patients for ischemic acute heart failure. Levosimendan (0.1 |ig/kg/min x 24 hours, without bolus) was added to conventional inotropes and/or intra-aortic balloon pump (IABP) support. The measured parameters were: cardiac output/index, pulmonary artery occlusion pressure (PaoP), left ventricular ejection fraction (LVEF), and mixed venous oxygen saturation (SvO2). Baseline data were collected before levosimendan administration and the following datasets were obtained at 6, 24 and 48 hours. Length of stay (LOS) in the cardiac ICU and inhospital mortality were also registered. Data were expressed as the mean ± SD. Fisher's exact test and a nonpaired t test were used when appropriate. P <0.05 was considered significant.

Results All patients were on dobutamine and epinephrine. IABP support was used in 28 (68%) of cases. The addition of levo-simendan significantly improved cardiac index (from 2.27 ± 0.9 l/min/m2 to 3.05 ± 0.9 l/min/m2 at 6 hours (P <0.01); 2.93 ± 0.8 l/min/m2 at 24 hours (P <0.02) and 2.93 ± 1.0 l/min/m2 at 48 hours (P <0.05)). LVEF increased with 19.8% (from 30 ± 8% to 37 ± 9% at 48 hours (P <0.05)). SvO2 improved significantly from 57 ± 13% to 69 ± 8% at 48 hours (P <0.01). PaoP decreased significantly only in the first 6 hours from 19 ± 5 mmHg to 15 ± 4 mmHg (P <0.05). The mean LOS in the cardiac ICU was 14 ± 13 days. Inhospital mortality was 24.4%.

Conclusions In our case series, addition of levosimendan following ineffective conventional therapy resulted in substantial hemodynamic improvement. These preliminary results support the use of levosimendan in patients with LCO as a rescue medication with favorable short-term effects. Reference

1. Raja SG, et al.: Ann Thorac Surg 2006, 81:1536. P169

Effects of levosimendan and inhaled nitric oxide on microcirculation in septic shock

A Morelli1, A Donati2, C Ertmer3, S Rehberg3, B Bollen Pinto3, A Orecchioni1, H Van Aken3, P Pelaia2, P Pietropaoli1, M Westphal3

1 University of Rome, Italy; 2Marche Polytechnique University,

Ancona, Italy; 3University of Muenster, Germany

Critical Care 2009, 13(Suppl 1):P169 (doi: 10.1186/cc7333)

Introduction Microvascular resuscitation is a crucial therapeutic goal in sepsis. The current study was performed to test the hypothesis that a combination of levosimendan and inhaled nitric oxide (INO) may improve microvascular perfusion in septic shock. Methods After initial hemodynamic stabilization (mean arterial pressure between 65 and 75 mmHg; mixed venous oxygen saturation >65%), seven patients with catecholamine-dependent septic shock received intravenous levosimendan 0.2 |ig/kg/minute for 24 hours. At the end of the first 24 hours of the study period, inhaled nitric oxide (35 ppm) was added for another 12 hours. Sublingual microvascular perfusion was analyzed using the side-stream dark field method. The total vessel density (mm/mm2), perfused vessel density (mm/mm2), De Backer score (1/mm), microcirculatory flow index of small vessels (MFIs) and micro-circulatory flow index of medium vessels (MFIm) were obtained at baseline and after 24 and 36 hours.

Results Levosimendan significantly (P <0.05 vs. baseline) increased perfused vessel density from 11.3 mm/mm2 (10.7; 12.6) to 14.8 mm/mm2 (13.7; 16.1) and MFIs from 2 (1.9; 2.2) to 3 (2.8; 3). Addition of INO further increased MFIm from 2.6 (2.5; 2.8) to 3 (3; 3). Data are presented as median (25%; 75% range). No statistically significant differences were found in any of the other investigated parameters.

Conclusions The combination of Levosimendan and INO may improve microvascular perfusion in septic shock.

Effects of levosimendan on renal function in septic shock: a case-control study

A Morelli1, C Etmer2, S Rehberg2, A Orecchioni1, N Cannuovacciuolo1, B Bollen Pinto2, M Lange2, H Van Aken2, A Donati3, P Pietropaoli1, M Westphal2

1University of Rome, Italy; 2University of Muenster, Germany;

3Marche Polytechnique University, Ancona, Italy

Critical Care 2009, 13(Suppl 1):P170 (doi: 10.1186/cc7334)

Introduction Nonhemodynamic mechanisms of cell injury might play a role in the loss of glomerular filtration rate during sepsis [1]. We hypothesized that levosimendan may positively affect renal function by a combination of systemic and regional hemodynamic, anti-inflammatory and anti-apoptotic effects. We therefore performed a case-control study to investigate the effects of levosimendan on creatinine clearance in patients with septic shock. Methods Ninety-nine septic shock patients received levosimendan (0.2 |ig/kg/min for 24 hours) within the first 36 hours following

onset of septic shock. For each study patient, a control subject from a group of patients with septic shock of an institutional database was matched for Simplified Acute Physiology Score II, baseline creatinine concentration, delay from shock onset, age, and gender. Serum creatinine concentrations were analyzed just before the start of the 24-hour period of levosimendan infusion (baseline) and 96 hours after levosimendan had been initiated. The glomerular filtration rate was estimated by applying the Cockcroft-Gault formula.

Results Compared with the control cohort, levosimendan significantly increased the glomerular filtration rate after 96 hours (62 ± 46 vs. 50 ± 33 ml/min, P < 0.05). In addition, the maximum serum creatinine concentration was lower in the levosimendan group (2.2 ± 1.3 vs. 2.6 ± 2 mg/dl, P <0.05 vs. control) during the 96-hour study period.

Conclusions The present data suggest that levosimendan may improve renal function in patients with septic shock. Reference

1. Wan L, et al.: Pathophysiology of septic acute kidney injury: what do we really know? Crit Care Med 2008, 36: S198-S203.

Levosimendan versus dobutamine in septic shock

JA Alhashemi, Q Alotaibi

King Abdualziz University, Jeddah, Saudi Arabia

Critical Care 2009, 13(Suppl 1):P171 (doi: 10.1186/cc7335)

Introduction Levosimendan is a calcium sensitizer that increases cardiac contractility without increasing intracellular calcium levels. Its efficacy has been demonstrated in acute decompensated heart failure but has not been evaluated in severe sepsis/septic shock. We hypothesized that levosimendan increases the cardiac index similar to dobutamine in patients with severe sepsis/septic shock. Methods In a randomized, open-label trial, 42 patients admitted to the ICU with severe sepsis/septic shock were randomized to receive either levosimendan (group L) or dobutamine (group D) as part of an early-goal directed therapy protocol [1]. Study drugs were titrated incrementally to an ScvO2 >70% or to a maximum dose, whichever was achieved first, and were continued for a total of 24 hours only. Group L received levosimendan 0.05 |ig/kg/ minute intravenously that was increased by 0.05 |ig/kg/minute every 30 minutes (maximum 0.2 |ig/kg/min). Group D received dobutamine 5 |ig/kg/minute intravenously which was increased by 5 |ig/kg/minute every 30 minutes (maximum 20 |ig/kg/min). Rescue therapy consisted of dobutamine 10 |ig/kg/minute intravenously titrated to ScvO2 >70% or a maximum of 20 |ig/kg/minute, whichever was achieved first. Hypotension (mean arterial pressure (MAP) <65 mmHg) was treated with norepinephrine infusion, titrated to a MAP >65 mmHg. ScvO2 was recorded hourly. Cardiac output was measured continuously using the FloTrac™ device (Edwards Lifesciences, Irvine, CA, USA). Continuous data were analyzed using repeated-measures ANOVA, and proportions were compared using Fisher's exact test. Results are presented as the mean ± SD unless otherwise indicated. Significance was defined as P <0.05.

Results APACHE II scores were 21 ± 7 versus 27 ± 7 (P = 0.02), and the ICU mortality was 10 (48%) versus 13 (62%) (P = 0.35) for groups L and D, respectively. The cardiac index was lower in group L compared with group D (estimated marginal mean ± SEM: 2.8 ± 0.1 vs. 3.2 ± 0.1, respectively, P <0.01) but the ScvO2 changes over time did not differ between groups (P >0.5). Norepinephrine was administered to 17 (81%) patients in group L and 21 (100%) in group D (P = 0.04). S69

Conclusions Dobutamine increased the cardiac index more than did levosimendan, although there was no difference between the two drugs with regard to their effects on ScvO2. Reference

1. Rivers E, et al.: N Engl J Med 2001, 345:1368-1377. P172

Levosimendan therapy does not improve survival of post-resuscitation cardiogenic shock patients

P Soos1, D Becker1, G Barczi1, G Szabo1, E Zima2, G Fulop1, L Geller1, A Apor1, B Merkely1

1Heart Center, Budapest, Hungary; 2Semmelweis University, Budapest, Hungary

Critical Care 2009, 13(Suppl 1):P172 (doi: 10.1186/cc7336)

Introduction The calcium sensitizer levosimendan enhances myocardial contractility, which could be advantageous in patients with myocardial ischemia requiring inotropic support. Methods During 3 years 3,852 patients with high-risk acute coronary syndrome (ACS) underwent percutaneous coronary intervention in our department. In 106 cases ACS was complicated with cardiogenic shock (mean age: 68.6 ± 1.2 years); moreover, in 26 cases patients had to be resuscitated (cardiopulmonary resuscitation (CPR)). Short-term and long-term effects of levosimendan on cardiac functions and on survival of cardiogenic shock and post-CPR patients were analyzed. Levosimendan was administrated in 39 of 106 cases as add-on therapy for patients with impaired left ventricular function, by extensive wall motion abnormality and by high blood cardiac enzyme concentration. Levosimendan therapy was started in most cases on the second or third day and applied for 6 hours as a continuous infusion (0.1 |ig/kg/min). The mean time spent in the primary cardiac care center (inhospital time) was 6.0 ± 0.4 days, and the whole follow-up was 204.6 ± 29.9 days long.

Results In the post-CPR patient group there was no significant difference in survival according to levosimendan treatment during short-term follow-up (36.5% vs. 40.0%, P =0.790) and during long-term follow-up (15.6% vs. 15.0%, P =0.754). On the other hand, for nonresuscitated patients the survival rates were significantly higher in the levosimendan-treated patient group during short-term follow-up (84.4% vs. 57.9%, P<0.001) and during long-term follow-up (47.3% vs. 23.0%, P <0.001). The time interval between the onset of myocardial infarction and percutaneous coronary intervention did not influence the effect of levosimendan on short-term and long-term mortality.

Conclusions In summary, levosimendan may improve cardiac function and decrease short-term and even long-term mortality in cardiogenic shock patients independently of the time interval of myocardial infarction. This positive effect of levosimendan may be abolished by post-CPR patients.

Study of hemodynamics in patients treated with landiolol in the ICU_

T Imabayashi, H Murayama, C Kuroki, N Kiyonaga, T Oryouji, S Tashiro, T Yasuda, Y Kakihana, A Matunaga, Y Kanmura

Kagoshima University Hospital, Kagoshima, Japan

Critical Care 2009, 13(Suppl 1):P173 (doi: 10.1186/cc7337)

Introduction We experienced patients whose cardiac functions were maintained or deteriorated during the treatment of a short-acting p-blocker, landiolol. We therefore investigated the effects of landiolol on the cardiac function in patients.

Methods From January through December 2007, 21 patients were selected if they had tachycardia (heart rate (HR) >100 beats/min) and they were treated with continuous infusion of landiolol. Hemodynamics were recorded using pulmonary artery catheters at four time points (before administration of landiolol, 1 hour after beginning of administration, immediately before discontinuation of administration, and 1 hour after discontinuation of administration). Infusion of landiolol was discontinued if the HR fell below 90 beats/min or the cardiac index (CI) fell below 2.0 l/min/m2. The paired Student t test was used to compare the differences. P <0.05 was statistically significant.

Results Between the point before administration of landiolol (2.4 ± 1.8 (mean ± SD) |ig/kg/min) and 1 hour after beginning of administration, although the HR was decreased from 137 ± 20 beats/min to 109 ± 20 beats/min (P <0.01), the stroke index (SI) was increased from 20.4 ± 8.1 ml/beats/m2 to 22.5 ± 7.0 ml/beats/m2, and thus the CI was maintained (2.76 ± 1.05 l/min/m2 and 2.43 ±

0.78 l/min/m2). Between the point before discontinuation of administration (1.4 ± 1.1 |ig/kg/min) and 1 hour after discontinuation of administration, the HR was not increased significantly (93 ± 18 beats/min and 100 ± 16 beats/min), however the SI was increased 21.5 ± 6.7 ml/beats/m2 to 25.7 ± 5.4 ml/beats/m2 (P<0.05), and therefore the CI was increased 1.93 ± 0.54 l/min/m2 to 2.50 ± 0.40 l/min/m2 (P <0.01). Infusion of landiolol was discontinued because the HR fell below 90 beats/min in 12 patients and the CI fell below 2.0 l/min/m2 in nine cases. There were no significant differences in the catecholamine index, pulmonary artery pressure, and central venous pressure. Conclusions We investigated hemodynamics in 21 patients who used landiolol and recognized the effects of rate control and depression of the SI and the CI; thus when we use landiolol, we have to pay attention to the HR and cardiac function [1]. Reference

1. Goto K, Shingu C, Miyamoto S, et al.: The effect of landiolol on hemodynamics and left ventricular function in patients with coronary artery disease. J Clin Anesth 2003, 19:523-529.

Effects of intravenous diltiazem on a porcine model of endotoxin-induced pulmonary hypertension

M Kyparissa, V Grosomanidis, K Kotzampassi, K Karakoulas, A Kolettas, B Fyntanidou, C Skourtis

University of Thessaloniki Medical School, Thessaloniki, Greece Critical Care 2009, 13(Suppl 1):P174 (doi: 10.1186/cc7338)

Introduction Pulmonary hypertension (PAH) is a life-threatening disease commonly seen in ICU septic patients. PAH is characterized by alterations in the pulmonary circulation leading to right ventricular failure and is associated with poor outcome in such patients. Various agents such as nitric oxide, prostaglandins and phosphodiesterase inhibitors have been used for its treatment. Drugs that induce pulmonary vasodilatation, increase contractility and maintain a stable haemodynamic profile seem an attractive treatment option in acute PAH patients. However, there is lack of evidence-based guidelines in the current medical literature. Although oral diltiazem has been shown to improve haemodynamics in chronic PAH patients, it has not been used for the treatment of acute PAH. The aim of the present study was to evaluate the effects of intravenous diltiazem on a porcine model of acute PAH during sepsis. Methods PAH was induced in 16 anaesthetized, mechanically ventilated pigs (25 kg) by intravenous infusion of 0.5 mg/kg LPS (Escherichia coli, 111: B4) in a period of 30 minutes. After LPS administration, animals were randomly divided into two groups. Group A received intravenous diltiazem 280 to 400 |ig/kg, and

Group B an equal dose of placebo (normal saline) and served as the control. Haemodynamic measurements, including parameters of systemic and pulmonary circulation, were performed before and after LPS administration and every 20 minutes for 2 hours. Results After LPS infusion, both systolic pulmonary pressure (PAPs) and diastolic pulmonary pressure (PAPd) exhibited a statistically significant increase (PAPs from 18 ± 1 to 48 ± 7 mmHg, P <0.01 and PAPd, from 7.7 ± 0.7 to 27 ± 4 mmHg, P<0.01) and remained elevated over time. Diltiazem administration reduced significantly both PAPs and PAPd in relation to placebo (P <0.001), but not to baseline levels. The heart rate, cardiac output and systemic haemodynamics exhibited no difference between groups throughout the time period.

Conclusions Intravenous diltiazem was associated with a reduction of pulmonary pressure without any systemic hypotension in a porcine model of endotoxin-induced acute PAH. This may represent a significant advance in the treatment of acute PAH. Potential clinical implications merit further study.

Acute liver failure induced by intravenous amiodarone in the cardiac care unit: retrospective study of 3 years

E Zima, V Szabo, I Osztheimer, T Barany, D Becker, L Molnar, P Soos, L Geller, B Merkely

Semmelweis University, Budapest, Hungary

Critical Care 2009, 13(Suppl 1):P175 (doi: 10.1186/cc7339)

Introduction Amiodarone is the antiarrhythmic drug of choice in treatment of patients suffering from acute tachyarrhythmias and haemodynamic instability due to impaired cardiac function. The intravenous form of amiodarone hydrochloride (IvAm) has individual antiarrhythmic, rate-controlling efficacy, and the dosage is empirical. The main adverse effects are hypotension, severe bradycardia, asystole, acute heart failure, and impaired liver function. Acute liver failure (ALF) is a known but rare complication of IvAm that may be reversible by eliminating the infusion in most cases. The few papers in the literature suppose ALF may be caused by polysorbate 80, the vehicle of IvAm. Oral administration does not to have such an adverse effect, therefore IvAm can be changed to oral form in any cases.

Methods Our aim was to investigate retrospectively the incidence of ALF and relation of IvAm and ALF in cardiac patients. The history, treatment sheets and laboratory parameters of 11,722 patients treated in the Heart Center between 2005 and 2007 were analyzed. Patients were considered severe ALF patients if transaminase levels exceeded 80 x upper limit of normal (ULN) during their stay in our clinic. The cutoff point was determined to differentiate ALF patients from heart failure and myocardial infarct patients with elevated transaminase levels.

Results According to the enzyme levels, 55 patients suffered from severe ALF during the 3 years; 26 of them had IvAm treatment. On the basis of treatment sheets, start and elimination of IvAm treatment, status of acute myocardial infarct and heart failure and transaminase kinetics, eight patients had ALF induced by IvAm. Indication for amiodarone was atrial fibrillation (n = 6) and ventricular tachycardia. Average multipliers of ULN were 379 ± 190 at aspartate aminotransferase, 191 ± 87 at alanine aminotransferase, 57 ± 22 at lactate dehydrogenase. The time from start of IvAm to detection of ALF was 17 ± 4.6 hours. One-quarter of these patients died in ALF. Liver enzymes decreased to 10 x ULN during 2.5 ± 0.6 days. Conclusions ALF is a rare but potentially life-threatening adverse effect of IvAm. Authors suggest monitoring liver enzymes from the start of IvAm treatment. Rapid elevation in liver enzyme levels indicates a hepatotoxic effect of IvAm. In these cases the immediate

cessation of IvAm administration and start of intensive care is life-saving.

Prevention of postoperative arrhythmias after pulmonary resection: celiprolol versus magnesium

S Ouerghi1, K Moncer2, N Frikha2, M Ben Ammar2, M Mebazaa2

1Hospital A Mami, Ariana, Tunisia; 2Mongi Slim Hospital, La Marsa, Tunisia

Critical Care 2009, 13(Suppl 1):P176 (doi: 10.1186/cc7340)

Introduction Incidence of arrhythmia after lobectomies is between 10% and 20% and approaches 40% after pneumonectomy. Postoperative arrhythmia is associated with higher morbidity and mortality [1]. The objective of this study was to compare the effect of oral p-blocker (celiprolol) and intravenous magnesium (Mg) on the frequency of supraventricular arrhythmia (SVA) after pulmonary resection.

Methods Twenty-six patients undergoing pneumonectomy or bilobectomy were randomised to receive either celiprolol (group 1, n =13) or intravenous Mg (group 2, n =13). Patients were excluded if they had a history of congestive heart failure, second-degree or greater heart block, a history of SVA, or were receiving oral p-blocker, diltiazem, or verapamil. Patients of group 1 received oral celiprolol (100 mg every 8 hours) starting before operation and continued for 10 days postoperatively. Group 2 received intravenously 2 g Mg at the time of the thoracotomy, at 6 hours then every day for 3 days. All patients were followed clinically with at least four daily ECGs for up to 10 days. Statistical analyses were performed using the statistical package SPSS version 11.0. P <0.05 was considered significant. Results The mean age of the 26 patients was 53 ± 14 years. SVA developed in 42.3% of the patients and atrial fibrillation (AF) in 19.2%. The mean cardiac frequency was 82 ± 10 pulses/minute in group 1 versus 92 ± 10 pulses/minute in group 2. The incidence of SVA was significantly higher in group 2 (30.4% vs. 11.4%, P =0.047). However, the incidence of AF was similar in the two groups (group 1: 3.8% vs. group 2: 15.2%, P = 0.135). The peak for the occurrence of SVA was on postoperative day 2. Right pneumonectomy and intrapericardial resection were not associated with increased development of postoperative SVA. No serious adverse effects caused by celiprolol or magnesium were seen. Conclusions Terzi and colleagues [2] demonstrated a significant reduction in the incidence of AF using Mg. In our study Mg has no role in the prophylaxis of SVA in lung surgery. Perioperative celiprolol can reduce the frequency of SVA without serious side effects. In fact, increased sympathetic activity is one of the predominant factors in the cause of this complication. References

1. Amar D: Perioperative atrial tachyarrhythmias. Anesthesiology 2002, 97:1618-1623.

2. Terzi A, et al.: Prevention of atrial tachyarrhythmias after non-cardiac thoracic surgery by infusion of magnesium sulfate. Thorac Cardiovasc Surg 1996, 44:300-303.

Low-dose dopamine is not useful in kidney transplantation

M Ciapetti, S Di Valvasone, M Bonizzoli, V Marcellino, A Di Filippo, A Peris

Careggi Teaching Hospital, Florence, Italy

Critical Care 2009, 13(Suppl 1):P177 (doi: 10.1186/cc7341)

Introduction In kidney transplantation, low-dose dopamine (LDD) (0.5 to 2.5 |ig/kg/min) is used to increase urine output and to S71

prevent arterial vasospasm and acute tubular necrosis, but there is some controversy about its use [1-4]. The aim of the study was to evaluate the effectiveness of LDD in the early post-transplant period.

Methods Fifty kidney transplant recipients admitted to the ICU in the early post-transplant period were allocated to two groups: A (n = 20) treated and B (n = 30) not treated with LDD. All patients underwent postoperative intensive monitoring, control of blood sample and kidney functioning at 0, 6 and 12 hours in the ICU. Postoperative therapy was the same for all patients, except for LDD. The intravascular volume was kept effective by maintaining central venous pressure >5 mmHg and ScVO2 >70%. We collected donors' and transplant kidney parameters (age, sex, death cause, ischaemia time), recipients' parameters (age, sex, weight, height, BMI, duration of dialysis, end stage renal disease, Simplified Acute Physiology Score II), intraoperative parameters (metabolic, respiratory and hemodynamic), hemodynamic and kidney functioning parameters in the ICU (heart rate, mean arterial pressure, central venous pressure, ScVO2, lactate, diuresis, blood urea nitrogen, creatinine, fluid balance), and outcome parameters (ICU length of stay, postoperative complications, acute rejection at 28 days, mortality at 6 months).

Results There were no significant difference in donors' and graft data, recipients' data, intraoperative data, hemodynamic and kidney functioning data and outcomes in both groups. The significant differences between the two groups of patients were: Simplified Acute Physiology Score II was higher in group B (A: 24.4 ± 12.9; B: 31.2 ± 9.4; P <0.05); heart rate was higher in group A at each observation time (at 6 hours A: 94.2 ± 17.03; B: 85.6 ± 12.2; at 12 hours A: 92.7 ± 15.5; B: 82.6 ± 15.2; P <0.05); ICU length of stay was shorter in group B (A: 28.9 ± 17.3; B: 20 ± 7.2; P<0.05). Conclusions LDD in kidney transplantation does not improve kidney function during the postoperative period, nor short-term and medium-term outcome, but it increases the heart rate and ICU length of stay. References

1. Dalton R, et al.: Transplantation 2005, 79:1561-1567.

2. Donmez A, et al.: Transplant Proc 1999, 31:3305-3306.

3. Flancbaum L, et al.: Clin Transplant 1998, 12:256-259.

4. Spicer ST, et al.: Clin Transplant 1999, 13:479-483.

Low-dose nitroglycerin improves microcirculation in hospitalized patients with acute heart failure

CA Den Uil1, WK Lagrand2, PE Spronk3, M Van der Ent1, L Jewbali1, JJ Brugts1, C Ince1, ML Simoons1

''Erasmus MC, Rotterdam, the Netherlands; 2LUMC, Leiden, the Netherlands; 3Gelre Hospitals, Apeldoorn, the Netherlands Critical Care 2009, 13(Suppl 1):P178 (doi: 10.1186/cc7342)

Introduction Impaired tissue perfusion is often observed in patients with acute heart failure. We tested whether low-dose nitroglycerin (NTG) improves microcirculatory blood flow in patients admitted for acute heart failure.

Methods In 20 acute heart failure patients, NTG was given as intravenous infusion at a fixed dose of 33 |ig/minute. Using sidestream dark field imaging, sublingual microvascular perfusion was evaluated before (T0, average of two baseline measurements) and 15 minutes after initiation of NTG (T1). In a subgroup of seven patients, NTG was stopped for 20 minutes, whereupon sidestream dark field measurements were repeated. Capillaries were defined as the microvessels with a diameter of <20 mm. The perfused S72 capillary density (PCD) was determined as the parameter of tissue

perfusion. Values are expressed as the median and interquartile range (P25; P75).

Results The median age of the subjects was 60 (52; 73) years and 65% were male. Patients were in a stable condition before starting NTG. NTG decreased the central venous pressure (17 (13; 19) mmHg at T0 vs. 16 (13; 17) mmHg at T1, P = 0.03) and pulmonary capillary wedge pressure (23 (18; 31) mmHg at T0 vs. 19 (16; 25) mmHg at T1, P = 0.03). NTG increased PCD (10.7 (9.9; 12.5) mm/mm2 at T0 vs. 12.4 (11.4; 13.6) mm/mm2 at T1, P= 0.01). After cessation of NTG, PCD returned to baseline values (P = 0.04).

Conclusions Low-dose NTG significantly reduces cardiac filling pressures and improves microvascular perfusion in patients admitted for acute heart failure.

Effects of intravenous nitroglycerin and noradrenaline on gastric microvascular perfusion in an experimental model of gastric tube reconstruction

M Buise1, D Gommers2, J De Jonge2, M Van Genderen2, J Bakker2, J Van Bommel2

'Catharina Hospital, Eindhoven, the Netherlands; 2Erasmus

Medical Center, Rotterdam, the Netherlands

Critical Care 2009, 13(Suppl 1):P179 (doi: 10.1186/cc7343)

Introduction Esophagectomy with gastric tube reconstruction is the surgical treatment for cancer of the esophagus. The perfusion of the distal part of the gastric tube depends exclusively on the microcirculation, making it susceptible to hypoperfusion and ischemia. It is unknown whether an increased perfusion pressure can exert a beneficial effect on gastric tissue perfusion. Methods For this purpose we performed a gastric tube reconstruction in 12 pigs, mean bodyweight 32 ± 1 kg (mean ± SE). Besides systemic hemodynamic parameters, the gastric micro-vascular blood flow (MBF) was assessed on pylorus, corpus and fundus with laser Doppler flowmetry and gastric microvascular HbO2 saturation (mHbSO2) and hemoglobin concentration (mHbcon) with spectrophotometry. Animals were evenly randomized over two groups: in both groups the mean arterial pressure was increased from 50 to 110 mmHg with infusion of noradrenaline; however, in the nitroglycerin (NTG) group the central venous pressure was maintained below 10 mmHg throughout the entire experiment with intravenous NTG. In this way, we aimed to increase the perfusion pressure gradient over the gastric tube tissue. Results Central venous and pulmonary capillary wedge pressures were lower in the NTG group. Although the systemic circulation tended to be more dynamic in the NTG group at baseline, all systemic hemodynamic parameters were similar in both groups throughout the experiment. Baseline MBF was all-over higher in the NTG group. Following surgery, in both groups the MBF decreased severely especially in the upper gastric tube. At higher mean arterial pressure (MAP), MBFs tended to be higher than at the lowest MAP levels. Overall, MBFs were higher in the NTG group. mHbcon levels increased significantly with the initial decreases in flow and remained lower in the NTG group. mHbSO2 values were no different between groups and did not change accordingly at different MAP levels.

Conclusions In our experimental model, tissue perfusion is severely compromised following formation of the gastric tube; this effect is aggravated by systemic hypotension independent from cardiac output. Venous congestion might contribute to this effect and can be prevented with continuous intravenous administration of NTG. Clinical studies will have to demonstrate an effect on anastomotic healing and outcome.

Norepinephrine: more than blood pressure cosmetics?

L Hiltebrand, S Brandt, O Kimberger, E Koepfli

University Hospital Bern, Switzerland

Critical Care 2009, 13(Suppl 1):P180 (doi: 10.1186/cc7344)

Introduction Norepinephrine (NE) is used to increase blood pressure (mean arterial pressure (MAP)) if hypotension arises. Increasing the perfusion pressure may increase blood flow in regions at risk. But vasoconstriction might worsen microcirculatory flow. We investigated the effects of NE on systemic, splanchnic and microcirculatory (microvascular blood flow (MBF)) blood flow in hypotensive pigs after major surgery.

Methods Twenty-seven pigs (30 ± 3 kg) were anesthetized, ventilated and underwent laparotomy. They were randomized to one of the following treatments: Group Low received 3 ml/kg/hour Ringer's lactate (RL) throughout the study. Group H received hydroxyethyl starch (130/0.4) to maintain SvO2 >60%. Group NE received 3 ml/kg/hour RL and NE to increase blood pressure to 65 and to 75 mmHg to match the MAP of Group H. Systemic, splanchnic, MBF blood flow and intestinal tissue oxygen tension were measured.

Results Baseline MAP was similar in all groups. To increase MAP to 65 and 75 mmHg, 0.035 and 0.12 |ig/kg/minute NE were

Figure 1 (abstract P180)

Target MAP 65 mmHg (mean ± SD).

Figure 2 (abstract P180)

Target MAP 75 mmHg (mean ± SD).

needed, respectively. The effects of NE on blood pressure, systemic, regional and MBF blood flow are shown in Figures 1 and 2. Conclusions NE increased MAP efficiently but had no beneficial effects on regional or MBF in the splanchnic region. This suggests that administration of vasopressors such as NE could be an unsafe way to maintain MAP because it may leave intestinal hypoperfusion undetected.

Vasopressin in pediatric vasodilatory shock: a multicentred randomized controlled trial

K Choong1, D Bohn2, D Fraser3

1McMaster University, Hamilton, ON, Canada; 2Hospital for Sick Children, Toronto, ON, Canada; 3Children's Hospital of Western Ontario, London, ON, Canada

Critical Care 2009, 13(Suppl 1):P181 (doi: 10.1186/cc7345)

Introduction Vasopressin has been suggested as a useful vaso-active agent in the support of vasodilatory shock in adults; however, its effect in children is unclear. We hypothesized that low-dose vasopressin in this population, administered as an adjunctive vasoactive agent, would lead to more rapid reversal of vasodilatory shock when compared with placebo. Methods In this multicenter, double-blind trial, children with clinical evidence of vasodilatory shock were randomized to receive either low-dose vasopressin (0.0005 to 0.002 U/kg/min) or placebo in addition to open-label vasoactive agents. Vasoactive infusions were titrated according to established guidelines to maintain target mean arterial pressure and adequate perfusion. The primary outcome was the time to vasoactive-free hemodynamic stability. Secondary outcomes included mortality, organ-failure free days, length of stay, and adverse and serious adverse events. Results Sixty-five out of 69 children (94%) who were randomized received the study drug (33 vasopressin, 32 placebo) and were included in the analysis. There was no significant difference between the vasopressin and placebo groups in the time to vaso-active-free hemodynamic stability (49.7 vs. 47.1 hours, respectively, P = 0.85). There were 10 deaths (30%) in the vasopressin group and five deaths (15.6%) in the placebo group (relative risk = 1.94; 95% CI = 0.75 to 5.05, P = 0.24). There were no significant differences between the two groups with respect to organ-failure free days (22 vs. 25.5 days, P = 0.11), ventilator-free days (16.5 vs. 23 days, P = 0.15), length of pediatric critical care unit stay (8 vs. 8.5 days, P = 0.93), or the adverse and serious adverse event rate ratios (12.0%, 95% CI = -2.6 to 26.7, P = 0.15; and 3.2%, 95% CI = -13.7 to 7.8, P = 0.55, respectively). Conclusions In this multicenter, randomized, placebo-controlled trial in pediatric patients with vasodilatory shock, low-dose vasopressin did not demonstrate any beneficial effects but a concerning trend in mortality.

Arginine vasopressin increases plasma levels of von Willebrand factor in sheep

S Rehberg1, R Laporte2, P Enkhbaatar1, E La2, K Wisniewski2, L Traber1, P Riviere2, DL Traber1

University of Texas Medical Branch, Galveston, TX, USA; 2Ferring, San Diego, CA, USA

Critical Care 2009, 13(Suppl 1):P182 (doi: 10.1186/cc7346)

Introduction The V1a/V2 receptor dual-agonist arginine vasopressin (AVP) is increasingly used in catecholamine-resistant septic shock [1]. While V1a receptor stimulation results in

vasoconstriction, V2 receptor stimulation promotes coagulation, at least in part through an increase in plasma von Willebrand factor antigen (vWF:Ag) activity. We hypothesized that, at an intravenous infusion rate representative of the requirements for the treatment of sepsis-induced vasodilatory hypotension in sheep [2], AVP provides procoagulant activities. We tested this hypothesis by measuring vWF:Ag activity in unanesthetized healthy sheep during administration of AVP in comparison with the vWF:Ag activity increase induced by the selective V2 receptor agonist desmo-pressin (dDAVP).

Methods After two to four measurements of vWF:Ag activity and hemoglobin (Hb) over a 1-hour baseline (BL) period, 13 female sheep were randomly administered one of two treatments: an intravenous bolus of dDAVP (1 nmol/kg; n = 7) or a 2-hour intravenous infusion of AVP (3 pmol/kg/min; n = 6). vWF:Ag activity and Hb were measured every 30 minutes over 2 hours from the time of dDAVP administration or the initiation of AVP administration. For each sheep, vWF:Ag activity was corrected for plasma volume by calculating the vWF:Ag activity/Hb ratio and was expressed as percentage of the mean BL value. Data are expressed as the mean ± SEM.

Results Following dDAVP bolus injection and during AVP infusion, the vWF:Ag activity/Hb ratio increased to a maximum of 135 ± 4% (n = 7) and 135 ± 6% (n = 4) of mean BL value, respectively (P < 0.001 and P = 0.002 vs. BL, respectively). The vWF:Ag activity/Hb ratio did not increase beyond the maximal fluctuation range of BL measurements in two out of the six sheep treated with AVP (maximum increases of 96% and 101%). Conclusions At an intravenous infusion rate representative of the requirements for the treatment of sepsis-induced vasodilatory hypotension in sheep [2], the V1a/V2 receptor dual-agonist AVP increased vWF:Ag activity to the same extent as the selective V2 receptor agonist dDAVP. Because of its V2 receptor agonist activity, the use of AVP may potentially amplify the microcirculation impairment caused by sepsis-induced coagulopathy. References

1. Russell JA, et al.: Vasopressin in septic shock. Crit Care Med 2007, 35:609-615.

2. Traber D: Selective V1a receptor agonists in experimental septic shock [abstract]. Crit Care 2007, 11(Suppl 4):P51.

Selective V1-agonism and selective V2-antagonism are superior to arginine vasopressin in ovine septic shock

S Rehberg1, C Ertmer1, BB Pinto1, DL Traber2, H Van Aken1, F Su3, JL Vincent3, M Westphal1

1University of Muenster, Germany; 2University of Texas Medical Branch, Galveston, TX, USA; 3Erasme Hospital, Université Libre de Bruxelles, Brussels, Belgium

Critical Care 2009, 13(Suppl 1):P183 (doi: 10.1186/cc7347)

Introduction The present study was designed as a prospective, randomized, laboratory experiment to compare the effects of a selective V1 -agonist (Phe2-Orn8-vasotocin), a selective V2-antagonist ((propionyl1-d-Tyr(Et)2-Val4-Abu6-Arg8,9)-vasopressin) and arginine vasopressin (AVP), when given as first-line therapy, on hemodynamics, metabolic changes, mesenteric blood flow (Qma) and mortality using a clinically relevant model of septic shock [1]. Methods Fecal peritonitis was induced in 21 anesthetized, invasively monitored, mechanically ventilated sheep. A combination of crystalloids and 6% hydroxyethyl starch was titrated to maintain S74 constant hematocrit. Following the shock time (ST), defined as

mean arterial pressure (MAP) <60 mmHg, sheep were randomly assigned to receive either a continuous infusion of 1 |ig/kg/hour V2-antagonist, 0.05 |ig/kg/minute V1-agonist or 0.5 mU/kg/minute AVP (n = 7 each). Norepinephrine was titrated up to a maximum of 1 |ig/kg/minute to maintain MAP at 70 ± 5 mmHg in all groups. Data are expressed as the mean ± SEM.

Results No significant differences between groups were detected at baseline and ST. The cardiac index and urine flow were similar between groups. The V1-agonist led to a higher Qma than in both other groups from 5 to 8 hours after ST (P <0.05 each). Compared with AVP, selective V1 -agonism stabilized MAP more effectively and allowed a reduction in cumulative norepinephrine requirements from 4 to 8 hours. The V1 -agonist and the V2-antagonist did not differ in these variables. V2-antagonism attenuated the decrease in base excess compared with both other groups and was associated with increased fluid requirements compared with V1-agonism (18 ± 1 vs. 14 ± 1 ml/kg/hour). Compared with AVP, the V1 -agonist and the V2-antagonist reduced arterial lactate levels (3.5 ± 0.5 and 4.1 ± 0.3 mmol/l vs. 5.5 ± 0.2 mmol/l, P <0.02 each) and prolonged survival (13 ± 1 hour and 14 ± 1 hour vs. 10 ± 1 hour; P <0.01 each). Conclusions Whereas V2-antagonism reduced metabolic acidosis, V1 -agonism stabilized hemodynamics more effectively compared with AVP. Because of the prolonged survival time, selective V1 -agonism and V2-antagonism might be superior to AVP infusion in septic shock. Future studies are warranted to investigate the combination of these two therapeutic strategies. Reference

1. Su F, et al.: Fluid resuscitation in severe sepsis and septic shock: albumin, hydroxyethyl starch, gelatin or Ringer's lactate - does it really make a difference? Shock 2007, 27: 520-526.

Adjunct terlipressin effect on vital organ perfusion during advanced life support in a porcine model of ventricular fibrillation

A Truhlar, V Cerny, Z Turek, D Kodejskova, J Suchankova

Charles University in Prague, University Hospital Hradec Kralove, Czech Republic

Critical Care 2009, 13(Suppl 1):P184 (doi: 10.1186/cc7348)

Introduction Drug administration is an integral part of advanced life support (ALS). The coronary perfusion pressure (CorPP) during cardiopulmonary resuscitation (CPR) predicts the probability of return of spontaneous circulation, while the cerebral perfusion pressure (CPP) affects brain damage. Guidelines for cardiac arrest treatment recommend giving adrenaline (ADR) although there is no evidence showing long-term benefit from any medication. Vaso-pressin was probably the most effective alternative studied in both experimental and clinical trails. As vasopressin has never been available in Europe, we evaluated the effect of its synthetic analogue, terlipressin (TER), on vital organ perfusion in an animal model of ventricular fibrillation (VF) (Terlipressin in Cardiac Arrest (TERCA) study). The use of TER in VF has never been studied before.

Methods A prospective, experimental study in 14 domestic pigs (30 to 35 kg) randomly assigned into two groups: A (ADR + TER; n = 7) and B (ADR + NaCl; n = 7). CorPP and CPP were calculated from right atrial, aortic and intracerebral pressures. VF was induced using an intracardiac pacing electrode. After 5 minutes of untreated arrest, chest compressions were started using the AutoPulse system (Zoll Circulation, Sunnyvale, CA, USA)

simulating hands-only CPR. At a time of 15 minutes after onset of VF, ALS was started for the next 45 minutes. ADR 30 |ig/kg and TER 30 |ig/kg were administered intravenously 19 minutes after induction of VF in group A, while ADR and NaCl were given in group B. Equal doses of ADR were then repeated every 3 minutes in both groups. The design of the study reflected the real-life time intervals achieved in clinical trials. The primary endpoint was to compare CorPP and CPP between groups A and B. Data were analysed using SigmaStat. P <0.05 was considered statistically significant.

Results CorPP (mean ± SD) measured 35, 40 and 45 minutes after the onset of VF was 12 ± 4, 13 ± 4 and 11 ± 6 mmHg in group A (with adjunct TER), and 6 ± 4, 2 ± 7 and 1 ± 5 mmHg in control group B (P <0.05, P <0.01 and P <0.01). CPP at the same times was 23 ± 7, 24 ± 8 and 20 ± 7 mmHg in group A, and 13 ± 7, 8 ± 6 and 6 ± 5 mmHg in control group B (P <0.05, P<0.01 and P <0.01).

Conclusions Our results suggest a significant increase of CorPP and CPP after early adjunct treatment of VF with TER added to standard doses of ADR during prolonged CPR in pigs. Acknowledgement Supported by research project MZO 00179906.

Vasopressin versus norepinephrine infusion in patients with vasoplegic shock after cardiac surgery

L Hajjar, A Roquim, R Kalil Filho, F Galas, T Ticom, J Auler

Heart Institute, Sao Paulo, Brazil

Critical Care 2009, 13(Suppl 1):P185 (doi: 10.1186/cc7349)

Introduction Vasopressin is commonly used as an adjunct to catecholamines to support blood pressure in refractory septic shock. Its effect on vasoplegic shock after cardiac surgery is unknown. We hypothesized that low-dose vasopressin as compared with norepinephrine would decrease the length of stay in the ICU in patients submitted to cardiac surgery with a pump. Methods We assigned patients who presented vasoplegic shock after cardiac surgery in the first 24 hours after arrival in the ICU. During 6 months, out of 458 patients undergoing cardiac surgery, 82 developed vasoplegia shock and participated in the study. We randomly assigned patients to receive norepinephrine (initial doses of 5 mg) or vasopressin (0.01 to 0.04 U/min). All vasopressor infusions were titrated and tapered according to protocols to maintain a target blood pressure. The primary endpoint was the length of ICU stay. Secondary endpoints were mortality, timing of needing vasopressors, incidence of organ dysfunction and adverse effects.

Results A total of 82 patients were included in the analysis - 42 patients received norepinephrine and 40 patients received vasopressin. There was a significant difference between the norepinephrine and vasopressin groups in the length of ICU stay (4.2 days vs. 7.3 days, P <0.005). Also, the vasopressin group of patients presented a lower incidence of renal failure compared with the norepinephrine group (4.5% vs. 8.9%, P <0.001). There was no significant difference in the overall rates of adverse events, rates of mortality and timing of vasopressor therapy. Conclusions Low-dose vasopressin reduced the length of ICU stay and the incidence of renal failure in patients with vasoplegic shock after cardiac surgery compared with norepinephrine. Also, the vasopressin group did not present more adverse events. References

1. Dunser MW, et al.: Circulation 2003, 107:2313-2319.

2. Luckner G, et al.: Crit Care Med 2007, 35:2280-2285.

Arterial catheter-related infection according to the catheter site

L Lorente, S Palmero, J Iribarren, J Jiménez, C Garcia, R Galván, J Castedo, J Martínez, M Brouard, M Martín, M Mora

Hospital Universitario de Canarias, La Laguna SC, Tenerife, Spain Critical Care 2009, 13(Suppl 1):P186 (doi: 10.1186/cc7350)

Introduction Although there are many studies on arterial catheter-related infection (ACRI), there are scarce data on such infection according to the catheter access site. Which particular arterial catheter site is associated with a higher risk of infection remains controversial. The guidelines of the Centers for Disease Control and Prevention make no recommendation about which site or sites minimize the risk of catheter-related infection. In previous studies, we have found a higher incidence of ACRI in arterial femoral than in radial access sites. In the present study, we increased the number of arterial catheters in order to increase the probability of finding significant differences in the incidence of ACRI between other arterial accesses. The objective of this study was to analyze the incidence of ACRI according to different access sites. Methods We performed a prospective observational study of all consecutive patients admitted to our 24-bed medical and surgical ICU of a 650-bed university hospital during 6 years (1 May 2000 to 30 April 2006). ACRI included catheter-related local infection and catheter-related bloodstream infection.

Results A total of 1,085 arterial femoral catheters were inserted during 6,497 days, 2,088 radial during 12,007 days, 174 dorsalis pedis catheters during 1,050 days and 141 brachial during 852 days. We detected 33 cases of ACRI (11 with bacteremia and 22 with local infection) in femoral catheters, 12 cases of ACRI (three with bacteremia and nine with local infection) in radial catheters, zero in dorsalis pedis catheters, and zero in brachial catheters. The ACRI incidence per 1,000 arterial catheter days was significantly higher for femoral (5.08) than for radial (1.76) access (OR = 5.1, 95% CI = 2.56 to 10.81; P <0.001), dorsalis pedis (0) access (OR = 7.6; 95% CI = 1.37 to infinite; P = 0.01) and brachial (0) access (OR = 6.2, 95% CI = 1.11 to infinite; P = 0.03). We did not find significant differences in the ACRI incidence per 1,000 arterial catheter days between radial and dorsalis pedis (OR = 1.5; 95% CI = 0.24 to infinite; P = 0.73); and between radial and brachial access (OR = 1.2; 95% CI = 0.20 to infinite; P = 0.88).

Conclusions Our results suggest that femoral arterial access should be avoided in order to minimize the risk of arterial catheter-related infection.

Inline filtration reduces the incidence of systemic inflammatory response syndrome in critically ill children

T Jack, M Boehne, BE Brent, A Wessel, M Sasse

Medical School Hannover, Germany

Critical Care 2009, 13(Suppl 1):P187 (doi: 10.1186/cc7351)

Introduction Particulate contamination of infusion solution implies a potential health risk for intensive care patients with a background of debilitation and impaired host responses. Particles have been shown to induce thrombogenesis, deterioration of microcirculation and modulation of immunoresponse. We assessed the effect of inline filtration on the reduction of major complications in critically ill children (Clinical Trials.gov ID NCT 00209768). Methods In a randomised, prospective trial, paediatric patients admitted to the interdisciplinary pediatric ICU of a tertiary university S75

hospital were assigned to either a control or an interventional group, the latter receiving inline filtration (infusion filter Pall ELD96LLCE/ NOE96E, Braun Intrapur Lipid/Intrapur Neonat Lipid) throughout infusion therapy. Prior to this study, the infusion regimen was optimised to prevent precipitation and incompatibilities of solutions and drugs. Primary objectives were a reduction in the incidence of sepsis, thrombosis, systemic inflammatory response syndrome (SIRS) or organ failure (liver, lung, kidney, circulation).

Results Interim analysis of 398 children (171 female, 227 male, mean age 72 months) revealed a heterogeneous background of underlying diagnoses and a Gaussian distribution to either the control group (208 patients) or inline filtration group (190 patients). First analyses demonstrated a significant reduction in the incidence of SIRS for the interventional group (95% CI = 68 to 112; P <0.035).

Conclusions The occurrence of sepsis, SIRS, thrombosis or organ failure often complicates the course of disease in critically ill patients. Inline filtration is most effective reducing the incidence of SIRS. Additional analyses are expected to confirm the preliminary results as well as to further identify the influence of inline filtration on other complications.

Central venous catheter infection: incidence and risk factors

G Lane, F Bell, M Bellamy

University of Leeds, UK

Critical Care 2009, 13(Suppl 1):P188 (doi: 10.1186/cc7352)

Introduction We performed a retrospective analysis of central venous catheter (CVC) infection over a 12-month period in a tertiary referral teaching hospital general ICU to establish incidence of infection and clinical factors associated with CVC infection.

Methods We reviewed data from all patients admitted to the ICU over a 12-month period. CVC tips from surviving patients were cultured. Patient demographic data, diagnoses and therapeutic interventions from our audit database were entered into a spreadsheet. Blood pressure, pulse, temperature and ventilatory data were collected from ICU charts. Haematological and biochemical variables were accessed from the hospital results server. Other potential risk factors entered included the use of steroids, chemotherapy, diabetes, renal replacement therapy, tracheostomy or arterial catheter. Data were anonymised, and nested logic arrays were used for processing and error checking. Risk factor analysis was performed by logistic regression against the presence of a positive CVC culture.

Results Of 865 patients, 191 had no CVC. Data were inadequate in a further 109 patients. In the remaining 565 patients, 836 CVCs were inserted. The median patient age was 61 years (IQR 49 to 71 years) with 51.6% male. In total, 19.5% of CVCs were two-lumen vascular access catheters for dialysis, and 80.5% were four-lumen CVCs for monitoring and drug infusion. A total of 637 central lines were inserted via the internal jugular route, 106 femoral and 71 subclavian. The mean number of CVCs per person was 1.47, median 1, range 1 to 15. One hundred and nineteen catheters (14.2%) became infected. The median duration of CVC was 4 days (IQR 2 to 8 days). Logistic regression correctly predicted CVC infection on 85.9% of occasions, with a specificity of 97.4%, but a relatively poor sensitivity of 18.6%. Independently predictive factors included subclavian placement (OR = 0.35), S76 standard central line (v dialysis catheter, OR = 0.75); days of

catheterisation (1.31), peak temperature (1.42) and peak C-reactive protein (1.002).

Conclusions In this study, subclavian placement was associated with a threefold reduction in risk of infection. There was a 31% per day increase in infection, and this was likely to be associated with pyrexia and a raised C-reactive protein. Interestingly, disease factors such as diabetes, chemotherapy and steroids were unimportant. These data call into question the common practice of preferentially placing CVCs in the internal jugular vein.

Impact of bloodstream infections on ICU mortality

M Michalia, M Kompoti, G Kallitsi, I Vassiliadis, M Charitidi, P Clouva-Molyvdas

Thriassio General Hospital of Eleusis, Athens, Greece Critical Care 2009, 13(Suppl 1):P189 (doi: 10.1186/cc7353)

Introduction Previous studies have investigated the impact of different types of bloodstream infection (BSI) (primary, secondary, catheter-related) on the outcome of critically ill patients, including only the first BSI episode in the analysis [1-3]. Our study aimed at evaluating the impact of different BSI types on ICU mortality including the total number of each type of BSI in the analysis. Methods All patients admitted in the ICU during a 46-month period were prospectively followed. Data recorded included: demographics, medical history, admission category (medical, elective surgical, emergency surgical, trauma), APACHE II score at admission to the ICU, BSI episodes, isolated pathogens, continuous renal replacement therapy (CRRT) implementation, blood product transfusions, ICU length of stay (LOS) and ICU outcome. BSIs were defined as primary (PBSI), secondary, catheter-related or mixed based on standard criteria. Data were analyzed with logistic regression, and the statistical significance level set at P <0.05.

Results Four hundred and twenty-six consecutive patients (295 males, 131 females) were included in the analysis. Age (mean ± SD) was 52.5 ± 19.4 years, APACHE II score at admission to the ICU 18.3 ± 6.6. The BSI incidence density was 26.3 episodes per 1,000 patient-days. ICU LOS was 21.6 ± 20.6 days. ICU mortality rate was 17.8% (95% CI = 14.3 to 21.6). In univariable analysis, the APACHE II score, age, admission category, CRRT implementation, PBSI episodes, and packed red blood cell (pRBC) units transfused during the ICU LOS were significantly associated with ICU mortality. In multivariable analysis, age (OR = 1.2, P = 0.003), APACHE II score (OR = 1.1, P <0.001), infection at admission (OR = 2.9, P = 0.007), CRRT implementation (OR = 9.3, P <0.001), PBSI episodes (OR per episode = 2.7, P = 0.002) and pRBC transfusions (OR = 1.1, P = 0.003) were independently associated with ICU mortality. None of the other BSI types showed association with ICU mortality.

Conclusions In our patient sample, PBSI episodes during the ICU LOS were independently associated with ICU mortality, each PBSI episode conferring a 2.7-fold probability of ICU death after adjustment for potential confounders. The other BSI types (secondary, catheter-related) showed no association with ICU mortality. References

1. Renaud B, Brun-Buisson C: Outcomes of primary and catheter-related bacteremia. A cohort and case-control study in critically ill patients. Am J Respir Crit Care Med 2001, 163:1584-1590.

2. DiGiovine B, Chenoweth C, Watts C, Higgins M: The attributable mortality and costs of primary nosocomial bloodstream infections in the intensive care unit. Am J Respir Crit Care 1999, 160:976-981.

3. Garrouste-Orgeas M, Timsit JF, Tafflet M, Misset B, Zahar J-R, Soufir L: Excess risk of death from intensive care unit-acquired bloodstream infections: a reappraisal. Clin Infect Dis 2006, 42:1118-1126.

Reduction of the catheter-related bloodstream infections in critically ill patients

R Peredo, C Sabatier, A Villagrá, D Suarez, J González, C Hernandez, F Pérez, J Valles

Hospital Parc Tauli. Sabadell, Spain

Critical Care 2009, 13(Suppl 1):P190 (doi: 10.1186/cc7354)

Introduction The objective of our study was to determine the utility of a multiple system intervention to reduce catheter-related bloodstream infections (CR-BSI) in an ICU.

Methods We carried out a prospective cohort study in a medical and surgical ICU. We determined the rate of CR-BSI per 1,000 catheter-days during the application of an evidence-based intervention used to decrease the CR-BSI in 2007 (March to December) compared with the rate during the same period in 2006 in which we just applied conventional measures of prevention. During the intervention period we applied five measures: giving educational sessions about how to insert and maintain central catheters, cleaning the skin with chlorhexidine, filling in a checklist during the insertion of the catheter, using the subclavian vein as the preferred site and avoiding the femoral site if possible, and removing unnecessary catheters. CR-BSI were defined as the recovery of the same organism (same species, same antibiotic susceptibility profile) from the catheter tip and blood cultures. Results During the control and intervention periods we registered 4,289 versus 4,174 patient-days and 3,572 versus 3,296 catheter-days, respectively. During the intervention period eight CR-BSI were diagnosed compared with 24 CR-BSI in the control period. The mean incidence rate of CR-BSI was 6.7/1,000 catheter-days in the control period and 2.4/1,000 catheter-days in the intervention period (RR = 0.3; 95% CI = 0.1 to 0.7; P = 0.03). A nursing intervention during the filling of the checklist was required in 1 7.7% of the insertions. The ratio of use of the catheter was 81.5% during the control period and 80.6% in the intervention period without significant differences between periods. Conclusions The implementation of a multiple system intervention with an evidence-based measure significantly reduces the CR-BSI in our ICU.

Definition of catheter-related bloodstream infection as a quality improvement measure in intensive care

C Meadows, B Creagh-Brown, T Nia, K Bonnici, S Finney

Royal Brompton Hospital, London, UK

Critical Care 2009, 13(Suppl 1):P191 (doi: 10.1186/cc7355)

Introduction Catheter-related bloodstream infection (CRBSI) accounts for 10% to 20% of hospital-acquired infections in the UK and is associated with both increased ICU stay and mortality. Rates of CRBSI may be modified by clinical care during insertion and utilisation of central venous catheters (CVCs) [1]. As such, the incidence of CRBSI has been proposed as a quality indicator. There is currently no consensus regarding the definition of CRBSI. Methods We applied two internationally recognised guidelines [1,2] to evaluate the CRBSI prevalence in a prospective study of CVC replacements over 3 months on an adult cardiothoracic ICU. We utilised these criteria for confirmed CRBSI but considered

additionally CVC colonisation and possible CRBSI. CVC tips were sent for semiquantitative culture, and contemporaneous blood was drawn for qualitative culture from both the CVC and a peripheral vein. Possible CRBSI was defined as positive CVC blood culture and tip in the absence of a peripheral blood culture; colonised CVC was defined as a positive tip and negative blood culture. Results A total of 33 CVCs were replaced for: clinical suspicion of a CRBSI (64%); duration of use (24%); incidental positive blood culture (9%); and clinical evidence of catheter site infection (3%). Ninety-one per cent of CVC tips were sent for culture. Fifty-eight per cent of cases had paired blood cultures sent. The incidence of antimicrobial therapy at the time of blood sampling was 85%. Only a single CVC replacement (3%) was associated with a confirmed CRBSI. By contrast, the incidence of possible CRBSI was greater (45%).

Conclusions Clinical suspicion of CRBSI was associated with infrequent microbiological confirmation. The impact of antibiotic therapy on diagnostic sensitivity is unknown. The incidence of possible CRBSI was much greater; we believe this entity encompasses unknown proportions of both colonised CVCs and CRBSI. The widely used definitions of CRBSI probably therefore underestimate the true incidence: this limits their utility to describe a parameter that may benchmark different ICUs and drive quality improvement processes within an ICU. We propose that both confirmed and possible CRBSI rates should be recorded as (different) surrogates for the true rate. Finally, this illustrates the shortcomings of assessing outcomes (CRBSI rate). Measuring adherence to quality control processes (care bundles) may be a better comparator. References

1. Pronovost P, et al.: An intervention to decrease catheter-related bloodstream infections in the ICU. N Engl J Med 2006, 355:2725-2732.

2. Pratt RJ, et al.: Epic2: national evidence-based guidelines for preventing healthcare-associated infections in NHS hospitals in England. J Hosp Infect 2007, 65S:S1-S64.

Comparative study between conventional and antiseptic impregnated central venous catheters

S Macedo, J Molina Filho, G Lima, L Brito

Hospital Sao José do Avai, Itaperuna, Brazil

Critical Care 2009, 13(Suppl 1):P192 (doi: 10.1186/cc7356)

Introduction Central venous catheters (CVCs) are very useful in the management of patients hospitalized in the ICU, but are not devoid of complications. Among the complications related to the permanence of CVCs, infection stands out. This may increase the morbidity, mortality, costs and length of stay in the ICU. The purpose of this study was to compare the duration of standard CVCs with those impregnated with antiseptic: silver sulfadiazine and chlorhexidine.

Methods A prospective randomized, alternate, nonblind study. Central venous access was taken, alternating the type of CVC used in each patient. Were recorded for each patient the sex, age, APACHE II score, Glasgow coma score, site of the puncture, reason for withdrawal of the catheter and the type of catheter used. The tip of the catheter was cultured (qualitative). The groups were divided: group I (41 patients, 54 punctures) used the standard CVC, and group II (38 patients, 54 punctures) used an impregnated CVC.

Results See Table 1. We included 62 patients (48.38% female). We studied 108 periods of catheterization, of which 54 were standard CVCs and 54 were impregnated CVCs. The average S77

Table 1 (abstract P192)

Multivariate analysis

Standard CVC

Impregnated CVC

Variable Value 95% CI Value 95% CI P value (OR*)

Age (average) 52.42 47.43 0.17

Sex (female) 50% 36.1 to 63.9 54.7% 40.4 to 68.4 0.3

Subclavian vein 46.3% 32.6 to 60.4 66.0% 51.7 to 78.5 0.03 (2.3)*

Glasgow coma score <8 59.3% 45.0 to 72.4 56.6% 42.3 to 70.2 0.5 (0.9)

APACHE II score (mean) 17.97 19.63 0.21

Site infection 77.8% 64.4 to 88.0 49.1% 35.1 to 63.1 0.002(0.3)*

Infection length 10.86 15.43 0.005*

Positive cultures 18.5% 9.3 to 31.4 15.1% 6.7 to 27.6 0.41 (1.3)

length of stay was higher in impregnated CVCs (14.11 days) compared with standard CVCs (10.7 days). Excluding death in both groups, the length of stay of the catheter in group I was 10.86 days, compared with 15.43 days in group II. Adding all periods of catheterization for each group, group I had an amount of 578 days and group II had 762 days. The total duration of group II was 31.84% higher than group I. Regarding the reason for withdrawal of the CVC, the suspected infection predominated in 77.8% of the time in standard CVCs and 49.1% of the time in impregnated CVCs. The culture of the catheter's tip was positive on 10 occasions (18.5%) in standard CVCs, against eight occasions (15.1%) in the impregnated group. Most patients had Glasgow coma score <9. The average APACHE II score was 17.97 in patients of group I, compared with an average of 19.63 in group II patients. The predominant site of puncture in this study was the subclavian vein (56.48%), and the catheters remained a long time on this site when compared with others (jugular and femoral vein). But when we considered only group II (impregnated), the catheters located in the jugular vein remained longer. The impregnated catheter cost 40% more than the conventional. Conclusions The length of stay with the use of impregnated CVCs was higher (15.43 days) than the standard CVC (10.86 days). The rate of colonization was higher in the standard CVC. Patients who require a CVC for long periods have benefited with the use of impregnated CVCs, because they present a long use term, a lower rate of colonization, avoiding complications related to the procedure of successive punctures and related to the permanence of the catheters. In view of the clinical benefits already mentioned, the benefit reached from the use of antiseptic-impregnated catheters compensated the initial expensive cost of 40%.

Impact of early catheter removal during treatment of invasive candidiasis: analysis from two phase 3 micafungin trials

M Nucci1, R Betts2, B Dupont3, C Wu4, D Buell4, L Kovanda4, O Lortholary3

1Hospital Universitärio Clementino Fraga Filho, Rio de Janeiro, Brazil; 2University of Rochester, NY, USA; 3Hôpital Necker, Paris, France; 4Astellas Pharma Inc., Deerfield, IL, USA Critical Care 2009, 13(Suppl 1):P193 (doi: 10.1186/cc7357)

Introduction Prompt central venous catheter (CVC) removal is often recommended in candidemic patients [1,2]. To investigate

Figure 1 (abstract P193)

further, we evaluated relationships between CVC status and 28-day and 42-day survival post-treatment initiation, overall treatment success, and time to mycological eradication. Methods Data from two randomized, double-blind, phase 3 mica-fungin trials evaluating treatment of candidemia and invasive candidiasis (n = 842) were analysed. Univariate analysis was performed to evaluate: associations of baseline CVC removal within 24 hours (CVC24) and 48 hours (CVC48) versus non-removal with each treatment outcome; and associations of potential confounding factors with treatment outcomes. A final multi-variate analysis was performed on those associations with P <0.10. Results Univariate analysis revealed that CVC24 was associated with 28-day survival (P = 0.05) and 42-day survival (P = 0.05), and CVC48 with 28-day survival (P = 0.01), 42-day survival (P = 0.01), and overall treatment success (P = 0.02). Neither CVC24 nor CVC28 were associated with time to mycological eradication. After controlling for confounding variables such as baseline Candida species, APACHE II score, underlying disease, and so on, multivariate analysis revealed that neither CVC24 nor CVC48 were statistically significant prognostic indicators for 28-day and 42-day survival (Figure 1). In addition, CVC48 was not associated with overall treatment success.

Conclusions Early CVC removal (within 48 hours of treatment start) had no statistically significant effect on treatment success or survival during treatment of invasive candidiasis. References

1. Pappas PG, et al.: Guidelines for treatment of candidiasis. Clin Infect Dis 2004, 38:161-189.

2. Nucci M, et al.: Should vascular catheters be removed from all patients with candidemia? An evidence-based review. Clin Infect Dis 2002, 34:591-599.

Cholorhexidine, octenidine or povidone iodine for catheter-related infections: a randomised controlled trial

A Bilir, B Yelken, A Erkan

Osmangazi University Medical Faculty, Eskisehir, Turkey Critical Care 2009, 13(Suppl 1):P194 (doi: 10.1186/cc7358)

Introduction Protection of the catheter site by antimicrobial ointments is one of the most important factors in the prevention of infection [1,2]. Povidone iodine, chlorhexidine gluconate and octenidine hydrochloride are the most common used agents for dressing. The purpose of this study is to compare the effects of povidone iodine, chlorhexidine gluconate and octenidine hydrochloride in preventing catheter-related infections. Methods The study was performed in an adult ICU. Fifty-seven patients who have arterial, central venous catheterization were eligible to be included in the study. Patients were randomized to receive 4% chlorhexidine gluconate (group I, n = 19), 10% povidone iodine (group II, n = 19) or octenidine hydrochlorodine (group II, n = 19) for cutaneous antisepsis.

Results The clinical characteristics of the patients and the risk factors for infection were similar in the groups. There was a statistically significant difference between groups in catheter-related sepsis and colonization (P <0.001). The documented catheter-related sepsis rate was 10.5% in the povidone iodine and octenidine hydrochlorodine groups. The catheter-related colonization rate was 26.3% in the povidone iodine group and 21.5% in the octenidine hydrochlorodine group. In the chlorhexidine group, there was no catheter-related sepsis or colonization. Conclusions Chlorhexidine is an effective disinfectant agent. The use of 4% chlorhexidine rather than 10% povidone iodine or octenidine hydrochlorodine for cutaneous disinfection before insertion of an intravascular device and for postinsertion site care can substantially reduce the incidence of catheter-related infection. References

1. Chaiyakunapruk N, et al.: Chlorhexidine compared with povidone-iodine solution for vascular catheter-site care: a metanalysis. Ann Intern Med 2002, 136:792-801.

2. Krau SD. Review: chlorhexidine gluconate is more effective than povidone-iodine for preventing vascular catheter related bloodstream infection. Evid Based Nurs 2003, 6:18.

Review of central-line-related sepsis in neurointensive care patients

E Doorley, P Nair

Walton Neurological Centre, Liverpool, UK

Critical Care 2009, 13(Suppl 1):P195 (doi: 10.1186/cc7359)

Introduction Central venous catheterisation is a core component of intensive care management. Despite advances in medicine, central venous catheters are associated with infection rates ranging between 2.5 and 70% [1,2]. There is, however, little isolated information regarding rates of central line sepsis in neurointensive patients. Microbiology results from the 3 months preceding this study identified six line-related infections. Following implementation of a new evidence-based medicine technique for central lines, this study set out to review central line sepsis in patients at Walton Neurological ITU.

Methods Eighty-seven patients were identified by the study criteria as suitable for inclusion, of which information on 65 patients was obtained. Baseline demographical information was collected on patient sex, age, catheter insertion site and procedural compli-

cations. Analysis was then carried out to review the rates of line sepsis in this neurologically compromised group of patients, along with identifying any obvious trends in central venous insertion. Results A total of 69.2% (n = 45) of the cohort of 65 patients had a central line in situ. Of these, 47% (n = 29) were femoral, 24% (n = 15) were internal jugular, 23% (n = 14) were subclavian whilst 6% (n = 4) were unknown (site not mentioned in records). In total, 4.8% (n = 3) had reported line infections, two of which were coagulase-negative staphylococcus grown from femoral lines while one patient was reported to have grown Candida albicans from their internal jugular central line site.

Conclusions Despite best practice evidence favouring cervical lines, the study found that femoral line insertion was more appropriate in this subset of patients with unstable head injuries. In addition, this study highlighted a 50% reduction in central line sepsis in the 3 months following our central line insertion and maintenance policies. Furthermore, early resiting on day 3 in the case of femoral lines resulted in a reduction in contaminated central lines. References

1. Jones H: Focus on venous catheter related sepsis. Anaesthesia [http://www.frca.co.uk/article.aspx?articleid=100923]

2. McKinley S, Mackenzie A, Ward R, Penford J: Incidence and predictors of central venous catheter related infection in intensive care patients. Anaesthetic Intensive Care 1999, 27:164-169.

Central venous catheterization: a randomized comparison between external and internal jugular access

F Campos, A Jose Pereira, T Correa, A Biasi Cavalcanti, R Da Hora Passos, M Beller Ferri, C Alves Rosa, A Capone Neto

Hospital Israelita Albert Einstein, Sao Paulo, Brazil Critical Care 2009, 13(Suppl 1):P196 (doi: 10.1186/cc7360)

Introduction Central venous catheterization is a routine procedure in intensive care, and internal jugular access (IJA) is often used due to its high success rates. However, complications can happen in up to 4.2% of internal jugular punctures and it is contraindicated in the presence of coagulopathy. The external jugular access (EJA) is underused, has low complications rates and is successful in up to 90% of cases. So far, there has been no randomized, controlled trial comparing both accesses. The objective of this study was to determine the success and early complication rates of internal and external jugular vein access [1].

Methods A prospective, randomized study, performed in two adult general ICUs. Inclusion criteria were all patients who need central venous catheterization with a visible external jugular vein and no contraindication for IJA. All included venous catheterizations were performed by the first-year and second-year critical care residents, supervised by a staff physician. Admission type, APACHE II score and outcomes were recorded.

Results Sixty-nine patients were included. The mean APACHE II score was 22.8 (SD = 6.4). Thirty-three patients (47.8%) were randomized to EJA and 36 (52.2%) to IJA. The percentage of success was 72.7% with EJA and 88.9% with IJA (risk ratio = 0.82; 95% CI = 0.64 to 1.04; P = 0.09). Complications occurred in 2/33 (6%) EJA patients and in 1/33 (3%) IJA patients (risk ratio = 2.1; 95% CI = 0.2 to 22.6; P = 0.28). Besides central venous catheterization failure, the only complications were carotid puncture (one patient in IJA) and external hematoma (two patients in EJA). S79

Conclusions External jugular venous access is a good alternative to internal venous catheterization and it is associated with minor and low complications rates. Our results show a lower but not significant probability of success with the EJA. Considering that the procedure was done by physicians not familiar with the technique, however, we did not find definite evidence to indicate that IJA is superior to EJA. Reference

1. Blitt CD, et al.: Cardiovascular catheterization via the external jugular vein. A technique employing the 'J' wire. JAMA 1974, 229:817-818.

Ultrasound-guided positioning of totally implantable access port systems: a single-center experience

C Chelazzi, C Innocenti, C Pelagatti, AR De Gaudio

University of Florence, Italy

Critical Care 2009, 13(Suppl 1):P197 (doi: 10.1186/cc7361)

Introduction Totally implantable access ports (TIAPs) are extensively used for long-lasting intravenous access. They can be associated with early and late complications, which are related to an optimal site of insertion and tip positioning [1]. Despite fluoroscopic guidance being the gold standard, ultrasound can be a suitable and easier tool to guide venipuncture and catheter positioning, provided a postprocedural chest X-ray is taken to rule out tip malpositioning [2]. Our aim was to assess safety of this technique in terms of the best site of insertion, tip position and rate of complications.

Methods A total of 360 TIAPs were implanted between December 2007 and September 2008 in a dedicated surgical room at the ICU of the University of Florence, Italy. Insertion was performed by trained intensivists, using an ultrasound-guided technique. A chest X-ray scan was performed after the procedure to rule out a catheter's tip malposition and pneumothorax. Early and late complications were noted and recorded. A Fisher test (P <0.05) was done to test for association between the site of insertion and either tip position or occurrence of complications. Results From a total of 360 TIAPs implanted, the rate of malposition was seven catheters (1.9%). The site of insertion that consented the best rate of correct placement was the right internal jugular vein (P <0.05). The incidence of complications was 17 out of 360 procedures (4.7%). Early complications included five arterial punctures and one pneumothorax, while late complications included four TIAP displacements, one pocket infection, three catheter-related infections and three thromboses. A correlation was found between occurrence of late complications and left insertion sites (P <0.05).

Conclusions Ultrasound-guided venipuncture for TIAP implantation was safely conducted in the majority of patients. The right internal jugular vein was the best insertion site in terms of the best rate of correct catheter tip position and lower incidence of complications. References

1. Ignatov A, Hoffman O, Smith B, Fahlke J, Peters B, Bischoff J, Costa SD: An 11 -year retrospective study of totally implanted central venous access ports: complications and patient satisfaction. Eur J Surg Oncol 2008, in press. [Epub ahead of print]

2. Brooks AJ, Alfredson M, Pettigrew B, Morris DL: Ultrasound-guided insertion of subclavian venous access ports. Ann R

S80 Coll Surg Engl 2005, 1:25-27.

Ultrasound-guided catheterization of the subclavian vein: a prospective comparison with the landmark technique in ICU patients_

Y Alic, A Torgay, A Pirat

Baskent University School of Medicine, Ankara, Turkey Critical Care 2009, 13(Suppl 1):P198 (doi: 10.1186/cc7362)

Introduction Ultrasound (US)-guided internal jugular vein catheterization has been recommended to increase the procedural success rate and to enhance patient safety. However, there are few data on the potential advantages of the use of US guidance for subclavian vein (SV) catheterization. The aim of this study was to evaluate whether US-guided catheterization of SV improves the procedural success rate of the traditional landmark method in ICU patients.

Methods Ethics Committee approval and written informed consent from all patients or their next of kin were obtained. We prospec-tively and randomly evaluated an US-guided method in 35 patients undergoing SV catheterization (Group US) and compared the results with 35 patients in whom a landmark-guided technique was used (Group LM). All procedures were performed by the same physician, who was experienced in both techniques. The catheteri-zation success rate at the first attempt, the overall catheterization success rate, the number of attempts, the time to catheterization, and catheterization-related mechanical complications were recorded. Bedside chest X-ray scans were used to evaluate the position of the catheter tip, pneumothorax, and hemothorax. Results The groups were similar in terms of physical characteristics, systemic disease, and risk factors for difficult SV catheterization. Catheterization success rate at the first attempt (Group LM 63% and Group US 63%, P = 1.00) and overall success rate (Group LM 94% and Group US 89%, P = 0.67) were similar in both groups. The number of attempts for SV catheterization were no different in the groups (Group LM 1.6 ± 1.0 attempts and Group US 1.7 ± 1.1 attempts, P = 0.61). Three patients in Group LM developed six mechanical complications while four patients in Group US had four such complications (P>0.05 for all). The time to catheterization was significantly longer in Group US than Group LM (Group LM 178 ± 128 s vs. Group US 230 ± 127 s, P = 0.008).

Conclusions Compared with the landmark technique, real-time two-dimensional US did not increase the overall or first attempt success rate in subclavian vein catheterization in ICU patients. The time to catheterization was significantly longer with real-time two-dimensional US guidance than with landmark guidance.

Perioperative fluid administration in pancreatic surgery: comparison of three regimens

A Martini, N Menestrina, D Simion, L Filetici, V Schweiger, L Gottin

University Hospital, Verona, Italy

Critical Care 2009, 13(Suppl 1):P199 (doi: 10.1186/cc7363)

Introduction Perioperative fluid administration represents an important issue in perioperative medicine, because an incorrect strategy is associated with increased morbidity and mortality. The aim of this study was to compare three fluid administration regimens in patients who have undergone pancreatic surgery. Methods A randomized prospective trial. Fifty-nine patients, American Society of Anesthesiologists class 1 to 3, were assigned to one of three perioperative fluid regimens (PFRs). Interventions:

Main results

Liberal, Goal, Restricted,

PFR1 PFR3 PFR2

Age (years) 60 55 60

Surgery duration (hours) 6 6 5.5

Colloids (ml) 1,000 500 1,100

Crystalloids (ml) 4,600 2,200 1,450

Enteral nutrition (days from surgery) 4 4 3

Bowel movement (days from surgery) 3 4 3

Hospital length of stay (days) 11.5 11.5 9

general balanced anesthesia; PFR1, liberal (21 patients): colloids and crystalloids (in a 1:3 rate) 12 ml/kg/hour; PFR2, restricted (18 patients): colloids 4 ml/kg/hour; and PFR3, goal targeted (20 patients): colloid infusion targeted to achieve stroke volume variation (SVV) <13%. Hemodynamic monitoring was performed using the Vigileo/FloTrac system (cardiac output (CO) and SVV). Recorded outcome variables were hospital length of stay, starting of enteral nutrition, bowel movement, blood transfusion, and perioperative complications.

Results Data regarding significant differences are presented in Table 1. Hemodynamic monitoring showed a higher variability of CO and SVV in PFR1. Postoperative major complications were also higher in PFR1. Fistulas occurred in eight cases of PFR1 and in three and four cases in PFR2 and PFR3, respectively (P <0.05). Conclusions Our study in still ongoing; however, ad interim analysis suggests that a restricted or goal-targeted perioperative fluid administration seems to provide more stable hemodynamics and a reduction of major abdominal complications.

B-type natriuretic peptide, corrected flow time and central venous pressure as predictors of fluid responsiveness in septic shock

D Sturgess1, R Pascoe2, G Scalia2, B Venkatesh1

1University of Queensland, Brisbane, Australia; 2The Wesley Hospital, Brisbane, Australia

Critical Care 2009, 13(Suppl 1):P200 (doi: 10.1186/cc7364)

Introduction The plasma B-type natriuretic peptide concentration (BNP) appears not to predict fluid responsiveness in septic shock but no account has been made for the potential influence of cardiac rhythm [1]. Also, no comparison has been made between BNP and other clinical guides to fluid therapy, such as the Doppler aortic flow time corrected for heart rate (FTc) or central venous pressure (CVP). The aim of this preliminary study was to compare BNP, FTc and CVP as predictors of fluid responsiveness in septic shock patients without cardiac dysrhythmia.

Methods A prospective study of 10 consecutive adult septic shock patients (in sinus rhythm; 60% mechanically ventilated) treated with intravenous fluid challenge (4% albumin 250 ml over 15 min) in an Australian tertiary ICU. Results presented as the mean ± SD.

Results The APACHE II score was 21.8 ± 12.7. Haemodynamic assessment incorporating transcutaneous aortic Doppler (USCOM®) occurred before and 5 minutes after fluid challenge. Concurrent with the initial assessment, blood samples were collected for BNP assay (ADIVA Centaur®). Four patients demonstrated an increase in stroke volume >15% (responders). Three of the responders had

an elevated baseline BNP (>144 pg/ml). The percentage change in stroke volume correlated with baseline FTc (r =-0.81; P =0.004) but not with BNP (r = -0.3; P = 0.4) or CVP (r = -0.4; P = 0.2).

Conclusions Our data confirm that neither BNP nor CVP appear to predict fluid responsiveness. Furthermore, elevated BNP should not be viewed as a contraindication to fluid challenge in septic shock, as it does not exclude favourable haemodynamic response. Transcutaneous FTc offers promise as a predictor of fluid responsiveness and should be evaluated further. Reference

1. Pirracchio R, et a/.: Impaired plasma B-type natriuretic peptide clearance in human septic shock. Crit Care Med 2008, 36:2542-2546.

Intraoperative fluid optimization using stroke volume variation in high-risk surgical patients: preliminary results of a randomized prospective single-center study

J Benes, I Chytra, P Altmann, M Hluchy, E Kasal, R Sviták, R Pradl, M Stepán

University Hospital, P/zen, Czech Republic

Critica/ Care 2009, 13(Suppl 1):P201 (doi: 10.1186/cc7365)

Introduction Stroke volume variation (SVV) is a good and easily obtained predictor of fluid responsiveness that can be used to guide fluid therapy in mechanically ventilated patients. During major abdominal surgery in patients with compromised cardiovascular reserves, inappropriate fluid management may result in occult organ hypoperfusion or in fluid overload and increased postoperative morbidity. The aim of our study was to evaluate the influence of SVV-guided fluid optimization on organ functions and postoperative morbidity and mortality in high-risk patients undergoing major abdominal surgery.

Methods Patients undergoing elective intraabdominal vascular and nonvascular surgery were randomly assigned to a control group with routine intraoperative care and a SVV group with fluid management guided by SVV derived from the Vigileo/FloTrac system. The intervention target was to maintain the SVV index below 10% with colloid boluses of 3 ml/kg. Postoperative ICU care was the same for both groups. Demographic parameters, comorbidities, performed surgical procedures, mortality and ICU and hospital lengths of stay were assessed. Laboratory parameters of organ hypoperfusion in the perioperative period (pH, base excess, serum lactate) and number of infectious and organ complications on day 7 and day 30 after operation were evaluated. The Mann-Whitney, unpaired t test and chi-squared test were used accordingly; P <0.05 was considered statistically significant. The study was approved by the local hospital ethic committee. Results A total of 80 patients were enrolled and randomized in the SVV (n = 40) and control (n = 40) groups. No significant differences between both groups in assessed parameters were found except for a difference in arterial pH (7.37 ± 0.05 vs. 7.35 ± 0.05; P = 0.04), lactate serum concentration at the end of the operation (median (IQR): 1.5 (1.2 to 1.9) mmol/l vs. 2.2 (1.39 to 2.35) mmol/l; P = 0.03) and the trend to lower rate of complications on day 30 in the SVV group (11 patients (39%) vs. 20 patients (57%); P = 0.06).

Conclusions Fluid optimization guided by SVV during major abdominal surgery decreases blood lactate at the end of operation and may be associated with a trend for a lower rate of postoperative organ complications.

Acknowledgement Supported by the research grant MSM0021620819. S81

Impedance cardiography in the estimation of hemodynamic and fluid status of coma patients during continuous venovenous hemodiafiltration

A De Nicola, MJ Sucre

San Leonardo Hospital, Castellammare di Stabia, Italy Critical Care 2009, 13(Suppl 1):P202 (doi: 10.1186/cc7366)

Introduction Most ICU patients on continuous venovenous hemodiafiltration (CVVHDF) are in multisystem failure and require extensive monitoring [1]. Impedance cardiography (ICG) technology provides a measurement of fluid status using the thoracic fluid content (TFC), along with cardiac output (CO), cardiac index (CI) and systemic vascular resistance index (SVRI). NICCOMO® (Medis, Germany), a noninvasive ICG device that provides trustworthy measures, could be a complementary monitor for CVVHDF, supporting fluid balance and helping avoid hemodynamic instability [2].

Methods The study was an analysis of coma patients with acute renal failure while undergoing CVVHDF (Equasmart®; Hemodec, Italy). By means of NICCOMO® the TFC, CI, CO, mean arterial pressure and SVRI were constantly recorded. Employing the Pearson method, the percentage variations in each of the parameters during the CVVHDF treatment were correlated to the amount of fluid removed (FR), normalized to body weight. Results Ten patients were studied (six men and four women); the age range was 53.1 ± 15.2 years. A total of 16.6 l of fluid was removed during CVVHDF (830 ml/day over 20 treatment days). The median FR per day was 1,837 ml and the median hourly FR rate was 252 ml. TFC diminished in all patients at the end of CVVHDF treatment (average reduction 14.8 ± 9/kO), while all other hemodynamic parameters showed both increases and decreases. We found that the percentage TFC changes were closely and inversely related with those of FR (r = -0.68, P <0.001); other hemodynamic parameters showed a moderate correlation with FR. The ICG device was helpful to promptly identify one patient who experienced hemodynamic instability and to prevent it.

Conclusions TFC is a reliable and noninvasive method for evaluating the quantity of FR during CVVHDF. This parameter changed consistently with fluid subtraction and TFC measurements can guide the extent of FR. This compact ICG device provides safe and accurate readings and seems to be one of the best options for evaluation of basic hemodynamic parameters and TFC during hemodiafiltration. References

1. Vincent JL, et al.: Rev Med Brux 2008, 29(1 Suppl):S9-S13.

2. Wynne L, et al.: J Surg Res 2006, 133:55-60.

Pulse and systolic pressure variation assessment in partially assisted ventilatory support

P Formenti, M Zaniboni, M Umbrello, A Galimberti, R Pinciroli, P Morelli, L Bolgiaghi, G Iapichino

Istituto di Anestesia e Rianimazione, Milan, Italy

Critical Care 2009, 13(Suppl 1):P203 (doi: 10.1186/cc7367)

Introduction The use of pulse pressure variation (PPV) and systolic pressure variation (SPV) is possible during controlled mechanical ventilation (MV) [1]. Even in acute respiratory failure, controlled MV tends to be replaced by assisted ventilatory support, which may generate a tidal volume (Tv) inadequate to change the S82 pulmonary venous flow and swing in pleural pressure [2]. This

makes the use of dynamic indices unreliable. Our hypothesis was that during a pressure-support-assisted ventilatory (PSV) approach few imposed breaths (flow-triggered synchronized intermittent mechanical ventilation (SIMV)) could allow the monitoring of PPV and SPV. We therefore tested whether PPV and SPV during PSV + SIMV could be as accurate as in controlled MV. Methods A prospective case-control study. Thirty patients who met the criteria of weaning from controlled MV were included. PPV and SPV were measured, first, during 20 minutes in PSV with three per minute flow-triggered SIMV breaths (10 ml/kg, duration 5 s, inspiration to expiration ratio 1:3) (T1), and then during three consecutive breaths in controlled MV (respiratory rate 12/min, duration 5 s, inspiration to expiration ratio 1:3, Tv 10 ml/kg, positive end-expiratory pressure and FiO2 as in PSV) (T2). Throughout 20 minutes of data collection, saline infusions were kept constant (3 ml/hour) without performing any fluid loading. Correlation and Bland-Altman analysis were used to compare respective values of PPV and SPV in the two modes of ventilation. Results Significant correlations were found between dynamic indices in SIMV during pressure support ventilation and those in controlled MV mode. The mean differences between two measurements were: PPV 0.6 ± 2.8% (limit of agreement: -5.0 and 6.2), SPV 0.5 ± 2.3 mmHg (limit of agreement: -4.0 and 5.1). Conclusions PPV and SPV measured during SIMV fitted with the findings in controlled MV. Dynamic indexes could be accurately monitored in patients breathing with assisted respiratory assistance adding an imposed large enough SIMV breath [3]. References

1. De Backer D, et al.: Can one predict fluid responsiveness in spontaneously breathing patients? Intensive Care Med 2007, 33:1111-1113.

2. Heenen S, et al.: How can the response to volume expansion in patients with spontaneous respiratory movements be predicted? Crit Care 2006, 10:R102.

3. Zaniboni M, et al.: Pulse and systolic pressure variation assessment in partially assisted ventilatory support. J Clin Monit Comput 2008, 22:355-359.

Does the pleth variability index improve fluid management during major abdominal surgery?

P Forget, F Lois, M De Kock

St-Luc Hospital, Université Catholique de Louvain, Brussels, Belgium

Critical Care 2009, 13(Suppl 1):P204 (doi: 10.1186/cc7368)

Introduction Dynamic parameters predict fluid responsiveness and improve fluid management during surgery. We intend to demonstrate that the noninvasive pleth variability index (PVI) guides peroperative fluid management and optimizes the circulatory status.

Methods Patients scheduled for major abdominal surgery were randomized into two groups comparing the peroperative PVI-directed fluid management (group P) versus standard care (control, group C). Protocol: induction of general anesthesia was followed by, in group P, 500 ml followed by 2 ml/kg/hour crystalloids; 250 ml colloids infused if PVI >13% for more than 5 minutes; if required, vasoactive support was introduced after lowering PVI <10%. In group C, 500 ml crystalloids followed by fluids at the discretion of the anesthesiologist. Results Eighty-two patients completed the protocol. No difference was detected in preoperative characteristics, type of surgery and anesthesia. Peroperative and postoperative (24-hour) crystalloid infusions were significantly different. Lactate levels were

Group C Group P P value

Perop. fluids (ml.) Crystalloids I815(+/-7s6) 1363 (+/-561) <0.01

Colloids 1003 (+/-70©) 890 (+/-574) 0.43

Postop. fluids (24h) Crystalloids 3516 (+/-1618) 3107 (+/-1099) <0.01

Colloids 358 (+/-456) 268 (+/-448) 0.43

Lactate (niMol.L-1) Fterop. (max) 1.6{+/-1.6) 1.2(+/-0.7) <0.05

Rjstop. (24h) 1.8 (+/-1.8) 14 (+/-0 6) <0.05

Fbstop. (40h) 1 4 (+/-0 4) 1 2 (+/-0 3) <0.05

Perop. hypotension 88% 54% 0.17

Perop. oliguria 42% 32% 0.34

Ser. creat. (mg.dL-1) Preop. 1.0 (+1-0.3) 1 0 (+/-0 2) 0.91

Postop. (48h) 1.1 (+/-0.7) 0.9 (+/-0.3) 0.11

significantly lower in group P, whereas the peroperative and postoperative volumes infused in group P were lower (Figure 1). Conclusions The PVI improves peroperative fluid management in abdominal surgery. The reduced mean volume infused associated with reduced lactate levels suggests the capacity of the PVI to infer tailored fluid administration.

Plethysmography variability index: a new fluid responsiveness parameter

M Feissel1, R Kalakhy1, J Badie1, G Robles1, J Faller1, JL Teboul2

1CHBM, Belfort, France; 2APHP, Le Kremlin Bicetre, France Critical Care 2009, 13(Suppl 1):P205 (doi: 10.1186/cc7369)

Introduction New predictors of fluid responsiveness have been obtained from plethysmographic waveforms displayed on pulse oxymeters. However, they require recordings on a PC and offline operator-dependent analysis. A new parameter called the plethysmography variability index (PVI) has been proposed by a pulse oxymetry manufacturer to be used for the purpose of fluid responsiveness. Its advantage is that it can be automatically calculated and displayed on the screen of the pulse oxymetry monitor. The aim of the study is to test the accuracy of this parameter to predict fluid responsiveness in critically ill patients. Methods Inclusion criteria were septic shock patients fully adapted to their respirator and on sinus rhythm. Methods involved simultaneous recording of the following tracings: invasive blood pressure, plethysmography pulse oxymeter (Philips™), ECG, airway pressure and digit values inscribed on the device (Masimo™). Echocardiography was used to calculate the velocity-time integral (VTI). We infused fluid (500 ml saline) in patients with pulse pressure variation (APP) >15% and performed passive leg raising (PLR) in patients with APP <15%. We compared the PVI with APP and with the variability of pulse oxymeter wave amplitude (APPleth) and sought the best threshold PVI value that predicted APP >15%. Patients who increased their VTI by more than 15% in response to fluid or to PLR were defined as responders. The significance of the PVI threshold to distinguish between responders and nonresponders was examined. Results In the first step 25 patients were enrolled. Fifty paired values were analysed. The r2 coefficients between APP-PVI, APPleth-PVI and APP-APPleth were 0.81, 0.79 and 0.74, respectively. A threshold PVI value of 20 identified patients with APP >15% with a sensitivity of 84% and specificity of 90%. In a second step 18 other patients were enrolled. All patients with PVI >20 (n = 8) were fluid responders and 10 patients with PVI <20 were PLR nonresponders.

Conclusions The PVI automatically obtained from a pulse oxymetry device seems an accurate index of fluid responsiveness. The numerical value of 20 distinguished responders from non-responders with good sensitivity and specificity.

Calibration of pulse contour continuous cardiac output analysis

L Weng, B Du, XY Hu, JM Peng

Peking Union Medical College Hospital, Beijing, China Critical Care 2009, 13(Suppl 1):P206 (doi: 10.1186/cc7370)

Introduction We evaluated the effect of the calibration interval on the reliability of pulse contour cardiac output (COpc) measurement [1].

Methods Eleven patients were investigated for over 10 hours using two COpc monitors simultaneously. One COpc (COpcCAL) monitor was calibrated hourly, while the other (COpcNOCAL) was calibrated once initially without any further calibration. COpcCAL was compared with COpcNOCAL.

Results A total of 116 pairs of cardiac output measurement was obtained. After 3 hours, the correlation between COpcCAL and COpcNOCAL was r2 = 0.85, P <0.0001, bias ± SD was -0.43 ±

0.87 ml/minute; after 6 hours, r2 = 0.68, P = 0.0064, bias ± SD was -0.83 ± 1.39 ml/minute; after 10 hours, r2 = 0.70, P = 0.0026, bias ± SD was -0.81 ± 1.21 ml/minute. See Figures 1 and 2. Conclusions The calibration interval has no effect on the reliability of the COpc measurement.

Reference

1. Godje O, et al.: Reliability of a new algorithm for continuous cardiac output determination by pulse-contour analysis during hemodynamic instability. Crit Care Med 2002, 30:52-58.

Is the pulse pressure variation a good predictor of fluid responsiveness in mechanically ventilated patients with low tidal volume?

C Costa, S Vieira, G Friedman, L Fialkow

Hospital de Clinicas de Porto Algre, Brazil

Critical Care 2009, 13(Suppl 1):P207 (doi: 10.1186/cc7371)

Introduction Dynamic preload indicators are superior to static indicators for predicting fluid responsiveness [1-3]. The aim of this study is to evaluate the influence of a low tidal volume on the capacity of pulse pressure variation (PPV) to predict fluid responsiveness.

Methods A transversal and interventional study that included 30 critically ill patients with acute circulatory failure, sedated and mechanically ventilated with a tidal volume of 6 to 7 ml/kg. Mechanical ventilatory measurements including positive end-expiratory pressure plateau and peak pressures, static compliance and hemodynamic measurements including PPV, heart rate, mean systemic and pulmonary arterial pressures, central venous pressure, pulmonary capillary wedge pressure and cardiac index were obtained before and after fluid challenge, performed with 1,000 ml crystalloids or 500 ml colloids. Fluid responsiveness was defined as an increase in cardiac index of at least 15%. Results Thirty patients were enrolled: aged 56 ± 16.8 years, APACHE score = 28 ± 8, male = 15; 19 patients with septic shock, one patient with sepsis, five patients in postoperative liver transplantation, three patients with acute pancreatitis, one patient with cardiogenic shock and one patient in postoperative aortic

Figure 1 (abstract P206)

Line regression between COpcCAL and COpcNOCAL.

Figure 2 (abstract P206)

Bland-Altman plot of COpcCAL and COpcNOCAL. Solid line, bias; dashed lines, ± 2SD.

aneurysm. Before fluid challenge: total positive end-expiratory pressure = 09 ± 3.7 cmH2O, static compliance = 34.3 ± 16.3 cmH2O, pulmonary capillary wedge pressure = 13.8 ± 5 mmHg, central venous pressure = 11.6 ± 5 mmHg. Fourteen patients were fluid responders (Figure 1). The best threshold value of PPV was 10% (receiver operating characteristic curve area = 0.7, 95% CI = 0.51 to 0.9), sensibility of 50%, specificity of 94%, a

positive predictive value of 88%, a negative predictive value of 68%, a positive likelihood ratio of 8.0 and a negative likelihood ratio of 0.53.

Conclusions The baseline PPV is a good predictor of fluid responsiveness in mechanically ventilated patients with low tidal volume. The threshold value of 10% was associated with a significant increase in cardiac index after volume expansion.

Relationship between the PPV and the cardiac index variation.

References

1. Teboul JL: Relation between respiratory changes in arterial pulse pressure and fluid responsiveness in septic patients with acute circulatory failure. Am J Respir Crit Care Med 2000, 162:134-138.

2. De Backer D: Pulse pressure variations to predict fluid responsiveness: influence of tidal volume. Intensive Care Med 2005, 31:517-523.

3. Chung-Chi H: Prediction of fluid resposiveness in acute respiratory distress syndrome patients ventilated with low tidal volume and high positive end-expiratory pressure. Crit Care Med 2008, 36:2810.

Proving the effectiveness of three dynamic indices to predict fluid responsiveness in septic mechanically ventilated patients

P Wacharasint, A Lertamornpong, A Wathanathum, A Wongsa

Phramongkutklao Hospital, Bangkok, Thailand

Critical Care 2009, 13(Suppl 1):P208 (doi: 10.1186/cc7372)

Introduction Fluid responsiveness is still a cornerstone in managing patients with severe sepsis and septic shock. Recently new technologies have been generated for facilitating the accuracy of predicting fluid responsiveness at the bedside, based on cardiopulmonary interaction [1,2]. We evaluated the effectiveness and accuracy of three dynamic indices, currently available in intensive care monitoring devices, which are pulse pressure variation (PPV) [3], stroke volume variation (SVV) and pulse oximetry plethysmographic waveform variation (POPV) in septic mechanically ventilated patients [4].

Methods A prospective clinical trial was conducted in 20 septic patients 18 years of age and older who had invasive blood pressure monitoring with an intraarterial cannula. PPV, SVV and POPV (%) were calculated and compared with the percentage cardiac index (CI) change. Patients with a CI increase induced by volume expansion >15% were classified as responders, and <15% as nonresponders. A parametric paired t test was used to compare hemodynamic parameters at baseline and after volume expansion. Student's t test was used to compare hemodynamic parameters in the responders and nonresponders groups. Receiver operating characteristic curves were used to evaluate the

predictive value of various indices on fluid responsiveness. P <0.05 was considered significant.

Results We found a strong correlation existed for PPV for detection of a volume expansion-induced change in CI (r2 =

0.794), followed by SVV (r2 = 0.667), and POPV (r2 = 0.633). The areas under the receiver operating characteristic curves were 0.96 for PPV (P <0.001), 0.92 for SVV (P = 0.001), and 0.85 for POPV (P =0.008). Respiratory variation in POPV exceeding 14% (sensitivity of 72%, specificity of 90%), SVV exceeding 11% (sensitivity 90%, specificity 92%), allowed detection of PPV exceeding 12% (sensitivity 84%, specificity 96%).

Conclusions In the septic mechanically ventilated patients, PPV is the most effective dynamic parameter for predicting fluid responsiveness; all of PPV, SVV and POPV are well correlated with the percentage change of CI. References

1. Michard F, Teboul JL: Predicting fluid responsiveness in ICU patients: a critical analysis of the evidence. Chest 2002, 121:2000-2008.

Bendjelid K, Romand JA: Fluid responsiveness in mechanically ventilated patients: a review of indices used in intensive care. Intensive Care Med 2003, 29:352-360. Michard F, et al.: Relation between respiratory changes in arterial pulse pressure and fluid responsiveness in septic patients with acute circulatory failure. Am J Respir Crit Care Med 2000, 162:134-138.

Feissel M, et al.: Plethysmographic dynamic indices predict fluid responsiveness in septic ventilated patients. Intensive Care Med 2007, 33:993-999.

Cross-comparison of the trending accuracy of continuous cardiac output measurement devices in postoperation patients

HK Kim, M Hadian, D Severyn, MR Pinsky

University of Pittsburgh Medical Center, Pittsburgh, PA, USA Critical Care 2009, 13(Suppl 1):P209 (doi: 10.1186/cc7373)

Introduction Arterial pulse analysis estimates of cardiac output (CO) are less invasive than pulmonary artery catheter (PAC)-derived ones. Although PAC bolus thermodilution CO (COtd) values are the reference CO values for most clinical studies, there is no defined gold standard. We therefore compared the co-variance of three commercially available arterial pulse analysis devices (FloTrac, LiDCO, PiCCO) and PAC with COtd and continuous CO (CCO).

Methods Seventeen postoperative cardiac surgery patients were studied for the first 4 hours post ICU admission. CO was measured simultaneously by FloTrac, LiDCO and PiCCO using the same arterial waveform. LiDCO and PiCCO were calibrated to the first COtd measured on ICU admission. Values of CO were compared before and after specific therapeutic interventions (volume, vasoactive or inotropic infusions). Absolute values for CO across all devices were compared by linear regression and Bland-Altman analysis. Dynamic changes for CO across all devices were compared by moment analysis. Results In 17 patients, 72 paired simultaneous CO measurements were collected. By linear regression analysis, all devices correlated well with each other for absolute CO as r = 0.87 (PAC-LiDCO), 0.70 (PAC-PiCCO), 0.51 (PAC-FloTrac), 0.80 (LiDCO-PiCCO), 0.65 (LiDCO-FloTrac) and 0.50 (PiCCO-FloTrac) with P <0.001 in all pairs. By Bland-Altman analysis, the mean bias between CO measurements by each paired device was -0.18l/min (PAC-LiDCO), 0.30 l/min (PAC-PiCCO), -0.43 l/min (PAC-FloTrac),

0.12 l/min (LiDCO-PiCCO), -0.63l/min (LiDCO-FloTrac) and -0.73 l/min (PiCCO-FloTrac), with precision (1.96 SD, 95% CI) of ±1.56, ±2.26, ±3.37, ±1.98, ±2.97 and ±3.45 l/min, respectively. By Pearson moment analysis, the dynamic change for CO was well correlated between PAC-LiDCO, PAC-PiCCO, LiDCO-PiCCO and LiDCO-FloTrac (r = 0.78, 0.56. 0.65 and 0.55, respectively). However, FloTrac showed poor correlation with PAC and PiCCO (r = 0.28 and 0.33, respectively).

Conclusions In postoperative cardiac surgery patients, the absolute CO of arterial pulse analysis devices showed good correlation with each other and PAC COtd and CCO. However, the dynamic change of CO was less correlated with each other than the absolute CO. Because each arterial pulse analysis device showed different correlations with PAC CO, the CO data from one monitoring system can be used carefully in concert with another to drive resuscitation protocols.

Comparison of three noninvasive cardiac output monitors in patients undergoing therapeutic hypothermia after cardiac arrest

GM Haslam, FE Kelly, WA English, A D'Agapeyeff, AJ Padkin, TM Cook, JP Nolan

Royal United Hospital, Bath, UK

Critical Care 2009, 13(Suppl 1):P210 (doi: 10.1186/cc7374)

Introduction We planned to observe the performance of the LiDCOplus (LiDCO Systems, UK), PiCCO (Pulsion Medical Systems, Germany) and NICO (Novametrix-Respironics, USA) continuous cardiac output monitors during therapeutic hypothermia and rewarming post cardiac arrest. PiCCO calibration failure during hypothermia has been reported [1,2]. Use of the LiDCOplus and the NICO monitor during therapeutic hypothermia has not been reported. The LiDCOplus pulse power analysis is calibrated using a bolus of lithium chloride instead of

Figure 1 (abstract P210)

-PiCCO

--LiDCO

«Hl- NICO

Patient 6

-PiCCO

0-1-1-1-1-1-1-1

0 5 10 15 20 25 30

Time (Hours)

transpulmonary thermodilution. Function of the NICO is based on the Fick principle and on partial rebreathing of carbon dioxide. LiDCOplus and NICO performance should not be affected by hypothermia. Methods Local research ethics committee approval was obtained. Patients admitted to our ICU for therapeutic hypothermia after cardiac arrest were recruited to the study following assent from the next of kin. Hypothermia was induced with 2 l crystalloid at 4°C and maintained using an Alsius Coolguard Icy Catheter (Alsius Corp., USA) inserted via the femoral vein. A 5 F PiCCO catheter was inserted in the femoral artery on the opposite side and connected to the PiCCO and LiDCOplus monitors. A NICO system was attached to the patient's tracheal tube. Systems were calibrated and recalibrated according to the manufacturer's instructions (PiCCO 8 hourly, LIDCOplus 24 hourly). The bladder temperature was recorded and maintained at 33°C for 24 hours. The temperature, heart rate, blood pressure, and cardiac output on all three monitors were recorded hourly. After 24 hours, patients were rewarmed at 0.25°C/hour to a target of 36.5°C. Results Six patients were recruited over 8 months. All patients were at the target temperature of 33°C when cardiac output was first recorded. No problems were encountered calibrating the PiCCO, LiDCOplus or NICO monitors at temperatures as low as 33°C. All three monitors were observed to trend together (Figure 1). Statistical analysis of these trends will be available at the time of presentation.

Conclusions LiDCOplus, PiCCO and NICO perform comparably

in the temperature range of 33 to 36.5°C.

References

1. Ong T, Gillies MA, Bellomo R: Failure of continuous cardiac output measurement using the PiCCO device during induced hypothermia: a case report. Crit Care Resusc 2004, 6:99-101.

2. Sami A, Sami A, Rochdil N, Hatem K, Salah BL: PiCCO monitoring accuracy in low body temperature. Am J Emerg Med 2007, 25:845-846.

Head-up tilt and passive leg raising in healthy volunteers as a preclinical model for preload-induced stroke volume modification

A Lima, E Klijn, J Bakker, C Ince, J Van Bommel

Erasmus Medical Center, Rotterdam, the Netherlands Critical Care 2009, 13(Suppl 1):P211 (doi: 10.1186/cc7375)

Introduction In clinical practice, fluid responsiveness is tested by inducing changes in the stroke volume (SV), as measured with invasive monitoring techniques in often time-consuming procedures. To be able to investigate the effects of changes in SV on other hemodynamic parameters, there is the need for a preclinical model that can be translated to clinical practice. For this purpose we studied the effect of the head-up tilt (HUT) test and passive leg raising (PLR) test on SV in five healthy volunteers. Methods The tilt table test consisted of 3 minutes of supine rest followed by a HUT of 70° on a manually operated tilt table and ending with 3 minutes of supine rest. The PLR test consisted of 3 minutes of rest in a semirecumbent position of 30°, followed by

3 minutes PLR (lower limbs elevated at 30° and trunk in supine position) and ending with 3 minutes of rest in a semirecumbent position. Three HUTs and two PLR tests were performed. The SV was measured continuously and noninvasively using a NICOM (Cheetah Medical, Tel Aviv, Israel), based on chest bioreactance, and a Finometer (TNO Biomedical Instrumentation, Amsterdam, the Netherlands), based on the Modelflow method. The Finometer also measured the mean arterial pressure.

Patient 4

Time (Hours)

Figure 1 (abstract P211)

Table 1 (abstract P212)

Results Both HUT and PLR induced significant changes in SV (Figure 1). In addition the cardiac output changed significantly, but not during HUT when measured with the Finometer. Although both devices showed a similar response to the postural changes, the baseline values and magnitude of these responses were not identical. The mean arterial pressure remained constant during these manoeuvres. The heart rate significantly increased during HUT but not during PLR.

Conclusions The combination of noninvasive monitoring techniques with preload-induced SV modifications in healthy volunteers provides an excellent preclinical model for the study of fluid responsiveness. Although these hemodynamic monitors might provide different results, this model should be very suitable to assess the effect of SV variations on, for instance, parameters of peripheral perfusion.

Global end-diastolic volume as a predictor of the need for massive transfusion in multiple-trauma patients with hemorrhagic shock

N Saito, Y Sakamoto, K Mashiko

Chiba Hokusou Hospital, Nippon Medical School, Chiba, Japan Critical Care 2009, 13(Suppl 1):P212 (doi: 10.1186/cc7376)

Introduction The PiCCO system enables hemodynamic evaluation and monitoring by two different approaches: the transpulmonary thermal dilution technique and pulse counter analysis. Optimal monitoring of cardiac preload is of paramount importance for the hemodynamic management of multiple-trauma patients with hemorrhagic shock. There have been only a few studies on the use of the PiCCO system (Pulsion, Germany) in multiple-trauma patients with hemorrhagic shock for hemodynamic monitoring. We hypothesized that performing a cardiac adequate preload evaluation with the PiCCO system would make it possible to predict latent hemorrhagic progress.

Methods Data from 53 consecutive multiple-trauma patients (age 51 ± 17.8 years, injury severity score 30 ± 12.9) with hemorrhagic shock at the scene of the injury or in the emergency room between June 2007 and November 2008 were analyzed. All patients underwent a hemodynamic evaluation with the PiCCO system. We divided the patients into two groups according to whether they received a massive transfusion (MT) (>2,000 ml packed red blood cell transfusion after admission within 24 hours) and compared their PiCCO data: cardiac output (CO), systemic vascular resistance (SVR), indexed global end-diastolic volume (GEDVi), and indexed extravascular lung water on admission to the ICU. The chi-square test and paired t test were used to perform the statistical analysis.

MT group Non-MT group P value

CO (l/min) 4.88 ± 2.1 6.38 ± 2.8 0.034

SVR (dyn x s/cm5) 1,169 ± 364 1,452 ± 601 0.046

GEDVi (ml/m2) 554 ±158 713 ± 238 0.007

Results Twenty-seven patients required massive transfusion. Mortality was higher in the MT group (P = 0.05), and the CO, GEDVi and ITBVi values were significantly lower in the MT group (Table 1).

Conclusions The parameters measured with the PiCCO system enabled evaluation of the correct cardiac preload in the multiple-trauma patients with hemorrhagic shock. GEDVi was useful as a predictor of the need for MT.

Should pulse pressure variation be indexed to tidal volume?

S Vistisen, J Koefoed-Nielsen, A Larsson

Aalborg Hospital/Aarhus University Hospitals, Aarhus C, Denmark Critical Care 2009, 13(Suppl 1):P213 (doi: 10.1186/cc7377)

Introduction Pulse pressure variation (PPV) for prediction of fluid responsiveness depends on the tidal volume (VT), and using VTs of the recommended 6 ml/kg makes PPV unreliable [1]. So far, nobody has suggested how to handle the VT when interpreting PPV or other dynamic parameters, but we hypothesise that PPV is proportional to VT and thus that PPV should be indexed to VT. The aim was to investigate how three VT levels affected PPV at four different intravascular volumes.

Methods The study was approved by the national animal ethical committee. Eight anesthetised and ventilated pigs (23 to 27 kg) were bled 25% of the blood volume (hypovolemia). PPV was measured at ventilation with VTs of 6, 9 and 12 ml/kg (VT6, VT9 and VT12, respectively). Thereafter, depleted blood was replaced with voluven (normovolemia) and PPV was again measured at the three VT levels and subsequently at the +25% and +50% hypervolemic levels also generated with voluven infusion. PPV values were log-transformed and compared with a paired t test. Comparisons were made at each intravascular volume level between VT6 and VT9, VT9 and VT12, and VT6 and VT12. Because three comparisons were performed, P <0.05/3 = 0.017 was considered significant.

Results All comparisons for PPV at different VTs were significantly different (P <0.001); see Table 1 for factorial increases in PPV. At -25% hypovolemia, PPV did not fully double with doubling of VT, whereas PPV slightly more than doubled at normovolemia and hypervolemia.

Table 1 (abstract P213) Comparisons of PPV at different VTs

Factorial increase in PPV

Volume

VT6-VT9

VT9-VT12

VT6-VT12

-25% 0% +25% +50%

1.28 to 1.34 1.53 to 2.00 1.47 to 1.87 1.50 to 1.71

1.17 to 1.28 1.24 to 1.60 1.26 to 1.50 1.30 to 1.61

1.51 to 1.70 2.17 to 2.80 1.88 to 2.77 2.09 to 2.56

Data presented as the 95% CI.

Conclusions Our study in pigs showed that PPV strongly depends on the VT and that PPV is nearly proportional to the VT, indicating that the clinical value of PPV may be improved by TV indexation. Reference

1. De Backer D, et al.: Pulse pressure variations to predict fluid responsiveness: influence of tidal volume. Intensive Care Med 2005, 31:517-523.

Fluid responsiveness in patients following major surgery

M Tuccillo1, M Cecconi2, N Al-Subaie2, M Hamilton2, R Grounds2, G Della Rocca1, A Rhodes2

1Azienda Ospedaliera Universitaria S. Maria della Misericordia,

Udine, Italy; 2St George's Hospital, London, UK

Critical Care 2009, 13(Suppl 1):P214 (doi: 10.1186/cc7378)

Introduction The aim of the study was to see how many patients were fluid responders on arrival in the ICU and to evaluate the performance of dynamic preload indices in predicting fluid responsiveness in fully sedated and mechanically ventilated patients. The following indices were studied: pulse pressure variation (PPV), stroke volume variation (SVV) and systolic pressure variation (SPV).

Methods Patients were monitored with the LiDCOTMplus and received a fluid challenge (250 ml boluses of colloid in 5 min) to ascertain fluid responsiveness. Patients in which a fluid challenge produced a 10% increase in stroke volume were considered fluid responders and thus the fluid challenge was repeated according to the unit protocol [1]. In fully mechanically ventilated patients, the PPV, SVV and SPV were recorded and receiver operating characteristic (ROC) analysis was performed.

Results Thirty-three patients (mean age 64 years; SD ± 13.80; mean BMI 27.34; 16 female and 17 male) were included; 23 patients were on spontaneous ventilation and 10 were totally fully ventilated with a mean tidal volume of 8 ml/kg and positive end-expiratory pressure of 5 cmH2O. Thirteen patients (40%) were responders, of which five were mechanically ventilated. Areas under the ROC curve (AUC) for dynamic predictors of fluid responsiveness were examined for ventilated patients; for PPV, AUC = 0.65 with SD ± 0.14, P = 0.28 showing a sensitivity of 67% and a specificity of 79% for a cutoff value of 14%; for SVV, AUC = 0.73 with SD ± 0.14, P = 0.1 showing a sensitivity of 67% and a specificity of 74% for a cutoff value of 9%; and for SPV, AUC = 0.74 with SD ± 0.12, P = 0.09 showing a sensitivity of 67% and a specificity of 86% for a cutoff value of 8%. Conclusions A high percentage of patients were fluid responsive on arrival in the ICU following major surgery (40%). The PPV, SVV and SPV have a potential to predict fluid responsiveness in mechanically ventilated patients. Reference

1. Pearse RM, et al.: Early goal-directed therapy after major surgery reduces complications and duration of hospital stay. A randomised, controlled trial. Crit Care 2005, 9:R687-R693.

Alternative echographic assessment of inferior vena cava diameter variation in mechanically ventilated patients

T Leclerc1, N Libert1, JP Tourtier1, A Bannay2, S De Rudnicki1

1HIA Val-de-Grace, Paris, France; 2CHU Nancy, France Critical Care 2009, 13(Suppl 1):P215 (doi: 10.1186/cc7379)

Introduction The echographic assessment of inferior vena cava S88 (IVC) diameter (IVCD) variation allows for fluid-responsiveness

Figure 1 (abstract P215)

prediction in ventilated septic patients [1,2]. Its applicability in surgical patients is not established, partly due to the difficulty to acquire a subxiphoidian (SX) view after abdominal surgery. The transhepatic (TH) view of the IVC could provide an interesting alternative.

Methods In this prospective, randomized, crossover, pilot study, IVCD variation was assessed in consecutive mechanically ventilated (volume control) ICU patients, with SX and TH views, in random order by one operator. M-mode images were acquired at 100 mm/s, 2 cm below the junction of the IVC with the right atrium. The minimal (Dmin) and maximal (Dmax) ICVD values were later measured on pooled images. IVCD variation [2], defined as dIVC = (Dmax - Dmin) / ((Dmax + Dmin) / 2), was compared between both views (when available): dIVCsx and dIVCth for subxiphoidian and transhepatic views, respectively.

Results Twenty-eight patients were included, 19 medical and nine surgical. The TH view was obtained in all patients, the SX view in only 22 of them with failures in four surgical and two morbidly obese medical patients. Linear regression showed a strong correlation between dIVCth and dIVCsx (n = 22, r2 = 0.98, P <0.001). The agreement was satisfactory using Bland-Altman analysis (Figure 1): bias was 0.26%, limits of agreement were -8.4 to +7.9%. Results were similar for the IVC distensibility index [1]. Conclusions Our results suggest that the SX view, when unavailable, can be replaced by the TH view for the echographic assessment of IVCD variation. Further study is ongoing to formally establish its validity as a predictor of fluid responsiveness, especially in surgical ICU patients. References

1. Barbier C, et al.: Intensive Care Med 2004, 30:1740-1746.

2. Feissel M, et al.: Intensive Care Med 2004, 30:1834-1837.

Global end-diastolic volume and global end-diastolic volume index are age dependent in awake, spontaneous-breathing patients after elective craniotomy

A Rieß, S Wolf, JF Landscheidt, CB Lumenta, P Friederich, L Schürer

Klinikum Bogenhausen, München, Germany

Critical Care 2009, 13(Suppl 1):P216 (doi: 10.1186/cc7380)

Introduction Estimation of cardiac preload is an important prerequisite for adequate volume resuscitation. The global end-diastolic volume (GEDV) and the global end-diastolic volume index (GEDVI) are surrogate parameters for preload and have been hypothesized to be age dependent [1]. The current study was performed to assess the influence of age on the preload parameters GEDV and GEDVI prospectively.

GEDV in age groups

21 to 40 years old 41 to 50 years old 51 to 60 years old 61 to 70 years old 71 to 83 years old

GEDV (ml) 1,154 (1,442 to 980) GEDVI (ml/m2) 612 (674 to 548) 1,156 (1,295 to 975) 598 (680 to 557) 1,183 (1,352 to 1,074) 652 (752 to 597) 1,462 (1,630 to 1,182) 756 (875 to 655) 1,337 (1,567 to 1,157) 734 (781 to 634)

Data presented as median (75th to 25th quartile).

Methods One hundred and one patients (41 male, 60 female) scheduled for brain tumour surgery were investigated using the PiCCOplus device (Pulsion Medical Systems AG, Munich, Germany). On postoperative day 1, the cardiac output, GEDV and GEDVI were determined by transpulmonary thermodilution before discharge from the ICU. The influence of predefined age groups (21 to 40 years old, n = 12; 41 to 50 years old, n = 24; 51 to 60 years old, n = 20; 61 to 70 years old, n = 26; 71 to 83 years old, n = 19) was tested using a Kruskal-Wallis test. The level of significance was 5%.

Results Age significantly influenced the GEDV (P =0.0024) as well as the GEDVI (P =0.0007). Cardiac output (P =0.3555), mean arterial pressure (P =0.0764) and systemic vascular resistance (P = 0.1446) were not dependent on age. Conclusions The volumetric parameter GEDV is dependent on age in haemodynamically healthy and spontaneously breathing patients. Indexing to body surface area does not remove age dependence. Targeting volume resuscitation using fixed ranges of the GEDVI acquired by transpulmonary thermodilution without reference to the patient's age seems not to be appropriate. Reference

1. Wolf S, et al.: ITBV and GEDV but not EVLW acquired by transpulmonary thermodilution is age dependent in a series of neurosurgical patients [poster]. Intensive Care Med 2007, 33(S2):P0273.

Recalibration of pulse contour cardiac output using the PiCCO-2 device: when to perform the next thermodilution?

W Huber, S Mair, T Oelfin, B Saugel, V Phillip, H Einwaechter, R Schmid

Klinikum rechts der Isar der Technischen Universität München, Germany

Critical Care 2009, 13(Suppl 1):P217 (doi: 10.1186/cc7381)

Introduction Haemodynamic monitoring is a cornerstone of intensive care. In addition to parameters of preload and afterload, the assessment of cardiac output (CO) is of paramount importance. Thermodilution (TD) using the pulmonary artery catheter and transpulmonary TD using the PiCCO device are the gold standards for CO determination. After calibration by TD, the PiCCO device is able to assess CO using pulse contour (PC) analysis. Despite an overall good correlation of PC-CO and TD-CO in several studies, the manufacturer suggests recalibration by TD after 8 hours. Little is known about the long-term accuracy of PC-CO. Therefore it was the aim of our prospective study to investigate the long-term accuracy of PiCCO-2-derived PC-CO in the daily ICU routine. Methods In 10 consecutive patients (five male, five female, age 65 ± 15 years) the PC-CO was recorded immediately before recalibration by TD-CO. One hundred and ninety-four measurements with a mean time-lag between two measurements of TD-CO of 663.5 ± 371 min (100 to 2,700) were recorded. Mechanical ventilation, catecholamine and arrhythmia occurred in 60 (31%), 132 (68%) and 154 (79%) of the measurements.

Results The 194 pairs of PC-CO and TD-CO showed a highly significant correlation (P <0.001; r = 0.875). There was no significant difference between PC-CO versus TD-CO (4.1 ± 1.6 vs. 4.07 ± 1.4 l/min m2). Analysis according to Bland-Altman demonstrated a mean bias of -0.036 ± 0.778 l/min m2 (lower and upper limits of agreement -1.56 and 1.49 l/min m2; percentage error of 38%). The difference of PC-CO and TD-CO was not correlated to the time-lag to the last calibration (P =0.257; r =-0.083 for uncorrected difference; P =0.067; r = 0.134 for absolute values of the difference). Further analysis demonstrated that the absolute value of the differences correlated to TD-CO (P =0.02, r =0.226). Subgroup analysis of 160 measurements with CO-TD <5.5 l/min m2 demonstrated an improved bias of 0.085 ± 0.53 l/min m2 (lower/upper limits of agreement: -0.98 and 1.12 l/min m2) and a percentage error of 28%. Conclusions PiCCO-2-derived PC-CO and TD-CO are highly significantly correlated. Accuracy is not influenced by the time-lag to the last calibration. Similar to previous data, PC-CO might overestimate very high CO, which usually does not influence clinical practice. Recalibration should be considered in patients with markedly increased PC-CO.

Transpulmonary lithium dilution technique: time to recalibration and calibration drift

R Stumpfle, M Cecconi, D Dawson, M Hamilton, RM Grounds, A Rhodes

St George's Hospital, London, UK

Critical Care 2009, 13(Suppl 1):P218 (doi: 10.1186/cc7382)

Introduction We have previously demonstrated that an average of at least two curves is necessary to improve the calibration of the lithium dilution technique of the LiDCO™plus. The precision of the new calibration process is able to detect a least significant change (LSC) of 17% [1]. The primary aim of this study was to evaluate the drift after an initial calibration with two lithium dilution curves. The second aim of the study was to evaluate the relationship between the magnitude of the time to recalibration and the magnitude of the drift.

Methods Patients requiring monitoring with the LiDCO™plus received an initial calibration plus a second calibration when clinically indicated. Data were downloaded from devices and analysed using the LiDCO™viewPRO v.1 program. Absent, abandoned or rejected calibration curves were excluded. Calibration factors from the first and second calibrations were compared. All recalibrations in which the drift was higher than the LSC (17%) were considered useful calibrations. Regression analysis for the time to recalibration and drift was performed. Receiver operating characteristic curve analysis was performed for the time free of calibration and the usefulness of calibrations (drift >17%).

Results A total of 45 files from 45 patients with two calibration points were identified. Patient mean age was 62 (39 to 88) years, mean height 1.69 (1.40 to 1.94) metres and mean weight 70.9 (40 to 155) kg. Time to recalibration varied from 5 to 39 hours. In 23 S89

(51%) patients the drift was lower than 17% (unnecessary calibration). In 22 (49%) patients the drift was higher than 17% (necessary calibration).

Regression analysis did not show any significant relationship between the time to recalibration and drift. This was confirmed by receiver operating characteristic curve analysis. Conclusions The new implemented calibration process (using the average of two curves) does not show any drift depending on time. Despite this, 51% of the recalibrations are necessary (drift >17%). Further data analysis is necessary to identify when to recalibrate. Reference

1. Cecconi M, Dawson D, Grounds R, Rhodes A: Lithium dilution cardiac output measurement in the critically ill patient: determination of precision of the technique. Online First [Epub ahead of print].

Individualized intraoperative patient optimization using uncalibrated arterial pressure waveform analysis in high-risk patients undergoing major abdominal surgery

J Mayer, J Boldt, R Beschmann, A Stephan, S Suttner

Klinikum Ludwigshafen, Germany

Critical Care 2009, 13(Suppl 1):P219 (doi: 10.1186/cc7383)

Introduction Established methods to optimize cardiac function and fluid balance based on flow-related variables are invasive or require considerable attentiveness. The purpose of this study was to guide intraoperative fluid and catecholamine therapy in patients with pre-existing cardiac disease undergoing major abdominal surgery using a recently introduced less-invasive device without the need for manual calibration (FloTrac/Vigileo™) and to determine possible improvement in outcome by means of N-terminal pro-brain natriuretic peptide (NT-proBNP) plasma levels and the duration of hospital stay.

Methods Forty American Society of Anesthesiologists III patients scheduled for elective major abdominal surgery with pre-existing cardiac disease (coronary artery disease, myocardial infarction, cardiac surgery, heart failure, cardiomyopathy) were studied. Patients were randomly allocated into a standard care group and an intervention group, where a stroke volume variation (SVV) and cardiac index (CI)-based protocol for volume and catecholamine therapy was implemented until the end of surgery. In brief, CI >2.5 l/min/m2 was aimed for, with a SVV threshold value for fluid challenge of 12%. After the baseline (skin incision), haemodynamic data and plasma NT-proBNP levels were obtained after 180 minutes, at the end of surgery, 5 hours post surgery, and on postoperative days 1 and 2, and the ICU and hospital stays were recorded. Results Demographic data and Physiological and Operative Severity Score for the Enumeration of Mortality and Morbidity physiology and operation score values were comparable between the groups. The intervention group received significantly more colloid volume replacement and more dobutamine; crystalloid volume replacement and norepinephrine consumption did not differ. Plasma NT-proBNP levels were significantly higher in the standard care group on postoperative days 1 and 2 (832 ± 675 vs. 1,633 ± 690 pg/ml and 1,097 ± 827 vs. 2,085 ± 871 pg/ml). The mean hospital stay was reduced in the intervention group (14.8 ± 4.7 days) versus 20.6 ± 8.1 days in the standard care group (P =0.009), whereas the ICU stay did not differ significantly. Conclusions The use of uncalibrated arterial waveform analysis (FloTrac/Vigileo™) for intraoperative patient optimization in patients with pre-existing cardiac disease undergoing major abdominal surgery is associated with a reduction of hospital stay and lower S90 plasma NT-proBNP levels.

Evaluation of the effectiveness of different volume replacement therapies in postoperative hypovolemic patients using the PiCCO monitoring system

T Gondos1, Z Marjanek2, Z Ulakcsai3, Z Szabo4, L Bogar3, M Karolyi5, B Gartner6, J Futo7

1Semmelweis University, Budapest, Hungary; 2JOK, Vâc, Hungary; 3PTE, Pécs, Hungary; 4Uzsoki Hospital, Budapest, Hungary; 5Kenézy Hospital, Debrecen, Hungary; 6Petz Hospital, Gyor, Hungary; 7St Imre Hospital, Budapest, Hungary Critical Care 2009, 13(Suppl 1):P220 (doi: 10.1186/cc7384)

Introduction We have no exact information about the hemo-dynamically relevant intravascular volume effect of different volume replacements. The aim of the study was to describe the changes in intravascular volume status 120 minutes after the start of the infusions using a transpulmonary thermodilution technique (PiCCO system; Pulsion).

Methods The prospective, randomized, multicentre study of 11 ICUs involved 200 mixed postoperative hypovolemic patients in Hungary between 2005 and 2008. Ten millilitres per kilogram of lactated Ringer (LR) or 4% gelatin (GEL) or 6% hydroxyethyl-starch 130/0.4 (HES), or 5% human albumin (HA) was infused to 50 patients in each group over 30 minutes. Hemodynamic measurements were taken at baseline, 30, 45, 60, 90 and 120 minutes after the start of infusion.

Results There were no significant differences in basic demographic and laboratory data among the four groups. Differences between baseline and 120-minute values of the same solution were significant only for the colloids: mean arterial pressure (MAP): HES 9%, HA 6%; central venous pressure (CVP): GEL 14%, HES 28%, HA 19%; cardiac index (CI): GEL 15%, HES 29%, HA 20%; global end-diastolic volume index (GEDVI): GEL 6%, HES 17%, HA 11%; stroke volume variation (SVV): GEL -12%, HES -15%, HA -25%; oxygen delivery index (DO2l): GEL 10%, HES 15%, HA 12%; central venous oxygen saturation (ScvO2): GEL 5%, HES 6%, HA 3%. Significant intergroup differences at 120 minutes: MAP: HES-LR; CVP: HES-LR, HA-LR; CI: GEL-LR, HES-LR, HA-LR; GEDVI: GEL-LR, HES-LR, HA-LR, HES-GEL; SVV: HES-LR, HA-LR; DO2I: GEL-LR, HES-LR, HA-LR; ScvO2: GEL-LR, HES-LR, HA-LR. The ratio of patients with normalized baseline hypovolemia parameters at 120 minutes: CVP: LR 0%, GEL 27%, HES 47%, HA 20%; CI: LR 0%, GEL 50%, HES 64%, HA 50%; GEDVI: LR 0%, GEL 25%, HES 68%, HA 44%; SVV: LR 3%, GEL 20%, HES 11%, HA 25%.

Conclusions In postoperative hypovolemic patients, 10 ml/kg LR solution has no demonstrable hemodynamic effect 120 minutes after the start of infusion. The colloid solutions have slightly different hemodynamic effects. Ten millilitres per kilogram of colloids can improve the hemodynamic parameters but their level is only 30% maximum comparing with the baseline at the end of the second hour.

Coupling of cardiac index and global end-diastolic volume index: is it mathematical or something else?

T Gondos1, Z Marjanek2, G Halasz3

1Semmelweis University, Budapest, Hungary; 2JOK, Vâc, Hungary; 3BME, Budapest, Hungary

Critical Care 2009, 13(Suppl 1):P221 (doi: 10.1186/cc7385)

Introduction Our goal was to analyse the relationship between the cardiac index (CI) and the global end-diastolic volume index

(GEDVI), a volumetric preload parameter measured by the single transpulmonary indicator dilution technique, in a mixed population of intensive care patients. We hypothesized a close mathematical connection underlying observed clinical changes. Methods An observational study (OTKA T 046538) in the medical-surgical ICU of a teaching hospital. Hemodynamic data from 32 patients (altogether 122 datasets) were included in the analysis using the PiCCO system (Pulsion, Germany). The CI-GEDVI relationship was investigated using a regression analysis between the main components of the equation: GEDVI = CI x (MTt - DSt) (MTt = mean transit time, DSt = downslope time). To illustrate purely mathematical relationships, the theoretical correlation lines were calculated and compared with the measured data. To demonstrate the complex relationship among all three parameters a three-dimensional (3D) presentation was applied. Results The 3D presentation resulted in a good fit of the measured values onto the theoretical surface (r = 0.98). In GEDVI-CI and GEDVI-(MTt-DSt) there were aspects of the 3D surface where the correlation was weak (r = 0.35 and 0.34). However, classifying the data according to the ranges of the third parameter, a positive linear regression was observable in each range with high correlation coefficients. In the CI-(MTt-DSt) aspect of the 3D surface the regression was better (r = 0.76), supporting the role of hydro-dynamic rules, and the correlation curve crossed the theoretical iso-GEDVI lines, suggesting the effect of the Frank-Starling mechanism. Conclusions Our study has demonstrated that there are three mechanisms working at the same time in the relationship between GEDVI and CI. The basic mathematical coupling is modified by hydrodynamic rules and the final relationship is adjusted by physiological factors. The complex analysis revealed that CI and GEDVI are related parameters even in clinical situations. The physiological influences can modify this relationship significantly, however, especially in low ranges of CI. Evaluating the GEDVI we have to consider the CI values because the intravascular volume status depends on the relationship of these two parameters (high GEDVI with low CI - volume overload, the same GEDVI with high CI - normovolaemia).

Comparison of transpulmonary thermodilution measurements of global end-diastolic volume index, extravascular lung water index and central venous pressure with radiographic estimation of these parameters using computed tomography

W Huber, B Saugel, RM Schmid

Klinikum rechts der Isar der Technischen Universität München, Germany

Critical Care 2009, 13(Suppl 1):P222 (doi: 10.1186/cc7386)

Introduction In critical illness, optimization of the fluid status is of central importance. This study aims to investigate whether radiographic estimation of the global end-diastolic volume index (GEDI), the extravascular lung water index (ELWI) and the central venous pressure (CVP) using computed tomography (CT) is able to contribute to an early, noninvasive evaluation of fluid status. Methods Thirty-two CT scans, 26 ICU patients. Estimation of GEDI, ELWI and CVP using CT. Transpulmonary thermodilution using the PiCCO within at most 6 hours (mean 2.25 hours) before or after CT.

Results The diagnostic accuracy of CT-based estimation of the volume status (GEDI <680, 680 to 800, >800) was 25%. Sensitivity and specificity at diagnosis of hypovolemia were 0% and 100%, respectively. The positive predictive value (PPV) for hypovolemia was 0%. The negative predictive value (NPV) was 74%. In prediction of a hypervolemia, radiographic estimation had

a sensitivity of 89% and a specificity of 62% (PPV 36%, NPV 96%). CT-estimated and PiCCO-assessed GEDI values were significantly different (P <0.005; overestimation of CT-estimated GEDI in 91%). CT-based estimation of ELWI (<7 or >7 ml/kg) had a diagnostic accuracy of 72%. The sensitivity for the prediction of elevated ELWI was 92% (specificity only 13%, PPV 76%, NPV 33%). The ELWI estimated using CT and the PiCCO-assessed ELWI were significantly different (P =0.029; underestimation of CT-estimated GEDI in only 6%). In prediction of CVP (1 to 9 or >9 mmHg) the estimation using CT had an accuracy of only 53%. Sensitivity for prediction of hypervolemia was only 48% (sensitivity of 80%, PPV 93%, NPV 22%).

Conclusions Estimation of hemodynamic parameters using CT is difficult. Estimation of the GEDI is not accurate, sensitive or specific for prediction of hypovolemia. At prediction of hyper-volemia, radiographic estimation of the GEDI is suitable. At assessment of elevated ELWI, the radiologist overestimates ELWI values (high sensitivity 92%, poor specificity, accuracy 72%). At estimation of CVP, the radiographic estimation is not sufficiently accurate, sensitive or specific. Values for the GEDI and ELWI assessed using CT or PiCCO are significantly different. Diagnostic sensitivity/specificity of radiographic estimation probably can be improved by interdisciplinary training.

Correlation of clinical evaluation and invasive monitoring evaluation in critically ill patients

V Wongsrichanalai, K Piyavechwiratana, W Tiyanont

Phramongkutklao Hospital, Bangkok, Thailand

Critical Care 2009, 13(Suppl 1):P223 (doi: 10.1186/cc7387)

Introduction Shock is a critical condition. The knowledge and skills of the physician can improve the outcome of these patients. We therefore studied the factors that affect physician knowledge and skill. Methods We enrolled 12 shocked patients admitted to the medical ICU, their symptoms having been evaluated by the patient-care team for defining the type of shock [1,2]. Venous cathe-terization (central venous pressure) and arterial catheterization (A-line) had been performed for invasive monitoring data [3]. After that either clinical evaluation data or invasive monitoring data were collected for analysis [4].

Results All 12 shock patients, seven men and five women, were defined in four groups: hypovolemic, cardiogenic, obstructive and distributive/septic shock in four cases, two cases, one case, and five cases, respectively. Shock was defined by 38 volunteer physicians, 27 men and 11 women. All physicians were studying in the training program: 28 were in the residency program (first, second and third years - 12, eight and eight physicians, respectively), 10 in the fellowship training program (first and second years equally). We found that training physicians can define the type of shock by clinical evaluation in 65.7% (residents vs. fellows 64.29% vs. 80%, P <0.05), and fellowship physicians can define the type of shock significantly better than residency physicians (P <0.05). Male physicians can define the type of shock significantly better than females (male vs. female 70.37% vs. 54.54%, P <0.05). In meta-analysis, clinical evaluating factors such as jugular venous pressure, capillary filling time and lung fine crepitation are correlated significantly with invasive monitoring factors.

Conclusions Physician experience is important for clinical evaluation. It can be used for evaluating shocked patients nearly as well as invasive monitoring. It can decrease procedure complications and cost. Gender is an interesting factor that affected evaluating abilities, it should be studied in greater numbers and in a different population for other significant statistics. S91

References

1. Piyavechwiratana K: The Best Care of Shock: Diagnosis and Goal Setting, Best Practices in Critical Care. Pramongkutklao Hospital: 147-152.

2. Holmes CL, Walley KR: The evaluation and management of shock. Clin Chest Med 2003, 24:775-789.

3. Cheatham ML: Hemodynamic Calculations I 2002 [www.sur-gicalcriticalcare.net/Lectures/PDF/hemodynamic calculations I.pdf]

4. Kress JP, et al.: Clinical examination reliability detects intrinsic positive end-expiratory pressure in critically ill, mechanical ventilated patients. Am J Respir Crit Care Med 1999, 159:290-294.

Influence of prone positioning on measurement of extravascular lung water and pulmonary vascular permeability indexes by transpulmonary thermodilution

SG Sakka, U Bruecken, U Gloeckner, F Wappler

Medical Center Cologne-Merheim, Cologne, Germany Critical Care 2009, 13(Suppl 1):P224 (doi: 10.1186/cc7388)

Introduction The transpulmonary thermodilution technique enables measurement of the extravascular lung water index (EVLWI), which has been described in experimental and clinical studies to be accurate when compared with the reference method (that is, transpulmonary double indicator technique) [1,2]. In this study, the influence of prone positioning on the reliability of the measurement of the EVLWI and the pulmonary vascular permeability index (PVPI) -as a measure of capillary function - by transpulmonary thermodilution was assessed.

Methods We prospectively studied 12 consecutive critically ill patients (eight male, four female, age 20 to 64 years) receiving mechanical ventilation, who for clinical indication due to severe chest trauma or acute respiratory distress syndrome underwent modified prone positioning and extended hemodynamic monitoring by the transpulmonary thermodilution technique (PiCCO®; Pulsion Medical Systems AG). In all patients, an arterial thermistor catheter (A. femoralis) was placed and connected to a monitor (PiCCO®plus, version 7.0 nonUS). Measurements of cardiac output, intrathoracic blood volume (ITBV), EVLWI and PVPI (five central venous bolus injections, 15 ml NaCl, <8°C) were performed 10 minutes before and after turning the patients into a modified prone position (135°) from a supine position. No changes in respirator settings and fluid status were made. The statistical analysis was performed by linear regression and according to Bland-Altman.

Results The range for the EVLWI was 5.0 to 21.0 and 5.0 to 22.7 ml/kg at the two time points. Linear regression analysis revealed r = 0.95 (mean bias 0.5 ml/kg, one standard deviation 1.4 ml/kg). Furthermore, the PVPI ranged from 1.9 to 5.3 before and 1.7 to 5.8 after prone positioning. Correlation coefficient was r = 0.96 (mean bias 0.04, one standard deviation 0.35). The linear regression analysis with respect to the ITBV at the two time points showed r = 0.96, respectively.

Conclusions Our results suggest that measurement of extravascular lung water and pulmonary vascular permeability indexes by the transpulmonary thermodilution technique is not influenced by prone positioning. References

1. Neumann P: Extravascular lung water and intrathoracic blood volume: double versus single indicator dilution technique. Intensive Care Med 1999, 25:216-219.

2. Sakka SG, et al.: Assessment of cardiac preload and extravascular lung water by single transpulmonary thermodilu-

S92 tion. Intensive Care Med 2000, 26:180-187.

Cardiac output by thermodilution and the MostCare system in patients on intra-aortic balloon pump

A Faltoni1, F Franchi1, E Bigio1, M Romano2, P Giomarelli1, B Biagioli1, S Scolletta1

1University of Siena, Italy; 2University of Florence, Firenze, Italy Critical Care 2009, 13(Suppl 1):P225 (doi: 10.1186/cc7389)

Introduction The intra-aortic balloon pump (IABP) is a device used to help the heart pump. During the IABP assistance, significant changes in arterial pressure waveform occur. The so-called pulse contour methods (PCMs) use proper algorithms to analyse the arterial pressure waveform and obtain the cardiac output (CO) [1]. As a consequence, changes in pressure waveform in patients on IABP may affect the reliability of different PCMs. The aim of this study was to investigated the reliability of a new PCM, the MostCare powered by the pressure recording analytical method (PRAM) (Vytech Health, Laboratoires Pharmaceutiques Vygon, Ecouen, France), by comparing its CO values (PRAM-CO) with those determined by bolus thermodilution (ThD-CO) during aortic counterpulsation. Methods Eight patients requiring hemodynamic support with an IABP after coronary surgery were studied. A thermodilution pulmonary artery catheter was inserted. To compute the CO from the analysis of the radial artery pressure waveform, the MostCare was connected via a cable to the monitoring system (Hewlett Packard, Andover, MA, USA). The comparisons of CO values by the two techniques were carried out at four different IABP settings: 1:1, 1:2; 1:3, and 1:4. Pearson's correlation and Bland-Altman analysis were applied. Percentages of error were also calculated. Results A total of 64 ThD-CO and PRAM-CO measurements were evaluated. ThD-CO values ranged from 2.4 to 6.8 l/min; PRAM-CO values ranged from 2.6 to 6.2 l/min. Mean ThD-CO values ranged from 4.0 ± 0.8 to 4.3 ± 0.9 l/min. Mean PRAM-CO values ranged from 3.9 ± 0.7 to 4.2 ± 0.8 l/min. At each IABP setting: (1) correlations between the two methods were 0.90, 0.87, 0.88, and

0.86; (2) mean biases were 0.11, 0.05, 0.10, and 0.12; and (3) percentages of errors were 19%, 20%, 19%, and 18%. Conclusions Bolus thermodilution and MostCare showed a good agreement at each time of the study. Although the IABP altered the arterial pressure wave profile this did not seem to affect the capability of the MostCare to adequately estimate the CO. This new device seemed to perform suitably in patients assisted with the IABP. Reference

1. Romano SM, Pistolesi M: Assessment of cardiac output from systemic arterial pressure in humans. Crit Care Med 2002, 30:1834-1841.

Validation of the extravascular lung water by single transpulmonary thermodilution in the clinical setting

T Tagami1, S Kushimoto2, T Masuno2, R Tosa1, K Yonezawa1, H Hirama1, Y Imazu3, K Matsuda4, Y Yamamoto2, M Kawai2, H Yokota2

1Aizu Chuo Hospital, Aizuwakamatsu, Fukushima, Japan; 2Nippon Medical School, Tokyo, Japan; 3Saiseikai Chuo Hospital, Tokyo, Japan; 4Yamanashi Chuo Hospital, Koufu, Japan Critical Care 2009, 13(Suppl 1):P226 (doi: 10.1186/cc7390)

Introduction While it is known that extravascular lung water (EVLW) estimated by transpulmonary single thermodilution correlates closely with gravimetric measurements of lungs in experimental animal models, the correlation in human beings, especially in the clinical setting, is still unclear. The aim of our study

Figure 1 (abstract P226)

Table 1 (abstract P227)

EVLW(ml)

Baseline characteristics and delta values

y = 0.586*-98.94

R*= 0.908

О 500 1000 1500 2GC0 2500 3000 3500 WOO

Lung weight (g)

EVLW and postmortem lung weight.

was to validate EVLW measured by the PiCCO system in the clinical setting.

Methods We retrospectively analyzed the data of all 23 cadavers whose EVLW was measured using the single indicator transpulmonary thermodilution system (PiCCO; Pulsion Medical Systems, Germany) at four teaching hospitals from July 2004 to November 2008. We evaluated the relationship between EVLW, the postmortem lung weight and the amount of plural effusion. Results Although the amount of the plural effusion was between 10 ml and 1,600 ml, we found very close correlation between pre-mortem transpulmonary measurements of EVLW and postmortem lung weight (Figure 1; n = 23, R = 0.954, R2 = 0.91). Conclusions Measurement of the EVLW using the PiCCO system is very closely correlated with gravimetric measurement of lung weight, which is independent of the amount of pleural fluid.

Relation between mixed venous and central venous saturation in sepsis: influence of source of sepsis

PA Van Beest1, M Koopmans2, J Van Ingen3, H Groen1, EC Boerma2, MA Kuiper2

1UMCG, Groningen, the Netherlands; 2MCL, Leeuwarden, the Netherlands; 3Martini Hospital, Groningen, the Netherlands Critical Care 2009, 13(Suppl 1):P227 (doi: 10.1186/cc7391)

Introduction There remains controversy concerning the use of central venous oxygen saturation (ScvO2) or mixed venous oxygen saturation (SvO2) as a marker for resuscitation, including their interchangeability [1,2]. We tested the hypothesis that in sepsis ScvO2 does not reliably predict SvO2, independent of the sepsis origin.

Methods We determined ScvO2 and SvO2 in a group of patients with sepsis in 6-hour intervals during the first 24 hours after acute admission.

Results Data of 29 septic patients were collected: 10 patients with abdominal sepsis and 19 patients with other sources of sepsis (nonabdominal group) (Table 1). Univariate analysis from the total population and both groups separately did not show any parameter (cardiac output, cardiac index, dopamine, nor-epinephrine, arterial saturation, hemoglobin, hematocrit and lactate) affecting A(ScvO2 - SvO2). See Figure 1.

Abdominal Nonabdominal P value

Age (years) 63 ± 8 66 ± 14 0.37

APACHE II score 23 ± 6 29 ± 8 0.02

ScvO2 (%) 74 ± 5 71 ± 10 0.51

SvO2 (%) 77 ± 8 69 ± 12 0.10

Lactate (mmol/l) 2.6 ± 1 2 ± 1.5 0.25

A 0.5 ± 3 2.3 ± 6 0.45

Figure 1 (abstract P227)

Conclusions We conclude that in sepsis ScvO2 does not reliably predict SvO2, independent of the sepsis origin. The difference between ScvO2 and SvO2 in sepsis appears not to be a fixed one. Also, this difference seems independent of several hemodynamic parameters. References

1. Dueck MH, Klimek M, Appenrodt S, Weigand C, Boerner U: Trends but not individual values of central venous oxygen saturation agree with mixed venous oxygen saturation during varying hemodynamic conditions. Anesthesiology 2005; 103:249-257.

2. Varpula M, Karlsson S, Ruokonen E, Pettila V: Mixed venous oxygen saturation cannot be estimated by central venous oxygen saturation in septic shock. Intensive Care Med 2006, 32:1336-1343.

Intraoperative central venous oxygen saturation and patient outcome in patients undergoing major abdominal surgery

R Rahman West, N Al-Subaie, A Addei, R Hagger, M Hamilton, M Grounds, A Rhodes

St George's Hospital, London, UK

Critical Care 2009, 13(Suppl 1):P228 (doi: 10.1186/cc7392)

Introduction Low central venous saturation (ScvO2) in the postoperative period is associated with poor outcome. We examined S93

whether intraoperative ScvO2 is a predictor of outcome based on the postoperative morbidity survey (POMS) and length of hospital stay. The POMS is a method for describing complications associated with major surgery [1]. It comprises a nine-point survey and provides a generic measure of short-term postoperative outcome.

Methods Prospective ScvO2 values were collected from patients undergoing major abdominal surgery. This was done by intermittent sampling of blood obtained from the distal lumen of a central venous catheter placed in the jugular vein. The POMS was used to assess outcome on postoperative days 1, 3, 5, 8, 15 and 21 if applicable. To accommodate local practice, however, we excluded urinary catheter, pain and mobility from the POMS as it was not a true reflection of organ dysfunction in our study population. We evaluated other individual postsurgical complications as outlined in the POMS to establish whether intraoperative median, minimum and AScvO2 (end of surgery value minus start of surgery value) are related to outcome.

Results Fifty-two patients (average age 67.8, SD 12.0) were included. The Portsmouth Physiological and Operative Severity Score for the Enumeration of Mortality and Morbidity revealed a predicted morbidity of 54.2% and mortality of 3.3%. The commonest complications were gastrointestinal and infection (58.7% and 19.2%, respectively). We found no correlation between median, minimum and AScvO2 with postoperative complication and length of hospital stay. However, AScvO2 for patients who exhibited postoperative complication on day 3 was significantly different from the patients who did not (P =0.035) but the actual difference was only 2%. Minimum ScvO2 correlated weakly with high arterial lactate measured at the end of the procedure (Spearman r =0.32, P = 0.025).

Conclusions Intraoperative central venous saturation in patients undergoing major abdominal surgery is not related to postoperative outcome or length of hospital stay. Reference

1. Bennett-Guerrero E, et al.: The use of a postoperative morbidity survey to evaluate patients with prolonged hospital-ization after routine, moderate-risk, elective surgery.

Anesth Analg 1999, 89:514-519.

Bioelectrical impedance analysis in ICU patients

L Steinhilp1, F Bubser1, S Wiesener1, C Spies1, M Boschmann2, T Schütz1, S Weber-Carstens1

1Charite, Berlin, Germany; 2Max-Delbrueck-Centre, Berlin, Germany

Critical Care 2009, 13(Suppl 1):P229 (doi: 10.1186/cc7393)

Introduction Bioelectrical impedance analysis (BIA) is a widely used method for calculating body compartments in healthy subjects and chronically ill patients. The resistance (R) - which is correlated with total body water - and the reactance (Xc) -depending on the capacitance of cell membranes - are measured by BIA. The phase angle a, a mathematical relation between R and Xc, has qualified as a prognostic marker in several severe diseases [1]. We aimed at monitoring body composition during the clinical course of ICU patients and to investigate the impact of illness severity on BIA.

Methods In this observational study we performed BIA testing at the frequencies of 50 kHz unilaterally between the wrist and ankle with a body impedance analyzer. BIA of ICU patients took place on three different measuring points: M1, M2 and M3 (days 2 to 3, days 10 to 11 and days 14 to 15 after admission). Severity of illness was monitored by Simplified Acute Physiology Score

(SAPS) II. BIA results of each patient were matched with a standard collective according to sex, age, and body mass index. Spearman's rho (rs) has been calculated to analyze correlation between a and SAPS II. Nonparametric analysis of longitudinal data was carried out to analyze two groups differentiated by SAPS II (Group 1: <45 points, Group 2: >45 points). Results Forty-eight patients received one measurement at M1, 28 patients were measured at all three measuring points. BIA of our patients differed considerably from the normal collective. There were significant differences between the patients grouped by primary disease (abdominal cancer patients, nontumor patients, neurosurgical patients, multiple trauma and acute respiratory distress syndrome). a is correlated with the ICU score SAPS II on admission at all three measuring points (M1: n = 48, rs = -0.296, P <0.05; M4: n = 28, rs = -0.368, P <0.005; M5: n = 28, rs = -0.525, P <0.005). While we did not find a significant change of R in our subsequent measurement, there was a decrease of Xc (P <0.05). Especially in the more serious ill patients (Group 2), this has been observed (P <0.05).

Conclusions BIA helps to estimate body composition at ICU admission. It is associated with primary disease and severity of illness. During the clinical course we observed a decrease of Xc, indicating loss of body cell mass. Reference

1. Kyle UG, et al.: ESPEN Guidelines, bioelectrical impedance analysis - part I: review of principles and methods. Clin

Nutr 2004, 23:1430-1453.

Estimated respiratory quotient and venoarterial pCO2 difference are outcome markers in patients with left ventricular dysfunction submitted to coronary artery bypass surgery

L Hajjar, T Yamaguti, F Galas, R Kalil Filho, M Piccioni, J Auler

Heart Institute, Sao Paulo, Brazil

Critical Care 2009, 13(Suppl 1):P230 (doi: 10.1186/cc7394)

Introduction Experimental and clinical research has successfully evaluated the performance of the respiratory quotient as a useful marker of anaerobic metabolism in shock from different causes. The aim of the present study was to evaluate the estimated respiratory quotient and venoarterial pCO2 difference as suitable anaerobic metabolism signs and outcome markers in patients with left ventricular dysfunction undergoing coronary artery bypass surgery.

Methods A prospective study including 87 patients with left ventricular dysfunction undergoing coronary artery bypass surgery with pump was performed from January 2006 to January 2008. Hemodynamic and metabolic parameters were obtained during five moments: after anesthesia induction and mechanical ventilation initiation (T0), at end of surgery (T1), at admission to the postsurgical ICU (T2), and 6 hours after (T3) and 12 hours after ICU admission (T4). The venoarterial carbon dioxide tension difference (ApCO2) and estimated respiratory quotient (eRQ) were also calculated. Postoperative outcomes were compared regarding clinical events and mortality.

Results In patients with unfavorable postoperative evolution, the eRQ values as venoarterial ApCO2 presented significantly higher values at T4 compared with other groups (1.94 ± 0.9 vs. 1.43 ± 0.65, P <0.05; 8.07 ± 3.24 vs. 5.66 ± 2.78, P <0.05). Patients with unfavorable evolution had significant higher levels of arterial lactate concentration at 6 hours after ICU admission (4.30 ± 2.47 vs. 2.72 ± 1.48, P <0.05). Stepwise logistic regression showed that a higher age, higher estimated respiratory quotient at T4, and

higher lactate at T3 were all independently associated with unfavorable postoperative evolution (OR = 1.12; OR = 3.45; OR = 1.46). Conclusions In a population of patients with left ventricular dysfunction submitted to coronary artery bypass surgery, age, postoperative arterial lactate, eRQ and venoarterial APCO2 after 12 hours of ICU admission are independent predictors of unfavorable outcome. References

1. Van der Linden P, et al.: Anesth Analg 1995, 80:269-275.

2. Zhang H, et al.: Am Rev Respir Dis 1993, 148:867-871.

Clinical value of noninvasive pulmonary artery systolic pressure estimates in our patients with pulmonary artery hypertension

T Subic, I Drinovec, F Sifrer

University Clinic of Respiratory and Allergic Diseases, Golnik, Slovenia

Critical Care 2009, 13(Suppl 1):P231 (doi: 10.1186/cc7395)

Introduction Invasive and noninvasive measurements of pulmonary artery pressures are used to diagnose and control patients with pulmonary artery hypertension (PAH) [1-3]. In our hospital all patients with PAH are diagnosed and first treated in the ICU. Besides right heart catheterization (RHC), noninvasive estimate of pulmonary artery systolic pressure (PASP) with transthoracic echocardiography (TTE) is performed. As RHC is an invasive procedure it is not suitable to repeat it. We compared values of invasively and noninvasively measured PASP to examine correlation so we could use TTE for ambulatory control and estimation of treatment success in our patients. Methods In a retrospective manner we examined the correlation between pulmonary artery pressures estimated by TTE versus RHC among our patients, diagnosed for PAH from 2003 to 2007. The data were collected from 36 patients with PAH of different

cause (chronic pulmonary thromboembolism, 12 patients; connective tissue disease, 10 patients; idiopathic pulmonary artery hypertension, eight patients; other, six patients). Bivariate correlations were used for statistical analysis. Results The median time interval between methods was 2 (1 to 30) days. The median PASP by RHC was 74.8 (30 to 122) mmHg and by TTE was 62.3 (41 to 110) mmHg (r = 0.65, P <0.0001). In 29 (80.6%) patients the difference between the two methods was >20 mmHg, and in 10 (27.8%) patients it was <10 mmHg. We did not find a difference between the two methods regarding the cause of PAH.

Conclusions In our group of patients we found a week correlation between invasively and noninvasively estimated PASP. However, there has been a trend of good correlation for a small proportion of our patients. We conclude that when the first measurements by TTE and RHC correlate well we can use TTE for future evaluation of disease progress and treatment response. When the measurements of the two methods differ we have to exclude potential reversible cause (different examiner, different conditions, to wide time interval) and serial invasive measurements may be needed. References

1. Chemla D, et ail.: Chest 2004, 126:1313-1317.

2. Friedberg MK, et al.: J Am Soc Echocardiogr 2006, 19:559-562.

3. Syyed R, et al.: Chest 2008, 133:633-639.

Goal-directed fluid therapy based on the continuous left ventricle end-diastolic volume improves acute mesenteric ischemia/reperfusion injury in rats

Y Villiger, J Hoda-Jourdan, M Licker, D Morel

University Geneva Hospital, Geneva, Switzerland

Critical Care 2009, 13(Suppl 1):P232 (doi: 10.1186/cc7396)

Introduction Fluid infusion is an essential part of proper medical and surgical management. The target hemodynamic parameters for

Figure 1 (abstract P232)

goal-directed fluid therapy remain controversial, with concerns regarding the risk of overhydration. We used a splanchnic ischemia-reperfusion (I/R) model with a high mortality rate to test whether fluid infusion aimed to target constant left ventricular end-diastolic volume (LVEDV) would ameliorate physiologic and biologic parameters in this situation.

Methods Two groups of adult rats were subjected to 90 minutes of mesenteric ischemia followed by 150 minutes of reperfusion, with one group (I/R+Vol) receiving fluids (one-half glucose 5% and one-half Isohes®) to maintain the LVEDV at baseline levels during reperfusion, and the other group (I/R) receiving no extra fluids. A sham group (instrumented without I/R) served as controls. Results In spite of a persisting acidosis, the LVEDV-directed fluid therapy was able to prevent the rapid fatal outcome, without evidence of overhydration (Figure 1), as demonstrated by the absence of an increased lung wet/dry weight ratio. Conclusions Early LVEDV-directed fluid therapy markedly improves the outcome from mesenteric I/R injury.

Utility of brain natriuretic peptide as a marker of early cardiac risk in patients undergoing elective abdominal aortic aneurism repair

L Vetrugno, F Bassi, L Cereatti, S Tomasino, E Di Luca, F Giordano

University Hospital, Udine, Italy

Critical Care 2009, 13(Suppl 1):P233 (doi: 10.1186/cc7397)

Introduction Patients undergoing elective aortic abdominal aneurysm repair are at increased risk of perioperative mortality and morbidity. The majority of deaths in this setting are related to cardiac complications. Brain natriuretic peptide (BNP) has recently emerged as a predictive value in this context [1]. The current study tests this hypothesis.

Methods We studied 34 patients. After a preoperative interview by a senior anesthetist who assessed cardiovascular status following the recommendations proposed by the American College of Cardiology-American Heart Association, BNP levels were obtained just before surgery and were blinded to the anesthetist. Myocardial injury was defined as a cardiac troponine concentration >1.5ng/ml. Statistical analysis was performed using Student's t test.

Results We found cardiac troponine >1.5 ng/ml in five patients with myocardial injury. The preoperative BNP value was 379 (SD±282) pg/ml in patients with myocardial injury and 88.7 (SD±61) pg/ml in those without events. The two-tailed P value was 0.08.

Conclusions Preoperative BNP levels are higher in patients who develop myocardial injury; however, the P value was not quite significant. This could be explained with intraoperative surgical complications (technical difficulties, blood loss, acute hypotension, longer cross-clamp) affecting the outcome of the patient with low BNP.

Reference

1. Cuthbertson BH, et al.: Utility of B-type natriuretic peptide in predicting perioperative cardiac events in patients undergoing major non-cardiac surgery. Br J Anaesth 2007, 99:170-176.

Accuracy of central venous oxygen saturation with a fiberoptic catheter

D Chiumello, V Berto, C Mietto, M Botticelli, M Chierichetti, F Tallarini

Fondazione IRCCS, Ospedale Maggiore Policlinico, Milan, Italy Critical Care 2009, 13(Suppl 1):P234 (doi: 10.1186/cc7398)

Introduction Central venous oxygen saturation (ScvO2) can reflect the overall balance between the systemic oxygen delivery and supply. Several recent studies reported the importance of ScvO2 monitoring in critically ill patients. Recently, ScvO2 monitoring with fiberoptic catheters has been made available. The aim of this study was to evaluate the correlation between the ScvO2 values obtained by a fiberoptic catheter (CeVOX; Seda, Milan, Italy) and those measured with a CO-oximeter (GEM 4000; Instrumentation Laboratory Milan, Italy).

Methods After in vivo calibration of the fiberoptic catheter, blood samples were collected at intervals of no more than 12 hours. Twenty-nine critically ill patients from different aetiologies (septic shock, acute lung injury/acute respiratory distress syndrome pneumonia, pancreatitis, trauma), 21 male, with a mean age of 59.8 ± 17.7 years, mean weight 78.6 ± 10.0 kg and mean Simplified Acute Physiology Score of 39 ± 14 were enrolled. Results One hundred and nineteen samples were collected. The ScvO2 measured with the fiberoptic catheter showed a weak correlation (r2 = 0.46) with the CO-oximeter. The mean bias (average difference between catheter readings and CO-oximeter values), precision (standard deviation of the bias) and limits of agreement (bias ± 2 SD of bias) were -0.17, 4.7 and -0.17 ± 9.4 respectively (Figure 1).

Conclusions Considering the absolute value, the fiberoptic catheter, after 12 hours of usage without any further calibration, presented a weak accuracy. To improve the clinical management we suggest performing in vivo calibration more frequently.

Figure 1 (abstract P234)

Plot of the mean ScvO2 values versus their difference by the Bland-Altman method.

Venous oxygen saturation and lactate gradient from the superior vena cava to the pulmonary artery in ICU patients with septic shock

P Kopterides1, I Mavrou1, E Kostadima2, E Zakynthinos2, M Lignos1, G Kontopithari1, E Papadomichelakis1, M Theodorakopoulou1, I Tsangaris1, G Dimopoulos1, I Dimopoulou1, S Orfanos1, S Bonovas3, A Armaganidis1

1 'Attiko' University Hospital, Athens, Greece; 2University Hospital of Larissa, Greece; 3Center for Diseases Control and Prevention, Athens, Greece

Critical Care 2009, 13(Suppl 1):P235 (doi: 10.1186/cc7399)

Introduction Central venous oxygen saturation (ScvO2) is considered comparable with mixed venous oxygen saturation (SvO2) in the initial resuscitation phase of septic shock [1]. Our aim was to assess their agreement in septic shock in the ICU setting and the effect of a potential difference in a computed parameter, namely oxygen consumption. In addition, we sought for a central venous to pulmonary artery (PA) lactate gradient. Methods We enrolled 37 patients with septic shock who were receiving noradrenaline infusions and whose attending physicians had placed a PA catheter for fluid management. Blood samples were drawn in succession from the superior vena cava (CV), right atrium (RA), right ventricle and PA. Hemodynamic and treatment parameters were monitored and data were compared by correlation and Bland-Altman analysis.

Results SvO2 was lower than ScvO2 (70.2 ± 11.4% vs. 78.6 ± 10.2%; P <0.001), with a bias of -8.45% and 95% limits of agreement ranging from -20.23 to 3.33%. This difference correlated significantly to the noradrenaline infusion rate and the oxygen consumption and extraction ratio. These lower SvO2 values resulted in a computed oxygen consumption calculated with oxygen saturation of pulmonary artery blood higher than the oxygen consumption calculated with oxygen saturation of central venous blood (P<0.001), with a bias of 104.97 ml/min and 95% limits of agreement from -4.12 to 214.07 ml/min. Finally, lactate concentration was higher in the CV and RA than in the PA (2.42 ± 3.15 and 2.35 ±3.16 vs. 2.17 ± 3.19 mmol/l, P<0.01 for both comparisons).

Conclusions Our data suggest that ScvO2 and SvO2 are not equivalent in ICU patients with septic shock. Additionally, the substitution of ScvO2 for SvO2 in the calculation of oxygen consumption produces unacceptably large errors. Finally, the decrease in lactate between RA and PA may support the hypothesis that the mixing of RA and coronary sinus blood is at least partially responsible for the difference between ScvO2 and SvO2. Reference

1. Marx G, Reinhart K: Venous oximetry. Curr Opin Crit Care 2006, 12:263-268.

Use of tissue oxygenation saturation in association with skin temperature as an indicator of the peripheral tissue perfusion in critically ill patients

A Lima, C Ince, J Bakker

Erasmus MC University Medical Centre Rotterdam, the Netherlands

Critical Care 2009, 13(Suppl 1):P236 (doi: 10.1186/cc7400)

Introduction Studies have suggested that tissue oxygenation saturation (StO2) values are insensitive in assessing peripheral

perfusion. StO2 measurements may be more correctly interpreted if measured in association with the forearm-to-fingertip skin-temperature gradient (Tskin-diff). A Tskin-diff threshold of 0°C has been showed to reflect vasoconstriction. We aimed to propose a different approach for the interpretation of StO2 by adding Tskin-diff monitoring and to characterize the pattern of StO2 dynamics in patients with peripheral vasoconstriction and vasodilation. We hypothesize that monitoring StO2 with Tskin-diff can more adequately predict ICU complications than StO2 itself. Methods StO2 was continuously monitored on the thenar with an InSpectra Model 325 probe (Hutchinson Technology Inc., Hutchinson, MN, USA). The Tskin-diff was obtained from two skin probes (Hewlett Packard 21078A; Phips Medical Systems, Eindhoven, the Netherlands) attached to the index finger and on the radial side of the forearm. To describe the effect of variations in skin temperature on StO2, we compared StO2 in survivors and nonsurvivors stratified by the condition of peripheral circulation (vasoconstriction, Tskin-diff >0; vasodilation, Tskin-diff <0). The first measurement was registered within 24 hours and then every 24 hours until day 3. Differences between group means were tested by the Mann-Whitney U test. P <0.05 was considered statistically significant.

Results We prospectively studied 41 consecutive critically ill patients (survivors = 29; nonsurvivors = 12): age: 49 ± 16 years; 20 septic shock, 14 nonseptic shock, seven other. No differences in StO2 were seen between survivors and nonsurvivors (day 1: 73 ± 9 vs. 78 ± 10; day 2: 74 ± 11 vs. 75 ± 11; day 3: 76 ± 10 vs. 77 ± 9). In survivors, StO2 values were significantly lower in peripheral vasoconstriction than in vasodilation (day 1: 69 ± 8 vs. 76 ± 8; day 2: 68 ± 13 vs. 78 ± 7; day 3: 71 ± 10 vs. 80 ± 9; P <0.05). In nonsurvivors, this association was seen only on day 1 (71 ± 8 vs. 86 ± 4; P <0.05). Compared with survivors on day 3, nonsurvivors had lower StO2 values in peripheral vasodilation (69 ± 6 vs. 80 ± 9, P = 0.02) and higher StO2 values in peripheral vasoconstriction (83 ± 7 vs. 71 ± 10, P = 0.02). Conclusions Dissociation between StO2 and skin temperature was seen more often in nonsurvivors. StO2 measured in association with skin temperature can more adequately predict ICU death than StO2 itself.

Effects of peripheral vasodilation induced by regional anaesthesia blocks on resting tissue oxygenation values

A Lima, E Galvin, J Van Bommel, J Bakker

Erasmus MC University Medical Centre Rotterdam, the Netherlands

Critical Care 2009, 13(Suppl 1):P237 (doi: 10.1186/cc7401)

Introduction Near-infrared spectroscopy is a technique for continuous, noninvasive, bedside monitoring of tissue oxygenation (StO2). The nature of the relationship between the kinetics of StO2 and changes in peripheral circulation has not been investigated. After successful regional anaesthesia blocks, local vasodilation and increased local blood flow occur as a result of blockade of sympathetic nerve fibers. We therefore studied the effects of peripheral vasodilation induced by regional anaesthesia blocks on resting StO2 values.

Methods We recruited healthy adult patients (n = 8) scheduled for selective upper limb surgery under axillary sympathetic blocks. StO2 was continuously monitored over the thenar of the blocked arm using an InSpectra Model 325 probe (Hutchinson Technology Inc., Hutchinson, MN, USA) from the beginning of the local anesthetic (T0) injection until 30 minutes was elapsed (T30). The contralateral arm was used as a control. Differences between

Figure 1 (abstract P237)

Effect of peripheral blood flow variations in StO2 during regional block.

group means were tested by Wilcoxon signed test. P <0.05 was considered statistically significant.

Results StO2 values in the blocked arm were significantly higher in all patients after the anaesthesia (Figure 1). T30 versus T0, 94 ± 2 versus 82 ± 3; P = 0.002. StO2 did not increase in the control arm.

Conclusions Peripheral vasodilation increases StO2 in normal conditions.

Effect of anemia on tissue oxygenation saturation and the tissue deoxygenation rate during ischemia

M Meznar, R Pareznik, G Voga

SB Celje, Slovenia

Critical Care 2009, 13(Suppl 1):P238 (doi: 10.1186/cc7402)

Introduction The hypothesis was that anemia, independently of hemodynamic stability, affects tissue oxygenation saturation (StO2) and the deoxygenation rate during stagnant ischemia. The blood hemoglobin concentration is determinant of oxygen delivery. In anemic patients, oxygen delivery decreases and oxygen extraction is increased. This leads to decreased venous hemoglobin saturation and a lower tissue oxygen saturation. The rate of tissue deoxygenation during ischemia is dependent on oxygen consumption and on the amount of oxygen available in the tissue [1]. Methods In a prospective observational study we included 340 patients in the medical emergency room. On admission, StO2 and the tissue deoxygenation rate during ischemia were measured by near-infrared spectroscopy. Patients were divided into four groups according to hemoglobin concentration and hemodynamic (HD) stability: Group 1 (nonanemic, HD-stable patients), Group 2 (anemic, HD-stable patients), Group 3 (nonanemic, HD-unstable patients), Group 4 (anemic, HD-unstable patients). Differences in StO2 and the rate of tissue deoxygenation were analyzed. Results Anemic groups had a significantly lower hemoglobin concentration compared with nonanemic groups (138 ± 16 vs.

76.5 ± 15 g/l, P <0.001). HD-unstable groups had significantly higher lactate levels compared with HD-stable groups (1.6 ± 0.6 vs. 3.9 ± 1.6 mmol/l, P <0.001). The results are presented in Table 1. StO2 in Groups 2 and 3 was comparable, while the difference in the deoxygenation rate in these groups was significant (P =0.007).

Conclusions Anemia significantly affects StO2 and the deoxygenation rate. It probably contributes to lower StO2 equally as HD instability. In contrast to HD instability, anemia causes more rapid deoxygenation during ischemia. Reference

1. Creteur J: Muscle StO2 in critically ill patients. Curr Opin Crit Care 2008, 14:361-366.

Near-infrared spectroscopy monitoring tissue oxygen saturation after cardiac surgery

R Kopp, S Rex, K Dommann, G Schälte, G Dohmen, G Marx, R Rossaint

University Hospital Aachen, Germany

Critical Care 2009, 13(Suppl 1):P239 (doi: 10.1186/cc7403)

Introduction The aim of this study was to compare near-infrared spectroscopy with global parameters of tissue oxygenation after cardiac surgery, such as cardiac output (CI), mixed venous oxygen saturation (SvO2) or lactate concentration. After cardiac surgery, the circulating blood volume and cardiac function is regularly reduced on the ICU [1]. This results in reduced microperfusion and peripheral vasoconstriction. The noninvasive InSpectra StO2 monitor (Hutchinson Technology Inc., Hutchinson, MN, USA) measures the oxygen saturation in the microcirculation of the thenar muscle [2].

Methods Forty patients after cardiac surgery were monitored with a StO2 monitor and a Swan-Ganz catheter measuring CI and SvO2 on the ICU. Additionally intermittent lactate and blood gas analysis was performed. ANOVA was used for statistics of results. Results The mean Euro-Score of the patients was 6.5 ± 3.7 including 24 aortocoronary bypass, five heart valve, three ascending aorta and eight combined cardiac operations. After admission

Figure 1 (abstract P239)

q. w 20

A • • * • * •

* A * A. * ▲

* ♦ ♦ * ♦

♦ * p < 0.05 versus after induction of anaesthesia

After induction After surgery of anaesthesia

Course of StO2, cardiac output, SvO2 and lactate concentration.

Table 1 (abstract P238)

Group 1, n = 251 Group 2, n = 30 P value Group 3, n = 48 Group 4, n = 10 P value

StO2 (%) Deoxygenation rate (%/min) 80.5 ± 7.8 16.5 ± 6.7 76 ± 7.7 19.6 ± 6.6 0.003 0.017 76.5 ± 11.9 14.8 ± 7.4 68.2 ± 9.6 16.7 ± 5.7 0.04 NS

StO2, CI and SvO2 were significantly reduced (Figure 1). The lactate concentration and noradrenaline dose were increased (P <0.05). On day 1 the noradrenaline dose dropped after fluid substitution, but only the StO2 value recovered. Conclusions After heart surgery StO2 indicated the reduction as well as the recovery of microcirculation early, whereas the lactate concentration and SvO2 seemed to demonstrate a delayed response especially of recovery. Management of postoperative fluid and catecholamine therapy by StO2 to optimize microcirculation should be the subject of further studies. References

1. Tschaikowsky K, et al.: Changes in circulating blood volume after cardiac surgery measured by a novel method using hydroxyethyl starch. Crit Care Med 2000, 28:336-334.

2. Myers DE, et al.: Noninvasive method for measuring local hemoglobin oxygen saturation in tissue using wide gap second derivative near-infrared spectroscopy. J Biomed Opt 2005, 10:034017.

load was 0.5 times that of 3 minutes (AOC Isch -25.79/-52.99). The ratios of response parameters were compared with this value using one-sample t tests and were significantly greater: overshoot (0.89 ± 0.16, P <0.001), AUC Rec (0.86 ± 0.35, P <0.001), Rec Time (0.88 ± 0.21, P <0.001).

Conclusions StO2 VOT is a promising tool that is likely to advance our knowledge of microvascular function under ischaemic stress. To standardise data interpretation it is imperative that similar techniques are used. We have reported data for two reproducible VOTs in a healthy control group. Understandably, there are differences in hyperaemic response parameters that are dependent on the ischaemic time; however, the relationship is no linear. Our data suggest that although 2-minute VOT represents a halving of the ischaemic load versus 3 minutes, subjects demonstrate almost 90% comparable hyperaemic responses. This suggests a diminishing return in hyperaemia with longer occlusive time. Further work is required to evaluate the efficacy and practicality of different VOTs.

Comparison of muscle tissue oxygenation response curves to two time-based vascular occlusion tests: evidence of diminishing returns?

SJ Thomson, N Al-Subaie, M Hamilton, ML Cowan, S Musa, M Grounds, TM Rahman

St George's Hospital NHS Trust, London, UK

Critical Care 2009, 13(Suppl 1):P240 (doi: 10.1186/cc7404)

Introduction Dynamic testing of muscle tissue oxygenation (StO2) with near-infrared spectroscopy and vascular occlusion (VOT) has been used to study pathophysiological states, but there is a paucity of data for standardised techniques in normal subjects. Three-minute VOT is frequently described. We have collected StO2 data for this technique and compared them with a shorter 2-minute test.

Methods Twenty-five subjects were studied using an InSpectra 650 monitor and a 15 mm probe. VOT was applied at systolic blood pressure + 50 mmHg on opposite arms for 3-minute and 2-minute time periods. StO2 data are reported for baseline, downslope (DS) and upslope (US)%/minute, overshoot%, area over the ischaemic curve (AOC Isch) and area under the recovery curve (AUC Rec)%/minute and recovery time (Rec Time) in minutes. Differences were analysed using paired t tests. Individual differences were then examined in relation to a predicted magnitude of response based on the ischaemic load and occlusion time. Results Three-minute VOT versus 2 minutes: AOC Isch (-52.99 vs. -25.79%/min, P <0.001), overshoot (14.96 vs. 13.2%, P =0.004), AUC Rec (18.89 vs. 15.33%/min, P = 0.01), Rec Time (2.37 vs. 2.05 min, P = 0.004). DS (-11.18 vs. -12.35%/min, P = 0.08) and US (212.99 vs. 201.24%/min, P =0.365) were not significantly different. The 2-minute ischaemic

Adaptability of muscle tissue oxygenation to repeated vascular occlusion

SJ Thomson, N Al-Subaie, M Hamilton, ML Cowan, S Musa, RM Grounds, TM Rahman

St George's Hospital NHS Trust, London, UK

Critical Care 2009, 13(Suppl 1):P241 (doi: 10.1186/cc7405)

Introduction Dynamic testing of muscle tissue oxygenation (StO2) with near-infrared spectroscopy and vascular occlusion (VOT) is a developing area of research in critical care. VOT is often performed multiple times during analysis. We report the impact of repeated VOT on StO2 parameters.

Methods Twenty-five healthy subjects were studied with the InSpectra 650 monitor and a 15 mm probe. VOT was set at systolic blood pressure + 50 mmHg for 3 minutes in triplicate. Each VOT was separated by a 5-minute rest period. StO2 data are reported for baseline%, downslope (DS) and upslope (US)%/min, overshoot%, area over the ischaemic curve (AOC Isch) and area under the recovery curve (AUC Rec)%/min and recovery time (Rec Time) in minutes. Data were compared by ANOVA. Results See Table 1.

Conclusions It is important to understand the impact of serial VOT. We have observed sequential change in rates of StO2 decline during ischaemia (reduced DS gradients, total drop and AOC Isch). Although the overshoot was unchanged, the AUC Rec and Rec time demonstrated increases approaching significance. These patterns imply a preconditioned adaptation to repeated VOT and an exaggerated hyperaemic response. Further studies are required to determine acceptable rest periods between VOTs to ameliorate these changes.

Table 1 (abstract P241)

Population StO2 means (±SD) for consecutive VOT and significance

StO2 curve Baseline Total drop DS US AOC Isch Overshoot AUC Rec Rec Time

parameter (%) (%) (%/min) (%/min) (%/min) (%) (%/min) (min)

VOT 1 78.7 (±4.2) 33.4 (±7) -11.2 (±2.5) 213 (±55.8) -53 (±11.5) 15 (±3.8) 18.9 (±6.6) 2.4 (±0.4)

VOT 2 78.3 (±4.2) 29.8 (±6.2) -10 (±2.3) 240 (±58.8) -48.2 (±10.7) 14.9 (±3.7) 21.7 (±7.5) 2.6 (±0.4)

VOT 3 77.9 (±4) 28.2 (±5.5) -9.4 (±1.8) 220 (±45.3) -44.5 (±8.7) 15.3 (±3.5) 23.4 (±7.3) 2.6 (±0.5)

P value 0.79 0.01 0.02 0.19 0.02 0.92 0.09 0.09

Microcirculatory, leucocyte/endothelium interaction and survival time effects of dobutamine in nonhypotensive endotoxemia

A Santos, E Furtado, N Villela, E Bouskela

State University of Rio de Janeiro, Brazil

Critical Care 2009, 13(Suppl 1):P242 (doi: 10.1186/cc7406)

Introduction Microcirculatory, leucocyte/endothelium interaction and survival time effects of dobutamine in the dose of 5 |ig/kg/ minute associated or not with volume resuscitation were studied in the hamster window chamber model during resuscitation from nonhypotensive endotoxemia [1,2].

Methods Awake hamsters were submitted to endotoxemia with intravenous injection of lipopolysaccharide (LPS) in the dose of 2 mg/kg. After 3 hours of LPS injection they were divided into four groups: LPS (n = 6): received no treatment; VR (n = 6): resuscitated with 40 ml/kg body weight of NaCl 0.9% in 1 hour followed by 20 ml/kg body weight during the remaining 4 hours; Dobuta (n = 6): received dobutamine infusion in the dose of 5 |ig/kg/minute for 3 hours; Dobuta/VR (n = 6): resuscitated with 40 ml/kg body weight of NaCl 0.9% in the first hour followed by 20 ml/kg body weight during the remaining 2 hours, the last combined with dobutamine in the dose of 5 |ig/kg/minute. The groups were compared with Control (n = 7): no treatment. Arteriolar and venular diameters, functional capillary density (FCD), leukocyte rolling and adhesion, and 72-hour survival time were evaluated. Results Dobuta had lower arteriolar diameter than Control (51 ± 10 and 114 ± 10% from baseline). LPS and Dobuta had lower FCD than Control and baseline values (18 ± 15; 16 ± 18; and 88 ± 6% from baseline, in LPS, Dobuta and Control, respectively). VR and VR/Dobuta restored FCD from baseline (382 ± 19 and 476 ± 30% from baseline in VR and VR/Dobuta, respectively). FCD in VR and VR/Dobuta were lower than Control. LPS and Dobuta had higher leucocytes adhesion than Control (42.2 ± 10; 32.2 ± 31; and 4.0 ± 7.1 leucocytes/mm2 in LPS, Dobuta and Control groups, respectively). There was no significant difference in survival time between VR and Control, and VR/Dobuta and Control. Survival time was significantly lower in LPS and Dobuta than Control. Conclusions Dobutamine associated or not with volume resuscitation did not improve microcirculatory parameters, leucocyte adhesion or survival time during resuscitation from nonhypotensive endotoxemia while volume resuscitation restored microcirculatory parameters and improved survival time. References

1. Secchi A, et al.: Dobutamine maintains intestinal villus blood flow during normotensive endotoxemia: an intravital microscopic study in the rat. J Crit Care 1997, 12:137-141.

2. Hiltebrand LB, et al.: Effects of dopamine, dobutamine, and dopexamine on microcirculatory blood flow in the gastrointestinal tract during sepsis and anesthesia. Anesthesiology 2004, 100:1188-1197.

Microcirculatory, leukocyte/endothelium interaction and survival time effects of recombinant C-reactive protein in nonhypotensive endotoxemia in hamsters

M Silva, A Santos, E Furtado, N Villela, E Bouskela

State University of Rio de Janeiro, Brazil

Critical Care 2009, 13(Suppl 1):P243 (doi: 10.1186/cc7407)

Introduction Microcirculatory, leukocyte/endothelium interaction S100 and 7-day survival time effects of human recombinant C-reactive

protein (rCRP) associated or not with volume resuscitation were studied in the hamster window chamber model during resuscitation from nonhypotensive endotoxemia [1,2]. Methods Awake hamsters subjected to endotoxemia with intravenous injection of lipopolysaccharide (LPS) (2 mg/kg) were divided, 1 hour after LPS injection, into four groups: LPS (n = 6): no treatment; VR (n = 6): received 40 ml/kg body weight of NaCl

0.9. in 1 hour followed by 20 ml/kg body weight during the remaining 4 hours; rCRP (n = 6): received rCRP infusion in the dose of 24 |ig/kg/hour during 5 hours; VR/rCRP (n = 6): received 40 ml/kg body weight of NaCl 0.9% in the first hour followed by 20 ml/kg body weight during the remaining 4 hours, and in the last period it was combined with rCRP infusion in the dose of 24 |ig/kg/hour. Groups were compared with Control (n = 6): no LPS. Arteriolar and venular diameters, functional capillary density (FCD), venular leukocyte rolling and adhesion, and 7-day survival time were evaluated.

Results LPS reduced FCD so it was maintained lower than Control and baseline even after 24 hours (13 ± 11 and 100 ± 7% from baseline in LPS and Control, respectively). Volume resuscitation and rCRP restored FCD to baseline levels but it was lower than Control after 24 hours (36 ± 25 and 32 ± 20% from baseline in VR and rCRP, respectively). VR/rCRP restored FCD and it was not different from Control (72 ± 26% from baseline in rCRP). Arteriolar and venular diameters were not different among groups. LPS increased the number of sticking leukocytes to the venular wall (41.2 ± 13 and 1.1 ± 1 leucocytes/mm2 in LPS and Control, respectively). rCRP and VR/rCRP significantly reduced venular leukocyte adhesion (12.6 ± 7.1 and 11.8 ± 9.5 leucocytes/mm2 in rCRP and VR/rCRP, respectively). There was no difference in rolling leukocytes among groups. The survival curve was not significantly different in rCRP, VR and VR/rCRP from Control.

Conclusions rCRP associated or not with volume resuscitation improved tissue perfusion, reduced the number of sticking leukocytes and increased the survival time during endotoxemia in the hamster model. References

1. Joyce DE, et al.: Leukocyte and endothelial cell interactions in sepsis: relevance of the protein C pathway. Crit Care Med 2004, 32(5 Suppl):S280-S286.

2. Iba T, et al.: Activated protein C improves the visceral microcirculation by attenuating the leukocyte-endothelial interaction in a rat lipopolysaccharide model. Crit Care Med 2005, 33:368-372.

Assessment of tissue oxygen saturation during a vascular occlusion test using near-infrared spectroscopy: role of the probe spacing and measurement site studied in healthy volunteers

R Bezemer1, A Lima2, E Klijn2, J Bakker2, C Ince1

1Academic Medical Center, Amsterdam, the Netherlands; 2Erasmus MC, Rotterdam, the Netherlands

Critical Care 2009, 13(Suppl 1):P244 (doi: 10.1186/cc7408)

Introduction To assess potential metabolic and microcirculatory alterations in critically ill patients, near-infrared spectroscopy (NIRS) has been used, in combination with a vascular occlusion test (VOT), for the noninvasive measurement of tissue oxygen saturation (StO2), oxygen consumption, and microvascular reperfusion and reactivity. However, the methodologies for assessing StO2 are very inconsistent in the literature and results vary from study to study. In this study we investigated the effects of the

Thenar

S= 60-

« 40-

ischemia T" , „

\ '/ \ // hyperemia

\ * T' / 'I/

\ / -- 15 mm probe

s. / — 25 mm probe

Mean ± SD StO2 values in the thenar during a 3-minute VOT. BSLN, baseline.

Figure 2 (abstract 244)

Mean ± SD StO2 values in the forearm during a 3-minute VOT. BSLN, baseline.

probe spacing and measurement site, using both a 15 mm and a 25 mm probe spacing on the thenar and the forearm, in healthy volunteers.

Methods StO2 was noninvasively measured in eight healthy volunteers during 3-minute VOTs using two InSpectra Tissue Spectrometers equipped with either a 15 mm or a 25 mm probe. VOT-derived StO2 traces were analyzed for baseline, ischemic, reperfusion, and hyperemic parameters.

Results Although not apparent at baseline, the probe spacing and measurement site significantly influenced VOT-derived StO2 parameters (Figures 1 and 2). StO2 parameters in the hyperemic phase of the VOT were shown to significantly correlate to the minimum StO2 value after 3 minutes of ischemia. Conclusions The present study showed that NIRS measurements in combination with a VOT are measurement site and probe dependent. Additionally, this study indicated that reactive hyperemia depends on the extent of ischemic hit and supports the use of a target StO2 over the use of a fixed time of occlusion.

Effects of perfusion pressure on microcirculatory perfusion of gastric tubes in pigs

E Klijn, S Niehof, J De Jonge, J Bakker, C Ince, J Van Bommel

Erasmus Medical Center, Rotterdam, the Netherlands Critical Care 2009, 13(Suppl 1):P245 (doi: 10.1186/cc7409)

Introduction It is known that formation of a gastric tube after esophagectomy impairs microcirculatory perfusion in the anasto-motic region. The impaired microcirculatory perfusion is caused by poor arterial inflow and insufficient venous drainage, due to ligation of three of the four gastric arteries. We therefore hypothesized that improving venous drainage by giving nitroglycerin (NTG) and titrating norepinephrine (NE) to increasing levels of mean arterial pressure (MAP) would improve microcirculatory perfusion of the distal part of the gastric tube.

Methods Five pigs, bodyweight 30.5 ± 0.5 kg (mean ± SD) were anaesthesized and instrumented for continuous hemodynamic monitoring. A median laparotomy was performed and a gastric tube was reconstructed. Laser speckle imaging was used to measure the microcirculatory blood flow (MBF) in the gastric tube. Laser speckle imaging is a laser-based technique that measures MBF in a macroscopic field at a video frame rate. NTG was titrated to in order to achieve a central venous pressure below 10 mmHg. MBF was measured at the base and at the distal site of the gastric tube at MAP increasing stepwise from 50 to 110 mmHg. For this purpose the MAP was decreased with infusion of propofol and increased with infusion of NE. We determined the flow at the distal site for each step compared with flow at the base of the gastric tube (fraction of flow) in percentages.

Results Hemodynamic values: MAP was increased in steps of 10 mmHg. The heart rate and cardiac output remained constant at around 123±28bpm and 4.2 ± 1.0 l/min, respectively. Central venous pressure and pulmonary artery occlusion pressure did not change throughout the experiment. MBF results: at MAP below 70 mmHg the fraction of flow at the distal part was 23.49 ± 3.47%. This increased significantly to 31.12 ± 9.15% at MAP between 70 and 90 mmHg. The fraction of flow for MAP above 90 mmHg was 29.42 ± 6.84%. See Figure 1. Conclusions Hypotension impairs flow in the distal part compared with the base of the gastric tube. Normotension significantly improves flow in this region. MBF in the distal part does not benefit from increasing MAP to supranormal values.

Figure 1 (abstract P245)

Lack of autoregulatory blood flow escape in the skin after infusion of therapeutic levels of noradrenaline through a microdialysis system in healthy volunteers

A Samuelsson, S Farnebo, E Zettersten, C Andersson, F Sjoberg

University Hospital, Linkoping, Sweden

Critical Care 2009, 13(Suppl 1):P246 (doi: 10.1186/cc7410)

Introduction In burn-injured patients we have previously shown that skin has a decreased autoregulatory blood flow adjustment. Lack of such a function in skin would make uncontrolled use of vasoconstrictors detrimental to this organ. If ischemia-like conditions are produced in the skin, use could enhance the inflammatory response and increase the risk of organ dysfunction. We exposed local areas of the skin in healthy volunteers to therapeutic levels of noradrenaline (NA) by infusion through micro-dialysis (MD) intradermally, to investigate whether autoregulatory escape is present in the skin during strong ^-adrenergic stimulation.

Methods Five subjects received three MD catheters (CMA 70) intradermally in the lower arm. Catheters were perfused for a 90-minute stabilizing period, Ringer solution with 20 mmol/l urea, for skin blood flow determinations (urea clearance). Thereafter NA 0.5 or 5 |ig/ml was perfused (2 |il/min) for 60 minutes followed by a buffer washout period. After this 0.5 mg/ml nitroglycerine was perfused (2 |il/min) for 60 minutes. Samples were collected every 10 minutes and analysed for urea, lactate, pyruvate and glucose. Results During perfusion with NA 5 and 0.5 |ig/ml, skin blood flow decreased significantly as indicated by changes in urea clearance; increased lactate pyruvate quotients and corresponding rapid decreases in tissue glucose levels. These changes further increased during the perfusion of pure buffer. Normalization was first noted during infusion of nitroglycerine.

Conclusions Both doses of NA, previously claimed to be physiological, induce a reproducible and severe vasoconstriction as indicated by effects on urea clearance, increases in lactate pyruvate quotients and decreases in tissue glucose. These data suggest that skin lacks an autoregulatory escape function and ischemia was induced in the skin of healthy volunteers by the NA infusion. The ischemia situation further deteriorated until an active vasodilatation was started. In this skin vascular model, in healthy volunteers, NA shows negative effects that may also be important for the critical care setting. Further studies are needed to validate these findings.

Alterations of tissue-dependent microcirculation in patients after successful resuscitation

H Busch, S Rahner, C Bode, T Schwab

University Hospital, Freiburg, Germany

Critical Care 2009, 13(Suppl 1):P247 (doi: 10.1186/cc7411)

Introduction The crucial role of the microcirculation in improved neurological outcome in patients after successful resuscitation has been discussed for many years. New noninvasive imaging techniques enable the visualization and analysis of the microcirculation in vivo. Our study utilised an orthogonal polarization spectral (OPS) noninvasive imaging technique to test microcirculation in patients after successful resuscitation. Methods Between February and November 2008, 20 successfully resuscitated patients were investigated in the medical intensive

care department of Albert Ludwigs University, Freiburg. OPS measurements were performed at the time of mild therapeutic hypothermia and after rewarming. These results were compared with the results of a control group that consists of 15 healthy people. OPS measurements were performed using the MicroScan from MicroVisionMedical. The recorded sublingual microcirculation images were analysed utilising the AVA software with a focus on the microcirculation parameters of total vessel density (TVD, mm/mm2) and perfused vessel density (PVD, mm/mm2). Results Patients after successful resuscitation showed a significant decrease in the tissue-dependent microcirculation compared with control (TVD: 5.33 ± 1.45 vs. 6.87 ± 1.07; P =0.002) and PVD (3.48 ± 1.59 vs. 6.69 ± 1.03; P = 0.0002). Patients after successful resuscitation showed, during therapeutic hypothermia, a significant increased microcirculation (TVD and PVD) compared with the rewarmed phase (TVD: 6.41 ± 1.53 vs. 5.33 ± 1.45; P = 0.028; and PVD: 4.96 ± 2.3 vs. 3.48 ± 1.59; P= 0.023). These effects were independent of the use of catecholamines.

Conclusions We could demonstrate significant alterations of the tissue-dependent microcirculation in patients after successful resuscitation. Patients in the post-resuscitation phase showed during therapeutic hypothermia an increased microcirculation, reflecting TVD and PVD, compared with the microcirculation after rewarming, independent of noradrenaline or dobutamine.

Microcirculatory changes caused by magnesium sulphate infusion in severe sepsis and septic shock

A Pranskunas, V Pilvinis

Kaunas University of Medicine Hospital, Kaunas, Lithuania Critical Care 2009, 13(Suppl 1):P248 (doi: 10.1186/cc7412)

Introduction Microcirculatory dysfunction is a key element in the pathogenesis of severe sepsis and septic shock and is related to endothelial dysfunction. Studies in vivo have shown that infusion of magnesium sulphate increased endothelium-dependent vasodilatation in healthy people and patients with cardiac disorders, but the effect on the septic patient's vessels, especially small ones, is unknown. We hypothesized that magnesium sulphate infusion can improve microcirculation in patients with severe sepsis and septic shock.

Methods Six septic patients (mean age 56 ± 16 years), who had already been fluid resuscitated, underwent magnesium sulphate infusion 2 g over 60 minutes with additional volume loading and use of norepinephrine if required. Sublingual microcirculation was evaluated using side dark-field videomicroscopy (MicroScan®; MicroVisionMedical). Each patient's microcirculation was evaluated by examining three to six different sublingual areas (10 to 20 seconds/image). In all patients, measurements were obtained at baseline and after 60 minutes. Images were analyzed by semiquantitative scores of flow (mean flow index; proportion of perfused vessels) and density (total vascular density; perfused vascular density) of small vessels (<20 |im). Data are presented as median values (percentiles 25 to 75).

Results Data of this study have shown that perfused vascular density increased from 9.7 (4.9 to 13.0) n/mm (at baseline) to 12.1 (10.1 to 13.6) n/mm (at 60 min), proportion of perfused vessels increased from 70 (40 to 83)% to 77 (67 to 85)% but without statistical significance, P = 0.068.

Conclusions Magnesium sulphate has a tendency to improve microcirculation in severe sepsis and septic shock patients, but further studies are needed to obtain more detailed results.

Sepsis induces an early impairment of endothelial glycocalyx in glomerular capillaries

L Vitali, V Selmi, A Tani, M Margheri, M Miranda, C Innocenti, R De Gaudio, C Adembri

University of Florence, Italy

Critical Care 2009, 13(Suppl 1):P249 (doi: 10.1186/cc7413)

Introduction The increase in endothelial capillary permeability represents one of the early and significant hallmarks of sepsis. An inflammatory-induced damage in endothelial glycocalyx has been identified as the mechanism involved in this increase in permeability in the condition of ischemia-reperfusion [1]. To date no data are available for glycocalyx damage in sepsis. The aim of this study was to evaluate whether sepsis-associated increase of permeability is due to glycocalyx alteration.

Methods To induce sepsis, the cecal ligation and puncture (CLP), a clinically relevant model of polymicrobial infection that mimics human sepsis, was used [2]. Nine rats underwent CLP, two rats, receiving laparotomy only, served as control. Rats were then transcardially perfused with PBS and PBS + paraformaldehyde 4% at three different time points (3, 7 and 15 hours after CLP). Kidneys were collected and processed for light and confocal microscopy examination. Alterations in glycocalyx were studied using digoxigenin-labelled lectin maackia amurensis agglutinin (MMA) (to identify sialic acids, a large family of nine-carboxylated sugars present on the cell surface) and mouse monoclonal antibody against Syndecan-1 (an integral membrane proteo-glycan).

Results In control rats, the luminal surface of the glomerular endothelium appeared intensely stained for MMA lectin and Syndecan-1, indicating the presence of normal glycocalyx. Since its early phase (3 hours), sepsis induced a significant alteration in the glomerular endothelial glycocalyx, which worsened at later time points (7 and 15 hours). At 15 hours, disruption of the glomerular architecture was also present.

Conclusions Alterations in endothelial glycocalyx may represent the first step of sepsis-related changes in permeability. It is likely that treatments aimed at preserving glycocalyx may also prevent not only the increase in permeability but possibly glomerular disruption. References

1. van den Berg, Vink H, Spaan JA: The endothelial glycocalyx protects against myocardial edema. Circ Res 2003, 92: 592-594.

2. Buras JA, Holzmann B, Sitkovsky M: Animal models of sepsis: setting the stage. Nat Rev Drug Discov 2005, 10: 854-865.

Electron microscopic renal tubular injury score in ovine endotoxemia

C Ertmer1, S Rehberg1, A Morelli2, M Lange1, G Kohler1, B Bollen Pinto1, H Van Aken1, M Westphal1

1 University of Muenster, Germany; 2University of Rome 'La Sapienza', Rome, Italy

Critical Care 2009, 13(Suppl 1):P250 (doi: 10.1186/cc7414)

Introduction Sepsis-associated renal failure is a common complication of septic shock. The current study investigates whether an innovative score of electron microscopic tubular injury (EMTI) is correlated with plasma creatinine concentration in ovine endotoxemic shock.

Figure 1 (abstract P250)

Maximum creatinine [mg/dl]

Methods Twenty-nine awake healthy ewes received a continuous endotoxin infusion until the mean arterial pressure (MAP) fell below 65 mmHg. Thereafter, sheep were optimally volume-resuscitated and received norepinephrine to establish a MAP of 70 ± 5 mmHg. After 12 hours animals were anesthetized and killed. Kidney tissue samples (10 per animal) were analyzed with standard transmission electron microscopy techniques. The EMTI score (0 to 1 2) was determined as the sum of (1) vacuolar cell degeneration (0 to 3), (2) basal membrane dissociation (0 to 3), (3) epithelial cell injury (0 to 3), and (4) luminal obstruction (0 to 3). The EMTI score was averaged as the mean value of the 10 kidney samples of each animal and was correlated with maximum plasma creatinine concentrations by Pearson product moment correlation. Results The EMTI score (median 5.9; 25% to 75% range 4.1 to 6.9) significantly correlated with the plasma creatinine concentration (1.19; 1.06 to 1.34) with a correlation coefficient of R = 0.371 (P = 0.0476; Figure 1).

Conclusions The EMTI score is useful to quantify alterations of renal tubular integrity and thereby reflects changes in renal function in ovine endotoxemia. Scientific use of the EMTI score may provide further insight into the pathophysiology of sepsis-associated renal failure.

Inflammation-induced renal injury subsides when endotoxin tolerance develops in humans as measured by urine proteomics

A Draisma1, S Heemskerk2, M Bouw1, C Laarakkers1, JG Van der Hoeven1, R Masereeuw2, P Pickkers1

1Radboud University Medical Center, Nijmegen, the Netherlands;

2Nijmegen Centre for Molecular Life Sciences, Nijmegen, the Netherlands

Critical Care 2009, 13(Suppl 1):P251 (doi: 10.1186/cc7415)

Introduction A prominent role of inflammatory mediators such as cytokines is increasingly recognized to play a prominent role in the development of renal injury during sepsis. Because of its high mortality rates, early detection of renal injury is of the utmost importance and the discovering of renal biomarkers seems promising in achieving this goal.

Methods We induced endotoxin tolerance to accomplish an attenuated proinflammatory state by intravenous injection of

2 ng/kg/day lipopolysaccharide (LPS) on 5 consecutive days in five healthy male volunteers. Renal function was monitored and urinary proteome research was performed before and after LPS administrations on days 1 and 5.

Results Repeated LPS administrations induced an increase in serum creatinine of 11 ± 3% (P = 0.002) and a diminished glomerular filtration rate of 33 ± 7% (P = 0.02) on day 3, which was associated with the appearance of 15 peak intensities and an increase in P2-microglobulin levels (P = 0.04) 6 hours after the first LPS administration. Four of the 15 peak intensities correlated with serum creatinine levels; namely, 3,950 (r =0.91, r2 = 0.84, P =0.03), 4,445 (r =0.97, r2 = 0.94, P =0.01), 6,723 (r =0.94, r2 = 0.88, P =0.02) and 7,735 mass per charge (r =0.87, r2 = 0.75, P =0.05). During day 5 of the repeated LPS administrations, endotoxin tolerance developed and renal function was restored, reflected by serum creatinine levels of 70 ± 6 |imol/l (P = 0.2, day 1 compared with day 5) and by attenuated peak intensities on the urinary proteome profiles (P<0.0001 for all measured 15 peak intensities) and P2-microglobulin levels comparable with baseline on the first day (P = 0.35). Conclusions Renal injury occurs on day 3 during repeated endotoxemia, as adequately predicted by urinary proteome research on day 1. A significant correlation was found between four markers and the extent of renal injury, which may act as potential new biomarkers for renal injury at an early stage of inflammation. The inflammation-induced renal injury subsided when LPS tolerance developed after 5 consecutive days of LPS administrations.

Urinary glutathione S-transferase as an early marker for acute kidney injury in patients admitted to intensive care with sepsis

C Walshe, F Odejayi, S Ng, B Marsh

Mater Misericordiae University Hospital, Dublin, Ireland Critical Care 2009, 13(Suppl 1):P252 (doi: 10.1186/cc7416)

Figure 1 (abstract P252)

which may help guide therapeutic interventions. nGST does not seem to be released as a biomarker using this sepsis model, suggesting a more specific distal tubular injury. Further work is required to determine levels of nGST in nonstressed kidneys. Reference

1. Mehta RL, et al.: Acute Kidney Injury Network: report of an initiative to improve outcomes in acute kidney injury. Crit Care 2007, 11:R31.

Plasma neutrophil gelatinase-associated lipocalin is an early marker of acute kidney injury in critically ill patients: a prospective study

J Constantin1, E Futier1, L Roszyk2, S Perbet1, V Sapin2, A Lautrette2, B Souweine2, JE Bazin1

1Hotel-Dieu Hospital, Clermont-Ferrand, France; 2University Hospital of Clermont-Ferrand, France

Critical Care 2009, 13(Suppl 1):P253 (doi: 10.1186/cc7417)

Introduction Acute kidney injury (AKI) is common in patients admitted to intensive care. Diagnosis of AKI relies on serum creatinine and urine flow. These have disadvantages of low specificity and sensitivity and a slow rate of change. Renal damage results in release of tubular enzymes into the urine. Measurement of urinary alpha glutathione S-transferase (aGST) and pi glutathione S-transferase (nGST) may indicate AKI more acutely and accurately than current methods of diagnosis. Methods Urine was collected from patients with a sepsis diagnosis 4 hourly over 48 hours. Urine was frozen, and urinary nGST and aGST measured. Fluid and vasopressor management was recorded, but managed independently. Serum creatinine was measured at 0, 24 and 48 hours. AKI was diagnosed using AKI Network criteria [1].

Results We present the first 35 patients recruited, 20 were male, 15 female. Median patient age was 53 years. Median APACHE II score was 13. Median ICU length of stay was 9 days. ICU mortality was 14%, hospital mortality 23%. AKI was diagnosed in 26% of patients. Statistical significance was tested by Wilcoxon signed-rank test. Although the median nGST at 0 hours was elevated (11.8 |ig/l (non-AKI) versus 22 |ig/l (AKI)) this was not statistically significant between the two groups, P = 0.985. nGST did not demonstrate an increased urinary level in AKI versus non-AKI (median values 0.89 |ig/l vs. 3.4 |ig/l at 0 hours). See Figure 1. Conclusions A trend towards early expression of nGST was identifiable in this study. This may indicate early detection of AKI,

Introduction Serum creatinine is a late marker of acute kidney injury (AKI). Plasma neutrophil gelatinase (pNGAL) is an early biomarker of AKI after cardiac surgery. The purpose of this study was to assess the ability of pNGAL to predict AKI in ICU patients. Methods All patients admitted to three ICUs of the same institution were enrolled in this prospective observational study. pNGAL was analyzed at ICU admission. RIFLE criteria were calculated at admission and for each day during the first week. Patients were classified according to RIFLE criteria (RIFLE 0 or 1). Four groups were identified: RIFLE 0-0, 1-1, 1-0 and 0-1. Results During this 1-month period, 88 patients were included in the study. Thirty-six patients had RIFLE 0-0 with a mean pNGAL of 98 ± 60 nmol/l. Twenty-two patients had RIFLE 1-1 with a mean pNGAL of 516 ± 221 nmol/l. Twenty patients had no AKI at admission but developed AKI at 48 hours (24 to 96 hours) (RIFLE 0-1). In this case the mean pNGAL was 342 ± 183 nmol/l. Ten patients had RIFLE 1-0 and the mean pNGAL was 169 ± 100 nmol/l. With a cutoff value of 155 nmol/l, the sensibility and specificity to predict AKI were respectively 82% and 97%. Seven patients needed extrarenal therapy, all of them had pNGAL >155 nmo/l. The patients with pNGAL <155nmol/l had more shock, more sepsis and a higher Simplified Acute Physiology Score II score.

Conclusions pNGAL at ICU admission is an early biomarker of AKI in a setting in which the timing of kidney injury is unknown. pNGAL increases 48 hours before RIFLE criteria.

Plasma cholinesterase activity in patients during peritoneal dialysis

L Popovic, A Pavicic, J Slavicek, J Kern

Childrens Hospital Zagreb, Croatia

Critical Care 2009, 13(Suppl 1):P254 (doi: 10.1186/cc7418)

Introduction Literature data suggest that the release of nitric oxide (NO) by endothelial NO synthase contributes to functional alterations of the peritoneal membrane induced by acute peritonitis. Experimental animal studies show that induced peritonitis was characterized by structural changes in the peritoneal membrane and increased permeability for urea and glucose, as well as increased protein loss in dialysate [1]. Plasma cholinesterase (PChE, 3.1.1.8) is an enzyme synthesized in the liver, molecular mass from 345,000 to 371,000 Da. The aim of this study was to evaluate the functional characteristics of the peritoneum, determining PChE transfer from blood to dialysis fluid in peritonitis-free patients and in patients with peritonitis. Methods We measured PChE activity in plasma and peritoneal dialysate in adult peritonitis-free patients (n = 20) and in patients with peritonitis (n = 12). For determination of PChE activity, vein blood samples were collected and stored at -20°C until analysed. PChE activity was determined by the spectrophotometric method of Ellman and colleagues [2], using butyryltiocholine as the substrate (Sigma Chemical Co., St Louis, MO, USA). Statistical analysis was made by Mann-Whitney U test and Wilcoxon matched-pairs test.

Results In both groups of examined patients we detected the significant passage of PChE from blood to dialysis fluid (P = 0.000089 for peritonitis-free patients and P = 0.002218 for patients with peritonitis). The transperitoneal PChE passage degree is statistically unchanged in patients with peritonitis in relation to peritonitis-free patients (P = 0.119438). Conclusions In all patients we detected significant transperitoneal passage of PChE during peritoneal dialysis. During human peritoneal dialysis there is no peritoneal permeability change for macromolecules such as PChE protein complex in patients with peritonitis. References

1. Jie NI, et al.: J Am Soc Nephrol 2003, 14:3205-3216.

2. Ellman GL, et al.: Biochem Pharmacol 1961, 7:88-95.

Early predictive value of neutrophil gelatinase-associated lipocalin in adult ICU patients with acute kidney injury

H De Geus, J Le Noble, F Zijlstra, C Ince, J Bakker

Erasmus University Medical Center, Rotterdam, the Netherlands Critical Care 2009, 13(Suppl 1):P255 (doi: 10.1186/cc7419)

Introduction We investigated the early predictive value of plasma and urine neutrophil gelatinase-associated lipocalin (NGAL) compared with serum creatinine in patients with acute kidney injury (AKI) in a heterogeneous adult ICU population. Previous studies display the excellent predictive value of urine NGAL for AKI in a pediatric ICU setting [1,2].

Methods A prospective cohort study was conducted, including 63 patients. After ICU admission, plasma and urine samples were collected at eight time points. NGAL measurements were performed using an ELISA. The difference between the increase in NGAL and the increase in serum creatinine was calculated in relation to the time point at which maximum AKI was reached (T = 0). AKI was defined according to Acute Kidney Injury Network definitions.

Figure 1 (abstract P255)

Increase in urine NGAL and serum creatinine AKI III.

Figure 2 (abstract P255)

Increase difference between urine NGAL and serum creatinine AKI I

Results Mean age (±SD) in years and APACHE II (±SD) scores for AKI I (n = 22), II (n = 21) and III patients (n = 20) were respectively (56 ± 18, 21 ± 7), (58 ± 19, 25 ± 8) and (63 ± 13, 27 ± 7). In AKI

III patients, the difference between the increase in plasma NGAL and the increase in serum creatinine expressed as a percentage (95% CI) at 72, 48 and 24 hours prior to maximum AKI was 203% (-169 to 574), -10% (-221 to 830) and -25% (-205 to 155). For urine NGAL and serum creatinine the difference was respectively 584% (-603 to 1,770), 756% (12 to 1,499) and 726% (240 to 1,212). Figures 1 and 2 depict the increase in urine NGAL and serum creatinine separately and their difference in relation to T = 0. There was no positive difference in patients with AKI stages I and II for urine or plasma at any time point prior to maximum AKI. Conclusions NGAL is, compared with serum creatinine, an earlier predictor of AKI III, in a heterogeneous adult ICU population. Urine NGAL provides a larger and more sustained window of early prediction than plasma NGAL in patients with developing AKI III. The use of urine NGAL as an early predictive biomarker for AKI III is therefore promising in an adult heterogeneous ICU setting. References

1. Mishra J, et al.: NGAL as a biomarker for acute renal injury after cardiac surgery. Lancet 2005, 365:1231-1238.

2. Zappitelli M, et al.: Urine NGAL as an early marker of acute kidney injury in critically ill children: a prospective cohort study. Crit Care 2007, 11:R84.

Novel time series analysis approach for prediction of dialysis in critically ill patients using echo-state networks

T Verplancke1, S Van Looy2, K Steurbaut2, D Benoit1, F De Turck2, G De Moor1, J Decruyenaere1

1Ghent University Hospital, Ghent, Belgium; 2Ghent University, Ghent, Belgium

Critical Care 2009, 13(Suppl 1):P256 (doi: 10.1186/cc7420)

Introduction Echo-state networks are part of a group of reservoir computing methods and are basically a form of recurrent artificial neural networks. These methods can perform classification tasks on time-series data. The recurrent artificial neural network of an echo-state network has an echo-state characteristic. This echo state functions as a fading memory: samples that have been introduced into the network in a further past are faded away. The echo-state approach for the training of recurrent neural networks was first described by Jaeger [1]. In clinical medicine, until this moment, no original research articles have been published to examine the use of echo-state networks.

Methods The present study examines the possibility of using an echo-state network for prediction of dialysis in the ICU. Therefore, diuresis values and creatinine levels of the first 3 days after ICU admission were collected from 830 patients admitted to the ICU between 31 May 2003 and 17 November 2007. The outcome parameter was the performance by the echo-state network in predicting the need for dialysis between day 5 and day 10 of ICU admission. Patients with an ICU length of stay <10 days or patients that received dialysis in the first 5 days of ICU admission were excluded. Performance by the echo-state network was then compared by means of the area under the receiver operating characteristic curve (AUC) with results obtained by two other time-series analysis methods by means of a support vector machine and a naive Bayes algorithm.

Results The AUCs in the three developed echo-state networks were 0.822, 0.818, and 0.817. There were no statistically significant differences at the 0.05 level with the results obtained by the support vector machine and the naive Bayes algorithms. Conclusions This proof of concept study is the first to evaluate the performance of echo-state networks in predicting the need for dialysis in an ICU population. The AUCs of the echo-state networks were good and comparable with the performance of other classification algorithms. Reference

1. Jaeger H: The 'Echo State' Approach to Analysing and Training Recurrent Neural Networks. Technical Report GMD 148. German National Research Institute for Computer Science; 2001.

Contrast-induced nephropathy: a prospective analysis of long-term outcome and persistence of renal impairment

W Huber, E Wohlleb, C Schilling, B Saugel, V Phillip, R Schmid

Klinikum rechts der Isar der Technischen Universität München, Germany

Critical Care 2009, 13(Suppl 1):P257 (doi: 10.1186/cc7421)

Introduction Despite several prophylactic approaches such as acetylcysteine, theophylline and sodium-bicarbonate, contrast-induced nephropathy (CIN) remains a clinical problem. Regarding the large number of studies on CIN, little is known about long-term follow up of patients with CIN. The aim of our study was therefore to analyse the outcome of patients with CIN included in eight

prospective studies on CIN conducted by our group, including more than 1,200 patients.

Methods A systematic analysis of patients with CIN (defined as increase in serum creatinine >0.5 mg/dl and/or >25% within 48 hours after contrast-medium (CM)) using chart review and a telephone call to patients and to physicians involved in therapy after CIN up to 1 year or up to death of the patient. Composite primary endpoint: requirement of renal replacement therapy (RRT), death, persistent increase in creatinine >0.3 mg/dl as compared with baseline value. Further endpoints: time course of creatinine up to 1 year after CM.

Results Among 85 cases with CIN, complete datasets sufficient for analysis of the above-mentioned endpoints could be recorded in 55 patients. Thirty-nine male and 16 female patients; age 85.5 ± 13.1 years; amount of CM 274 ± 181 ml; intravenous CM application in 12 patients, intraarterial CM in 43 patients. The 28-day mortality was 8/55 (15%). At least four patients (7.3%) were treated with RRT. A total of 24/55 patients (44%) fulfilled the criteria of the composite endpoint (RRT or death or persistent increase in creatinine >0.3 mg/dl). Compared with baseline creatinine (1.8 ± 1.2 mg/dl), creatinine levels after 24 hours (2.3 ± 1.2 mg/dl; P<0.001), 48 hours (2.4 ± 1.4 mg/dl; P<0.001) and 1 week (2.5 M 1.9 mg/dl; P =0.033) were significantly elevated. In 37 of the surviving patients, creatinine after 1 year was not significantly higher than before CM (1.65 ± 1.05 vs. 1.46 ± 0.76 mg/dl; P = 0.1). However, patients who died or were on RRT were not included in this comparison. In two of our eight studies CIN was significantly associated with mortality. Conclusions Our data indicate that CIN is associated with significant mortality, requirement of RRT and/or persistent renal impairment in nearly one-half of the patients. Since these data were collected in patients included in studies aimed at prophylaxis of CIN, the risk might be even more elevated in clinical routine. Prophylaxis of CIN should therefore be a major issue in patients at risk.

Outcome of ICU patients requiring dialysis in an African institution

M Mer1, L Ezekiel1, S Naicker1, G Richards1, J Levin2

1University of the Witwatersrand, Johannesburg, South Africa;

2Medical Research Council of South Africa, Pretoria, South Africa Critical Care 2009, 13(Suppl 1):P258 (doi: 10.1186/cc7422)

Introduction Acute renal failure (ARF) is a common problem in ICUs and is associated with a high mortality rate [1,2]. Early and aggressive management of renal dysfunction through intermittent or continuous renal replacement therapy (RRT) is now common practice. The objective of this study was to review the outcome of patients who received RRT in the multidisciplinary ICU of the Johannesburg Hospital, South Africa.

Methods A review of ICU records between January 2003 and December 2004 of all patients requiring RRT in the ICU was performed. Demographic data, reason for ICU admission, presence of comorbidity, APACHE II score, the modality and duration of dialysis, as well as the outcome in the ICU were documented. This study was approved by the Human Research Ethics Committee of the University of the Witwatersrand, and statistical analysis carried out in conjunction with the Biostatistics Unit of the Medical Research Council of South Africa using the Stata release 8.0 statistical package.

Results One hundred and fifty-six patients out of 2,200 admissions (January 2003 to December 2004) were initiated on RRT. One hundred and three patients were treated with intermittent haemodialysis (IHD), 47 of whom died (45.6%). Twenty-three

patients underwent continuous venovenous haemodialysis, 20 (87%) of whom died. Twenty-two patients underwent both IHD and continuous venovenous haemodialysis, 15 (68.2%) of whom died. In eight patients there was no record of the mode of dialysis administered. Multivariate logistic regression suggested that the main factor associated with mortality was dialysis. Omitting the mode of dialysis, the presence of sepsis and the use of inotropes were independent risk factors associated with mortality. Conclusions Continuous RRT allows renal support in patients who would be unable to sustain IHD. The presence of sepsis and the use of inotropic support, rather than the mode of dialysis, are predictive of the outcome of ARF in the ICU. Of note, the overall mortality rate for ARF in this study appears to be lower than previously reported. References

1. de Mendonica A, Vincent JL, Suter PM, et al.: Acute renal failure in the ICU: risk factors and outcome evaluation by SOFA score. Intensive Care Med 2000, 26:915-921.

2. Mehta RL: ARF in the ICU: lessons from the PICARD study.

In ASN Proceedings 2004.

Early goal-directed therapy of septic patients coming from the ward does not protect the kidneys

HD Kiers1, A Litchfield2, S Reynolds2, D Griesdale2, RT Gibney3, P Pickkers1, D Chittock2, DD Sweet2

1Radboud University Nijmegen Medical Centre, Nijmegen, the Netherlands; 2Vancouver General Hospital, University of British Columbia, Vancouver, BC, Canada; 3University of Alberta, Edmonton, AB, Canada

Critical Care 2009, 13(Suppl 1):P259 (doi: 10.1186/cc7423)

Introduction Patients with severe sepsis or septic shock admitted to the emergency room appear to benefit from early goal-directed therapy, while this is unknown for patients admitted to the ICU from the ward.

Methods We documented achievement of mean arterial pressure >65 mmHg, central venous pressure >8 mmHg and central venous oxygen saturation >70% within the first 12 hours after ICU consultation. Creatinine in the week prior to ICU admission, creatinine at ICU consultation and peak creatinine in the 2 weeks after ICU admission were recorded.

Results Eighty-five patients were included, of which 40% achieved all goals within 6 hours (early achievement), whereas 33% did not but had all of them documented within the first 12 hours (no achievement), 27% had incomplete documentation of goals. Forty-four out of 85 patients (52%) developed acute kidney injury (AKI) according to the RIFLE criteria. Patients with incomplete documentation, early or no achievement of goals were comparable at baseline, APACHE II and interventions except for more use of inotropes in the no achievement group. The course of creatinine in patients was similar, regardless of achievement of goals: baseline -ICU consultation - peak, presented as median (IQR): early achievement group: 90 (61 to 119) - 118 (86 to 204) - 148 (86 to 249); no achievement group: 76 (56 to 98) - 121 (78 to 159) -151 (88 to 205); incomplete documentation group: 81 (56 to 109) -95 (65 to 175) - 114 (77 to 223). The patients that developed AKI had a significant increase in creatinine from baseline to the time of ICU consultation: baseline - ICU consultation - peak, presented as median (IQR): no AKI: 75 (59 to 114) - 87 (64 to 123) - 92 (72 to 129); AKI: 84 (61 to 105), P = 0.638 - 145 (102 to 205), P <0.001 - 196 (153 to 276), P <0.001. The development of AKI was independent of achievement of physiologic goals of resuscitation.

Conclusions In septic patients admitted from the ward there is no association between the development of AKI and the timeliness of achievement of physiologic resuscitation goals. In view of the higher creatinine values measured at ICU admission in the patients that eventually suffer from AKI, it appears that the insult has occurred prior to the time of ICU consultation and that physiologic resuscitation does not reverse it.

Is goal-directed therapy useful in kidney transplantation?

M Ciapetti, S Di Valvasone, M Bonizzoli, A Di Filippo, A Peris

Careggi Teaching Hospital, Florence, Italy

Critical Care 2009, 13(Suppl 1):P260 (doi: 10.1186/cc7424)

Introduction Patients with end-stage renal disease (ESRD) are entitled to the kidney transplantation approach at surgery with various modifications of the volemic status. According to some studies [1,2], acute tubular necrosis risk can be reduced by keeping a correct intravascular volume before graft reperfusion. The aim of this study was to evaluate goal-directed therapy in the early postoperative period in kidney transplantation. Methods We observed 50 kidney transplant recipients divided into two groups: 38 patients (Group A) underwent central oxygen venous saturation (ScVO2) monitoring, and 12 patients (Group B) were controls. Continuous central venous pressure (CVP) and ScVO2 were monitored the in ICU [3]. In Group A the volemic status by keeping CVP >5 mmHg and ScVO2 >70% was optimized. We collected donors' and transplant patients' kidney parameters (age, sex, death cause, ischaemia time), recipients' parameters (age, sex, weight, height, BMI, duration of dialysis, ESRD, Simplified Acute Physiology Score II), intraoperative parameters (metabolic, respiratory and hemodynamic), hemo-dynamic and kidney functioning parameters in the ICU (heart rate, mean arterial pressure, CVP, ScVO2, lactate, diuresis, blood urea nitrogen (BUN), creatinine, fluid balance), and outcome parameters (ICU length of stay, acute rejection at 28 days, mortality at

6 months).

Results At each observation ScVO2 was >70% in all patients of Group A. Diuresis was higher in Group A (at 6 hours, Group A: 1,082.6 ± 1,000.7; Group B: 757.2 ± 462.7; at 12 hours, Group A: 1,020.5 ± 921.5; Group B: 835.4 ± 517.8). The heart rate at 0 hours was higher in Group B (Group A: 85.2 ± 13.2; Group B: 97.7 ± 27.7; P <0.05). Creatinine (at 0 hours, Group A: 6.9 ± 2; Group B: 8.2 ± 3; at 12 hours, Group A: 7.2 ± 2.5; Group B: 9.3 ± 0.07) and BUN (at 0 hours, Group A: 0.9 ± 0.4; Group B: 1.0 ± 0.3; at 12 hours, Group A: 1 ± 0.4; Group B: 1.1 ± 0.3) were higher in Group B. The creatinine (Group A: -0.2 ± 1.2; Group B:

0.4.± 0.7) and BUN (Group A: 0.06 ± 0.1; Group B: 0.1 ± 0.04) reduction (12 to 0 hours) was higher in Group A. Diuretic stimulation was reduced in 10 patients of Group A and zero of Group B (P <0.05) and was interrupted in eight patients of Group A and two of Group B.

Conclusions Postoperative intensive monitoring and optimization of intravascular volume by CVP and ScVO2 grant fast recovery of kidney functioning in transplant recipients, so reducing diuretic stimulation, creatinine and BUN values. References

1. De Gasperi A, et al.: Transplant Proc 2006, 38:807-809.

2. Hadimioglu N, et al.: Transplant Proc 2006, 38:440-442.

3. Rivers EP, et al.: Curr Opin Crit Care 2001, 7:204-211.

Clinical characteristics and outcomes of critically ill adults with septic acute kidney injury in a general hospital in Singapore

J Koh, J Vijo Poulose, V Poulose

Changi General Hospital, Singapore

Critical Care 2009, 13(Suppl 1):P261 (doi: 10.1186/cc7425)

Introduction The aim of this study was to determine the clinical characteristics and outcomes of critically ill adults with septic acute kidney injury (AKI) stratified according to the AKI staging. The Acute Kidney Injury Network (AKIN) definition for AKI had been shown to predict important clinical outcomes such as hospital mortality [1]. Sepsis is the most common cause of AKI resulting in worse clinical outcomes when compared with other causes [2]. Methods An observational study conducted in a medical ICU of a general hospital in Singapore over a 6-month period. Patients who were admitted to the ICU with a diagnosis of sepsis and AKI (as defined by the AKIN criteria) were prospectively enrolled. The clinical characteristics and outcomes were determined and stratified according to the AKIN criteria.

Results A total of 71 consecutive septic patients (60.6% Chinese, 32.4% Malays, 2.8% Indians) were enrolled. The mean age was 63.7 years with a male predominance of 67.6%. The median Simplified Acute Physiology Score (SAPS) II score was 54. The majority (60.6%) was AKI III, with 22.9% in stage II and 16.9% in stage I. Overall hospital mortality was 39.4%. Patients who met septic AKI III had significantly higher mortality compared with AKI I and II (55.8% vs. 16.7% and 12.5%, respectively, P <0.001). There was a significant difference in the mean SAPS II score between the dead and alive patients (70 vs. 46, P <0.001). Multiple logistic regression analysis showed that AKI III (OR = 5.75, 95% CI = 1.2 to 25.5) and SAPS II score >65 (OR = 15.6, 95% CI = 3.5 to 68.2) were found to be independent predictors of hospital mortality.

Conclusions In septic patients, AKI III appeared to be a strong predictor of hospital mortality. This finding is similar to a previous study [3], which also showed that in patents with AKI only AKI III was an independent risk factor for hospital death. References

1. Barrantes F, et al.: Acute kidney injury criteria predict outcomes of critically ill patients. Crit Care Med 2008, 36: 1397-1403.

2. Bagshaw SM, et al.: Septic acute kidney injury in critically ill patients: clinical characteristics and outcomes. Clin J Am Soc Nephrol 2007, 2:431-439.

3. Ostermann M, et al.: Correlation between the AKI classification and outcome. Crit Care 2008, 12:R144.

Incidence and outcome of acute renal failure necessitating renal replacement therapy after trauma

S Beitland1, H Moen1, I Os2

1Ullevaal University Hospital, Oslo, Norway; 2University of Oslo, Norway

Critical Care 2009, 13(Suppl 1):P262 (doi: 10.1186/cc7426)

Introduction Acute renal failure (ARF) requiring renal replacement therapy (RRT) is uncommon in trauma patients. The aim of this study was to assess incidence and outcome in this patient group. Methods Adult trauma patients with ARF treated with RRT at Ullevaal University Hospital between 1 January 1996 and 31 December 2007 were retrospectively reviewed. The hospital is the

regional trauma referral centre for approximately 1.93 million adult (>18 years) persons. Individuals were identified and data were collected using several institutional registries in addition to the Norwegian renal registry. Patients were grouped according to presence of rhabdomyolysis based on peak serum creatine kinase levels exceeding 10,000 U/l or not. Categorical data were compared employing the two-sided Pearson chi-square test, whereas continuous data were compared utilizing the two-tailed Mann-Whitney U test.

Results There were 78,345 hospital admissions due to trauma during the study period, 42 of these underwent RRT for ARF (after excluding one person due to low age, three with nonrenal indications for RRT and another three with chronic renal failure). The incidence rate of post-traumatic ARF requiring RRT was 0.54%o, and was higher in males compared with females (33.72 vs. 6.05 per million persons, P <0.01). The patients' age was median (range) 46.4 (18 to 84) years, and 85.7% were males. Mortality rates were 23.8% (ICU/hospital), 35.7% (3 months) and 40.5% (1 year). Renal recovery, defined as independency from RRT, occurred in all survivors after 3 months and 1 year. Rhabdomyolysis was present in 18 persons (42.9%) and was only registered in males. Trauma patients with rhabdomyolysis were significantly younger (33.0 vs. 57.0 years, P = 0.01), needed RRT earlier (3.0 vs. 6.5 days, P = 0.02) and had lower 3-month (16.7 vs. 50.0%, P = 0.03) and 1-year (22.2 vs. 54.2%, P = 0.04) mortality rates compared with nonrhabdomyolytic persons. Conclusions Post-traumatic ARF requiring RRT is rare and mainly affects males of young age. There is still a considerable mortality rate in these patients, and it seems that rhabdomyolytic persons have a more favourable outcome compared with those without rhabdomyolysis. Among survivors, recovery of renal function is often seen, and few become dependent on chronic RRT.

Early acute kidney injury in Northern Ireland ICUs

E Borthwick1, S Harris2, C Welch3, A Maxwell1, DF McAuley4, P Glover4, D Harrison3, K Rowan3

''Regional Nephrology Unit, Belfast, UK; 2UCL, London, UK;

3Intensive Care National Audit and Research Centre, London, UK;

4Royal Victoria Hospital, Belfast, UK

Critical Care 2009, 13(Suppl 1):P263 (doi: 10.1186/cc7427)

Introduction There are limited data about the epidemiology of acute kidney injury (AKI) in critically ill patients in Northern Ireland. The aim of this study was to examine AKI within 24 hours of ICU admission and its relation to outcomes (ICU mortality and hospital mortality).

Methods A secondary analysis of prospectively collected data in the Intensive Care National Audit and Research Centre Case Mix Programme Database. The Case Mix Programme Database was interrogated and data extracted from 22,313 admissions to eight ICUs from 1999 to 2007. The presence of AKI was assessed within the first 24 hours after admission (early AKI) and classified according to the RIFLE criteria. Trends over time were described for the RIFLE categories, and outcomes of admissions in each category were summarised. Where available, information on the use of renal replacement therapy during the ICU stay was analysed.

Results Trends in early AKI changed little over time: 35.5% of patients sustained AKI (risk 13.5%, injury 11.1%, failure 9.8%, end-stage 1.2%) and 9% of patients received renal replacement therapy. Outcomes are presented in Table 1. Conclusions For the first time we have established the incidence of early AKI using the RIFLE criteria in Northern Ireland ICUs. This

Outcomes

No AKI (n = 14,385) Risk (n = 3,020) Injury (n = 2,446) Failure (n = 2,183) End-stage (n = 259)

ICU mortality 1,350 (9.4) 563 (18.6) 834 (33.8) 909 (41.6) 48 (18.5)

Hospital mortality 2,258 (16.8) 900 (31.9) 1,092 (47.2) 1,137 (55.9) 82 (32.7)

Data presented as n (%).

will inform renal service development in ICUs. The incidence of severe AKI is high relative to the rest of the UK [1] although it is low compared with the US [2]. Mortality increases with severity of renal injury. References

1. Kolhe et al.: Crit Care 2008, 12(Suppl 1):S2.

2. Hoste et al.: Crit Care 2006, 10:R73.

RIFLE classification can predict hospital mortality of critically ill patients

HY Xu1, JM Peng1, ZR Mao2, L Weng1, XY Hu1, B Du1

1Peking Union Medical College Hospital, Beijing, China; 2First Affiliated Hospital of Henan College of Traditional Chinese Medicine, Henan, China

Critical Care 2009, 13(Suppl 1):P264 (doi: 10.1186/cc7428)

Introduction The Acute Dialysis Quality Initiative group has proposed the RIFLE (Risk-Injury-Failure-Loss-End-stage renal disease) classification to assess acute kidney injury (AKI). We sought to evaluate the incidence of AKI in critically ill patients according to the RIFLE classification and the correlation between RIFLE class and hospital mortality.

Methods We performed a retrospective cohort study applying the RIFLE classification on 1,138 patients admitted to the ICU during a 2-year period.

Results According to the RIFLE classification, 376 patients (33%) had AKI during their ICU stay. When assessing the maximum RIFLE class, 209 (18.3%) patients were classified Risk, 63 (5.5%) as Injury and 104 (6.3%) as Failure. Female (OR = 1.53; 95% CI = 1.18 to 1.98; P = 0.001), nonsurgical admission (OR = 1.32; 95% CI = 1.01 to 1.70; P = 0.039), APACHE II on admission above 25 (OR = 1.99; 95% CI = 1.06 to 3.76; P = 0.033), sepsis on admission (OR = 2.20; 95% CI = 1.01 to 4.79; P = 0.046) and chronic organ dysfunction (OR = 1.65; 95% CI = 1.09 to 2.50; P = 0.017) were independent risk factors for AKI. Patients with progressive RIFLE classification had increased hospital mortality, with Risk 13.9%, Injury 22.2% and Failure 47%, as compared with 7.9% (P < 0.001) among patients without AKI. Furthermore, the RIFLE class Failure was an independent predictor of 3-month hospital mortality (OR = 2.37; 95% CI = 1.23 to 4.54; P = 0.009) in addition to organ failure on admission (OR = 4.60; 95% CI = 2.31 to 9.18; P <0.001), use of vasopressor (OR = 2.34; 95% CI = 1.09 to 5.00; P = 0.028), and APACHE II score on admission (OR = 1.160; 95% CI = 1.11 to 1.21; P < 0.001). Conclusions Patients with increasing RIFLE classification had significant elevated hospital mortality. Maximum RIFLE class Failure was independently associated with 3-month hospital mortality. References

1. Bellomo R, et al.: Acute renal failure-definitions, outcome measures, animal models, fluid therapy and information technology needs: the Second International Consensus Conference of the Acute Dialysis Quality Initiative (ADQI) Group. Crit Care 2004, 8:R204-R212.

2. Hoste EA, et al.: RIFLE criteria for acute kidney injury are associated with hospital mortality in critical ill patients: a cohort analysis. Crit Care 2006, 10:R73-R83.

Assessment of acute kidney injury with modified RIFLE criteria in critically ill pediatric burn patients

T Palmieri, A Lavrentieva, D Greenhalgh

Shriners Hospital for Children, Sacramento, CA, USA Critical Care 2009, 13(Suppl 1):P265 (doi: 10.1186/cc7429)

Introduction The objective of the present study was to evaluate the incidence, risk factors and outcome associated with acute kidney injury (AKI) as defined by the modified pediatric version of RIFLE criteria (pRIFLE) in children with severe burn injury. Methods The retrospective, descriptive cohort study included 123 patients admitted for more than 24 hours to a burn pediatric ICU from 2006 to 2008. Burn injury severity was estimated using the total body surface area burn (TBSA%), severity of illness was estimated using the Pediatric Risk of Mortality (PRISM) score. The pRIFLE criteria were applied and the patients were assigned to the appropriate pRIFLE stratum (Risk, Injury, Failure) if they fulfilled either estimated creatinine clearance, urine output criteria, or both. Results The incidence of AKI was 40.7%, maximum RIFLE class Risk, class Injury and class Failure occurred in 50%, 36% and 18%, respectively. Patients with maximum RIFLE class Risk, Injury and Failure had ICU mortality rates of 0%, 5.6% and 57.1%, respectively, compared with 1.4% for patients without AKI. We observed statistically significant differences between the patients with AKI and those without AKI in the following parameters: TBSA (41.2 ± 17.7% vs. 24.2 ± 14.6%, P <0.001), admission PRISM (8.6 ± 6.4 vs. 4.8 ± 3.4, P <0.01), number of surgical procedures (3.7 ± 2.9 vs. 1.5 ± 1.5, P <0.001), occurrence of abdominal compartment syndrome (18% vs. 0%, P <0.001), length of mechanical ventilation (22.3 ± 27.6 days vs. 7.1 ± 11.4 days, P <0.001) and length of ICU stay (37 ± 30.1 days vs. 14.6 ± 13.9 days, P <0.001). Logistic regression analysis indicated that PRISM score (OR = 1.1, 95% CI = 1.0 to 1.2; P = 0.05) and TBSA (OR= 1.06, 95% CI = 1.0 to 1.1; P <0.001) were independent risk factors for AKI in pediatric burn patients. Conclusions AKI estimated by pRIFLE criteria occurs in major pediatric burns, and failure was associated with increased mortality. Patients with AKI had higher admission burn and illness severity, increased incidence of abdominal compartment syndrome, more operations, and had increased duration of mechanical ventilation and length of ICU stay. AKI is a marker of increased resource utilization and risk for adverse outcomes after burn injury in children.

Extracorporeal renal replacement therapy in patients with heparin-induced thrombocytopenia

M Bekers-Anchipolovskis, V Liguts, E Strike, N Porite, V Harlamovs, L Semcenko, M Daukste

P. Stradina University Hospital, Riga, Latvia

Critical Care 2009, 13(Suppl 1):P266 (doi: 10.1186/cc7430)

Introduction Evaluation of renal replacement therapy (RRT) efficacy in heparin-induced thrombocytopenia (HIT) patients, using the alternative anticoagulant bivalirudin. A decreased platelet count (PC) is often seen after cardiopulmonary bypass with PC reduction by about 30% to 50%, correlating with the duration of cardio-pulmonary bypass. However, only a few of these patients will develop HIT. Acute renal failure (ARF), followed by extracorporeal RRT, was seen in 90% of the patients with critical PC reduction found in the postoperative ICU. ARF has multiple etiology: baseline disorders (congestive heart failure, acute heart failure), hypoxemia, drugs used during surgery, and so on. The significance of postoperative thrombocytopenia, including HIT, urges to differentiate its causes and select appropriate patient care options, including use of alternative anticoagulants, namely bivalirudin. Bivalirudin (Angio-max™; The Medicines Company, Cambridge, MA, USA) is a synthetic hirudin analog, which reversibly binds thrombin. Bivalirudin is eliminated by enzymatic proteolysis (80%) and renal clearance (20%). The serum half-life is 25 minutes in patients with normal renal function, while it is longer in those with renal dysfunction (3.5 hours).

Methods A retrospective study in HIT patients (clinical scoring and laboratory assays) who underwent RRT.

Results Ten patients were studied, thereof four patients had low PC, thrombotic complications, RRT and prolonged duration of stay in the ICU - 17 days on average. Two of the patients had positive tests (ranged from 0.45 to 1.62) on postoperative days 6 to 8 and 11 to 12. All of the patients had their anticoagulant switched from heparin to bivalirudin, despite implanted circulation assisting devices and RRT. Coagulation system status was evaluated by the activated partial thromboplastin time, of which target value was twice the upper normal range. The bivalirudin infusion dose ranged from 50 ^/kg/hour to 100 ^/kg/hour, depending on the activated partial thromboplastin time value. Bivalirudin was found to be at least as safe and effective an anticoagulant as heparin. Conclusions HIT patients must be switched from heparin to alternative anticoagulant. Bivalirudin is a safe and effective anticoagulant for extracorporeal RRT; however, the dose should be adjusted individually. Bivalirudin has prolonged effect in ARF patients and does not have antagonists.

Coagulopathy of liver disease does not increase the filter life during continuous renal replacement therapy

B Agarwal, S Shaw, M Hari

Royal Free Hospital, London, UK

Critical Care 2009, 13(Suppl 1):P267 (doi: 10.1186/cc7431)

Introduction Clotting of haemofiltration circuits is a limiting factor in achieving efficient continuous renal replacement therapy (CRRT), yet systemic anticoagulation risks haemorrhage. Some patients, such as those with liver failure, are traditionally managed with no or minimal anticoagulation, because of abnormal clotting tests and therefore increased perceived risk of bleeding [1]. Methods We retrospectively reviewed the CRRT circuit life in three groups of liver failure patients (acute liver failure (ALF), acute

on chronic liver failure (ACLF) and postelective liver transplantation (LTx)), with two control groups (systemic sepsis (SS) and haema-tological malignancy (Haem)), admitted to the Royal Free Hospital ICU - a tertiary referral centre for liver disease and transplantation -between 2003 and 2007. Ten consecutive patients in each of the five groups were included in the study if they had renal failure and required continuous haemofiltration (CRRT) for more than 48 hours.

Results The mean CRRT circuit life was significantly greater in the Haem group, compared with the others; 28.5 ± 25.7 hours, versus 11 ± 10.5 ALF, 11.6 ± 6.6 ACLF, 7.4 ± 5.1 LTx and 9.9 ± 5.9 SS, P <0.05, with the Haem group requiring fewest new CCRT circuits within 48 hours; 2.7 ± 1.5 versus 4.3 ± 1.3 ALF, 4.2 ± 2.1 ACLF, 5.3 ± 1.5 LTx and 4.6 ± 1.5 SS, P <0.05 and least blood transfusions; 1.4 ± 1.3 versus 4.8 ± 4.2 ALF, 4.2 ± 4.1 ACLF, 2.2 ± 2.1 LTx and 3.2 ± 1.2 SS. Transmembrane pressures were higher in those CRRT circuits that clotted due to the filter, compared with other causes, such as access dysfunction (123 ± 74 vs. 71.8 ± 29.3 mmHg, P = 0.009). In those patients where anticoagulation was started due to repeated filter clotting, the CRRT circuit life improved from 5.6 ± 3.4 to 19 ± 12.7 hours, P <0.01.

Conclusions Despite abnormal standard laboratory coagulation tests and thrombocytopenia, CRRT circuits clot frequently in liver failure patients. Anticoagulation did improve CRRT circuit survival without an obvious increase in bleeding or blood transfusion requirement. Anticoagulation should therefore be considered in these patients in cases of repeated circuit clotting. Reference

1. Davenport A: CRRT in the management of patients with liver disease. Semin Dial 1996, 9:78-84.

Optimal dose of renal replacement therapy in acute kidney injury: a meta-analysis

RJ Van Wert, DC Scales, JO Friedrich, R Wald, NK Adhikari

University of Toronto, ON, Canada

Critical Care 2009, 13(Suppl 1):P268 (doi: 10.1186/cc7432)

Introduction Acute kidney injury (AKI) requiring renal replacement therapy (RRT) increases mortality in the critically ill patient. Our objective was to systematically review randomized controlled trials (RCTs) examining the effect of the RRT dose on mortality. Methods In duplicate, we searched Medline, EMBASE and the Cochrane Central Register of Controlled Trials from inception to December 2008, using terms for all RRT modalities, AKI, and RCTs. Included RCTs compared different doses of RRT within a given modality (effluent rate in continuous RRT (CRRT), number of sessions per week in intermittent hemodialysis (IHD)) in patients with AKI, and reported mortality or dialysis dependence. We excluded RCTs that evaluated very high volume hemofiltration (continuous effluent rates >60 ml/kg/hour). Risk ratios (RRs) with 95% confidence intervals for mortality and dialysis dependence among survivors were calculated using random-effects models (RevMan 5). We investigated subgroup effects by modality (CRRT vs. IHD) and patient group (septic vs. nonseptic). Results Of 2,913 citations, seven RCTs (n = 2,255) met the inclusion criteria. Four RCTs used CRRT (35 to 48 vs. 20 ml/kg/hour); two RCTs used IHD (daily vs. alternate day); and the largest and only multicentre RCT (n = 1,124) integrated CRRT and IHD in high-dose and standard-dose arms. RCTs were generally of good quality. Meta-analysis did not demonstrate reduced mortality with high-dose RRT (RR = 0.86, 0.70 to 1.06). The effect was similar in patients (1) treated with CRRT (RR =

0.87, 0.71 to 1.06, five trials, n = 1,552) and IHD (RR = 0.95, 0.59 to 1.51, three trials, n = 703), and (2) with sepsis (RR = 1.08, 0.96 to 1.22, n = 896) and without sepsis (RR = 0.88, 0.65 to 1.20, n = 959) in the four trials with these data. High-dose RRT did not decrease dialysis dependence among all survivors (RR = 1.09, 0.83 to 1.43, five trials, n = 774) or in the subgroup treated only with CRRT (RR = 1.27, 0.62 to 2.59, three trials, n = 233). We estimate that if the ongoing multicentre RENAL RCT (planned n = 1,500, ClinicalTrials.gov NCT00221013) of high-dose versus standard-dose CRRT demonstrates the expected absolute mortality risk reduction (from 60% to 52%), the pooled mortality benefit would just reach statistical significance. Conclusions Current evidence does not demonstrate reduced mortality with high-dose RRT for AKI, even in the subgroup of patients with sepsis-associated AKI. Forthcoming results of the RENAL trial may contribute to more definitive results.

Removal of inflammatory mediators by continuous renal replacement therapy in severe sepsis

E Tomas1, E Lafuente1, B Vera1, M Fernandes1, J Silva1, F Santos1, F Moura1, P Santos1, R Lopes1, P Calder2

1Centro Hospitalar Tamega e Sousa, Penafiel, Portugal; 2Institute Human Nutrition, Southampton, UK

Critical Care 2009, 13(Suppl 1):P269 (doi: 10.1186/cc7433)

Introduction The aim of this study was to evaluate the efficacy of continuous renal replacement therapy (CRRT) in systemic inflammatory mediator removal in a group of patients with severe sepsis/septic shock and related renal failure. Methods We conducted a prospective study, approved by the ethics committee, enrolling 11 patients with severe sepsis/septic shock under CRRT. We measured the cytokines using the immunoassay method. The cytokine measurement was done every morning until the end of renal support. We analyzed the following data: age, severity scores (Simplified Acute Physiology Score (SAPS) II, Sequential Organ Failure Assessment (SOFA)), cytokines (TNF, IL-1P, IL-6 and IL-10) levels on pre-filter, post-filter and ultrafiltrate samples. We correlated these data with filter duration, considering 24 hours as the maximum theoretical limit for effective cytokine removal.

Results From the 11 patients enrolled, we collected 420 valid samples to measure the cytokine levels. The mean age and SAPS II were respectively 66 ± 10.7 years and 46 ± 27.1. The mean total

Figure 1 (abstract P269)

n = 420 Pit-Iillcr Post-Filter Ultrafiltratc

TNFIpsfal) 6,26-12.5 6,48±12,1 3,25il,14

IL-6(pg/ml) 6977,«±16048 9203,9±20115 204±351

IL-lB(pÊ-'ml) 16,94±84,96 16,36x82.55 1,31±3,8

IL-lO(PBiml) 109,1 ±336 104,8±330,8 11.8*61.1

Total cytokine levels.

Figure 2 (abstract P269)

AN69 pre-TNF : posi-TW pa-lU ; post-U.-6 pre-IL-lB ; post-IL-1 B pK-lUO ; post-ll-lll

9.48±5,2 ; 9,66±5

9722,8*852; 6955=545

38,7*35,8 ; 37±34,9

145±126 : 145=127

>24t | 3,7*018 ; 4,1:3,5 1467*653 ; 1561±722| 2,3=t0,7;2,9t0,8 13,5±7,2 ; 14,2±6,I Removal of inflammatory mediators according to filter duration.

SOFA score at admission and at day 3 were 11.2 ± 4 and 12.7 ± 4 respectively, with ASOFA = +1.3 points (P >0.05). The mean ICU hospitalization time was 25.9 ± 33.5 days. We have done 876 hours of dialytic therapy; 33 AN69 membranes were used, during a mean time of 26.5 ± 25.6 hours. The mean total levels found for cytokines are presented in Figure 1 and the levels depending on the filter duration are in Figure 2. The ICU mortality was 54.6%.

Conclusions The present study does not allow us to define a specific pattern for the removal of inflammatory mediators related to CRRT. We observed a trend for cytokine reduction dependent on filter duration with the higher efficacy removal until 24 hours. The cytokine level found in the effluent does not reflect the absolute reduction found in post-filter, and this effect could be dependent on adsorption, according to other studies [1,2]. References

1. Shoji H: Extracorporeal endotoxin removal for the treatment of sepsis: endotoxin adsorption cartridge (toray-myxin). Ther Apher Dial 2003, 7:108-114.

2. Kellum J, Song M, Venkataraman R: Hemoadsorption removes tumor necrosis factor, interleukin-6, and inter-leukin-10, reduces nuclear factor-B DNA binding, and improves short-term survival in lethal endotoxemia. Crit Care Med 2004, 32:801-805.

Blood epuration of middle molecules in continuous venovenous hemodiafiltration with regional citrate anticoagulation versus systemic heparinization

MB Nogier, L Lavayssiere, O Cointault, M Abbal, N Kamar, B Periquet, L Rostaing, D Durand

CHU Rangueil, Toulouse, France

Critical Care 2009, 13(Suppl 1):P270 (doi: 10.1186/cc7434)

Introduction Adequate anticoagulation is a precondition to improve blood epuration. In a prospective crossover trial, we compared the removal of middle molecules in continuous renal replacement therapy, by two methods that used either trisodium citrate ACD (BRAUN Laboratories) or heparin as anticoagulation. Methods Fourteen critically ill patients were treated by continuous venovenous hemodiafiltration. The dialysate and hemofiltration flows were 1,500 ml/hour each. All patients received two sessions of 36 hours successively. Patients 1 to 7 received systemic heparinization first, then regional anticoagulation by citrate. Conversely, Patients 8 to 14 received heparin and then citrate. Every 12 hours during each session of renal replacement (T0, T12, T24 and T36), blood epuration of small molecules (urea, creatinine) and middle molecules (P2-microglobulin, retinol binding protein) were evaluated from blood and effluent samples. Results At all time periods, there was no significant difference in creatinine and urea clearances between citrate and heparin treatment, except in T1 2 where creatinine clearance was significantly higher when using citrate than when using heparin (38.5 ±

4.0 ml/min vs. 34 ± 4.1 ml/min, P = 0.02). We did not observe a significant decline of urea and creatinine clearance during the session. The retinol binding protein blood and effluent rates were limited and stable during the session. Between T0 and T12, results showed a significant improvement of P2-microglobulin clearance with using citrate or heparin. After T12, P2-microglobulin clearance decreased with citrate treatment and stayed stable with heparin. Clearance seems to be higher with citrate than heparin, however, with only a significant difference in T12 (20.1 ± 4.0 vs. 16.3 ±

3.1 ml/min, P = 0.02). Undergoing citrate treatment, we recorded significant metabolic alkalosis (P = 0.0025) but without hypernatremia.

Conclusions Regional citrate anticoagulation in continuous venovenous hemodiafiltration is an efficient and safe method of anticoagulation. Its impact upon blood epuration of small or middle molecules remains to be established.

Utilization of slow low-efficiency dialysis may help to optimize the need for continuous renal replacement therapy in Indian ICUs

A Majumdar, S Basu, M Bhattacharya, M Kharbanda, P Sinha, S Todi

AMRI Hospitals, Kolkata, India

Critical Care 2009, 13(Suppl 1):P271 (doi: 10.1186/cc7435)

Introduction The aim was to study the practice pattern of using the modern modalities of renal replacement therapy (RRT) (slow low-efficiency dialysis (SLED) and continuous renal replacement therapy (CRRT)) in hemodynamically unstable critically ill patients in an Indian ICU.

Methods A retrospective observational study of hemodynamically unstable patients with acute kidney injury (AKI) who needed RRT in ICUs of a tertiary-care hospital. All patients who underwent SLED and/or CRRT from September 2005 to April 2008 were taken up for analysis. To maintain a mean arterial pressure (MAP) >70 mmHg, patients who required noradrenaline >0.5 |ig/kg/minute were treated with CRRT whereas those requiring <0.5 |ig/kg/minute received SLED. Depending on haemodynamic stability patients were switched from CRRT to SLED, or vice versa. Results From September 2005 to April 2008, 214 haemo-dynamically unstable AKI patients, deemed unfit for intermittent haemodialysis, underwent SLED/CRRT (continuous venovenous hemofiltration (CVVH)/continuous venovenous hemodiafiltration (CVVHDF)). Ten patients were switched to SLED after a median 48 hours of CRRT. See Figures 1 and 2.

Conclusions In our ICU, the need for RRT in hemodynamically unstable patients with AKI was significantly higher in the medical patients, the commonest cause of AKI being sepsis. Patients who

Figure 1 (abstract P271)

SLED CRRT p value

No of patients 162 52

MedianAgc, ynt 66 At

Male: Female ratio (in%) 61:39 «6:34

Clinical Classification

Medical 87% «1%

Surgical 10.5% 15%

Oynac 1.5% 2.5%

Pacdiatric 1.0% 2.5%

lncidcncc of Sepsis 54% 57%

Average MAP (mm llg) at tbeskut of RRT 85 73

Median AFAC1 Hi II Score 26 23 0.59

Hospital Mortality (%) 52% 78%

Median Hospital tA-ngti] of Stay, days 12 «

Cost/day (in Rupees) 4000 14000 21000

(in l:uros) 64 223 - 382

were equally sick (comparable APACHE II scores) could be effectively dialysed by SLED, as compared with CRRT. Hemodynamic stability was maintained in the patients on SLED, as none needed switchover to CRRT. The patients undergoing SLED were dialysed during daytime by the haemodialysis nurse, eliminating the need for a specialist nurse in the night. After a median 48 hours of CRRT, it was possible to switch resulting in optimal utilization of resources. SLED was much cheaper than CRRT. In a country like India where often there are economical constraints, the judicious use of SLED will help us optimize the need for CRRT.

Observational cohort of renal replacement therapy patients at a district general hospital ICU: case mix and outcomes

C Hayes-Bradley, S Caddel, J Paddle

Royal Cornwall Hospital, Truro, UK

Critical Care 2009, 13(Suppl 1):P272 (doi: 10.1186/cc7436)

Introduction Previous studies have shown the mortality of ICU patients requiring renal replacement therapy (RRT) to be high at 62.8% [1], and underpredicted by APACHE II scoring [2]. The ICNARC Case Mix Programme gives mortality of 59.5% for 2003 to 2004. We aimed to review our patients to see how we compare. Methods We prospectively collected data on all RRT episodes on the ICU from 2005 to 2007: patient age, APACHE II score in the first 24 hours, primary indication for RRT, RIFLE classification at start of RRT, ICU and hospital length of stay (LOS), mortality at 30, 60, and 90 days, and recovery of renal function at 30 days. Results We admitted 1,557 patients over the 3-year study period, of which 18% were elective. A total of 210 patients received RRT (data were available on 208). The median age was 66 years, 56% were male, and mean APACHE II score was 25.4. Our hospital mortality for all patients receiving RRT was 50.5%. One hundred and seventy-six patients had RRT for acute kidney injury with a hospital mortality of 51.1% (23 class Risk patients, 70 class Injury, 83 class Failure). No statistical difference in hospital mortality, ICU LOS, or renal function recovery existed by RIFLE class. By APACHE II score, the standardised mortality ratio was 0.98 (1.1 in 2005, 0.97 in 2006, and 0.86 in 2007). Main indications for filtration were: acidaemia 51%, oliguria 9%, uraemia 13%, fluid overload 7%, sepsis 6%, and hyperkalaemia <1%. The average ICU LOS was 9 days for hospital survivors (IQR 5 to 21 days) and 4 days in nonsurvivors (IQR 2 to 8 days). Only three hospital survivors were not alive at 90 days. Three out of 82 followed-up patients still required dialysis at 30 days.

Conclusions Our hospital mortality compares favourably with other published work [1,3]. We found the APACHE II score to predict mortality accurately, in contrast to published work showing underprediction [2]. This may represent a better outcome in our cohort. We were unable to demonstrate a correlation between the RIFLE score at initiation of RRT and hospital mortality. This could be due to small numbers, or to an equivalence of outcome for the RIFLE classes once RRT is established. References

1. Metnitz PGH, etal.: Crit Care Med 2002, 30:2051-2058.

2. Kolhe NV, et al.: Crit Care 2008, 12(Suppl 1):S2.

3. Noble J, et al.: Anaesthesia 2001, 56:124-129.

Figure 2 (abstract P271)

SLED CRRT

CVV11 CWHDF

Noof patients 162 34 18

Total no of hours 3394 1848 1344

Average 20.9 bre 5-1.3 hire 74,7hrs

Effects of continuous hemofiltration on organ perfusion, energy metabolism, oxidative stress, endothelial dysfunction and inflammation

R Sykora, J Chvojka, A Krouzecky, J Radej, V Varnerova, T Karvunidis, I Novak, M Matejovic

Charles University, Medical School and Teaching Hospital, Plzen, Czech Republic

Critical Care 2009, 13(Suppl 1):P273 (doi: 10.1186/cc7437)

Introduction Continuous renal replacement therapies (CRRT) are widely used for treatment of acute kidney injury in critically ill patients. Little attention has been paid to the potential adverse effects of CRRT related to extracorporeal circuit bioincompatibility. Limited available evidence suggests that intermittent dialysis may compromise hepatosplanchnic perfusion in acute renal failure patients. By contrast, there are no data on the bio(in)compatibility indices in patients treated by CRRT. To investigate this issue, we utilized a long-term porcine model allowing a broad insight into organ hemodynamic, microvascular, metabolic and other pathways not accessible in human medicine.

Methods Measurements were performed in 11 healthy instrumented animals. After baseline measurements, animals were randomized to receive either no treatment (n = 6) or continuous venovenous hemofiltration with a polysuphon membrane (CVVH, n = 5). Further data were collected at 6 and 10 hours after the randomization. During each time point the following data related to

(1) organ perfusion (hepatosplanchnic and renal blood flows), (2)

microcirculation (ileal mucosal and renal cortex laser Doppler flowmetry and sidestream darkfield imaging), (3) energy balance

(arterial and regional pH, lactate/pyruvate and ketone body ratios),

(4) oxidative/nitrosative stress (thiobarbituric acid reactive species, nitrates + nitrites), (5) inflammation (TNFa, IL-6) and (6) endothelial dysfunction (von Willebrand factor, asymmetric dimethylarginine) were collected.

Results Hemofiltration affected neither regional organ blood flows nor ileal mucosal and renal cortex microvascular perfusion. No changes in liver and kidney oxygen exchange and energy balance were detected during the 10-hour treatment. Similarly, CVVH did not interfere with surrogate markers of inflammation, oxidative/ nitrosative stress and endothelial dysfunction. Conclusions In our model CVVH utilizing a modern biocompatible polysulphon membrane did not induce any significant deleterious effects in various biological systems. The importance of our findings lies in the fact that any changes in the studied parameters, if observed in future studies, cannot be attributed to the effects of putative extracorporeal circuit bioincompatibility. Finally, our data support the postulated concept of good biological tolerance of CRRT.

Acknowledgement Supported by MSM 0021620819 - Replacement of and support to some vital organs.

Influence of continuous venovenous hemofiltration on transpulmonary thermodilution-derived parameters

V Neirynck, A Willems, D Peeters, N Van Regenmortel, I De laet, K Schoonheydt, H Dits, M Malbrain

ZNA Stuivenberg, Antwerp, Belgium

Critical Care 2009, 13(Suppl 1):P274 (doi: 10.1186/cc7438)

Introduction We studied the effects of continuous venovenous hemofiltration (CVVH) on transpulmonary hemodynamic parameters in nine ventilated patients [1,2].

Methods All together 32 calibrations were performed with and without CVVH treatment. For each calibration three consecutive injections of 20 ml cold saline were done via the central venous line (CVL), giving a total of 186 thermodilutions. Results Patient age was 72.3 ± 14, BMI 24.7 ± 3.8, SAPS II 56.2 ± 15.7. Regardless of the catheter position, CVVH increased the extravascular lung water index (EVLWi) from 11.3 ± 5 to 12.4 ± 6.3 (P = NS), while the cardiac index (CI) and global end-diastolic volume index (GEDVi) decreased from 5.1 ± 1.8 to 4.2 ± 1.3 (P =0.03) and from 1036 ± 298 to 885 ± 185 (P =0.02), respectively. The results of a subanalysis comparing correct catheter position (CVL placed in jugular or subclavian vein and dialysis catheter placed femorally) and faulty position (dialysis catheter positioned between the thermodilution injection and detection sites) are summarized in Table 1. In two patients catheters were exchanged during the stay from the faulty to the correct position, and this resulted in a significant decrease in all parameters: CI dropped from 6.1 ± 0.9 to 5 ± 0.3 (P = 0.014), GEDVi from 1253 ± 165 to 829 ± 161 (P = 0.001) and EVLWi from 16.1 ± 7.1 to 8.7 ± 1.2 (P = 0.03).

Conclusions In critically ill patients treated with CVVH, the hemodynamic parameters obtained by PiCCO transpulmonary thermodilution can be influenced: EVLWi increases while CI and GEDVi drop. We hypothesize that this may be due to the position of the CVL and dialysis catheters. References

1. Martinez-Simon A: Crit Care 2006, 10:410.

2. Sakka S, et al.: Anesth Analg 2007, 105:1079-1082.

Antibiotic dosing regimens for septic patients receiving continuous venovenous haemofiltration: do current studies supply sufficient data?

A Li, C Gomersall, G Choi, Q Tian, G Joynt, J Lipman

The Chinese University of Hong Kong, NT, Hong Kong Critical Care 2009, 13(Suppl 1):P275 (doi: 10.1186/cc7439)

Introduction The aim of this study was to establish the minimum dataset that needs to be specified when presenting pharmaco-kinetic data for critically ill patients with acute renal failure, and to review the current literature to establish whether this minimum

Table 1 (abstract P274)

Effect of CVVH on hemodynamic parameters in correct and faulty catheter positions

CVVH correct No CVVH correct CVVH faulty No CVVH faulty

(n = 11) (n = 11) P value (n = 21) (n = 21) P value

CI (l/min/m2) 4.1 ± 0.9 4.5 ± 0.8 NS 4.3 ± 1.5 5.4 ± 2 0.04

GEDVi (ml/m2) 774.8 ± 83.8 810.6 ± 124.1 NS 942.1 ± 198.4 1,155.4 ± 294.6 0.009

EVLWi (ml/kg) 7.8 ± 2.2 7.8 ± 1.5 NS 14.8 ± 6.4 13.1 ± 5.3 NS

Figure 1 (abstract P275)

Percentage of studies with specified parameter

Required Criteria

CURT continuous renal replacement therapy, UF ultrafiltratc, SA surfacc area, CI tot~ total clearance, Sc Sd sieving or saturation coefficient, Vd volume of distribution, CI CftftT continous renal rcpalccmcnt therapy clearance. PB= protein binding

Reporting of required parameters for dose regimen calculation in the literature.

dataset is indeed reported. Antibiotic dosing for septic patients with acute renal failure receiving continuous renal replacement therapy is complicated, and failure to correctly dose may result in either drug toxicity, or treatment failure and development of resistance [1].

Methods A dataset was established of the minimal number of parameters that need be reported when calculating a drug-dosing regimen. Patient demographics and markers of severity were added to allow for patient population comparisons. A Medline search was performed of the relevant literature, producing 76 studies from which completeness of the dataset was examined. Results None of the studies analysed presented the full dataset that we established as necessary (Figure 1). Of concern, basic pharmacokinetic parameters such as the volume of distribution and clearance were absent in a significant number of studies effectively abrogating calculation of a meaningful dosing regimen. Conclusions A large proportion of current studies do not report key information necessary to devise a rational dosing regimen for patients with acute renal failure receiving continuous renal replacement therapy. We have presented a set of criteria we believe are necessary to calculate an antibiotic-dosing regimen for these patients and hope this will be a useful guide when reporting future pharmacokinetic data. Reference

1. Roberts JA, et al.: Crit Care Med 2008, 36:2433-2440. P276

Comparing adsorption of gentamicin by polyacrylonitrile and polyamide hemofiltration filter in an in vitro continuous venovenous hemofiltration model

P Lam, Q Tian, M Ip, C Gomersall

Chinese University of Hong Kong, NT, Hong Kong Critical Care 2009, 13(Suppl 1):P276 (doi: 10.1186/cc7440)

Introduction As a high proportion of patients who require continuous renal replacement therapy will also be receiving antibiotics, S114 the issue of whether significant amounts of antibiotic are adsorbed

by the haemofilter is relevant to critically ill patients. The aim of the study is to determine the time course of adsorption of gentamicin to a polyacrylonitrile filter (PAN) and a polyamide filter, respectively. Methods A unit of expired whole blood was mixed with heparinized lactated Ringer's solution to made up a total volume of 1,000 ml. Five hundred milliliters of this blood-crystalloid mixture was taken to a glass chamber where it was agitated and heated. After the equilibration period, 20% of a standard dose of the drug was infused into the mixing chamber. The blood-crystalloid mixture was then circulated through an in vitro continuous venovenous hemofiltration model. The ultrafiltrate was returned to the mixing chamber and no replacement fluid was infused. As a result, any decrease in drug concentration could only occur due to adsorption. Samples were taken from the mixing chamber for measurement of the drug concentration. At 90 minutes, the remaining 500 ml blood-crystalloid mixture will then be added to the mixing chamber. Samples were taken again after another hour. If the fall in drug concentration was less than those predicted from the dilution effect following the increase in blood-crystalloid volume, it indicates reversibility of adsorption. A second dose of drug was then added and samples were taken afterwards. Adsorption of antibiotic before and after the second dose was compared. Two types of filter, namely a 0.6 m2 PAN hemofilter 100;

Hospal) and a 0.6 m2 polyamide hemofilter (Hemofilter 6S; Gambro), were used in the study and four batches of tests were repeated for each hemofilter.

Results Drug adsorption by PAN hemofilters was significant, whereas drug adsorption by polyamide hemofilters was much lower compared with that of PAN hemofilters. Gentamicin adsorption by PAN hemofilters was completed at 30 minutes and it was irreversible.

Conclusions The adsorption properties of gentamicin by the two hemofilters were markedly different.

11722884

Slow continuous ultrafiltration: how many fluids must be taken out?

G Guiotto, S Gligorova, F Paladino, F Schiraldi, S Verde

San Paolo Hospital, Napoli, Italy

Critical Care 2009, 13(Suppl 1):P277 (doi: 10.1186/cc7441)

Introduction Ultrafiltration (UF) is effective and safe in treating volume-overloaded patients with acute decompensated heart failure (ADHF) and diuretic resistance [1]. Accurate determination of the amount of fluid to be removed and maintenance of the circulating blood volume are critically important [2]. The inferior vena cava diameter (IVCD) and its collapsibility index (IVCCI) are compromised in ADHF patients due to high right atrial pressure [3]. We hypothesized that monitoring of IVCCI could be used to optimize fluid removal rate during UF.

Methods Twenty patients (nine male, 11 female; age 76 ± 4, New York Heart Association classes III to IV) admitted to our medical ICU for ADHF were treated with UF (Aquadex System 100; CHF Solutions, Minneapolis, MN, USA). The heart rate (HR), mean arterial pressure (MAP) and IVCD with M-mode subcostal echocardiography during spontaneous breathing were evaluated before UF (T0), at 12 hours (T1) and at 24 hours (T2). The IVCCI was calculated as follows: [(IVCDmax - IVCDmin) / IVCDmax] x 100. Results The mean UF time was 25.5 ± 5 hours with a mean volume of 259 ml/hour and a total ultrafiltrate production of 6.6 ± 2 l. Differences between the T0 and T2 parameters are presented in Table 1. Hypotension was observed only in those patients (2/20) who reached IVCCI >35%. In all the other patients a significant increase in IVCCI was obtained without any hemodynamic instability.

Table 1 (abstract P277)

Figure 1 (abstract P278)

T0 T2 P value

MAP (mmHg) 85 ± 9 86 ± 12 NS

HR (bpm) 85 ± 6 85 ± 14 NS

IVCD (mm) 30.2 ± 6.3 27.3 ± 5.9 <0.5

IVCCI (%) 6.5 ± 1.7 33.3 ± 4.6 <0.001

Conclusions Inferior vena cava ultrasound is a rapid, simple and noninvasive means for bedside monitoring of the intravascular volume during UF. References

1. Costanzo RM, et al.: Ultrafiltration versus intravenous diuretics for patients hospitalized for acute decompen-sated heart failure. J Am Coll Cardiol 2007, 49:675-683.

2. Ronco C, et al.: Extracorporeal ultrafiltration for the treatment of overhydration and congestive heart failure. Cardiology 2001, 96:155-168.

3. Blehar DJ, et al.: Identification of congestive heart failure via respiratory variation of inferior vena cava diameter. Am J Emerg Med 2009, 27:71-75.

Plasmatic cytokines and intermittent hemodialysis with polymethylmethacrylate membrane in septic shock patients

N Mayeur, L Lavayssiere, MB Nogier, O Cointault, O Fourcade, L Rostaing

CHU Rangueil, Toulouse, France

Critical Care 2009, 13(Suppl 1):P278 (doi: 10.1186/cc7442)

Introduction Sepsis is mediated by many biologically active inflammatory mediators, including interleukins. IL-6, IL-8, and IL-10

Relative IL-8 concentration versus baseline. *P <0.05.

Figure 2 (abstract P278)

Relative IL-10 concentration versus baseline. *P <0.05.

are correlated with increased mortality in septic shock acute renal failure (ARF) [1]. ARF treatment requires renal replacement therapy (RRT). The cytokine plasmatic level during and after hemodialysis (HD) in septic ARF is partially described [2]. Polymethylmeth-acrylate (PMMA) hemodialyser membranes own high adsorptive capacity [3]. In this prospective observational trial, we study the plasmatic level of IL-6, IL-8 and IL-10 during and after the first HD seance with PMMA membrane in septic shock patients with ARF. Methods Inclusion criteria: patients with septic shock <24 hours as defined by the American College of Chest Physicians/Society of Critical Care Medecine and requiring RRT (Injury in the RIFLE criteria). The hemodialyser PMMA membrane was Filtrizer BK-1,6 F (TORAY Industrie, Tokyo, Japan). Data and blood samples were collected at: start of HD (D0), every hour during HD (D1; D2), at the end of HD (endD); and 30, 60, 90, 120 and 180 minutes after HD (postD0.5; postD1; postD1.5; postD2; postD3, respectively). Solid-phase ELISA was used for cytokine measurements. Statistical analysis was by Kruskall-Wallis nonparametric test. Results Ten patients were included. At D0: Sequential Organ Failure Assessment (14.6 ±0.8) and IGS 2 (Simplified Acute Physiology Score II) (79.11 ±4.73). At D0, IL-6, IL-8 and IL-10

concentration values were 767 ± 191.2, 724.4 ±191.7 and 168.5 ± 50.44 pg/ml, respectively. Relative serum IL-8 and IL-10 concentrations versus D0 are shown in Figures 1 and 2 (mean ± SEM). The urea reduction between D0 and endD was 48.5%. The norepinephrine rate and mean arterial pressure did not change between D0 and endD (0.65 ±0.12 vs. 0.57 ±0.12 |ig/kg/min, and 76.40 ± 4.554 vs. 83.60 ± 4.349 mmHg, respectively; P = NS). Conclusions PMMA membranes showed transient efficiency in IL-8 and IL-10 elimination by possible membrane saturation. The IL-6 concentration was not modified. Three hours after HD, the IL-8 and IL-10 concentrations were back to baseline. This fast increase could be explained by plasmatic rebound and must be kept in mind. This rebound could be deleterious in this stage of sepsis. References

1. Oberholzer A, et al.: Plasma cytokine measurements augment prognostic scores as indicators of outcome in patients with severe sepsis. Shock 2005, 23:488-493.

2. Haase M, et al.: Hemodialysis membrane with a high-molecular-weight cutoff and cytokine levels in sepsis complicated by acute renal failure: a phase 1 randomized trial. Am J Kidney Dis 2007, 50:296-304.

3. Hirasawa H, et al.: Continuous hemofiltration with cytokine-adsorbing hemofilter in the treatment of severe sepsis and septic shock. Contrib Nephrol 2007, 156:365-370.

Immunoadsorption in dilated cardiomyopathy

Y Wakabayashi1, A Baba1, M Akaishi1, T Yoshikawa2, T Monkawa2

1Kitasato Institute Hospital, Tokyo, Japan; 2Keio University School of Medicine, Tokyo, Japan

Critical Care 2009, 13(Suppl 1):P279 (doi: 10.1186/cc7443)

Introduction Removal of cardiodepressant autoantibodies by immunoadsorption (IA) has been reported to induce early haemo-dynamic improvement in patients with dilated cardiomyopathy (DCM). The Immusorba TR-350 (Asahikasei-Kuraray Medical Co. Ltd, Japan) (TR) is an IA column currently used for Myasthenia gravis or Guillain-Barre syndrome. This column, in which tryptophan is immobilized as a ligand, has a property to have high affinity to IgG subclass 3. With this property, imunoglobulin substitution is not usually required after the IA treatment. Since cardiodepressant antibodies belong to IgG subclass 3, we investigated the effect of IA using this column on cardiac function in patients with DCM.

Methods Seventeen DCM patients (left ventricular ejection fraction (LVEF) <30%) participated in the study. IA was conducted every other day three to five times. Blood was drawn at the rate of 80 to 100 ml/minute by direct venipuncture or from the blood access catheter and it was first passed through the plasmaseparating column. The separated plasma was then passed through the TR at the rate of 20 ml/minute. A total of 1,500 to 2,000 ml plasma was processed per one session. Either heparin or nafamostat mesylate was used as an anticoagulant. The LVEF was measured by quantitative gated single photon emission computed tomography. The cardiodepressant antibodies were assayed ex vivo [1]. The P1-adrenergic and muscarinic M2-acetylcholine receptor antibodies were measured by ELISA. Results After the three to five sessions of IA treatment, cardiodepressant antibodies were almost completely cleared from the circulation. Three months after the IA sessions, the LVEF increased significantly from 18.7 ± 2.3 to 23.2 ± 2.6%, P <0.05. The average increase rate was 33.1 ± 12.6% from the baseline. Limited to the 10 patients who initially revealed the high titer of

cardiodepressant antibodies, the increase in LVEF was greater (54.5 ± 16.8% from the baseline). The cardiodepressant antibodies were not detected at this period except in one patient. All patients tolerated IA without any complication. Any cardiac event or mortality did not take place over the period of 3 months. Conclusions The IA treatment with TR may bring benefit in left ventricular function in patients with DCM, particularly in those with a high titer of cardiodepressant antibodies. Further long-term follow-up is, however, required before confirming its efficacy. Reference

1. Baba A: Autoantigen estimation and simple screening assay against cardiodepressant autoantibodies in patients with dilated cardiomyopathy. Ther Apher Dial 2008, 12:109-116.

Lipopolysaccharide adsorber in abdominal septic shock

T Ala-Kokko, J Koskenkari, J Laurila

Oulu University Hospital, Oulu, Finland

Critical Care 2009, 13(Suppl 1):P280 (doi: 10.1186/cc7444)

Introduction Polymyxin-B hemoperfusion has been shown to lower mortality in sepsis [1]. The effects of a new endotoxin adsorber (Alteco LPS Adsorber; Alteco Medical AB, Lund, Sweden) on the length of noradrenaline (NA) treatment and lipopolysaccharide blood levels in abdominal septic shock were evaluated. Methods Following consent a 2-hour hemoperfusion with LPS adsorber was began in five patients [2]. Sepsis guidelines were followed [3]. Two historical controls per case were selected. Results The mean total duration of NA infusion was 46 hours shorter in the adsorber group compared with the control group (95% CI = -104 hours to 12 hours, P = 0.165) (Table 1). The average length of NA infusion was 17.4 ± 6.8 hours (5.8 to 23.8 hours) following the start of adsorption treatment. The level of LPS decreased in all but one study patient and all were without NA at 24 hours. The mean Sequential Organ Failure Assessment decrease was 3.4 ± 1.7 from baseline to 24 hours post treatment The average length of hospital stay was 3.4 days shorter in the adsorber group (95% CI of the difference, -21.7 to 14.8 days, P= 0.881). All study patients were alive on day 28 and one control died in the hospital.

Conclusions Single 2-hour LPS hemoperfusion was associated with a rapid decrease in NA dose, reversal of septic shock, and decrease in organ dysfunctions and LPS concentrations. The total duration of NA infusion and hospital stay were shorter compared with historical controls, but the difference was not statistically significant in this small study. References

1. Cruz D, et al.: Effectiveness of polymyxin B-immobilized fiber column in sepsis: a systematic review. Crit Care 2007, 11:R47.

Table 1 (abstract P280)

Clinical characteristics of the study patients and the controls

Patients (n = 5) Controls (n = 10) P value

Age 64 ± 16 65 ± 13 NS

Male/female 2/3 4/6 NS

Simplified Acute Physiology Score II 38 ± 11 46 ± 11 NS

Sequential Organ Failure Assessment 9.6 ± 2 7.7 ± 2 NS

ICU length of stay 6.2 ± 3 6.4 ± 3 NS

Hospital length of stay 23 ± 13 27 ± 17 NS

NA infusion (hours) 34 ± 12 81 ± 77 NS

Data presented as mean ± SD.

2. Levy MM, et al.: 2001 SCCM/ESICM/ACCP/ATS/SIS International Sepsis Definitions Conference. Crit Care Med 2001, 31:1250-1256.

3. Dellinger RP, et al.: Surviving Sepsis Campaign: international guidelines for management of severe sepsis and septic shock: 2008. Crit Care Med 2008, 36:296-327.

Experience with the use of selective sorbents in complex intensive care of sepsis in patients after cardiac surgery

M Yaroustovsky, M Abramyan, Z Popok, E Nazarova, D Popov, O Stupchenko, M Plushch

Centre Cardiovascular Surgery, Moscow, Russian Federation Critical Care 2009, 13(Suppl 1):P281 (doi: 10.1186/cc7445)

Introduction The aim of the study was to evaluate the first experience with endotoxin adsorption in complex intensive care of critically ill patients with sepsis after heart surgery. Methods Eleven adult patients were studied. Patients were divided into two groups: undergoing Alteco LPS adsorption procedures (n = 6), and undergoing hemoperfusion using Toraymyxin columns (n = 5). Intensive therapy of sepsis in both groups included antibacterial therapy, hemodynamic and respiratory support, prevention of thromboembolism, stress ulcer prophylaxis, nutritive support, and studied extracorporeal techniques to endotoxin removal (two procedures for every patient).

Results Endotoxin adsorption procedures were started on average at day 8 after surgery in patients with systemic inflammatory response syndrome and confirmed Gram-negative infection. As the result of procedures, hemodynamic indices improved in both groups while the dose of inotropic support decreased. Also we noted the improvement of oxygenating lung function. Positive dynamics of procalcitonin, endotoxin and inflammation mediator concentrations were noted. A favorable influence of studied procedures on the course of the infective process was confirmed by the dynamics of leukocyte count and the tendency to normalization of body temperature. Blood cultures taken several days after the procedures were negative. There was no deterioration of hemodynamic indices during and after procedures. No cases of thrombosis of the extracorporeal contour were noted. Data are presented in Figure 1 as mean ± SEM.

Conclusions Both studied procedures are safe. Clinical studies should be continued to define the place of endotoxin adsorption in complex treatment of critically ill patients with sepsis.

Figure 1 (abstract P281)

Parameters .Ur>.< i> LPi? ,.l:m irii.j, »HP - FMX

Before After Befoie After

Heart rate 96.3±14 7 100.8±17.<5 79.2±17.6 87.0±12.8

MAP, mmHg 74.5±9.7 90.8±16.1 73.2±14.5 S7.2±10.2

Bí-dy temperature, 38CttCJ.8 37 3±0 5 37.9±0.7 36.7*0.5

PaOj/FiOj 206±65.7 259.3±49 231.7±342 242.2±22 .9

WBC, lO'/l 17.8±6.1 12.4±3.5 35.5±6.3 15.4±5.4

PLT. 10 VI 97.3±39 6 108.2«4.7 1342±854 142.0±74 1

Endotoxin, TJ3/ml 1 22±0.57 0.29±0.15 1.44±1.1S 0.18±O.36

PCT, ng/ml 15.27±13.4 2.15±1.22 36.0±70.3 21.0±18.2

DHP-PMX, hemoperfusion with the use of Toraymyxin columns; MAP, mean arterial pressure; WBC, white blood cells; PLT, platelets; PCT, procalcitonin.

Best choice of acute blood purification therapy based on the severity score and blood lactic acid values in septic shock patients

Y Sakamoto1, K Mashiko1, H Matsumoto1, Y Hara1, N Kutsukata1, N Saito1, H Yokota2

1Chiba Hokusou Hospital, Nippon Medical School, Chiba, Japan;

2Nippon Medical School, Tokyo, Japan

Critical Care 2009, 13(Suppl 1):P282 (doi: 10.1186/cc7446)

Introduction Septic shock is a condition associated with diffuse coagulopathy and multiple organ failure, and frequently leads to death. Direct hemoperfusion using a polymyxin B-immobilized fiber column (DHP-PMX) has been used for the treatment of septic shock [1]. On the other hand, there is another kind of acute blood purification therapy, but the optimal column of continuous veno-venous hemodiafiltration (CVVHDF) is still controversial. Methods We treated 88 septic shock patients by DHP-PMX. The patients were divided into two groups based on the survival outcome and on the improvement in the circulatory dynamics immediately after DHP-PMX (Group A: increase of systolic blood pressure (SBP) by more than 30 mmHg (44 cases); Group B: increase of SBP by 30 mmHg or less (44 cases)). In another study conducted to review the best choice of acute blood purification therapy after DHP-PMX, the patients were divided into three groups: groups undergoing CVVHDF using a polymethylmethacrylate membrane hemofilter (PMMA) (28 cases), CVVHDF using a polyacrylonitrile membrane hemofilter (26 cases), and no CVVHDF after DHP-PMX (34 cases). Results There were 48 survival cases and 40 expired cases. The overall survival rate was 54.5% (good outcome judging from the APACHE II score), and outcome significantly related to APACHE II score, Sepsis-Related Organ Failure Assessment score and blood lactic acid value before treatment (P <0.0001). The improvement rates of the blood pressure (increased by more than 30 mmHg) were 50.0% and significantly low blood lactic acid level in Group A. For another examination, only the PMMA CVVHDF group showed a better outcome (survival rate of 78.6%) compared with the other groups (P =0.0190). In addition, only the PMMA CVVHDF group showed significant improvements of the blood lactic acid on day 3 (P =0.0011).

Conclusions Our study suggests that DHP-PMX treatment was effective in the early phase of septic shock before critical increase of the blood lactic acid levels. The optimal column for CVVHDF, as determined by improvement of the blood lactic acid levels, following DHP-PMX treatment is the PMMA column. Reference

1. Cruz DN, Perazella MA, Bellomo R, et al.: Effectiveness of polymyxin B-immobilized fiber column in sepsis: a systematic review. Crit Care 2007, 11:R47.

Clinical effects of polymyxin B immobilised fiber with direct hemoperfusion for the patients of severe sepsis or septic shock caused by intra-abdominal infections

T Ikeda1, K Ikeda1, H Taniuchi1, S Suda1, Y Takahashi2

1Tokyo Medical University, Hachioji Medical Center, Tokyo, Japan;

2Sannoudai Hospital, Ibaragi, Japan

Critical Care 2009, 13(Suppl 1):P283 (doi: 10.1186/cc7447)

Introduction The endotoxin adsorption method polymyxin B immobilised fiber with direct hemoperfusion (PMX-DHP) has been used for treatment of patients with severe sepsis and septic shock primarily caused by Gram-negative infections in Japan [1]. One

hundred and twenty-six septic patients who had severe sepsis or septic shock due to intra-abdominal infections were treated with PMX-DHP.

Methods These patients were separated into two groups: whose who survived for at least 28 days after the start of PMX-DHP therapy (survival: 85 cases) and those who did not (nonsurvival: 41 cases). Background factors and inflammatory mediators were examined in each group. PMX-DHP was assessed with the changes of clinical parameters (heart rate, systolic arterial pressure, mean arterial pressure, PaO2/FiO2) and various cytokines (TNFa, IL-6, IL-8, IL-1ra, PAI-1). Sepsis was diagnosed according to the criteria of the American College of Chest Physicians/Critical Care Medicine Consensus Conference Committee. Results A total of 91.7% of survival cases were principally treated with surgical procedure. On demographic data, Goris's multiple organ failure score only showed significant differences between the groups (survival cases: 5.8 ± 2.8, nonsurvival cases: 8.0 ± 2.6, P<0.05). Procalcitonin (PCT) before PMX-DHP in all patients was 59.1 ± 97.2 ng/ml and tended to decrease, 54.7 ± 81.7 ng/ml after PMX-DHP. PCT was 67.3 ± 109.6 ng/ml before PMX-DHP and significantly decreased to 54.7 ± 82.1 ng/ml immediately after PMX-DHP in the survival group, but, it did not change significantly in the nonsurvival group. There was a significant correlation between endotoxin and PCT (r =0.527, P <0.001). Conclusions Our results may suggest that PMX-DHP for the patients with severe sepsis or septic shock caused by intraabdominal infections can improve hemodynamic changes and pulmonary oxygenation, and also reduce systemic inflammatory cytokines and serum PCT in the survival group. Reference

1. Ikeda T, Ikeda K, Taniuchi H, et al.: Clinical evaluation of PMX-DHP for hypercytokinemia caused by septic multiple organ failure. Ther Apher Dial 2004, 8:293-298.

Improvement of haemodynamic and respiratory parameters during coupled plasma filtration and adsorption correlates with the clearance of inflammatory mediators

F Turani12, G Lanini1, C Alessandrini1, F Paoletti1, M Falco1, GV Stazzi1, GD Tebala2

1European Hospital, Rome, Italy; 2Aurelia Hospital, Rome, Italy Critical Care 2009, 13(Suppl 1):P284 (doi: 10.1186/cc7448)

Introduction Sepsis is the leading cause of mortality in intensive care, but data on new therapies are inconclusive. Coupled plasma filtration adsorption (CPFA) is a new extracorporeal technology for septic shock and may improve haemodynamic respiratory function and mortality. The aim of this study is to evaluate during CPFA the haemodynamic response and respiratory function and the reduction of inflammatory markers.

Methods Eighteen septic patients have been enrolled in this study within 8 hours from sepsis diagnosis. Every patient had three CPFA treatments for 8 hours with Q blood = 200 ml/hour, Q ultrafiltration = 30 ml/kg/hour and Q plasma = 20% of Q blood. At T0 (basal), T1 (after first cycle), T2 (after second cycle), T3 (after third cycle) and T4 (after 72 hours) we evaluated haemodynamic parameters, norepinephrine dosage, PaO2/FiO2 ratio and plasma IL-6, and procalcitonin. All data are expressed as the mean ± SD. ANOVA was used to compare changes during the times of study. P <0.05 was statistically significant. Results Table 1 presents the main results of the study. Conclusions In contrast with a recent experimental study in septic S118 pigs [1], data from this study confirm that CPFA improves

Table 1 (abstract P284)

T0 T4 P value

Mean arterial pressure (mmHg) 67 ± 5 83 ± 7 <0.01

Norepinephrine (|ig/kg/min) 0.23 ± 0.1 0.04 ± 0.08 <0.001

PaO2/FiO2 195 ± 15 268 ± 25 <0.05

IL-6 (pg/ml) 409 ± 25 114 ± 110 0.10

Procalcitonin (pg/ml) 40 : ± 15 5 ± 2 <0.001

haemodynamics during septic shock. This improvement may be related to the reduction of IL-6 and first of all of procalcitonin. Procalcitonin clearance during CPFA may have a role in the improvement of shock-related vasoparalysis, as calcitonin receptor family complexes have been implicated recently in the pathogenesis of sepsis. Reference

1. Sykora R, et al.: Coupled plasma filtration adsorption in peritonitis induced septic shock. Shock 2008 [Epub ahead of print]

Improving of APACHE II score at the early phase using CTR-001 direct hemoperfusion in patients with severe sepsis and septic shock

Y Suzuki, N Sato, M Kojika, T Kikkawa, T Shouzushima,

S Endo

Iwate Medical University, Morioka, Japan

Critical Care 2009, 13(Suppl 1):P285 (doi: 10.1186/cc7449)

Introduction We reported the clinical efficacy of a newly developed cytokine adsorption column (CTR-001; Kaneka Co., Osaka, Japan) for the septic patients at the previous meeting. Especially, increasing blood pressure was remarkable during use of CTR-001 in septic shock patients. In this study, we investigate that clinical efficacy concerning the improvement of APACHE II and SOFA scores.

Methods A prospective randomized, controlled clinical trial was performed in this study. The newly developed column contains microporous cellulose beads with a hexadecyl alkyl chain as the ligand. Eighteen patients with early septic shock or septic organ dysfunction were enrolled. Nine of the 18 were randomized to direct hemoperfusion (DHP). All patients received supportive intensive care, and those randomized to DHP received direct hemoperfusion for 4 hours more than two times up to 14 times during 14 days. We measured the plasma concentration of IL-6, IL-8, IL-1P, and TNFa. The APACHE II score and SOFA score were evaluated for each patient on the 1st, 7th, 14th and 28th day after starting treatment before the treatment in the morning. Results The decrease of APACHE II score from the pretreatment level at the 7th day was significantly larger in the CTR-001 treatment group than in the control group (P =0.0189; MannWhitney test). On the other hand, there were no significant changes of the SOFA score at the 7th day. Adsorption column-related serious adverse events were not observed in the DHP group. The concentration of IL-6 and IL-8 in the plasma decreased from the pretreatment level in the DHP group significantly (P = 0.0464, P = 0.0464 respectively; Wilcoxon test). Conclusions The newly developed direct hemoperfusion column improved the septic shock better than the ordinary supportive intensive care. This new cytotoxic removal column may play an important role in treatment of patients with septic shock.

Beneficial effects of early hemoperfusion with a polymyxin B fibre column on septic shock

N Takeyama, H Noguchi, K Morino, T Obata, T Sakamoto, F Tamai, H Ishikura, Y Kase, M Kobayashi, Y Takahashi

Japan Sepsis Study Group, Nagakute-cho, Japan

Critical Care 2009, 13(Suppl 1):P286 (doi: 10.1186/cc7450)

Introduction The aim was to verify the hypothesis that extracorporeal therapy with a polymyxin B (PMX) fibre column may prevent septic shock-induced organ dysfunction and that early treatment of septic shock with PMX hemoperfusion may improve patient outcome.

Methods One hundred and sixteen patients with septic shock who were admitted to the ICU of 35 hospitals were enrolled in this study from April 2006 through March 2008. PMX treatment was performed immediately after septic shock was diagnosed. All patients were followed up for 28 days after enrollment in the study, and 28-day mortality was assessed. Arachidonylethanolamide, 2-arachidonoylglycerol, IL-6, lipopolysaccharide, lactic acid, and the Sepsis-related Organ Failure Assessment score were determined before PMX treatment and at 24, 72, and 168 hours after the treatment.

Results At the end of PMX treatment, the mean arterial pressure and plasma HCO3- were significantly increased (P <0.01), and the plasma level of lactate was significantly decreased (P <0.05). Seven days after treatment, the PaO2/FiO2 ratio was significantly increased (P <0.05), and the plasma level of creatinine and arachidonylethanolamide were significantly decreased (P <0.05). Of the 57 surviving patients, 24 patients were treated with PMX within 6 hours after the diagnosis of the septic shock (early group) and 33 did not treat within 6 hours (late group). There was no significant difference between the early and late groups about 28-day mortality (41.5% and 37.7%, respectively; P = NS), whereas serum creatinine and the PaO2/FiO2 ratio were significantly improved (P <0.05). The ICU stay was shorter in the early group than the late group (10.6 vs. 16.4 days; P = 0.088). Conclusions Hemoperfusion with PMX was a safe and effective treatment for improvement of hypotension and hypoperfusion in septic shock patients. PMX treatment within 6 hours after diagnosis of septic shock would be beneficial with respect to oxygenation and renal function. References

1. Vincent JL, et al.: Shock 2005, 23:400-405.

2. Shoji H: Ther Apher Dial 2003, 7:108-14.

3. Nakamura T, et al.: ASAIO J 2004, 50:563-567.

4. Enomoto N, et al.: Respirology 2008, 13:452-460.

5. Kase Y, et al.: Ther Apher Dial 2008, 12:372-378.

Observational study for direct hemoperfusion therapy with polymyxin-B immobilized fibers (PMX-DHP) in patients with septic shock in Japan: PMX-DHP study group

Y Kase1, T Obata1, Y Takahashi2

Jikei University School of Medicine, Tokyo, Japan; 2Showa

University School of Medicine, Tokyo, Japan

Critical Care 2009, 13(Suppl 1):P287 (doi: 10.1186/cc7451)

Introduction Direct hemoperfusion therapy with polymyxin-B immobilized fibers (PMX-DHP) has been widely applied as the therapeutic method for the patients with septic shock in Japan since 1994. Optimal usage of PMX-DHP for the patients with

septic shock remains a matter of debate, mainly because of a lack of adequately designed clinical trials.

Methods The observational study, carried out in 21 hospitals in Japan, was designed to investigate the effects of PMX-DHP for patients with septic shock. A total of 36 patients with septic shock were enrolled between April 2004 and January 2005. Results Forty-four percent (16/36) was acute peritonitis, which required laparotomy and lavage of the peritoneum before PMX-DHP. The other 56% (20/36) was unable to remove the site of infection, such as urinary tract infection, respiratory infection and unknown cause of infections. Inhospital death rates were 6% (1/16) in patients with acute peritonitis and 55% (11/20) in those without acute peritonitis (P = 0.0446). The early indication rates of PMX-DHP, within 24 hours since the onset of septic shock, were 61% (22/36). Among those 22 patients, the rate of achieving mean arterial pressure more than 65 mmHg within 6 hours of the onset of septic shock was 63% (14/22), and 18% (4/22) achieved 65 mmHg within 24 hours.

Conclusions In this group of septic patients, the better survival rates of patients with acute peritonitis were associated with PMX-DHP. Early induction of PMX-DHP also contributed to recovering adequate arterial blood pressure.

Investigation of type II phospholipase A2 values and eicosanoid values during polymyxin-B-immobilized fiber direct hemoperfusion of septic shock patients

T Shouzushima, Y Suzuki, G Takahashi, S Shibata, N Sato, S Endo

Iwate Medical University, Morioka, Japan

Critical Care 2009, 13(Suppl 1):P288 (doi: 10.1186/cc7452)

Introduction Type II phospholipase A2 (type II PLA2) is a rate-limiting enzyme in the eicosanoid cascade. We investigated type II PLA2 and eicosanoids during polymyxin-B-immobilized fiber direct hemoperfusion (PMX-DHP) of septic shock patients. Methods A highly sensitive nephelometry method was used to measure endotoxin. The 11 patients all had positive endotoxin levels (>1.1 ng/ml). Their mean age was 69 years (range: 43 to 85 years). The mean APACHE II score was 32 points, and the mean SOFA score was 14 points. PMX-DHP was performed twice in every patient.

Results The endotoxin level of every patient converted to negative after two PMX-DHP sessions, and they rapidly recovered from shock. The patient's TNFa values decreased significantly (from 296 to 133 pg/ml) in response to the two PMX-DHP sessions, and their LBT4 values (from 114 to 77 pg/ml), PGF1 a values (from 32 to 19 pg/ml), and TXB2 values (from 79 to 57 pg/ml) also all decreased significantly. The 28-day mortality rate was 9%, and the 180-day mortality rate was 18%.

Conclusions The results suggested that type II PLA2 produced in response to stimulation (for example, endotoxins or proinflam-matory cytokines) may further activate the arachidonic acid cascade, induce the production of various lipid mediators, and be involved in the formation of various pathological conditions. It seemed that production of these humoral factors decreased as a result of suppression of the inflammatory response by performing PMX-DHP and that their suppression was linked to the improvement in pathology.

Lipid mediator adosorption with dialyser membrane in patients with septic shock

Y Kase1, Y Sakamoto2, T Obata1

1Jikei University School of Medicine, Tokyo, Japan; 2Nippon Medical School, Chiba, Japan

Critical Care 2009, 13(Suppl 1):P289 (doi: 10.1186/cc7453)

Introduction Several studies have shown the effectiveness of non-renal indication of continuous renal replacement therapy (CRRT) or endotoxin-adosorbing fibers (direct hemoperfusion therapy with polymyxin-B immobilized fibers (PMX-DHP)) for patients with septic shock in improving unstable cardiovascular status. This nonrenal indication of CRRT might be the additional benefits that resulted in adsorbing various bioactive lipid mediators with dialyser membranes. PMX-DHP is also known to adosorb various bioactive mediators except endotoxin. In this investigation, the additional benefits of CRRT and PMX-DHP were assessed in patients with septic shock.

Methods Polymethylmethacrylate (PMMA), polysulfone, polyacrylo-nitrile and polymyxin-B immobilized fiber (PMXBIF) were investigated after use in patients with septic shock. Various adosorbed bioactive lipid mediators, mainly arachidonylethanolamide and 2-arachidonylglycerol, in these fibers were measured with gas chromatography/mass spectrometry/selected-ion monitoring using the isotope dilution method.

Results Bioactive lipid mediators, such as 2-arachidonylglycerol and arachidonylethanolamide, were adosorbed most in PMXBIF and then in PMMA, while polysulfone and polyacrylonitrile adosorbed a relatively low amount of lipid mediators. Conclusions The amount of adosorbed bioactive lipid mediators with PMXBIF and PMMA cannot be disregarded. It is necessary to take this result into consideration for selection of the dialyser membrane at the time of nonrenal indication of CRRT for patients with septic shock.

Microbiologic contamination of ultrasound transducers utilized by anesthesiologists in the operating room and ICU

F Lytle, B Knoll, T Comfere

Mayo Clinic, Rochester, MN, USA

Critical Care 2009, 13(Suppl 1):P290 (doi: 10.1186/cc7454)

Introduction Ultrasound is increasingly used to facilitate central venous catheter and regional anesthetic block placement [1,2]. Bacterial colonization of ultrasound probes has been demonstrated and the potential for cross-contamination between patients exists [3]. There are few, if any, studies investigating this for procedures performed by anesthesiologists, in the operating room and ICU. Methods Following Institutional Review Board approval, 18 ultrasound probes utilized by anesthesiologists were sampled after

1 week of average usage during 2 months in 2008. Standard microbiologic techniques were used [4]. Data were recorded, stored securely, and analyzed using appropriate statistics. Results Sixty-nine samples were obtained. Forty-nine percent of samples showed bacterial colonization. Coagulase-negative staphylo-coccus was identified in 42.6%. The incidence of Staphylococcus aureus (1.4%) and Gram-negative bacteria (4.4%) was low. Conclusions Ultrasound probes utilized in busy operating rooms and ICUs at a tertiary-care facility are a potential source for contamination and cross-contamination. Further studies of ultrasound use, probe contamination with the potential to serve as a vector for pathogens, and cleaning protocols are indicated.

References

1. Bodenham AR: Crit Care 2006, 10:175.

2. Marhofer P, et al.: Anesth Analg 2007, 104:1265-1269.

3. Mullaney PJ, et al.: Clin Radiol 2007, 62:694-698.

4. Murray PR: Manual of Clinical Microbiology. 7th edition. Washington, DC: ASM Press; 1999.

Assessment of the pathogen microorganisms and resistance patterns in patient with device-associated infections in the ICU

H Pampal, A Ozon, S Bilgin, E Ozer

Mesa Hospital, Ankara, Turkey

Critical Care 2009, 13(Suppl 1):P291 (doi: 10.1186/cc7455)

Introduction Although several precautions are taken or studies using national guidelines are conducted for preventing device-associated infections (DAI), it is still a big problem with high morbidity and mortality rates, related to frequent and long-term usage of invasive devices [1]. The aim of the study is to assess the incidence of DAI and to determine the microorganisms and resistance patterns of the pathogens.

Methods All microbiological tests of DAI patients in the ICU between January 2007 and November 2008 were analyzed. All microorganisms isolated from patients were recorded and their resistance patterns were also ascertained.

Results Sixty-nine patients were diagnosed as DAI using the criteria of the Centers for Disease Control. Ventilator-associated pneumonia (VIP) was the most frequent, with an incidence of 49.3% (n = 35), followed by catheter-associated urinary tract infection (CR-UTI) (n = 20) and central venous catheter-associated bloodstream infection (CVC-BSI) (n =15) with an incidence of 28.9% and 21.8%, respectively. The most frequent pathogens in VIP cases were Pseudomonas aeruginosa in 57% of cases (80% of which were resistant to fluoroquinolones), enterobacteriaceae species in 22% of the cases (50% of which were resistant to ceftriaxone) and Staphylococcus aureus in 8% of the cases (100% of which were methicillin resistant). The most frequent pathogens in CR-UTI cases were enterobactericeae in 60% of cases (70% of which were resistant to ceftriaxone), enterococci species in 25% of the cases (none of the enterococci was resistant to vancomycine) and P. aeruginosa in 15% of cases (100% of which were resistant to flouroquinolones). The most frequent pathogens in CVC-BSI cases were coagulase-negative staphylococci in 66% of cases (75% of which were methicillin resistant), S. aureus in 20% of the cases (85% of which were resistant to methicilline) and enterobacteriaceae species in 14% of cases (30% of which were ceftriaxone).

Conclusions DAI increase the mortality and morbidity rates in the ICU. Therefore it is important to know the possible pathogen microorganisms and their resistance patterns for the success of the selection of the ampiric antibiotic treatment. Reference

1. Rosenthal VD, et al.: Device-associated nasocomial infections in 55 intensive care units of 8 developing countries.

Ann Intern Med 2006, 145:582-591.

Does improved oral hygiene alone prevent ventilator-associated pneumonia?

S Bleakley1, G Lavery1, D Trainor1, I Thompson2, E Smyth2

1Belfast HSC Trust & Faculty of Life and Health Sciences, University of Ulster, Belfast, UK; 2Belfast HSC Trust, Belfast, UK Critical Care 2009, 13(Suppl 1):P292 (doi: 10.1186/cc7456)

Introduction Up to 20% of patients receiving mechanical ventilation for >48 hours will develop ventilator-associated pneumonia (VAP) [1]. Dental plaque and oropharyngeal (OP) secretions of intubated patients often contain organisms capable of causing VAP [2].

Methods An intervention to improve oral hygiene (8-hourly OP cleaning/suctioning, toothbrushing and instillation of chlorhexidine gel) was commenced in one zone of an adult general ICU. Nursing staff in the rest of the ICU delivered standard oral hygiene (concurrent control). No other specific VAP prevention strategies were used in any patient. All patients on mechanical ventilation were reviewed daily for VAP using National Nosocomial Infections Surveillance criteria [3].

Results In a 4-month period, 71 patients were admitted to the intervention zone (Group A) and 189 patients were admitted to control beds (Group B). Table 1 summarises the results with values for age, APACHE II score and length of ICU stay shown as medians and interquartile ranges.

Conclusions Use of an intervention to improve oral hygiene was associated with a reduction in the incidence of VAP. References

1. Safdar N, et al.: Clinical and economic consequences of ventilator associated pneumonia: a systematic review. Crit Care Med 2005, 33:2184-2193.

2. Fourrier F, et al.: Colonisation of dental plaque: a source of nosocomial infections in intensive care unit patients. Crit Care Med 1998, 26:301-308.

3. Jarvis WR: Benchmarking for prevention: the Centers of Disease Control and Prevention's National Nosocomial Infections Surveillance (NNIS) system experience. Infection 2003, Suppl 2:44-48.

Effects of body orientation on the development of ventilator-associated pneumonia in mechanically ventilated swine

M Cressoni1, A Zanella2, M Epp3, V Hoffmann3, M Stilyanou3, T Kolobow3

1Policlinico IRCCS, Milano, Italy; 2Universita Milano-Bicocca, Monza, Italy; 3National Institutes of Health, Bethesda, MD, USA Critical Care 2009, 13(Suppl 1):P293 (doi: 10.1186/cc7457)

Introduction Ventilator-associated pneumonia (VAP) is a frequent nosocomial infection with an average incidence of 20% of ICU patients undergoing mechanical ventilation. Elevation of the head of the bed to >30° (semirecumbent position) is a recommended strategy to reduce gastric reflux, and subsequent aspiration of

colonized gastric contents. However, the efficacy of this strategy to prevent VAP remains controversial. We studied the relationship between gravity and VAP in a swine model, an omnivore with gastrointestinal physiology similar to humans. Methods Twenty-six female Yucatan minipigs were randomized into four groups: (A) eight pigs were mechanically ventilated with an orientation of the trachea approximately 45° above horizontal for 72 hours. In the remaining three groups (B to D) the head of the bed was oriented 10° below horizontal (Trendelenburg position): (B) six pigs were mechanically ventilated for 72 hours; (C) six pigs were mechanically ventilated for 72 hours with enteral feeding; and (D) six pigs were mechanically ventilated for 168 hours with enteral feeding. At the end of the study period, pigs were electively sacrificed and quantitative lung microbiological cultures performed. Results All eight pigs kept in a semirecumbent position developed pneumonia and respiratory failure (PaO2/FiO2 = 132 ± 139 mmHg vs. 479 ± 42 mmHg, P <0.0001) with a median of 5.5 lobes out of six colonized. Sixteen pigs kept in the semirecumbent position had sterile lungs and two pigs ventilated in the Trendelenburg position for 7 days developed a low level of colonization. Body orientation was the only significant predictor of lung colonization and pneumonia (P <0.001).

Conclusions The semirecumbent position is uniformly associated with lung colonization and respiratory failure by 72 hours. In contrast. the positioning of the trachea and the endotracheal tube below the horizontal prevented the development of VAP.

Gastroesophageal reflux in mechanically ventilated pediatric patients and its relation to ventilator acquired pneumonia

T Shahin, M El-Hodhod, H Ibrahim

Faculty of Medicine, Ain Shams University, Cairo, Egypt Critical Care 2009, 13(Suppl 1):P294 (doi: 10.1186/cc7458)

Introduction The objective was to determine the frequency of gastroesophageal reflux (GER) in mechanically ventilated pediatric patients and its role as a risk factor for ventilator-acquired pneumonia (VAP) that may be enhanced among these patients. Methods The study was conducted in a pediatric ICU of Ain Shams University Hospital on 24 mechanically ventilated patients (16 VAP patients and eight without VAP as controls, with mean age 16.6 ±20.5 and 18.6 ± 22.4 months, respectively). Esophageal 24 hour pH-metry beside clinical and laboratory evaluation of the underlying problem and severity of the patient's condition were carried out.

Results All VAP patients had GER (50% alkaline reflux, 1 2.5% acidic reflux and 37.5% combined reflux) compared with 75% of non-VAP patients (100% alkaline reflux). The total reflux time was found to be significantly longer among VAP (50 minutes) versus non-VAP (3 minutes) patients. There was a significant increase in acidic reflux parameters among nonsurvivors versus survivors (P <0.001).

Conclusions GER is a constant incident in our mechanically ventilated pediatric patients, with alkaline reflux being more common than acidic reflux. Both acidic reflux and alkaline reflux

Table 1 (abstract P292)

Length of stay Admission APACHE II Total Number of VAPs/1,000

Group Age (years) (days) score ventilator-days VAPs ventilator-days

A 51 (30 to 64) 7 (4 to 11) 16 (13 to 18) 457 2 4.37

B 56 (29 to 69) 8 (4 to 13) 18 (14 to 20) 1,584 12 7.57

were found to be associated with development of VAP and the total reflux time is found to be a reliable parameter to pick up VAP patients. However, acidic reflux was found to be related to high mortality.

Incidence of ventilator-associated pneumonia in patients undergoing elective tube exchange to LoTrach endotracheal tubes

A Fletcher, J Carter, M Blunt, P Young

Queen Elizabeth Hospital, Kings Lynn, UK

Critical Care 2009, 13(Suppl 1):P295 (doi: 10.1186/cc7459)

Introduction The objective was to study a cohort of general ICU patients electively reintubated with the LoTrach tracheal tube (tube exchange) to determine safety and to audit postprocedural ventilator-associated pneumonia (VAP). Emergency reintubation following elective or unplanned extubation is a known risk factor for subsequent VAP [1]; however, there are no published data on the effects of protocol-based elective tube exchange. The LoTrach tube prevents the pulmonary aspiration that occurs with conventional cuffs [2] and allows subglottic secretion management, thereby directly influencing two key steps in the patho-genesis of VAP.

Methods Sequential patients (53 patients in 14 months) receiving the LoTrach tracheal tube and cuff pressure controller were studied. Patients either underwent an elective tube exchange (using a bougie, preprocedural preoxygenation, gastric tube aspiration, muscle relaxation and direct laryngoscopy to clear upper airway secretions) or were primary LoTrach intubations on the ICU. Three clinicians independently examined the critical care electronic patient record. VAP was identified by a fall in the PaO2/FaO2 ratio >25% or a clinical pulmonary infection score >5 and the presence of a positive qualitative tracheal aspirate. The international consensus criteria for VAP diagnosis were also used, as was the institution of antimicrobial therapy (if this was triggered by a clinical suspicion of VAP, then for the purposes of this study VAP was diagnosed).

Results Forty-four (83%) patients underwent elective tube exchange. No complications were noted associated with the procedure. There were no episodes of VAP while the LoTrach was in situ. On an intention-to-treat basis there was a 1.8% VAP rate because one patient who required emergency reintubation following elective extubation received a conventional tube and developed VAP 2 days later. No other patients had antimicrobials begun for chest infections.

Conclusions There were no complications associated with elective tube exchange and no subsequent cases of VAP in this cohort of patients who were reintubated with the LoTrach tube. Elective tracheal tube exchange can be safely performed in general ICU patients. References

1. Torres A, et al.: Re-intubation increases the risk of nosocomial pneumonia in patients needing mechanical ventilation. Am J Respir Crit Care Med 1995, 152:137-134.

2. Young PJ, et al.: A low-volume, low-pressure tracheal tube cuff reduces pulmonary aspiration. Crit Care Med 2006, 34:632-639.

Reduction in ventilator-associated pneumonia following the introduction of subglottic suction endotracheal tubes

P Morgan, A Guyot, S Ranjan, M Eaton, M Carraretto, M Scott

Royal Surrey County Hospital, Guildford, UK

Critical Care 2009, 13(Suppl 1):P296 (doi: 10.1186/cc7460)

Introduction Aspiration of subglottic secretions is recommended by the American Thoracic Society, for the prevention of ventilator-associated pneumonia (VAP) [1], but has a poor implementation rate in some countries [2]. VAP is the most frequent infection in ventilated patients, occurring in up to 27% of cases [1]. Methods We studied 993 patients from September 2005, over a 39-month period, following the introduction of our ventilation care bundle, which includes 30° head-up position, deep vein thrombosis and peptic ulcer prophylaxis, sedation holds and using chlorhexidine for mouth care. VAP is difficult to diagnose and definitions vary. Our diagnoses were made prospectively by a single consultant microbiologist using the validated Clinical Pulmonary Infection Score. After 18 months we introduced the HiLo Evac/Lanz Mallinckrodt endotracheal tubes with suction applied using a 10 ml syringe 2-hourly, as our study intervention. Results The benchmark for European ICUs is a rate of 5% to 15% (HELICS; Hospital In Europe Link for Infection Control through Surveillance) and in America is 3 to 20 cases per 1,000 ventilator-days (NNIS; National Nosocomial Infections Surveillance System). For the 18 months following the implementation of care bundles, our VAP rate was 5.39% (25/463 patients) with an incidence of 14 cases per 1,000 ventilator-days. Following the study intervention the VAP rate fell to 1.5% (8/530 patients) and the incidence to 4.17 cases per 1,000 ventilator-days. Our cohort of patients over the period had no difference in APACHE II scores. The difference between the groups following the employment of subglottic secretion reached statistical significance with relative risk 3.57 (95% CI = 1.63 to 7.85, P <0.001). Conclusions Our study confirms the benefit of subglottic secretion clearance to reduce the occurrence of VAP. References

1 . American Thoracic Society: Am J Respir Crit Care Med

2005, 171:388-416. 2. Sierra R, et al.: Chest 2005, 128:1667-1673.

Ventilator-associated pneumonia in a Greek ICU: prevalence and etiology

M Katsiari, E Apostolakou, C Nikolaou, E Pagouni, F Tsimpoukas, E Mainas, E Kounougeri, M Laskou, A Maguina

Konstantopoulion General Hospital, Nea Ionia, Athens, Greece Critical Care 2009, 13(Suppl 1):P297 (doi: 10.1186/cc7461)

Introduction Nosocomial pneumonia is the leading cause of death from hospital-acquired infections. Ventilator-associated pneumonia (VAP) is the most frequent ICU-acquired infection in mechanically ventilated patients. The purpose of our study is to assess the prevalence and the etiologic pathogens of VAP in our ICU, as well as its impact on morbidity and mortality.

Methods A prospective observational study in a multidisciplinary eight-bed ICU. During an 18-month period, 160 consecutive patients with a length of stay (LOS) >48 hours were enrolled in the study. Data were collected using a specially designed software and included age, gender, APACHE II score on admission, days on mechanical ventilation, LOS and ICU outcome. Patients were stratified into two groups: Group A included patients who did not

develop VAP (n = 144), and Group B patients in whom VAP diagnosis was confirmed (n = 16). In Group B, isolated pathogens and their in vitro susceptibilities were also recorded. VAP diagnosis was established on the basis of clinical criteria and positive quantitative cultures of bronchial aspirates (>105 colony-forming units/ml). Data were analyzed using Student's t test and the Mann-Whitney rank-sum test.

Results Age (64 ± 18 vs. 70 ± 13 years, P = 0.257) and APACHE II score on admission (18.9 ± 7.6 vs. 18.8 ± 9.7, P = 0.965) were similar in both groups. Calculated VAP incidence was 5.87/1,000 ventilator-days (four early and 12 late-onset VAP). Duration of mechanical ventilation (17 ± 22 vs. 30 ± 12 days, P < 0.001) and LOS (19 ± 22 vs. 34 ± 15 days, P <0.001) were statistically longer in Group B. Isolated pathogens included: Acinetobacter baumannii (10 patients), Klebsiella pneumoniae (four patients), Pseudomonas aeruginosa (two patients), other Gram-negative (one patient). Overall antibiotic resistance to carbapenemes was 88%, to aminoglycosides 82%, to aztreonam 94%, to piperacillin/ tazobactam 82% and to colimycin 6%. ICU mortality was considerably higher in Group B (24.3% vs. 43.8%). Conclusions All VAP cases in our ICU were caused by Gramnegative multidrug-resistant bacteria. Colimycin seems to be our major therapeutic weapon against these strains. VAP prolongs the duration of mechanical ventilation and LOS, and contributes to higher mortality rates in ICU patients requiring mechanical ventilation.

Ventilator-associated pneumonia after procedures in cardiac surgery

J Wojkowska-Mach1, M Baran2, R Drwila2, E Foryciarz2, D Romaniszyn1, PB Heczko1

Jagiellonian University Medical School, Krakow, Poland; 2John Paul II Regional Teaching Hospital, Krakow, Poland Critical Care 2009, 13(Suppl 1):P298 (doi: 10.1186/cc7462)

Introduction Surveillance of hospital-acquired infections in intensive care wards is a very important but time-consuming process requiring involvement of all medical personnel [1]. This means published data on infections after cardiac surgery are rather limited [2]. The aim of this study was to analyze the epidemiology and etiology of ventilator-associated pneumonia (VAP) following coronary surgery in ICU patients.

Methods The surveillance was based on the active method. Cases of infections were detected by the hospital Infection Control Team in cooperation with the unit personnel in accordance with Centers for Disease Control definitions during 2007. Data on surgical site infections (SSI; 33 cases) served as the background for validation of VAP surveillance.

Results Altogether 53 VAP cases after 2,170 surgical events were detected. The ventilator utilization ratio was 0.52. The total cumulative VAP incidence rate was 2.2% and the ventilator-associated hospital-acquired pneumonia rate was 18.3/1,000 ventilator-days; mortality was 1.9%. The total cumulative incidence SSI rate was 1.4%. Etiological factors of VAP were Gram-negative bacilli (Pseudomonas aeruginosa - 10.4%, Escherichia coli -12.5%, Klebsiella pneumoniae - 16.7%) and Candida albicans. Conclusions In the analyzed setting, in which surveillance of SSI has been run since 2002, detected SSI incidence rates are similar to those reported in the National Nosocomial Infections Surveillance and Krankenhaus Infectionen Surveillance System programs [3,4]. However, obtained data on the epidemiology of VAP are different. Also, there are differences in both the epidemiology and microbiology of VAP in this hospital and results reported from other cardiac surgery wards. This indicates a

necessity for introducing effective surveillance of hospital-acquired pneumonia after cardiac surgery procedures in the ICU. Acknowledgement Partially financed by K/ZDS/000649. References

1. Eggiman P, Hugonnet S, Sax H, et al.: VAP: caveats for benchmarking. Intensive Care Med 2003, 29:2086.

2. Dupont H, Montravers P, Gauzit R, et al.: Outcome of postoperative pneumonia in the Eole study. Intensive Care Med 2003, 29:179.

3. Edwards JR, Peterson KD, Andrus ML: National Heathcare Saftey Network report. Am J Infect Control 2007, 35:290.

4. Finkelstein R, Rabino G, Mashiah T, et al.: Surgical site infection rates following cardiac surgery: the impact of a 6-year infection control program. Am J Infect Control 2005; 33:450.

Epidemiology of ventilator-associated pneumonia in ICU patients

A Vakalos, K Kolesidis, G Tsigaras

Xanthi General Hospital, Xanthi, Greece

Critical Care 2009, 13(Suppl 1):P299 (doi: 10.1186/cc7463)

Introduction Ventilator-associated pneumonia (VAP) is the most frequent ICU-acquired infection among patients receiving mechanical ventilation. The aim of our study was to test the incidence, the rate of mortality and the impact of VAP to prolong the duration of ICU stay, in our both medical and surgical ICU. Methods During a 32-month period, from November 2005 to August 2008, 127 patients were admitted to our ICU; 114 of them had received mechanical ventilation and were included retrospectively in our study. The sum of the mechanical ventilation days was 1,330. The patients compared were divided into two groups. Group A included 28 patients (24.5% of the total) with VAP (five early and 23 late), and Group B included 86 patients with no VAP. Results The incidence of VAP was 24.5% in patients receiving mechanical ventilation, or 21 of episodes per 1,000 days of mechanical ventilation. We detected no statistically significant difference among the two groups according to age (mean ± SD): 64.62 ±15.9 and 64.53 ±16.9, P =0.97, nor the APACHE II score (mean ± SD): 19.96 ± 6.8 and 19.33 ± 7.79, P = 0.7. We detected a statistically significant difference among the two groups according to the duration of ICU stay (days, mean ± SD): 33.75 ± 20 and 10.27 ± 11.29, P <0.0001. We detected a difference according to the rate of mortality, 32.14% and 22.09%, respectively, although not statistically significant, P = 0.31, OR = 1.67. Conclusions The incidence of VAP in our study is similar to other studies, which varies from 8% to 28%. VAP prolongs the duration of ICU stay, while the attributable mortality rate for VAP is still debated. Nevertheless, we have to improve our clinical approach in order to recognize better the risk factors and to develop a more effective prevention program.

Clara cell protein in bronchoalveolar lavage fluid: a predictor of ventilator-associated pneumonia?

M Vanspauwen1, C Linssen1, C Bruggeman1, J Jacobs2, M Drent1, D Bergmans1, W Van Mook1

Maastricht University Medical Center, Maastricht, the Netherlands;

2Prins Leopold Institute on Tropical Medicine, Antwerp, Belgium Critical Care 2009, 13(Suppl 1):P300 (doi: 10.1186/cc7464)

Introduction Clara cell protein 10 (CC-10) is a low-molecular-weight protein secreted in large quantities by nonciliated Clara S123

cells. Differences in CC-10 concentrations have been demonstrated in several inflammatory lung diseases (for example, bronchial asthma and chronic eosinophilic pneumonia). Moreover, there is evidence that in infectious pulmonary diseases the type of microorganism (for example, Pseudomonas aeruginosa) influences CC-10 activity. In this study we evaluated the presence of CC-10 in bronchoalveolar lavage (BAL) fluid as a potential marker for ventilator-associated pneumonia (VAP) in critically ill patients with a suspicion of VAP.

Methods Between January 2003 and December 2007 all consecutive BAL fluid samples from critically ill patients in the ICU of the Maastricht University Medical Center clinically suspected of VAP were included. Patients were divided into two groups: micro-biologically confirmed VAP (VAP group), and microbiologically not confirmed VAP (non-VAP group). VAP was microbiologically confirmed if BAL fluid cultures yielded >104 colony-forming units/ml and/or microscopic analysis revealed >2% intracellular organisms. The CC-10 concentration was measured with a commercially available ELISA, and retrospective analysis was performed. Areas under the curve of receiver operating characteristic curves were calculated for CC-10 concentrations. Results A total of 196 BAL fluid samples were included from 196 patients (123 men, 73 women). Seventy-nine out of 196 episodes of suspected VAP (40.3%) were microbiologically confirmed. The median CC-10 concentration in the VAP group was 3,019 ng/ml (range: 282 to 65,546) versus 2,504 ng/ml (range: 62 to 30,240) in the non-VAP group (P = 0.06), with an area under the curve of 0.586. In addition, CC-10 concentrations were not significantly different between the non-VAP group and the VAP group caused by a specific organism (for example, P. aeruginosa) (P = 0.386). Conclusions CC-10 concentration in BAL fluid is not a useful marker to differentiate between VAP and non-VAP.

Presence of human metapneumovirus in bronchoalveolar lavage fluid of critically ill patients

C Linssen, M Vanspauwen, C Bruggeman, E Wouters, D Bergmans, W Van Mook

Maastricht University Medical Center, Maastricht, the Netherlands Critical Care 2009, 13(Suppl 1):P301 (doi: 10.1186/cc7465)

Introduction Human metapneumovirus (hMPV) is a paramyxovirus that has been shown to cause respiratory infections in children, the elderly and immunocompromised patients. In this study we retrospectively evaluated the presence of hMPV in bronchoalveolar lavage fluid (BALF) of critically ill patients.

Methods Between January 2005 and December 2007 all consecutive BALF samples from critically ill patients in the ICU of the Maastricht University Medical Center clinically suspected of hospital-acquired pneumonia were included. BALF work-up included: differential cell count, quantitative bacterial culture, PCR for relevant respiratory pathogens and additional stains and cultures guided by clinical suspicion. All samples were analysed retrospectively by quantitative real-time PCR targeting the nucleoprotein gene of hMPV.

Results A total of 144 BALF samples were included from 121 patients in the ICU (75 men and 46 women). RNA of hMPV was detected in two out of 144 BALF samples (1.4%) of two patients. The first hMPV-positive patient had an underlying hematological malignancy (multiple myeloma). Already being treated with broad-spectrum antibiotics, the patient developed signs of respiratory distress. Five days after hospital admission, the patient was admitted to the ICU because of respiratory insufficiency. A BAL S124 performed the same day yielded no causative microorganism at

that time. The second hMPV-positive patient had been admitted to the hospital with a community-acquired pneumonia for which no causative organism was identified. Five days after hospital admission, progressive respiratory insufficiency necessitated ICU admission. Twelve days after ICU admittance a bronchoalveolar lavage was performed, which yielded Candida albicans in 102 colony-forming units/ml. In both patients no causative microorganism for pneumonia could be identified during hospitalisation, and therefore we speculate that hMPV may at least have contributed to or perhaps even caused the respiratory deterioration under antibiotic therapy.

Conclusions In critically ill patients 1.4% of collected BALF samples revealed the presence of hMPV RNA. Therefore, hMPV may play a part in respiratory complications in ICU patients. An additional study is necessary to investigate the extent to which hMPV contributes to respiratory failure in patients admitted to the ICU.

Trend analysis of antibiotic resistance and minimum inhibitory concentration distribution from 1997 to 2007 among Pseudomonas aeruginosa, Escherichia coli and Klebsiella pneumoniae in European ICUs

H Hanberger, H Gill, G Fransson, L Nilsson, P Turner

Antibiotic Research Unit, Linkoping, Sweden

Critical Care 2009, 13(Suppl 1):P302 (doi: 10.1186/cc7466)

Introduction The aim of this study was to analyse trends in antibiotic susceptibility and minimum inhibitory concentration (MIC) distribution in European ICUs based on data from the Mystic surveillance study initiated in 1997 [1].

Methods Twenty-eight ICUs in Belgium, Croatia, Finland, Germany, Spain, Sweden and the UK contributed with nonrepeat isolates of Pseudomonas aeruginosa (n = 3,241), Escherichia coli (n = 3,306), and Klebsiella pneumoniae (n = 1,660). Only non-repeat isolates taken on clinical indication were included. Two models were used for trend analysis: Model A: changes in susceptibility (S), intermediate (I) and resistance (R) using the EUCAST breakpoints [2] described by the logistic regression model pR = 1 / (1 + exp(-aR - bR x year)) and Model B: changes in MIC distribution 2logMIC = aM + bMx year. The parameters bR and bM describe the time dependence, and the hypotheses bR = 0 and bM = 0 can be tested. Year was used as the independent variable in both models.

Results Resistance (I+R%) rates in 2007 and trend analysis ((+ increase, - decrease), (P values for changes in susceptibility/MIC distribution from 1997 to 2007)): P. aeruginosa: imipenem 43% ((+/+), <0.001/<0.001)) (Figures 1 and 2), meropenem 27% ((+/+), (0.01/0.04)), ceftazidime 32% ((+/+), (0.035/<0.001)), cipro-floxacin 32% ((-/-) (0.39/0.79)), gentamicin 22% ((-/-),(0.002/ 0,019)); E. coli: imipenem 0.6% ((-/-) (<0.013/<0.001)), meropenem 0% ((+/-) (0.66/<0.001)), cefotaxime 9% ((-/+), (0.61/0.17)), ciprofloxacin 25% ((+/+), (<0.001/0.07)), gentamicin 14% ((+/+), (0.14/0.44)); and K. pneumoniae: imipenem 0% ((-/-) (<0.001/<0.001)), meropenem 0.5% ((-/-) (0.89/<0.001)), cefotaxime 13% ((-/-), (<0.001/0.3)), ciprofloxacin 17% ((+/-), (0.017/0.78)), gentamicin 21% ((+/-), (0.16/0.61)). Conclusions Significantly (P <0.05) increasing resistance rates (I+R%) and increasing MICs (2log MIC) were found among P. aeruginosa to carbapenems and ceftazidime. For other species-antibiotic combinations there were either low (<1%) resistance rates or not the same significant changes demonstrated with the two trend models. This study showed that trend analysis based on MIC distributions was not more sensitive than trend analysis based

Figure 1 (abstract P302)

P. aeruginosa and imipenern -- 7 countries Model A: changes in susceptibility

50% 45% 40%

Figure 2 (abstract P302)

P. aeruginosa and imipenern - 7 countries Model B: changes in MIC distribution

on changes in susceptibility (I+R). High (>27%) resistance rates were seen for all tested drugs against P. aeruginosa while low (<1%) resistance rates were only shown for carbapenems against E. coli and K. pneumoniae. References

1. Jones RN, Masterton R: Determining the value of antimicrobial surveillance programs. Diagn Microbiol Infect Dis 2001, 41:171-175.

2. EUCAST [www.eucast.org]

Antimicrobial resistance pattern of Pseudomonas aeruginosa in clinical isolates from ICU patients

E Antypa, A Koteli, K Kontopoulou, A Kiparissi, E Antoniadou

G. Gennimatas Thessaloniki General Hospital, Thessaloniki, Greece

Critical Care 2009, 13(Suppl 1):P303 (doi: 10.1186/cc7467)

Introduction Pseudomonas aeruginosa remains one of the most important pathogens in the nosocomial setting, where it is a common causative agent of bacteremia [1,2]. The aim of this study was to evaluate the antimicrobial resistance of Pseudomonas spp. strains isolated from inpatients hospitalized in the ICU of our hospital throughout a 3-year period.

Methods A total of 1 75 clinical isolates of Pseudomonas spp. collected from January 2006 to December 2008 were investigated

in this study. Each isolate was obtained from a different patient. Identification and routine antibiograms of the isolates were carried out using the Vitek 2 automated System (BioMerieux®, Marcy I'Etolie, France). The minimum inhibitory concentrations of imipenem, meropenem and colistin were also determined by the agar dilution method according to Clinical Laboratory Standards Institute guidelines.

Results The isolates included in our study originated from blood cultures 70 (40%), urine 53 (30.3%), bronchial aspirates 23 (13.1%), central venous catheters 18 (10.3%) and others 11 (6.3%). The resistance rates changed from 2006 to 2008 as follows: amikacin: 72 to 83%, aztreonam: 90 to 95%, ceftazidime: 81 to 89%, ciprofloxacin: 79 to 85%, colistin: 0 to 0%, mero-penem: 72 to 83%, imipenem: 72 to 83%, netilmicin: 72 to 83%, piperacillin/tazobactam: 71 to 80%, tobramycin: 72 to 83%, ticarcillinlin/clavulanic acid: 71 to 80%.

Conclusions Pseudomonas spp. infections are particularly serious for incubated ICU patients with 40 to 50% mortality rates. In our hospital, the percentage of Pseudomonas spp. multiresistant isolates has increased dramatically over the past 3 years. The majority of isolates were resistant to 15 or more antibiotics. What is most worrying is the fact that there is a prevalence of a multiresistant phenotype that was only sensitive to colistin. The emergance and rapid spread of multidrug-resistant isolates of Pseudomonas spp. are of great concern worldwide. It is necessary to limit the overuse of antibiotics and to implement a new antibiotic policy. References

1. Tan TY, Ng LSY, Kwang LL: Evaluation of disc susceptibility tests performed directly from positive blood cultures. J

Clin Pathol 2008, 61:343-346.

2. Giamarellos-Bourboulis EJ, Grecka P, Giamarellou H: Comparative in vitro interactions of ceftazidime, meropenem, and imipenem with amikacin on multiresistant Pseudomonas aeruginosa. Diagn Microbiol Infect Dis 1997. 29:81-86.

Bacteriological profile and antibiotic resistance of bacteria isolates in a burn department

L Thabet, K Bousselmi, H Oueslati, A Ghanem, S Ben Redjeb, A Messadi

Traumatology and Burn Center, Ben Arous, Tunisia Critical Care 2009, 13(Suppl 1):P304 (doi: 10.1186/cc7468)

Introduction Nosocomial infections remain the main cause of morbidity and mortality in burn patients. Ongoing surveillance of infections in burned patients is essential to detect changes in epidemiology and to guide better empirical antibiotherapy and infection control policies. The aim of this study was to analyze the bacterial flora and the antibiotic resistance of isolates in a burn department during a 2-year period.

Methods From 1 January 2005 to 31 December 2006, 1,268 strains were isolated from different specimens. Antimicrobial susceptibility testing has been carried out by the disk diffusion method as referred by the French Society of Microbiology. All data were stored in a laboratory database using whonet 5.3 software. Duplicate isolates defined as the same bacterial species for the same patient with the same antimicrobial susceptibility profile were excluded. Results The most frequently identified species were Staphylococcus aureus (19.8%), Pseudomonas aeruginosa (15.8%), Acinetobacter baumannii (11.8%), and Providencia stuarttii (9.5%). The rate of methicillin-resistant S. aureus (MRSA) was 68.1%. All isolates were fully susceptible to glycopeptides. P. aeruginosa resistance was 35.6% and 35.4% respectively for ceftazidime and imipenem. Concerning A. baumannii, 98.7% of

strains were resistant to ceftazidime, 59.5% to imipenem and 87.5% to ciprofloxacin. In total, 77.3% of P. stuarttii isolates were resistant to ceftazidime. The frequencies of resistance to ceftazidime, ofloxacin and amikacin of Klebsiella pneumoniae were respectively 60.9%, 25.4% and 47.1%. The survey of resistance showed a global decrease in 2006 versus 2005. The rate of MRSA was 61% in 2005 versus 51.6% in 2006. The resistance of ceftazidime was 80.6% in 2005 versus 26.9% in 2006 in P. aeruginosa. The imipenem resistance showed also a decrease in 2006 in A. baumannii and P. aeruginosa. The amelioration of hygiene, particularly washing hands after introduction of hydro-alcoholic solutions, and the collaboration between microbiologists and clinicians could explain the decrease of resistance showed in the burn department.

Conclusions Comparative to the previous years, S. aureus is still the commonest pathogen in the burn department. The incidence of antimicrobial resistance has decreased during 2006 after a peak of multiresistance during 2005. The measures of prevention taken were efficient and should be re-enforced.

Antibiotic resistance of Staphylococcus aureus from ICUs in the Netherlands 1996 to 2006_

M Rijnders1, RH Deurenberg1, M Boumans1, M Hoogkamp-Korstanje2, P Beisser1, EE Stobberingh1

1University Hospital Maastricht, the Netherlands; 2SWAB, Utrecht, the Netherlands

Critical Care 2009, 13(Suppl 1):P305 (doi: 10.1186/cc7469)

Introduction Staphylococcus aureus is a potential pathogenic microorganism and a causative agent of ~25% of infections in intensive care patients. Optimal empiric therapy may reduce morbidity and mortality. Therefore, it is essential to provide the clinician with resistance data of the local circulating strains and patients pathogens. A national surveillance program of the Dutch Antibiotic Resistance Surveillance Group was started in the Netherlands in 1996 to gain insight into the emergence of microbial resistance at local, regional and national levels. This study describes the findings of resistance development of S. aureus from ICUs of 14 large referral hospitals all over the Netherlands over a 10-year period from 1996 to 2006. Methods In the first 6 months of each year, the participating hospitals collected clinical isolates from, among others, blood and respiratory samples. In total 943 isolates were collected: 250 from three hospitals in the north, 187 from two in the east, 229 from five in the west and 280 from four in the south. The susceptibility to relevant antibiotics was determined by micro broth dilution according to the Clinical and Laboratory Standards Institute guidelines. Results Resistance to penicillin fluctuated over time at ~75%; seven methicillin-resistant S. aureus were isolated (0.7%). Resistance to clarithromycin increased to 10% in 2003, but decreased in 2006 to 6%, the level before 2003. Resistance to clindamycin fluctuated over time from 4 to 8%. Doxycyclin resistance varied between 2 and 10%. No resistance to vancomycin, teicoplanin and linezolid was demonstrated. Resistance to gentamicin and rifampicin was sporadically found. The prevalence of fluoroquinolone resistance was between 0 and 4% until 2002. In 2003, a peak in the prevalence of fluoroquinolone resistance (ciprofloxacin 14% and moxifloxacin 8%) was observed. Resistance to ciprofloxacin remained at this high level until 2005 and decreased in 2006. Resistance to moxifloxacin decreased immediately. Regional differences were observed for ciprofloxacin, with the highest resistance in the western and southern parts and S126 with doxycyclin with the lowest resistance rate in the northern part.

Conclusions During the 10-year study period only an increase in resistance to ciprofloxacin was observed. The data presented still justify the empiric choice of flucloxacillin (with rifampicin or gentamicin depending on the indication) in case of an infection probably caused by S. aureus in ICU patients.

Clinical and microbiological efficacy of continuous versus intermittent administration of vancomycin in critical care patients

M Stepan, I Chytra, P Pelnar, T Bergerova, E Kasal, A Zidkova, R Pradl

University Hospital, Plzen, Czech Republic

Critical Care 2009, 13(Suppl 1):P306 (doi: 10.1186/cc7470)

Introduction Vancomycin is known to induce postantibiotic effect but some data suggest that bactericidal activity is time dependent. Presently, the optimal dosing regimen for administration of vancomycin remains unknown. The aim of this open prospective randomized study was to compare clinical and microbiological efficacy of continuous infusion versus intermittent administration of vancomycin in critically ill patients.

Methods Patients admitted to the interdisciplinary ICU suffering from infection indicated to vancomycin administration with predicted duration of treatment of at least 4 days were randomized to receive either a 15 mg/kg intravenous loading dose of vancomycin followed by a daily 15 mg/kg continuous infusion (continuous group) or intermittent administration of 15 mg/kg vancomycin intravenously every 12 hours (intermittent group). Antibiotic therapy was stopped at improvement of clinical state and laboratory signs of subsidence of infection. Failure of antimicrobial therapy was defined as persistence or progression of signs and symptoms of infection, development of new clinical findings consistent with active infection or death from infection. The age, APACHE II score, type of infection, length of ICU stay, length of mechanical ventilation, mortality, clinical and microbiological failure and length of vancomycin therapy and total dose of vancomycin were assessed. The Sequential Organ Failure Assessment score, white blood count, C-reactive protein and renal function at the beginning and at the end of therapy were evaluated. The Mann-Whitney test, unpaired t test and chi-squared test were used accordingly; P <0.05 was considered statistically significant. Results A total of 65 patients were enrolled and randomized in the continuous (n = 33) and intermittent (n = 32) groups. No significant differences between both groups in all assessed parameters were found. Clinically evaluated failure of therapy in the continuous versus intermittent groups was in 12 (36%) patients versus eight (25%) patients, and microbiological failure in seven (21%) patients versus eight (25%) patients.

Conclusions Continuous infusion and intermittent administration of vancomycin in critically ill patients provided equivalent clinical and microbiological outcome.

Acknowledgement Supported by research grant MSM0021620819. P307

Over-increased creatinine renal clearance in septic patients and implications for vancomycin optimization

J Baptista, P Casanova, P Martins, J Pimentel

Coimbra University Hospitals, Coimbra, Portugal

Critical Care 2009, 13(Suppl 1):P307 (doi: 10.1186/cc7471)

Introduction The hyperdynamic stage occurring in sepsis may be responsible for an increase in renal blood flow that may lead to

increased elimination of some drugs, namely vancomycin. The aim of this study was to evaluate the effects of 24-hour creatinine clearance (CrCL) higher than 130 ml/min/1.73 m2 on vancomycin serum levels, and to identify an accurate marker of this condition. Methods This study was carried out in a multipurpose ICU, on 120 critical septic patients, 36% with septic shock (42 patients), 67 men (72.5%), average age 56 ± 21 years, average APACHE II score and Simplified Acute Physiology Score (SAPS) II of 1 7.2 and 42.2, respectively. We studied 120 consecutive vancomycin treatments (continuous perfusion) and we assessed therapeutic levels (20 to 30 |ig/ml) on days 1, 2 and 3 (V1, V2 and V3). Patients were divided into two groups according to CrCL: G1 <130 ml/min/1.73 m2 (n = 77); G2 >130 ml/min/1.73 m2 (n = 43). Both groups had similar vancomycin dosage on day 0 (47.7/46.6 mg/kg; P = 0.26).

Results Average age, APACHE II score and SAPS II were 62.6/44 years, 18.8/14.2 and 45/36.9, respectively, for G1 and G2 (P <0.05). Average CrCL in G1: 76.7 ml/min/1.73 m2; average CrCL in G2: 176 ml/min/1.73 m2 (P <0.05). Serum proteins and albumin were, respectively, 5.26/5.64 g/dl and 2.84/3.17 g/dl (P <0.05), and urinary creatinine (UCr), urinary urea and 24-hour urine output were 34.6/71 mg/dl, 437/625 mg/dl and 2,450/ 2,846 ml, respectively (P <0.05). V1, V2 and V3 were 20/14.3, 24.3/17.6 and 26.5/20.3 |ig/ml, respectively, for G1 and G2 (P <0.05 on each day). The correlations between V1 and CrCL were -0.47 (G1) and -0.33 (G2), both with P <0.05. The area under the curve (AUC) of the receiving operating curves of UCr, urinary urea and blood urea nitrogen (BUN) as markers of high clearance status (>130) were 0.85 (95% CI = 0.77 to 0.91), 0.72 (95% CI = 0.63 to 0.80) and 0.66 (95% CI = 0.55 to 0.73), respectively, and this area was maximal for a subgroup of patients (76 patients) without shock (AUC: 0.9; 95% CI = 0.81 to 0.96 for UCr). Best cutoff point for UCr: > 58mg/dl; for BUN: <17 mg/dl. Conclusions Over-increased CrCL identifies a subgroup of younger patients, with lower severity scores and high incidence of subtherapeutic vancomycin concentrations on the first 3 days -especially on day 1 (39/43 patients; 90.7%). A UCr concentration above 58 mg/dl can be a sensitive (70%) and specific (87%) marker of this condition. The addition of BUN <1 7 mg/dl as a second marker increases specificity to 97.4%.

Vancomycin-resistant enterococci colonisation in the ICU and control measures

K Katircioglu, M Ozkalkanli, S Gul Yurtsever, D Sanli, H Erten, V Duzenli, S Savaci

Izmir Ataturk Training and Research Hospital, Izmir, Turkey Critical Care 2009, 13(Suppl 1):P308 (doi: 10.1186/cc7472)

Introduction Vancomycin-resistant enterococci (VRE) have been recognised as microorganisms capable of causing epidemics in critically ill patients [1]. The present report describes an outbreak involving VRE colonisation in our ICU.

Methods After several weeks of severe nursing shortage, when often there were only three nurses for 10 critically ill patients, the first VRE was isolated from one of the two blood cultures of a trauma patient. Rectal swabs of all of the patients' and ICU staff were collected. Because there was no patient who had been discharged to the ward during this period no further analysis was required.

Results Enterococcus casseliflavus was isolated from six out of the 10 patients' rectal swabs. Once the outbreak was identified, all patients were placed under strict contact isolation and cohorted, and barrier precautions were instituted. Four patients who had

negative rectal swabs were isolated in another ICU. Five of the patients did not demonstrate signs of infection and were accepted as colonisation. ICU staff had negative rectal swabs. Two of the patients who had positive rectal swabs had died because of underlying primary disease (acute myocardial infarction, acute respiratory distress syndrome due to aspiration pneumonia). The first patient who had a positive blood culture was treated with intravenous Linezolid and discharged to home after 14 days. Another patient who had sepsis due to necrotising fasciitis was discharged to the ward after obtaining three negative rectal swabs. The other patient who had non-Hodgkin lymphoma and pneumonia was discharged to the ward after obtaining three negative rectal swabs. The last patient was isolated in another room and followed up for 4 weeks and discharged to the ward after obtaining three negative rectal swabs.

Conclusions The outbreak was controlled by continuous implementation of the infection control programme. A long ICU stay, hemodialysis, and nursing shortage are risk factors for VRE development. Transmission of VRE can be facilitated by the hands of the staff. We conclude that this outbreak may be due to the shortage of nursing during summer. Reference

1. Peta M, et al.: Outbreak of vancomycin-resistant Enterococcus spp. in an Italian general intensive care unit. Clin Microbiol Infect 2006, 12:163-169.

Activities of vancomycin, teicoplanin and linezolid against bacteraemic methicillin-resistant Staphylococcus aureus strains in Gauteng, South Africa

ME Botha1, J Coetzee1, C Feldman2, GA Richards2, AJ Brink1

1Ampath National Laboratory Services, Johannesburg, South Africa; 2Johannesburg Hospital and University of the Witwatersrand, Johannesburg, South Africa

Critical Care 2009, 13(Suppl 1):P309 (doi: 10.1186/cc7473)

Introduction This study aims to describe the vancomycin, teicoplanin and linezolid susceptibility patterns of methicillin-resistant Staphylococcus aureus (MRSA) blood culture isolates from patients in the private sector in Gauteng, South Africa. Screening tests for heterogeneous glycopeptide intermediate S. aureus (hGISA) strains were also performed. MRSA isolates with vancomycin minimum inhibitory concentrations (MICs) of 1 to 2 mg/l are associated with worse clinical outcomes [1]. Furthermore, hGISA infections are associated with clinical failure of glycopeptide therapy [2].

Methods MICs for vancomycin, teicoplanin and linezolid were performed on 50 randomly collected MRSA strains from blood cultures according to Clinical Laboratory Standards Institute guidelines. Screening for hGISA was performed using the Etest (AB BIODISK) Macromethod as well as the new Etest Glycopeptide Resistance Detection.

Table 1 (abstract P309)

Susceptibility patterns of MRSA isolates (n = 50)

Vancomycin Teicoplanin Linezolid

(mg/l) (mg/l) (mg/l)

MIC50 1.5 2 1.5

MIC90 2 3 2

Breakpoint S <2 S <8 S <4

Results The results of susceptibility patterns are depicted in Table 1. Fifty percent (25/50) and 20% (10/50) of the strains screened positive for hGISA using the Etest Macromethod and Etest Glycopeptide Resistance Detection, respectively. Conclusions We recommend that clinically relevant MRSA isolates be reported with MICs for vancomycin, teicoplanin and linezolid, and that glycopeptide treatment failure warrants further testing of the MRSA isolate to detect possible hGISA. Ideally hGISA should be confirmed by population analysis profile testing, which was not available to us for this study. References

1. Soriano A, et al.: Influence of vancomycin MIC on the treatment of MRSA bacteremia. Clin Infect Dis 2007, 46:193200.

2. Charles PGP, et al.: Clinical features associated with bac-teremia due to heterogeneous vancomycin-intermediate Staphylococcus aureus. Clin Infect Dis 2004, 38:448-451.

Telavancin for the treatment of hospital-acquired pneumonia in severely ill and older patients: the ATTAIN studies

E Rubinstein1, GR Corey2, HW Boucher3, MS Niederman4, A Shorr5, A Torres6, SL Barriere7, HD Friedland7

1University of Manitoba, Winnipeg, MB, Canada; 2Duke University Medical Center, Durham, NC, USA; 3Tufts Medical Center, Boston, MA, USA; 4SUNY, Stony Brook, NY, USA; Georgetown University, Washington, DC, USA; 6Hospital Clinic de Barcelona, Spain; 7Theravance, Inc., South San Francisco, CA, USA Critical Care 2009, 13(Suppl 1):P310 (doi: 10.1186/cc7474)

Introduction Telavancin (TLV) is an investigational lipoglyco-peptide with activity against Gram-positive pathogens. The Assessment of Telavancin for Treatment of Hospital-acquired Pneumonia (ATTAIN) programme studied TLV for the treatment of hospital-acquired pneumonia (HAP). This analysis compared the clinical cure rates achieved with TLV or vancomycin (VAN) for severely ill and older patients.

Methods ATTAIN 1 and ATTAIN 2 were methodologically identical, randomised, double-blind, phase 3 studies. Adult patients with pneumonia acquired after 48 hours in an inpatient acute-care or chronic-care facility, or acquired within 7 days after being discharged following >3 days of hospital stay were randomised to TLV 10 mg/kg intravenously every 24 hours or VAN 1 g intravenously every 12 hours for 7 to 21 days. A test-of-cure (TOC) visit was conducted 7 to 14 days after end-of-study treatment. Compliant patients who had a clinical response of either cure or failure at TOC were considered clinically evaluable (CE). Results Pooled clinical cure rates at TOC for several clinically relevant subgroups including the elderly as well as patients with severe HAP at baseline are presented in Figure 1. The percentage of patients reporting at least one treatment-emergent adverse

Figure 1 (abstract P310)

_Clinical cure rates at TOC (pooled CE population).

_TLV,% (n/N) VAN, % (n/N) Difference, (95% Cl)a

APACHE II >20b 68 (36/53) 57 (33/58) 9.0 (-9.4 , 27.5) ALI or ARDS 84(26/31) 69 (11/16) 16.1 (-9.7, 41.2) Bacteremia 85(17/20) 69 (18/26) 16.2 (-11.5, 36.5) Age >65 years 81 (125/155) 76 (146/192) 4.7 (-4.0, 13.4) a TLV-VAN;6 Calculated by imputing zeroes for missing components; ALI, acute lung injury; APACHE, Acute Physiology and Chronic Health Evaluation; ARDS, acute respiratory distress syndrome;_

event was comparable between the TLV and VAN groups (80% vs. 79%, respectively).

Conclusions Although not statistically significant, TLV achieved numerically higher cure rates than VAN for treatment of HAP in severely ill and older patients with comparable treatment-emergent adverse events.

Piperacillin/tazobactam administered by continuous or intermittent infusion for the treatment of nosocomial pneumonia

L Lorente, S Palmero, J Jiménez, J Iribarren, R Galván, J Martínez, C García, J Castedo, M Brouard, M Martín, M Mora

Hospital Universitario de Canarias, La Laguna SC, Tenerife, Spain Critical Care 2009, 13(Suppl 1):P311 (doi: 10.1186/cc7475)

Introduction Betalactam efficacy is determined by the duration of time that concentrations remain above the minimum inhibitory concentration (MIC). Some studies have found that the administration of betalactams by continuous infusion maintains constant concentrations above the MIC of susceptible organisms over the course of therapy; but limited data exist on the clinical efficacy of betalactams administered by continuous infusion. The purpose of this study was to evaluate the clinical efficacy of piperacillin/ tazobactam by continuous infusion (CI) administration or by intermittent infusion (II) for the treatment of ventilator-associated pneumonia (VAP) caused by Gram-negative bacilli. Methods A retrospective cohort study (1 June 2002 to 31 December 2007) of patients with VAP caused by Gram-negative bacilli who received initial empiric antibiotic therapy with piperacillin/tazobactam. We analyzed two contemporary cohorts: one received piperacillin/tazobactam by CI (first received a loading dose of piperacillin/tazobactam 4/0.5 g over 30 minutes, and after 4/0.5 g infused over 360 minutes every 6 hours) and the other by II (4/0.5 g over 30 minutes every 6 hours). The administration of piperacillin/tazobactam by CI or II was prescribed according to the physician's discretion.

Results Significant differences were not found between both groups of patients (37 with CI and 46 with II) in baseline data. Logistic regression analysis showed a higher probability of clinical cure of VAP by CI than by II (33/37 (89.2%) vs. 26/46 (56.5%); OR = 7.4; 95% CI = 1.96 to 37.42; P = 0.001). Logistic regression analysis showed a higher probability of clinical cure of VAP by CI than by II when the microorganism causative of VAP had a MIC of 8 |ig/ml (8/9 (88.9%) vs. 6/15 (40.0%); OR = 10.8; 95% CI = 1.01 to 588.24; P = 0.049) and a MIC of 16 |ig/ml (7/8 (87.5%) vs. 1/6 (16.7%); OR = 22.9; 95% CI = 1.19 to 1,880.78; P = 0.03); but not when it had a MIC of 4 |ig/ml (18/20 (90.0%) vs. 19/25 (76.0%); OR = 2.8; 95% CI = 0.42 to 31.67; P = 0.41). Conclusions Administration of piperacillin/tazobactam by continuous infusion may have more clinical efficacy than administration by intermittent infusion for the treatment of nosocomial pneumonia.

Efficiency of rifampicin-miconazole-impregnated catheters in the femoral venous site

L Lorente, M Lecuona, J Iribarren, J Jiménez, C García, R Galván, J Castedo, J Martínez, M Mora, A Sierra

Hospital Universitario de Canarias, La Laguna SC, Tenerife, Spain Critical Care 2009, 13(Suppl 1):P312 (doi: 10.1186/cc7476)

Introduction The guidelines of the Centers for Disease Control and Prevention do not recommend the use of an antimicrobial or

antiseptic-impregnated catheter for short-term catheterization and recommend avoiding femoral access to reduce the risk of central venous catheter-related bacteremia (CVC-RB). The objective of this study was to determine the incidence of CVC-RB with rifampicin-miconazole-impregnated catheters (RM-C) and standard catheters (S-C) in femoral venous access; and the cost of CVC-RB with both types of catheter.

Methods A cohort study, conducted in a 24-bed polyvalent medical-surgical ICU of a university hospital. We included patients admitted to the ICU from 1 June 2006 to 30 September 2007, who underwent femoral venous catheterization. The cost of CVC-RB included the cost of antimicrobial agents and the diagnosis methods; but other costs were not included (such as prolongation of ICU stay, mechanical ventilation, extrarenal depuration, etc.). Results We inserted 73 RM-C during 634 catheter-days and 111 S-C during 927 catheter-days. We diagnosed eight CVC-RB in the S-C group and none in the RM-C group. The incidence of CVC-RB with RM-C was significantly lower than with S-C (0 vs. 8.62 per 1,000 catheter-days, OR = 0.13, 95% CI = 0.00 to 0.86; P = 0.03). The total cost in the RM-C group was €7,300 (73 central venous catheters (CVC) per €100 each CVC, and none CVC-RB), the mean cost per catheter was €100 (€7,300/73 CVC), and the mean cost per catheter-day was €11.51 (€7,300/634 CVC-days). The total cost in the S-C group was €15,330 (111 CVC per €30 each CVC = €3,330, more than €12,000 per the eight CVC-RB), the mean cost per catheter was €138 (€15,330/111 CVC), and the mean cost per catheter-day was €16.54 (€15,330/927 CVC-days).

Conclusions RM-C could decrease the incidence of femoral venous catheter-related bacteremia and are more efficient than standard catheters.

Micafungin for the treatment of pediatric invasive fungal infections

A Arrieta1, NL Seibel2, TJ Walsh2, L Arnold3, AH Groll4

Children's Hospital, Orange, CA, USA; 2National Institute of

Health, Bethesda, MD, USA; 3Astellas Pharma, Deerfield, IL, USA;

4Children's Hospital, Muenster, Germany Critical Care 2009, 13(Suppl 1):P313 (doi: 10.1186/cc7477)

Introduction Pediatric patients in the ICU are highly vulnerable to invasive fungal infections [1]; Candida has been reported to be the third most common pathogen in pediatric ICU wards [2]. However, information on new antifungal therapies in children remains limited. Micafungin (MICA) is a once-daily antifungal agent of the echinocandin class.

Methods A retrospective review of MICA pediatric data from six trials.

Results MICA was received by 296 children for invasive candi-diasis, refractory invasive candidiasis, refractory invasive asper-gillosis, prophylaxis in hematopoietic stem cell transplantation (HSCT) patients, or to assess pharmacokinetics (PK). Most patients aged <1 year were premature (38/66), whereas most children aged >1 year were HSCT recipients or undergoing another therapy for a hematological malignancy (181/230). Maximum daily dose was similar for patients aged <1, 1 to 4, 5 to 8, 9 to 12 and 13 to 15 years with medians of 2.0, 1.5, 1.5, 1.9 and 1.5 mg/kg, respectively. Treatment success rates are shown in Figure 1. MICA showed linear PK, with a higher clearance in neonates than in older children and adults. Common adverse events (incidence >2%) were transient increases in transaminases, hypokalemia, hyperbili-rubinemia and increased alkaline phosphatase.

Figure 1 (abstract P313)

Therapeutic Category Treatment N Treatment Response, n/N [ by Age Group (years)

0 to 2 | 3 to 7 Blo12 | 13 to <16 | Overall

Phase III

Invasive candidiasis pr candidemia Micafungin 52 24/30 (80 0) 5/7 (714) 3/8 ¡37 S) 4/7 (57.1) 36/52 (6S.2)

Liposomal Amphotericin B 64 31/40 (77.5) era (66.7) 1/3 (33.3) 2/2 (100) 40/54 (74.1)

Propti/laxis of fungal infections Micafungin 39 6/7 (85 7) 10/14 (714) 7/12 (58.3) 4/6 166.7) 27/39 (69.2)

Fluconazole 45 6/13 (46.2) 4/7 (S7.1) 10/19 (52.6) 4/6 (66.7) 24/45 (S3.3)

Phase II

Invasive aspergillosis Micafungin* 5B 2/6 (40 0) 7/16 (43.8) 6/18 (27.8) 12/19 (63.2) 26/58 (44.8)

Invasive candidiasis or candidemia Micafungin 63 19/28 (67 9) 7/11 (636) 8/9 (88.9) 4/5 (80.0) 38/53 (71.7)

Phase 1

PK. safety and tolerabiilty Micafungin 69 4/5 (80.0) 23/30 (76 7) 13/22 (59.1) 9/12 (75.0) 49/69 (710)

'Either alone or In combination

Conclusions MICA is a safe and efficacious agent for the treatment and prophylaxis of pediatric invasive fungal infections in the ICU setting. References

1. Filioti J, et al.: Intensive Care Med 2007, 33:1272-1283.

2. Raymond J, et al.: Infect Control Hosp Epidemiol 2000, 21:260-263.

Malignant Boutonneuse fever with multiple organ failure: a three-case series

B Oliveira, AP Lacerda, Z Costa e Silva, C Fran?a

Hospital de Santa Maria Lisboa, Portugal

Critical Care 2009, 13(Suppl 1):P314 (doi: 10.1186/cc7478)

Introduction Rickettsia conorii infection, the agent of Boutonneuse fever, presents usually in a form considered benign, with serious complications in less than 10% of patients. Its mechanism of infection consists of vascular endothelial invasion by the microorganism and subsequent vasculitis and tissue necrosis. The process is usually localized or limited to the skin but can progress with malignant severe systemic involvement [1,2]. Methods A series of three cases of patients with multiple organ failure admitted to an ICU with confirmed diagnosis of Boutonneuse fever is presented (positive serology or tissue PCR). Results Common to all the patients was the rapid development of the disease (average: 3 days) from the initial complaints of fever after a bite by an unidentified agent and development of generalized rash, and the diagnosis of multiple organ failure (average Simplified Acute Physiology Score II: 59; average Sequential Organ Failure Assessment score on admission: 12.6). All developed respiratory failure requiring invasive mechanical ventilation, haematological failure with haemolytic anaemia and acute renal failure suggestive of serious widespread vasculitis. One of the cases developed lethal refractory septic shock within 6 hours of admission. Tetracycline antibiotic therapy was started early in all patients, based on clinical and epidemiological data, since initial microbiological results were negative. Conclusions This series of cases illustrates the most severe form of Boutonneuse fever, usually associated with comorbidities such as diabetes, malignant disease, immunodeficiency or delay in the diagnosis and appropriate antibiotic therapy, which was not the case in these patients.

References

1. Aharonowitz G, Koton S, Segal S, et al.: Epidemiological characteristics of spotted fever in Israel over 26 years. Clin Infect Dis 1999, 29:1321-1322.

2. Anton E, Font B, Munoz T, et al.: Clinical and laboratory characteristics of 144 patients with mediterranean spotted fever. Eur J Clin Microbiol Infect Dis 2003, 22:126128.

Impact of microbiology and antimicrobial treatment on mortality in septic shock

G Dabar1, G Jamaleddine2, P Yazbeck1, M Waked3

1Hotel Dieu de France Hospital, Beirut, Lebanon; 2American University Hospital, Beirut, Lebanon; 3Saint George Hospital, Beirut, Lebanon

Critical Care 2009, 13(Suppl 1):P315 (doi: 10.1186/cc7479)

Introduction Infectious characteristics, the infection source, and appropriate antibiotics are analyzed in relation to mortality in a cohort of patients with severe sepsis and septic shock. Methods An observational and prospective cohort study over

1 year of all patients admitted to one of the adult ICUs of the three

major university hospitals in Beirut. Demographics, appropriate ICU therapies within 48 hours of admission, appropriate antibiotic prescription within 48 hours of admission, antibiotic duration and number were registered. Infections were defined as definite or suspected. The infectious source, microbial agents and anti-biograms were identified. Data were collected only during the ICU stay, and outcome was recorded at hospital discharge (alive or dead).

Results One hundred and twenty-seven patients with an average age of 65 years were enrolled, 52% of admissions were from the emergency room and 83% were medical patients. Fifty-eight percent of the patients had severe sepsis within 48 hours of admission. The average APACHE II and Sequential Organ Failure Assessment scores were 21 and 7, respectively. Immunodeficiency was present in 30%. Overall mortality was 45%. Definite infection was diagnosed in 61% of the patients and bacteremia in 31.5%. Appropriate antibiotic prescription was significantly higher in patients with less severe sepsis compared with severe sepsis (>1 organ failure). Mortality was significantly higher in patients prescribed a higher number of antibiotics while the length of antibiotic prescription was not different between survivors and nonsurvivors. Escherichia coli (n = 14) was the most common isolate in blood cultures, followed by Candida (n = 5). Fungal pathogens were significantly more associated with nosocomial infections and with mortality. Multiresistant pseudomonas was significantly more associated to mortality compared with extended spectrum beta lactamase or methicillin-resistant Staphylococcus aureus. Positive urine culture was protective of a higher organ failure score. In multivariate analysis, Gram-negative bacteria, a nosocomial source and immunodeficiency were all associated with an increased risk of death. Conclusions Inappropriate antibiotic prescription is associated with more severe sepsis. Mortality is higher in nosocomial sepsis. Candida is a serious nosocomial agent in severe sepsis and is associated with increased mortality as well as multiresistant pseudomonas aeruguinosa.

Do changes in ICU flora and their antibiotic susceptibility parallel changes in infection control and antibiotic use?

I Grigoras, C Caramidaru, D Rusu, L Cotarlet, O Chelarescu

'Gr.T. Popa' University of Medicine and Pharmacy Iasi, Emergency

University Hospital 'Sf Spiridon' Iasi, Romania

Critical Care 2009, 13(Suppl 1):P316 (doi: 10.1186/cc7480)

Introduction The usefulness of microbiological surveillance in ICU patients is under debate [1,2]. It can signal infection/colonization sites, most frequent germs and their antibiotic susceptibility. Our study aimed to investigate changes in the microorganism rate and their antibiotic susceptibility in ICU patients along with efforts to change antibiotic practice and infection control policies in the ICU. Methods A prospective study was performed in a 19-bed mixed ICU in an emergency university hospital. Blood, catheters, respiratory secretions, surgical wound secretions and urine were cultured any time in the case of infection suspicion or every 4 to 5 days in the case of patients with ICU length of stay >5 days. Cultures were counted no matter the infection or colonization status. Antibiotic use was recorded as daily drug doses/100 beds (DDDs). The study included two time periods: January to June 2007 (P1 - 839 admitted patients) and January to June 2008 (P2 -779 admitted patients). Comparison between the two periods included overall and site-specific rates of positive cultures, patterns of germs, antibiotic susceptibility and antibiotic usage. Results The overall number of positive cultures decreased by 31.8% (P <0.01) and per patient by 15% (P <0.01) in P2 versus P1. Germ patterns changed little over time, most frequently found being Staphylococcus aureus, Acinetobacter spp, coagulase-negative staphylococci and Pseudomonas spp. According to the site, coagulase-negative staphylococci isolated from the bloodstream decreased from 42% to 26% (P = 0.01), but Acinetobacter spp. in the respiratory tract increased from 34.5% to 43%, concurrently with a reduction in Klebsiella spp from 12.7% to 3%. Antibiotic use decreased by one-half in P2 versus P1 (68.7 DDDs vs. 113.8 DDDs, P = 0.01). Ciprofloxacin dropped from first (P1) to seventh (P2) position (75% decrease) and resulted in a decrease in ciprofloxacin resistance of S. aureus from 94% to 46% (P <0.01).

Conclusions Changes in infection control policies and antibiotic use resulted in a decreased number of positive cultures in ICU patients and a decline in antibiotic resistance in some strains. References

1. Jones ME, et al.: Emerging resistance among bacterial pathogens in the intensive care unit. Ann Clin Microbiol Antimicrob 2004, 3:14.

2. EARSS Annual Report 2005 [http://www.rivm.nl/earss/]

Prehospital administered intravenous antimicrobial protocol for septic shock: a prospective randomized clinical trial

D Chamberlain

Flinders University, Adelaide, Australia

Critical Care 2009, 13(Suppl 1):P317 (doi: 10.1186/cc7481)

Introduction Appropriate intravenous, broad-spectrum empiric antimicrobial therapy should be initiated as rapidly as possible for suspected severe infections in the presence of hypotension. Timing and delay of initial administration of effective antimicrobial therapy is an important predictor of survival [1]. It is hypothesized that appropriate prehospital-initiated intravenous, broad-spectrum

empiric antimicrobial therapy will reduce the delay of administration and will reduce mortality [2].

Methods One hundred and ninety-eight patients meeting the criteria for septic shock on initial clinical presentation prehospital were randomized to receiving broad-spectrum intravenous antimicrobial therapy and fluid per guided protocol or to receiving intravenous fluid only.

Results Out of 198 septic shock patients, 99 received prehospital antimicrobial therapy. Blood cultures were taken prior to administration. Both groups were comparable in all aspects. There were 83 male and 26 female patients in the test group, and 79 male and 20 female patients in the control group (P = 0.021). Mean age was 67.9 ± 10.5 years in the test group and 63.8 ± 11.0 years in the control group (P = 0.186). The APACHE II score in the test group and in the control group was statistically not significant (P = 0.661). In the test group 28%, compared with the control group 53%, of patients had community-acquired pneumonia (P = 0.083). The mean intensive care stay in the test group was 6.8 ± 2.1 days and in the control group it was 11.2 ± 5.2 days (P = 0.001). The first antimicrobial administration after emergency department admission time in the control group was 3.4 ± 2.6 hours (P = 0.02). The 28-day mortality rate was significantly reduced to 42.4% (test group) compared with 56.7% in the control group (P = 0.049, OR = 0.56; 95% CI = 0.32 to 1.00). Conclusions The adjuvant treatment of patients with a guided prehospital-initiated broad-spectrum antimicrobial therapy with intravenous fluid reduces the delay in antimicrobial administration and significantly reduces the 28-day mortality rate in patients with septic shock. References

1. Kumar A, et al.: Duration of hypotension before initiation of effective antimicrobial therapy is the critical determinant of survival in human septic shock. Crit Care Med 2006, 34:1589-1596.

2. Miner JR, et al.: Presentation, time to antibiotics, and mortality of patients with bacterial meningitis at an urban county medical center. J Emerg Med 2001, 21:387-392.

Clinical experiment with the use of colistin for the treatment of multiresistant Acinetobacter baumanii nosocomial infection

L Safi, A Elwali, K Aboulalaa, A Baite, M Leklite

Military Mohammed V Hospital, Rabat, Morocco

Critical Care 2009, 13(Suppl 1):P318 (doi: 10.1186/cc7482)

Introduction Intravenous and aerosolized colistin are being used increasingly, in the critical care setting, for treating patients with nosocomial infections due to multidrug-resistant Gram-negative bacteria. The objective of this study was to report our experience of colistin use in multiresistant Acinetobacter baumanii. Methods A retrospective review of patients' charts for those admitted to the surgical intensive unit, between September 2007 and September 2008. Infected patients with multiresistant acineto-bacter treated with intravenous and aerosolized colistin were reviewed.

Results Fifteen patients were identified, 13 men and two women. The patients ranged in age from 21 to 65 years. The patients were admitted to intensive care for polytrauma (10 cases), chest trauma (three cases) and cerebrovascular accident (two cases). The main infections were ventilator nosocomial pneumonia (100%), surgical site infection (6.6%), primary bacteremia (13%), catheter infection (20%), and meningitis (13%). The bacteria responsible were A. baumanii (100%), Pseudomonas aeruginosa (46%), Staphylo-

coccus aureus (26%), Klebsiella pneumoniae (6%), Enterobacter cloacae (6%), and Morganella morganii (6%). All patients had normal renal function at the onset of antibiotic therapy. Five patients received colistin monotherapy, six patients received combination therapy colistin with imipenem or third-generation cephalosporin, and four patients received colistin with teicoplanine. The colistin was used at 4 mg/kg/day administered intravenously and aerolized colistin at 2 mg/kg for 8 to 16 days. A favourable response was observed in 12 cases. Overall mortality was three cases. Colistin induced reversible nephrotoxicity in one case and a reversible neuropsychiatric event was observed in one case. Bacterial eradication was confirmed in 14 patients (93%). Conclusions Garnacho-Montero and colleagues reported that intravenous colistin was as effective as imipenem [1]. Our findings with colistin can be considered encouraging in comparison with previous experience. Reference

1. Garnacho-Montero J, et al.: Treatment of multidrug-resis-tant Acinetobacter baumannii ventilator-associated pneumonia (VAP) with intravenous colistin: a comparison with imipenem-susceptible VAP. Clin Infect Dis 2003, 36:11111118.

Evaluation of gentamicin first-dose pharmacokinetics in septic critically ill patients: pilot study

JC Gonçalves Pereira1, A Martins2, P Povoa1

1Hospital S. Francisco Xavier, Lisboa, Portugal; 2Hospital S. José, Lisboa, Portugal

Critical Care 2009, 13(Suppl 1):P319 (doi: 10.1186/cc7483)

Introduction Aminoglycosides, especially gentamicin, are extensively used for treatment of severe infections. A dose of 7 mg/kg is recommended to achieve a high peak serum concentration. However, its accumulation is associated with adverse reactions, notably renal injury. Knowledge of gentamicin pharmacokinetics in critically ill patients is therefore crucial to therapeutic success and prevention of toxicity.

Methods Patients' demographic and clinical data were collected. We studied gentamicin pharmacokinetic data from patients treated between January 2006 and June 2008 in two ICUs. Patients were eligible if the gentamicin dose was selected to achieve high serum levels (target concentration of 16 to 24 |ig/ml) and if pharmacokinetic data from such patients, specifically peak and trough levels, were measured after the first dose. The Sawchuk and Zaske one-compartment pharmacokinetic model [1] was used to calculate the gentamicin volume of distribution (Vd), maximum concentration (Cmax) and clearance.

Results We studied 32 patients with a mean age of 63 years (24 men). Mean Sequential Organ Failure Assessment score was 7.65 and the mean Charlson comorbidities score was 3.54. The median Vd was 0.41 l/kg (IQR 0.36 to 0.46 l/kg). Only four patients had a Vd equal to or less than 0.26 l/kg, usually found in the healthy population. We found no correlation between Vd and age, Charlson comorbidities score, Sequential Organ Failure Assessment score, and creatinine serum level (r2 = 0.016, 0.18, 0.037, and 0.067, respectively). The presence of mechanical ventilation and septic shock had no influence on Vd (P = 0.59 and P = 0.14, respectively). Women had a significantly higher mean Vd/kg (0.51 vs. 0.39 l/kg, P = 0.002) and, therefore, lower Cmax (15.2 vs. 18.5 |ig/ml, P = 0.016). In a logistic regression model, only sex (female: OR = 0.032; 95% CI = 0.03 to 0.387) and the dose/kg (per mg/kg: OR = 3.21; 95% CI = 1.17 to 8.79) were significantly associated with the achievement of Cmax above 16 |ig/ml. S131

Conclusions The gentamicin Vd cannot be predicted by age, the presence of renal failure or any studied comorbidities. Therefore, in order to obtain adequate therapeutic levels as soon as possible, a high first gentamicin dose (that is, at least 7 mg/kg) should be given to all patients. In women, even higher doses may be needed. Reference

1. Sawchuk R, Zaske D: Pharmacokinetics of dosing regimens which utilize multiple intravenous infusions. J Phar-macokinet Biopharm 1976, 4:183-195.

Does early appropriate antibiotic therapy improve the outcome of severe sepsis or septic shock?

N Saito, Y Sakamoto, K Mashiko

Chiba Hokusou Hospital, Nippon Medical School, Chiba, Japan Critical Care 2009, 13(Suppl 1):P320 (doi: 10.1186/cc7484)

Introduction The Surviving Sepsis Campaign Guideline 2008 [1] recommends starting appropriate antibiotic therapy within 1 hour after making the diagnosis of septic shock. In the ICU of our emergency center, we perform empiric antibiotic therapy for septic shock patients within 6 hours after admission. Methods Cases of severe sepsis or septic shock diagnosed and treated in the ICU of our emergency center for more than 48 hours between January 2005 and September 2008 were retrospectively analyzed. The cases were divided into a survival group and a nonsurvival group, and were compared in relation to primary diagnosis, clinical findings, and type of pathogen. The chi-square test and paired t test were used to perform the statistical analysis. Results There were 107 cases, 24 cases of severe sepsis and 83 cases of septic shock, and 73 of them were in males. The mean and standard deviation of the patients' age was 66.4 ± 15 years. The severity of their illness according to the APACHE II score was 22.3 ± 8, and their Sequential Organ Failure Assessment score was 9.1 ± 4. The causes of the sepsis were pneumonia (51.4%), peritonitis (13.1%), and soft tissue infection (13.1%). Mortality was 26.2%. There were 79 cases in the survival group and 28 in the nonsurvival group. The two groups are presented in Table 1. We performed a multivariate regression analysis to identify prognostic factors, and the only independent prognostic factors were age (OR = 0.955 (P =0.022; 95% CI = 0.91 to 0.99)), acute respiratory disease syndrome (ARDS) (OR = 5.789 (P = 0.002; 95% CI = 1.91 to 17.4)) and base deficit (OR = 1.113 (P =0.008; 95% CI = 1.02 to 1.2)). Early appropriate antibiotic therapy (EAAT) was not correlated with survival.

Table 1 (abstract P320)

Survival Nonsurvival P value

Early appropriate antibiotic therapy (%) 67 60.7 0.645

Gram-positive coccus (%) 39.2 53.5 0.297

Gram-negative rod (%) 29.1 39.2 0.351

Conclusions EAAT did not affect the outcome. The prognostic factors for severe sepsis and septic shock identified in this study were age, base deficit and ARDS. References

1. Dellinger RP, Levy MM, Carlet JM, et al.: Surviving sepsis campaign: International guidelines for management of severe sepsis and septic shock. Crit Care Med 2008, 36:296-327.

2. Kumar A, Roberts D, Wood KE, et al.: Duration of hypotension before initiation of effective antimicrobial therapy is the critical determinant of survival in human septic shock. Crit Care Med 2006, 34:1589-1596.

Strategy to reduce antibiotic prescription in cases of airway aspiration

H Bagnulo, M Godino, N Carambula

Hospital Maciel, Montevideo, Uruguay

Critical Care 2009, 13(Suppl 1):P321 (doi: 10.1186/cc7485)

Introduction Airway aspiration (AWA) of gastric content is a very frequent complication in patients presenting to emergency rooms after incidents linked to a depressed level of consciousness. The routine use of antibiotics in this situation determines a high-pressure selection for multiresistant microorganisms. However, how often AWA is the causative situation for early ventilator-associated pneumonia is not known. Also a gold standard for a diagnostic workup is needed. Our aim was to know how frequent AWA determines infectious aspiration pneumonia confirmed by evolution and microbiological samples, and to propose a methodological approach to rule out antibiotic usage. Methods Over a 2-year period, in 82 patients admitted to our ICU, AWA was confirmed by the direct observation of gastric content when orotracheal intubation was performed by a trained physician due to a Glasgow coma scale below 12. Usual diagnosis: acute brain injury, 52 patients (73%); Simplified Acute Physiology Score II 37 ± 11; age, 44 ± 15 years; all were mechanically ventilated, average 6 ± 2 days; ICU stay ± 10 days; mortality 21%. Rectal fever, leukocytosis, thoracic radiology and PaO2/FiO2 were recorded on a daily basis. The Clinical Pulmonary Infection Score (CPIS) and semiquantitative tracheal aspirates (SQTA) were performed twice: in the first 48 hours and between the third and fifth days. Results Out of 82 patients, 23 (28%) developed clinically and microbiologically confirmed pneumonia. Fever and leukocytosis showed no significant differences in patients with or without pneumonia during the first 5 days. Also the PaO2/FiO2 index was not different. As for radiology, when unilateral focal condensation was present, pneumonia was confirmed later (relative risk = 3.3, 95% CI = 1.7 to 6.4). CPIS in the first 48 hours showed a negative predictive value for pneumonia of 89%, and SQTA with no microorganism growth a negative predictive value of 96%. In our patient group, 42 (51%) had CPIS <6 and SQTA growth; in them no antibiotic usage is recommended. On the contrary, out of 20 patients who had CPIS >6 and positive SQTA, 16 (75%) developed pneumonia within the first 5 days of the ICU stay; this group deserves antibiotics. CPIS >6 without pneumonia, probably due to lung inflammation, was observed in 10 out of 28 patients (36%). Conclusions (1) In our ICU population, pneumonia develops in only 28% of those presenting with AWA. (2) Through CPIS <6 and negative SQTA performed in the first 48 hours we could identify that in more than one-half of AWA patients antibiotics are not needed. (3) CPIS is not a reliable early indicator of pneumonia in patients with AWA.

Effect of intravenous administration of freeze-dried sulfonated human normal immunoglobulin for septic patients

Y Deguchi1, T Nakagawa1, H Suga1, M Nishina1, T Sato1, K Okajima2, N Harada2

1Tokyo Women's Medical University, Tokyo, Japan; 2Nagoya City University, Nagoya, Japan

Critical Care 2009, 13(Suppl 1):P322 (doi: 10.1186/cc7486)

Introduction The proinflammatory cytokine TNF plays a critical role in the formation of severe sepsis when it is excessively released.

Recently, it was revealed through our animal models that intravenous administration of freeze-dried sulfonated human normal immunoglobulin (IVIG) makes anti-inflammatory effects by enhancing insulin-like growth factor 1 (IGF-1) production through stimulation of sensory nerves, which inhibits the production of TNF. In this study, we examined whether IVIG for septic patients enhances IGF-1 production.

Methods Fourteen septic patients indicating a high level of soluble E-selectin were surveyed, who were transported to our hospital from April to November 2007. They were divided into two groups, IVIG group (G-group, 5 g/day x 3 days, nine cases, 67.2 ± 8.64 years old) and nonadministered group (control group, C-group, five cases, 69 ± 12.9 years old).

Results In the G-group, the values of IGF-1 on the fifth day (162.22 ± 52.91 ng/ml) and seventh day (150.44 ± 29.06 ng/ml) were significantly higher than that on day 0 (88.59 ± 29.79 ng/ml, P <0.05). Furthermore, The value of IGF-1 of the G-group (162.22 ± 52.91 ng/ml) was significantly higher than that of the C-group (68.80 ± 24.04 ng/ml) on the fifth day (P <0.05). Conclusions We revealed that IVIG for septic patients enhances IGF-1 production. Furthermore we will enhance the effect of sulfanated immunoglobulin compared with that of intact immuno-globulin in the difference of the anti-inflammatory effect and the mechanism. References

1. Okajima K, Harada N, Suga H,Nakagawa T: Rapid assay for plasma soluble E-selectin predicts the development of acute respiratory distress syndrome in patients with systemic inflammatory response syndrome. Transl Res 2006, 148:295-300.

2. Harada N, Okajima K, Kurihara H, Nakagata N: Antithrombin prevents reperfusion-induced hepatic apoptosis by enhancing insulin-like growth factor-I production in mice.

CritCare Med 2008, 36:971-974.

Levels of soluble fibrin in severe septic patients given intravenous immunoglobulin

T Kobayashi1, T Nishiura2, H Suga2, T Nakagawa2

Tokyo Women Medical University Medical Center East, Tokyo, Japan

Critical Care 2009, 13(Suppl 1):P323 (doi: 10.1186/cc7487)

Introduction It is known that severe sepsis causes multiple organ failure and high mortality. Intravenous immunoglobulin (IVIG) is used for severe sepsis patients. The mechanism of IVIG is clearly unknown. Suggested, however, is the possibility that IVIG prevents organic injury by suppressing the inflammation. Soluble fibrin (SF) is intermediate between fibrinogen and fibrionoligomer. It is better than fibrinogen as the material for thrombin. IVIG is used as one of the effective factors for repair of the damaged vascular endo-thelium. In this study we used SF as a marker of severity of illness in patients with or without IVIG and compared the levels of SF in two groups.

Methods Nineteen severe septic patients were divided into two groups. The control group (n = 9) was not given and the other group (n = 10) was given IVIG (5 g/day for 3 days). SF of serum from patients was measured on 0 days, 1 day, 3 days, 5 days and 7 days, and the results were analyzed.

Results The levels of SF were: control group: 4.77 ± 9.44 (0 days), 3.13 ± 2.53 (1 day), 3.33 ± 2.75 (3 days), 2.53 ± 2.58 (5 days), 2.58 ± 3.75 (7 days); other group: 7.71 ± 8.28 (0 days), 12.26 ± 16.75 (1 day), 14.34 ± 25.56 (3 days), 8.06 ± 9.08 (5 days), 11.95 ± 12.71 (7 days). There was a significant

difference in the levels of SF between groups with and without IVIG at 1 day, 5 days and 7 days.

Conclusions Monocyte has the Fcy receptor. It is thought that stimulating Fcy of monocyte with IgG plays a role in increasing SF. Without severe DIC, increasing fibrin is used in the repair of the damaged vascular endothelium as a matrix and it prevents the invasion of germs. It is thought that increasing fibrin is the appropriate stimulation for coagulation and one of the reactions of biomechanical defense. We think SF is one of the important factors for the damaged cell and a useful marker of severity for severe sepsis. We also think IVIG is effective for severe sepsis. Reference

1. Echtenacher B, et al.: Infect Immun 2001, 69:3550-3555. P324

Efficacy of corticosteroids on survival in patients with sepsis and septic shock: meta-analysis

N Uchiyama, N Nishimura, T Jinta, R Suda, S Yamao, S Gotoh, H Horinouchi, Y Tomishima, R Sugiura, N Chohnabayashi

St Luke's International Hospital, Tokyo, Japan

Critical Care 2009, 13(Suppl 1):P324 (doi: 10.1186/cc7488)

Introduction Previous studies have recommended the use of low-dose corticosteroids in patients with septic shock. Specifically, response to corticotropin tests has been recognized as a prognostic factor in critically ill patients, especially in patients with no response to the corticotropin test. A recent large randomized controlled trial evaluating the efficacy of low-dose corticosteroids revealed no benefit on overall survival or in patients with no response to corticotropin in patients with severe sepsis and septic shock. Recently, recommendations for the diagnosis and management of corticosteroid insufficiency in critically ill adult patients were published as consensus statements from an international task force by the American College of Critical Care Medicine [1]. However, some studies were not included in these statements. Methods We conducted a systematic search of EMBASE and MEDLINE (through August 2008) for double-blind randomized clinical trials that evaluated the effect of corticosteroids on mortality in patients with severe sepsis and septic shock. Study selection criteria were all trials before August 2008 in which participants were randomized to corticosteroids or placebo and in which mortality was reported.

Results Data from 17 randomized, controlled trials with a total of 3,638 participants were analyzed. Corticosteroid use was not associated with a risk reduction in overall mortality (pooled risk ratio = 0.99 (95% CI = 0.90 to 1.09), P = 0.823). Low-dose corticosteroids (150 to 300 mg/day) did not show benefit on all-cause mortality in patients with severe sepsis and septic shock (risk ratio = 0.92 (95% CI = 0.79 to 1.06)). Furthermore, low-dose corticosteroids in patients with severe sepsis and septic shock who did not respond to the corticotropin test showed no benefit on 28-day mortality (risk ratio = 0.91 (95% CI = 0.76 to 1.09)). Corticosteroids use was not associated with increased complications, such as gastrointestinal bleeding or increased infections. Conclusions This meta-analysis indicated that administration of low-dose corticosteroids was not beneficial on overall mortality in patients with severe sepsis or septic shock. Reference

1. Marik PE, et al.: Recommendations for the diagnosis and management of corticosteroid insufficiency in critically ill adult patients: consensus statements from an international task force by the American College of Critical Care Medicine. Crit Care Med 2008, 36:1937-1949. S133

Effect of hydrocortisone therapy on outcome and the incidence of infection in patients with septic shock

H Hayami, O Yamaguchi, H Yamada, S Nagai, S Ohama, A Sakurai, Y Sugawara

Yokohama City University Medical Center, Yokohama, Japan Critical Care 2009, 13(Suppl 1):P325 (doi: 10.1186/cc7489)

Introduction In the CORTICUS study, despite the fact that the median time to shock reversal was shorter in the hydrocortisone (HC) group, improvement of outcome was not demonstrated. This is explained as an increased incidence of superinfection, including new episodes of sepsis or septic shock, but the explanation seems to be unclear so there are still debates about the study. We examined the outcome of HC therapy and the effect on later infectious complications, especially fungal infection. Methods A retrospective cohort study. Ninety-six adult patients >18 years old with septic shock treated at single university general ICU. Initially, fluid therapy, inotropic support, transfusion and management of infection were carried out according to the guideline of survival sepsis campaign, and we assessed the adrenal function by adrenocorticotropic hormone stimulation test (250 |ig). In the earlier period (April 2004 to March 2007), 200 mg HC was administered to the nonresponders. Later (April 2007 onwards), HC was given only to patients that were poorly responsive to fluid resuscitation and vasopressor therapy. We assessed the 28-day and 90-day survival rate, hospital discharge rate, the incidence of new septic shock, the positive rate of endotoxin and p-d-glucan.

Results Out of 96 patients, 38 patients were determined as responders and 58 as nonresponders. The Sequential Organ Failure Assessment score was higher in nonresponders (11 ± 3.3 vs. 13.6 ± 3.7, P = 0.0014). For the 58 nonresponders, HC was used for 40 patients (HC group). The 28-day survival rate was higher in HC (61% vs. non-HC 44%, P = 0.49), but the 90-day survival rate decreased to 33% in HC and 28% in non-HC (P =0.47), and the discharge rate reversed as 25% in HC versus 28% in non-HC (P = 0.84). The incidence of new septic shock was 20% in HC, 11% in non-HC. The positive rate of endotoxin did not increase before and after treatment in both groups. The mean concentration of p-d-glucan did not increase in non-HC (59 ± 99 to 48 ± 44 pg/ml), but it significantly increased in HC (50 ± 70 to 69 ± 186 pg/ml) (P = 0.049).

Conclusions The 28-day survival rate was higher in the HC group than the non-HC group as reported in the French study, but no beneficial effect was seen in longer outcome, consistent with the CORTICUS study. Moreover, HC therapy may have a risk to increase fungal infection.

Effects of methylprednisolone on sepsis-induced blood-brain barrier permeability and reflex responsiveness in rats

F Esen1, T Erdem1, A Ogan2, E Senturk1, P Ergin Ozcan1, M Kaya1, N Cakar1, L Telci1

University of Istanbul, Turkey; 2International Hospital, Istanbul, Turkey

Critical Care 2009, 13(Suppl 1):P326 (doi: 10.1186/cc7490)

Introduction The purpose of this study was to assess neurological function and examine blood-brain barrier (BBB) permeability changes in septic encephalopathy after cecal ligation and perfora-S134 tion (CLP). The protective effect of high-dose methylprednisolone

administration on BBB derangement and neurological dysfunction was also investigated.

Methods An experimental prospective randomized study was performed on 42 adult male Sprague-Dawley rats. Sepsis was induced through CLP. A bolus 30 mg/kg methylprednisolone administration followed by intravenous injection of 5.4 mg/kg/hour for 8 hours was given immediately after CLP in septic and sham-operated rats. Control groups for both septic and sham-operated rats were injected with equal volumes of saline. Results Neurological function was assessed at 8, 12, and 24 hours after sepsis induction. Those rats surviving for 24 hours post procedure were anesthetized and decapitated for investigation of BBB integrity using a spectrophotometric assay of Evans blue dye extravasations. A significant decrease in neurological function and a significant increase in BBB permeability were observed with the induction of sepsis compared with the sham-operated controls. Administration of methylprednisolone caused a significantly improved reflex responsiveness and decreased brain tissue Evans blue dye content in septic rats.

Conclusions The present investigation shows that methylpred-nisolone administered immediately after sepsis induction via CLP is associated with better neurologic scores, which might be related to its positive effects on sepsis-induced BBB permeability changes.

Systemic administration of E-selectin-directed dexamethasone liposomes attenuates pulmonary inflammation in a mouse model of ventilator-induced lung injury

MA Hegeman1, PM Cobelens1, MP Hennus1, NJ Jansen1, MJ Schultz2, JA Kamps3, G Molema3, AJ Van Vught1, CJ Heijnen1

1UMC Utrecht, the Netherlands; 2AMC Amsterdam, the

Netherlands; 3UMC Groningen, the Netherlands

Critical Care 2009, 13(Suppl 1):P327 (doi: 10.1186/cc7491)

Introduction Mechanical ventilation (MV) may evoke damage to healthy lungs leading to ventilator-induced lung injury (VILI). We hypothesized that downregulation of pulmonary inflammation may attenuate VILI. The present study investigated whether lung inflammation and injury can be restored by dexamethasone (Dex). To prevent the systemic negative side effects of free Dex, we targeted Dex by delivering anti-E-selectin Dex liposomes into activated endothelium of inflamed lungs.

Methods Mice were tracheotomized and ventilated in the pressure control mode for 5 hours with 50% oxygen levels, 2 cmH2O positive end-expiratory pressure and 12 cmH2O peak inspiratory pressure (lung protective, LP-MV) or 20 cmH2O peak inspiratory pressure (lung injurious, LI-MV). Nonventilated mice were used as controls. Free Dex, anti-E-selectin or isotype IgG Dex liposomes were given systemically at initiation of MV. After 5 hours, lung injury was determined by arterial oxygen levels, tissue edema and bronchoalveolar lavage protein levels. Pulmonary inflammation was assessed by myeloperoxidase activity and IL-1P, keratinocyte-derived chemokine and E-selectin mRNA.

Results LI-MV decreased oxygenation levels, increased tissue edema and increased bronchoalveolar lavage protein levels as compared with LP-MV, suggesting that lung function was deteriorated by LI-MV. Both MV strategies enhanced pulmonary inflammation. Anti-E-selectin Dex liposomes and free Dex, but also IgG Dex liposomes, reduced ventilator-induced lung inflammation. However, Dex treatment did not diminish lung injury. Conclusions Systemic administration of targeted and free Dex attenuates VILI-associated pulmonary inflammation, but not lung injury.

Relative adrenal insufficiency in cardiopulmonary bypass surgery patients: impact on the postoperative hemodynamic status

J Jimenez, J Iribarren, M Brouard, L Lorenzo, L Lorente, R Perez, N Perez, L Raja, R Martinez, M Luisa

Hospital Universitario de Canarias, La Laguna SC, Tenerife, Spain Critical Care 2009, 13(Suppl 1):P328 (doi: 10.1186/cc7492)

Table 1 (abstract P329)

Septic group Septic group

with CIRCI without CIRCI Nonseptic group

(n = 37) (n = 29) (n = 20)

Total adrenal 16.4 ± 4.4 16.1 ± 4.8 5.5 ± 2.6*

gland volume (cm3)

Response to the ACTH test did not influence the adrenal volume. *P <0.05.

Introduction The objective of this study was to determine the incidence and risk factors for relative adrenal insufficiency (RAI) in cardiopulmonary bypass (CPB) patients and the impact on postoperative hemodynamic status.

Methods A prospective cohort study was performed on elective CPB patients on a 24-bed ICU of a tertiary university hospital. RAI was defined as a rise in serum cortisol <9 |ig/dl after the administration of 250 |ig cosyntropin. Plasma cortisol levels were measured preoperatively, immediately before and 30 minutes, 60 minutes and 90 minutes after the administration of cosyntropin (250 |ig). Results We included 120 from 137 consecutive patients, of whom 17 met criteria for exclusion (eight off-pump, two surgical emergencies, two with endocarditis, five corticoid dependency). We studied 84 (70%) males and 36 (30%) females. Mean age 67 ± 12 years. Plasma cortisol levels were measured preoperatively, immediately before, 30, 60, and 90 minutes after the administration of cosyntropin and at 24 postoperative hours. The RAI (Acortisol <9 |ig/dl) incidence was 77.5%. Etomidate was the only independent risk factor associated with RAI (OR = 8.51, 95% CI = 3.09 to 23.42). RAI patients needed more vasopressor requirements just after surgery (P = 0.04), and at 4 postoperative hours (P = 0.01). Pretest and post-test plasma cortisol levels were inversely associated with maximum norepinephrine dose in the same time periods (p = -0.22, P = 0.02; p = -0.18, P = 0.05; p = -0.21, P = 0.02; and p = -0.22, P = 0.02, respectively). Conclusions RAI and lower cortisol levels in CPB patients induce postoperative vasopressor dependency. Use of etomidate in these patients is a modifiable risk factor for the development of RAI that should be avoided.

Adrenal gland evaluation in septic shock patients: preliminary results of the first CT-scan study

B Jung, S Nougaret-Jung, G Chanques, S Aufort, N Claveiras, N Rossel, B Gallix, S Jaber

Saint-Eloi Hospital, Montpellier, France

Critical Care 2009, 13(Suppl 1):P329 (doi: 10.1186/cc7493)

Introduction Although critical-illness-related corticosteroid insufficiency (CIRCI) is common in septic shock patients, its diagnosis and management remain controversial [1,2]. The recent consensus conference on CIRCI reported that baseline total serum cortisol and that after 250 |ig synthetic ACTH is insufficient to evaluate the adrenal response to sepsis [1]. To our knowledge, there is no CT-scan study that has evaluated adrenal gland imaging during septic shock. The aim of our study was to describe the adrenal gland volume evaluated by CT scan during septic shock. Methods A single-center study during 1 year. Two groups of patients who benefited from an abdominal CT scan were studied. Patients who presented septic shock, an ACTH test and an abdominal CT scan in the 72 hours before or after the onset of shock were included in the group septic shock (shock +). Patients who benefited from an abdominal CT scan but did not present any shock were included in the group nonshock (shock -). CIRCI was

defined as a delta serum total cortisol less than 9 |ig/dl after an ACTH test of 250 |ig. The main endpoint was the total adrenal gland volume evaluated by CT scan with semiautomatic software (Myriane Intrasense, Montpellier, France).

Results In the preliminary results we compared 66 patients in septic shock with 20 patients without any shock. In the septic shock group, the Simplified Acute Physiology Score II score was 46 (40 to 56) and mortality in the ICU was 36%. The total adrenal gland volume is presented in Table 1. The ACTH test was not evaluable in four patients in the group septic shock leading us to analyse 62 patients.

Conclusions Our preliminary results showed for the first time that in septic shock the total adrenal gland volume is three to four times higher than in nonseptic patients. The diagnosis and outcome impact of this increase volume should be better evaluated. References

1. Marik PE, et al.: Recommendations for the diagnosis and management of corticosteroid insufficiency in critically ill adult patients: consensus statements from an international task force by the American College of Critical Care Medicine. Crit Care Med 2008, 36:1937-1949.

2. Sprung CL, et al.: Hydrocortisone therapy for patients with septic shock. N Engl J Med 2008, 358:111-124.

Effects of systemic steroid in patients with severe community-acquired pneumonia requiring mechanical ventilation

G Chon1, C Lim2, Y Koh2, S Hong2

1Chungju Hospital, Konkuk University, Chungju, Republic of Korea;

2Asan Medical Center, Seoul, Republic of Korea

Critical Care 2009, 13(Suppl 1):P330 (doi: 10.1186/cc7494)

Introduction The effects of systemic steroids in patients with severe community-acquired pneumonia (CAP) requiring mechanical ventilation in a medical ICU (MICU) remain vague about mortality [1-4]. The aim was to evaluate systemic steroids improving survival in patients with severe CAP requiring mechanical ventilation in the MICU in this study. Methods A retrospective, observational study in 88 patients with severe CAP requiring mechanical ventilation in the MICU of the Asan Medical Center, Ulsan University, Seoul, South Korea. We collected information about clinical and laboratory data, and 28-day and 3-month survival from electronic medical records. Results From January 2005 to November 2006 we included 88 patients with severe CAP requiring mechanical ventilation in the MICU. Clinical baseline characteristics, APACHE II score and Sequential Organ Failure Assessment score were similar between the steroid group and the nonsteroid group. Steroids were highly used in acute respiratory distress syndrome (21/23), and shock (57/75) of severe CAP complication, respectively. Using multivariate analysis, longer hospital stay (OR = 1.162; 95% CI = 1.055 to 1.279), shorter ICU stay (0.824; 0.719 to 0.944), and improving Sequential Organ Failure Assessment score difference S135

from day 1 to day 7 (1.447; 1.103 to 1.898) were associated with an increased 3-month survival. But systemic steroids did not improve 28-day and 3-month survival.

Conclusions The effects of systemic steroids in patients with severe CAP requiring mechanical ventilation in the MICU did not improve survival. References

1. Garcia-Vidal C, et al.: Effects of systemic steroids in patients with severe community-acquired pneumonia. Eur Respir J 2007, 30:951-956.

2. Confalonieri M, et al.: Hydrocortisone infusion for severe community-acquired pneumonia. Am J Respir Crit Care Med 2005, 171:242-248.

3. Christ-Crain M, et al.: Free and total cortisol levels as predictors of severity and outcome in community-acquired pneumonia. Am J Respir Crit Care Med 2007, 176:913-920.

4. Franchimont D, et al.: Glucocorticoids and inflammation revisited. Neuroimmunomodulation 2003, 10:247-260.

Statin therapy in patients admitted to hospital with presumed infection

P Kruger1, M Harward1, J Helyar1, B Venkatesh2, M Jones2

1Princess Alexandra Hospital, Brisbane, Australia; 2University of

Queensland, Brisbane, Australia

Critical Care 2009, 13(Suppl 1):P331 (doi: 10.1186/cc7495)

Introduction Several studies, largely retrospective, have suggested improved outcomes and a reduction in inflammatory response in patients who develop infection whilst on statin therapy. This study reports outcomes for patients on established statin therapy admitted to hospital with presumed infection compared with those patients not on a statin.

Methods The study was approved by the Princess Alexandra Hospital Research Ethics Committee and conducted from May

2006 to October 2008 as part of a randomised controlled study investigating continuing established statin therapy. A daily computer-generated report identified all patients admitted to hospital with a potential diagnosis of infection.

Results From a total of 2,291 patients screened, 2,239 were considered to have presumed infection. Of these, 2,161 were aged >30 years and were included in the final analysis (statin users = 633, no statin use = 1,528). The statin group was significantly older with a mean age (± SD) of 69 ± 12 versus 61 ± 16 years

(P = 0.0001) and included more men (62.1% vs. 55.2%, P = 0.004). The source of presumed infection was not statistically different between the groups (overall chi-square P =0.16). A presumed respiratory source was the most common in both groups (statin 35.5% vs. no statin 37.1%, respectively) followed by skin (19.0 vs. 20.3%), gastrointestinal (17.8 vs. 14.1%), urosepsis (13.0 vs. 11.7%) and other sources (14.7 vs. 16.8%). Two or more SIRS criteria was seen in 65.4% of the statin group patients and 67.8% of the no statin group (P =0.30). Hospital mortality was similar in both groups (7.4% statin group and 6.5% no statin group, P =0.64), as was the rate of ICU admission (17% statin group and 16% no statin group; P =0.63). Hospital length of stay was not significantly different between the groups (11.4 ± 15.4 days in the statin group and 12.5 ± 26.2 days in the no statin group, P = 0.10).

Conclusions In a heterogeneous cohort of patients admitted to hospital with presumed infection, prior statin therapy was not associated with an improved outcome or reduced systemic inflammatory response. The age and gender differences observed are in S136 keeping with previous literature and support the assertion that

statin users represent a different patient group to those not using statin therapy.

Evolution of inflammation in non-ICU patients with infections: pilot prospective cohort study

A Donnelly1, NK Adhikari2, R Pinto2, Z Salih1, C McKenzie1, M Terblanche1

1St Thomas's Hospital, London, UK; 2Sunnybrook Health Sciences Centre, Toronto, ON, Canada

Critical Care 2009, 13(Suppl 1):P332 (doi: 10.1186/cc7496)

Introduction Statins may prevent organ dysfunction in patients with infections, but the optimal time for this therapy is unknown. Our objective was to determine the evolution of inflammation in patients treated for infection on general wards and the impact of previous statin use.

Methods We performed a single-centre prospective cohort study (April to September 2008) in unselected patients admitted to medical wards with infection, collecting data on demographics, comorbidities, and statin use before admission; and 10-day follow-up data on intensive care or high-dependency unit (ICU/HDU) admission and death, systemic inflammatory response syndrome (SIRS) criteria and organ dysfunction (using a modified Sequential Organ Failure Assessment (SOFA) score), and infection markers (C-reactive protein (CRP), white blood cells (WBC)). Evolution of organ dysfunction, SIRS, WBC and CRP were analysed descriptively; continuous data are expressed as the mean (SD) or median (Q1 to Q3). The effect of statins was explored in regression models accounting for within-patient correlation, with P <0.05 taken as statistically significant.

Results Two hundred and nine patients were admitted with infections (lung 51.0%, urinary 34.2%, skin/soft tissue 18.5%, other 5.2%; >1 infection/patient possible): age 63.8 years (20.7), 49.8% male, Charlson score 2 (1 to 3), previous statin users 27.8%, WBC 15.6 (12.0) x 109/l, CRP 105 (113) mg/l. On admission, 88.9% had >1 SIRS criterion (median 2 (1 to 3)) and 72.3% had modified SOFA score >1 (median 1 (0 to 2)), with no statin versus non-statin group differences. CRP, WBC, and the proportion of patients with >1 SIRS criterion and modified SOFA score >1 decreased over time (P <0.0001), but generalized linear mixed models showed no effect of statins (P = 0.98, 0.51, 0.55, and 0.25) when adjusted for time, age, sex, and Charlson score. By day 10, seven patients were admitted to the ICU/HDU, four patients had died, and 64 patients had >1 day with a higher modified SOFA score versus admission. Overall, 35.9% of patients developed this combined outcome (statin (44.8%) vs. non-statin (32.5%); OR = 1.66, 95% CI = 2.85 to 3.25 after adjustment for age, sex, Charlson score).

Conclusions Ward patients with infection often develop some organ dysfunction, but the risk of death/higher care is low. Trials of statins to prevent such clinically important outcomes would need to be large.

Heart rate variability in Egyptian children with acute rheumatic fever

MM Farid, M Abd Elmonim, M Elganzoury, M Abou Elmaaty, O Youssef

Faculty of Medicine - Ain Shams University, Cairo, Egypt Critical Care 2009, 13(Suppl 1):P333 (doi: 10.1186/cc7497)

Introduction Autonomic dysfunction in relation to cardiovascular system morbidity and mortality has been reported. The objective of

Group Ia (n = 10) Group Ib (n = 10) Group IIa (n = 10) Group IIb (n = 10) Group III (n = 20)

5AFTs score 6.7 (0.95) 3.2 (0.92) 0.6 (0.5) 0.6 (0.5) 0.6 (0.5)

Plasma NE (pg/ml) 3,061 (1,008) 1,638 (1,129) 455 (111) 438 (113) 423.5 (115)

SVPBs 13 (8.9) 8.6 (5.9) 0.9 (0.7) 1 (0.67) 0.8 (0.77)

VPBs 50.7 (26.4) 13.6 (5.48) 0.7 (0.82) 0.7 (0.8) 0.8 (0.83)

Data presented as mean (SD).

this study was to study cardiac autonomic balance and heart rate variability (HRV) indices in acute rheumatic fever. Methods A prospective study was conducted on three groups. Group I included 20 patients with acute rheumatic carditis. Group II included 20 patients with no cardiac involvement (10 with chorea (Group IIa) and 10 with arthritis (Group IIb)). Group I included 10 in failure (Group Ia) and 10 compensated (Group Ib). Group III included 20 healthy controls. Plasma norepinephrine (NE) assay, echocardiography, cardiovascular autonomic function tests (5AFTs) and 24-hour ambulatory ECG (Holter) monitoring for arrhythmias and HRV. Studied HRV variables included the standard deviation of all normal RR intervals (SDNN) and the percentage of differences between adjacent normal RR intervals that are greater than 50 ms (PNN50).

Results Group I had significantly higher values for 5AFTs score, NE, supraventricular premature beats (SVPBS) and ventricular premature beats (VPBS) (P <0.001). Measured variables were more affected in Group Ia compared with Group Ib (Table 1) and in patients with heart failure New York Heart Association class IV compared with class Group III.

Conclusions Acute rheumatic carditis is associated with significant autonomic dysfunction. HRV indices can be used for risk stratification in those patients.

Recombinant human activated protein C reduces cardiac 3-nitrotyrosine and malondialdehyde levels in ovine acute respiratory distress syndrome and septic shock

MO Maybauer, DM Maybauer, JF Fraser, L Kiss, C Szabo, LD Traber, M Westphal, S Rehberg, P Enkhbaatar, DS Prough, DN Herndon, DL Traber

The University of Texas Medical Branch and Shriners Burns Hospital, Galveston, TX, USA

Critical Care 2009, 13(Suppl 1):P334 (doi: 10.1186/cc7498)

Introduction We have recently shown that recombinant human activated protein C (rhAPC) improved pulmonary function [1] and cardiac performance [2] in ovine acute respiratory distress syndrome and sepsis. Peroxynitrite (ONOO-) is known to inactivate adrenoreceptors in sepsis. It can be detected with equal amounts of 3-nitrotyrosine (3-NT). rhAPC has been effective to reduce lung 3-NT levels in sepsis [1]. We therefore hypothesized that rhAPC might reduce cardiac 3-NT levels as well, and studied cellular enzymes involved with the ONOO- pathway in cardiac tissue.

Methods Fifteen sheep (33 to 38 kg) were operatively prepared for chronic study and randomly allocated to either the sham (uninjured, untreated), control (injured) or treatment (rhAPC) groups (n = 5 each). After a tracheotomy, acute lung injury was produced in the control and rhAPC groups by insufflation of cotton smoke, followed by instillation of Pseudomonas aeruginosa bacteria into the lungs according to an established protocol [1]. The sheep were studied for 24 hours in the awake state and ventilated with

FiO2 1.0. In the treatment group, rhAPC (24 |ig/kg/hour) was intravenously administered, beginning 1 hour post injury. Heart tissue 3-NT, myeloperoxidase (MPO), and malondialdehyde (MDA) contents were measured (ELISA) after 24 hours. Data presented as the mean ± SEM; *significance, P <0.05. Results After 24 hours, 3-NT levels (nM/ml/mg) were 19 ± 3 in sham and were significantly increased in the control group (101 ± 11*). The rhAPC group (22 ± 18*) showed significantly lower 3-NT tissue levels then controls. MDA levels (|iM/ml/mg) were 48 ± 5 in sham and were significantly increased in the control group (85 ± 3*). The rhAPC group (41 ± 3*) showed significantly lower MDA tissue levels then controls. The MPO activity (mU/mg) showed no differences between groups: sham (180 ± 11), control (200 ± 15), and rhAPC (210 ± 20), respectively.

Conclusions rhAPC has no influence on cardiac MPO levels, but significantly reduced heart tissue 3-NT and MDA levels in ovine acute respiratory distress syndrome and septic shock, thereby improving cardiac performance. These findings may lead to further investigations of rhAPC and cardiovascular function. References

1. Maybauer MO, et al.: Recombinant human activated protein C improves pulmonary function in ovine acute lung injury resulting from smoke inhalation and sepsis. Crit Care Med 2006, 34:2432-2438.

2. Maybauer MO, et al.: Recombinant human activated protein C improves cardiac performance in ovine septic shock following acute lung injury. Anesthesiology 2005, 103:A243.

Altered plasma proteome during an early phase of an experimental model of peritonitis-induced sepsis

T Karvunidis1, V Thongboonkerd2, W Chiangjong2, J Mares1, Z Tuma1, J Moravec1, S Sinchaikul3, S Chen3, K Opatrny Jr1, M Matejovic1

1University Hospital and Charles University, Pilsen, Czech

Republic; 2Siriraj Hospital, Mahidol University, Bangkok, Thailand;

3Academia Sinica, Taipei, Taiwan

Critical Care 2009, 13(Suppl 1):P335 (doi: 10.1186/cc7499)

Introduction The pathophysiology and molecular mechanisms involved in sepsis are complex and poorly understood. We performed a proteomics study to characterize early host responses to sepsis as determined by altered plasma proteome in a porcine model of peritonitis-induced sepsis.

Methods In seven instrumented and mechanically ventilated pigs, sepsis was induced by inoculating autologous faeces. Haemo-dynamics, oxygen exchange, inflammatory responses, oxidative and nitrosative stress, and other laboratory parameters were monitored. Plasma samples were obtained before and 12 hours after the induction of hyperdynamic sepsis. Plasma proteins were resolved by two-dimensional electrophoresis. Proteins that changed in abundance were identified by mass spectrometry (MS, quadrupole TOF/TOF MS and MS/MS). S137

Results From approximately 1,500 protein spots in each gel, levels of 47 proteins spots were significantly altered in the septic plasma samples compared with corresponding baseline ones. MS identified 35 protein forms representing 22 unique proteins whose plasma levels were increased, whereas 12 forms of eight unique proteins were significantly decreased during sepsis. Conclusions We identified a set of plasma proteins with significantly altered levels during an early phase of sepsis in a porcine model of peritonitis-induced sepsis using a proteomics approach. Most of these altered proteins have important roles in the inflammatory response. Some findings are novel, and exploring their roles may lead to the identification of new therapeutic targets. Acknowledgement Supported by MSM 0021620819 -Replacement of and support to some vital organs.

Increased epithelial apoptosis and decreased oxidant injury in silymarin-treated rats with sepsis-induced acute lung injury

S Canikli, N Bayraktar, S Turkoglu, O Ozen, M Unlukaplan, A Pirat

Baskent University School of Medicine, Ankara, Turkey Critical Care 2009, 13(Suppl 1):P336 (doi: 10.1186/cc7500)

Introduction Studies have demonstrated that silymarin (milk thistle) has cytoprotective effects and induces apoptosis. We hypothesized that silymarin decreases sepsis-induced acute lung injury (ALI) in a cecal ligation and puncture (CLP) rat model through its anti-inflammatory and antioxidant effects.

Methods Forty-eight rats were randomized to sham (n =16), control (n =16), and silymarin (n =16) groups. ALI was induced with CLP in the control and silymarin groups. Animals in the silymarin group received silymarin 50 mg/kg/day for 3 days before the experiment and 2 days afterward. Serum and bronchoalveolar lavage fluid TNFa, IL-1P, and IL-6; lung tissue malondialdehyde

Figure 1 (abstract P337)

■ Lap HCLP 3 Nat+CLP

H 1 j ;

1 2 5 10 50 100 150 300 1000 Shear Rate (1/second)

Comparison of mean blood viscosity data of the groups Lap (sham-operated group), CLP (cecal ligation-puncture group) and Nat + CLP (nattokinase-supplemented cecal ligation-puncture group).

and glutathione levels; lung histopathologic examination; and lung wet-to-dry (w/d) weight ratio measurements were used to compare and evaluate the severity of lung injury between the groups. Apoptosis was quantitated using paraffin sections of lung by fluorescent terminal deoxynucleotidyltransferase-mediated dUTP nick end-labeling. Survival analyses were also performed. Results Mortality rates for the silymarin and control groups were 37.5% and 87.5%, respectively (log-rank P = 0.0503). Compared with the silymarin group, the control group exhibited significantly more severe lung injury, as indicated by higher mean values for serum and bronchoalveolar lavage fluid TNFa (P = 0.01 for both), IL-1 P (P < 0.028 for both), and IL-6 (P< 0.01 for both); neutrophil infiltration of the lungs (P = 0.003); pulmonary edema (P = 0.001); total lung histopathologic injury score (P<0.001); w/d (P = 0.019); and lung-tissue malondialdehyde (P = 0.011) levels. Lung tissue glutathione levels were significantly higher in the silymarin group than the control group (P = 0.001). Compared with the control group, induction of apoptosis was increased in the silymarin group (P = 0.003).

Conclusions Silymarin reduces the severity of sepsis-induced ALI and may also improve survival in a CLP rat model. These beneficial effects of this agent are probably due to its inhibitory effects on the inflammatory process and oxidative injury and its proapoptotic effect.

Effect of nattokinase supplementation on plasma fibrinogen levels, whole blood viscosity and mortality in experimental sepsis in rats

M Cengiz1, M Yilmaz1, A Ramazanoglu1, S Ozdem1, H Meiselman2, O Baskurt1

1Akdeniz University, Antalya, Turkey;2 Keck School of Medicine, Los Angeles, CA, USA

Critical Care 2009, 13(Suppl 1):P337 (doi: 10.1186/cc7501)

Introduction Nattokinase is a serin protease derived from fermentation of boiled soy bean that has a potent fibrinolytic effect [1]. The aim of this experimental study was to investigate the effects of nattokinase on plasma fibrinogen levels, whole blood viscosity and mortality in rats with sepsis.

Methods Fifty adult female Wistar rats were used in the study. Nattokinase (6 mg/day) was given via the intragastric route for 7 days. Sepsis was induced by cecal ligation-puncture. Analysis of plasma fibrinogen levels, whole blood viscosity and survival was performed. Results Mean plasma fibrinogen levels of rats that received nattokinase prior to cecal ligation-puncture were lower than the others that did not receive nattokinase prior to cecal ligation-puncture and the sham-operated group, but the differences were not statistically significant. Mean blood viscosity of rats analyzed with the Rheolog™ scanning capillary viscometer was lower in the nattokinase-supplemented animals with cecal ligation-puncture group at a 1/second shear rate (P <0.05) (Figure 1). Nattokinase supplementation did not significantly influence survival rates and survival times of the rats after cecal ligation-puncture. Conclusions The results of our in vivo study were not sufficient to prove the fibrinogenolytic effect of nattokinase but confirmed the results of a previous in vitro study investigating the effect of nattokinase on blood viscosity. More in vivo and in vitro studies with large study populations are required to investigate the use of nattokinase to prevent and treat sepsis-related disseminated intravascular coagulation. Reference

1. Sumi H, et al.: A novel fibrinolytic enzyme (nattokinase) in the vegetable cheese Natto; a typical and popular soybean food in the Japanese diet. Experientia 1987, 3:1110-1111.

Physicians' perceptions of current management strategies for Gram-negative pneumonia: a multinational study

A Frank

Bayer Schering Pharma AG, Berlin, Germany

Critical Care 2009, 13(Suppl 1):P338 (doi: 10.1186/cc7502)

Introduction Hospital-acquired pneumonia and its most serious manifestation, ventilator-associated pneumonia, are associated with high rates of mortality and account for 25% of all infections in the ICU [1]. Pneumonia in mechanically ventilated (MV) patients is often complicated by the involvement of multidrug-resistant (MDR) Gram-negative bacteria [2]. This multinational study explored physicians' prescribing behaviour when treating Gram-negative pneumonia in MV patients.

Methods Online interviews were conducted with 510 critical/ intensive care, infectious disease and pulmonary/respiratory specialists in the USA (n = 130), Germany (n = 100), Mexico (n = 100), Spain (n = 80) and Japan (n = 100). Participants were practicing physicians (3 to 31 years), were involved in the management of MV patients and were familiar with treatment strategies for pneumonia. Results The involvement of Gram-negative (vs. Gram-positive) bacteria in pneumonia in MV patients was perceived to carry greater mortality risks. Most physicians (63 to 77% in all countries except Japan; 41% in Japan) were extremely concerned about the impact on treatment outcomes of increasing antibiotic resistance in Gramnegative species. Excessive antibiotic use and failure to effectively de-escalate antibiotic therapy were perceived as key contributors to resistance development. Achieving rapid cure (78%) and minimizing the duration of MV (76%) were cited as the most important aims of treatment. Key considerations for antibiotic selection were activity against Gram-negative bacteria, including MDR strains, and relative extent of lung tissue penetration. Over 80% of respondents saw aerosolized antibiotics as a valuable potential addition to the current treatment armamentarium (no antibiotic aerosols are currently licensed for pneumonia), particularly for elderly patients and those with respiratory distress, who are at risk of the worst outcomes. Conclusions Gram-negative pneumonia in MV patients is a serious complication in ICUs. Antibiotic prescribing for Gram-negative pneumonia among respondents is influenced by the risk of MDR bacterial involvement. Aerosolized antibiotics are seen as a potentially valuable adjunct to systemic therapies for treating Gram-negative pneumonia in MV patients. References

1. American Thoracic Society/Infectious Diseases Society of America: Guidelines for the management of adults with hospital-acquired, ventilator-associated, and healthcare-associated pneumonia. Am J Respir Crit Care Med 2005, 171:388-416.

2. Parker CM, et al.: Ventilator-associated pneumonia caused by multidrug-resistant organisms or Pseudomonas aeruginosa: prevalence, incidence, risk factors, and outcomes. J

Crit Care 2008, 23:18-26.

Pilot proforma to aid the diagnosis of sepsis in burns patients

T Evans, J McLennan

Morriston Hospital, Swansea, UK

Critical Care 2009, 13(Suppl 1):P339 (doi: 10.1186/cc7503)

Introduction The aim was to produce and pilot a data collection proforma aiding the diagnosis of sepsis in the burns population. It

should be relevant, reliable and lead to reproducible results. We also looked at the data collected to assess the relevance of any findings. In June 2007, the American Burn Association Consensus Conference [1] produced standardized definitions in an attempt to aid clarification of sepsis in those already fulfilling the diagnosis of a systemic inflammatory response syndrome (SIRS). This would allow more consistent diagnosis and accuracy to further trials occurring in this field. The intention is that the proforma will be used by the INTERBURNS research group for a large prospective study.

Methods This was a retrospective case series. Patients included were those admitted to a regional burns unit with more than 20% burns during December 2007, numbering four. These patients were matched for age, sex, and mechanism of injury. The initial proforma was produced on the basis of the American Burn Association recommendations [1]. Data were collected and reviewed. This revealed potential confounding factors, so the proforma was modified and data recollected. A daily diagnosis of SIRS and sepsis was made from the information collected. Results Data were collected at 12 p.m. daily initially. It was noted that haemofiltration and noradrenaline were routinely used. Dressing changes occurred in the morning, leading to increased analgesia, sedation and the release of inflammatory mediators confounding physiological parameters measured. Therefore the proforma was modified and the data recollected at 6 a.m. daily, thus achieving reliable and reproducible data. Despite the small numbers, the data indicated a potential association between an increase in noradrenaline dose of 25% and either instigation of a treatment modality or a diagnosis of SIRS not previously identified, or both.

Conclusions The proforma is reproducible and gives relevant reliable data for further analysis. The results have highlighted the need for further research to clarify any association between noradrenaline and its place in the diagnosis of sepsis. This proforma and its subsequent utilisation by the INTERBURNS research group should improve our ability to diagnose sepsis and instigate early treatment in this patient population. Reference

1. Greenhalgh DG, et al.: American Burn Association consensus conference to define sepsis and infection in burns. J

Burn Care Res 2007, 28:776-790.

Surviving sepsis campaign guidelines for severe sepsis and septic shock: implementation and outcome of a 3-year follow up

A Castellanos Ortega1, B Suberviola1, LA Garcia Astudillo1, MS Holanda1, MA Hernandez1, F Ortiz Melon1, R Tejido1, FJ Llorca2, B Fernandez Miret1

University Hospital Marques de Valdecilla, Santander, Spain;

2University of Cantabria, Santander, Spain

Critical Care 2009, 13(Suppl 1):P340 (doi: 10.1186/cc7504)

Introduction The purpose of the study was to describe the effectiveness of the Surviving Sepsis Campaign bundles with regard to both implementation and outcome in patients with septic shock. Methods A single-center prospective observational study of patients admitted to the medical-surgical ICU fulfilling criteria for the international sepsis definitions. After a widespread 2-month educational program, implementation of Surviving Sepsis Campaign Resuscitation Bundles (RB) and Management Bundles (MB) were accomplished. A reinforcement educational program was performed in October 2007. Patients were recruited from September 2005 to August 2008. S139

Results We analyzed 384 episodes of septic shock. The mean age was 64.5 ± 15 years, APACHE II score 23.2 ± 7.2, Sequential Organ Failure Assessment score 9.5 ± 3, and global hospital mortality 37.5%. The rate of compliance with the RB was 35.4%. There were significant differences in mortality between compliant (C) and noncompliant (NC) groups despite the similar characteristics and the severity of septic shock. The mortality rate was 42.5% in the NC group and 23.6% in the C group. The compliance rate with MB was only 10%, there were no differences in mortality between C and NC groups (41% vs. 37%). When the influence of age, severity, emergency department origin, and ICU admission delay was controlled by multivariate analysis, compliance with RB was independently associated with survival (OR = 0.39, 95% CI = 0.22 to 0.70, P <0.01). Compliance rates with RB during three consecutive 12-month time periods were 34%, 23% and 45.4%, respectively (P <0.01); inhospital mortality rates in those periods were 37%, 47% and 31%, respectively (P =0.03). Compliance with MB decreased from 20% (first period) to 3% (third period).

Conclusions Implementation of RB was associated with decreased mortality in patients with septic shock. The compliance rate with MB was poor and had no impact on survival. Acknowledgements Supported by IFIMAV Expte. PRF/07/04 and the Instituto de Salud Carlos III. Expte. PI070723.

Sepsis bundles: just think about it?

A Cardinale, L Giunta, C Di Maria, C Pellegrini, P De Luca, G Di Salvio, T Russo, P Masturzo, E De Blasio

Hospital 'G. Rummo', Benevento, Italy

Critical Care 2009, 13(Suppl 1):P341 (doi: 10.1186/cc7505)

Introduction The aim of the study is to verify the impact of the implementation of the diagnostic and therapeutic bundles suggested by Sepsis Surviving Campaign (SSC) [1] on the outcome of patients with severe sepsis and septic shock. Methods A retrospective analysis of the outcome of severe sepsis and septic shock patients before, during and after the implementation of the bundles according to the SSC in an eight-bed polyvalent ICU. We evaluated ICU and hospital mortality, the length of stay, the level of compliance to the bundles and its impact on the mortality. Statistical analysis was performed using the chi-square test.

Figure 1 (abstract P341)

Severe sepsis and septic shock hospital mortality.

Results A total of 127 patients were enrolled from 2005 onwards (38 patients in 2005, 43 patients in 2006 and 46 patients in 2007). We observed a reduction of the overall ICU and hospital mortality from 73% to 43%, not statistically significant, even if the difference between the observed and predicted mortality showed a statistically significant improvement (P = 0.027) (Figure 1). On the contrary, the length of stay [2] was 18 days in 2005, 23 days in 2006, 25 days in 2007. Our compliance was respectively 59% in 2006 and 55% in 2007 for resuscitation bundles and 58% in 2006 and 78% in 2007 for management bundles. The mortality was higher (70%) in the group with an overall compliance score <6 compared with 40% for those with a score >7. Conclusions Even though the main difficulty was to timely reach the targets of the bundles, the existence of a protocolized approach to severe sepsis and septic shock seems to contribute to the reduction of mortality in our population of patients. References

1. Dellinger RP: Surviving Sepsis Campaign: international guidelines for management for severe sepsss and septic shock 2008. Crit Care Med 2008, 36:296-327.

2. Ferrer R: Improvement in process of care and outcome after a multicenter severe sepsis educational program in Spain. JAMA 2008, 299:2294-2303.

Compliance with sepsis resuscitation but not management bundles improves the survival of surgical patients with septic shock

B Suberviola1, A Castellanos-Ortega1, LA Garcia Astudillo1, MS Holanda1, C Gonzalez Mansilla1, FJ Llorca2, B Fernandez Miret1, F Ortiz Melon1, M Hernandez1

1University Hospital Marques de Valdecilla, Santander, Spain;

2University of Cantabria, Santander, Spain

Critical Care 2009, 13(Suppl 1):P342 (doi: 10.1186/cc7506)

Introduction The purpose of this study was to describe the effectiveness of the Surviving Sepsis Campaign bundles with regard to both implementation and outcome in surgical patients with septic shock.

Methods A single centre prospective observational study of surgical patients admitted to the ICU from September 2005 to August 2008 fulfilling criteria for the international sepsis definitions.

Results We analyzed prospectively 149 surgical patients with septic shock. The mean age was 69 ± 13 years, APACHE II score: 23 ± 7, Sequential Organ Failure Assessment: 8 ± 2. The mortality rate was 34.2% in the ICU and 42.3% in the hospital. There were no significant differences in the characteristics and the severity of septic shock between the compliant (C) and noncompliant (NC) groups. We found differences in the ICU and hospital mortality between the C and NC groups in two Resuscitation Bundles (RB): central venous oxygen saturation (ScvO2) >70% (25% vs. 43%, P =0.01 and 32% vs. 52%, P =0.02) and source control (14% vs. 56%, P <0.001 and 24% vs. 63%, P <0.001, respectively). The compliance rate of all RB was 27% and there were significant differences in mortality between the C and NC groups (17% vs. 40%, P =0.01 and 35% vs. 49%, P =0.009 in the ICU and hospital, respectively). There were no significant differences in the ICU and hospital mortality with the Management Bundles compliance. In the multivariate analysis, source control, ScvO2 >70%, compliance of all RB, mechanical ventilation, APACHE II and Sequential Organ Failure Assessment were independently associated with mortality. When the influence of age and severity was controlled by logistic regression, source control was

independently associated with survival (OR = 0.17, 95% CI = 0.05 to 0.55, P = 0.003).

Conclusions Implementation of the RB was associated with decrease mortality of surgical patients with septic shock. Among all sepsis bundle elements, the source control was the only independent predictor of survival. The compliance of Management Bundles had no impact on survival.

Standard operating procedure in patients with severe sepsis and septic shock

T Schwab, A Schmitz, S Richter, C Bode, H Busch

University Hospital, Freiburg, Germany

Critical Care 2009, 13(Suppl 1):P343 (doi: 10.1186/cc7507)

Introduction Patients with severe sepsis and septic shock still have a high mortality rate, despite improvements in intensive care therapy. In the present study we assessed the impact of a standard operating algorithm adjusted on international treatment recommendations in patients with severe sepsis and/or septic shock (for example, volume resuscitation, hemodynamic control, glycemic control, substitution of selenium and/or hydrocortisone and the use of recombinant human activated protein C (rhAPC)) on the outcome. Methods A retrospective analysis of 144 patients admitted to our medical ICU in 2004 and 2006. In 2004, before implementation of the standard operating procedure (SOP), 74 patients fulfilling criteria for diagnosis of severe sepsis and/or septic shock were analysed and compared with 70 patients after implementation of evidence-based SOP in 2006.

Results Both groups did not show any difference in initial APACHE II score and clinical baseline characteristics. With implementation of the SOP, use of volume in the first 6 hours (1,506 vs. 2,154 ml, P<0.05) and in the first day (4,005 vs. 6,122 ml, P <0.001) significantly increased. Furthermore, treatment with hydrocortisone, selenium and insulin increased after implementation of the SOP significantly. Catecholamines and rhAPC were unaffected. Mortality in the patient group without SOP was 57%, and in the intervention group with SOP the mortality was 38.5% (P <0.05). Conclusions Owing to the implementation of a standard algorithm, a significant increase in volume therapy and adjunctive sepsis therapy were realised more frequently compared with patients treated without the SOP and could be associated with a lower mortality rate in patients. The realisation of an evidence-based SOP in daily practice in patients with severe sepsis and/or septic shock might be effective, shown in the alteration of treatment practice and the reduction of mortality.

Epidemiology and outcome of patients with severe sepsis in six Spanish ICUs

L Lorente1, M Martín2, C Díaz3, L Labarta4, J Ferreres5, J Solé-Violán6, J Borreguero1, Y Barrios1

1Hospital Universitario de Canarias, La Laguna SC, Tenerife, Spain; 2Hospital Universitario Nuestra Señora de Candelaria, S/C de Tenerife, Spain; 3Hospital Insular, Las Palmas de Gran Canaria, Spain; 4Hospital San Jorge, Huesca, Spain; 5Hospital Clínico Universitario de Valencia, Spain; 6Hospital Universitario de Gran Canaria Dr. Negrín, Las Palmas de Gran Canaria, Spain Critical Care 2009, 13(Suppl 1):P344 (doi: 10.1186/cc7508)

Introduction The objective of this study was to describe the epidemiology, consumption of resources and outcome of patients with severe sepsis.

Methods A prospective, observational and multicenter study performed in six Spanish ICUs. Only patients with severe sepsis were included.

Results A total of 122 patients with severe sepsis were included (female 33.1%; mean age 58.01 ± 15.40 years; mean APACHE II score at admission to the ICU 21.10 ± 8.89). The source of the infection was respiratory in 56.2%, abdominal in 19.2% and other foci in 24.6%. Sepsis-related Organ Failure Assessment (SOFA) scores at the time of diagnosis of severe sepsis were the following: global 10.21 ±3.75, respiratory 2.66 ±1.08, haematological 0.75 ± 1.10, hepatic 0.70 ± 1.01, cardiovascular 3.54 ± 1.22, neurological 0.96 ± 1.50, and renal 1.46 ± 1.61. Rates of organ failure were the following: respiratory 93.8%, haematological 38.5%, hepatic 37.7%, cardiovascular 90.8%, neurological 35.4% and renal 55.4%. The rate of use of medical resources was as follows: adrenergic agents 90.8%, mechanical ventilation 86.2%, extrarenal depuration 20% and recombinant activated protein C 15.4%. Mean length of stay in the ICU was 16.88 ± 18.98 days. The mortality rate in the ICU was 43.44%. We found that lactic acid serum levels (OR = 1.26, 95% CI = 1.11 to 1.43, P <0.001), SOFA score (OR = 1.24, 95% CI = 1.12 to 1.37, P <0.001) and plasminogen activator inhibitor-1 (PAI-1) plasma levels (OR = 1.02, 95% CI = 1.01 to1.03, P = 0.001) at the time of diagnosis were predictors of mortality.

Conclusions Severe sepsis is an important cause of mortality and it incurs a considerable use of resources. Lactic acid serum levels, SOFA score and PAI-1 plasma levels at diagnosis were found to be predictors of mortality.

Epidemiology of severe sepsis in India

S Chatterjee1, S Todi1, S Sahu2, M Bhattacharyya1

1AMRI Hospitals, Kolkata, India; 2Kalinga Hospital, Cuttack, India Critical Care 2009, 13(Suppl 1):P345 (doi: 10.1186/cc7509)

Introduction A multicentre, prospective, observational study conducted in four intensive therapy units (ITUs) in India from June 2006 to September 2008 to determine the incidence and outcome of severe sepsis among adult patients.

Methods All patients admitted to the ITU were screened daily for SIRS, organ dysfunction and severe sepsis as defined by the ACCP and SCCM. Patients with severe sepsis were further studied. Results A total of 4,183 ITU admissions were studied. SIRS with organ dysfunction was found in 1,286 (30.74%) patients, of which 688 (53.50%) were due to severe sepsis. The incidence of severe sepsis was 16.45% of all admissions. The mean age of the study population was 56.72 years (SD = 18.20), of which 62.63% were male. The median APACHE II score was 19 (IQR 18 to 20) with predominant (90.93%) medical admission. The ITU mortality of all admissions was 17.70% and that of severe sepsis was 46.30%. The hospital mortality and 28-day mortality of severe sepsis were 53.39% and 55.05%, respectively. The standardized mortality ratio of severe sepsis patients was 2.20. The median duration of stay in the ITU for the severe sepsis cohort who survived was 4 days (IQR

4 to 5). The number of episodes where infection was the primary reason for admission to the ITU was 98.11%. Culture positivity was found in 44.48%. Lung was the predominant source of sepsis (35.90%). Gram-negative organisms were responsible for 57.86%

of cases and Gram-positive for 16.63%. The rest were parasitic, viral and fungal infections.

Conclusions Severe sepsis was common in Indian ITUs. The ITU mortality was higher compared with the western literature. Grampositive infections were less common although the incidence of parasitic and viral infection was higher than in the West. S141

Improving identification of severe sepsis by junior doctors: an observational study

M Slattery, D Hepburn, R Jagadeeswaran, P Temblett

Morriston Hospital, Swansea, UK

Critical Care 2009, 13(Suppl 1):P346 (doi: 10.1186/cc7510)

Introduction Sepsis is a significant cause of morbidity and mortality worldwide. Early recognition and treatment are of paramount importance in reducing mortality; this has been highlighted by the Surviving Sepsis Campaign (SSC) [1]. A consensus of opinion exists as to Time Zero being the first instance where severe sepsis is present. We undertook a study of newly qualified doctors in a university teaching hospital, to evaluate how effective they were at recognising severe sepsis; and whether a simple intervention could improve their ability to perform this task.

Methods The case file of a patient with septic inflammatory response syndrome leading to severe sepsis was identified and anonymised. Time Zero was identified by the investigators based on the SSC criteria. The case file was distributed to a sample of junior doctors over a 2-month period. They were asked to study the retrospective case record and fill in a questionnaire identifying when they felt Time Zero occurred. These data were collected and then the doctors were given an educational tool about the SSC guidelines and the definition of Time Zero. They were then asked to reappraise the case and reassess the Time Zero point based on their new knowledge.

Results Thirty junior doctors participated; all had less than

4 months postgraduate experience. Time Zero was correctly identified on the first attempt by 17% (n = 5) of the juniors. Time Zero estimates ranged from -70 to +920 minutes of the actual value. After implementation of the teaching tool, 100% of the doctors correctly identified Time Zero according to the SSC criteria. Conclusions These results show that there can be significant discrepancies in the accuracy of prediction of Time Zero, which in turn could have significant implications for the diagnosis and timely inception of treatment in severe sepsis. A simple intervention such as an educational tool based on current SSC guidelines can improve the accuracy and speed of diagnosis, and this may have an effect on morbidity and mortality.

Reference

1. Dellinger RP, et al: Surviving Sepsis Campaign: international guidelines for management of severe sepsis and septic shock: 2008. Crit Care Med 2008, 36:296-232.

Prevalence of ICU infection in South Africa and accuracy of treating physician diagnosis and treatment

5 Bhagwanjee, J Scribante, F Paruk, G Richards

University of the Witwatersrand, Johannesburg, South Africa Critical Care 2009, 13(Suppl 1):P347 (doi: 10.1186/cc7511)

Introduction ICU infection is an important cause of morbidity and mortality. Equally, it is important to establish to what extent physicians are capable of making effective diagnoses and implement correct treatment for sepsis. A national prevalence of ICU infection study was conducted in South African ICUs to evaluate these two issues.

Methods Approval to conduct the study was obtained from all appropriate authorities. A 1-day prevalence of sepsis study was undertaken on 16 August 2005. A proportional probability sample was determined from all ICUs in the country. Attending physicians were asked to indicate their diagnosis and treatment on the day of

the study. Relevant clinical data were collected for each patient and reviewed by two independent intensivists. Results All units chosen agreed to participate. The 28-day hospital mortality was 58 of 248 patients (23%). The prevalence of each category of the SIRS [1] criteria and associated percentage mortality respectively was nil 59 (8%), SIRS 120 (9%), sepsis 40 (28%), severe sepsis 13 (35%) and septic shock 16 (41%). One hundred and ninety-six patients were deemed to have one of the sepsis diagnoses by the primary physician, an overdiagnosis of 51%. Antibiotic prescription and associated mortality percentage respectively were: appropriate 82 (1 2%) and inappropriate 100 (28%). Antibiotic therapy was changed correctly after microbiology data in 21 of 88 patients (24%). Duration of therapy was correct in 51 of 183 patients (28%).

Conclusions The prevalence of sepsis and the associated mortality is similar to that described in other studies [1]. Physicians tend to overdiagnose sepsis. Antibiotic prescribing practices were incorrect with respect to selection, modification after microbiology data and duration of treatment. A major deficiency in diagnostic and therapeutic ability among physicians was identified in this study. Appropriate steps must be taken to remedy this deficiency. Reference

1. Dombrovskiy VY, et al.: Crit Care Med 2007, 35:1244-1250. P348

Defining severe community-acquired pneumonia: a significant barrier to improving patient outcomes

MM Balp, C Naujoks

Novartis Pharma AG, Basel, Switzerland

Critical Care 2009, 13(Suppl 1):P348 (doi: 10.1186/cc751)

Introduction Severe community-acquired pneumonia (sCAP) is an exaggerated inflammatory and coagulation response to infection. sCAP has high rates of mortality and necessitates modified treatment to that for mild/moderate community-acquired pneumonia (CAP). Despite the grave nature of this condition it is poorly characterised. A systematic review was performed to gauge current data and identify unmet needs associated with sCAP. Methods MEDLINE was searched for English-language papers concerning sCAP published after 1998.

Results There are a number of published indices for diagnosing sCAP but these are difficult to use in the clinical setting. In the literature, sCAP is defined as CAP requiring admittance to an ICU or as CAP that results in death. Current International Statistical Classification of Disease and diagnosis-related group codes do not specify whether pneumonia is community acquired or hospital acquired or indicate the severity of the infection. Estimates of the prevalence of sCAP in patients with CAP range from 6.6 to 16.0% [1,2]. Mortality rates for patients with sCAP range from 10 to 55% [3,4]. The discrepancy in these rates emphasises the variation in hospital practices due to the lack of an objective definition of sCAP. The absence of a clear definition for sCAP could result in inappropriate treatment of this life-threatening condition, increasing mortality rates. There are scant data concerning the costs of treating sCAP; increased expenditure for patients with sCAP versus those with CAP results from ICU treatment, increased length of hospital stay, mechanical ventilation, vasopressor use and rehabilitation costs. It is expected that the clinical and cost benefits of new therapies will be more easily recognised if sCAP is consistently defined. Defining sCAP appropriately requires a focused international initiative and collaboration between clinicians and payers.

Conclusions There is a major unmet need for a meaningful definition of sCAP. Poor characterisation of sCAP has resulted in

variable reports of its prevalence and may result in inappropriate treatment leading to increased mortality in this patient population. The results of this systematic review form the basis for developing a new treatment pathway to improve outcomes for patients with sCAP. References

1. Buising KL, et al.: A prospective comparison of severity scores for identifying patients with severe community-acquired pneumonia: reconsidering what is meant by severe pneumonia. Thorax 2006, 61:419-424.

2. Ewig S, et al.: Severe community-acquired pneumonia: sssessment of severity criteria. Am J Respir Crit Care Med 1998, 158:1102-1108.

3. El-Solh AA, et al.: Etiology of severe pneumonia in the very elderly. Am J Respir Crit Care Med 2001, 163:645-651.

4. Roson B, et al.: Etiology, reasons for hospitalization, risk classes, and outcomes of community-acquired pneumonia in patients hospitalized on the basis of conventional admission criteria. Clin Infect Dis 2001, 33:158-165.

Outcome of severe sepsis in the ICU is independent of haemoglobin levels

J Wood, D Pandit

William Harvey Hospital, Ashford, UK

Critical Care 2009, 13(Suppl 1):P349 (doi: 10.1186/cc7513)

Introduction In this observational cohort study we attempted to elucidate a haemoglobin (Hb) target that favours the mortality outcome in severe sepsis patients. The optimum level of Hb to influence outcome in severe sepsis is yet to be determined. Although the analysis of severe sepsis patients in the Canadian transfusion requirement in critical care study [1] did not find any mortality benefit with a Hb level of 10 g/dl or above, the early goal-directed therapy trial [2] showed significant mortality benefit by achieving haematocrit >30%, equivalent to a Hb value of 10 g/dl, during the early treatment period. This pilot study was therefore undertaken to look for any relationship between Hb levels and mortality in severe sepsis patients admitted to the ICU. Methods All patients 16 years or older with severe sepsis or septic shock who stayed longer than 24 hours between July 2006 and June 2007 were retrospectively included in the study. Patient demographics, Hb levels measured by the blood gas analyser during their period of severe sepsis and outcome data were collected. Binary logistic regression analysis was performed with ICU survival and 28-day survival as the dependent variables. Results Of the 62 patients enrolled in the study, the average age was 66.8 years (16 to 87 years), with mean admission APACHE II score and Simplified Acute Physiology Score II being 18 (8 to 30) and 58 (1 7 to 78), respectively. Their ICU and 28-day mortality rates were 35.5% and 41.9%, respectively. On analysis no significant relationship was found between average Hb, Hb variation, minimum Hb, maximum Hb, and number of units transfused with ICU or 28-day survival. In addition, no significant relationship was found between Hb falling below thresholds of 7, 8, 9, or 10 g/dl, or variation from an Hb of 7, 8, 9, 10 g/dl during severe sepsis and the ICU survival or 28-day survival. Conclusions While a haematocrit of 30% represents a physiological optimum between viscosity and oxygen carriage, the equivalent Hb from this pilot study shows no advantage in the outcome of severe sepsis. This study suggests that any effect Hb levels may have on the outcome from severe sepsis is likely to be small.

References

1. Hebert P, et al.: A multicentre, randomized, controlled clinical trail of transfusion requirements in critical care. N Engl J Med 1999, 340:409-417.

2. Rivers E, et al.: Early goal directed therapy in the treatment of severe sepsis and septic shock. N Engl J Med 2001, 345:1368-1377.

Abstract withdrawn

TNFp+250 polymorphism and hyperdynamic state in cardiac surgery with extracorporeal circulation

J Iribarren, JJ Jimenez, M Brouard, L Lorente, R Perez, L Lorenzo, L Raja, S Palmero, N Perez, R Martinez, M Mora

Hospital Universitario de Canarias, La Laguna SC, Tenerife, Spain Critical Care 2009, 13(Suppl 1):P351 (doi: 10.1186/cc7515)

Introduction We have investigated genetic and clinical factors associated with the hyperdynamic state (HS) after heart surgery with extracorporeal circulation (ECC).

Methods We performed a prospective cohort study of consecutive patients who underwent elective heart surgery with ECC. The HS was defined as hyperthermia (>38°C), cardiac index (CI) >3.5 l/min/m2 and systemic vascular resistance index (SVRI) <1,600 dynes x s/cm5 x m2. The study included demographic variables, gene polymorphisms (A/G) of TNFp+250, G/A-1082 of IL-10, polymorphism of IL-1 receptor antagonist, comorbidity, type of surgery, serum levels of IL-6, and postoperative course. We used the Pearson chi-square test or Fischer exact test, and the Student t test for univariate analysis, with forward stepwise logistic regression for multivariate adjustment.

Results Eighty patients were studied, of whom 22 (27.5%) developed HS. The presence of allele G of TNFp+250 polymorphism was associated with an increased incidence of HS (68% vs. 37%; P = 0.011). In the multivariate analysis, a longer duration of ECC, and the presence of the G allele, were associated with the development of HS.

Conclusions The G allele of TNFp+250 polymorphism, and prolonged extracorporeal circuit times, may favour the development of a hyperdynamic state after heart surgery with ECC.

Role of 4G/5G plasminogen activator inhibitor-1 (PAI-1) gene polymorphism and the PAI-1 plasma levels in the outcome of patients with severe sepsis

L Lorente1, M Martín2, L Labarta3, C Díaz4, J Solé-Violán5, J Blanquer6, J Borreguero1, Y Barrios1

1Hospital Universitario de Canarias, La Laguna SC, Tenerife, Spain;

2Hospital Universitario Nuestra Señora de Candelaria, S/C de

Tenerife, Spain; 3Hospital San Jorge, Huesca, Spain; 4Hospital

Insular, Las Palmas de Gran Canaria, Spain; 5Hospital Universitario

de Gran Canaria Dr. Negrín, Las Palmas de Gran Canaria, Spain;

6Hospital Clínico Universitario de Valencia, Spain Critical Care 2009, 13(Suppl 1):P352 (doi: 10.1186/cc7516)

Introduction The objective of this study was to evaluate the effect of the 4G/5G plasminogen activator inhibitor-1 (PAI-1) gene polymorphism on the plasma PAI-1 levels and outcome in critically ill patients with severe sepsis.

Methods A prospective, observational and multicenter study carried out in six Spanish ICUs. Only patients with severe sepsis were included. Epidemiological data, severity scores, site of infection and outcome of sepsis were recorded. Measurement of PAI-1 plasma levels (at diagnosis, 72 hours and 7 days of diagnosis of severe sepsis) and the DNA genotyping of the PAI-1 were carried out. Results A total of 122 patients with severe sepsis were included. Nonsurvivor patients (n = 53) exhibited higher PAI-1 plasma concentrations than the survivor patients (n = 69) at the time of diagnosis of severe sepsis (60.17 ± 30.34 vs. 42.51 ± 28.85; P =0.001), at 72 hours (34.74 ± 20.79 vs. 17.91 ± 12.85; P<0.001), and at 7 days (35.93 ± 21.16 vs. 24.77 ± 15.85; P = 0.07). PAI-1 plasma levels at diagnosis were found a predictor of mortality (OR = 1.02, 95% CI = 1.01 to 1.03, P = 0.001). We did not find significant differences between 4G/5G PAI-1 gene polymorphisms in the PAI-1 plasma levels (49.20 ± 26.15 with 4G/4G genotype, 49.39 ± 30.63 with 4G/5G genotype, and 48.06 ± 32.26 with 5G/5G genotype), and in the mortality (10/23 (41%) with 4G/4G genotype, 25/60 (40%) with 4G/5G and 18/39 (47%) with 5G/5G).

Conclusions PAI-1 plasma levels at the time of diagnosis are markers of an unfavourable prognosis in patients with severe sepsis. We have not, however, found any correlation between the 4G/5G PAI-1 gene polymorphism, plasma levels of PAI-1 or mortality of patients with severe sepsis.

Role of monocyte apoptosis for the final outcome of the septic host with peritonitis

A Kotsaki1, I Tzepi1, P Carrer1, K Louis1, G Zografos2, E Giamarellos-Bourboulis1

1Attikon University Hospital, Athens, Greece; 2Hippokrateion General Hospital, Athens, Greece

Critical Care 2009, 13(Suppl 1):P353 (doi: 10.1186/cc7517)

Introduction Former studies have shown that monocyte apoptosis is crucial for the final outcome of sepsis in the field of acute pyelonephritis [1]. The significance of apoptosis in acute intraabdominal infection was studied.

Methods Acute peritonitis was induced after cecal ligation and puncture in 17 male New Zealand rabbits. Four sham-operated rabbits were also studied. Peripheral blood mononuclear cells were isolated after Ficol gradient centrifugation; monocytes were further isolated after plastic adherence. Apoptosis of monocytes was estimated after staining positive for annexin V and negative for propidium iodine by flow cytometric analysis. Results Monocyte apoptosis is presented in Table 1. A rate of apoptosis greater than 40% was found in six animals after 24 hours; it was lower than 40% for 11 animals. The median survival of the former was 15.2 days and of the latter 4.2 days (logrank: 5.49, P = 0.019).

Conclusions Sepsis arising in the field of peritonitis is accompanied by inhibition of apoptosis of monocytes. The rate of apoptosis 24 hours after sepsis induction is detrimental for the survival of the septic host.

Table 1 (abstract P353)

Apoptosis of monocytes over follow-up

Median (%) 1.5 hours 4 hours 24 hours 48 hours

Sham 21.4 45.1 16.3 27.9

Peritonitis 18.5 10.9* 21.9* 13.3

*P < 0.05 between groups.

Reference

1. Antonopoulou A, et al.: Clin Exp Immunol 2007, 149:103108.

Genomic variations within matrix metalloproteinase-9 and severe sepsis

Q Chen, Y Jin, X Fang

First Affiliated Hospital, Zhejiang University, Hangzhou, China Critical Care 2009, 13(Suppl 1):P354 (doi: 10.1186/cc7518)

Introduction Sepsis is a multiple-gene disease resulting from the interaction of environmental and genetic components. Matrix metallo-proteinase-9 (MMP-9) has been demonstrated to play an important role in organ dysfunction and outcome of sepsis [1 -3]. However, genetic predisposition of MMP-9 to sepsis remained unknown. Methods Seven common SNPs within the functional regions of MMP-9 gene (rs17576, rs2274756, rs2250889, rs9509, rs3918240, rs3918241 and rs3918242) were investigated in 192 patients with severe sepsis and 262 healthy controls. Meanwhile, the plasma levels of MMP-9 were measured via ELISA. Results The genotype distributions and allelic frequencies of the above seven SNPs were not significantly different between patients with severe sepsis and controls, as well as between surviving and nonsurviving patients with severe sepsis (all P >0.05). Haplotype GGCTTTC, AGGTCTC, GGCCTTC, GACTTAT and AGCCCTC are the five most common haplotypes. The distribution of the haplotypes was also comparable among the defined groups. The median plasma levels of MMP-9 was 37.66 ng/ml in 32 patients within the first 24 hours following the diagnosis of severe sepsis, and 30.15 ng/ml in 19 healthy controls. Compared with those in surviving patients with severe sepsis (median 37.66 ng/ml, n = 16) and healthy controls, the concentrations of MMP-9 appeared an increasing trend in nonsurviving patients with severe sepsis (median 36.06 ng/ml, n = 16).

Conclusions The present findings suggest that common polymorphisms within the function regions of MMP-9 gene may not play a major role in the predisposition to severe sepsis in the Chinese Han cohort. The plasma levels of MMP-9 may associate with the outcome of severe sepsis. Further studies in large samples need to be guaranteed. References

1. Renckens R, et al.: Matrix metalloproteinase-9 deficiency impairs host defense against abdominal sepsis. J Immunol 2006, 176:3735-3741.

2. Steinberg J, et al.: Metalloproteinase inhibition reduces lung injury and improves survival after cecal ligation and puncture in rats. J Surg Res 2003, 111:185-195.

3. Lalu MM, et al.: Matrix metalloproteinase inhibitors attenuate endotoxemia induced cardiac dysfunction: a potential role for MMP-9. Mol Cell Biochem 2003, 251:61-66.

Association of IL-10 promoter polymorphism -1082 G/A with adverse outcome in severe sepsis and septic shock

O Sabelnikovs1, L Nikitina-Zake2, J Zhuravlova3, E Sama3, I Vanags1

1Riga Stradins University, Riga, Latvia; 2Latvian Biomedical Research and Study Centre, Riga, Latvia; 3P. Stradins Clinical University Hospital, Riga, Latvia

Critical Care 2009, 13(Suppl 1):P355 (doi: 10.1186/cc7519)

Introduction IL-10 is an anti-inflammatory cytokine with pleiotropic effect in immunoregulation and inflammation. Several recent

papers reported association of the IL-10 -1082G allele with adverse outcome in sepsis [1,2]. The study objective was to investigate whether IL-10 promoter polymorphism -1082 G/A is associated with adverse outcome in severe sepsis and septic shock patients.

Methods The study was conducted in the mixed medical-surgical adult ICU of P. Stradins Clinical University Hospital in Riga in 2007 and 2008. A total of 103 critically ill patients who met the proposed severe sepsis and septic shock criteria were included. The IL-10 -1082 polymorphism was genotyped by sequencing. All patients were followed up throughout their stay in the ICU to clinical outcome. The frequency distributions of the genotypes in the subgroups were compared using the Pearson chi-square test. P <0.05 was considered to indicate statistical significance. Results Of the 103 patients observed, 44 (43%) had an adverse outcome in the ICU. In the survival group 11 (19%) had G/G at position -1082 of the IL-10 gene, 25 (42%) were heterozygous G/A, and 23 (39%) were homozygous A/A. In the nonsurvival group five (11%) had G/G, 30 (68%) had A/G, and nine (21%) had the A/A genotype. The polymorphism frequencies between survival and nonsurvival differed significantly (P = 0.034, x2 = 6.8). The relative risk for adverse outcome in IL-10 -1082 alternative G allele carriers (G/G and G/A genotypes) was 1.6 (95% CI = 0.91 to 2.83).

Conclusions We found association of the IL-10 promoter polymorphism -1082 G/A with clinical outcome in severe sepsis and septic shock patients; alternative -1082 G allele carriage seems to be associated with greater risk for adverse outcome in the studied patient population. References

1. Gallagher PM, et al.: Association of IL-10 polymorphism with severity of illness in community acquired pneumonia. Thorax 2003, 58:154-156.

2. Stanilov SA, et al.: Interleukin-10-1082 promoter polymorphism in association with cytokine production and sepsis susceptibility. Intensive Care Med 2006, 32:260-266.

Activated protein C restores kidney function in endotoxin-induced acute renal failure in the rat

E Almac1, T Johannes2, E Mik3, M Legrand4, K Unertl2, C Ince1

1Academisch Medisch Centrum, Amsterdam, the Netherlands;

2University Hospital Tuebingen, Germany; 3Erasmus Medical Center, Rotterdam, the Netherlands; 4Lariboisiere Hospital, University of Paris, France

Critical Care 2009, 13(Suppl 1):P356 (doi: 10.1186/cc7520)

Introduction Activated protein C (APC) has been shown to have beneficial effects on the inflammatory process and coagulation during sepsis. Inflammation and coagulopathy impair the micro-vasculature and therefore disturb oxygen transport to the tissue. The hypothesis of our study was that APC treatment improves renal microvascular oxygenation and kidney function in endotoxin-induced acute renal failure in the rat.

Methods In 18 anesthetized and ventilated (FiO2 0.4) male Wistar rats, the arterial blood pressure and renal blood flow were recorded. The renal microvascular PO2 was continuously measured by phosphorescence lifetime technique. All animals received a lipopolysaccharide (LPS) bolus (10 mg/kg) to induce endotoxemic shock. All rats received fluid resuscitation (hydroxyethyl starch 130kDa) 1 hour after LPS application. In one group of animals, APC (drotrecogin alpha, Xigris®; Lilly) was continuously infused in a concentration of 10 |ig/kg/hour. Another group received a continuous infusion of 100 |ig/kg/hour APC.

Figure 1 (abstract P356)

MAP mmHg

RBF RVR

ml/miii tlyue/sw/em*

CjiPD-iumH8

111/iPO. mmHg

102 ±5 72 ±0.2 14 ± 1 0.94 ±0.11 79 ±8 57 ±5

ä i 71 4 10 2.9 ±0.8 35 ±12 0.02*001 67 ±7 52 ±5

67 ± 11 3.9 ±0.7 19±4 0,54 ±0.11 54 ±7 41 ±4

2 - <0 10114 6 8 ± 0.7 16± 1 1.00 ±012 84 ±4 63 ±2

^ = <1 62 ±8 3.3 ±0.8 27 ±6 0.01 ±0.01 69 ±6 56 ±4

* 58 ± 7 3.6 ±0.5 19 ±4 0.75 ±0.13 49 ±6 40 ±6

= 102 ± 1 6.3 ±0.3 Ittl 0.98 ±0.16 80 ±4 60±3

™ i <1 76 ±8 2.7 ±0.6 35 ±7 0.01 ±001 67 ±S 51 ± 8

< t: SO ±7 3.9 ± 0.3 21 ±1 0 ,88 ±011 56±7 49 ±4

FR = fluid resuscitation; T0 = baseline; t1 = 1 hour after LPS bolus; t2 = resuscitation (3 hours); MAP = mean arterial pressure; RBF = renal blood flow; RVR = renal vascular resistance; CLcrea = creatinine clearance; c|PO2 = cortical microvascular PO2; m|PO2 = medullary microvascular PO„.

Results Data are presented in Figure 1.

Conclusions APC at a concentration of 100 |ig/kg/hour significantly restored kidney function compared with standard fluid resuscitation during endotoxemia. This application best improved the mean arterial pressure. APC had no beneficial effects, however, on the average renal microvascular PO2.

Angiopoietin-2 is a mediator involved in the advent of hypotension following endotoxin shock

PD Carrer1, V Grosomanidis2, B Fyntanidou2, S Panidis2, K Kotzampassi2, EJ Giamarellos-Bourboulis1

University of Athens Medical School, Athens, Greece; 2University of Thessaloniki Medical School, Thessaloniki, Greece Critical Care 2009, 13(Suppl 1):P357 (doi: 10.1186/cc7521)

Introduction Angiopoietin-2 (Ang-2) is a mediator produced by endothelial cells [1]. It is in doubt whether its production in the intubated septic host is due to an effect of anesthetic medication or to the sepsis process. The present study attempted to provide an answer.

Methods In 10 mechanically ventilated pigs, anesthesia was maintained by sevoflurane inhalation. Then shock was induced by the intravenous administration of 25 mg/kg lipopolysaccharide (LPS) of Escherichia coli 111: B4 in eight pigs; two were controls. Blood was collected by a Swan-Ganz catheter and by a peripheral vein. Ang-2 was estimated by an enzyme immunoassay. Results The median Ang-2 of the peripheral circulation among controls and pigs at shock was at baseline 1,403.2 and

2.390.6 pg/ml (P = NS), respectively; at 1 hour 914.2 and 2,192.5 pg/ml (P = 0.048), respectively; and at 2 hours 2,013 and

1.738.7 pg/ml (P = NS). Respective values of the lung circulation were 995.8 and 1,738.7 pg/ml at baseline (P = NS); 831.3 and 1,663.3 pg/ml at 1 hour (P = NS); and at 2 hours 943.8 and

1.757.8 pg/ml (P = NS).

Conclusions Ang-2 is a mediator involved in the advent of shock

not influenced by the administration of sevoflurane.

Reference

1. Giamarellos-Bourboulis EJ, et al.: Kinetics of angiopoietin-2 in serum of multi-trauma patients: correlation with patient severity. Cytokine 2008, 44:310-313.

Inhibition of the lectin-like oxidized low-density lipoprotein receptor-1 improves intestinal microcirculation in experimental endotoxaemia

C Lehmann1, D Pavlovic2, S Wilk2, C Thaumuller2, M Otto2, M Wendt2, S Felix2, M Landsberger2

1Dalhousie University, Halifax, NS, Canada; 2Ernst Moritz Arndt University, Greifswald, Germany

Critical Care 2009, 13(Suppl 1):P358 (doi: 10.1186/cc7522)

Introduction Lectin-like oxidized low-density lipoprotein receptor-1 (LOX-1) is a major endothelial receptor for oxidized low-density lipoprotein [1]. Its expression is induced by pro-atherogenic stimuli as well as by inflammatory cytokines. LOX-1 acts also as an adhesion molecule involved in leukocyte recruitment. The systemic leukocyte activation in sepsis represents a crucial factor in the impairment of the microcirculation of different tissues, causing multiple organ failure and death. The aim of our experimental study was therefore to evaluate the effects of LOX-1 inhibition on the endotoxin-induced leukocyte adherence within the intestinal microcirculation using intravital microscopy.

Methods Group 1 (n = 10 Lewis rats) remained untreated and served as the control group. In Group 2 (n = 10) endotoxemia was induced by intravenous administration of 2 mg/kg lipopolysaccha-ride (LPS). In Group 3 (n = 10) endotoxemic animals were treated with 10 mg/kg anti-Lox-1 -IgG. Endotoxemic animals of Group 4 (n = 10) were treated with an unspecific IgG. Following 2 hours of endotoxin challenge or placebo administration, intestinal microcirculation was evaluated using intravital microscopy. LOX-1 expression was quantified by western blot and reverse-transcription PCR.

Results LOX-1 inhibition reduced significantly the leukocyte adherence in the submucosal venules of the intestinal wall (P <0.05). Functional capillary density of the intestinal muscular layers as well as in the mucosa increased following administration of the LOX-1-antibody in comparison with untreated LPS animals (P <0.05). At the mRNA level, LOX-1 expression was significantly increased in the untreated LPS group (P <0.05) whereas animals with LOX-1-antibody administration showed a significant reduction of the expression (P <0.05).

Conclusions LOX-1 -antibody administration in experimental endotoxaemia significantly reduced leukocyte adherence and increased microvascular perfusion within the intestinal microcirculation. The inhibition of the lectin-like oxidized low-density lipoprotein receptor-1 may represent an attractive target for the modulation of endotoxin-induced impairment of the microcirculation in sepsis. Reference

1. Honjo M, et al.: Lectin-like oxidized LDL receptor-1 is a cell-adhesion molecule involved in endotoxin-induced inflammation. Proc Natl Acad Sci U S A 2003, 100:1274-1279.

Effect of the a7nAChR agonist GTS-21 on inflammation during human endotoxemia

J Pompe, M Kox, C Hoedemaekers, P Pickkers, A Van Vugt, J Van der Hoeven

Radboud University Nijmegen Medical Centre, Nijmegen, the Netherlands

Critical Care 2009, 13(Suppl 1):P359 (doi: 10.1186/cc7523)

Introduction Activation of the cholinergic anti-inflammatory path-S146 way via vagus nerve stimulation or a7nAChR agonists improves

Figure 1 (abstract P359)

Figure 2 (abstract P359)

outcome in animal models of endotoxemia, sepsis and experimental arthritis. This vagal anti-inflammatory pathway is mediated by the nicotinergic a7nACh receptor that can be selectively stimulated by GTS-21. Up to now, the anti-inflammatory effects of oral administration of GTS-21 in humans in vivo have not been investigated. The aim of this study was to investigate the anti-inflammatory effects of oral administration of GTS-21 on the inflammatory response in the human endotoxemia model. Methods We performed a double-blind placebo-controlled randomized study in 12 healthy, nonsmoking male volunteers (18 to 28 years) during experimental endotoxemia. Subjects received 150 mg GTS-21 or placebo orally three times daily 3 days before lipopolysaccharide (LPS) injection and on the day of the experiment, the last dose 1 hour before LPS administration (t = -1). One hour after the last dose of GTS-21 or placebo, LPS derived from Escherichia coli O:113 was injected (2 ng/kg intravenously). Results The main study endpoint was the concentration of circulating cytokines after LPS in the absence and presence of GTS-21. The effects of GTS-21 on TNFa and IL-10 release are shown in Figures 1 and 2. There was a trend towards a decrease in TNFa levels and an increase in IL-10 levels after GTS-21 administration. A similar trend was observed in levels of other proinflammatory and anti-inflammatory cytokines.

Conclusions GTS-21 suppresses TNFa and stimulates IL-10 release during human endotoxemia in healthy human volunteers, resulting in a shift towards a more anti-inflammatory pattern. This effect may have potential for in vivo modulation of the innate immune response.

Inflammatory response of critically ill and haemodialysis patients after whole blood stimulation by cell wall components of Gram-positive bacteria and fungi

E Tsigou1, S Aloizos2, G Papatheodorou2, G Ganiatsos2, A Kotsovili3, A Tsakris3, G Baltopoulos1

1KAT Hospital, Athens, Greece; 2401 GAHA, Athens, Greece; 3Athens University School of Medicine, Athens, Greece Critical Care 2009, 13(Suppl 1):P360 (doi: 10.1186/cc7524)

Introduction The purpose of our study was to determine the acute inflammatory response of critically ill (ICU) and haemodialysis (HD) patients and to compare the cytokine profiles induced by different classes of pathogens in these two patient populations vulnerable to infections.

Methods We studied production of IL-6 and TNFa in response to stimulation of whole blood with 1 |ig lipoteichoic acid (LTA) and 1 |ig mannan for 6 hours in ICU patients with various medical causes of admission and in HD patients just before dialysis. Results Blood samples were taken from 11 ICU patients (eight males; age 63.83 ± 10.73 years; APACHE II score 21.12 ± 3.45) and from nine HD patients (six males; age 65.11 ± 10.37 years). Results are presented in Table 1. Between the two patient groups, no differences were found concerning the intensity of the inflammatory response.

Conclusions The basic concentration of IL-6 was higher in ICU patients. After ex vivo challenge with LTA and mannan, no differences in the intensity of the inflammatory response were detected between the two groups. LTA is a much more potent immunostimulator than mannan in both ICU and HD patients.

Asymmetric and symmetric dimethylarginines: metabolism and role in severe sepsis

M Umbrello, F Colombo Pavini, L Bolgiaghi, E Carloni, F Rapido, M Gomarasca, E D'Angelo, G Iapichino

Istituto di Anestesia e Rianimazione, Milan, Italy

Critical Care 2009, 13(Suppl 1):P361 (doi: 10.1186/cc7525)

Introduction Asymmetric dimethylarginines (ADMA) and symmetric dimethylarginines (SDMA) are markers of protein breakdown; both compete with arginine for cellular transport and are excreted in urine. In addition, ADMA is a nonselective inhibitor of NO synthase, and is also metabolized by dimethylarginine-dimethylaminohydrolase (DDAH), a specific hydrolase whose activity in stress is controversial [1]. While the ADMA increase is

associated with adverse events in many critical conditions, little attention has been focused on the role of SDMA [2]. Methods In three Italian university ICUs we measured plasma ADMA, SDMA, their ratio (a marker of ADMA catabolism, a rough indicator of DDAH activity), arginine, IL-6, TNFa, and C-reactive protein (CRP) level at days 1, 3, 6, 9, 12 and at discharge in 72 consecutive patients with severe sepsis/septic shock. Results Basal glycaemia, creatinine, IL-6, TNFa, CRP, ADMA and SDMA were higher than normal; arginine was normal. ADMA was related to total Sequential Organ Failure Assessment and arginine, inversely related to IL-6 and CRP; SDMA was related to Simplified Acute Physiology Score II score, daily, worst-day, total Sequential Organ Failure Assessment score, blood urea, creatinine, and arginine. The ADMA/SDMA ratio was inversely related to IL-6. In 58 patients discharged alive, creatinine, IL-6 and CRP decreased over time, ADMA increased, SDMA remained stable, and the ADMA/SDMA ratio increased. In 14 patients who died in the ICU, creatinine, IL-6, TNFa, CRP, and ADMA did not vary, SDMA significantly increased, and ADMA/SDMA ratio variation was not significant. In both groups the last ICU day confirms data trends. SDMA but not ADMA was associated with ICU mortality. Conclusions In severe sepsis SDMA is a more robust predictor of organ failure/mortality than ADMA. Stress reaction seems to activate ADMA catabolism, while in survivors, when inflammation subsides, the catabolism seems to be reduced. References

1. Zoccali C, et al.: Asymmetric dimethyl-arginine (ADMA) response to inflammation in acute infections. Nephrol Dial Transplant 2007, 22:801-806.

2. O'Dwyer MJ, et al.: Septic shock is correlated with asymmetrical dimethyl arginine levels, which may be influenced by a polymorphism in the dimethylarginine dimethy-laminohydrolase II gene: a prospective observational study. Crit Care 2006, 10:R139.

IL-2 modulates IFNy mRNA gene expression in cultured peripheral blood mononuclear cells from septic patients

M White1, R Grealy1, D Doherty2, D Kelleher2, R McManus2, T Ryan1

1St James Hospital, Dublin, Ireland; 2Institute of Molecular

Medicine, Trinity College, Dublin, Ireland

Critical Care 2009, 13(Suppl 1):P362 (doi: 10.1186/cc7526)

Introduction IL-2 activates numerous key cells in the immune system [1]. We have previously demonstrated that IFNy mRNA gene expression in peripheral blood mononuclear cells (PBMC) is downregulated in patients with sepsis compared with healthy controls [2]. Here we investigated IFNy gene expression in cultured PBMC from healthy controls and patients with severe sepsis and examined whether exogenous IL-2 can influence IFNy mRNA expression.

Methods PBMC were isolated from five healthy controls and five patients with severe sepsis and were cultured in 24-well plates.

Table 1 (abstract P360)

Cytokine concentrations before and after LTA and mannan challenge

ICU IL-6 ICU TNFa HD IL-6 HD TNFa

Baseline (pg/ml) 363.2 ± 363.1 85.75 ± 56.05 88.78 ± 152.6 34.75 ± 27.17

LTA challenge (pg/ml) 2,128 ± 94.86, P <0.0001 4,836 ± 2,607, P <0.0001 2,057 ± 120.7, P <0.0001 4,175 ± 2,180, P = 0.0035

Mannan challenge (pg/ml) 812.1 ± 562.0, P = 0.0377 168.9 ± 119.9, P = 0.0904 170.9 ± 158, P = 0.2787 56.75 ± 42.01, P = 0.3687

Cells were incubated in medium alone or stimulated with either 1 |ig/ml lipopolysaccharide (LPS) for 24 hours (activate monocytes and B cells), 3 |ig/ml anti-CD3 (mAb) for 4 hours (activate T cells), or 10 ng/ml phorbol myristate acetate (PMA) + 1 |iM ionomycin for 4 hours (activate all cells). Each experiment was performed in the absence and presence of 50 U recombinant IL-2 (rIL-2). Total RNA was isolated and reverse transcribed. IFNy mRNA gene expression was quantified with quantitative RT-PCR. Results were analysed with ANOVA and the t test where appropriate. Results IFNy mRNA production is lower in PBMC of patients with sepsis compared with healthy controls (P = 0.03) but similar when measured in the presence of rIL-2. With T-cell stimulation, IFNy mRNA was greater in PBMC from septic patients compared with controls (P = 0.02) but similar in the presence of rIL-2. With LPS stimulation of monocytes and B cells, IFNy mRNA was lower in PBMC from septic patients compared with controls (P = 0.02), but similar in the presence of rIL-2. With PMA stimulation, IFNy mRNA was similar in PBMC from septic patients and controls in the presence and absence of rIL-2.

Conclusions IFNy mRNA production is reduced in PBMC in sepsis but can be stimulated ex vivo to produce normal IFNy levels. In the absence of rIL-2, IFNy mRNA production is inducible in T cells but not in monocytes and B cells. IL-2 modulation of IFNy mRNA production appears to be stimulus dependent, and may contribute to the host defence mechanism. References

1. Burkett P, et al.: Diverse functions of IL-2, IL-15, and IL-7 in lymphoid homeostasis. Annu Rev Immunol 2006, 24:657-679.

2. O'Dwyer MJ, et al.: The occurrence of severe sepsis and septic shock are related to distinct patterns of cytokine gene expression. Shock 2006, 26:544-550.

Compliment activity patterns in patients with sepsis after human purified Cl-esterase inhibitor infusion

A Igonin, N Lazareva, L Dolzhenkova

I.M. Sechenov Medical Academy, Moscow, Russian Federation Critical Care 2009, 13(Suppl 1):P363 (doi: 10.1186/cc7527)

Introduction Our purpose was to assess the effects of super-physiologic dosages of human purified C1-esterase inhibitor infusion (C1INH) on complement-dependent pathways of systemic inflammatory response in patients with sepsis.

Figure 1 (abstract P363)

I * ff

time period

C1INH activity in the Bicizar (q) and control groups (c): analysis in quartiles.

Methods In the terms of an open-label prospective control study, human purified C1INH (Bicizar; BioGenius LLC, Russia) was administered at total dosage of 12,000 U to 20 sepsis patients. Patients (n = 22) who did not receive C1INH infusion were enrolled as controls. C3 and C4 complement subunits, IL-6, C-reactive protein and procalcitonin results were analyzed in quartiles according to C1INH baseline activity.

Results C1INH activity was different in quartiles at entry in both groups (P <0.03). C1INH infusion resulted in elevation of C1INH activity in the treatment arm (Figure 1). The most significant difference was observed between the upper quartiles of the groups (P <0.01). The degree of C1INH activity shift had a negative association with the baseline results (r = -0.635, P <0.01). C3 (P <0.01) and C4 (P <0.05) levels displayed a more rapid and pronounced rise after C1INH infusion in patients with higher baseline C1INH activity. The C-reactive protein concentration dropped significantly in patients who received C1INH (130 mg/l (28 to 293)) already on the third day of the study in comparison with the control (185 mg/l (72 to 343); P = 0.022). The same trend characterized the IL-6 level. Patients with elevated procalcitonin values at the onset of sepsis had a more significant C1INH increase (r = 0.635, P <0.05).

Conclusions Downregulation of complement activity with C1INH might help to contain severe systemic inflammation. Presumably, the response to C1INH infusion was related to its baseline activity.

Detrimental hemodynamic and inflammatory effects of microparticles

S Mortaza, P Asfar

CHU Angers, France

Critical Care 2009, 13(Suppl 1):P364 (doi: 10.1186/cc7528)

Introduction Microparticles (MPs) are membrane vesicles with procoagulant and proinflammatory properties released during cell activation and might be potentially involved in the pathophysiology of septic shock [1,2]. The present study was designed to assess the effects of MPs from septic origin on systemic hemodynamics as well as on the inflammatory, oxidative and nitrosative stresses. Methods We designed a prospective, randomized, controlled experimental study with repeated measurements. Forty healthy rats were randomly allocated to three groups: 10 animals inoculated with MPs isolated from control rats (cMPs), 15 animals inoculated

with MPs isolated from sham rats (shMPs) and 15 animals inoculated with MPs isolated from rats with peritonitis (sMPs). Rats were anesthetized, mechanically ventilated and were infused with the same amount of cMPs or shMPs or sMPs. We measured the heart rate, mean arterial pressure, carotid artery blood flow and portal vein blood flow. Hemodynamic parameters were recorded during 7 hours, and then animals were sacrificed. The aorta and heart were harvested for further in vitro tissue analyses. Results (1) The cellular origin (phenotype) but not the circulating concentration of MPs was different in septic rats, characterized by a significant increase in leukocyte-derived MPs. (2) sMPs but not cMPs or shMPs decreased the mean arterial pressure without any effect on carotid artery and portal vein blood flows. (3) Rats inoculated with sMPs exhibited an increase in superoxide ion production and NF-kB activity, overexpression of inducible NO synthase with subsequent NO overproduction and decrease in endothelial NO synthase activation.

Conclusions Rats with sepsis induced by peritonitis exhibited a specific phenotype of MPs. Inoculation of sMPs in healthy rats reproduced hemodynamic, septic inflammatory patterns, associated with oxidative and nitrosative stresses. References

1. Morel O, et al.: Cellular microparticles, a disseminated storage pool of bioactive vascular effectors. Curr Opin Hematol 2004, 11:156-164.

2. Martinez MC, et al.: Shed membrane microparticles from circulating and vascular cells in regulating vascular function. Am J Physiol Heart Circ Physiol 2005, 288:1004-1009.

Hydrogen sulfide: anti-inflammatory and cytoprotective effects

C Szabo

Ikaria Inc., Seattle, WA, USA

Critical Care 2009, 13(Suppl 1):P365 (doi: 10.1186/cc7529)

Introduction Pharmacological actions of gaseous biological mediator hydrogen sulfide (H2S) include vasodilatation, inhibition of mitochondrial respiration as well as induction of suspended animation-like state [1,2]. Its beneficial cellular actions include cytoprotection and anti-inflammatory effects [2]. Methods Rodent models of ischemia and reperfusion of the heart, murine and ovine models of acute respiratory distress syndrome, a canine model of cardiopulmonary bypass, porcine models of myocardial infarction and thoracoabdominal aortic aneurysm surgery were used. H2S was administered in an iso-osmolar, pH-neutral intravenous formulation (IK-1001).

Results In a mouse model of myocardial infarction, a significant protection by IK-1001 was seen in terms of reduction of myocardial infarct size. These effects were accompanied by reduction in myocardial IL-1 levels and reduction in neutrophil infiltration [3]. The cardiac protection of H2S was confirmed in a porcine model of myocardial ischemia [4] and in a canine model of cardiopulmonary bypass surgery. IK-1001 was also protective in murine and ovine models of acute respiratory distress syndrome, where improvement in survival and pulmonary function was accompanied by a reduction in oxidant stress and suppression of the production of IL-6 and inhibition of the expression of inducible nitric oxide synthase [5]. In models of thoracoabdominal aneurysm surgery, reperfusion was accompanied with significant increases in the production of IL-1 and IL-6, which were reduced by IK-1001. IK-1001 also improved renal function, provided hemodynamic stabilization and attenuated oxidative DNA damage [6].

Conclusions H2S thus exerts organ-protective and antiinflammatory effects in various animal models of critical illness. The mechanisms may involve metabolic effects leading to the induction of hypothermia, antioxidant mechanisms, modulation of gene expression, activation of KATP channels, and inhibition of inflammatory cell activation. References

1. Blackstone E, et al.: H2S induces a suspended animationlike state in mice. Science 2005, 308:518.

2. Szabo C: Hydrogen sulphide and its therapeutic potential. Nat Rev Drug Discov 2007, 6:917-935.

3. Elrod JW, et al.: Hydrogen sulfide attenuates myocardial ischemia-reperfusion injury by preservation of mitochondrial function. Proc Natl Acad Sci U S A 2007, 104:1556015565.

4. Sodha NR, et al.: The effects of therapeutic sulfide on myocardial apoptosis in response to ischemia-reperfusion injury. Eur J Cardiothorac Surg 2008, 33:906-913.

5. Esechie A, et al.: Protective effect of hydrogen sulfide in a murine model of acute lung injury induced by combined burn and smoke inhalation. Clin Sci (Lond) 2008, 115:91-97.

6. Simon F, et al.: Hemodynamic and metabolic effects of hydrogen sulfide during porcine ischemia/reperfusion injury. Shock 2008, 30:359-364.

Dimethyl sulphoxide administration decreases renal ischemic-reperfusion injury

W Hoyos, I Medina, R López, J Ramos, Z Garcia, S López

Universidad 'Dr. José Matías Delgado', Santa Tecla, La Libertad, El Salvador

Critical Care 2009, 13(Suppl 1):P366 (doi: 10.1186/cc7530)

Introduction Kidney ischemia is one of the mechanisms of acute renal failure and it is known that the free oxygen radicals play an important role in this type of injury. We tested the hypothesis that dimethyl sulphoxide (DMSO), a free radical scavenger, has a protective role in a renal ischemia-reperfusion animal model [1]. Methods Renal ischemia was induced to 45 male, New Zealand rabbits. The subjects were divided into three groups based on ischemia duration: 30, 60 or 90 minutes. Each group was divided into three subgroups: (a) DMSO previous to ischemia, (b) DMSO after ischemia, (c) control group. All subjects were given 6 hours of reperfusion. Three blood samples were taken at baseline, ischemic and reperfusion phases. Each sample was tested for serum creatinine, blood ureic nitrogen and urea. After reperfusion, bilateral nephrectomy was performed on each subject before euthanasia. A pathological analysis evaluated tubular and basement membrane changes. The level of injury was scaled in three stages: mild, moderate and severe.

Results The histological analysis showed severe damage in 33% of the control group, compared with 0% in both treatment groups (chi-square P = 0.00) (Table 1). Blood chemistry analysis in the control group at 60 and 90 minutes of ischemia showed higher

Table 1 (abstract P366)

Proportion estimate of histopathological findings

n = 45 Control (%)

DMSO pre (%) DMSO post (%)

Moderate Severe

46 20 33

40 60 0

60 40 0

values than both treatment groups. Creatinine values were analyzed over a proportion estimate, showing that 26% of subjects in the control group had an improvement comparing the ischemic phase versus the reperfusion phase. Treatment groups showed that 46% and 60% of the subjects improved their creatinine values when DMSO was administered pre ischemia and post ischemia, respectively.

Conclusions The animal model showed an increasing trend of all blood chemistry parameters evaluated in the control group. DMSO applied as prophylaxis or treatment post ischemia demonstrates diminished renal function deterioration. Histological analysis revealed the absence of severe lesions when DMSO is administered. Reference

1. Kolb KH, et al.: Absorption, distribution and elimination of labeled dimethyl sulfoxide in man and animals. Ann NY

Acad Sci 1967, 141:85-95.

Beneficial effects of the heme oxygenase-1/carbon monoxide system

N Takeyama, S Takaki, Y Kajita, T Yabuki, H Noguchi, Y Miki, Y Inoue, T Nakagawa, H Noguchi

Aichi Medical University, Aichi, Japan

Critical Care 2009, 13(Suppl 1):P367 (doi: 10.1186/cc7531)

Introduction It has been reported that the blood level of carboxyhemoglobin (CO-Hb) is increased in critically ill patients such as those with sepsis or multiple trauma [1,2]. Because carbon monoxide (CO) is one of the metabolites of heme catabolism, it has been suggested that there may be increased breakdown of heme in these patients. Heme oxygenase (HO) is the enzyme involved in the rate-limiting step for catabolism of heme-containing proteins. It was recently reported that HO-1 acts as a potent anti-inflammatory agent and antioxidant through its products [3]. HO-1 may therefore be an inducible defense against cellular stress that occurs during the inflammatory process [1,4]. Against this background, we decided to evaluate the relation among the blood level of CO, HO-1 expression by monocytes, oxidative stress, and the outcome of sepsis.

Methods Thirty patients who fulfilled the criteria for severe sepsis or septic shock and 17 other patients without sepsis during their stay in the ICU were studied. HO-1 expression by monocytes, arterial CO, oxidative stress, and cytokines were measured. Results Arterial blood levels of CO, cytokines, as well as monocyte HO-1 expression were higher in septic shock patients than in nonseptic patients. Increased HO-1 expression was significantly correlated with the arterial CO concentration and oxidative stress. There was a positive correlation between survival and higher HO-1 expression or CO level.

Conclusions We found that the increase of endogenous CO production in sepsis mainly reflects increased heme turnover secondary to upregulation of HO-1, which is partially in response to systemic oxidative stress. A strong correlation between the blood CO level and survival supports the beneficial effect of HO-1 upregulation and increased CO production in patients with sepsis. References

1. Hoetzel A, et al.: Carbon monoxide in sepsis. Antioxid Redox Signal 2007, 9:2013-2026.

2. Bauer M, et al.: The heme oxygenase-carbon monoxide system: regulation and role in stress response and organ failure. Intensive Care Med 2008, 34:640-648.

3. Foresti R, et al.: Use of carbon monoxide as a therapeutic agent: promises and challenges. Intensive Care Med 2008,

S150 34:649-658.

4. Scott JR, et al.: Restoring homeostasis: is heme oxyge-nase-1 ready for the clinic? Trends Pharmacol Sci 2007, 28:200-205.

Assessment of IL-18 values in septic acute lung injury/acute respiratory distress syndrome patients

T Kikkawa, Y Suzuki, H Makabe, S Shibata, G Takahashi, N Matsumoto, N Sato, S Endo

Iwate Medical University, Morioka, Japan

Critical Care 2009, 13(Suppl 1):P368 (doi: 10.1186/cc7532)

Introduction IL-18 is said to be involved in organ injury. We investigated the IL-18 values of septic acute lung injury (ALI) and acute respiratory distress syndrome (ARDS) patients. Methods The subjects were 38 patients during the 3-year period from 2004 to 2007 from whom it was possible to collect a blood specimen within approximately 6 hours of the onset of septic ALI or ARDS. Their mean age was 67 years, and their mean APACHE II score was 29. Their Sequential Organ Failure Assessment score was 13, and their mean PaO2/FiO2 ratio was 170. The PaO2/FiO2 ratio was 246 in the ALI group and 135 in the ARDS group. There were four cases (10.5%) in the 28-day mortality group, and six cases (15.8%) in the 90-day mortality group. Results The value of IL-18 in the dead group was significantly higher than in the surviving group (1,649 ± 1,056 pg/ml vs. 4,523 ± 2,798 pg/ml; P <0.05), and in the ARDS group was also significantly higher than in the ALI group (2,467 ± 1,880 pg/ml vs. 1,314 ± 800 pg/ml; P <0.05).

Conclusions These results suggested that IL-18 may play a major role in progression of ARDS in respiratory disorder as multiple organ failure.

Cytochrome C is not released in the heart during sepsis-induced myocardial depression

L Smeding, TA Van Veelen, WJ Van der Laarse, RR Lamberts, J Groeneveld, FB Plotz, MC Kneyber

Institute for Cardiovascular Research VU, VU University Medical Center, Amsterdam, the Netherlands

Critical Care 2009, 13(Suppl 1):P369 (doi: 10.1186/cc7533)

Introduction Sepsis is often accompanied by myocardial depression; one of the possible mechanisms includes mitochondrial injury. We hypothesized that opening of the mitochondrial permeability transition pore (mPTP) may occur during sepsis-induced myocardial depression. We investigated the opening of the mPTP in septic hearts by measuring cytochrome C release into the cytosol.

Methods Sepsis was induced in rats by 7.5 mg/kg lipopoly-saccharide injection intraperitoneally. After 4 hours, hearts were excised and mounted in a Langendorff setup to study myocardial contractility ex vivo. Subsequently, hearts were frozen and 5 |im cryostat sections were made. Sections were stained for cytochrome C using immunohistochemistry. Healthy rats served as controls. Results Septic animals showed a decreased contractility (P <0.005) and lower developed pressure (P <0.001) when compared with healthy controls. Immunohistochemistry revealed no release of cytochrome C in healthy or septic hearts. Conclusions Cytochrome C is not released during sepsis-induced myocardial depression. These results indicate that the mito-chondrial permeability pore may not be involved in the development of myocardial dysfunction during sepsis.

Strain-specific and pathogen-specific physiologic and genomic differences in murine inflammatory cardiac dysfunction

G Ackland1, R Agrawal2, C Hou3, A Patterson2

1 University College London, UK; 2Stanford University, Stanford, CA, USA; 3Washington University School of Medicine, St Louis, MO, USA

Critical Care 2009, 13(Suppl 1):P370 (doi: 10.1186/cc7534)

Introduction Comparing genomic changes in mice strains demonstrating physiologic differences with pathologic insults is a novel approach to elucidate potential mechanisms. Our hypothesis was that murine strains exhibit different cardiac/genomic responses to specific pathogens.

Methods The end-systolic pressure-volume relationship (ESPVR) and end-diastolic pressure-volume relationship (EDPVR) cardiac performance was compared in B6, C57 and FVB mice (pressure-volume loops, Millar catheter; isoflurane anesthesia) 4 hours after zymosan (ZYM) or endotoxin (LPS) intraperitoneally. Gene expression profiles unique to mouse strain/baseline/treatment were created using two-way ANOVA/two-fold filtering. Results ESPVR improved in B6/C57 mice after ZYM (Figure 1). Diastolic compromise (Figure 2) occurred in FVB mice following ZYM but in B6 mice after LPS. Genomic analyses within strains revealed pathogen-specific differences: for example, ZYM-treated

Figure 1 (abstract P370)

FVB ■

Baseline LPS zymosan

RVUs, relative volume units.

Figure 2 (abstract P370)

EJaseline LPS Zymosan

FVB mice (diastolic impairment) demonstrated downregulation of key cell cycle, vascular endothelial growth factor, L-type calcium channel genes, with upregulation of T-cell receptor and the src-family kinase (exaggerated inflammatory response). Conclusions Comparative genomic analyses provide new insights into septic cardiac pathophysiology.

Acknowledgement Supported in part by the National Heart, Lung and Blood Institute (Patterson), and the Intensive Care Society (Ackland).

Diastolic but not systolic dysfunction is associated with troponin and N-terminal pro-brain natriuretic peptide elevation in sepsis

G Landesberg, Y Meroz, S Goodman, D Levin

Hadassah University Hospital, Jerusalem, Israel

Critical Care 2009, 13(Suppl 1):P371 (doi: 10.1186/cc7535)

Introduction Cardiac dysfunction is one of the key features in sepsis and septic shock, yet its mechanism is poorly understood. We aimed to investigate the pathophysiology of cardiac dysfunction in sepsis by integrating echocardiographic with biochemical and inflammatory markers.

Methods Over 14 months, 127 consecutive, septic/systemic inflammatory response syndrome (SIRS) patients in our ICU were collected. All patients underwent transthoracic echocardiography with measurements of systolic and diastolic function, and blood samples were collected and serum separated for measurements of biomarkers. All clinical parameters were collected from the ICU charts during the days of echocardiography studies and blood sampling. Outcome measures were ICU, inhospital survival and survival up to 2 years. Patients with segmental wall motion abnormality - indicating myocardial infarction or regional ischemia and/or significant mitral or aortic disease - were excluded from the analyses.

Results Out of 86 patients without regional myocardial dysfunction and/or significant valvular disease, 36 (42%) died during follow-up, almost all of them within the first 6 months. Thirty-one (36%) patients had positive blood cultures and they were more tachycardic and hypotensive and had shorter E-wave deceleration time than SIRS (negative culture) patients (P =0.024). The echocardiographic measurements most predictive of mortality by Cox survival analysis were E-wave/Em ratio (Exp(P) = 1.12, P = 0.006) and pressure gradient over the tricuspid valve (Exp(P) = 1.04, P<0.0001). Among the biomarkers, N-terminal pro-brain natriuretic peptide (NT-proBNP), and IL-18 were the strongest predictors of mortality (P =0.004 and P <0.001). Troponin and NT-proBNP best correlated with higher E-wave/Em ratio, and lower Em and Sm waves and with the cytokines TNFa and IL-8. The left ventricular end-diastolic volume, left ventricular end-systolic volume or left ventricular ejection fraction did not predict survival and did not correlate with troponin or NT-proBNP elevation.

Conclusions After exclusion of all patients with coronary artery disease and/or significant valvular dysfunction, there is still a significant incidence of troponin and NT-proBNP elevation in septic/SIRS patients. Echocardiographic features most significantly associated with troponin and NT-proBNP elevation were measures of diastolic dysfunction (high E-wave/Em ratio, low Sm and Em). Measures of systolic dysfunction did not correlate with troponin or NT-proBNP elevation.

L-Threonine treatment enhances heat shock protein 25 expression and prevents apoptosis in heat-stressed intestinal epithelial-18 cells

CR Hamiel, A Kallweit, R Beck, K Queensland, PE Wischmeyer

University of Colorado, Aurora, CO, USA

Critical Care 2009, 13(Suppl 1):P372 (doi: 10.1186/cc7536)

Introduction Osmotically acting amino acids, such as glutamine, can be cytoprotective following injury in vitro and in vivo. As threonine (THR) can also induce cell-swelling, the aim of this study was to investigate the potential for THR to induce cellular protection in intestinal epithelial-18 cells and to elucidate the mechanism by which it may work.

Methods Cells were treated for 15 minutes with increasing doses of THR up to 20 mM, with/without subsequent heat stress (HS) injury. Cell survival was evaluated via MTS assay 24 hours following lethal HS (44°C x 50 min). All HS groups were normalized to their non-HS controls. Western blot analysis was used to determine active caspase-3 activity (an indicator of apoptosis), and heat shock protein 25 (HSP25) expression/cellular localization in cells subjected to non-lethal HS (43°C x 45 min). Enhanced nuclear translocation of HSP25 has been linked to decreased apoptosis. Microscopy was used to visualize cell size and morphology since cytoplasmic HSP25 has been shown to enhance actin stabilization during cellular stress. Results THR increased cell survival in a dose-dependent manner (P = 0.008 vs. HS controls (CT)), n = 3. A control amino acid cocktail (20 mM valine, alanine and phenylalanine) failed to provide protection from lethal heat stress. Active caspase-3 activity was highest in HS cells and decreased with THR (P = 0.0006 vs. HS CT). HSP25 was predominantly cytoplasmic in non-HS cells and increased in a dose-dependent manner with THR (P = 0.0006 vs. CT). HS caused nuclear translocation of HSP25, and this effect was increased even further with THR treatment (P <0.05 vs. HS CT). Microscopy showed preserved cell size and structural integrity of the actin cytoskeleton in HS cells treated with THR. Cell size decreased during HS by 40 ± 5% (P = 0.003 vs. non-HS controls). These effects were completely attenuated with THR treatment (P = 0.00001 vs. HS CT).

Conclusions THR protected cells from lethal HS by decreasing apoptosis. THR can induce HSP25 in CT cells, and enhance nuclear translocation in HS cells. THR treatment preserved the structural integrity of the actin cytoskeleton and prevented cellular crenation during HS. It is possible that THR's mechanism of cellular protection involves cytoskeletal stabilization and decreased apoptosis-mediated by HSP25.

Repeated measurements of N-terminal pro-brain natriuretic peptide enable dynamic risk stratification in critically ill patients

B Meyer1, M Hulsmann1, P Wexberg1, M Nikfardjam1, G Strunk2, T Szekeres1, G Gouya1, R Pacher1, G Heinz1

Medical University of Vienna, Austria; 2University of Economics

and Business Administration Vienna, Austria

Critical Care 2009, 13(Suppl 1):P373 (doi: 10.1186/cc7537)

Introduction Risk stratification is a major problem in the care of critically ill patients. To date, there is no widespread acceptance of any prognostic marker for ongoing risk stratification. In the present S152 study we aim to determine whether N-terminal pro-brain natriuretic

peptide (NT-pro-BNP) serves as a marker of dynamic risk stratification.

Methods This prospective observational study was performed in the ICU of the Department of Cardiology/Medical University of Vienna between August 2004 and June 2007. Adult patients with a length of ICU stay >48 hours were included. In addition to routine clinical and laboratory assessment, blood samples for determination of NT-pro-BNP were obtained in all patients on admission (NT-pro-BNP-0h) and after 48 hours (NT-pro-BNP-48h). NT-pro-BNP plasma levels were assessed by use of commercially available kits.

Results Out of 286 patients included (196 male (68.5%), age 64 ± 14 years), there were 226 ICU survivors (79%). ICU survivors had significantly lower NT-pro-BNP-0h as well as NT-pro-BNP-48h levels compared with ICU nonsurvivors (7,063 ± 9,183 vs. 15,254 ± 12,850 pg/ml, P<0.0001, NT-pro-BNP-0h; and 8,304 ± 9,147 vs. 17,302 ± 12,687 pg/ml, P <0.0001, NT-pro-BNP-48h, respectively). There was no statistically significant change in NT-pro-BNP levels in ICU survivors compared with ICU nonsurvivors (A-NT-pro-BNP 1,240 ± 7,814 vs. 2,047 ± 11,081 pg/ml, P =

0.624), but significantly more ICU survivors had a decrease in NT-pro-BNP within 48 hours (37% vs. 33%, P <0.0001). In Cox regression models, NT-pro-BNP-0h, NT-pro-BNP-48h and increase/decrease of NT-pro-BNP were independent predictors of ICU mortality within 28 days, with NT-pro-BNP-48h being the most potent parameter (NT-pro-BNP-0h Wald 11.289, P = 0.001; NT-pro-BNP-48h Wald 17.630, P <0.001; increase/decrease Wald 4.992, P = 0.025, respectively). The area under the receiver operating characteristic curve with respect to prediction of ICU survival was 0.714 (P<0.0001) for NT-pro-BNP-0h, 0.713 (P<0.0001) for NT-pro-BNP-48h and 0.489 (P =0.800) for A-NT-pro-BNP. Conclusions NT-pro-BNP reflects not only the severity of the disease on ICU admission, but - more importantly - the plasticity of NT-pro-BNP monitors the severity of the disease during the ICU stay.

CT-pro-endothelin-1 and prognosis in critically ill patients with respiratory failure

B Meyer1, P Wexberg1, M Nikfardjam1, G Heinz1, N Morgenthaler2, A Bergmann2, J Struck2, R Pacher1, M Hulsmann1

Medical University of Vienna, Austria; 2BRAHMS AG, Hennigsdorf, Germany

Critical Care 2009, 13(Suppl 1):P374 (doi: 10.1186/cc7538)

Introduction Endothelin-1 is known to be elevated in patients with various pulmonary and nonpulmonary diagnoses. CT-pro-endo-thelin-1 (CT-pro-ET-1) is a stable precursor molecule of endothelin-

1. In the present study we tested whether CT-pro-ET-1 is elevated in critically ill patients admitted to the ICU with respiratory failure. Moreover, we tested whether an elevation of CT-pro-ET-1 is a predictor of an adverse outcome.

Methods In this prospective observational study we included 78 patients with documented respiratory failure on ICU admission and 266 patients with various other diagnoses and without respiratory failure. Blood samples for determination of CT-pro-ET-1 were obtained in all patients on ICU admission. CT-pro-ET-1 was determined by use of a new sandwich immunoassay. Results Respiratory failure was attributed to a primary pulmonary cause in 66 patients: chronic obstructive pulmonary disease (n =15), pulmonary hypertension (n = 7), pneumonia (n = 17), acute respiratory distress (n = 3), pulmonary embolism (n = 6), postoperative respiratory failure (n = 12) and various/mixed causes

(n = 6). A total of 12 patients had respiratory failure because of primary cardiogenic edema. Patients presenting with primary pulmonary failure on ICU admission had significantly higher CT-pro-ET-1 levels compared with patients with diagnosis of cardiogenic pulmonary edema and patients without respiratory failure (193 ± 117 vs. 160 ± 67 vs. 148 ± 94 pmol/l, P = 0.007). In patients with primary pulmonary failure and in patients with cardiogenic edema, there was no statistically significant difference in CT-pro-ET-1 levels between ICU survivors and ICU nonsurvivors (195 ± 155 vs. 191 ± 127 pmol/l, P = 0.908 and 160 ± 75 vs. 164 ± 5 pmol/l, P = 0.940). In contrast, in the mixed cohort of critically ill patients without respiratory failure, CT-pro-ET-1 levels were statistically significantly lower in ICU survivors compared with ICU nonsurvivors (140 ± 93 vs. 179 ± 95 pmol/l, P = 0.007). Conclusions CT-pro-ET-1 plasma levels are increased in patients admitted to the ICU because of respiratory failure. Elevated plasma levels of CT-pro-ET-1 are a potent marker of adverse outcome in our mixed cohort of critically ill patients. In the subgroup of patients admitted to the ICU because of respiratory failure due to a primary pulmonary cause or due to cardiogenic edema, however, elevation of CT-pro-ET-1 did not provide statistically significant prognostic information.

Serum high-mobility group box-1 protein as a specific marker of severe abdominal injury

Y Sakamoto1, K Mashiko1, H Matsumoto1, H Yokota2

1Chiba Hokusou Hospital, Nippon Medical School, Chiba, Japan;

2Nippon Medical School, Tokyo, Japan

Critical Care 2009, 13(Suppl 1):P375 (doi: 10.1186/cc7539)

Introduction Trauma cases who have altered consciousness are candidates for missed injury of the abdomen. A specific marker of severe abdominal trauma would therefore be considered useful for rapid evaluation and treatment of abdominal injuries in trauma cases. On the other hand, high-mobility group box-1 protein (HMGB1) has been widely studied in relation to its role in sepsis and inflammation [1,2].

Methods We measured the serum HMGB1 concentrations in 50 consecutive trauma patients as early as possible after they arrived in the emergency room. All cases with an abbreviated injury scale over 3 were enrolled in the study. The correlations between the serum HMGB1 levels and the body region of injury were examined. Then, a comparative evaluation of the injury severity score, revised trauma score and probability of survival was conducted in the patients with and without elevated serum HMGB1 concentrations. In addition, we measured the serum HMGB1 levels in 45 septic shock patients and the relationship between the HMGB1 level and underlying disease.

Results There were no significant correlations between the serum HMGB-1 levels and the presence of severe head, chest or extremity injury. On the other hand, the serum HMGB1 levels were significantly higher in the subject group with severe abdominal injury (mean HMGB-1 level, 16.9 ng/ml; 10 cases) as compared with that in the subject group without severe abdominal injury (mean HMGB-1 level, 2.2 ng/ml; 40 cases) (P = 0.0069). In septic shock patients, the serum HMGB1 levels were significantly higher in the subject group with peritonitis (mean HMGB-1 level, 19.1 ng/ml; 22 cases) as compared with that in the subject group without peritonitis (mean HMGB-1 level, 8.8 ng/ml; 23 cases) (P =0.0447).

Conclusions According to our data, high serum levels of HMGB1 were not correlated with an increased likelihood of head injury, chest injury or extremity injury, but were significantly correlated with

the presence of severe abdominal injury. This result suggests the possibility of the serum HMGB-1 level representing a specific marker of severe abdominal injury. References

1. Sakamoto Y, et al.: Relationship between effect of polymyxin B-immobilized fiber and high mobility group box-1 protein in septic shock patients. ASAIO J 2007, 53: 324-328.

2. Sakamoto Y, et al.: Clinical responses and improvement of some laboratory parameters following polymyxin B-immo-bilized fiber treatment in septic shock. ASAIO J 2007, 53: 646-650.

LightCycler SeptiFast in early diagnosis of sepsis: our experience

SM Raineri, D Canzio, C Sarno, ND Cascio, G Mineo, R Chiaramonte, A Giarratano

University of Palermo, Italy

Critical Care 2009, 13(Suppl 1):P376 (doi: 10.1186/cc7540)

Introduction The conventional sepsis diagnosis, using the cultivation technique, needs 24 hours for bacterial identification and 36 hours for fungal. The use of empiric therapy makes the growth of bacteria and fungi slower or may yield negative findings in many cases of septic shock. The molecular technique can contribute to a more rapid and specific diagnosis in septic patients. SeptiFast detects 26 bacterial and fungal species DNAs, using the PCR in real time and giving results after 6 hours. This is important for de-escalation therapy and beginning of appropriate antibiotic treatment. The aim of this study is to evaluate the sensibility and the specificity of the SeptiFast test versus traditional diagnosis. Methods We enrolled 16 patients admitted to the ICU in the past 6 months with surgical severe sepsis and septic shock. All patients were treated with empiric antibiotics at the time of testing. Fungal and bacterial DNA was detected by the LightCycler SeptiFast. We compared the results of molecular diagnosis with the cultivation technique (BACTEC 9050) and microbiological blood culture Results The preliminary results are presented in Table 1.

Table 1 (abstract P376)

Comparison of results

BACTEC SeptiFast

Coagulase-negative staphylococci 7 6

Acinetobacter baumanii 3 4

Klebsiella 3 4

Pseudomonas aeruginosa 2 3

Enterococcus faecalis 8 8

Stenotrophomonas maltophilia 1 1

Conclusions The results of BACTEC and SeptiFast were the same in 12/16 (75%) patients. In 3/16 (18%) patients the SeptiFast test showed a positive result, not detected by BACTEC. In only 1/16 (6%) we found a positivity in BACTEC, and negativity in SeptiFast. This result was probably due to contamination of samples for BACTEC. SeptiFast is a valuable add-on to the traditional gold standard with cultivation methods, also during antimicrobial treatment. This study represents a limited assessment of the PCR's performance. To properly assess the clinical value of this technique, a prospective study with a larger population could be useful.

Differential diagnosis of systemic inflammatory response syndrome versus sepsis based on a multiplex quantitative PCR assay

A Kortgen1, M Bauer1, E Möller2, S Rußwurm2, K Reinhart1

1Friedrich-Schiller-University, Jena, Germany; 2SIRS-Lab GmbH, Jena, Germany

Critical Care 2009, 13(Suppl 1):P377 (doi: 10.1186/cc7541)

Introduction Differential diagnosis of systemic inflammation versus sepsis is based primarily on clinical criteria and laboratory tests that lack the required sensitivity and specificity. This dilemma holds true for medical as well as surgical patients, for example after bypass surgery or autoimmune disease. Both populations are characterized by the presence of signs of systemic inflammation or by increased risk for nosocomial infection and, thus, reflect patient populations at risk for sepsis with significant diagnostic uncertainty. Novel and robust biomarkers are urgently needed to correctly and timely identify infection as the underlying cause of a systemic host response because each hour of delay of anti-infectious therapy leads to increased mortality. Methods We analyzed the clinical utility of a new class of trans-criptomic biomarkers derived from circulating leukocytes. Prospectively collected whole blood samples from 460 patients admitted to the operative ICU were included in a microarray/quantitative PCR study to identify sensitive and specific biomarkers. The identification of a signature specific for the discrimination between systemic inflammatory response syndrome and sepsis in patients suffering from shock and organ dysfunction was performed in independent training and test phases. The training set of 96 patients was selected by an independent ICU committee. Results An algorithm was established combining and transforming the gene-expression signals into a continuous, nondimensional score indicating either infectious or noninfectious causes for organ dysfunction. The resulting classificator was validated in a test set comprising 1,784 ICU-days of 364 patients. For each marker, a robust quantitative PCR assay was established. The final microarray signature could be transferred into a multiplex quantitative PCR format retaining full sensitivity and specificity with time to result of approximately 5 hours. Moreover, it could be demonstrated that the combination of seven biomarkers possesses the same accuracy compared with the complete biomarker set. The AUC in the test group was determined as 0.79 (procalcitonin 0.65, C-reactive protein 0.67).

Conclusions With its high predictive value for the differentiation between infectious and none infectious causes of shock and organ dysfunction, this new class of biomarkers may help to identify patients with life-threatening infections among patients at risk and to guide therapy, for example with anti-infective agents.

Sepsis in the emergency department: pathogen identification by blood cultures and PCR

S Hettwer, J Wilhelm, D Hammer, M Schürmann, M Amoury, S Scheubel, F Hofmann, A Oehme, D Wilhelms, AS Kekule, K Werdan

Martin-Luther-University, Halle, Germany

Critical Care 2009, 13(Suppl 1):P378 (doi: 10.1186/cc7542)

Introduction Sepsis in the early stage is a common disease in emergency medicine, and rapid diagnosis is essential. The aim of S154 our monocentric observational study (characterization of patients

with sepsis in the emergency department) was to compare pathogen diagnosis by blood cultures (BC) and PCR. Methods Two (aerobic and anaerobic) BC and blood for PCR testing were taken from the patients immediately after admission, before initiating antibiotic therapy. For PCR testing we used the LightCycler SeptiFast (Roche Diagnostics, Basel, Switzerland), which enables rapid diagnosis of the 20 most important sepsis-pathogens. SeptiFast test kits were provided free of charge. Results We analyzed 95 patients with suspected severe infection. Thirty-four patients (35.8%) had a PCT value >2 ng/ml and were classified as septic. In septic and nonseptic patients, age (63.8 ± 1 7.7 vs. 62.9 ± 7.4 years, P = NS) and male sex (60.3% vs. 64.1%, P = NS) were comparable. Thirty-five patients (36.8% of total and 44.1% of septic) had positive BC, and 32 patients (33.7% of total and 52.9% of septic) had positive PCR tests. Septic patients showed significantly more positive PCR results than nonseptic patients (P <0.01). No difference was found for BC. PCR showed an accordance of 84.4% to BC. In 11.4% of patients BC was positive and PCR was negative (agent not in PCR recognition spectrum, n = 2 (Micrococcus luteus); agent in the PCR spectrum, but not found, n = 6 (Streptococcus spp. and coagulase-negative Staphylococcus spp.); different pathogens identified by PCR and BC, n = 3). In seven patients (7.3%) with negative BC, additional, mostly Gram-negative bacteria were found by PCR. The PCR-positive in comparison with PCR-negative patients had significantly higher values for procalcitonin (17.3 ± 26.4 vs. 13.8 ± 57.2 ng/ml, P <0.01), IL-6 (4,220 ± 14,385 vs. 626 ± 1,666 pg/ml, P <0.01) and APACHE II score (19.1 ± 9.1 vs. 14.6 ± 8.8, P <0.05) and Sequential Organ Failure Assessment score (4.5 ± 3.4 vs. 2.9 ± 2.6, P <0.05). For BC-positive patients a difference was found only for Sequential Organ Failure Assessment score (4.0 ± 3.0 vs. 2.9 ± 2.6, P<0.05). Conclusions In septic patients in the emergency department, PCR identifies pathogens in about 50%. A positive PCR correlates with sepsis severity as well as the inflammatory response of the host.

Rapid detection and identification of pathogens in critically ill and immunocompromised hosts using molecular techniques

M Prucha1, S Pekova1, P Stastny2, R Zazula2, J Vydra3, M Kouba4, T Kozak3

1Na Homolce Hospital, Prague, Czech Republic; 2Thomyaer Hospital, Prague, Czech Republic; 3Faculty Hospital Kralovske

Vinohrady, Prague, Czech Republic; 4Institute of Haematology and Blood Transfusion, Prague, Czech Republic Critical Care 2009, 13(Suppl 1):P379 (doi: 10.1186/cc7543)

Introduction Infectious complications in immunocompromised patients and critically ill patients represent a serious clinical problem. In these individuals, not only common nosocomial agents but also very unusual pathogens, pathogens with specific cultivation demands or fastidious pathogens can be identified. Timely information on the causative agents is highly important for rapid and accurate clinical decision-making and targeted pharmaco-therapy.

Methods Patients fulfilling the criteria for sepsis, severe sepsis and septic shock, as well as patients with hematological malignancies, patients undergoing chemotherapy, and transplant and post-transplant patients were included in our cohort. We developed a system employing pathogen-specific probes and real-time PCR technology to detect the 25 most frequent human pathogens causing severe nosocomial infections and five fastidious pathogens causing pulmonary infections. Each sample has also

been tested using a pan-bacterial and pan-fungal broad-range PCR system coupled with direct sequencing of PCR products for precise identification of the causative agents. Results We have investigated 548 clinical samples (peripheral blood 236 samples, bronchoalveolar lavage (BAL) 186 samples, cerebrospinal fluid 32 samples, sputum 29 samples, aspirate from thorax cavity 19 samples, drainage fluids 12 samples, abscesses 11 samples, urine 10 samples, tissue 13 samples). In addition to common pathogens we have identified a set of unusual and fastidious pathogens: Chlamydiophilla pneumoniae (BAL), Mycoplasma pneumoniae (BAL), Peptostreptococcus micros (thorax cavity), Fusobacterium nucleatum, Listeria monocytogenes and Porphyromonas endodontalis (cerebrospinal fluid), Candidatus Neoehrlichia mikurensis (peripheral blood), Mycobacterium tuberculosis (tissue, cerebrospinal fluid), Aspergillus flavus (tissue, sputum), Malassezia pachydermatis (tissue), and Cryptococcus carnescens (BAL).

Conclusions A pathogen-specific real-time PCR technique coupled with direct pan-bacterial and pan-fungal sequencing represents a very fast and useful tool to accelerate and refine the diagnostics of infections in critically ill patients.

Microalbuminuria: a biomarker of sepsis

S Basu1, M Bhattacharya1, A Majumdar1, T Chatterjee2, S Todi1

1AMRI Hospitals, Kolkata, India; 2Jadavpur University, Kolkata, India Critical Care 2009, 13(Suppl 1):P380 (doi: 10.1186/cc7544)

Introduction Assessment of microalbuminuria as a diagnostic tool in predicting sepsis in the critically ill patient. Methods A prospective observational study in a 20-bed ICU in a tertiary-care hospital. Microalbuminuria estimated as the spot urine albumin-creatinine ratio (ACR, mg/g) was measured on ICU admission (ACR1) and after 24 hours (ACR2). A total of 242 patients were recruited for the study between January 2007 and May 2008. Patients with an ICU stay of less than 24 hours, pregnancy, menstruation, anuria, hematuria, urinary tract infection, and proteinuria due to renal and postrenal structural diseases were excluded.

Results Patients with sepsis (n = 95) had a significantly higher median ACR1 (145.8 (IQR 46 to 305)) and ACR2 (104.3 (IQR 33

Figure 1 (abstract P380)

to 179)) in comparison with those without sepsis (n = 147) (ACR1 = 56.6 (IQR 27 to 111) and ACR2 = 37.8 (IQR 18 to 93)) (P<0.0001) (Figure 1). In a receiver operating characteristic curve analysis, ACR1 emerged as the most reliable indicator of sepsis (area under curve (AUC) of ACR1 = 0.710 >AUC of ACR2 = 0.694). ACR1 concentration of 145.7 mg/g had sensitivity of 50.5% and specificity of 87.1% with positive predictive value of 71.6% and negative predictive value of 73.1% in diagnosis of sepsis.

Conclusions Absence of significant microalbuminuria at the time of ICU admission is unlikely to be associated with sepsis.

Newly developed endotoxin measurement method (endotoxin activity assay) may reflect the severity of sepsis

H Murayama, Y Kakihana, T Oryoji, N Kiyonaga, S Tashiro, T Imabayashi, T Yasuda, Y Kanmura

Kagoshima University Hospital, Kagoshima, Japan

Critical Care 2009, 13(Suppl 1):P381 (doi: 10.1186/cc7545)

Introduction Endotoxin (ET) is a structural molecule of the Gramnegative bacilli extracellular membrane, which activates target cells including macrophages and neutrophils, and causes septic shock. But it is known that the conventional ET measurement method has many problems; for example, a discrepancy between plasma ET concentration and clinical manifestation in the septic patient. We therefore evaluate the usefulness of a new developed method to measure the plasma ET activity (endotoxin activity assay (EAA)) [1] in patients under sepsis compared with the ordinary method of the limulus amebocyte lysate (LAL) assay.

Methods With institutional approval and informed consent, we measured the EAA in 40 patients (aged 63.5 ± 17.7 years) admitted to the ICU. The EAA was measured using a chemi-luminometer (Autolumat LB953; EG & G. Berthold). Patients were divided to five categories: (1) control group, (2) systemic inflammatory response syndrome (SIRS) group, (3) sepsis (SIRS and infection) group, (4) severe sepsis group, and (5) septic shock group. We then compared the EAA level between each group and control group. We made the statistical evaluation by unpaired t test and a significant difference was P <0.05.

Results The EAA levels were significantly increased as sepsis severity rises. The measured EAA levels were (0.18 ± 0.09), (0.33 ± 0.19), (0.39 ± 0.16), (0.65 ± 0.25) and (0.78 ± 0.34) in control, SIRS, sepsis, severe sepsis and septic shock groups, respectively. In the LAL method, four of the severe sepsis group and two of the septic shock group exceeded the cutoff value. In the EAA level, severe patients had a tendency to exceed the cutoff value. There was no significant correlation between the EAA level and ET density.

Conclusions This newly developed method named EAA measures ET using anti-lipopolysaccharide monoclonal antibody. Our trial suggests this method can evaluate the severity of sepsis correctly compared with the usual method of measuring ET. Reference

1. Romanschin AD, Harris DM, Riberio MB, et al.: A rapid assay of endotoxine in whole blood using autologous neutophil-dependent chemiluminescence. J Immunol Methods 1998, 212:169-185.

Comparison of ACR1 between sepsis and nonsepsis patients.

Activated protein C-protein C inhibitor complex as a prognostic marker in sepsis

L Heslet1, R Hald2, C Recke2, K Bangert2, LO Uttenthal2

1Rigshospitalet, Copenhagen, Denmark; 2BioPorto Diagnostics A/S, Gentofte, Denmark

Critical Care 2009, 13(Suppl 1):P382 (doi: 10.1186/cc7546)

Introduction The PROWESS study and later trials of activated protein C (APC) treatment in sepsis have shown only modest reductions in mortality. A recent Cochrane systematic review (CD004388) records doubtful efficacy and serious adverse effects. To optimize the benefit/risk ratio of APC treatment of each patient, a biomarker of protein C (PC) activation is urgently needed, and the use of such a marker, APC-protein C inhibitor (PCI), has been investigated in the present study. Methods APC-PCI was measured in acid citrate plasma by means of a newly developed sandwich ELISA (median normal value 0.13 ng/ml, range 0.07 to 0.26, n = 16). Levels of APC-PCI and PC were monitored (daily to alternate days) in 135 consecutive critically ill patients, 53 of whom had sepsis during the observation period. The state of PC activation to APC was categorized as nonactivated (APC-PCI <0.25 ng/ml), moderately activated (APC-PCI = 0.25 to 0.72 ng/ml) or highly activated (APC-PCI >0.72 ng/ml), based on maximum APC-PCI values in relation to the normal range.

Results The maximum APC-PCI values ranged from 0.03 to 29 ng/ml, median 0.44 ng/ml. The overall mortality of the 53 sepsis patients was 32% (17/53). The mortality in the PC activation groups was significantly different (P = 0.032, chi-square test): nonactivated 44% (7/16), moderately activated 13% (3/23) and highly activated 50% (7/14). A bell-shaped mortality relationship was noted, with high mortalities in both the nonactivated and highly activated groups, and a much lower mortality in the moderately activated group. Subdividing the PC activation groups according to APACHE II score yielded the highest mortality, 71% (5/7), in the nonactivated subgroup with APACHE II >25, whereas the APACHE II score showed no relationship with mortality in the other PC activation groups. Minimum PC levels did not correlate with APC-PCI and showed no significant differences between the activation groups.

Conclusions Nonactivation of PC in sepsis may represent the failure of an appropriate protective response and is therefore associated with increased mortality, especially when the APACHE II score is elevated. Septic patients without PC activation and a high APACHE II score may be those who are most likely to benefit from APC treatment. PC measurements were not predictive of PC activation as indicated by APC-PCI levels.

Copeptin is a strong and independent predictor of outcome in cardiogenic shock

B Meyer1, P Wexberg1, J Struck2, A Bergmann2, N Morgenthaler2, G Heinz1, R Pacher1, M Hulsmann1

Medical University of Vienna, Austria; 2BRAHMS AG, Hennigsdorf, Germany

Critical Care 2009, 13(Suppl 1):P383 (doi: 10.1186/cc7547)

Introduction As a stress hormone, arginine vasopressin (AVP) is significantly increased in acute hemodynamic instability. AVP is released in response to osmotic and haemodynamic changes aiming to maintain fluid volume and vascular tone. Copeptin is a stable fragment of pre-pro-vasopressin that is synthesised and

released in equimolar quantities as AVP. Unlike AVP, copeptin is highly stable ex vivo. We aimed to test the prognostic potency of an elevation of copeptin in patients with cardiogenic shock. Methods In this prospective observational study we included consecutive patients with cardiogenic shock admitted to the ICU of the Department of Cardiology/Medical University of Vienna between November 2004 and March 2006. In all patients, blood samples for determination of routine laboratory tests and N-terminal pro-brain natriuretic peptide (NT-pro-BNP) and copeptin plasma levels were obtained on admission. Copeptin was assessed using an immunoassay in the chemiluminescence/coated tube format. Copeptin, NT-pro-BNP, age, gender, presence of acute renal failure and mechanical ventilation were analysed for prediction of ICU survival.

Results We included 91 consecutive patients (66 male (72%), age 66.5 ± 11.4 years) with diagnosis of cardiogenic shock on ICU admission. All patients required intravenous inotropic support, 19 patients (21%) were treated with an intraaortic balloon counterpulsation and nine patients (9%) were on extracorporeal circulatory support (eight patients (8%) had an extracorporeal membrane oxygenation, one patient (1%) was on novacor support). Fifty-six patients (62%) survived and 35 patients (38%) died. Copeptin plasma levels were significantly higher in ICU nonsurvivors than in ICU survivors (164.4 ±117.8 vs. 248.2 ± 256.6 pg/ml, P =0.034). In a logistic regression model, copeptin was the best predictor of ICU survival, with only NT-pro-BNP providing independent additional information (copeptin OR = 1.002; P =0.001 and NT-pro-BNP OR = 1.001; P = 0.05).

Conclusions Elevated plasma levels of copeptin are a strong and independent predictor of adverse outcome in patients with cardiogenic shock.

Daily assay of procalcitonin, C-reactive protein and IL-6 roles in diagnosis and management of severe sepsis

M Umbrello, S Marzorati, E Mantovani, F Colombo Pavini, F Rapido, G Mistraletti, M Langer, G lapichino

Istituto di Anestesia e Rianimazione, Milan, Italy

Critical Care 2009, 13(Suppl 1):P384 (doi: 10.1186/cc7548)

Introduction Early diagnosis of sepsis is crucial for management and outcome of critically ill patients. The use of clinical parameters, white cell count or body temperature proved far from ideal in identifying patients who need antimicrobial therapy. The lack of a sensible and specific marker of infection may be responsible for delaying or prolonging antibiotic use [1]. The aims of this work were to test the predictive ability of the serial monitoring of IL-6, procalcitonin (PCT), and C-reactive protein (CRP) to stratify the different levels of sepsis and to assess whether their measurement could add to the therapeutic decision-making process during long-term ICU stay.

Methods In a prospective observational study we studied the time course of inflammatory markers in consecutive cases of long-term critical illness in the general ICU of a university hospital. Daily sera were subsequently analyzed for CRP, PCT and IL-6 (only in the last 16 patients) in all patients with length of stay >6 days. Results We enrolled 26 patients, for a total of 592 days. In seven patients that never experienced severe sepsis/septic shock CRP, PCT and IL-6 levels decreased over time, the 14 who recovered had a reduction of markers, while no variation was found in five patients expired in sepsis. One hundred and ninety-eight days classified as severe sepsis/septic shock had CRP (72.1 (43.4 to 127.8 IQR) vs. 90.8 (46.8 to 213.9)), PCT (0.19 (0.09 to 0.49) vs. 1.9 (0.49 to 4.92)) and IL-6 (83.90 (57.25 to 133.85) vs. 199.18

(105.51 to 289