Scholarly article on topic 'Hyperoxia inhibits alveolar epithelial repair by inhibiting the transdifferentiation of alveolar epithelial type II cells into type I cells'

Hyperoxia inhibits alveolar epithelial repair by inhibiting the transdifferentiation of alveolar epithelial type II cells into type I cells Academic research paper on "Health sciences"

Share paper
Academic journal
Critical Care
OECD Field of science

Academic research paper on topic "Hyperoxia inhibits alveolar epithelial repair by inhibiting the transdifferentiation of alveolar epithelial type II cells into type I cells"

Critical Care Volume 12 Suppl 2, 2008

28th International Symposium on Intensive Care and Emergency Medicine

Brussels, Belgium, 18-21 March 2008

Published online: 13 March 2008

These abstracts are available online at © 2008 BioMed Central Ltd

Analytical survey of human rabies and animal bite prevalence during one decade in the province of Kerman, Iran

M Rezaeinasab1, M Rad2

1Rafsanjan University of Medical Sciences, Rafsanjan, Iran; 2Faculty

of Veterinary Medicine, University of Tehran, Iran

Critical Care 2008, 12(Suppl 2):P1 (doi: 10.1186/cc6222)

Introduction In order to find out the frequency rates of domestic and wild animal bites as well as the evaluation of the prevalence rates of rabies disease in the human population in the Province of Kerman, a retrospective study was designed to analyze statistically the collected recorded data related to this project. Methods This study was conducted within the framework of MPVM student research projects by means of collaboration between University of Tehran, Veterinary Organization of Kerman, Medical Science University of Kerman and Medical Science University of Rafsanjan and Networks of Health Centers of the 10 cities of Kerman Province.

The required data such as the numbers of persons who were bitten by animals, the distribution of the studied variables such as geographical locations, age groups of people, jobs and professional relationships, pre-exposure prophylaxis treatment for rabies, and topographical conditions of the injured organs of bodies due to the animal bites, as well as the mortality rates of individuals resulting from rabies were collected during one decade from 21 March 1994 to 21 March 2003 in all 10 cities including the rural areas of the province of Kerman. All data were finally analyzed by SPSS software (version 11.5).

Results On the basis of recorded statistical analysis, the mortality cases of human rabies in the province of Kerman during one decade was 10 persons (eight males and two females). One-half of them (50%) were bitten by dogs and the others (50%) by foxes. Among the reported deaths, 40% were from Kahnooj county (Jiroft region). The reported data indicated that 21,546 persons were bitten by animals during 10 years in the province of Kerman. The mean of age of the people who were bitten by dogs was 24.80 years (SD = ±14.6), while the mean age of the people who were bitten by foxes was 57.25 years (SD = ±1.50). There was a significant difference between the mean age of these two groups of the people (P < 0.05). The most frequent rate of injured people was reported in the age group 10-19 years old and the frequency rate of males (76.00%) was more than females (24.00%). Therefore, there was a statistically significant difference between males and females in this study (P < 0.01). About 60% of all persons that were bitten by animals were from rural areas and 40% of them were from urban areas (P < 0.05). Among the people who were bitten and injured by animals during one decade in the province of Kerman, 85.70% of them were not treated by the rabies prophylaxis treatment regimen. Among all of them who were bitten by animals, 50% were injured through hands and feet, 40%

of them through heads and faces, and 10% of them through trunks, cervical regions and other organs of the bodies. In the persons who were bitten by animals in the head region, the mean latency period for rabies was 33 days (SD = ±12.2 days), while the mean latency period in the persons who were bitten through hands and feet was 77 days (SD = ±45.8 days). The P value was <0.1. The results of this study showed that there is a significant reciprocal correlation between annual raining level and the frequency rate of animal bites in the province of Kerman (r = 0.5, P < 0.01).

Conclusions According to this study, the role of foxes in the epidemiology of human rabies in the province of Kerman, located in the southeast of Iran, seems very important. Since most of the animal bite individuals, during the one-decade survey in this region of Iran, did not seem aware of the risk of exposure to the viral infection of rabies through animal bites, the public education of preventive measurements of rabies seems imperative by the public health authorities as well as vaccination of animals against rabies, especially dogs and cats, as well as mass vaccination of wild animals by means of distribution of oral vaccines in the vast and scattered forests by helicopters belonging to Veterinary Organization Authorities being recommended. Collaboration of intersectional public health relationships of medical science universities of the province of Kerman as well as all related authorities to control rabies prevalence in the regional and interregional provinces of the southeast, the southwest and the neighbor provinces of Fars, Hormozgan, Sistan-Baluchestan and Yazd is very necessary.

What do people really know about MRSA? A survey of knowledge and attitudes in the general public and hospital visitors

A Mclaughlin, J Canavan, E McAdam, R Mcdonagh, H Brar, J Hardt, K Sinead, G Fitzpatrick, M Donnelly

Adelaide Meath Hospital, Dublin, Ireland

Critical Care 2008, 12(Suppl 2):P2 (doi: 10.1186/cc6223)

Introduction We set out to assess current understanding of MRSA among the lay public prior to writing an information booklet for relatives of patients in the ICU.

Methods Trained researchers approached potential participants in the hospital entrance and public places to complete the questionnaire.

Result Of 545 participants who completed the questionnaire, 24 had never heard of MRSA and 521 remained (176 visitors, 345 general public); 4.9% (n = 26) had previously contracted MRSA. The median age was 37 (21-49) years. The cohort first heard of MRSA 24 (±18) months previously. The most common sources of information were television and newspapers. Participants who had MRSA thought that the shortage of beds contributed to MRSA S1

transmission (84% vs 69%). 46.3% of the public versus 16% of the MRSA group did not expect to acquire MRSA after routine surgery (P = 0.0095). Most participants (65.3% of the public, 70% of visitors and 52% of the MRSA group) thought MRSA was serious. Ninety-two percent of the MRSA group worried about transmission to family members. 3.6% of the cohort would not know where to find more information.

Conclusions MRSA is considered serious, information is obtained through the media, and most participants can obtain further information.

Intensive care infections: risk factors and mortality

S Silvestri1, L Toma1, F Forfori2, C Mosca2, F Giunta2

1Scuola di specializzazione in Anestesia e Rianimazione, Universita degli Studi di Pisa, Pisa, Italy; 2Department of Surgery, AOUP, Pisa, Italy

Critical Care 2008, 12(Suppl 2):P3 (doi: 10.1186/cc6224)

Introduction The aim of this study was to elucidate the impact of ICU-acquired infection on ICU and hospital mortality. The main determinants of hospital infection onset were investigated and the role of the most used antibiotics in the ICU was considered a risk factor for selection of peculiar bacterial species responsible for ICU pneumonia.

Methods Patients with a longer than 48 hour stay in a teaching hospital ICU were retrospectively enrolled between January 2005 and December 2006. Risk factors for ICU and hospital mortality were analyzed with a logistic regression model adjusted for age, SAPS II, medical or surgical status of the patients. Univariate analysis permitted one to verify the relation between previous exposition to an antibiotic therapy and development of ICU pneumonia. Results Of 343 patients enrolled, 39 had a diagnosis for ICU infection: 18 had an infection on admission developing a second infection during ICU stay, and 21 had a primary infection after ICU admission. Among the patients with ICU-acquired infection, ICU mortality and hospital mortality were more than doubled (OR = 2.51 (95% CI = 1.05-5.98) and OR = 2.32 (95% CI = 1.10-4.86), respectively). Having more than one infection demonstrated an ICU mortality risk addiction more than tripled (OR= 3.36 (95% CI = 1.06-10.61)). Admission severity and an infection before ICU admission emerged as important risk factors for ICU-acquired infections (OR = 5.71 (95% CI = 1.19-27.29) and OR = 3.14 (95% CI = 1.42-6.97), respectively). Previous fluoroquinolone use demonstrated a clear role in favouring Pseudomonas aeruginosa pneumonia and linezolid in Acinetobacter baumannii pneumonia (Table 1). Conclusions ICU-acquired infections are an independent risk factor for ICU and hospital mortality. Finally some antibiotic categories might show up as pneumonia inductors but further studies are needed to confirm our hypothesis. Reference

1. Aloush V, Navon-Venezia S: Antimicrob Agents Chemother 2006, 1:43-48.

Table 1 (abstract P3)

Pseudomonas Acinetobacter Stenotrophomonas

aeruginosa baumannii maltophilia

Fluoroquinolones RR = 2.80 RR = 0.35 RR = 0.47

(1.03-7.62) (0.04-2.83) (0.05-4.06)

Linezolid RR = 0.38 RR = 6.21 RR = 1.38

(0.06-2.45) (1.27-30.40) (0.17-11.36)

S2 RR, relative risk (95% confidence interval).

Gram-positive nosocomial infections in a general ICU: searching for a clue

G Georgiev, S Milanov, V Todorova, M Milanov

Pirogov Emergency Institute, Sofia, Bulgaria

Critical Care 2008, 12(Suppl 2):P4 (doi: 10.1186/cc6225)

Introduction The pattern of nosocomial pathogens has changed gradually since the mid 1980s and Gram(+) aerobes are the leading cause of infection in many ICUs today. Despite this trend there are still no firm recommendations for empiric Gram(+) antimicrobial coverage in patients with severe nosocomial infections. Methods A historical cohort study was conducted and included all cases of documented nosocomial infections in our general ICU for a 1-year period (November 2006-November 2007). Data on demographic characteristics, primary diagnosis, comorbidity, number of indwelling devices, previous microbial isolates and current antibiotics were cross-tabulated according to the presence and type of Gram(+) pathogens isolated. For the identified most likely risk factors, separate contingency tables were constructed and analyzed. Results Sixty-six patients (39.05% of 169 with documented nosocomial infections) with Gram(+) isolates were identified. Methicillin-resistant Staphylococcus epidermidis (MRSE) (34.85%) and Enterococci (25.76%) were most commonly isolated, followed by methicillin-resistant Staphylococcus aureus (MRSA), methicillin-susceptible S. epidermidis (MSSE), Streptococci, and methicillin-susceptible S. aureus (MSSA). In eight (12.12%) of these 66 patients the same pathogen was isolated more than once and in 14 patients (21.21%) more than one Gram(+) pathogen was present during his/her ICU stay. There were no significant differences between the groups according to demographic characteristics. The following independent risk factors for Gram(+) nosocomial infection were identified - for MRSE, gunshot wound, chronic obstructive pulmonary disease comorbidity, previous isolation of both Acineto-bacter spp. and Pseudomonas spp, previous/current treatment with carbapenem; for Enterococcus spp., billiary peritonitis, previous/ current treatment with the combination cefoperazone-sulbactam; for MRSA, clinical uroinfection; for MSSE, previous/current treatment with combination first/second-generation cephalosporin-metronida-zole; for MSSA, neurologic injury. Surprisingly the number of indwelling devices was not linked with increased risk of coagulase-negative staphylococcal infections, nor there was found a long latent period for their clinical manifestation.

Conclusions Exploratory hypotheses for further larger sample conformations have been generated. Whether some of these are pertinent to a particular ICU or could be generalized remains to be elucidated. Identification of associated risk factors for Gram(+) nosocomial infections would aid initial antibiotic choice in such patients at risk.

Descriptive analysis of ICU patients with hospital-acquired, ventilator-associated, and healthcare-associated

pneumonia at four academic medical centers_

DH Kett1, JA Ramirez2, P Peyrani2, JE Mangino3, MJ Zervos4, E Cano1, KD Ford5, EG Scerpella5, IMPACT-HAP study group1 1University of Miami/Jackson Memorial Hospital, Miami, FL, USA; 2University of Louisville, KY, USA; 3The Ohio State University Medical Center, Columbus, OH, USA; 4Henry Ford Health System, Detroit, MI, USA; 5Pfizer, New York, USA Critical Care 2008, 12(Suppl 2):P5 (doi: 10.1186/cc6226)

Introduction We developed an ICU performance improvement project to evaluate patients with ventilator-associated pneumonia

(VAP), hospital-acquired pneumonia (HAP), and healthcare-associated pneumonia (HCAP) using the 2005 American Thoracic Society/Infectious Diseases Society of America guidelines. Below is a descriptive analysis of the patients enrolled and their outcomes. Methods Data were collected prospectively. Patients were classified as VAP, HAP and HCAP. Antibiotics were chosen based on local antibiograms.

Results The first 158 patients are reported (VAP n = 120, HAP n = 26 and HCAP n =12). Patients often had comorbidities; diabetes (22%), cardiac (22%), respiratory (21%) and renal (16%). Microorganisms were identified in 78% of patients. One hundred and twenty-five patients received empiric therapy (ET). ET was compliant with the guidelines in 31% of these patients. De-escalation of antibiotic therapy occurred on day 3 in 75% (77/103) of candidates. Clinical improvement and/or cure were seen in 70% of patients. Superinfections developed in 37% of the patients. In patients requiring mechanical ventilatory support, the average days on the ventilator was 12 ± 17 days. Patients' average stay (days) in the ICU* and hospital* differed by group: VAP (17 ± 14 days, 23 ± 19 days), HAP (9 ± 10 days, 13 ± 13 days) and HCAP (11 ±19 days, 22 ± 36 days), respectively. *Comparisons with P < 0.05. See Table 1.

Table 1 (abstract P5)


Age 57 ± 19 51 ± 18 64 ± 17

APACHE II score* 21 ± 6 18 ± 6 17 ± 8

Clinical Pulmonary Infection Score* 6.8 ± 2 5.7 ± 2 5.2 ± 2

Day 14 mortality* 19.7% 15.4% 8.3%

*P < 0.05.

Conclusions VAP, as compared with HAP and HCAP, had the highest severity of illness, mortality, and consumption of ICU and hospital resources. Published guidelines are not easily translated into daily practice. Reference

1. Kett DH, Ramirez JA, Peyrani P, et al.: Am J Respir Crit Care Med 2005, 71:388-416.

European multicenter survey on antibiotic prophylaxis in liver transplant patients

E Vandecasteele1, J De Waele1, S Blot1, D Vogelaers1, X Rogiers1, D Vandijck1, M Bourgeois2, J Decruyenaere1, E Hoste1

1 University Hospital, Ghent, Belgium; 2AZ St-Jan, Bruges, Belgium Critical Care 2008, 12(Suppl 2):P6 (doi: 10.1186/cc6227)

Introduction Infection remains a major problem for patients undergoing liver transplantation (LT). However, no data regarding perioperative antibiotic prophylaxis are available. The aim of the study was to gain insight into prophylactic antibiotic strategies used in European liver transplant centers.

Methods An electronic and postal survey was sent to all LT centers, members of the European Liver and Intestine Transplantation Association. The questionnaire asked for the prophylactic antibiotic regimen used for LT recipients undergoing elective LT, for LT recipients with acute-on-chronic liver disease, and for LT recipients with acute liver failure, respectively. Results A total of 59 centers (46% response rate) from 16 different countries completed the questionnaire. Of all participating centers, 8.6% reported to perform <25, 37.9% reported 25-50, 27.6% reported 50-75, 10.4% reported 75-100, and 15.5%

reported >100 LTs annually. Antibiotic prophylaxis for recipients with elective LT consisted of one single antibiotic in 48.3%. In 50%, combination therapy was given; whereas in 1.7%, the prophylactic regimen rotated from monotherapy to combination therapy on a 6-month basis. The mean duration of prophylaxis was 3.1 ± 2.0 days. In 19% of the centers prophylaxis was restricted to 1 day only, to the first 2-3 days in 55.2%, and for more than 3 days in 24.1% (one missing answer). Monotherapy consisted of a first-line antibiotic agent (first-generation and second-generation cephalosporin, or aminopenicillin) in 42.9%, and of a broad-spectrum antibiotic (third-generation cephalosporin, piperacillin, or carbapenem) in 57.1% of centers. For recipients with acute-on-chronic disease, 73.7% used the same antibiotic regimen as used for elective LT, while 26.3% changed it (5.3% increased the duration of prophylaxis, and 21.0% changed the type of antibiotic). For recipients with acute liver failure, 66.7% used the same antibiotic regimen as used for elective LT, while 33.3% changed it (10.5% changed the duration of prophylaxis, and 22.8% changed the type of antibiotic).

Conclusions Among European LT centers, considerable variation exists in the antibiotic prophylactic strategies used for liver transplant recipients, both in terms of antibiotic regimen used and in duration of therapy. These findings underscore the need for the development of specific guidelines.

A national survey on current practice of use of selective digestive decontamination in the United Kingdom

R Shah1, J Louw2, T Veenith2

1Frimley Park Hospital, Surrey, UK; Queen Elizabeth Hospital, Kings Lynn, UK

Critical Care 2008, 12(Suppl 2):P7 (doi: 10.1186/cc6228)

Introduction The incidence of nosocomial pneumonia in patients in intensive care ranges between 7% and 40%, with a crude mortality exceeding 50% [1]. One way to reduce the incidence of ventilator-associated pneumonia in the intensive care is selective digestive decontamination (SDD). In our clinical experience, SDD is not used frequently in the UK, despite its evidence. Methods We conducted a telephonic survey and collected data on use of SDD. All ICUs in England were included (256 units) and we obtained a response form 249 units. The average size was 5.8 patients. The response was obtained either from an ICU consultant or a charge nurse in the intensive care. Before we discussed the questionnaire, we assessed the suitability of person answering. We discussed our questionnaire with 73 consultants and 176 charge nurses.

Results We obtained a response from 249 units out of the 256 units. Only 6% (15 units) out of the 249 units used SDD. In 94% (235) of the units this was not considered for use, and in 4% (12) of the units this was considered but not deemed suitable. In 0.8% (two) of the units it is currently being considered for implementation. Conclusions The oropharynx is the major source of potential pathogens that cause lower airway infections. The role of SDD is to eradicate these bacteria from the oropharynx [1]. We found in our telephonic survey that SDD is not used by most of the ICUs in England. The main deterring factors were high frequency of MRSA, drug resistance, lack of incorporation in sepsis bundles, relative disinterest in the drug companies, cost and difficulty in obtaining the preparation.

One of the drawbacks of our survey could have been the fact that we discussed with charge nurses and consultants who were not part of decision-making for the use of SDD in the ICUs. But the bottom line is that SDD is not used in the majority of ICUs. S3


1. Baxby D, van Saene HKF, Stoutenbeek CP, Zandstra DF: Selective decontamination of the digestive tract: 13 years on, what it is and what it is not. Intensive Care Med 1996, 22:699-706.

Community-acquired and healthcare-related urosepsis: a multicenter prospective study

T Cardoso1, O Ribeiro2, A Costa-Pereira2, A Carneiro1, A SACiUCI Study Group1

1Hospital Geral Sto Antonio, Porto, Portugal; 2Faculty of Medicine,

University of Oporto, Porto, Portugal

Critical Care 2008, 12(Suppl 2):P8 (doi: 10.1186/cc6229)

Introduction Urinary infections are the third focus of infection in sepsis. In this study we describe the epidemiology and microbiology of community-acquired urosepsis, to determine the associated crude mortality and to identify independent predictors of mortality. Methods A prospective, multicentered, cohort study on community-acquired urosepsis cases admitted to Portuguese ICUs from 1 December 2004 to 30 November 2005 with a follow-up until discharge.

Results Seventeen units entered the study from the north to south of Portugal, corresponding to 41% of all mixed national ICU beds. Over this period 4,142 patients were admitted to the study - 897 (22%) had community-acquired sepsis, and of these 65 (7%) had urosepsis.

Compared with other focuses of infection, urosepsis was more frequent in women (66% vs 33% in nonurosepsis, P < 0.001), and associated with shorter ICU length of stay (7 days vs 9 days, P =

0.002). No significant differences were observed regarding severity of illness (SAPS II, sepsis severity) or crude mortality. The isolation rate was 68% with 41% positive blood cultures. All isolations, except one, were Gram-negative and no fungus was isolated; Escherichia coli dominated the microbiological profile (63% of all isolations).

Healthcare-related infection (HCRI) was found in 31% of these patients: E coli represents 58% of all isolations but the resistance profile was different, with resistance to ciprofloxacin and cotrimoxazol increasing from 9% (in community-acquired sepsis) to 25% (in HCRI). The 28-day mortality was higher in the non-HCRI group (29%) than in the HCRI group (15%), although not statistically significant. Conclusions Although described as being the focus of infection with better prognosis we could not confirm this for community-acquired urosepsis in the present study. HCRI patients are a particular group with a similar microbiological profile but different resistance profile requiring a different empirical approach. Reference

1. Friedman ND, Kaye KS, Stout JE, et al.: Health care-associated bloodstream infections in adults: a reason to change the accepted definition of community-acquired infections.

Ann Intern Med 2002, 137:791-797.

Bedside laparoscopy to diagnose intrabdominal pathology in the ICU_

S Matano, M Bonizzoli, A Di Filippo, G Manca, A Peris

Intensive Care and Emergency Service, Florence, Italy Critical Care 2008, 12(Suppl 2):P9 (doi: 10.1186/cc6230)

Introduction The aim of the study was to evaluate the accuracy of S4 bedside diagnostic laparoscopy (BDL) in critically ill patients (CIP)

suspected to suffer from intrabdominal pathology compared with operative laparotomy or diagnostic imaging (CT scan) and to verify the safety of the procedure. In fact, a delay in the diagnosis of intrabdominal pathology could worsen the morbidity and mortality in these patients. In ICU patients treated with prolonged parenteral nutrition, mechanical ventilation and high-dose opioid analgesics, acalculous cholecystitis (AC) is a severe complication [1]. Clinical evaluation of the abdomen is difficult as deep sedation often masks symptoms, and physical examination is inconclusive so they are potentially eligible for exploratory laparoscopy after abdominal CT. Furthermore, performing CT is often impossible because of the difficulty in safely transporting CIP.

Methods From January 2006 to November 2007 a BDL was performed in 24 CIP to confirm the clinical diagnosis of AC. Every day, liver function tests are collected and abdominal ultrasono-graphy is performed when the suspicion of AC is high. Elevated liver function tests and ultrasonography signs such as gallbladder distension or wall thickening (>3-4 mm) with or without perichole-cystic fluid were the more significant findings of suspected AC and were considered admission criteria in the study. Twenty-four patients met the criteria. Ten were trauma victims, three were post-cardiac surgical patients, and 11 had sepsis of unknown origin. Fifteen were hypotensive and required haemodynamic support. BDL was performed with the Visiport. The pneumoperitoneum was created with a 10-15 mmHg CO2 pressure. The mean procedure time was 40 minutes.

Results The procedure was done a mean 8 days (range 5-15 days) after ICU admission. In two patients the BDL was positive for gangrenous colecystitis (both after cardiac surgery) requiring laparoscopic cholecystectomies in the operating room. Purulent peritonitis was found in five patients with sepsis of unknown origin but microbiological tests on ascites resulted negative in all cases. The other BDLs resulted negative for intrabdominal pathology. Conclusions BDL seems to represent an alternative and effective technique that might be more accurate than a CT scan and less invasive than laparotomy to obtain a diagnostic evaluation of intrabdominal pathology in ICU patients. Reference

1. Rehm CG: Crit Care Clin 2000, 16:101-112. P10

A potential role for the chest X-ray in the transmission of resistant bacteria in the ICU

PD Levin, O Shatz, D Moriah, S Sviri, A Or-Barbash, CL Sprung, C Block

Hadassah Hebrew University Hospital, Jerusalem, Israel Critical Care 2008, 12(Suppl 2):P10 (doi: 10.1186/cc6231)

Introduction An investigation of infection control practices used by X-ray technicians during the performance of routine chest X-ray scans in the ICU, transmission of resistant bacteria to the X-ray machine, and the effect of an infection control intervention. Up to 20% of patients acquire infections in the ICU, 44% of which may be transferred on caregivers' hands. Daily routine chest X-ray scans are performed sequentially, presenting the potential for bacterial spread. The degree to which X-ray technicians apply infection control measures, and the extent to which bacteria are transferred, is unknown.

Methods Compliance with 14 infection control measures was measured covertly during the performance of daily chest X-ray scans. Bacterial surface cultures were taken from the X-ray machines. An educational intervention (informing the technicians about resistant bacteria, machine culture results and correct alcohol and glove use) was instituted. Observations and machine

cultures were repeated. The appearance of resistant bacteria in patient cultures was followed.

Results Infection control practices were compared before and after the intervention. Alcohol hand-rub use before patient contact increased from 12% to 25% of occasions (P = 0.009), from 0% to 62% prior to touching the X-ray machine (P < 0.001) and from 9% to 39% (P < 0.001) before touching the next patient. Glove use also improved significantly.

Resistant Gram-negative bacteria grew in 12/31 (39%) preinter-vention X-ray machine cultures and 0/29 (0%, P < 0.001) postintervention cultures. Cultures with no bacterial growth increased from 11/31 (33%) to 22/29 (67%, P = 0.002) pre to post intervention.

New occurrences of resistant Gram-negative bacteria in clinical cultures decreased from 19 in 68 patients (28%) pre intervention to 8/84 (10%, P = 0.003) post intervention.

Conclusions Resistant Gram-negative bacteria are found frequently on the X-ray machine, probably being transferred on technicians' hands. This represents the potential for patient-to-patient bacteria transfer. A simple infection control intervention decreases X-ray machine contamination and is associated with a decrease in the appearance of resistant bacteria in patient cultures, although causality is not proven. References

1. Grundmann H, et al.: Crit Care Med 33:946-951.

2. Pittet D, et al.: Arch Intern Med 159:821-826.

Healthcare-related bacteraemia admitted to the ICU

G Castro1, T Cardoso1, R Carneiro1, O Ribeiro2, A Costa-Pereira2, A Carneiro1

1Hospital Geral de Santo Antonio, Porto, Portugal; 2Faculty of

Medicine, University of Oporto, Porto, Portugal

Critical Care 2008, 12(Suppl 2):P11 (doi: 10.1186/cc6232)

Introduction Bacteraemia developing in patients outside the hospital is categorized as community acquired. Accumulating evidence suggests that healthcare-related bacteraemia (HCRB) are distinct from those that are community acquired. Methods A prospective, observational study of all the patients with community-acquired bacteraemia sepsis (CABS) admitted to a tertiary, mixed, 12-bed ICU, at a university hospital, between 1 December 2004 and 30 November 2005. HCRB was defined according to criteria proposed by Friedman and colleagues [1]. Results Throughout the study period, 160 patients were admitted with CABS; 50 (31%) had HCRB. In the CABS group the main focus of infection was respiratory (41%), intra-abdominal (15%) and endovascular (15%); in the HCRB group respiratory infection was present in 14 (28%) patients, intra-abdominal in 13 (26%) patients and urological in 10 (20%) patients (P = 0.227). The microbiological profile was different between the two groups: in the non-HCRB the main microbiological agents were Grampositive 57 (63%), versus 34 (37%) Gram-negative. In the HCRB group the Gram-negative dominated the microbiological profile: 26 (65%) versus 34 (37%) (P = 0.003). The ICU crude mortality was different in both groups (52% in HCRB versus 34% in CABS, P =

0.028) and also hospital mortality (60% vs 39%, P = 0.013). Conclusions HCRB has a higher crude mortality and a different microbiological profile was shown in the present study. This knowledge should prompt the necessity for early recognition of patients with HCRB that would need a different therapeutic approach.


1. Friedman ND, Kaye KS, Stout JE, et al.: Health care-associ-

ated bloodstream infections in adults: a reason to change the accepted definition of community-acquired infections.

Ann Intern Med 2002, 137:791-797.

Incidence of nosocomial infection in patients with nontraumatic or traumatic coma

L Lorente Ramos, J Castedo, R Galván, C García, J Iribarren, J Jiménez, M Brouard, L Lorenzo, S Palmero, M Martín, M Mora

Hospital Universitario de Canarias, La Laguna, Tenerife, Spain Critical Care 2008, 12(Suppl 2):P12 (doi: 10.1186/cc6233)

Introduction To determine the rate of nosocomial infection in nontraumatic or traumatic coma patients.

Methods A prospective study for 24 months in a medical-surgical ICU. Infections were diagnosed according to CDC criteria. Infections were classified based on the diagnosis onset as: early onset (EO), developed during the first 4 days of ICU stay; and late onset (LO), developed 5 days after ICU admission. Results We included 118 patients with nontraumatic coma (31 intracerebral hemorrhage, 30 subarachnoid hemorrhage, 15 brain infarction, 12 intoxication, nine CNS infection, six status epilepticus and 15 others), 63 males. The mean age was 55.07 (±16.12 years). The mean APACHE II score was 18.50 (±12.02). A total of 47 patients (39.83%) developed 70 nosocomial infections (28 EO and 42 LO) and death in 32 patients (27.12%): 33 pneumonias (18 EO and 15 LO), 25 urinary tract infections (eight EO and 17 LO), five primary bacteremias (two EO and three LO), three catheter-related bacteremias (three LO), three ventriculitis (three LO) and one wound surgical infection (one LO). The microorganisms responsible were: nine Pseudomonas, nine CNS, eight Escherichia coli, six MSSA, five MRSA, five Haemophillus, five Candida albicans, four Streptococcus faecalis, four Streptococcus pneumoniae, four Proteus mirabilis and 11 others. Included were 67 patients with traumatic coma, 57 males. The mean age was 38.02 (±17.49 years). The mean APACHE II score was 18.32 (±12.21). A total of 27 patients (40.29%) developed 38 nosocomial infections (18 EO and 20 LO) and death in 14 patients (20.89%): 27 pneumonias (15 EO and 12 LO), six urinary tract infections (one EO and five LO), two primary bacteremias (one EO and one LO), one catheter-related bacteremia (one LO), one ventriculitis (one EO) and one wound surgical infection (one LO). The microorganisms responsible were: eight MSSA, one MRSA, seven Pseudomonas aeruginosa, five CNS, five Haemophillus influenzae and 12 others.

Conclusions Forty percent of patients with nontraumatic and traumatic coma developed infections - those with a respiratory origin being the most frequent.

Comparative study on infection of the central nervous system in patients with head trauma and spontaneous cerebral hemorrhage

P Vartzeli, A Yiambides, K Daskalakis, M Moukas, K Schulpis, K Mandragos

Red Cross Hospital, Ampelokipoi, Greece

Critical Care 2008, 12(Suppl 2):P13 (doi: 10.1186/cc6234)

Introduction The emergency neurosurgical procedure, the long duration of it (>4 hours) and the infected trauma are factors that have, in studies, been connected with increased probability of infection of the central nervous system (CNS) during the postoperative period. S5

Objective To study the appearance of infection of the CNS in patients who have been operated on after sustaining a head injury or spontaneous cerebral hemorrhage that were hospitalized in the ICU, over a period of 2 years.

Materials Recordings of 118 patients who were hospitalized in the ICU during the period 2005-2007. The selection of the patients was based on the following criteria: the reason for admission to the ICU was head injury (70 patients) or cerebral hemorrhage (48 patients); all patients had undergone a neurosurgical procedure; and an infection occurred during hospitalization in the ICU. Methods All patients out of the 118 that presented fever or laboratory findings of an infection which could not be attributed to an infection of any other reason except CNS underwent lumbar puncture.

Results Twenty-seven patients underwent lumbar puncture (22.88%). Findings from the lumbar puncture compatible with an infection of the CNS occurred in six patients (five patients with cerebral injury and one patient with cerebral hemorrhage) out of 118 patients, 5.08% of all patients (7.14% of head injury and 2.08% of cerebral hemorrhages).

The days that the lumbar puncture was performed were the 4th-19th postoperative days. The mean GCS value during the admittance to the hospital of the total patients was 8.88 (3-15), but the mean GCS value of those patients that developed CNS infection was 7.86 (3-14).

Conclusions The administration of antibiotics from the first day of admittance to the ICU probably is accountable for the very low rate of infection of the CNS in patients with head injury or cerebral hemorrhage. There is no important difference between the scheduled surgical procedure from the head injury and automatic cerebral hemorrhage. Further studies are needed for the reduction and control of the postoperative infections in these patients. References

1. Korinek AM: Neurosurgery 1997, 41:1073-1079.

2. Korinek AM, Golmard JL, Elcheick A, et al.: Br J Neurosurgery 2005, 19:155-162.

3. Kourbeti IS, Jacobs AV, Koslow M, et al.: Neurosurgery 2007, 60:317-325.

Respiratory community-acquired and healthcare-related sepsis: are they different?

G Castro1, O Ribeiro2, A Costa Pereira2, A Carneiro1, T Cardoso1

1Hospital Geral de Santo Antonio, Cuidados Intensivos, Porto, Portugal; 2Faculdade de Medicina do Porto, Servigo Biostastística e Informática, Porto, Portugal

Critical Care 2008, 12(Suppl 2):P14 (doi: 10.1186/cc6235)

Introduction Respiratory infection counts for more than one-half of all admissions to the ICU with sepsis. In this study the epidemiology and microbiological profile of community-acquired and healthcare-related (HCR) respiratory sepsis will be described. Methods A prospective, observational study of all the patients with community-acquired sepsis (CAS) admitted to our ICU, over 1 year. Respiratory CAS was defined by the presence of respiratory infection and at least two SIRS criteria at the time of hospital admission or within the first 48 hours. HCR infection was defined according to criteria proposed by Friedman and colleagues [1]. Results In the study period, 347 patients were admitted - 149 (43%) with CAS. Respiratory infection was present in 102 patients (68%). Comparing this group with nonrespiratory CAS, 73% versus 51% were male (P = 0.01), with a similar median age of 57 years versus 62 years (P = 0.334), more severe sepsis (40% vs S6 28%) and less septic shock (46% vs 68%) (P = 0.030). Blood

cultures were obtained in 96 (94%) patients, only 8% were positive versus 39% in nonrespiratory CAS (P < 0.001). Grampositive microorganisms represented 51% of all isolations, Gramnegative 26%, Mycobacterium tuberculosis 6%, atypical 5%, and fungus represented only 2% of all isolations. Polymicrobian infections were documented in 5% of the patients. HCR respiratory infection was present in 17%. Gram-positive microorganisms represented 50% of all isolations, and Gram-negative 37%. ICU length of stay (9 vs 8 days, P = 0.595), as well as ICU (35% vs 32%, P = 0.686) and hospital (36% vs 41%, P = 0.559) mortality were similar between respiratory and non-respiratory CAS. Conclusions Respiratory CAS is a very important problem in the ICU, representing 30% of all admissions. Although the microbiological profile is similar to that described in the literature, in this population tuberculosis still plays a representative role and needs to be considered. In this population, no significant differences in the microbiological profile were seen between CAS and HCR infection. Reference

1. Friedman ND, Kaye KS, Stout JE, et al.: Health care-associated bloodstream infections in adults: a reason to change the accepted definition of community-acquired infections.

Ann Intern Med 2002, 137:791-797.

Antibiotic costs in bacteremic and nonbacteremic patients treated with the de-escalation approach

E Evodia1, P Myrianthefs1, P Prezerakos2, G Baltopoulos1

1KAT General Hospital, Athens, Greece; 2Municipality of Athens, Educational Centre, Athens, Greece

Critical Care 2008, 12(Suppl 2):P15 (doi: 10.1186/cc6236)

Introduction Antibiotic therapy significantly contributes to healthcare costs and especially to those infections due to multidrug resistance pathogens. The purpose of the study was to investigate empiric antibiotic therapy costs compared with the consequent application of de-escalated therapy.

Methods We prospectively collected data regarding demographics and antibiotic costs in critically ill ICU patients experiencing infection. We recorded daily costs of empiric antibiotic therapy on identification-suspicion of infection as well as the costs after the pathogen identification and susceptibility.

Results We included 27 critically ill patients (15 males) of mean age 49.9 ± 4.3 years and illness severity of APACHE II score 15.0 ± 1.7, SAPS II 32.4 ± 3.7, and SOFA score 6.0 ± 0.5. Daily costs of initial empiric antibiotic therapy were significantly higher compared with those of the therapy guided according to susceptibility results in confirmed bacteremias. This was applicable for Gram-positive (€61.0 ± 12.7 vs €130.4 ± 56.3, P = 0.009), Gram-negative (€181.0 ± 47.8 vs €142.7 ± 42.9, P = 0.0063) and mixed (€166.0 ± 21.1 vs €96.0 ± 34.0, P = 0.0016) bacteremias. In patients with other sites of infection the antibiotic costs did not differ (P = 0.112) between therapy guided according to susceptibility results compared with empiric therapy (€239.0 ± 49.7 vs €242.0 ± 88.7).

In patients with negative cultures the daily antibiotic cost was €110.7 ± 31.9. Therapy in those patients was discontinued earlier and they had a significantly lower length of ICU stay (P = 0.000, 8.7 ± 0.9 days vs 24.6 ± 4.1 days).

Conclusions According to our bacteriologic susceptibility results, the de-escalation therapy is applicable only in bacteremias which may lead to decreased antibiotic costs. Such an approach is not applicable in infections of other sites possibly due to multidrug resistance pathogens.

When appropriate antibiotic therapy is relevant in bacteremic septic patients

H Bagnulo, M Godino

Maciel Hospital, Montevideo, Uruguay

Critical Care 2008, 12(Suppl 2):P16 (doi: 10.1186/cc6237)

Introduction In the past 10 years different authors have published higher mortality in severe infection related to inappropriate antibiotic therapy (IAT). A systematic review [1] recommends defining groups of patients that could benefit more with appropriate antibiotic therapy (AAT).

Methods Two hundred and twenty bacteremic septic patients admitted during 4 years to a medical-surgical ICU were considered for place of acquisition (community acquired vs nosocomial acquired), foci of origin, SAPS II and presence of shock, in relation to mortality and to the appropriateness of empiric antibiotic therapy. Mortality was considered during the ICU stay. Results For 220 septic patients, mortality in 106 patients (48%): AAT 157 patients (71.4%), mortality in 71 patients (45%); IAT 63 patients (28.6%), mortality in 35 patients (55.5%) (P = 0.2). Community-acquired bacteremia 153 patients, mortality in 73 patients (47%); nosocomial-acquired bacteremia 67 patients, mortality in 33 patients (49%) (P = 0.9). Community-acquired bacteremia 99 patients with SAPS II <50: IAT 23 patients, 12 dead; AAT 76 patients, 20 dead (P = 0.03, RR = 1.9). For 54 patients with SAPS II >50 in this group the IAT was not related to mortality. See Table 1.

Table 1 (abstract P16)

Antibiotic therapy and mortality by foci of origin

Focus n (%) AAT / IAT Mortality

Pulmonary 94 (43) 66/28 (P = 0.8) 25/18 (P = 0.05,

RR = 2)

Peritoneal 30 (13.6) 17/13 (P = 0.08) 8/5 (P = 0.9)

Vascular 30 (13.6) 26/4 (P = 0.07) 12/1 (P = 0.4)

Urinary 24 (11) 19/5 (P = 0.5) 12/1 (P = 0.1)

Skin 17 (7.7) 13/4 (P = 0.8) 8/3 (P = 0.5)

Unknown 13 (5.5) 5/8 (P = 0.01, 2/6 (P = 0.2)

RR = 3.8)

Meningeo 12 (5.5) 11/1 (P = 0.1) 4/1 (P = 0.4)

Conclusions IAT relates to unknown foci of origin in septic patients irrespective of the site of acquisition and severity of illness (P = 0.01, RR = 2.3). Bacteremic pulmonary infections treated with empirical IAT have a higher attributable mortality (P = 0.02, RR = 2.9). CA septic patients with SAPS II <50, when treated with IAT, have a significantly higher mortality (P = 0.03, RR = 1.9). We were not able to document this in more severely compromised patients (SAPS II >50), probably because the severe septic condition hides the consequences of the IAT. Reference

1. McGregor JC, Rich SE, Harris AD: Systematic review of the methods used to assess the association between appropriate antibiotic therapy and mortality in bacteremic patients. Clin Infect Dis 2007, 45:329-337.

Incidence of candidemia before and after fluconazole prophylaxis implementation in a 14-bed general ICU

P Vartzeli1, M Moukas1, L Kondili2, G Bethimoutis2, C Mandragos1

1ICU, Red Cross Hospital, Ampelokipoi, Greece; 2Microbiology

Department, Red Cross Hospital, Athens, Greece

Critical Care 2008, 12(Suppl 2):P17 (doi: 10.1186/cc6238)

Introduction Patients in ICUs account for the greatest number of candidemia in most hospitals. Fluconazole prophylaxis has been used to prevent candida infections in critically ill patients. In order to examine the effect of fluconazole prophylaxis implementation in our ICU we reviewed the records of all patients with blood cultures that grew Candida spp. (albicans and nonalbicans) 1 year before and after.

Methods In 2006 we started using intravenous fluconazole administration as prophylaxis (400 mg/day) in selected patients (surgical, with central venous catheters, receiving broad-spectrum antibiotics, receiving TPN, requiring hemodialysis, spending more than 8 days in the ICU) as protocol. We recorded the incidence of candidemia for 2005 (4.03%) and 2006 (1.7%) as well. We also recorded the candidemic patient's age (mean, 47.84 years/51 years), sex (10 men, three women/four men, one woman), APACHE II score on admission (mean, 11.27/ 12), days spent in ICU (46 ± 30.30 days/98 ± 68.44 days), median day of candida isolation (17th day (2nd-50th day)/46th day (23rd-208th day)), whether they were receiving TPN (30.8%/60%), and outcome. All candidemic patients were treated with liposomic amphotericin. Results In 2005, 322 patients were admitted to our ICU - 13 of them had at least one blood culture that yielded Candida (six C. albicans, seven Candida spp). None of them received fluconazole prophylaxis. Seven patients (53.8%) died. In 2006, 291 patients were admitted - five of them developed candidemia (two C. albicans, three C. parapsilosis), four were under prophylaxis and three of them developed C. parapsilosis. Three patients (60%) died.

Conclusions Although the number of patients is small, it seems that fluconazole prophylaxis can prevent candidemia in critically ill patients, but also may promote the development of nonalbicans species, which are resistant to fluconazole. Reference

1. Fraser VJ, et al.: Candidemia in a tertiary care hospital: epidemiology, risk factors, and predictors of mortality. Clin Infect Dis 1992, 15:414.

Comparison between mortality and airway colonisation versus noncolonisation with Candida species in critically ill adults

G Browne, R McMullan, D McAuley, J Troughton, G Lavery

Royal Victoria Hospital, Belfast, UK

Critical Care 2008, 12(Suppl 2):P18 (doi: 10.1186/cc6239)

Introduction Candida airway colonisation in patients with a clinical suspicion of ventilator-associated pneumonia has been associated with increased mortality in the published literature. The aim of this study was to investigate whether there is an association between the presence of Candida spp. in the respiratory secretions of critically ill adults and ICU mortality, irrespective of the confirmed presence of ventilator-associated pneumonia. S7

Methods A retrospective analysis was performed on patients admitted to a large mixed ICU in Northern Ireland over a 1 -year period. Data were analysed to determine mortality in patients whose respiratory secretions had cultured Candida spp. (both with and without coexisting bacteria), compared with those in whom cultures were negative for Candida spp. but positive for bacterial pathogens. Patients with persistently culture-negative respiratory specimens were excluded from analysis. Statistical significance of observed differences was evaluated by chi-square testing. Results In total, 287 patients were analysed. Of these, 202 (70%) were male. Bacteria only were cultured from respiratory secretions of 208 (72%) patients (the 'non-Candida' group). The 'Candida' group consisted of 79 (28%) patients; of these, 39 had Candida spp. only and 40 had Candida spp. plus bacterial pathogens. Within the 'non-Candida' group, 39 patients died during the ICU episode; in the 'Candida' group, 17 died (18.8% vs 21.5%, P = 0.597). Conclusions The presence of Candida spp. in the respiratory secretions of this critically ill cohort was not associated with a significant increase in ICU mortality. It appears, therefore, that airway colonisation with Candida spp. in the absence of ventilator-associated pneumonia may not be regarded as a reliable predictor of ICU mortality.

Risk factors for lung colonization by Candida albicans in a general ICU

L Toma1, S Silvestri1, F Forfori2, G Licitra2, F Giunta2

1Scuola di specializzazione in Anestesia e Rianimazione, Pisa, Italy; 2Azienda Ospedaliera Universitaria Pisana, Pisa, Italy Critical Care 2008, 12(Suppl 2):P19 (doi: 10.1186/cc6240)

Introduction Although a substantial proportion of patients become colonized with Candida sp. during a hospital stay, only few develop severe infection. Invasive candidiasis occurs in only 1-8% of patients admitted to hospitals, but in 10% of patients housed in the ICU where candida infections represent up to 15% of all nosocomial infections [1]. Candida sp. isolates from bronchoalveolar lavage (BAL) cultures in immunocompetent patients are through contaminants rather than pathogens. The objective of this study is to research the most important risk factors for lung colonization by Candida albicans in ICU patients.

Methods Immunocompetent patients admitted to the ICU with C. albicans isolates from BAL in a 20-month period were retrospectively studied. Patients without any microbiological growth from BAL were also included. The clinical course, therapeutic decision, potential risk factors and outcome were recorded. Results The population object of this study is composed of 20 (33.3%) patients with C. albicans isolated from BAL (BAL+) and of 12 (20%) patients with absent growth in BAL (BAL-). Significant differences between patients with BAL(+) and patients with BAL(-) are observed: 80% BAL(+) versus 8.3% BAL(-) was treated with parenteral nutrition (OR = 44), 90% versus 33.3% were mechanically ventilated (OR = 20), 65% versus 8.3% received corticosteroid therapy (OR = 18). See Table 1. Conclusions Total parenteral nutrition, mechanical ventilation and treatment with corticosteroids are important risk factors for lung colonization by C. albicans. The higher risk is attributable to parenteral nutrition: the risk is twice as high compared with ventilation and corticosteroid-associated risk. Reference

1. Rello J, Esandi ME, Mariscal D, et al.: The role of Candida spp. isolated from broncoscopic samples in nonneu-S8 tropenic patients. Chest 1998, 114:146-149.

Table 1 (abstract P19)

Odds Standard 95% confidence

ratio error P >z interval

NTP 44 52.12 0.001 4.31


Corticosteroid 23.38 0.008 2.166


VMA 18 17.36 0.003 2.71


NTP, total parenteral nutrition; VMA, assisted mechanics ventilation.

Combination therapy with efungumab for the treatment of invasive Candida infections: several illustrative case reports

P Spronk1, B Van der Hoven2, C Graham3, F Jacobs4, J Sterba5, E Liakopoulou6, A Qamruddin7

1Gelre Hospitals (Lukas Site), Apeldoorn, The Netherlands;

2Erasmus Hospital, Rotterdam, The Netherlands; 3Birmingham University Children's Hospital, Birmingham, UK; 4Free University of Brussels, Belgium; 5University Hospital Brno, Czech Republic; 6Christie Hospital, Manchester, UK; 7Central Manchester & Manchester Children's University Hospitals, Manchester, UK Critical Care 2008, 12(Suppl 2):P20 (doi: 10.1186/cc6241)

Introduction Efungumab (Mycograb®) is a human recombinant

antibody against fungal Hsp90 that, in combination with lipid-

associated amphotericin B, has shown efficacy in patients with invasive candidiasis (phase 3 data). Eight compassionate-use case studies of efungumab in combination with antifungal agents in the treatment of invasive Candida infections are presented. Methods Efungumab was given to eight patients at 1 mg/kg twice daily, typically for 5 days combined with standard doses of amphotericin B, caspofungin, flucytosine or fluconazole. Patients were 7-69 years old with culture-confirmed invasive fungal infections, from which Candida spp. (Candida albicans, Candida krusei, Candida glabrata) were isolated; five patients had candidal peritonitis, one candidaemia, one a subphrenic abscess and candidae-mia, and one mediastinal, pleural and pulmonary candidiasis; one patient had neutropenia.

Results Seven out of eight patients responded to 10 doses of efungumab; one patient (a child with candida peritonitis and abdominal abscesses associated with a non-Hodgkin's abdominal lymphoma) responded but relapsed and required a second course of treatment, to which he responded. One patient, with mediastinal, pulmonary and pleural candidiasis associated with ARDS, was withdrawn after two doses of efungumab, due to blood pressure fluctuations, impaired gas exchange, increased cardiac output and fever; in this patient the efungumab was not prefiltered. Three further patients experienced transient hypotensive or hypertensive episodes after the first dose, which did not recur with subsequent doses. One patient experienced nausea and vomiting after the second dose.

Conclusions This experience with efungumab extends the clinical trial database. It shows efficacy in poor-prognosis patients who failed to respond to conventional monotherapy (6-20 days), in patients with multiple species of Candida, and in candidaemia in a neutropenic patient. All but one patient tolerated efungumab and seven patients completed the course without major side effects.

Pooled analysis of safety for micafungin OA Cornely1, P Maddison2, AJ Ullmann3

Wniversität Klinikum Köln, Germany; 2Astellas Pharma Europe BV, Leiderdorp, The Netherlands; 3Klinikum derJohannes Gutenberg-Universität, Mainz, Germany

Critical Care 2008, 12(Suppl 2):P21 (doi: 10.1186/cc6242)

Introduction Micafungin (MICA) is an efficacious antifungal treatment for life-threatening fungal infections [1-4]. Methods We characterised the safety of MICA by analysing pooled adverse event (AE) data from 17 clinical studies conducted worldwide. All patients (n = 3,028) received >1 dose of intravenous MICA; a median daily dose of 100 mg for adults and 1.5 mg/kg for children over a mean duration of 18 and 29 days, respectively.

Results Median age was 40.5 (range <0.1-92) years, including 296 (9.8%) children (<16 years old) and 387 (12.8%) elderly patients (>65 years old). Common underlying conditions were haematopoietic stem cell or other transplantation (26%), malignancies (21%) and HIV (33%). The most frequently reported MICA-related AEs were nausea (2.8%), vomiting (2.5%), phlebitis (2.5%), hypokalaemia (2.1%), pyrexia (2.1%), diarrhoea (2.0%), and increases in alkaline phosphatase (2.7%), aspartate aminotrans-ferase (2.3%) and alanine aminotransferase (2.0%). In comparative studies, the MICA safety profile was superior to liposomal ampho-tericin B, and similar to fluconazole and caspofungin (Figure 1). Conclusions This large database with more than 3,000 patients demonstrated a favourable clinical safety profile for micafungin. References

1. Kuse ER, et al.: Lancet 2007, 369:1519-1527.

2. de Wet NT, et al.: Aliment Pharmacol Ther 2005, 21:899907.

3. van Burik JA, et al.: Clin Infect Dis 2004, 39:1407-1416.

4. Pappas PG, et al.: Clin Infect Dis 2007, 45:883-893.

Figure 1 (abstract P21)

Canil illarmhr'l nvasive Candidiasis Candidaemia/lrtvasive Candidiasis Oesophageal candidiasis' ProjiJij la*is+

Adverese event MICA 100 m£ (□=316) AmB 3 mg/kg (n=32l) MICA Mill nit; <n=200l MICA 150 mg (n=202) CASPO 50 ms* <n-l«3) MICA 150 mg (n=260) fLLI 200 mg <n=258) MICA 50 nig ln=425) PLI) 400 nig 1.1=457)

Hypokalemia 21 (6.6%)' 38 (11.8%) 4 (2.0%) 5 (2.5%) 3 (1.6%) 1 (0.4%) 1 (0.4%) S (1.9%) 8 (1.8%)

Pyrexia 23 (7.3%)' 39 (12.1%) 2 (1.0%) 0 1 (0.5%) 5 (1.9%) 1 (0.4%) 4 (0.9%) 5 (1.1%)

Rigors 2 (0.6*)' 19 (5.9%) 1 (0.5%) 2 (1.0%) 1 (0.5%) 6 (2.3%) 0" 1 (0.2%) 5 (1.1%)

Creatinine increased 6 (1.9%)' 17 (5.3%) 0 1 (0.5%) 1 (0.5%) 0 0 1 (0.2%) 3 (0.7%)

Infusion-related event 52 (16.5%)' 87 (27.1%) 5 (2.5%) 0 5 (2.6%) 9 (3.5%) s (3.1%) 2 (0.5%) 4 (0.9%)

Treatment-related: assessed by investigator as having al least a possible relationship to study drug. .'.'i:H liposomal amphotericin II. CASPO: caspolitngjn; FLU: fluconazole; MICA: micafungin. t Haematopoietic stem cell (HSCT) was the main inclusion criterion. J Most oesophageal candidiasis COEC) patients were HIV positive with AIL>S. * P<0,05, Hsher's exact lest. <r Alter a 70 mg loading dose.

Treatment-related adverse events (incidence > 5%) from comparative studies. Number (%) of patients.

Pharmacokinetics of micafungin in adult patients with invasive candidiasis and candidaemia

ER Kuse1, I Demeyer2, N Undre3

Medizinische Hochschule Hannover, Germany; 2Onze Lieve Vrouw Ziekenhuis, Aalst, Belgium; 3Astellas Pharma GmbH, Munich, Germany

Critical Care 2008, 12(Suppl 2):P22 (doi: 10.1186/cc6243)

Introduction Micafungin (MICA) is an antifungal therapy for the treatment of life-threatening fungal infections. Until this study, the pharmacokinetics (PK) of MICA in patients with confirmed invasive candidiasis (IC) or candidaemia (C) had not been studied. We report here the PK of MICA in this patient population. Methods We characterised the PK of MICA in neutropenic and non-neutropenic patients with confirmed IC or C. Patients (n = 20) received MICA 100 mg daily for >14 days. Plasma concentration-time profiles to determine the PK were taken after the first dose (day 1) and on the last day of treatment. Results The mean age was 50 years (range: 18-84 years) and mean weight was 67 kg (range: 48-103 kg). There were 13 Caucasians, three Thais, one Black, one Asian Indian, one Mulatto and one Cape Coloured. PK parameters are presented in Figure 1. The mean half-life and mean clearance remained largely unchanged after repeated daily dosing for 14 or 28 days. There was no accumulation of MICA between day 1 and the end of therapy beyond that expected for a drug with linear PK. Systemic exposure to MICA metabolites was low throughout the study and therefore they do not contribute to the therapeutic antifungal effectiveness of MICA.

Figure 1 (abstract P22)

Parameter |ti/j (hr) Cmax ((ig.'trjL; AUCo-24 AUCu-ii ■ CI (mL/hr)


n ¡20 20 ¡20 ¡20 19

Mean 14.47 5.69 56.64 83.25 1441

SD 7.01 2.15 ¡30.10 51.07 ¡728


n ¡20 20 ¡20 20 ¡20

Mean 13.37 10.05 97,1 1 137.18 1168

SD 1.99 4.37 28.97 42.92 ¡561

Conclusions The PK of MICA in these critically ill patients with IC and C were generally similar to those in earlier studies in healthy adults [1]. These data support previous studies that show MICA requires no loading dose. Reference

1. Chandrasekar PH, Sobel JD: Clin Infect Dis 2006, 42:11711178.

Single-dose pharmacokinetics of the cholesteryl sulfate complex of amphotericin B in critically ill patients with cholestatic liver failure

S Weiler, M Joannidis, R Bellmann-Weiler, R Bellmann

Clinical Pharmacokinetics Unit, Innsbruck, Austria Critical Care 2008, 12(Suppl 2):P23 (doi: 10.1186/cc6244)

Introduction Investigations on the pharmacokinetics and elimination of amphotericin B (AMB) lipid formulations in liver impairment have so far been lacking. In the present clinical study the

pharmacokinetics of the cholesteryl sulfate complex of AMB was assessed in critically ill patients with cholestatic liver failure. Methods Time-concentration profiles were determined in critically ill patients with cholestatic liver failure and in critically ill patients with normal hepatic function requiring cholesteryl sulfate complex of AMB for invasive fungal infections. The lipid-associated and liberated fraction of AMB were separated by solid-phase extraction and subsequently quantified by high-performance liquid chromatography. Results Three patients with impaired and three patients with normal hepatic function on day 1 of ABCD therapy have so far been enrolled. After a single dose of ABCD (2.46 ± 0.54 mg vs 2.94 ± 1.47 mg/kg in the impaired-liver group compared with the control group), the maximum concentration in patients with impaired liver function was fourfold increased compared with the control group (1.98 ± 0.61 vs 0.52 ± 0.12 |ig/ml for total AMB (P <0.05), 1.25 ± 0.58 vs 0.46 ± 0.14 |ig/ml for the liberated fraction (P < 0.05), 0.74 ± 0.05 vs 0.06 ± 0.02 |ig/ml for the lipid-associated fraction (P < 0.05)). The clearance was slower in the investigational group (0.15 ± 0.09 vs 0.38 ± 0.19 l/hour/kg for total AMB, 0.22 ± 0.10 vs 0.38 ± 0.19 l/hour/kg for the liberated AMB fraction (P < 0.05) and 0.52 ± 0.45 vs 17.84 ± 15.45 l/hour/kg for lipid-associated AMB (P < 0.05)). The volume of distribution at steady state was significantly decreased (2.17 ± 0.58 vs 9.78 ± 2.99 l/kg for total AMB (P < 0.05), 3.09 ± 0.88 vs 10.39 ± 2.70 l/kg for liberated AMB (P < 0.05) and 8.18 ± 3.47 vs 83.27 ± 64.98 l/kg for lipid-associated AMB (P < 0.05)).

Conclusions The elimination of ABCD appears to be delayed in cholestatic liver failure, particularly that of the lipid-associated fraction. More pharmacokinetic data are required to establish reliable dose recommendations for ABCD in patients with liver failure.

Serum tobramycin levels during selective decontamination of the digestive tract in ICU patients on renal replacement therapy

M Mol, H Van Kan, L Spanjaard, M Schultz, M Vroom, E De Jonge

Academic Medical Center, Amsterdam, The Netherlands Critical Care 2008, 12(Suppl 2):P24 (doi: 10.1186/cc6245)

Introduction Selective decontamination of the digestive tract (SDD) is an infection prophylaxis regimen that may improve survival in ICU patients [1]. Antibiotics for SDD are nonabsorbable, are given enterally and are therefore considered safe to use. The aim of our study was to determine whether enteral administration of tobramycin as part of a SDD regimen may lead to detectable and potentially toxic serum tobramycin concentrations in patients with renal failure. Methods A prospective, observational study in ICU patients given SDD treatment for at least 3 days. All patients were on continuous venovenous hemofiltration with a filtration rate of 35 ml/kg/hour. Tobramycin serum concentrations were measured every 3 days. Results Serum samples were taken a median 6 days after the start of SDD (IQR 3-9 days). Detectable tobramycin levels were found in 12 of 19 patients (63%) and in 15 of 26 serum samples (58%). In four patients tobramycin concentrations were >1 mg/l, and in one of these patients a toxic concentration of 3 mg/l was found. All patients with tobramycin levels >1 mg/l had ischemic bowel disease. In contrast, no patients with lower concentrations had intestinal ischemia. Conclusions In patients with renal failure treated with continuous venovenous hemofiltration, administration of SDD can lead to detectable and potentially toxic tobramycin serum concentrations. The risk of increased enteral absorption of tobramycin may be S10 particularly high in patients with intestinal ischemia. We advise

monitoring plasma tobramycin concentrations in patients with renal

failure on prolonged treatment with SDD.


1. de Jonge E, et al.: Effects of selective decontamination of digestive tract on mortality and acquisition of resistant bacteria in intensive care: a randomised controlled trial.

Lancet 2003, 362:1011-1016.

The pharmacokinetics of dalbavancin in subjects with mild, moderate, or severe hepatic impairment

J Dowell1, E Seltzer1, M Buckwalter1, T Marbury2, D Simoneau3, E Boudry3

1Vicuron Pharmaceuticals, Pfizer Inc., New York, USA; 2Orlando Clinical Research Center, Orlando, FL, USA; 3Pfizer International Operations, Paris, France

Critical Care 2008, 12(Suppl 2):P25 (doi: 10.1186/cc6246)

Introduction Dalbavancin (DAL) is a semisynthetic lipoglyco-peptide in phase 3 development with activity against Gram-positive bacteria. Weekly doses (1 g day 1/0.5 g day 8) are being investigated for the treatment of complicated skin and soft tissue infections. DAL has both renal and nonrenal routes of elimination. A study was performed to assess the need for dosage adjustments in patients with hepatic impairment.

Methods Subjects received intravenously 1 g DAL on day 1 followed by 0.5 g on day 8. Subjects had mild, moderate, or severe hepatic impairment as defined by Child-Pugh criteria A, B, or C. Age, gender, and weight-matched controls with normal hepatic function were also enrolled. DAL plasma concentrations were determined and pharmacokinetic parameters were calculated. Drug exposure was calculated as the cumulative area under the concentration-time curve through day 15; drug clearance and the elimination halflife were also determined. Safety was assessed by physical examination and adverse event and laboratory monitoring. Results Twenty-six subjects were enrolled, received DAL, and had evaluable pharmacokinetics. The drug was well tolerated with no serious adverse events. DAL concentrations and exposures were not increased due to hepatic impairment. The elimination half-life was not affected by hepatic impairment. Slightly lower exposures and higher drug clearance were observed for subjects with moderate and severe hepatic impairment, presumably due to volume changes secondary to ascites and edema. The DAL concentrations observed for these subjects were comparable with the ranges observed in other studies.

Conclusions DAL concentrations are not increased due to hepatic impairment and no dosage adjustment should be required for patients with mild, moderate, or severe hepatic impairment.

Dalbavancin dosage adjustments not required for patients with mild to moderate renal impairment

J Dowell1, E Seltzer1, M Stogniew1, MB Dorr1, S Fayocavitz1, D Krause1, T Henkel1, T Marbury2, D Simoneau3, E Boudry3

1Vicuron Pharmaceuticals, Pfizer Inc., New York, USA; 2Orlando Clinical Research Center, Orlando, FL, USA; 3Pfizer International Operations, Paris, France

Critical Care 2008, 12(Suppl 2):P26 (doi: 10.1186/cc6247)

Introduction Dalbavancin (DAL) is a novel semisynthetic glyco-peptide in phase 3 clinical development that has activity against Gram(+) organisms, including resistant strains. Two doses given 1 week apart have been shown to be effective in complicated skin

Figure 1 (abstract P26)

Figure 1 (abstract P27)

and soft tissue infections. A clinical study was performed to determine the need for dosage adjustments in subjects with mild to moderate renal impairment (RI).

Methods Subjects with normal renal function (creatinine clearance (CrCL) > 80 ml/min), mild RI (CrCL of 50-79 ml/min), or moderate RI (CrCL of 30-49 ml/min) received DAL as a single intravenous infusion (1,000 mg). Plasma samples were collected through at least 14 days after the dose. DAL was assayed using validated LC-MS/MS methods. Pharmacokinetic (PK) data were analyzed using noncompartmental methods.

Results Twenty-one subjects were enrolled, received one dose of 1,000 mg dalbavancin, and were included in the PK analysis. Plasma concentration-time curves through 14 days (AUC0-14) were similar between subjects with normal renal function and subjects with mild or moderate RI. An increased concentration was observed in subjects with moderate RI beyond day 14, at a point in the profile when concentrations were below 40 m/l (Figure 1). Conclusions DAL does not require a dosage adjustment for patients with mild or moderate RI. These results are consistent with previous clinical and nonclinical PK studies showing that DAL has dual (both renal and nonrenal) routes of elimination.

Dalbavancin safety in the phase 2/3 clinical development program

E Seltzer1, L Goldberg1, D Krause1, D Simoneau2, E Boudry2

1 Vicuron Pharmaceuticals, Pfizer Inc., New York, USA; 2Pfizer

International Operations, Paris, France

Critical Care 2008, 12(Suppl 2):P27 (doi: 10.1186/cc6248)

Introduction Dalbavancin (DAL) is a novel, next-generation lipoglycopeptide with a pharmacokinetic profile that allows weekly dosing. The safety of DAL in the treatment of complicated skin and soft tissue infections was demonstrated versus comparators (COMP) in the phase 2/3 clinical development program. Methods Safety was assessed by analyzing adverse events (AEs), laboratory parameters, vital signs, and ECGs. Safety analyses were conducted on the intent-to-treat (ITT) population, using descriptive statistics only (consistent with ICH Guidance). COMP included linezolid, cefazolin, and vancomycin.

Results Of 1,699 patients in the phase 2/3 integrated database, 1126 patients received DAL. Demographic characteristics were similar between the treatment groups. The majority of patients were aged <65 years, male (60.2% DAL vs 58.8% COMP), and Caucasian (71.1% DAL vs 75% COMP). The safety and tolerability were good and comparable with each of the comparators separately and in toto. No compound-specific or unique toxicity was identified. The duration of AEs in patients treated with DAL was similar to that of COMP (median duration, 4 days vs 5.5 days for treatment-related AEs and 3 days vs 4 days for all AEs, respectively) (Figure 1). There was no hepatotoxic or renal signal in an examination of abnormalities in ALT, AST, BUN, and creatinine. The percentage of patients with abnormal hematology values was low and similar between treatment groups. No QT effect was demonstrated. Safety in relevant subpopulations (such as elderly, diabetic patients) was demonstrated.

Conclusions Dalbavancin is a well-tolerated lipoglycopeptide, with an AE profile similar to comparators in type and duration of AEs.

Efficacy of telavancin for treatment of surgical site infections

SE Wilson1, ME Stryjewski2, VG Fowler2, D Young3, F Jacobs4, A Hopkins5, SL Barriere5, MM Kitt5, GR Corey2

1University of California, Irvine School of Medicine, Orange, CA,

USA; 2Duke University Medical Center, Durham, NC, USA;

3UCSF, San Francisco, CA, USA; 4Erasme Hospital, Brussels, Belgium; 5Theravance, Inc., South San Francisco, CA, USA Critical Care 2008, 12(Suppl 2):P28 (doi: 10.1186/cc6249)

Introduction The purpose of this study was to evaluate the efficacy of telavancin (TLV), a novel bactericidal lipoglycopeptide with a multifunctional mechanism of action, for the treatment of surgical site infections due to Gram-positive bacteria, including methicillin-resistant Staphylococcus aureus (MRSA). Methods The ATLAS program (assessment of TLV in complicated skin and skin structure infections (cSSSI)) consisted of parallel, randomized, double-blind trials including >1,800 patients with cSSSI who received either TLV 10 mg/kg intravenously every 24 hours or vancomycin (VAN) 1 g intravenously every 12 hours for 7-14 days. This subgroup analysis of ATLAS patients with surgical site cSSSI compared clinical and microbiologic responses to treatment with TLV or VAN.

Results One hundred and ninety-four patients (10%) had operative-site cSSSI (TLV, n = 101; VAN, n = 93). Patient characteristics were similar between groups. Among all treated patients,

Table 1 (abstract P28)

Clinical cure and pathogen eradication rates by treatment group


Clinical cure, n (%) 41 (85) 33 (70) 18 (86) 15 (71)

Pathogen eradication, n (%) 40 (83) 30 (64) 17 (81) 12 (57)

clinical cure was achieved in 78 (77%) TLV patients and 65 (70%) VAN patients. The efficacy of TLV was numerically superior to VAN in SA and MRSA-infected patients (Table 1) but differences did not reach statistical significance. Incidences of adverse events were generally similar although nausea (28% TLV, 16% VAN), headache (10% TLV, 5% VAN) and taste disturbance (20% TLV, 1% VAN) were more common in the TLV group. Conclusions TLV was at least as efficacious as VAN for treatment of operative-site MRSA cSSSI and is a potentially useful treatment option.

Recurrence of skin infections in patients treated with telavancin versus vancomycin for complicated skin and soft tissue infections in a New Orleans emergency department

L Dunbar1, D Sibley1, J Hunt1, S Weintraub1, A Marr1, J Ramirez1, R Edler1, H Thompson1, M Kitt2

1Louisiana State University Health Sciences Center, New Orleans, LA, USA; 2Theravance, Inc., Research and Development, South San Francisco, CA, USA

Critical Care 2008, 12(Suppl 2):P29 (doi: 10.1186/cc6250)

Introduction Telavancin (TLV) is a novel lipoglycopeptide antibiotic that has a multifunctional mechanism to produce rapid bactericidal activity. TLV is highly active against Gram-positive bacteria, including methicillin-resistant and vancomycin (VAN)-intermediate and VAN-resistant strains of Staphylococcus aureus. The recently described community-acquired MRSA is known to have virulence factors associated with multiple lesions and recurrences. The objective of this study was to determine rates of recurrent skin infections within 6 months following treatment with TLV versus VAN. Methods A cohort analysis of outcomes was performed in patients from a high-volume inner-city emergency department (ED) in New Orleans, LA, USA. This study was approved by the Human Use Committee (LSUHSC), and informed consent was obtained for all patients. The study included patients enrolled in randomized, double-blind, controlled, phase 2 and 3 multicenter clinical trials. Eligibility criteria included age >18 years and diagnosis of complicated skin and soft tissue infections caused by suspected or confirmed Gram-positive organisms. Randomization was 1:1 to receive TLV or VAN. ED visit records of enrolled patients were reviewed to determine the number with recurrent skin infections. Data were analyzed by logistic regression.

Results Ninety-nine patients were randomized and received at least one dose of study medication; 19 patients were not evaluable due to adverse events (AEs), loss to follow-up, or lack of response. Success rates were similar in both analysis populations at the end of therapy: TLV 40/43 (93.0%) versus VAN 35/37 (94.6%). In 68 patients with S. aureus at baseline, 34/35 (97.1%) were cured in the TLV group and 32/33 (97.0%) in the VAN group. For 56 S12 MRSA patients, cure rates were 30/30 (100%) for TLV and 25/26

(96.2%) for VAN. A total of 14 baseline MRSA patients initially cured returned to the ED with a new skin and soft tissue infection: 4/30 (13.3%) patients treated with TLV and 10/26 (38.5%) patients treated with VAN. In a relative risk analysis, TLV-treated patients had a 3.34-fold greater chance of not returning with a recurrent infection than VAN-treated patients (P, 0.04; CI, -1.036, 10.790). The overall incidence of AEs was similar in the two treatment groups: TLV (30%) versus VAN (32.7%). Conclusions The results of this study suggest improved long-term eradication of pathogens by TLV based on recurrence of infection within 6 months, and support the development of TLV, especially for infections caused by community-acquired MRSA. Reference

1. Stryjewski ME, et al.: Clin Infect Dis 2005, 40:1601-1607.

Daptomycin therapy for Gram-positive bacterial infections: a retrospective study of 30 cardiac surgery patients

E Douka, G Stravopodis, I Mpisiadis, D Markantonaki,

S Geroulanos, G Saroglou

Onassis Cardiac Surgery Center, Athens, Greece Critical Care 2008, 12(Suppl 2):P30 (doi: 10.1186/cc6251)

Introduction Daptomycin is the first in a class of agents known as lipopeptides, used for the treatment of Gram-positive infections. The aim of this study was to evaluate the outcome in patients who were treated with daptomycin in a 12-month period. Methods A retrospective review of Onassis Cardiac Surgery Center medical records. Clinical information, including patient demographics and clinical outcome, was analyzed. Primary endpoints were resolution of signs and symptoms of infection and discharge from hospital.

Results Thirty inpatients were treated with daptomycin (27 men, median age 60.6 years, mean hospital stay 55 days). Seven patients had bloodstream infection (BSI) (six coagulase-negative staphylococcus (CNS), one methicillin-susceptible Staphylococcus aureus (MSSA)), six patients had catheter-related BSI (five CNS, one vancomycin-resistant enterococcus (VRE)), eight patients had nonbacteremic catheter-related infection (seven CNS, one VRE), two patients had wound infection caused by MSSA and one patient had defibrillator-wire endocarditis caused by CNS. Seven patients received daptomycin as empiric therapy without laboratory documentation. All bacterial isolates were tested for susceptibility to daptomycin (MIC <2 |ig/ml was considered sensitive). Most patients received daptomycin at a dosage of 4-6 mg/kg intravenously every 24 hours. The dosing frequency was adjusted to once every 48 hours or thrice weekly in all patients who had received hemodialysis. Prior and concomitant antibiotic therapy had been administered to all patients.

Overall, 22 (73.3%) of the 30 patients improved and were discharged. Eight patients died of complications of their underlying medical conditions. CNS was the most common pathogen (19 patients, six of whom died). No adverse events were attributed to daptomycin.

Conclusions Given the limitations of this registry because of its retrospective nature, daptomycin appears promising for the treatment of Gram-positive bacteremia, including catheter-related BSI by CNS. It has a safety profile similar to other agents commonly administered. Clinical experience will help define its role in the treatment of catheter-related BSI, foreign body endocarditis and multidrug-resistant Gram-positive bacteremia.

Therapy with teicoplanin in the ICU: continuous intravenous infusion or intermittent intravenous administration?

E Nardi1, L Marioni1, F Forfori2, G Licitra2, M Del Turco2, F Giunta2

1Scuola di specializzazione in Anestesia-Rianimazione, Université degli Studi di Pisa, AOUP, Pisa, Italy; 2Department of Surgery, AOUP, Pisa, Italy

Critical Care 2008, 12(Suppl 2):P31 (doi: 10.1186/cc6252)

Introduction Teicoplanin is a glycopeptide antibiotic. The principle pharmacodynamic parameter is time dependency. Methods A total of 16 critically ill patients were enrolled into the study after informed consent. They were being treated for suspected or documented Gram-positive infections. We administrated an initial loading dose of 10 mg/kg every 12 hour for three doses, followed by a maintenance dose based on therapeutic drug monitoring and therapy personalization by Bayesian computerized software. For the maintenance dose we divided patients into two groups; in the first group teicoplanin was administrated intermittently every 24 hours (control group), in the second group by continuous infusion (study group). Adequate drug exposure was defined as a plasma level concentration >10 mg/l or greater. Blood samples for therapeutic drug monitoring were collected immediately before teicoplanin administration. When the pathogen agents were isolated, bacteriostatic and bactericidial properties of the serum were tested in both groups.

Results No differences between groups were found regarding mortality and renal damage. In the study group we reached greater level of teicoplanin despite the same amount of drug (Figure 1). To reach an adequate plasmatic level we had to increase the amount of teicoplanin in four patients in the control group, but we halved the dose in one patient and brought it down to one-quarter in another one in the study group. Bactericidial serum activity was greater in the continuous group, although without statistical significance (Figure 1). Conclusions Our data suggest that the administration of teicoplanin by continuous intravenous infusion compared with the intermittent mode might be more efficient in critically ill patients.

Bacterial population and antibiotic resistance in an ICU during a 4-year period

I Karampela, C Nicolaou, Z Roussou, M Katsiari, E Mainas, F Tsimpoukas, A Maguina

Konstantopoulio General Hospital, Athens, Greece Critical Care 2008, 12(Suppl 2):P32 (doi: 10.1186/cc6253)

important considering the emergence of multidrug-resistant bacteria. Our aim was to retrospectively study the bacterial population responsible for colonization and infection of patients hospitalized in a multidisciplinary seven-bed ICU in a general hospital, as well as antibiotic resistance, during a 4-year period. Methods Nine hundred and forty-eight patients were admitted to the ICU from September 2003 to September 2007. Blood, bronchial aspirates, urine, pus, drainage fluid (trauma, pleural, ascitic) and central venous catheter (CVC) tips were cultured for diagnostic purposes as well as for colonization surveillance. Results Gram-negative bacteria were most commonly isolated (59%). The most frequent isolates were Acinetobacter baumannii (35%) and coagulase-negative Staphylococci (CNS) (30%) followed by Klebsiella pneumoniae (12%) and Pseudomonas aeruginosa (12%). Enterococci (6%) and Staphylococcus aureus (4.6%) were rarely isolated.

Seventy-five percent of S. aureus strains were methicillin resistant, but all were sensitive to linezolid, teicoplanin and vancomycin. Ninety percent of CNS strains were methicillin resistant, 1.1% was resistant to linezolid, 3.3% were resistant to teicoplanin, but all strains were sensitive to vancomycin. Four percent of Enterococcus strains were resistant to linezolid, while teicoplanin and vancomycin resistance was 32.6% and 35.8%, respectively. Ninety-seven percent and 68% of A. baumannii strains were resistant to aminoglycosides and carbapenemes, respectively, while resistance to colimycin showed a significant increase during the last 2 years (from 5% to 16%). K. pneumaniae strains were resistant to aminoglycosides in 40%, aztreonam in 57%, and carbapenemes in 52%, while they were found to have an increasing resistance to colimycin with time (18% in 2006 to 34% in 2007). Finally, only one P. aeruginosa strain was found to be colimycin resistant.

Conclusions Gram-negative resistant strains predominate among our ICU bacterial population. During the 4-year study period the overall bacterial resistance, although high, remains relatively stable. Emerging Gram-negative bacterial resistance to colimycin poses a potential therapeutic problem for the future.

Relationship of microorganism carriage and infections in ICU patients_

V Soulountsi1, K Mitka1, E Massa1, C Trikaliotis2, I Houris1, M Bitzani1

1ICU and 2Microbiology Department, G. Papanikolaou, Thessaloniki, Greece

Critical Care 2008, 12(Suppl 2):P33 (doi: 10.1186/cc6254)

Introduction Nosocomial infections are responsible for significant morbidity and mortality in ICU patients. This is particularly

Figure 1 (abstract 31)

Introduction A prospective cohort study was undertaken to determine the usefulness of surveillance cultures to classify ICU

infections and to predict the causative agent of ICU-acquired infections.

Methods A total of 48 community patients (36 men, 11 women, age 50.17 ± 17.974 years, APACHE II score 13.51 ± 6.153) who were expected to stay in the ICU for >5 days were included in this study. Surveillance cultures of the throat and rectum were obtained on admission and thereafter at days 4, 8 and 12 to look for potentially pathogenic microorganisms (PPMs) in order to distinguish the community acquired from those acquired during the ICU stay. The epidemiological data and the alteration of carriage state of the patients during these days were recorded. Total infection episodes were classified into three categories according to the carrier state: primary endogenous infection (PEI), caused by PPMs carried by the patient in surveillance cultures on admission; secondary endogenous infection (SEI),caused by PPMs of the ICU environment, yielded both in surveillance and diagnostic cultures; and exogenous infection (EXI) for those caused by PPMs that did not yield in surveillance cultures. Statistical analysis was made using Pearson x2, paired t test, and ROC curve. P value < a, a = 5%. Results On day 4, colonization was detected by throat and rectum surveillance cultures in 81.1% and 75% of patients, respectively (P <<0.05). The most common microorganism isolated in surveillance cultures from the throat was Acinetobacter baumannii (22.9%) and that from the rectum was Escherichia coli (15.7%). A total of 100 infections were described during the patients' ICU stay (length of stay: 26.44 ± 17.95) distinguished in 28 PEI, 44 SEI and 25 EXI. ICU-acquired infections were 69% of cases. The mean day of PEI diagnosis was 6.2 ± 4.7, of SEI was 12.6 ± 8.2 and of EXI was 12.6 ± 7.3. The causative agent could be predicted in 72% of infections. The sixth day was the cutoff point to predict the causative microbial agent from the surveillance cultures (sensitivity 80%, specificity 74.6%). Isolation of A. baumannii in surveillance cultures had a probability of 92.1% to cause infection.

Conclusions Our data suggest that surveillance cultures may offer useful information to improve hygiene in the ICU, to determine the causative agent of infection and to follow better antimicrobial policy.

Rifampicin and colistin association in the treatment of severe multiresistant Acinetobacter baumannii infections

B Charra, A Hachimi, A Benslama, S Motaouakkil

Ibn Rochd University Hospital, Casablanca, Morocco Critical Care 2008, 12(Suppl 2):P34 (doi: 10.1186/cc6255)

Introduction The increased incidence of nosocomial infections by multidrug-resistant Acinetobacter baumannii creates demand on the application of some combinations of older antimicrobials on that species. We conducted the present observational study to evaluate the efficacy of intravenous and aerosolized colistin combined with rifampicin in the treatment of critically patients with nosocomial infections caused by multiresistant A. baumannii. Methods Critically ill patients with nosocomial infections caused by A. baumannii resistant to all antibiotics except colistin in a medical ICU were included. Diagnosis of infection was based on clinical data and isolation of bacteria. The bacterial susceptibilities to colistin were tested. Clinical response to colistin and rifampicin was evaluated.

Results There were 26 patients (age 43.58 ± 18.29 years, APACHE II score 6.35 ± 2.99), of whom were 16 cases of nosocomial pneumonia treated by aerosolized colistin (1 x 106 IU three S14 times/day) associated with intravenous rifampicin (10 mg/kg every

12 hours), nine cases of bacteraemia treated by intravenous colistin (2 x 106 IU three times/day) associated with intravenous rifampicin (10 mg/kg every 12 hours) in which three cases associated with ventilator-associated pneumonia, and one case of nosocomial meningitis treated by intrathecal use of colistin associated with intravenous rifampicin. The clinical evolution was favourable for all ill patients. Concerning side effects, we noticed a moderate hepatic cytolysis in three patients.

Conclusions This is a clinical report of colistin combined with rifampicin for treatment of A. baumannii infection. Despite the lack of a control group and the limited number of patients, the results seem to be encouraging.

Trends of resistance of Gram-negative bacteria in the ICU during a 3-year period

A Skiada1, J Pavleas2, G Thomopoulos2, I Stefanou3, A Mega2, N Archodoulis2, P Vernikos2, K Rigas2, A Rigas2, A Kaskantamis2, A Zerva2, E Pagalou2, J Floros2

1Research Laboratory for Infectious Diseases and Antimicrobial Chemotherapy, University of Athens, Greece; 2ICU, Laikon General Hospital, Athens, Greece; 3Microbiology Department, Laikon Hospital, Athens, Greece

Critical Care 2008, 12(Suppl 2):P35 (doi: 10.1186/cc6256)

Introduction The aim of the study was to calculate the incidence of bacteremias due to multidrug-resistant (MDR) Gram-negative bacteria (GNB) and to detect any emerging trends during a 3-year period.

Methods A prospective study of bloodstream infections in an ICU of a tertiary care hospital. The data collected prospectively included epidemiological and clinical characteristics of all patients admitted to the ICU from September 2004 to August 2007, the bacteria isolated from bloodstream infections and their patterns of resistance. A bacterium was characterized as MDR when it was resistant to three classes of antibiotics and as MDRc when it was also resistant to carbapenems. The study was divided into nine 4-month periods in order to calculate the incidence of MDR bacteremias in each such period and to evaluate each bacterium separately.

Results During this study 390 patients were admitted to the ICU, of whom 60% were male. Their mean age was 65 years, the mean APACHE score was 17.9 and the mean duration of stay in the ICU was 18 days. One hundred bacteremias due to MDR GNBs were recorded. Of the isolated MDR bacteria, 77 were MDRc and 95% of those 77 were Acinetobacter baumannii (25 isolates), Pseudomonas aeruginosa (20 isolates) or Klebsiella pneumoniae (28 isolates). A clear trend emerged for K. pneumoniae, whose incidence increased exponentially during the study period. Of the 28 isolates of MDRc K. pneumoniae, 7% were recorded during the first 12 months of the study, 33% during the next 12 months and 60% during the last 12 months. The incidence of A. baumannii remained relatively stable (36%, 32% and 32% of isolates were recorded during each 12-month period) and the same was true for P. aeruginosa (25%, 40% and 35%, respectively). Conclusions The incidence of bacteremias due to MDR GNBs that are also resistant to carbapenems is high in our ICU. Bacteremias due to MDRc K. pneumoniae have risen dramatically during the past months. Further studies are needed to investigate the risk factors and develop strategies to confront the problem. References

1. Kollef M: Chest 2006, 130:1293-1296.

2. Patel G: J Pharm Practice 2005, 18:91-99.

Colonization and infection by potential Gram-negative multiresistant microorganism in a medical-surgical ICU

L Lorente Ramos, C García, J Iribarren, J Castedo, R Galván, J Jiménez, M Brouard, R Santacreu, L Lorenzo, M Martín, M Mora

Hospital Universitario de Canarias, La Laguna, Tenerife, Spain Critical Care 2008, 12(Suppl 2):P36 (doi: 10.1186/cc6257)

Introduction To determine the rate of colonization/infection by potential Gram-negative multiresistant microorganisms (pseudomonas, stenotrophomonas and acinetobacter) in critically ill patients. Methods A prospective study for 30 months in the ICU. Throat swabs, tracheal aspirates and urine were taken on admission and twice weekly. The infections were classified based on thorat flora as: primary endogenous (PE), caused by germs that were already colonizing the throat on ICU admission; secondary endogenous (SE), caused by germs that were not colonizing the throat on ICU admission but were acquired during the stay in the ICU; and exogenous (EX), caused by germs that were not colonizing the throat. The infections were classified based on the onset as: early onset (EO), when developed during the first 4 days of ICU stay; and late onset (LO), when developed 5 days after ICU admission. Results In total 1,582 patients were admitted. The mean APACHE II score was 13.95 (±8.93). Mortality was 14.79%. A total of 80 patients had colonization by pseudomonas, 26 patients at ICU admission and 54 patients during the ICU stay. We documented 46 infections by pseudomonas (nine EO and 37 LO; four PE, 35 SE and seven EX) with death in 13/46 patients (28.26%): 31 pneumonias (six EO and 25 LO; two PE, 24 SE and five EX), seven urinary tract infections (one EO and six LO; two PE, three SE and two EX), five primary bacteremias (two EO and three LO; five SE), two surgical wound infections (two LO and SE) and one pressure sore infection (one LO and SE). A total of 14 patients had colonization by stenotrophomonas, one patient at ICU admission and 13 patients during the ICU stay. We documented eight infections by stenotrophomonas (two EO and six LO; seven SE and one EX) with death in 3/8 patients (37.50%): all were pneumonias. A total of 12 patients had colonization by acinetobacter, one patient at ICU admission and 11 patients during the ICU stay. We documented eight infections by acinetobacter (two EO and six LO; seven ES and one EX) with death in 2/8 patients (25%): six pneumonias (two EO and four LO; five SE and one EX) and two bloodstream infection (two LO and two SE). Conclusions In our series, most of the infections caused by pseudomona, stenotrophomonas and acinetobacter were pneumonias, had a late onset and were secondary endogenous infections.

Changing resistance pattern for Acinetobacter baumannii through the years

F Alkaya Alver1, O Memikoglu1, E Ozgencil1, E Eker2, M Oral1, N Unal1, M Tulunay1

1Ibni Sina Hospital, Ankara, Turkey; 2Bakent University, Adana, Turkey

Critical Care 2008, 12(Suppl 2):P37 (doi: 10.1186/cc6258)

Introduction Over the past two decades, Acinetobacter baumannii has emerged as an important nosocomial pathogen, and hospital outbreaks caused by this organism have increased worldwide. Its extraordinary ability to acquire resistance to almost all groups of commercially available antibiotics is a clinical problem of great concern. In fact, most A. baumannii strains isolated in the ICU are highly resistant to carbapenems, aminogylcosides and p-lactam

antibiotics. The aim of this study is to determine the change in the resistance pattern of A. baumannii through the years. Methods Isolates from patients admitted to the ICU in 2003 and 2007 were analyzed. Both colonization and infection isolates were evaluated. In 2003, 118 isolates from 51 patients and in 2007, 108 isolates from 21 patients were included in the study. The clinical specimens were sampled from the tracheal aspirate, abscess, blood, wound, urine, pleura fluid, catheter and ascites. Susceptibilities to seven antibiotics were determined. Results Patterns of resistance in 2003 and 2007 were: 54.7%/28.6% sulbactam-cefoperazone, 66.1%/85.7% imipenem, 96.6%/95.2% piperacillin-tazobactam, 66.1%/61.9% cefepime, 4.3%/0% colistin, 76.3%/85% amikacin, and 60.2%/66% tobramycin, respectively. Even though tobramycin is not on the market in Turkey, drug resistance has been increased in 4 years. Conclusions Colistin seems to be the best alternative in our hospital but, since it is not on the market in our country, the use of it is limited. Carbepenems are therefore the first choice in the treatment of multidrug-resistant A. baumannii in our country. This choice causes increased carbepenem resistance in A. baumannii isolates. Our hospital's antibiotic treatment policies and the cyclic usage of antibiotics must be reconsidered. Reference

1. Del Mar Tomas M, Cartelle M, Pertega S, et al.: Hospital outbreak caused by a carbapenem-resistant strain of Acinetobacter baumannii: patient prognosis and risk-factors for colonisation and infection. Clin Microbiol Infect 2005, 11: 540-546.

Catheter-related bloodstream infection according to central venous accesses

L Lorente Ramos, C García, J Iribarren, J Castedo, J Jiménez, M Brouard, C Henry, R Santacreu, M Martín, M Mora

Hospital Universitario de Canarias, La Laguna, Tenerife, Spain Critical Care 2008, 12(Suppl 2):P38 (doi: 10.1186/cc6259)

Introduction Which venous catheterization site is associated with the higher risk of infection remains controversial. In the CDC guidelines of 1996 and in the latest guidelines of 2002, central venous catheter (CVC) insertion at the subclavian site is recommended rather than a femoral or a jugular access to minimize infection risk. The objective of this study was to analyze the incidence of catheter-related bloodstream infection (CRBSI) of CVCs according to different accesses.

Methods A prospective and observational study, conducted in a polyvalent medical-surgical ICU. We included all consecutive patients admitted to the ICU during 4 years (1 May 2000-30 April 2004). The comparison of CRBSI incidence per 1,000 catheter-days between the different central venous accesses was performed using Poisson regression. P < 0.05 was considered statistically significant.

Results The number of CVCs, days of catheterization duration, number of bacteremias and the CRBSI incidence density per 1,000 days were: global, 1,769, 15,683, 48 and 3.06; subclavian, 877, 7,805, 8, 1.02; posterior jugular, 169, 1,647, 2 and 1.21; central jugular, 515, 4,552, 22 and 4.83; and femoral, 208, 1,679, 16 and 9.52. The CRBSI incidence density was statistically higher for femoral than for central jugular (OR = 1.40, 95% CI = 1.04-infinite, P = 0.03), posterior jugular (OR = 1.99, 95% CI = 1.30-infinite, P < 0.001) and subclavian accesses (OR = 9.30, 95% CI = 4.27-infinite, P < 0.001); for central jugular than for posterior jugular (OR = 3.98, 95% CI = 1.15-infinite, P = 0.03) and subclavian accesses (OR = 4.72, 95% CI = 2.27-infinite, P < S15

0.001); and there were no significant differences between posterior jugular and subclavian access (OR = 1.09, 95% CI = 0.43-infinite, P = 0.99).

Conclusions Our results suggest that the order for venous punction, to minimize the CVC-related infection risk, should be subclavian or posterior internal jugular as the first option, subsequently central internal jugular and finally the femoral vein.

Comparison of oligon central venous catheters with standard multilumen central venous catheters in cardiac surgery ICU patients

I Mpisiadis, E Douka, I Kriaras, D Markantonaki, S Geroulanos

Onassis Cardiac Surgery Center, Athens, Greece Critical Care 2008, 12(Suppl 2):P39 (doi: 10.1186/cc6260)

Introduction Catheter-related infections account for a large part of all nosocomial infections, and clinical studies have suggested that impregnation of catheters with antiseptics or antibiotics could decrease the rates of colonization. The purpose of this study was to assess the efficacy of oligon catheters to reduce bacterial colonization.

Methods A prospective, randomized clinical study was conducted among patients admitted to our 16-bed cardiac surgery ICU from 1 December 2006 to 1 December 2007 who required a central venous catheter after cardiac surgery. A total of 139 patients were prospectively randomized to receive either an oligon (O group, n = 69) or a standard catheter (S group, n = 70), expected to remain in place for >3 days. Catheter colonization, catheter-related bloodstream infection and nonbacteremic catheter-related infection were defined according to the Center for Disease Control and Prevention. Blood cultures were drawn at catheter removal, and the removed catheters were analyzed with quantitative cultures. Catheters were removed aseptically if no longer necessary, the patient died or there were signs of sepsis.

Results A total of 69 catheters were studied in the oligon group and 70 in the standard group. Characteristics of the patients, the insertion site, the duration of catheterization, and other risk factors for infection were similar in the two groups. Catheter colonization, 3 (4.35%) in O group versus 3 (4.28%) in S group, failed to reach significance despite the relative long median duration of catheterization of 9 days versus 8 days, respectively. When catheter colonization occurred, coagulase-negative staphylococcus was found most frequently in both groups.

Conclusions Oligon central venous catheters did not significantly reduce bacterial catheter colonization or the catheter-related infection rate compared with the standard catheters. This means that usual preventive measures are the cornerstone to control catheter-related infections.

A multicenter randomized controlled clinical trial comparing central venous catheters impregnated with either 5-fluorouracil or chlorhexidine/silver sulfadiazine in preventing catheter colonization

JM Walz1, J Luber2, J Reyno3, G Stanford4, R Gitter5, KJ Longtine1, JK Longtine1, MA O'Neill1, SO Heard1

1University of Massachusetts Medical Center, Worcester, MA, USA; 2Franciscan Health System Research Center, Tacoma, WA, USA; 3Rapid City Regional Hospital, Rapid City, SD, USA; Winchester Medical Center, Winchester, VA, USA; 5Mercy Heart Center, Sioux City, IA, USA S16 Critical Care 2008, 12(Suppl 2):P40 (doi: 10.1186/cc6261)

Introduction We conducted a national multicenter randomized clinical trial to compare the efficacy of a novel anti-infective central venous catheter (CVC) coated with 5-fluorouracil (5-FU) (Angiotech Pharmaceuticals, Vancouver, Canada) with a catheter coated with chlorhexidine and silver sulfadiazine (CH-SS) (ARROWgard Blue; Arrow International, Inc., Reading, PA, USA) in preventing catheter colonization, local site infection and catheter-related bloodstream infection (CRBSI).

Methods Male and nonpregnant female subjects >18 years, who were initially hospitalized in an ICU and required insertion of a triple-lumen CVC, were randomized (1:1) to receive either the 5-FU or CH-SS CVCs, implanted for a maximum of 28 days. Upon removal, catheter tips were cultured using the roll-plate method. CRBSI was defined as isolation of the same species from peripheral blood and the catheter tip. Incidence rates of bacterial catheter colonization between the two treatments were compared using the Cochran-Mantel-Haenszel x2 test. To evaluate bacterial isolates for evidence of acquired 5-FU resistance, isolates cultured from catheter tips were exposed to 5-FU in vitro for a second time. Results Of 960 subjects who were randomized, 817 completed the study. Four hundred and nineteen were randomized to the 5-FU group and 398 to the CH-SS group. The rate of colonization of 5-FU catheters was 2.9% (n = 12) compared with 5.3% (n = 21) in the CH-SS-coated catheters (relative reduction in colonization with 5-FU coating of 46%, P = 0.055). There were no statistically significant differences (5-FU vs CH-SS) in local site infections (1.4% vs 0.9%), CRBSI (0% vs 2.8%), and the rate of adverse events related to the study devices (3.4% vs 3.5%). There was no evidence for acquired resistance to 5-FU in clinical isolates exposed to the drug for a second time.

Conclusions The CVC coated with 5-FU is noninferior in its ability to prevent bacterial colonization of the catheter tip when compared with catheters coated with CH-SS.

High epithelial lining fluid concentrations of NKTR-061 (inhaled amikacin) twice daily achieved in pneumonic portions of lung

J Chastre1, K Corkery2, C Wood3, M Clavel4, K Guntupalli5, D Gribben2, C Luyt1

1Hopital Pitie-Salpetriere, Paris, France; 2Nektar Therapeutics, San Carlos, CA, USA; 3University of Tennessee Health Sciences Center, Memphis, TN, USA; 4Centre Hospitalier Universitaire Dupuytren, Limoges, France; 5Baylor College of Medicine, Houston, TX, USA

Critical Care 2008, 12(Suppl 2):P41 (doi: 10.1186/cc6262)

Introduction The use of systemic aminoglycosides to treat ventilated patients with Gram-negative pneumonia (GNP) is limited by their toxicity and poor penetration into the lung. Luyt and colleagues demonstrated high amikacin epithelial lining fluid (ELF) concentrations after NKTR-061 twice daily, amikacin 400 mg (3.2 ml), administration to intubated patients with pneumonia (n = 4) [1]. The present study was conducted to confirm high levels of NKTR-061 in the ELF in the pneumonic portion of the lung after 400 mg twice daily dosing.

Methods NKTR-061 was delivered via the pulmonary drug delivery system (PDDS® Clinical; Nektar Therapeutics) in mechanically ventilated patients with GNP for 7-14 days. The aerosol therapy was adjunctive to intravenous therapy per ATS guidelines. Twenty-eight evaluable patients received a daily dose of 800 mg in two divided doses every 12 hours. On treatment day 3, all patients underwent bronchoalveolar lavages 30 minutes post aerosol. The lung fluid was obtained from the infection-involved area of the lung.

The apparent volume of ELF recovered by bronchoalveolar lavage was determined using urea as an endogenous marker of dilution. The same day, the amikacin concentration was determined in serum collected 0.5, 1, 3, 6, 9, 12, 13, and 24 hours after delivery of the morning dose.

Results The median ELF amikacin was 976.1 |ig/ml (135.716,127.6), whereas the median (range) serum maximum concentration was 0.9 |ig/ml (0.62-1.73). The median days of aerosol treatment was 7 days (2-10).

Conclusions Delivery of aerosolized amikacin using the PDDS Clinical achieved very high aminoglycoside concentrations in the ELF, in the pneumonic area of the lung, while maintaining safe serum amikacin concentrations. The ELF concentrations achieved always exceeded the amikacin MIC for microorganisms usually responsible for GNP. The clinical impact of this route remains to be determined. Reference

1. Luyt et al.: [abstract]. Am J Respir Crit Care Med 2007, 175: A328.

NKTR-061 (inhaled amikacin) delivers high lung doses in mechanically ventilated patients with pneumonia and in healthy subjects

M Eldon1, K Corkery1, D Gribben1, J Fink1, M Niederman2

1Nektar Therapeutics, San Carlos, CA, USA; 2Winthrop University Hospital, Mineola, NY, USA

Critical Care 2008, 12(Suppl 2):P42 (doi: 10.1186/cc6263)

Introduction Intravenous antibiotics (AB) have a clinical cure rate of ~55% in mechanically ventilated patients (MVP) with pneumonia. AB inhalation delivers larger lung doses than intravenous AB, but is problematic in MVP due to inefficient and variable current delivery systems. NKTR-061, a proprietary amikacin (AMK) aerosolization system optimized for ventilator circuits, is in clinical development as an adjunct to intravenous AB therapy for the treatment of pneumonia.

Methods In a phase II study, MVP (n = 44) received 400 mg every 24 hours or every 12 hours for 7-14 days; serial serum, tracheal aspirate and urine samples were collected on day 3. The days on vent and intravenous AB use were monitored. In a separate study, healthy subjects (n = 14) inhaled a single Technicium 99m-labeled 400 mg dose; serum and urine were collected and lung deposition was determined with gamma scintigraphy.

Results The AMK lung dose in healthy subjects was 172.2 mg, 43% of the nominal dose. The lung dose in MVP was 112 mg, with the difference arising from loss of drug in the ventilator circuit. The predicted peak lung AMK in healthy subjects (Figure 1) ranged from 75 to 165 |ig/g. The peak tracheal aspirate AMK after NKTR-061 was 16,212 ± 3,675 |ig/ml versus 14 ± 4.2 |ig/ml after intravenous (15 mg/kg [1]); the peak serum AMK after NKTR-061 was 3.2 ± 0.5 |ig/ml versus 47 ± 4.2 |ig/ml after intravenous. NKTR-061 caused a significant (P = 0.02) dose-dependent reduction in intravenous AB use, with MVP dosed every 12 hours requiring half as much concurrent intravenous AB after 7 days of treatment as those receiving placebo [2].

Conclusions NKTR-061 achieves AMK lung exposures in MVP much greater than those after intravenous dosing. Greater lung exposure with concurrent lower overall dose and serum exposure is expected to increase efficacy, reduce the incidence of AB resistance and limit systemic AB toxicity. References

1. Santre C, et al.: Amikacin levels in bronchial secretions of 10 pneumonia patients with respiratory support treated

Figure 1 (abstract P42)

HS, healthy subjects; IH, inhaled; IV, intravenous.

once daily versus twice daily. Antimicrob Agents Chemother 1995, 39:264-267. 2. Niederman MS, et al.: Inhaled amikacin reduces IV antibiotic use in intubated mechanically ventilated patients during treatment of Gram-negative pneumonia [abstract]. Am J Respir Crit Care Med 2007, 175:A328.

Ten-year exploratory retrospective study on empyema

KM Marmagkiolis1, M Omar1, N Nikolaidis2, T Politis3,

I Nikolaidis4, S Fournogerakis4, MP Papamichail5, L Goldstein1

Forum Health - WRCS, Youngstown, OH, USA; 2Aberdeen Royal Infirmary Hospital, Aberdeen, UK; 3Northeastern Ohio Universities College of Medicine, Rootstown, OH, USA; 4Tzanion General Hospital, Athens, Greece; 5Hospital Asklipion Voulas, Athens, Greece Critical Care 2008, 12(Suppl 2):P43 (doi: 10.1186/cc6264)

Introduction Thoracic empyema remains a serious illness that usually represents a complication of pneumonia in susceptible patients. Our exploratory study aims to describe this potentially fatal disease and identify clinically useful correlations that would lead to more effective management and treatment. Methods We performed a retrospective review of patients hospitalized between the years 1996 and 2006 at Forum Health -WRCS. Demographics, initial symptoms and signs, underlying diseases, pleural fluid analysis and cultures, chest CT reports, length of stay and outcome were reviewed.

Results The charts of 104 patients who filled the above criteria were reviewed. Their age ranged from 10 months to 87 years; 52% were nonsmokers. The main presenting symptoms were dyspnea (65%), fever (60%), cough (60%), chest pain (45%) weight loss (14%) and hemoptysis (9%). Approximately 22% of the patients had an underlying malignancy. Other underlying chronic illnesses included chronic obstructive pulmonary disease (27%), congestive heart failure (24%), and diabetes (21%). Pleural fluid Gram stain was positive in 25% of the patients and pleural fluid cultures in 49%. Of those with positive cultures, Gram(+) aerobes were found in 60%, Gram(-) in 24% and anaerobes in only 12%. Treatment of the patients included: repeat thoracentesis (effective in only two patients); intrapleural thrombolysis performed in five patients, effective in four; and chest tube drainage (performed in 80% of the patients). Approximately one-half of them

required further procedures: video-assisted thoracoscopic surgery was performed in 10 patients (10%), six of whom required subsequent thoracotomy; and thoracotomy and decortication (performed in 46% of the patients). Overall mortality was 9% and surgical mortality was 2.1%.

Conclusions Clinical suggestions arising from our study are as follows. Empyema is a fatal complication of pneumonia and should always be suspected in patients with nonimproving pneumonia. Early aggressive antibiotic therapy targeting Gram-positive aerobes should be initiated. An underlying malignancy should be always considered in the differential diagnosis. Cardiothoracic surgeons should always be consulted early in the clinical course for evaluation of a possible video-assisted thoracoscopic surgery or thoracotomy.

Validation of predictive rules for ICU patients with community-acquired pneumonia

V Rudnov, A Fesenko

Medical Academy, Ekaterinburg, Russian Federation Critical Care 2008, 12(Suppl 2):P44 (doi: 10.1186/cc6265)

Introduction The objective of the study was to validate specific rules and nonspecific scores for prediction of mortality in patients with severe community-acquired pneumonia (CAP). Methods The study included 300 patients with CAP admitted to six ICUs of the city during 2005-2006. On admission each of the patients was assessed using the specific rules PORT, CURB-65, CRB-65, SMART-CO, and SOFA and APACHE II scores with regards to pneumonia severity and mortality. All data were analysed and processed on receiver operating characteristic curves. Results See Table 1. The results of analysis demonstrated high predictive values of the specific rule CAP and SOFA score. The areas under the receiver operating characteristic curves (AUROCs) were compared. The APACHE II score did not have prognostic ability, because the difference in AUROC did not have statistical significance to the diagonal: 0.71 ± 0.17 (P = 0.2). PORT and SOFA scores have maximal sensitivity and specificity: 92.3 (63.9-98.7) and 81.0 (65.9-91.4).

Table 1 (abstract P44)

Score AUROC 95% CI P value

PORT 0.88 0.7 -0.9 <0.01

CURB-65 0.86 0.6 -0.9 <0.01

CRB-65 0.84 0.7 -0.9 <0.01

SMRT-CO 0.77 0.6 -0.9 <0.01

SOFA 0.90 0.8- 0.96 <0.01

APACHE-2 0.71 0.4- 0.91 >0.05

Conclusions The specific rules PORT, CURB-65, CRB-65, SMART-CO and SOFA score are comparatively informative and valuable in predicting short-term mortality in severe CAP. The APACHE II score is of low specificity and cannot be used for prediction outcomes.

Effect of age on resolution of ventilator-associated pneumonia

M Sartzi, G Kalitsi, A Stogianidi, M Charitidi, K Michas, M Komboti, F Tsidemiadou, P Clouva

General Hospital of Eleusis, Athens, Greece

Critical Care 2008, 12(Suppl 2):P45 (doi: 10.1186/cc6266)

Introduction To study the clinical and paraclinical response to therapy in patients with ventilator-associated pneumonia (VAP). Methods A prospective, 4-year study of 450 patients ventilated over 48 hours. Patients were compared according to four age groups: Group I (<35 years), Group II (35-55 years), Group III (55-79 years) and Group IV (>80 years). Patients and VAP characteristics, ICU and hospital lengths of stay (LOSs), duration of mechanical ventilation (MV)-intubation (TT), patient outcome and the first day of normalization of each clinical and paraclinical parameter were studied using Pearson's chi-square test and oneway ANOVA.

Results One hundred and thirty-four patients developed VAP. Twenty-five (18.6%) patients of Group I had (mean ± SD) APACHE II score 16.5 ± 3.6, MV 19.08 ± 7.8, TT 21.7 ± 8.4, ICU LOS 25.04 ± 11.5 days, hospital LOS 18.4 ± 10.3 days, ICU mortality 3 (12%), hospital mortality 0 (0%). Thirty-eight (28.4%) patients of Group II had APACHE II score 18.5 ± 5.7, MV 24.08 ± 10.8, TT 29.7 ± 10.6, ICU LOS 33.04 ± 12.5 days, hospital LOS 28.4 ± 11.3 days, ICU mortality 5 (20.8%), hospital mortality 6 (21.4%). Sixty (44.8%) patients of Group III had APACHE II score 21.4 ± 5.6, MV 31.1 ± 16.5, TT 32.8 ± 16.8, ICU LOS 37.04 ± 13.8 days, hospital LOS 34.2 ± 14.1 days, ICU mortality 15 (25%), hospital mortality 16 (26.6%). Ten (7.5%) patients of Group IV had APACHE II score 26.3 ± 4.5, MV 41.6 ± 10.5, TT 45.8 ± 14.7, ICU LOS 49.04 ± 15.2 days, hospital LOS 47.6 ± 18.7 days, ICU mortality 3 (30%), hospital mortality 5 (50%). The APACHE II score (P < 0.001), duration of MV (P < 0.02) and TT (P < 0.04), hospital mortality (P < 0.001), hospital LOS (P < 0.01), and MODS (P < 0.04) differ statistically significantly. ICU mortality (P < 0.4), CPIS (P < 0.7), and duration of antibiotic treatment (P< 0.6) did not differ significantly. VAP was caused by MDR Gram(-) microorganisms, except for three cases caused by MRSA (P < 0.8). Time resolution for temperature was 7 days (6.6. ± 1.1, P < 0.1), leucocyte 6 days (5.8 ± 1.2, P < 0.5), hemodynamic stability 5 days (4.8 ± 0.8, P < 0.3), normalization of PaO2/FiO2 4 days (3.9 ± 0.7, P < 0.07), and microbiological eradication 10 days (9.3 ± 1.1, P < 0.3). Colonization after VAP resolution was higher in the elderly patients (P < 0.02).

Conclusions Age does not influence the clinical response to therapy. Patients in whom the tracheobronchial aspirates were not sterilized after the resolution of VAP are at higher risk of a longer time of hospitalization and of dying after discharge from the ICU.

An endotracheal tube with a polyurethane cuff and subglottic secretion drainage reduces the incidence of primary and secondary endogenous ventilator-associated pneumonia

L Lorente Ramos, M Lecuona, A Jiménez, J Iribarren, J Castedo, R Galván, C García, J Jiménez, M Brouard, M Mora, A Sierra

Hospital Universitario de Canarias, La Laguna, Tenerife, Spain Critical Care 2008, 12(Suppl 2):P46 (doi: 10.1186/cc6267)

Introduction To compare the incidence of ventilator-associated pneumonia (VAP) using an endotracheal tube with a polyurethane

cuff, designed to reduce channel formation, and subglottic secretion drainage (ETT-PUC-SSD) versus an conventional endotracheal tube (ETT-C) with a polyvinyl cuff and without subglottic secretion drainage.

Methods A clinical randomized trial, between 1 March 2006 and 31 October 2006 in a medical-surgical ICU. Were included patients requiring mechanical ventilation during more than 24 hours. Patients were randomized into two groups: one group was ventilated with ETT-PUC-SSD and another group with ETT-C. Throat swabs and tracheal aspirates were taken at the moment of admission and twice a week until discharge. The infections were classified based on thorat flora as: primary endogenous, caused by germs that were already colonizing the throat on the ICU admission; secondary endogenous, caused by germs that were not colonizing the throat on the ICU admission but were acquired during the stay in ICU; and exogenous, caused by germs that were not colonizing the throat.

Results There were no significant differences between both groups of patients in age, sex, diagnosis groups, APACHE II score, pre-VAP use of antibiotics, paralytic agents, reintubation, tracheostomy, and days on mechanical ventilation. VAP was found in 31 of 140 (22.1%) patients in the ETT-C group and in 11 of 140 (7.9%) in the ETT-PUC-SSD group (P = 0.001). Cox regression analysis showed CET as a risk factor for global VAP (hazard rate = 3.3, 95% CI = 1.66-6.67, P = 0.001), primary endogenous VAP (hazard rate = 5.12, 95% CI = 1.12-23.38, P = 0.04) and secondary endogenous VAP (hazard rate = 2.87, 95% CI = 1.20-6.84, P = 0.02); but not for exogenous VAP. Conclusions The endotracheal tube with a polyurethane cuff and subglottic secretion drainage is effective to prevent primary endogenous and secondary endogenous VAP.

Effect of continuous aspiration of subglottic secretions on the prevention of ventilator-associated pneumonia in mechanically ventilated patients

C Yang, H Qiu, Y Zhu

Zhong-da Hospital, Nanjing, China

Critical Care 2008, 12(Suppl 2):P47 (doi: 10.1186/cc6268)

Introduction The objective was to evaluate the effect of continuous aspiration of subglottic secretions (CASS) on the prevention of ventilator-associated pneumonia (VAP) in mechanically ventilated patients.

Methods Patients ventilated mechanically in the ICU from October 2004 to April 2006 were randomly divided into two groups: one group of patients was treated with CASS and the other group of patients was not (NASS group). The CASS was performed immediately after admission in the CASS group of patients. The diagnosis of VAP was made based on the clinical presentations, and the evaluation of VAP was done using the simplified version of the Clinical Pulmonary Infection Score. The general status of the patients, days of ventilated treatment, the volume of daily aspirated subglottic secretions, the morbidity and timing of VAP, days of stay in ICU and mortality within 28 days of hospitalization were recorded.

Results One hundred and one patients were included in the study. There were 48 patients in the CASS group who were treated with mechanical ventilation for longer than 48 hours, and 43 patients in the NASS group. The median volume of aspirated subglottic secretions within the first 24 hours in the CASS group of patients (48 cases) was 28.8 ml. The morbidity of VAP in the CASS and NASS groups of patients was 25.0% and 46.5%, respectively (P = 0.032), and the length of time before the onset of VAP in these two

groups of patients was 7.3 ± 4.2 days and 5.1 ± 3.0 days, respectively (P = 0.088). There was a significant increase in the percentage of Gram-positive cocci from the lower respiratory tracts in the NASS group of patients compared with that in the CASS group of patients (P = 0.004). In the CASS group of patients, the volume of the first daily aspirated subglottic secretions in patients with VAP was significantly less than that in patients without VAP (P = 0.006). The morbidity of VAP in patients with the failed early aspiration (volume of first daily aspirated secretions <20 ml) was significantly higher than in patients who were aspirated effectively (P < 0.01). The length of mechanical ventilation time in patients with VAP was significantly longer than that in patients without VAP (P = 0.000). The inhospital mortality in patients with VAP was significantly higher than that in patients without VAP (P = 0.009), and mortality in the 28 days after admission in patients with VAP was significantly higher than that in patients without VAP (P = 0.035).

Conclusions Effective continuous aspiration of subglottic secretions could significantly reduce the morbidity of early-onset VAP, and, accordingly, may decrease the mortality of critically ill patients.

Effect of an oral care protocol in preventing ventilator-associated pneumonia in ICU patients

L Yao1, C Chang2, C Wang2, C Chen1

1National Taiwan University, School of Nursing, Taipei, Taiwan;

2Mackay Memorial Hospital, Taipei, Taiwan

Critical Care 2008, 12(Suppl 2):P48 (doi: 10.1186/cc6269)

Introduction Ventilator-associated pneumonia (VAP) remains a major complication for patients incubated with ventilators. Most cases are attributed to increased bacteria flora in the oropharyn-geal secretion and aspiration of those organisms. Evidence exists suggesting that oral care could reduce bacterial flora, prevent aspiration, and subsequently decrease the incidence of VAP for patients with ventilators. This study aims to evaluate the effectiveness of a standardized oral care protocol in improving oral hygiene and reducing the incidence of VAP in a sample of surgical patients in the ICU (SICU).

Methods Patients newly admitted to the SICU who were under ventilator support for 48-72 hours and without pneumonia present were enrolled during March-November 2007 from a tertiary medical center in Taiwan. Subjects were randomized into the experimental or control groups and both received a 7-day oral care protocol. For the experimental group (EG), a standardized 20-minute oral care protocol was performed using an electronic toothbrush to clean and moisturize oral cavity twice daily. For the control group (CG), a mimic 20-minute protocol involving moisturizing and attention control was performed for the same intervals. The incidence of VAP defined by the Clinical Pulmonary Infection Score and the oral hygiene measured by the Oral Assessment Guide (OAG) and plaque index were compared between the two groups. Variables were compared by the analysis of Fisher exact test, chi-square test, and Mann-Whitney U test. P < 0.05 was considered significant.

Results Forty-four patients were studied with a mean age of 60.6 ± 16.1 years, 63.6% being males. The results showed that the cumulative incidence of VAP was significantly lower in the EG, with 22.7% occurrence in the EG and 77.8% in the CG on day 9 (P < 0.05). In terms of oral hygiene, subjects in the EG performed significantly better on both OAG scores and plaque index. Specifically, the OAG decreased from 16.3 ± 1.9 to 14.9 ± 2.6 in the EG and remained high from 16.5 ± 1.6 to 16.6 ± 2.1 in the S19

CG (P < 0.05). The plaque index was decreased from 0.76 ± 0.14 to 0.49 ± 0.18 in the EG and remained high from 0.74 ± 0.13 to 0.75 ± 0.21 in the CG (P < 0.05).

Conclusions The findings support the effectiveness of an oral care protocol in preventing VAP and improving oral hygiene for patients admitted to the SICU with ventilator support.

Soluble triggering receptor expressed on myeloid cells-1 in bronchoalveolar lavage is not predictive for ventilator-associated pneumonia

G Oudhuis, J Beuving, J Zwaveling, D Bergmans, EE Stobberingh, G Ten Velde, C Linssen, A Verbon

University Hospital Maastricht, The Netherlands

Critical Care 2008, 12(Suppl 2):P49 (doi: 10.1186/cc6270)

Introduction The aim of the study was to evaluate the usefulness of soluble triggering receptor expressed on myeloid cells-1 (sTREM-1) as a rapid diagnostic test for ventilator-associated pneumonia (VAP). To develop a rapid diagnostic test for the diagnosis of VAP, a common complication of mechanical ventilation [1], multiple biomarkers have been evaluated with variable results. sTREM-1 proved to be a good biomarker for sepsis [2]. For the diagnosis VAP, however, there have only been a few, relatively small, studies on the role of this receptor [3]. Methods Retrospectively, 240 bronchoalveolar lavage fluid (BALF) samples, taken from patients in the ICU of a university hospital, were tested. sTREM-1 in BALF was measured using a quantitative sandwich enzyme immunoassay. Two researchers, unaware of the results of the assay, determined whether a VAP was present. Clinical suspicion of a VAP was confirmed by the presence of >2% cells containing intracellular organisms and/or a quantitative culture result of >104 colony forming units/ml in BALF. The disease had to be acquired after at least 48 hours of mechanical ventilation. Results The mean concentration of sTREM-1 was significantly higher in BALF of patients with confirmed VAP compared with patients without VAP (P = 0.045). The area under the curve was

0.577.(95% CI = 0.503-0.651, P = 0.042). sTREM-1 levels in our hands proved not to be discriminative for VAP. Choosing a sensitivity of 95% resulted in a positive predictive value (PPV) of 41% and a negative predictive value (NPV) of 62% in our population. Taking a specificity of 95% led to a PPV of 67% and a NPV of 62%. sTREM-1 levels were not different in VAP cases caused by Gram-positive or Gram-negative bacteria. sTREM-1 levels were higher in nonsurvivors compared with survivors, regarding inhospital mortality.

Conclusions The results imply that the sTREM-1 assay in BALF is

not discriminative for VAP.


1. Jackson WL, et al.: Curr Opin Anaesthesiol 2006, 19:117-121.

2. Bouchon A, et al.: Nature 2001, 410:1103-1107.

3. Gibot S, et al.: N Engl J Med 2004, 350:451-458.

Ventilator-associated pneumonia in an ICU: epidemiology, etiology and comparision of two bronchoscopic methods for microbiological specimen sampling

B Kostadinovska-Jordanoska, M Sosolceva, S Stojanovska, M Licenovska, O Vasileva

City Hospital-Skopje, Skopje, Macedonia S20 Critical Care 2008, 12(Suppl 2):P50 (doi: 10.1186/cc6271)

Introduction Ventilator-associated pneumonia (VAP) is the most important ICU-acquired infection in mechanically ventilated patients that appears 48-72 hours after the beginning of mechanical ventilation. The aim of this study was to evaluate the incidence and microbiology of VAP and to compare two quantitative bronchoscopic methods - bronchoalveolar lavage (BAL) and bronchoscopic tracheobronchical secretion (TBS) - for the diagnosis.

Methods The epidemiological and microbiological etiology of VAP in a surgical ICU with 65 patients during a 1-year period was evaluated in this prospective open, clinical study. The patients were divided into two groups: group I, 30 patients with mechanical ventilation longer than 48 hours with VAP (the case groups); group II, 35 patients with mechanical ventilation longer than 48 hours without VAP (the control group). Two types of quantitative bronchoscopic methods for identifying the etiological agent were compared (BAL and TBS).

Results Among 65 long-term ventilated patients, 35 developed VAP (one more VAPs). VAP was caused predominantly by MRSA (35%), Pseudomonas aeruginosa (28%), Klebsiella sp. (14%), and Acinetobacter sp (14%). The treatment of patients with VAP who were ventilated for a longer period in the ICU took longer compared with patients without VAP. In our study, we did not find an increased mortality rate in patients undergoing long-term ventilation who acquired VAP compared with patients undergoing long-term ventilation without VAP.

Conclusions The study showed that quantitative analysis for identifying bacterial etiology of VAP with one of two accessible bronchoscopic methods (BAL and TBS) produced identical results. Reference

1. Vincent JL, Biharu DJ, Suter PM, et al.: The prevalence of nosocomial infection in intensive care units in Europe. Results of the European Prevalence of Infection in Intensive Care (EPIC) Study. JAMA 1995, 274:639-644.

Presence of human metapneumovirus in bronchoalveolar lavage fluid samples detected by means of real-time polymerase chain reaction

C Linssen, P Wolffs, B Wulms, M Drent, L Span, W van Mook

University Hospital Maastricht, The Netherlands

Critical Care 2008, 12(Suppl 2):P51 (doi: 10.1186/cc6272)

Introduction Human metapneumovirus (hMPV) is a paramyxovirus causing symptoms of the respiratory tract infection comparable with those of respiratory syncytial virus. The virus can play a causative role in respiratory tract infection in infants, the elderly and immunocompromised patients. Analysis of bronchoalveolar lavage fluid (BALF) samples obtained from patients with hematological malignancies suspected of pneumonia often do not result in the identification of a causative infectious organism. To investigate the potential role of hMPV, we analysed BALF samples of these patients for the presence of hMPV by means of real-time PCR. Methods The study was conducted in the ICU and the hematology ward of the University Hospital Maastricht. All consecutive BALF samples obtained in the period April 1999-June 2006 from patients with a hematological malignancy suspected of pulmonary infection were eligible for inclusion. Data on the BALF total cell count, differential cell count, quantitative bacterial culture and detection of viruses, mycobacteria and fungi were noted. All samples were analyzed by real-time RT-PCR targeting the nucleo-protein gene of hMPV.

Results A total of 117 BALF samples from 95 patients (82 patients from the hematology ward, 15 ICU patients) were

included. RNA of hMPV was detected in seven out of 117 (6%) BALF samples from five patients (three patients from the hemato-logy ward, two ICU patients). In two out of five hMPV-positive patients, the underlying disease was non-Hodgkin lymphoma; the other three patients suffered from multiple myeloma, myelodys-plastic syndrome and mantle cell lymphoma. In one patient, four BALF samples were retrieved within 1 month. The first three BALF samples were hMPV PCR-positive, the fourth (collected 1 month after the first) was PCR-negative. No other infectious agents were detected in the hMPV-positive BALF samples. Neither the total cell count nor the differential cell count was significantly differenced between the hMPV-positive and hMPV-negative groups. Conclusions In 6% of BALF samples collected from adult patients with a hematological malignancy suspected of a pulmonary infection, hMPV RNA was detected whereas no other infectious agents were found. hMPV may thus be considered the causative agent of pulmonary infection in patients with a hematological malignancy when analysis for other infectious agents is negative.

Prebiotic, probiotic and synbiotic usage and gastrointestinal and trachea colonization in mechanically ventilated patients

CR Shinotsuka, MR Alexandre, CM David

UFRJ, Rio de Janeiro, Brazil

Critical Care 2008, 12(Suppl 2):P52 (doi: 10.1186/cc6273)

Introduction Sepsis, and its complications, is the main causes of death in the ICU. New preventive measures for nosocomial infections have been researched as an alternative to antibiotic administration, such as probiotic usage.

Methods This clinical, randomized trial evaluated 49 patients who were admitted to the ICU of Hospital Universitario Clementino Fraga Filho and were mechanically ventilated. The patients were randomized into one of four groups: control (n = 16), prebiotic (n = 10), probiotic (n = 12) or synbiotic (n = 11). Enteral nutrition, fibers, and lactobacillus were administered for 14 days. Colonization of the gastrointestinal tract, trachea, and the incidence of nosocomial infections, particularly ventilation-associated pneumonia, were measured. Other outcomes measured included duration of mechanical ventilation, length of stay in the ICU, duration of hospitalization, mortality rates, and development of organ dysfunction. Results The groups were matched at admission. There was no difference between the groups in relation to the incidence of ventilator-associated pneumonia or the incidence of nosocomial infection. There was a nonsignificant increase in the proportion of enterobacteria in the trachea at the seventh day in the prebiotic and probiotic groups compared with the control group. There was a nonsignificant decrease in the number of bacteria found in the stomach in the prebiotic, probiotic and synbiotic groups at day 7. No significant difference with regard to the remaining measured parameters could be found.

Conclusions Prebiotic, probiotic and synbiotic usage had no effect in the colonization of the gastrointestinal tract and trachea of mechanically ventilated patients. Reference

1. Bengmark S: Bioecologic control of the gastrointestinal tract: the role of flora and supplemented probiotics and synbiotics. Gastroenterol Clin North Am 2005, 34:413-436.

Comparison of the surveillance with quantitative and nonquantitative ETA cultures in predicting ventilator-associated pneumonia etiology in patients receiving antibiotic therapy

G Gursel, M Aydogdu, K Hizel, T Ozis

Gazi University School of Medicine, Ankara, Turkey Critical Care 2008, 12(Suppl 2):P53 (doi: 10.1186/cc6274)

Introduction It is not yet clear whether surveillance of lower respiratory tract secretions should be performed routinely and which method should be used for this. The aim of the present study is to investigate the value of quantitative (QC-ETA) and nonquantitative (NQC-ETA) surveillance cultures in predicting the causative pathogen of ventilator-associated pneumonia (VAP) in patients receiving antibiotic therapy.

Methods A prospective, observational, cohort study carried out in medical ICU of a tertiary hospital. One hundred and nine ICU patients receiving mechanical ventilation for at least 4 days were included in the study.

Results Concordance and discordance of causative pathogens of VAP with prior quantitative and nonquantitative surveillance cultures were assessed. Tracheal surveillance cultures were obtained routinely at the time of intubation and thrice weekly. Each sample was processed nonquantitatively and quantitatively (103 and 105 cfu/ml). Diagnosis of VAP was made with microbio-logically confirmed clinical criteria (CPIS > 6 and growth > 105 cfu/ml in ETA). Sixty-eight VAP episodes were developed during this period. Sensitivity (63%, 28%), specificity (78%, 85%), positive predictive value (82%, 76%), negative predictive value (56%, 41%), false positive (22%, 15%) and false negative (37%, 72%) results of the NQC-ETA and QC-ETA were calculated, respectively. NQC-ETA and QC-ETA predicted the causative pathogens 3.3 (2.7) days and 2.5( 1.7) days prior to the development of VAP episodes, respectively.

Conclusions The results of this study suggest that NQC-ETA would be an acceptable tool in surveillance for and predicting the causative pathogen of VAP developing in patients who have already received antibiotic therapy.

Errors regarding specific preventive measures of ventilator-associated pneumonia in the ICU

A Gavala, X Kontou, K Bella, E Evodia, M Papa, P Myrianthefs, G Baltopoulos

KAT, Kifisia, Greece

Critical Care 2008, 12(Suppl 2):P54 (doi: 10.1186/cc6275)

Introduction Prevention of aspiration-induced ventilator-associated pneumonia (VAP) includes, among other factors, head elevation, appropriate cuff pressure and correct positioning of nasogastric tubes. The purpose of the study was to identify errors regarding these prevention measures in critically ill ICU patients. Methods We included all mechanically ventilated patients hospitalized in our seven-bed ICU. We prospectively collected the demographics and positioning of the patients (degrees), cuff pressures (mmHg) and correct position of the nasogastric tubes in the stomach at 08:00 every day for six consecutive months. Results We included 37 patients (25 males) of mean age 66.9 ± 3.3 years and illness severity scores of SAPS II 53.05 ± 1.5 and SOFA 7.2 ± 0.3. In total we had 267 observations. The mean cuff pressure was 26.8 ± 0.9 mmHg. The mean slope of the patients' bed was 29.5 ± 0.4°. The mean volume of the oropharyngeal S21

aspirates was 9.9 ± 0.4 ml and of tracheal aspirates was 7.1 ± 0.2 ml. In 24 and 69 observations, tracheal and oropharyngeal aspirates, respectively, were >10 ml. In 109/267 (40.1%) observations. the slope of the patients was <30°. All patients had at least one positioning with a slope <30°. In 64/267 (23.9%) observations, the cuff pressures were <20.0 mmHg. One-half of the patients had at least one measurement <20.0 mmHg. In 10 cases, the end of the nasogastric tube was in the esophagus and in five cases it was obstructed. Twenty patients developed VAP (20/37, 54.1%). Patients with a large amount of oropharyngeal aspirates (>10 ml) and low cuff pressures (<20 mmHg) had a significantly higher incidence of subsequent VAP (72.2% vs 36.8%) compared with those with a low amount of oropharyngeal aspirates (<5 ml) and normal cuff pressures (chi-square test, P = 0.049; RR = 1.960, CI = 1.018-3.774; OD = 4.457, CI = 1.110-17.90). Conclusions Errors regarding specific prevention measures of VAP are frequently observed. Our data also show the significance of the amount of oropharyngeal aspirates and cuff pressures for the subsequent development of VAP. The tightness of the stomach-oropharyngeal-tracheal axis seems to be a significant factor influencing the subsequent development of VAP. Reference

1. Chastre J: Respir Care 2005, 50:975-983. P55

Clinical predictors for septic shock in patients with ventilator-associated pneumonia

G Gursel, M Aydogdu

Gazi University School of Medicine, Ankara, Turkey Critical Care 2008, 12(Suppl 2):P55 (doi: 10.1186/cc6276)

Introduction Ventilator-associated pneumonia (VAP) is one of the most frequent infections of ICUs, and nearly 50% of patients develop septic shock during VAP. Septic shock is an independent predictor for mortality in these patients. The aim of this study is to investigate the predictors for septic shock in patients with VAP receiving appropriate antibiotic therapy.

Methods Eighty-nine patients with microbiologically confirmed VAP and receiving appropriate antibiotic therapy were included in the study. The patients were divided into two groups according to the existence of septic shock. Clinical, hematological, biochemical and microbiological characteristics of the patients were compared. Results Thirty-one percent of the patients developed septic shock, and advanced age (OR = 1.07, 95% CI = 1.02-1.13, P = 0.009), lymphocytopenia <1,000 mm3 (OR = 7.48, 95% CI = 1.91-29, P = 0.004), high blood glucose levels >120 mg/dl (OR = 4.75, 95% CI = 1.38-16, P = 0.014), and higher Clinical Pulmonary Infection Scores (OR = 1.64, 95% CI = 1.16-2.33, P = 0.006) were independent predictors for the development of septic shock. Conclusions Some clinical parameters such as lymphocytopenia, advanced age, higher blood glucose levels and Clinical Pulmonary Infection Scores can predict septic shock during VAP but large randomized controlled studies are needed to confirm these results.

Antibiotic-related acute effects within the intestinal microcirculation in experimental sepsis

CH Lehmann1, V Bac1, J Richter2, M Gründling2, M Wendt2, S Whynot1, O Hung1, M Murphy1, T Issekutz1, D Pavlovic2

1Dalhousie University, Halifax, NS, Canada; 2Ernst-Moritz-Arndt-Universität Greifswald, Germany S22 Critical Care 2008, 12(Suppl 2):P56 (doi: 10.1186/cc6277)

Introduction Although the benefit of antibiotic therapy in infectious inflammatory diseases remains unquestioned [1], little is known about the effects of these antibiotics on the inflamed microcirculation independent of their antimicrobial property. The aim of this study was to evaluate acute effects related to antibiotics administration upon the intestinal microcirculation, which plays a crucial role in the pathogenesis of sepsis and subsequent multiorgan failure [2,3].

Methods Experimental sepsis was induced in 50 Lewis rats using the colon ascendens stent peritonitis model [4]. Four frequently used antibiotics were included in the study (20 mg/kg imipenem/ cilastatin (IMI), 25 mg/kg tobramycin (TOB), 70 mg/kg vancomycin (VAN), 5 mg/kg erythromycin (ERY)). The antibiotics were administered as a single intravenous bolus following 16 hours of observation time. The intestinal functional capillary density and leukocyte-endothelial interactions were evaluated using intravital microscopy 1 hour following antibiotic treatment. Additional experiments were performed in an abacterial setting with comparable microcirculatory disturbances (2 hours endotoxemia; n = 50). Results Acute IMI or TOB administration, respectively, did not affect the intestinal microcirculation. VAN treatment aggravated the leukocyte rolling behavior in this acute setting. In contrast, ERY administration significantly reduced leukocyte activation and improved the functional capillary density within the intestinal microcirculation during experimental sepsis. These effects could be confirmed during endotoxemia, suggesting that ERY exerts antiinflammatory effects in addition to its antibacterial action. Conclusions When choosing antimicrobial agents in septic conditions, possible effects of the antibiotics within the pathogene-tically important intestinal microcirculation should be considered. In conjunction with microbial sensitivity tests, the results of such studies assist in selecting the appropriate antibiotic therapy. References

1. Nobre V, et ail.: Curr Opin Crit Care 2007, 13:586-591.

2. Suliburk J, et al.: Eur Surg Res 2007, 40:184-118.

3. Clark JA, et al.: Shock 2007, 28:384-393.

4. Lustig MK, et al.: Shock 2007, 28:59-64.

Intraperitoneal lipopolysaccharide-induced neutrophil sequestration in the lung microvasculature is due to a factor produced in the peritoneal cavity

S Mullaly, S Johnson, P Kubes

University of Calgary, Calgary, AB, Canada

Critical Care 2008, 12(Suppl 2):P57 (doi: 10.1186/cc6278)

Introduction Intraperitoneal injection of Gram-negative lipopoly-saccharide (LPS) results in a profound decrease in circulating leukocyte counts with a significant increase in neutrophil sequestration in the lung microvasculature as measured by lung myelo-peroxidase levels. Mice made deficient in toll-like receptor 4 (TLR4) are resistant to LPS challenges. Furthermore, it has been demonstrated that the systemic effects of LPS are due to parenchymal, possibly endothelial, cells and not due to bone marrow-derived cells, such as leukocytes.

Methods C57B/6 mice were anesthetized and sacrificed. Peritoneal lavage using 3 ml PBS followed by a gentle abdominal massage for 1 minute was performed. Lavage fluid was aspirated and centrifuged to concentrate the peritoneal cells. Cells were then treated with normal saline or 10 |ig LPS. The treated peritoneal cells were then injected into the peritoneal cavities of TLR4-deficient mice (C57B/10ScNJ) for a duration of 4 hours. Measured outcomes included circulating leukocyte counts and lung myeloperoxidase (MPO) levels.

Results Normal saline-treated control C57B/6 mice had circulating counts of 6.38 ± 0.56 million cells/ml, and the MPO levels in normal saline-treated peritoneal cells transferred into C57B/6 mice were 5.80 ± 0.73 units. Normal saline-treated peritoneal cells of C57B/6 mice injected into TLR4-deficient mice resulted in circulating counts of 5.18 ± 0.22 million cells/ml and MPO levels of 4.23 ± 0.73 units. LPS-treated control C57B/6 mice had circulating counts of 1.94 ±

0.22 million cells/ml, and the MPO levels in LPS-treated peritoneal cells transferred into C57B/6 mice were 16.52 ± 4.3 units. LPS-treated peritoneal cells of C57B/6 mice injected into TLR4-deficient mice resulted in circulating counts of 2.48 ± 0.44 million cells/ml and MPO levels of 12.06 ± 1.74 units. The liver endothelium was the only organ activated in this model. Furthermore, this process of LPS-treated, peritoneal cell-induced neutrophil sequestration in the lung microvasculature was found to be independent of mast cells and NKT cells.

Conclusions An as yet to be identified factor, or factors, produced from the peritoneal lavage cells, can produce neutrophil sequestration in the lung microvasculature. Reference

1. Andonegui G, etal:. J Clin Invest 2003, 111:1011-1020. P58

Correlation between microcirculatory flow, density and heterogeneity scores in septic shock patients

C Ruiz, G Izquierdo, R Lopez, M Andresen, G Hernandez, A Bruhn

Pontificia Universidad Católica de Chile, Santiago, Chile Critical Care 2008, 12(Suppl 2):P58 (doi. 10.1186/cc6279)

Introduction A recent conference recommended that microcirculatory images should be analyzed with several scores that evaluate density, flow and flow heterogeneity [1]. Following these recommendations, we analyzed sublingual microcirculatory images from septic shock patients to determine how these different scores were correlated with each other and with clinical hemodynamic and perfusion parameters.

Methods Using side dark-field videomicroscopy (Microscan®; Microvision Medical) we repeatedly evaluated sublingual microcirculation at different time points in septic shock patients. In total, we performed 17 microcirculatory single-time-point assessments (3-6 site images/time point), in parallel with hemodynamic and perfusion measurements (mean arterial pressure (MAP), nor-adrenaline dose (NA), cardiac index (CI), mixed venous O2 saturation (SmvO2), arterial lactate). Images were analyzed by semiquantitative scores of flow (mean flow index (MFI) and proportion of perfused vessels (PPV)) and density (perfused vascular density (PVD)) of small vessels (<20 |im). Heterogeneity indexes (Het Index = maximum - minimum / mean) were calculated for the MFI and PPV. Correlations between parameters were determined by the Pearson coefficient and considered significant if P < 0.05. Results We found that PVD was correlated to PPV (r = 0.55), and negatively to Het Index PVD (r = -0.54) and Het index PPV (r = -0.43), but we found no correlation of PVD with any homodynamic or perfusion parameter. Flow indexes (PPV and MFI) were strongly correlated with each other (r = 0.81) and inversely with their respective heterogeneity indexes (PPV and Het Index PPV, r = -0.88; MFI and Het Index MFI, r = -0.83). In addition, PPV and MFI were correlated to SmvO2 (r = 0.44 and 0.52), and CI (r = 0.49 and 0.47), and inversely to lactate levels (r = -0.46 and -0.4). Only the MFI was correlated to MAP (r = 0.5). Heterogeneity indexes were correlated to lactate (r = 0.40 with PPV and r = 0.44 with MFI), and inversely to MAP (r = -0.40 with Het Index PPV and r = -0.64 with Het index MFI). The Het Index MFI was also correlated to NA (r = 0.5).

Conclusions Higher microcirculatory flow scores, but not density, are associated with higher CI and better systemic perfusion parameters. Both MFI and PPV seem equally effective to assess microcirculatory flow. Reference

1. De Backer D, et al.: Crit Care 2007, 11:R101. P59

Effect of iloprost on the microcirculation and liver function after orthotopic liver transplantation

5 Klinzing, C Stumme, K Albin, O Habrecht, U Settmacher, G Marx

Friedrich Schiller University, Jena, Germany

Critical Care 2008, 12(Suppl 2):P59 (doi: 10.1186/cc6280)

Introduction Iloprost, a prostacyclin analogon, can probably improve the microcirculation in various tissues. It may have a positive effect on perfusion and function of the liver, especially in patients after liver transplantation. We therefore estimated liver function with the indocyanine green (ICG) plasma disappearance rate (PDR). We also evaluated iloprost effects on microcirculation in the oral mucosal tissue.

Methods Sixteen patients after orthotopic liver transplantation were randomly included in the study. They received either iloprost (n = 9) or placebo (n = 7). Iloprost was given in a dose of 1 ng/kg/min. The oral mucosal tissue oxygen saturation, microcirculatory blood flow and blood flow velocity were measured in a depth of one with a laser Doppler flowmetry and remission spectroscopy system (O2C; LEA, Gießen, Germany). The ICG-PDR was determined with the LIMON (Pulsion, München, Germany). All measurements were performed 6 hours, 12 hours, 24 hours, 48 hours, 72 hours and 96 hours postoperatively.

Results We saw a significant increase in the microcirculatory blood flow and blood flow velocity 6 and 12 hours postoperatively in the iloprost group in comparison with the placebo group. After

6 hours the blood flow was 156 (minimum, 54; maximum, 460; without units) in the placebo group and 213 (minimum, 57; maximum, 456; P < 0.05) in the iloprost group. After 12 hours it was 175 (minimum, 53; maximum, 483) and 318 (minimum, 86; maximum, 569; P < 0.05), respectively. The blood flow velocity was 18 (minimum, 14; maximum, 50) in the placebo group and 29 (minimum, 17; maximum, 51; P < 0.05) in the iloprost group 6 hours postoperatively. Twelve hours postoperatively the velocity was 21 (minimum, 16; maximum, 52) and 37.5 (minimum, 19; maximum, 64; P < 0.05). The oral mucosal tissue oxygen saturation did not change. We could not find any difference in the ICG-PDR between the two groups.

Conclusions Iloprost improved the microcirculatory blood flow and blood flow velocity of the oral mucosa in the early postoperative phase. This effect did not lead to an improved ICG clearance in patients after liver transplantation.

Noninvasive monitoring of peripheral perfusion with physical examination and the peripheral flow index correlates with dynamic near-infrared spectroscopy measurements in patients with septic shock

A Lima, T Jansen, C Ince, J Bakker

Erasmus MC University Medical Centre Rotterdam, The Netherlands Critical Care 2008, 12(Suppl 2):P60 (doi: 10.1186/cc6281)

Introduction Peripheral blood flow can be markedly impaired in septic shock. Bedside assessment of this derangement has not yet S23

been incorporated into routine clinical practice. We hypothesize that noninvasive monitoring of peripheral perfusion with physical examination and the peripheral flow index (PFI) derived from the pulse oximetry signal can reflect sepsis-induced microcirculation alteration as measured by near-infrared spectroscopy (NIRS) in patients with septic shock.

Methods NIRS (InSpectra) was used to quantify sepsis-induced circulatory alterations by calculating the increase rate of tissue oxygen saturation (slope-StO2) in a standard hyperaemia test (3 min arterial occlusion followed by rapid reperfusion). The increase rate of the PFI signal (slope-PFI) following the occlusion was compared with slope-StO2. We performed a physical examination of the extremities before arterial occlusion, and abnormal peripheral perfusion was defined as an increase in the capillary refill time (>4.5 s). The measurements were registered at admission after hemodynamic stability was obtained. We performed regression analysis to study the effect of abnormal peripheral perfusion on slope-StO2 and to study the relationship between slope-PFI and slope-StO2.

Results We prospectively studied 20 consecutive septic shock patients (age 54 ± 15 years; 16 males and four females). The admission diagnoses were 10 pneumonia, seven abdominal sepsis, two meningitis and one urosepsis. The slope-StO2 was significantly different between patients with normal peripheral perfusion (n = 8; mean = 218%/min; 95% CI = 141-339) and abnormal peripheral perfusion (n = 12; mean = 92%/min; 95% CI= 68-123). Regression analysis showed that the slope-StO2 is 138%/min lower in patients with abnormal than in patients with normal peripheral perfusion, controlled for the possible effects of central temperature (r2 = 0.42; P < 0.01). We found a strong association between slope-PFI and slope-StO2 (Pearson correlation = 0.84; P < 0.001). The effect of slope-StO2 on the slope-PFI was an increase in slope-StO2 of 90%/min per 1 unit/min slope-PFI (r2 = 0.65; P < 0.001). Conclusions Peripheral vascular reactivity in patients with septic shock, as measured by changes in StO2 following an ischemia-reperfusion challenge, is related to the clinical assessment with the capillary refill time and PFI.

Effect of intravenous nitroglycerin on the sublingual microcirculation in patients admitted to the intensive cardiac care unit

CA Den Uil1, WK Lagrand2, E Klijn1, M Van der Ent1, FC Visser1, C Ince1, PE Spronk3, ML Simoons1

1Erasmus Medical Center, Rotterdam, The Netherlands; 2Leiden University Medical Center, Leiden, The Netherlands; 3Gelre Hospitals, Apeldoorn, The Netherlands Critical Care 2008, 12(Suppl 2):P61 (doi: 10.1186/cc6282)

Introduction Nitroglycerin (NTG), a nitric oxide donor with vasodilating effects, is frequently administered to patients with heart failure. We tested the hypothesis of whether the vasodilating effects of NTG could be monitored at the bedside. Methods We included heart failure patients who were admitted to the intensive cardiac care unit. In each patient, continuous NTG infusion (2 mg/hour) was started immediately after an intravenous NTG loading dose of 0.5 mg. Using side-stream dark-field imaging, sublingual microvascular perfusion was evaluated before NTG administration (T0, baseline), 2 minutes after the NTG loading dose (T1) and 15 minutes after T1 (T2). At least three video sequences of the microcirculation were recorded and analyzed. Microscan Analysis Software was used to measure the functional capillary density (FCD), an indicator of tissue perfusion. Capillaries

were defined as the microvessels with a diameter <25 |im. Each value is represented as the median and interquartile range (P25-P75). Results Seven patients were included in this study. The mean arterial pressure decreased during execution of the study protocol: 80 (78-85) mmHg at baseline versus 75 (66-79) mmHg at T2; P = 0.03. There was a nonsignificant trend to an increase in FCD throughout the study (10.9 (9.0-12.5) ^m-1 at T0 vs 12.3 (11.3-14.0) |im-1 at T2; P = 0.06).

Conclusions We observed a trend to increasing FCD values during NTG treatment, despite a temporary decrease in the mean arterial pressure. This finding suggests improvement of microvascular perfusion by low-dose, intravenously administered NTG. Based on these interim results, more patients will be included in the study for final analysis and conclusions.

Relationship between the sublingual microcirculation and lactate levels in patients with heart failure

CA Den Uil1, WK Lagrand2, E Klijn1, C Ince1, PE Spronk3, ML Simoons1

1Erasmus Medical Center, Rotterdam, The Netherlands; 2Leiden University Medical Center, Leiden, The Netherlands; 3Gelre Hospitals, Apeldoorn, The Netherlands Critical Care 2008, 12(Suppl 2):P62 (doi: 10.1186/cc6283)

Introduction Treatment of patients with severe heart failure aims at normalizing hemodynamic and metabolic parameters. We tested whether the sublingual functional capillary density (FCD), an indicator of tissue perfusion at the microvascular level, correlates with lactate levels in heart failure patients.

Methods We investigated 12 heart failure patients, treated with inotropes, within 24 hours after hospital admission. Sidestream dark-field imaging was used to investigate the sublingual microcirculation. At least three video sequences of the microcirculation were recorded and analyzed. Microscan Analysis Software was used to measure the FCD, where the FCD was determined as the total length of perfused capillaries per field of view. Capillaries were defined as microvessels with a diameter <25 |im.

Figure 1 (abstract P62)

Lactate (mmolfL)

Results The mean arterial pressure was 75 (68-78) mmHg, lactate levels were 1.2 (1.0-1.7) mmol/l and SvO2 was 0.78 (0.65-0.84) mol/mol. The FCD was 11.7 (10.1-12.2) |im-1. The correlation between lactate and FCD is shown in Figure 1 (regression line: P1 = -0.88, P = 0.05).

Conclusions In this study, FCD correlated with lactate levels. This finding is indicative for a relationship between global hemodynamics and the sublingual microcirculation in patients treated for heart failure. Based on the interim results, more patients will be included in this study for final analysis and conclusions.

Sidestream dark-field imaging versus orthogonal polarization spectroscopic imaging: a comparative study

R Bezemer, P Goedhart, M Khalilzada, C Ince

Academic Medical Center, Amsterdam, The Netherlands Critical Care 2008, 12(Suppl 2):P63 (doi: 10.1186/cc6284)

Introduction Sidestream dark-field (SDF) imaging, a stroboscopic LED ring-based imaging modality for clinical observation of the microcirculation, was validated by quantitative and qualitative comparison with orthogonal polarization spectral (OPS) imaging. Methods For OPS imaging a Cytocan-II backfocus type device (Cytometrics, Philadelphia, PA, USA) was used, and for SDF imaging a MicroScan Video Microscope (MicroVision Medical Inc., Amsterdam, The Netherlands) was employed. To validate SDF imaging, nailfold capillary diameters and red blood cell velocities were measured in the exact same capillaries using OPS and SDF imaging. For quantitative comparison of the quality of sublingually acquired microcirculatory images, an image quality quantification system was developed to score venular and capillary contrast and sharpness on scales from 0 to 1.

Results After introduction of a scaling factor to correct for the slightly higher magnification of the SDF device with respect to the OPS device, equal quantitative results for capillary diameters and red blood cell velocities were obtained. Venular contrast and sharpness were shown to be comparable for OPS and SDF imaging. Capillary sharpness and contrast, however, were shown to be significantly higher using SDF imaging (Figure 1). Venular granularity, in addition, was more clearly observable employing the SDF device.

Figure 1 (abstract P63)

OPS imaging

Cap! contrast an

Conclusions SDF imaging provided significantly higher image quality by the use of stroboscopic LED ring-based SDF illumination. It is anticipated that SDF imaging will serve as a reliable imaging modality for the clinical assessment of the microcirculation and will enhance computer-aided image analysis. Reference

1. Goedhart PT, etal.: Opt Express 2007, 15:15101-15114. P64

Finger reactive hyperaemia to measure endothelial function in sepsis and health (the FRESH study)

JS Davis1, J Thomas2, M McMillan1, T Yeo1, D Celermajer3, D Stephens2, NM Anstey1

Menzies School of Health Research, Darwin, Australia; 2Royal Darwin Hospital, Darwin, Australia; 3Royal Prince Alfred Hospital, Sydney, Australia

Critical Care 2008, 12(Suppl 2):P64 (doi: 10.1186/cc6285)

Introduction Endothelial dysfunction is thought to be an important mechanism of organ failure in sepsis. We hypothesised that endothelial function (EF) would be impaired in adult patients with sepsis; that it would improve with treatment; and that the degree of its impairment would correlate with disease severity and outcome. Methods EF was measured using a novel, noninvasive technique at the bedside (reactive hyperaemia peripheral arterial tonometry (RH-PAT)) in three groups: patients with sepsis requiring admission to the ICU (ICU sepsis); patients with sepsis requiring hospital but not ICU admission (ward sepsis); and control patients without sepsis. Measurements were taken on days 0, 2 and 7 in the sepsis patients and at baseline in the control patients. Results Planned interim analysis was performed on 38 ICU sepsis patients, 19 ward sepsis patients and 28 control patients. The mean (95% CI) baseline RH-PAT index was significantly lower in ICU sepsis (1.56 (1.41-1.71)) than in control patients (2.03 (1.87-2.19)), P = 0.0001. It was intermediate in the ward sepsis group: baseline RH-PAT index = 1.72 (1.52-1.92) (P = 0.02 cf controls, not significant cf ICU sepsis). See Figure 1. The RH-PAT index improved markedly in the ward sepsis patients over the first 2 days (1.72 (1.52-1.92) to 2.29 (2.08-2.57); P = 0.0004); however, it did not change significantly in the ICU sepsis patients (1.56 (1.41-1.71) to 1.77 (1.56-1.98)).

Conclusions Noninvasive measurement of EF is feasible in sepsis. EF in sepsis is initially markedly impaired. It improves over the first 2 days in those patients with moderate sepsis but not in those with

225 (im

Sidestream dark-field (SDF) imaging versus orthogonal polarization spectral (OPS) imaging: capillary contrast and sharpness.

Baseline endothelial function.

sepsis requiring ICU admission. These data will be further analysed to explore correlations, and blood samples have been stored for the measurement of serum arginine and markers of endothelial activation.

Microcirculatory investigation in severe trauma injury E Grigoryev

Medical University, Kemerovo, Russian Federation Critical Care 2008, 12(Suppl 2):P65 (doi: 10.1186/cc6286)

Introduction Disturbances of regional microcirculation do not correlate with the global indices of perfusion such as the cardiac index and cardiac output. Research of local regional microcirculation must therefore be useful for diagnostic and prognostic value. The aim of study was to research the diagnostic and prognostic consequence of the microcirculatory indices in severe trauma injury.

Methods Thirty-four patients with severe trauma injury were entered into a prospective trial (scale TRISS, excluding patients with penetrating trauma and severe head injury). The patients were divided into two groups according to the TRISS scale: group 1 (TRISS 10-15 points, lethal probability 30%, favorable outcome) and group 2 (TRISS 16-25 points, lethal probability >30%, unfavorable outcome). The standard of the intensive therapy included respiratory support, infusion therapy and sedation/muscle relaxation, and analgesia. We investigated the red blood cell flux by laser Doppler flowmetry (perfusion unit) - the device was placed on skin and antral gastric mucosa (LAKK 01, Russia); noninvasive hemodynamic monitoring with thoracic bioimpedansometry (Diamant M, Russia) and invasive by PiCCO+ in unstable patients (Pulsion, Germany). The characteristics of blood gas and oxygen consumption were evaluated by Bayer RapidLab (Bayer, Germany) and lactate/pyruvat concentration (Boeringer Mannheim, Germany). The dates were analyzed by t test, Fisher criteria. P < 0.05 was considered statistically significant.

Results The serum lactate/pyruvate level was increased in both groups of patients on day 1 (m ± SD, 3.5 ± 1.2 vs 3.8 ± 1.0 mmol/l; nonsignificant). According to laser Doppler flowmetry the skin and the mucosa perfusion were decreased versus the control group (red blood cell flux 0.45 ± 0.12 in control vs 0.12 ± 0.03 in perfusion unit; significant). Group 1 was defined as a normalization of perfusion to day 2, whereas the disturbances of microcirculation remained in group 2 up to day 2. The microcirculatory index was not correlated with cardiac index (r = 0.26, nonsignificant). The extra lung water index by PiCCO+ was correlated with the microcirculatory index (r = 0.56, P < 0.05): group 1 associated with a normal index versus group 2 with increased extra lung water (13 ± 4 vs 23 ± 12; significant).

Conclusions The recovery of the microcirculatory index (red blood cell flux) is associated with favorable outcome in severe trauma patients.

Assessment of tissue hypoperfusion by subcutaneous microdialysis during septic shock: cases with bacteremia versus nonbacteremia

K Morisawa, S Fujitani, H Takahashi, H Oohashi, S Ishigouoka, M Yanai, Y Masui, Y Taira

St Marianna University, Kawasaki-shi, Japan

Critical Care 2008, 12(Suppl 2):P66 (doi: 10.1186/cc6287)

Introduction Plasma lactate has been used as a better marker of S26 tissue hypoperfusion in patients with sepsis. Plasma lactate

elevation can be delayed compared with tissue hypoperfusion. Microdialysis has been used for an assessment of tissue hypoperfusion in the area of neurosurgery; however, limited studies have been published in the area of septic shock. We hypothesized that septic patients with bacteremia (BA) suffered from more severe hypoperfusion than those with nonbacteremia (Non-BA). We therefore investigated subcutaneous lactate and lactate/ pyruvate ratio in cases with BA versus Non-BA for an assessment of tissue hypoperfusion in both groups.

Methods Cases with septic shock were enrolled between April 2006 and November 2007 in a mixed ICU of a tertiary care hospital in Japan. Microdialysis (CMA/Microdialysis, Sweden) was used as in a previous study [1]. Lactate, pyruvate and glucose in subcutaneous tissue of cases with BA and Non-BA were measured three times with 8-hour intervals after ICU admission. Two groups were then compared in terms of above measurements. All data were reported as medians and interquartile ranges (IQR). The Mann-Whitney U test was used for statistical analysis and P <

0.05 was considered statistically significant.

Results Fourteen cases were evaluated; the male/female ratio of BA was 2/5 (age 62-86 years) and Non-BA was 4/3 (age 57-88 years). No difference of APACHE II score was observed (mean: BA 30 vs Non-BA 29). The lactate level (mmol/l) in BA (median

3.8, IQR 1.9-5.4) was significantly higher than in Non-BA (median

1.9, IQR 1.6-2.6) (P = 0.012). The glucose level (mmol/l) in BA (median 3.9, IQR 2.6-7.1) was significantly less than that in Non-BA (median 6.3, IQR 4.9-10.1) (P = 0.004). The lactate/pyruvate ratio in BA (median 1.8%, IQR 1.4-2.5%) was significantly higher than those in Non-BA (median 1.4%, IQR 1.2-1.6%) (P = 0.023). Conclusions Our data suggest that tissue ischemia was more prominent in septic patients with BA than those with Non-BA. Microdialysis can be a promising method to differentiate between septic shock with BA and Non-BA.


1. Ungerstedt U, et al.: Microdialysis - principles and applications for studies in animals and man. J Intern Med 1991, 230:365-373.

Tissue perfusion evaluation with near-infrared spectroscopy during treatment with activated protein C

A Donati, L Botticelli, C Anastasi, M Romanelli, L Romagnoli,

V Beato, P Pelaia

Université Politecnica delle Marche, Torrette di Ancona, Italy Critical Care 2008, 12(Suppl 2):P67 (doi: 10.1186/cc6288)

Introduction In sepsis the link between the systemic inflammatory response and the development of multiorgan failure is represented by microcirculatory and mitochondrial distress syndrome, which causes an important cellular deoxygenation not corrigible exclusively with a restoration of a normal hemodynamic state and a satisfactory systemic transport of oxygen. We determine the changes caused by a stagnant ischemia in the tissue oxygenation with near-infrared spectroscopy (NIRS) in patients with severe sepsis, during therapy with activated protein C (APC), evaluating whether APC influences tissue saturation (index of O2ER) and whether alterations of the hemodynamic state are connected with these changes. Methods A prospective observational study. We evaluated 10 septic patients (treated with APC) from December 2005 to September 2007. We carried out evaluation with NIRS of the tissue oxygen saturation (StO2) with the InSpectra spectrometer (Hutchinson Technology Inc., MN, USA), putting a probe of 15 mm into the brachioradialis muscle of the patients. The measurements were made in five steps: pre-APC, at 24 hours, at 48 hours, at

72 hours, at 96 hours and 24 hours after the end of the infusion (post-APC). Each measurement (of the basic StO2 and of the slope during and after the ischemia) is registered and transformed from the InSpectra Analysis software. The parameters are studied with the Wilcoxon nonparametric test for repeated measurement (P < 0.05).

Results The increase of the basic StO2 during and after the treatment and its decrease during the arterial occlusion are statistically relevant (P < 0.05). The increase of the StO2 slope after arterial occlusion is statistically relevant starting from the second day of infusion of APC (P < 0.03).

Conclusions There is an improvement of all the NIRS parameters after the infusion of APC; that is, an increase of O2ER. We have to verify whether that increase is connected either with a reduced shunt effect in the microcirculation or with the end of metabolic downregulation that involves the mitochondrial system. NIRS has been used in this study for the first time during treatment with APC. Spectroscopy and videomicroscopy focus our attention on perfusion and tissue oxygenation, which it is not possible to separate during the evaluation of severity, of therapeutic choice and of the treatment response of severe sepsis. References

1. Skarda et al.: Shock 2007, 27:345-353.

2. Ince C: Crit Care 2005, 9(Suppl 4):S13-S19.

Tissue oxygen saturation does not correlate with the oxygen delivery index during major abdominal surgery

CJ Kirwan, N Al-Subaie, P Morgan, T Addei, R Sunderland, R Hagger, A Rhodes, RM Grounds, M Hamilton

St Georges Hospital, London, UK

Critical Care 2008, 12(Suppl 2):P68 (doi: 10.1186/cc6289)

Introduction Tissue oxygenation (STO2) measured by near-infrared spectroscopy (NIRS) has been shown to correlate with the global oxygen delivery index (DO2I) in both humans and animals during haemorrhagic shock and its fluid resuscitation [1]. This is a pilot study to determine whether STO2 can be used as a surrogate marker of DO2I with a view to utilising this simple noninvasive technique to guide intraoperative haemodynamic therapy. Methods Eighteen patients undergoing major abdominal surgery were recruited from a London teaching hospital. All patients received the same induction and maintenance anaesthesia. Ten patients were actively haemodynamically optimised to a DO2I >600 ml/min/kg with fluid resuscitation. The DO2I was determined using an oesophageal Doppler probe cardiac output monitor (CardioQ; Deltex Medical, UK). The STO2 of the thenar muscle was determined using the InSpectra STO2 (Hutchinson Technology, USA). Paired measurements of the DO2I and STO2 were taken every 20 minutes from the start of surgery. Results The average patient age was 67 (30-84) years; seven (39%) were female. A total of 173 paired observations were made. The median (IQR) for the DO2I and STO2 were 454 (332.5-595.5) and 88 (83-93), respectively. There was no correlation between the DO2I and STO2 (Figure 1; r = 0.1, P > 0.1). In addition there is no statistically significant difference in

Figure 1 (abstract P68)

Near-infrared spectroscopy as a potential surrogate for mixed venous oxygen saturation for evaluation of patients with hemodynamic derangements

J Crawford, R Otero, EP Rivers, D Goldsmith

Henry Ford Hospital, Detroit, MI, USA

Critical Care 2008, 12(Suppl 2):P69 (doi: 10.1186/cc6290)

Introduction Crookes and colleagues demonstrated that a decreased thenar muscle tissue oxygen saturation may reflect the presence of severe hypoperfusion (shock) in trauma patients better than traditional hemodynamic parameters [1]. Near-infrared spectroscopy (NIRS) may be a novel method for rapidly and noninvasively assessing changes in tissue level oxygenation. The purpose of this study was to compare and correlate NIRS measurements (StO2) with central venous blood saturation measurement (ScvO2) in the setting of compromised systemic perfusion in critical patients in the emergency department (ED). Methods A prospective, nonrandomized, observational, study in patients >18 years, admitted to the critical care area (CAT 1) of the ED with various complaints classified as cardiovascular, pulmonary, neurological, trauma or gastrointestinal etiology (n = 500). The NIRS probe was applied to the right thenar eminence and data were collected and stored for analysis. StO2 and ScvO2 monitoring was performed within 15 minutes of admission to CAT

1, and values were recorded at a single point in time. The ED physicians were blinded to StO2 values. Exclusion criteria included the Do Not Resuscitate status, peripheral vascular disease, cardiac arrest, amputated upper extremities or skin abnormalities in the monitoring site as well as refusal to participate. Results StO2 correlation analysis was performed against all continuous variables. Subsequently, ANOVA was run for all of the continuous variables allowing pairwise comparisons. For this cohort of 500 patients, 305 paired data points of StO2 and ScvO2 were compared. StO2 and ScvO2 had a strong linear correlation that was statistically significant (r = 0.76, P < 0.001). We also

STO2 when the DO2I > 600 ml/min/m2 (paired t test, P = 0.6). observed that the time spent below an StO2 <75% was

STO2 did not track the changes in DO2I.

Conclusions There is no clear relationship between STO2 and the DO2I during major abdominal surgery. STO2 in the intraoperative period cannot currently be used as a surrogate marker for oxygen delivery in this group of patients. Reference

1. McKinley etal.: J Trauma 2000, 48:637-642.

associated with an APACHE score greater than 15 and also was associated with a higher admission rate to the ICU (P = 0.05). Conclusions NIRS has demonstrated with significance that there is a strong correlation with StO2 and ScvO2 in critically ill patients presenting to the ED. There also appears to be an association between the time a patient spends below an StO2 <75% with an increased APACHE score and ICU admission.


1. Crookes BA, Cohn SM, Bloch S, et al.: Can near-infrared spectroscopy identify the severity of shock in trauma patients? J Trauma 2005, 58:806-813.

Changes in thenar eminence tissue oxygen saturation measured using near-infrared spectroscopy suggest ischaemic preconditioning in a repeated arterial occlusion forearm ischaemia model

A Sen, D Martin, M Grocott, H Montgomery

University College London, Institute of Human Health and Performance, London, UK

Critical Care 2008, 12(Suppl 2):P70 (doi: 10.1186/cc6291)

Introduction Ischaemic preconditioning (IP) describes the process whereby a tissue exposed to brief sublethal periods of ischaemia becomes protected from longer lethal episodes of ischaemia. One mechanism by which skeletal muscle may effect protection from ischaemic insult is to reduce the resting rate of oxygen consumption (VO2) following a preconditioning stimulus. Tissue oxygen saturation (StO2) reflects the dynamic balance between oxygen supply and utilisation. We hypothesised that using near-infrared spectroscopy to measure thenar eminence StO2 repeated arterial occlusion of the upper arm would induce an IP effect. Methods The study was approved by the UCL Research Ethics Committee and written consent was obtained from 20 healthy volunteers. StO2 was measured using the InSpectra Tissue Spectrometer (Model 325; Hutchinson Technology Inc., USA). The tissue spectrometer probe was attached to the left thenar eminence and a blood pressure cuff was placed around the left upper arm. The repeated arterial occlusion forearm ischaemia model (RAOFIM) consisted of resting measurements and then a cycle of four cuff inflations (200 mmHg, 3 min) and four deflations (5 min). Finally the cuff was inflated for 3 minutes on the right upper arm while the StO2 was measured from the right thenar eminence. Paired t tests were used to compare rates of oxygen desaturation; P < 0.05 was considered statistically significant. Results There was a fall in thenar eminence StO2 during all arterial occlusions. The rate of decline of StO2 was significantly reduced during the fourth inflation (0.160%/s) as compared with the first in the left arm (0.213%/s), P < 0.001. There was an increase in the rate of StO2 decline in the right arm (0.268%/s) when compared with the first left occlusion (P < 0.001).

Conclusions The data from this pilot study demonstrate that, following preconditioning using a RAOFIM, the rate of oxygen desaturation in resting skeletal muscle during subsequent arterial occlusion manoeuvres is reduced. This could be explained by a fall in resting muscle VO2 as a result of the preceding short ischaemic stimuli and therefore represents evidence of IP in skeletal muscle. These data do not provide evidence to support a remote IP effect in the contralateral arm.

Tissue oxygen saturation during anaesthesia, cardiopulmonary bypass and intensive care stay for cardiac surgery

J Sanders1, D Martin1, A Smith2, B Keogh2, M Mutch2, H Montgomery1, M Hamilton3, M Mythen4

University College London, Institute of Human Health and Performance, London, UK; 2The Heart Hospital, University College London Hospitals NHS Trust, London, UK; 3St Georges Healthcare NHS Trust, London, UK; 4Great Ormond Street Hospital, UCL Institute of Child Health, London, UK Critical Care 2008, 12(Suppl 2):P71 (doi: 10.1186/cc6292)

Introduction Near-infrared spectroscopy is a novel method for rapid and noninvasive assessment of tissue oxygen saturation (StO2). An association between StO2 and oxygen delivery has been demonstrated during shock, trauma and resuscitation. We conducted a pilot observational study in which the aims were to measure changes in StO2 during the perioperative period for scheduled cardiac surgery and to explore correlations between StO2 and routine haemodynamic measures. Methods The study was approved by the UCLH Joint Research Ethics Committee. Written informed consent was gained from 74 patients undergoing scheduled coronary artery bypass grafting (CABG) and valvular surgery requiring cardiopulmonary bypass (CPB). The thenar eminence StO2 was measured continuously during the perioperative period for a maximum of 24 hours using the InSpectra Tissue Spectrometer (Model 325; Hutchinson Technology Inc., USA). Haemodynamic variables were collected from patient records. The mean StO2 was calculated for various time points within the study.

Results The tissue spectrometer performed well throughout the study. From a baseline of 81.7% the StO2 rose significantly during induction of anaesthesia to 88.5% (P < 0.001). Prior to and during CPB the StO2 fell to a minimum of 77.6%, and rose significantly to 83.1% after CPB (P < 0.001). The mean StO2 decreased during the ICU stay to a minimum of 70.0% at 2 hours post operation. There was marginal association between StO2 measures and haemodynamic changes although all analyses resulted in areas under ROC curves <0.70.

Conclusions The present study demonstrates interesting changes in tissue StO2 during the perioperative period surrounding scheduled cardiac surgery. The trends suggest a fall in StO2 throughout CPB and during early recovery in the ICU. Changes in StO2 may reflect underlying tissue perfusion; therefore the utilisation of StO2 as both an index for tissue hypoperfusion and as a therapeutic goal needs further exploration.

Near-infrared spectroscopy during stagnant ischemia: a marker of ScvO2-SvO2 mismatch in septic patients with low cardiac output

H Mozina, M Podbregar

University Medical Center Ljubljana, Slovenia

Critical Care 2008, 12(Suppl 2):P72 (doi: 10.1186/cc6293)

Introduction Monitoring of oxygen saturation in the superior vena cava (ScvO2) was suggested as a simpler and cheaper assessment of the global DO2:VO2 ratio and was used successfully as a goal in treatment of patients with septic shock and severe sepsis [1]. In patients with low cardiac output (CO) the difference between SvO2 and ScvO2 is more expressed and problematically large confidence limits and poor correlation were found between

the two values [2]. The thenar muscle oxygen saturation (StO2) measured with near-infrared spectroscopy (NIRS) during stagnant ischemia (cuff inflation-induced vascular occlusion) decreases slower in septic shock patients [3]. This may be due to slower muscle tissue oxygen consumption in sepsis. This phenomenon possibly contributes to the ScvO2-SvO2 mismatch in patients with low CO by adding more oxygenated venous blood to flow through the superior vena cava. The aim of present study was to determine the relationship between the StO2 deceleration rate and the ScvO2-SvO2 difference in septic patients with low CO. Methods In septic patients with low CO and no signs of hypovolaemia, catheterization with a pulmonary artery floating catheter was performed. Blood was drawn from the superior vena cava and pulmonary artery at the time of each StO2 measurement in order to determine ScvO2 and SvO2. The thenar muscle StO2 during stagnant ischemia was measured using NIRS (InSpectra™) and the StO2 deceleration rate (StO2%/min) was obtained using the Inspectra Analysis Program V2.0.

Results Fifty-four patients (47 male, seven female), age 68 ± 13 years, SOFA score 12.2 ± 2.5 points. CI 2.5 ± 0.7 l/min/m2, SvO2 67 ± 10%, ScvO2 77 ± 8%. Lactate 3.5 ± 3.0 mmol/l, CRP 127 ± 78 mg/l. NIRS data: basal StO2 89 ± 8%, deceleration rate -1 2.6 ± 4.9%/min, StO2 deceleration rate versus ScvO2-SvO2

0.651. P = 0.001(Pearson correlation, P value). Conclusions The StO2 deceleration rate during cuffing is inversely proportional to the difference between ScvO2 and SvO2 in septic patients with low CO. When using ScvO2 as a treatment goal, this simple noninvasive NIRS measurement might be useful to discover those patients with normal ScvO2 but probably abnormally low SvO2.


1. Rivers E, etal.: N Engl J Med 2001, 345:1368-1377.

2. Martin C, etal.: Intensive Care Med 1992, 18:101-104.

3. Pareznik R, et al.: Intensive Care Med 2006, 32:87-92.

Relationship between central venous oxygen saturation measured in the inferior and superior vena cava

T Leiner, A Mikor, Z Heil, Z Molnar

University of Pecs, Hungary

Critical Care 2008, 12(Suppl 2):P73 (doi: 10.1186/cc6294)

Introduction Central venous saturation ScvO2 proved a useful alternative to mixed venous saturation, and also a reliable parameter to evaluate the oxygen debt in emergency and intensive care [1-3]. Under special circumstances of anatomical or coagulation disorders, however, canulation of the jugular or subclavian veins are not recommended. In these cases the femoral vein has to be catheterised. The aim of our prospective study was to investigate the relationship between the ScvO2 measured in the superior vena cava (ScvsO2) and in the inferior vena cava (ScviO2). Methods After local ethics committee approval every ICU patient with two central venous catheters (one subclavian/internal jugular and one femoral) entered the study. Parallel blood gas analyses were performed in random, whenever ScvO2 was requested by the attending physician. Vital parameters (heart rate, mean arterial pressure, Glasgow Coma Scale, respiratory parameters) and the dose of vasopressor and sedative drugs were also recorded. Results were compared using Wilcoxon test, Pearson correlation and Bland-Altman plots.

Results In 13 patients 47 matched pairs were compared. Although ScvsO2 = median 79% (range: 50-85) was significantly higher than ScviO2 = 71% (45-87), P < 0.001, there was significant

correlation between the two variables (r = 0.690, P < 0.001). Bland-Altman plots showed a mean bias of 7.6% with lower and upper levels of agreement of -5.6 and 20.7, respectively. The dose of vasopressor (norepinephrine) and dose of sedative (propofol) had a significant influence on the measured difference between the investigated variables (r = 0.562, P < 0.001; r = 0.538, P < 0.001, respectively).

Conclusions The preliminary results of this study show that ScviO2 underestimates ScvsO2 with a low level of agreement and that this difference is affected by vasopressor support and sedation. ScviO2 may be useful, however, for monitoring the trend of ScvsO2. References

1. Reinhart K, et al.: Intensive Care Med 2004, 30:1572-1578.

2. Rivers E, et al.: N Engl J Med 2001, 345:1368-1377.

3. Molnar Z, et al.: Intensive Care Med 2007, 33:1767-1770.

Early worst central venous oxygen saturation is predictive of mortality in severe head trauma but not in moderate head trauma

A Di Filippo, S Matano, R Spina, L Perretta, C Gonnelli, A Peris

Teaching Hospital Careggi, Florence, Italy

Critical Care 2008, 12(Suppl 2):P74 (doi: 10.1186/cc6295)

Introduction The aim of the present study was to evaluate the mortality prediction power of central venous oxygen saturation (ScVO2) in critically ill patients suffering from major trauma and head trauma.

Methods In an ED, eight-bed, ICU of a teaching hospital from January 2004 to November 2007, all patients with major trauma (RTS < 10) and head trauma were included in the study. On the basis of the severity of head trauma the patients were divided into two groups: severe (GCS < 8; n = 91) and moderate head trauma (GCS > 8 < 12; n = 116). Each group was in turn divided into two other groups: patient survivors and dead patients. In each subgroup, the age, sex, ISS, SAPS II, worst ScVO2 on the first day from trauma (emogasanalysis of venous blood sampled by a catheter inserted in the superior vena cava 2 hours from trauma), and worst lactate level in circulating blood on the first day from trauma were compared. Statistics were performed with the Student t test and the x2 test.

Results The results showing a significant difference are summarized in Table 1.

Table 1 (abstract P74)

Severe head trauma Moderate head trauma

Survivors Dead Survivors Dead

(n = 76) (n = 15) (n = 99) (n = 17)

ISS (pt) 30.3 ± 10.7 43.4 ± 18° 27.9 ± 16 41.1 ± 16.3°

ScVO2 (%) 71 ± 7 62 ± 9* 73 ± 7 75 ± 7

Lactate (mmol/l) 3.5 ± 1.9 9 ± 5.4° 2.9 ± 1.6 6.1 ± 4°

Data presented as the mean ± SD. *P < 0.05, °P < 0.01.

Conclusions ScVO2 seems to be predictive of major outcome in severe head trauma but not in moderate head trauma. Venous mixing of the superior vena cava could play a role in this difference. Reference

1. Reinhart K, et al.: Intensive Care Med 2004, 30:1572.

Comparison between continuous and discontinuous central venous oxygen saturation in the ICU: a prospective study and preliminary results

H Gharsallah, I Labbene, W Koubaa, F Gargouri, K Lamine, M Ferjani

Military Hospital of Tunis, Tunisia

Critical Care 2008, 12(Suppl 2):P75 (doi: 10.1186/cc6296)

Introduction Benefits of central venous oxygen saturation (ScvO2) optimization, at early stages of severe sepsis and septic shock, have been shown [1]. Discontinuous ScvO2 measurement has not been studied earlier. The purpose of our study was to compare continuous and discontinuous ScvO2 in terms of the number of therapeutic interventions in septic patients.

Methods After approval by our institutional ethics committee, 16 patients with severe sepsis or septic shock were included in this prospective study. Patients were randomly assigned to the continuous ScvO2 group (central venous catheter; Edwards Lifescience X3820HS ) or to the discontinuous ScvO2 group. Blood pressure, heart rates and pulse oxymetry were continuously monitored and the lactate concentration measured in all patients. In both groups we noted that the number of therapeutic interventions due to the ScvO2 value is <70%. Statistical analysis used the Fisher test exact for qualitative variables and the Student t test (Mann-Whitney) for quantitative variables. P < 0.05 was considered significant.

Results There were no significant differences between the groups with respect to baseline characteristics. The median number of therapeutic interventions was significantly higher in the continuous ScvO2 group (13 vs 7, P = 0.016). No significant differences occurred between the length of stay in the two groups. See Table 1.

Table 1 (abstract P75)

Comparison between continuous and discontinuous central venous oxygen saturation

Continuous Discontinuous P value

Age 63 ± 17 66.8 ± 5.6 0.64

Weight 63 ± 17 73 ± 8.9 0.44

MODS 6.62 ± 3.9 6.12 ± 1.8 0.87

APACHE I 18.8 ± 7.4 18.1 ± 6.8 0.87

Length of stay 6.1 ± 2.51 10.2 ± 5 0.95

Therapeutic interventions 13 (6-19) 7 (4-12) 0.01*

'Significant at P < 0.05.

Conclusions These preliminary results showed that continuous ScvO2 measurements increase the number of therapeutic interventions. We conclude that continuous measurement of ScvO2 is helpful for early therapeutic interventions. Controlled trials of sufficient size, however, are needed to confirm these results. Reference

1. Rivers E, et al.: N Engl J Med 2001, 345:1368-1377. P76

Influence of tissue perfusion on the outcome of high-risk surgical patients needing blood transfusion

JM Silva Jr, DO Toledo, A Gulinelli, FG Correa, KR Genga, T Cezario, M Pinto, A Isola, E Rezende

Hospital do Servidor Público Estadual, Sao Paulo, Brazil S30 Critical Care 2008, 12(Suppl 2):P76 (doi: 10.1186/cc6297)

Introduction The objective of this study was to evaluate the clinical outcomes of patients who required intraoperative blood transfusion, aiming to compare the pretransfusion hematimetric values with tissue perfusion markers.

Methods A prospective single-center cohort study. Patients were selected in the operative room of a tertiary hospital. Adult patients who required blood transfusion during the intraoperative period were included in the study. Arterial and central venous blood samples were collected at the moment in which the blood transfusion decision was made.

Results Sixty-one patients were included, with a mean age of 68 years. The POSSUM score was 36.2 ± 10.3 and the MODS score was 2.4 ± 1.9. At the time of the blood transfusion the mean hemoglobin level was 8.4 ± 1.8 g/dl. The overall inhospital mortality rate was 24.6%. The ScvO2 cutoff point for the ROC curve was equal to 80% (AUC = 0.75; sensitivity = 80%; specificity = 65.2%). Patients who received a blood transfusion and had ScvO2 < 80% (n = 29), in comparison with those with ScvO2 > 80% (n = 32), had lower mortality rates (12.5% vs 47.1%; P = 0.008) and lower incidence of postoperative complications (58.9% vs 72.9%; P = 0.06). Blood transfusion with a ScvO2 < 80% was also associated with reduced use of vaso-pressors (5.9% vs 36.8%; P = 0.009), lower incidence of hypoperfusion (17.6% vs 52.6%; P = 0.009) and lower incidence of infection (23.5% vs 52.6%; P = 0.038) in the postoperative period.

Conclusions In patients submitted to major surgery, the ScvO2 appears an important variable to be taken into consideration to decide for or against blood transfusion, since transfusions with adequate perfusion, reflected by ScvO2 > 80%, are associated with higher mortality rates and worse clinical outcomes. References

1. Hebert PC, Wells G, Blajchman MA, et al.: A multicenter, randomized, controlled clinical trial of transfusion requirements in critical care. N Engl J Med 1999; 340:409-417.

2. Vincent JL, Baron JF, Reinhart K, et al.: Anemia and blood transfusion in critically ill patients. JAMA 2002, 288:14991507.

3. Rivers E, Nguyen B, Havstad S, et al.: Early goal-directed therapy in the treatment of severe sepsis and septic shock. N Engl J Med 2001, 345:1368-1377.

Hyperlactatemia and low central venous saturation can predict prognosis after cardiac surgery

L Hajjar, M Rego, F Maeda, M Sundin, L Sampaio, F Galas, J Auler Jr

Heart Institute, Faculdade de Medicina da Universidade de Sao Paulo, Brazil

Critical Care 2008, 12(Suppl 2):P77 (doi: 10.1186/cc6298)

Introduction Hyperlactatemia and low central venous saturation are commonly encountered in critically ill patients and carry a prognostic significance in those with sepsis. Few reports, however, have examined the relationship between these parameters immediately after cardiac surgery and clinical outcome. We examined the ability of the arterial plasma lactate concentration and central venous saturation to predict patient outcome after cardiac surgery. Methods We performed a consecutive observational study in a university hospital. A total of 125 patients undergoing cardiac surgery were studied. Demographic data were analyzed. Samples of arterial lactate, arterial gases, and central venous saturation were collected at the time of admission in the ICU, 12 and 24 hours later. Univariate and multivariate analyses were performed.

Results Of the 125 patients in this study, 115 (92%) patients survived and 10 (8%) died. The lactate level was higher in nonsurvivors than in survivors (P < 0.001). A higher lactate level (>3.3 mmol/l) was an independent predictor of death (OR = 23, 95% CI = 3.9-136), of occurrence of arrhythmias (OR = 5.36, 95% CI = 1.9-15), renal dysfunction (OR = 9.93, 95% CI = 2.9-34), and shock (OR = 67.2, 95% CI = 6.4-710). There were no relationships of higher level of lactate and longer time of stay in the ICU, cardiac dysfunction or myocardial ischemia. Low central venous saturation (<60%) was an independent predictor of arrhythmias (OR = 12.74, 95% CI = 3.45-47), infection (OR = 6.61, 95% CI = 2.2-19.6), shock (OR = 16.7, 95% CI = 1.8-156), and need for transfusion (OR = 3.68, 95% CI = 1.25-10.8). There were no relations of low central venous saturation with cardiac dysfunction, renal dysfunction or myocardial ischemia. Conclusions In this observational study, the postoperative plasma arterial lactate and central venous saturation concentration strongly and independently predicted the outcome after cardiac surgery. These findings suggest that these parameters may be markers of prognosis after cardiac surgery and support the role of hemodynamic optimization in reducing complications. Reference

1. Baker J, Coffernils M, et al.: Blood lactate levels are superior to oxygen-derived variables in predicting outcome in human septic shock. Chest 1991, 99:956-962.

Clinical utility of arterialised capillary earlobe sampling in the critical care setting

S Stott, G Thain, N Duncan

Aberdeen Royal Infirmary, Aberdeen, UK

Critical Care 2008, 12(Suppl 2):P78 (doi: 10.1186/cc6299)

Introduction The purpose of this study was to determine whether earlobe capillary blood gas sampling, performed by nonmedical staff, provides a clinically acceptable estimate of the pH, pCO2 and pO2 in critically ill adults.

Methods Paired samples (arterial and capillary) were obtained from patients aged 16 years and over admitted to the general intensive therapy unit who had an arterial line in situ. Details of the severity of illness, use of vasoactive agents and complications were recorded. Results One hundred and thirty-one paired independent samples were obtained from 142 patients. Mean age 60 (18-87) years, mean APACHE II score 20 (5-44). Bland-Altman analysis was used to compare arterial and capillary pH, pCO2 and pO2, respectively. See Table 1. The use of vasoconstricting drugs had no significant effect on the mean differences between arterial and capillary values for pH, pCO2 or pO2 (P = 0.4, 0.8 and 0.7, respectively). For high arterial pCO2 tensions (>6.5 kPa), capillary measurements showed a mean bias of 0.95 kPa with limits of agreement of -0.22 to 2.12 kPa. For hypoxic patients (PaO2 < 10 kPa), capillary sampling had a mean bias of 0.05 kPa with limits of agreement of -1.08 to 1.17 kPa. There were no complications of capillary sampling in terms of bruising, bleeding or infection. It took significantly longer to obtain capillary samples than arterial ones (35 s, P < 0.001).

Conclusions Capillary earlobe sampling provides a reliable estimation of the arterial pH and pCO2 in critically ill patients. For pO2 estimation, the technique has a higher level of agreement when the arterial PO2 is below 10 kPa. Capillary earlobe blood sampling would be a reliable method of monitoring for patients who do not have an arterial line in situ and can be performed without complications by nonmedical personnel. Reference

1. Zavorsky GS, et al.: Arterial versus capillary blood gases: a meta-analysis. Respir Physiol Neurobiol 2007, 155:268279.

Emergent internal jugular vein cannulation as a risk factor associated with arterial puncture

R Jankovic, J Ristic, M Pavlovic, Z Stevanovic, D Antic, D Djordjevic, M Andjelkovic

School of Medicine, University of Nis, Serbia

Critical Care 2008, 12(Suppl 2):P79 (doi: 10.1186/cc6300)

Introduction Placement of central venous catheters is frequently associated with serious complications. Arterial puncture is the most common mechanical complication associated with internal jugular vein access procedures (IJVAP). The influence of emergent indication as a sole risk factor for arterial puncture during IJVAP has not been fully explored. We evaluated the impact of emergent IJVAP, performed in the operating room, on the carotid arterial puncture rate.

Methods We analyzed all landmark techniques of guided IJVAP that were performed by either the anterior or the posterior approach, using the Seldinger technique in the operating theater during a 2-year period. All IJVAP were defined either as elective or emergent. A procedure was defined as emergent if the anesthesiologist judged that any delay would be harmful. The side of the puncture site was chosen according to clinical necessity. The puncture side, number of cannulation attempts, the time relationship between surgical incision and IJVAP, and the number of arterial punctures after cannulation were recorded. Correct placement of the central venous catheter was confirmed by free venous blood return, free flow of fluid through all ports of catheter and postinsertion chest X-ray scan.

Results We analyzed 86 IJVAP performed in the operating room (22 left-sided and 64 right-sided). In 32 cases, IJVAP were performed as emergent (37.2%). The overall rate of carotid artery puncture was 9.30%. Arterial puncture following emergent IJVAP occurred in seven cases (21.87%). After elective IJVAP, accidental arterial puncture occurred in only one case (1.85%). This difference was statistically significant (P = 0.003). Emergent IJVAP were considerably associated with repeated cannulation attempts (P < 0.001). In 16 cases (50%), emergent IJAVP were performed after surgical incision, including five cases of unintentional carotid puncture. Although the arterial puncture frequently occurred after postincisional emergent IJVAP, the difference was not statistically significant (31.25% vs 12.15%; P = 0.15).

Conclusions Emergent internal jugular vein cannulation might be identified as a factor associated with an increased arterial puncture rate.

Table 1 (abstract P78)

Parameter Mean bias Limits of agreement

pH pCO2 (kPa) pO2 (kPa) -0.02 0.35 1.05 -0.07 to 0.02 -0.62 to 1.33 -2.4 to 4.5

Use of ultrasound for central venous catheter placement

T Angelini1, L Bevilacqua1, F Forfori2, G Licitra2, L DeSimone2, F Giunta2

1Scuola di Specializzazione in Anestesia e Rianimazione, Pisa,

Italy; 2Department of surgery, AOUP, Pisa, Italy

Critical Care 2008, 12(Suppl 2):P80 (doi: 10.1186/cc6301)

Introduction The placement of a central venous catheter is a common practice in the ICU and the incidence of mechanical complications occurs is 5-19% of patients. In this study we compare the ultrasound approach with classic landmark technique in terms of reduction of mechanical complications and the number of attempts needed for the cannulation of the internal jugular vein. Methods We examined 31 patients admitted to the ICU: in 20 of them the cannulation of the internal jugular vein was obtained using real-time ultrasound guidance, while in 11 patients we employed the landmark technique (axial approach).We recorded the number of complications and the number of attempts, correlating with the experience of the operator. All data were statistically examined with Student's t test (number of attempts) and Fisher's test for count of odds ratio (incidence of complications).

Results We reported 9% of complications in the landmark group (one accidental arterial puncture) and 6% in the ultrasound group (one pneumothorax). The odds ratio for these data is 0.5 (95% CI = 0.006-45.4). We found a statistically significant difference in the number of attempts performed, with a lower value in the ultrasound group (mean ± SD, ultrasound 1.1 ± 0.30 vs landmark technique 1.7 ± 0.78; P = 0.034). No difference in the number of attempts was evidenced by the experience of the operator using the ultrasound approach. See Figure 1.

Conclusions Our data confirm that use of ultrasound for central venous catheter placement is safer and is associated with a lower risk of complications than the classical approach, especially for low-experience operators. References

1. McGee D, Gould M: Preventing complication of central venous catheterization. N Engl J Med 2003, 348:1123-1133.

2. Maecken T: Ultrasound imaging in vascular access. Crit Care Med 2007, 35:S178-S185.

3. Karakitsos D, et al.: Real-time ultrasound-guided catheterisa-tion of the vein: a prospective comparison with the landmark technique in critical care patients. Crit Care 2006, 10:R162.

Figure 1 (abstract P80)

Is there a 'safe mean airways pressure' in preterm babies with hyaline membrane disease: an echocardiographic retrospective approach

M Di Nardo, C Cecchetti, F Stoppa, D Perrotta, M Marano, C Tomasello, N Pirozzi

Ospedale Pediatrico Bambino Gesu, Rome, Italy Critical Care 2008, 12(Suppl 2):P81 (doi: 10.1186/cc6302)

Introduction Airway pressure limitation is a largely accepted strategy in neonatal respiratory distress syndrome (NRDS), yet a lot of debate persists about the exact level of mean airways pressure (M-PAW) that can be safely used. The aim of the present study was to examine whether the echocardiographic evaluation of tricuspid regurgitation (TR) and right ventricular function (RVF) may help to indirectly solve this problem.

Methods A retrospective study. Thirty preterms were enrolled and divided into two groups: Group A (control group), 15 patients; Group B, 15 patients. Mean gestational age 32 ± 1 weeks, body weight 1.55 ± 0.55 kg, with a diagnosis of NRDS [1]. All of the patients were treated with surfactant therapy (curosurf 100 mg) for grade 3 and grade 4 NRDS and with high-frequency pressure-controlled ventilation: peak pressure according to body weight, PEEP 3 ± 2 cmH2O, I:E = 1:1.5, breath rate >80 ± 10, FiO2 50 ± 15%. In Group B the M-PAWs were reduced according to our echocardiographic evaluations. TR and RVF (pulmonary arterial systolic pressure (PAPs and PAPd), flattening to the left of the interventricular septum) were monitored daily (SONOS 5500-Philips echocardiography machine equipped with 8-12 MHz probes), until beginning weaning from mechanical ventilation. Results Signs of right ventricular dysfunction (moderate to severe TR, flattening of interventricular septum, PAPs >36 mmHg) were observed especially in group A with a M-PAW of 14 ± 3 cmH2O. The duration of mechanical ventilation was 24 hours longer in group A than in Group B (P < 0.005 with the Student t test). Conclusions This small experience shows that RVF worsens while increasing the M-PAW over 11 cmH2O. This event could increase the weaning time in those patients. Even though a large number of patients should be enrolled in our future studies, we believe that any occurrence of right ventricular dysfunction should be immediately corrected, reducing M-PAW with the help of echocardiography. Reference

1. Sweet D, Bevilacqua G, Carnielli V, et al.: European consensus guidelines on the management of neonatal respiratory distress syndrome. J Perinat Med 2007, 35:175-186.

Right ventricular dysfunction in liver failure: a hemodynamic study

P Gruber, P Kennedy, N Stoesser, J Wendon, G Auzinger

Kings College Hospital, Institute of Liver Studies, London, UK Critical Care 2008, 12(Suppl 2):P82 (doi: 10.1186/cc6303)

Introduction Data investigating the clinical importance of right ventricular dysfunction (RVD) in liver disease are sparse. The use of a modified pulmonary artery catheter with a fast-response thermistor to assess the right ventricular ejection fraction (RVEF) and right ventricular end-diastolic-volume index (RVEDVI) can aid in the diagnosis of and guide appropriate therapy for RVD in critically ill patients. In a previous study, RVD was defined as RVEF < 52% and RVEDVI > 92 ml/m2 [1]. We aimed to investigate the prevalence of RVD in a heterogeneous group of patients with

multiorgan failure and liver disease admitted to a specialist liver ICU. In addition, differences in right ventricular function were compared in patients in acute liver failure (ALF) and with acutely decompensated chronic liver disease.

Methods Over a 24-month period, hemodynamic data for 16 patients were analyzed. Patients with known significant tricuspid regurgitation and arrhythmias were excluded. Patients were grouped according to etiology into ALF and acute-on-chronic liver disease (AoCLD). Comparison of hemodynamic data was performed using Mann-Whitney U tests.

Results See Table 1. All patients showed evidence of RVD, but the RVEDVI was higher in patients with AoCLD. The pulmonary artery occlusion pressure (PaOP) was not different between groups. The transpulmonary gradient (TPG = MPAP - PaOP) as a marker of increased pulmonary vascular resistance was higher in AoCLD patients despite similar pulmonary artery pressures.

Table 1 (abstract P82)

Results of hemodynamic parameters obtained from right heart catheterisation

ALF patients AoCLD patients

Figure 1 (abstract P83)

Parameter All patients (n = 9) (n = 7) P value

RVEF 36.3 36 35 NS

RVEDVI 129.3 114.7 148 <0.005

MPAP 29.5 28.4 31 NS

PAOP 16.1 18.2 13.3 NS

CI 4.4 4.4 4.4 NS

TPG 13.5 10.1 17.9 <0.05

Conclusions RVD is common in patients with liver failure and is more severe in AoCLD patients. Whether treatment based on RVEF and RVEDVI monitoring in liver disease can improve patient outcome still needs to be proven. Reference

1. Le Tulzo Y, et al.: Intensive Care Med 1997, 23:664-670.

Diastolic function abnormalities in ICU patients with chronic obstructive pulmonary disease

H Michalopoulou, A Bakhal, J Vaitsis, D Nastou, D Pragastis, N Stravodimou

Metaxa Hospital, Athens, Greece

Critical Care 2008, 12(Suppl 2):P83 (doi: 10.1186/cc6304)

Introduction It has been shown from several studies that patients with right ventricle pressure overload often have left ventricular (LV) diastolic dysfunction. Our aim was to evaluate LV function in ICU patients with chronic obstructive pulmonary disease (COPD). Methods We studied 35 patients (20 males, 15 females), mean age 65 ± 8.4 years, with COPD and without comorbidities (hypertension, diabetes, coronary heart disease or heart failure). Twenty-five healthy subjects matched for age and sex were used for control. Using conventional echocardiography, the LV end diastolic diameter (LVD), LV ejection fraction (EF%), right ventricle diastolic diameter (RVD), right ventricle systolic pressure (RVSP), the maximal velocity of E wave, maximal velocity of A wave, and E/A ratio were assessed. Using tissue Doppler echocardiography, Em, Am and Em/Am were calculated.

Results The differences between groups are presented in Figure 1. The parameters of the diastolic LV function, E/A and Em/Am were significantly lower in COPD patients in comparison with healthy subjects. There was also a significant negative correlation between RVSP and Em/Am (r = -0.74, P < 0.01) and E/A (r = 0.5; P < 0.005). Conclusions Systolic LV function is well preserved in COPD patients but we found a severe LV diastolic impairment that might be due to alterations of the chamber stiffness from the hypertrophic right ventricle.

Noninvasive rodent cardiac output: comparison of a compact clinical monitor with echocardiography and proposal of a simple correction factor

DJ Sturgess, BA Haluska, M Jones, B Venkatesh

University of Queensland, School of Medicine, Brisbane, Australia Critical Care 2008, 12(Suppl 2):P84 (doi: 10.1186/cc6305)

Introduction Rodent models are often studied in critical care research. Noninvasive cardiac output (CO) measurement is desirable but often impractical. The present study aimed to compare a commercially available human CO monitor, USCOM® (USCOM Ltd, Sydney, Australia), with specialised rodent echocardiography for noninvasive measurement of rat CO.

Methods With institutional ethics committee approval (UQ AEC protocol 675/05), 21 anaesthetised, mechanically ventilated male Sprague-Dawley rats (573 ± 96 g) were studied during refinement and study of an endotoxic shock model. Pulsed-wave Doppler echocardiography (15 MHz rodent probe) was used to measure the left ventricular outflow velocity and to calculate the stroke volume and CO. USCOM (v1.7, human neonatal algorithm; 2.2 MHz) measurements followed each echocardiographic examination. USCOM CO was measured by combining continuous wave Doppler with the predicted outflow tract diameter (OTD-U). Results Twenty-one paired measurements were analysed. The mean echocardiographic CO was 113 ml/min (range 46-236). The mean USCOM CO was 245 ml/min (range 75-553). Paired echocardiographic and USCOM measurements demonstrated significant correlations for heart rate (r = 0.92, P < 0.0001) and CO (r = 0.68, P = 0.001). Bland-Altman analysis of CO demonstrated a mean bias of -131 ml/min and precision of 52 ml/min. Linear regression analysis yielded a simple correction factor for USCOM OTD estimation. Following application of this correction factor (0.68 x OTD-U), the mean bias improved to -0.1 ml/min with precision of 38 ml/min.

Conclusions Measurement of rat CO using the USCOM human neonatal algorithm (v1.7) is not interchangeable with specialised pulsed-wave Doppler echocardiography. We propose a simple correction factor that should improve performance of this device in the rodent laboratory. Incorporation into a rat-specific algorithm should be evaluated prospectively across a range of potential applications.

Prospective, observational study of the reliability of achieving diagnostic quality transthoracic echocardiograpy images in critically ill adult patients

C Weaver, N Masani, J Parry-Jones

University Hospital of Wales, Cardiff, UK

Critical Care 2008, 12(Suppl 2):P85 (doi: 10.1186/cc6306)

Introduction Echocardiography is often requested in the management and diagnosis of hemodynamically unstable critically ill patients [1]. Transoesophageal echocardiography (TOE) is often considered the echocardiographic test of choice in the general ICU patient population. This is based on studies in which trans-thoracic echocardiography (TTE) commonly offered inadequate images [2]. The aim of this study is to assess the quality and quantity of images obtained in critically ill patients. Methods Patients were recruited from February 2006 to December 2007, when the attending consultant requested a TTE on clinical grounds. A single operator carried out all of the TTE procedures. Each study was performed in the 45° head-up, left lateral position. Left ventricular function was assessed either using Simpson's biplane model or the 16-segment Wall motion score index (WMSI). All studies and changes in management were recorded in the patient's notes. Demographic, diagnostic and severity scoring data were collected.

Results Sixty-six TTE procedures were performed. Mean age of patients was 69 ± 13 years. Eighteen out of 66 studies lacked one or more basic views. The commonest request was for left ventricular function, 45% were normal studies, and the commonest changes in management were fluid boluses, inotrope changes, and commencement of ACE inhibitor therapy. Five TOE procedures were requested. The Simpson biplane method was obtained in 65% of the patients. The WMSI was obtained in 73% of studies. In ventilated patients, the mean positive end-expiratory pressure (PEEP) in the full studies was 7.7 cmH2O. The mean PEEP was 11.5 cmH2O in the inadequate studies. The parasternal windows were impaired by high PEEP settings.

Conclusions In 73% of the patients a full study was performed. Studies may be impaired in patients where their respiratory support requires PEEP > 10 cmH2O. Changes in management occurred in 60% of the patients within 48 hours. TTE should therefore be considered the initial and principal echocardiographic investigation in critically ill patients. In a minority of cases, inadequate views may require progression to TOE. References

1. Chandrasekhar J, etal.: Chest 2003, 124:174-177.

2. Majo X, etal.: Chest 2004, 126:1592-1597.

Evaluation of bedside lung ultrasonography in the diagnosis of alveolar-interstitial syndrome and pleural effusion in the ICU

K Stefanidis, K Vitzilaios, E Tripodakis, S Poulaki, E Angelopoulos, V Markaki, P Politis, D Zervakis, S Nanas

Evaggelismos Hospital, National and Kapodistrian University of Athens, Greece

Critical Care 2008, 12(Suppl 2):P86 (doi: 10.1186/cc6307)

Introduction The purpose of this study was to determine the efficacy of ultrasonography (US) in the detection of alveolar-interstitial syndrome and pleural effusion in critically ill patients, compared with the results of the gold standard computed S34 tomography (CT).

Methods Twenty-seven consecutive critically ill patients were enrolled in this study (age = 65 ± 17 years, male/female = 10/17, APACHE II score = 18.3 ± 6.2 and Lung Injury Score = 1.0 ± 0.7). Lung US was performed before or after CT within an interval of 20 hours by two independent physicians blinded to the results of the CT. Ultrasound scanning of the anterior and the lateral chest was obtained on the right and left hemithorax, from the second to the fourth-fifth intercostal space from parasternal to midaxillary line. The results of the US scanning in each intercostal space were grouped into the respective lobe (superior, mid and inferior for the right lung, and superior and inferior for the left lung) and were compared with the findings of the CT in each lobe respectively. Alveolar-interstitial syndrome was defined as the presence of more than two comet-tail artifacts perpendicular to the pleural line, and the pleural effusion was detected as a hypoechoic space above the diaphragm.

Results The diagnostic sensitivity and specificity of US for the alveolar-interstitial syndrome were 94.1% and 60% for the right superior lobe, 93.7% and 100% for the right mid lobe, 76.5% and 90% for the right inferior lobe, 93.3% and 72.7% for the left superior lobe, and 88.2% and 90% for the left inferior lobe, respectively. Finally the sensitivity and the specificity of US for pleural effusion were 94.7% and 100% for the right and 86.6% and 91.7% for the left pleural effusion, respectively. Conclusions The preliminary data of this study suggest that US may provide essential information about the respiratory condition of the critically ill patient. The fact that lung US is an imaging tool that can be easily performed at the bedside, that is free of radiation exposure and that is less costly makes it an attractive and promising alternative to CT.

Measurement of vena cava inferior diameter in hemorrhagic shock diagnosis

DR AkIllI, A BayIr, MD Cander, DR Kara

Meram Faculty of Medicine, Selguk University, Konya, Turkey Critical Care 2008, 12(Suppl 2):P87 (doi: 10.1186/cc6308)

Introduction The present study aimed to investigate the usability of the vena cava inferior (VCI) diameter as a predictor for acute blood loss and to compare it with other parameters in patients with shock.

Methods A total of 78 patients were included in the study. These patients were divided into two groups, a control group consisting of 50 healthy individuals and a study group consisting of 28 case patients. Vital signs of both groups were taken, shock indices were calculated and the measured VCI diameters of both groups were compared. Furthermore, VCI diameters were also compared with shock indices lactate, base excess and bicarbonate, and the relationships between these parameters and mortality were investigated.

Results Vena cava inferior expirium (VCIe)-anteroposterior diameter (AP) was 14.3 ± 3.6 mm in the study group and 29.3 ± 4.8 mm in the control group (P = 0.000), VCIe-mediolateral diameter (ML) was 8.9 ± 2.5 mm in the study group and 19.4 ± 3.6 mm in the control group (P = 0.000), vena cava inferior inspirium (VCIi)-AP was 10.9 ± 3.6 mm in the study group and 23.8 ± 5.0 mm in the control group (P = 0.000), and VCIi-ML was 7.0 ± 3.8 mm in the study group and 15.4 ± 3.2 mm in the control group (P = 0.000). The VCI diameters of the study and control groups were significantly different. No correlation was determined between all VCI diameters and the shock index, heart rate, systolic blood pressure, diastolic blood pressure, given liquid amount, hemoglobin, hematocrit, white blood cells and base excess.

Lactate (r = 55) was correlated with all VCI diameters; however, this correlation was better for VCIe-ML. The shock index was less correlated with base excess and lactate (r = 37 and 43, respectively). A significant decrease was found in diastolic blood pressure, VCIe-ML and VCIi-ML in addition to lactate, bicarbonate and base excess when dead and alive patients were compared (P < 0.05). Conclusions The VCI diameter can give more reliable information compared with the shock index and other parameters, especially in trauma patients, to determine acute blood loss, and it can be used as a follow-up parameter of hemorrhagic shock. A decreased VCI diameter measured on admission in patients with hemorrhagic shock might be a predictor of high mortality. References

1. Tetsuka T, etal.: ASAIO J 1995, 41:105-110.

2. Morishita Y, et al.: Clin Exp Nephrol 2005, 9:233-237.

3. Marcelino P, etal.: Rev Port Pneumol 2006, 12:637-658.

4. Barbier C, etal.: Intensive Care Med 2004, 30:1740-1746.

5. Feissel M, et al.: Intensive Care Med 2004, 30:1834-1837.

Arterial pressure changes during the Valsalva maneuver to predict fluid responsiveness in spontaneously breathing patients

MI Monge Garcia, A Gil Cano, JC Diaz Monrove

Hospital de Jerez, Jerez de la Frontera, Spain

Critical Care 2008, 12(Suppl 2):P88 (doi: 10.1186/cc6309)

Introduction Although superiority of dynamic parameters compared with static values of preload is widely accepted, predicting fluid responsiveness in spontaneously breathing patients is still a challenging problem. A sudden increase in intrathoracic pressure during the Valsalva maneuver transiently impairs venous return and, only in preload-dependent patients, decreases the stroke volume and arterial pulse pressure (PP). We designed this study to assess the predictive value of arterial pressure response to a Valsalva maneuver for fluid responsiveness in spontaneously breathing patients. Methods In 27 spontaneously breathing patients, the Valsalva maneuver, consisting of a forced expiration through a closed mouthpiece, was performed. Patients were encouraged to maintain a constant pressure of 30 cmH2O for 10 seconds. The Valsalva pulse pressure variation (AVPP) was defined using the maximum PP at the beginning of the strain (PPmax) and the minimum PP (PPmin) according to the known formula: AVPP (%) = (PPmax -PPmin) / [(PPmax + PPmin) / 2]. A first set of measurements was obtained at baseline and immediately after the Valsalva maneuver was performed. Cardiac output (FloTrac/Vigileo™), central venous pressure, invasive arterial pressure and respiratory pressure were continuously recorded during the whole strain time. After volume expansion, new measurements were obtained and the Valsalva maneuver was performed again post infusion. Results The volume expansion-induced increase in the stroke volume index was >15% in 10 patients (responders) and <15% in 17 patients (nonresponders). The baseline AVPP was higher and decreased more after volume expansion in responders (P < 0.0001). The baseline AVPP and AVPP increase after volume expansion were statistically correlated with changes in the stroke volume index (r = 0.83 and r = 0.74, P < 0.0001, respectively). The baseline AVPP accurately predicted changes induced by volume expansion with a sensibility of 90% and a specificity of 94%. The area under the ROC curve for AVPP was 0.96 (95% CI = 0.81-0.99, P = 0.0001), with a best cutoff value of 50%. Conclusions Arterial pressure variations induced by a Valsalva maneuver reliably predict fluid responsiveness in spontaneously breathing patients.

Passive leg raising predicts fluid responsiveness after cardiac surgery

F Galas, L Hajjar, T Polastri, T Faustino, W Leao, L Sampaio, J Auler Jr

Heart Institute, Faculdade de Medicina da Universidade de Sao Paulo, Brazil

Critical Care 2008, 12(Suppl 2):P89 (doi: 10.1186/cc6310)

Introduction Passive leg raising (PLR) represents a self-volume challenge that could predict fluid response with less limitations than other methods as variation of stroke volume and pulse pressure variation. We hypothesized that the hemodynamic response to PLR is able to predict fluid responsiveness in mechanically ventilated patients after cardiac surgery.

Methods A prospective study in a surgical ICU of a university hospital was performed. We investigated 44 patients in the immediate postoperative period after cardiac surgery while in mechanical ventilation with no spontaneous breathing activity. Fourteen patients had arrhythmias. The hemodynamic status was evaluated at baseline, after PLR and after volume expansion (500 ml HAES 130/04 infusion over 10 min). In patients without arrhythmias, the pulse pressure variation was calculated. Results In 22 patients (responders), the cardiac index increased by >15% after fluid infusion. A PLR increase of cardiac index >15% predicted fluid responsiveness with a sensitivity of 95% and a specificity of 94%. In patients without arrhythmias, a respiratory variation in pulse pressure >13% predicted fluid responsiveness with a sensitivity of 92% and a specificity of 88%. Conclusions In a group of patients submitted to cardiac surgery, the changes in cardiac index induced by PLR predict fluid responsiveness in ventilated patients with higher sensitivity and specificity than respiratory variation in pulse pressure. References

1. Monnet X, Rienzo M, Osman D, et al.: Passive leg raising predicts fluid responsiveness in the critically ill. Crit Care Med 2006, 34:1402-1407.

2. Michard F, Teboul J-L: Predicting fluid responsiveness in ICU patients: a critical analysis of the literature. Chest 2002, 121:2000-2008.

Variation of hemodynamic parameters after fluid challenge

A Donati, L Botticelli, C Anastasi, M Romanelli, L Romagnoli, R Nardella, P Pelaia

Universita Politecnica delle Marche, Torrette di Ancona, Italy Critical Care 2008, 12(Suppl 2):P90 (doi: 10.1186/cc6311)

Introduction We compared thoracic electrical bioimpedance

(TEB) with transpulmonary thermodilution (TT) to evaluate the

accuracy of cardiac index with the ICG (CI-I) and cardiac index

with the PiCCO (CI-P) before and after fluid challenge (FC), to

determine the correlation between intrathoracic blood volume

(ITBV) and corrected flow time (FTc) before and after FC, to verify

the parameters' response after FC, and to establish the credibility

of total fluid content (TFC) as a pulmonary fluid index in

comparison with extravascular lung water (EVLW).

Methods We recruited 33 patients from May 2006 to July 2007.

Inclusion criteria: instable hemodynamic conditions, mechanic

ventilation. Exclusion criteria: intraaortic balloon pump, aortic

failure. We used 7 ml/kg hydroxyethyl starch 6% in 30 minutes for

the FC. We used the PiCCO (Pulsion Medical Systems AG) and

the ICG (Solar ICG module; GE Medical Systems Technology, S35

Milwaukee, USA) to monitor hemodynamic parameters. We studied the strength of the association between all of the hemodynamic parameters of the ICG and the PiCCO with the correlation coefficient (P < 0.005).

Results The correlation coefficient between the differences of CI-I and CI-P before and after FC is 0.6090 (P = 0.0002). The correlation coefficient between the differences of EVLW and TFC before and after FC is 0.1192 (P = 0.51). The correlation coefficient between the differences of FTc and ITBV before and after FC is 0.3443 (P = 0.04).

Conclusions The study demonstrates that the ICG can individuate an increase of cardiac output after FC, but less than the PiCCO. The correlation coefficient between CI-P and CI-I results is inferior after FC, so CI-I seems less accurate in identifying the filling response. There is an agreement between TFC and EVLW before the FC. The ITBV from the PiCCO demonstrates more clinical utility in identifying a response to FC. Even if TEB is clinically useful, it does not represent an available option instead of the TT. The parameters we studied have less clinical efficacy than the classic methods of TT, as recent studies of the literature demonstrate. TEB should be used when catheterization of a central artery is contraindicated, when there is no other method to monitor and especially when there is a need for rapid monitoring. References

1. Engoren et al.: Am J Crit Care 2005, 14:40-45.

2. Heringlake et al.: Intensive Care Med 2007, 33:2168-2172.

Comparison of pulse pressure variation and end-diastolic volume index in an experimental model of hemorrhagic shock in the pig

MA Oliveira, DA Otsuki, JO Auler Jr

Faculdade de Medicina da Universidade de Sao Paulo, Brazil Critical Care 2008, 12(Suppl 2):P91 (doi: 10.1186/cc6312)

Introduction Different hemodynamic parameters, including static indicators of cardiac preload such as the end-diastolic volume index (EDVI) and dynamic parameters such as the pulse pressure variation (PPV) have been used in the decision-making process regarding volume expansion in critically ill patients. The objective of this study was to compare fluid resuscitation guided by either PPV or EDVI, after hemorrhagic shock.

Methods Twenty anesthetized and mechanically ventilated pigs were randomly allocated into two groups: PPV and EDVI. Hemorrhagic shock was induced by removing blood to a target pressure of 40 mmHg and was maintained for 60 minutes. Parameters were measured at baseline, at the time of the shock (Shock0), 60 minutes after the shock (Shock60), immediately after resuscitation with hydroxyethyl starch 6% (130/0.4) (R0), and 1 hour (R60) and 2 hours (R120) after resuscitation. The endpoint of fluid replacement was to re-establish the baseline values of PPV or EDVI. Data were submitted to ANOVA for repeated measures followed by the Bonferroni test.

Results The resuscitation solution volume was higher in the EDVI group when compared with PPV (EDVI = 1,305 ± 331 ml and PPV = 965 ± 245 ml; P < 0.05). The time required to reach the endpoint was also different between groups (PPV = 8.8 ± 1.3 min and EDV = 24.8 ± 4.7 min). The cardiac index decreased after shock (Shock0 and Shock60, P < 0.01) and increased after resuscitation (R0, P < 0.01) in the PPV group. In the EDVI group, the cardiac index S36 decreased at Shock0 (P < 0.05) and increased during R0 and R60

(P < 0.05). The right atrial pressure and pulmonary artery wedge pressure decreased after shock in both groups (ShockO and Shock60, P < 0.05), reaching baseline values after resuscitation. Oxygen delivery decreased after shock in both groups (Shock0 and Shock60, P < 0.001), recovered the baseline value at R0 in both groups, but decreased at R60 and R120 in group PPV and at R120 in the EDVI group. Lactate increased at Shock60 in both groups and remained high at R0 in the PPV group and at R0 and R60 in the EDVI group.

Conclusions After hemorrhagic shock, the resuscitation to an established endpoint was quicker and required less fluid with PPV when compared with EDVI.

Acknowledgements Supported by grants from FAPESP

(05/59470-0). Performed at LIM08.


1. Frédéric M, et ail.: Crit Care 2007, 11:131.

Evaluation of systolic pressure variation and pulse pressure variation in an experimental model of acute normovolemic hemodilution

AJ Sant'Ana1, DA Otsuki1, DT Fantoni2, JO Auler Jr1

1Faculdade de Medicina da Universidade de Sao Paulo, Brazil;

2Faculdade de Medicina Veterinaria da Universidade de Sao Paulo, Brazil

Critical Care 2008, 12(Suppl 2):P92 (doi: 10.1186/cc6313)

Introduction Dynamic indicators derived from the arterial pressure waveform analysis as systolic pressure variation (SPV) and pulse pressure variation (PPV) have been shown to be a useful tool to optimize tissue perfusion in clinical and experimental studies. These indicators, however, have not been adequately explored in situations of acute variation of blood viscosity as occurs during acute normovolemic hemodilution (ANH). The purpose of this research was to compare the behavior of these dynamic indicators in a porcine model of ANH with two different solutions. Methods Fourteen pigs were anesthetized and randomly allocated into two groups: GI (ANH with 6% hydroxyethyl starch, 1:1) and GII (ANH with 0.9% saline solution, 1:3). Static and dynamic hemodynamic parameters were evaluated at baseline (T1), immediately after ANH (T2), and 1 hour and 2 hours after ANH (T3 and T4). Data were submitted to ANOVA for repeated measures followed by the Bonferroni test.

Results The cardiac index increased in GI after ANH (T2 and T3, P < 0.001). The mean arterial pressure was sustained in GI, but decreased after ANH in GII (T2, T3 and T4, P < 0.05). The right atrial pressure and pulmonary artery wedge pressure decreased in GI (T3 and T4, P < 0.05). The PPV increased in GII (T3 and T4, P< 0.001). The SPV increased in GI (T4, P < 0.05) and GII (T3 and T4, P < 0.001).

Conclusions The fluid responsiveness evaluated by PPV and SPV, as well as the cardiac index and mean arterial pressure, in animals submitted to ANH with hydroxyethyl starch showed that the hemodynamics were better preserved. We can suppose that starch remains longer in the vessels when compared with normal saline solution.

Acknowledgements Supported by grants from FAPESP

(06/55221-8). Performed at LIM08.


1. Grocott MPW, etal.: Anesth Analg 2005, 100:1093-106.

2. Michard F: Anesthesiology 2005, 103:419-428.

3. Lopes et al.: Clinics 2006, 61:345-350.

Venous return in ICU patients

J Maas1, WK Lagrand1, PC Van den Berg1, MR Pinsky2, JR Jansen1

1Leiden University Medical Center, Leiden, The Netherlands; 2University of Pittsburgh, PA, USA

Critical Care 2008, 12(Suppl 2):P93 (doi: 10.1186/cc6314)

Introduction Guyton's theory on venous return, implying a linear relationship between blood flow and central venous pressure, was tested in 12 mechanically ventilated ICU patients during standard care.

Methods The central venous pressure was changed by applying four different constant inspiratory plateau pressures over 12 seconds. Mean values of central venous pressure and cardiac output were measured with pulse contour analysis over the last 3 seconds of this plateau period and were plotted against each other to construct a venous return curve. During the inspiratory plateau periods, hemodynamic steady-state circumstances were met without an observable change in cardiovascular control mechanisms. Two different volemic states were created: normo-volemia in the supine position (SUP) and hypervolemia by volume loading with 0.5 l intravenously (SUP-V).

Results Guyton's linear venous return pressure-flow relationship was confirmed. The average slope of the relation during SUP was not significantly different from the slope during SUP-V. The mean systemic filling pressures derived from these venous return curves during SUP and SUP-V were 18.8 ± 4.5 mmHg and 29.1 ± 5.2 mmHg, respectively (P < 0.001). During SUP the calculated total circulatory mean compliance was 0.98 ml/mmHg/kg and the mean stressed volume was 1,677 ml.

Conclusions The mean systemic filling pressure, systemic compliance and stressed volume can be determined in mechanically ventilated patients with intact circulation using inspiratory pause procedures. These results may imply a physiological tool to assess the volume state of the circulation as well as fluid responsiveness of mechanically ventilated patients in the ICU.

Prediction of fluid responsiveness by FloTrac™ and PiCCOplus™ in cardiac surgery patients

C Hofer, A Senn, A Zollinger

Triemli City Hospital, Zurich, Switzerland

Critical Care 2008, 12(Suppl 2):P94 (doi: 10.1186/cc6315)

Introduction The aim of this study was to compare the prediction of fluid responsiveness [1] using the stroke volume variation (SVV) determined by FloTrac™ (SVV-FloTrac; Edwards Lifesciences, USA) and PiCCOplus™ (SVV-PiCCO; Pulsion Medical Systems, Germany).

Methods With ethics committee approval, the SVV-FloTrac, SVV-PiCCO, pulse pressure variation (PPV), global end-diastolic volume (GEDV) and stroke volume (SV) were measured before and after a volume shift induced by body positioning (30° head-up to 30° head-down) in 40 patients after cardiac surgery. A t test, Bland-Altman analysis, Pearson correlation and area under the receiver operating curves (AUC) were calculated. P < 0.05 was considered significant.

Results Body positioning resulted in a significant SV and GEDV increase, while SVV-FloTrac, SVV-PiCCO and PPV significantly decreased. Comparably strong correlations between SVV-FloTrac/SVV-PiCCO and ASV were observed (Table 1). The best AUC was found for SVV-FloTrac (threshold value: 12.1%) and

Table 1 (abstract P94)

AUC predicting ASV > 25% and Pearson correlation of baseline indices versus ASV

AUC P value r2 value P value

SVV-FloTrac 0.824 0.001 0.426 <0.001

SVV-PiCCO 0.858 <0.001 0.492 <0.001

PPV 0.718 0.011 0.334 <0.001

GEDV 0.509 0.924 0.091 0.580

SVV-PiCCO (threshold value: 9.6%). Mean bias ± 2SD (SVV-FloTrac - SVV-PiCCO) was -2.5 ± 6.2%, and the correlation coefficient (r2) was 0.72 (P < 0.01).

Conclusions SVV-FloTrac and SVV-PiCCO showed a comparable performance in predicting fluid responsiveness. When compared with SVV-PiCCO, a lower threshold value for SVV-FloTrac has to be considered. Reference

1. Hofer CK, et al.: Chest 2005, 128:848-854. P95

Myocardial dysfunction in sepsis studied with the pressure recording analytical method

F Pratesi1, T Balden1, C Sommario1, P Bertini1, F Forfori2, SM Romano3, F Giunta2

1Scuola di specializzazione in Anestesia e Rianimazione, Universita degli studi di Pisa, AOUP, Pisa, Italy; 2Department of Surgery, AOUP, Pisa, Italy; 3University of Florence, School of Medicine, Florence, Italy

Critical Care 2008, 12(Suppl 2):P95 (doi: 10.1186/cc6316)

Introduction Myocardial dysfunction is one of the most common organ failures in patients with sepsis characterized by transient biventricular contractility impairment, as well as systolic and diastolic dysfunction. The aim of this study is to value early hemodynamic alterations and myocardial dysfunction in severe sepsis using the pressure recording analytical method (PRAM). Methods Patients with severe sepsis or septic shock admitted to the ICU were enrolled. We studied hemodynamic variations and cardiac performance measured with PRAM (cardiac output, stroke volume, systemic vascular resistance, stroke volume variation, pulse pressure variation, cardiac cycle efficiency (CCE), maximal rate of rise in the arterial pulse curve (dP/dt max)) for the first

Figure 1 (abstract P95)


LYEF% fi.n.>5o;» <50,58% 45% 57% 50,2«/; 57% 70% 5S% 52% 55% 51% 64%

E/A(t.h>1) UM 1.2 1.6 0.3 0,65 0,42 1.5 1.6 1.4 0.4 1,1

LVEDS(T,I1.16,0 31.2 timi 18,80 IS JO 17,80 U.93 13.05 21,00 IS,70 15,13 17,50 12,30 17,50

CCE medium -0,4 -0,01 -0,01 -0,02 0,09 0,21 -0,02 -0,01 0,30 0,52 0.3S

dP/dt eicik medium 0,70 0,58 1.25 1.14 0,98 1,54 1.14 0,75 1.07 1,70 wo

2,5 2 1,5

1-1- . ■ Iftl 1

■ 2345671 9 10 11

24 hours. At admission, transthoracic echocardiography (TEC) was used to value the left ventricular ejection fraction (LVEF%), E/A ratio and left ventricular end-diastolic surface (LVEDS). Mortality at 28 days was measured.

Results Eleven patients were included (six severe sepsis, five septic shock). Mortality was 9%. TEC documented systolic function preserved in all patients (LVEF > 50%); diastolic dysfunction (E/A ratio < 1) in five patients and LVEDS reduced in two patients. Hemodynamic monitoring documented a myocardial dysfunction as a reduction of CCE and dP/dt max in two patients that required inotropic support. CCE represents the ratio between the hemodynamic work and the energetic cost sustained by the heart [1].

Conclusions PRAM seems to be a valid mini-invasive instrument in patients with sepsis for early diagnosis of myocardial dysfunction and to guide therapy. In fact, as shown in Figure 1, despite a preserved LVEF%, CCE and dP/dt max clearly detected myocardial dysfunction, because these parameters relate cardiac function with vascular condition. Reference

1. Romano SM, Pistolesi M: Assessment of cardiac output from systemic arterial pressure in humans. Crit Care Med 2002, 30:1834-1841.

TruCCOMS: real-time continuous cardiac output? PR Lichtenthal1, FA Tinker2

1University of Arizona, Tucson, AZ, USA; 2Da Vinci Research, LLC, Tucson, AZ, USA

Critical Care 2008, 12(Suppl 2):P96 (doi: 10.1186/cc6317)

Introduction Omega Critical Care has introduced the truCCOMS system to address the need for a rapid and accurate cardiac output monitor. This monitor relates blood flow to the power required to maintain a fixed temperature difference between a coil on a pulmonary artery catheter and that of the surrounding blood. The system was tested to assess their claims of accuracy and speed.

Methods To test for accuracy, four truCCOMS catheters were tested against the reference values of a Transonic T201 flowmeter using pulsatile flow provided by a Syncardia Systems artificial heart

Figure 1 (abstract P96)

Figure 2 (abstract P96)

Measured accuracy of the truCCOMS device. CO, cardiac output.

Response time: truCCOMS versus Edwards continuous cardiac output (CCO) catheter.

connected to a Donovan mock circulation tank generating flows between 1 and 8 l/min. Additionally, response time was monitored for abrupt changes in flow from 3 to 6 l/min and compared with an Edwards continuous cardiac output catheter. Results Measured flows from the truCCOMS unit, properly corrected for thermodynamic differences between blood and the glycerine used, show accurate correlation with Transonic values as seen in Figure 1. Figure 2 shows the substantial improvement in response time provided by the truCCOMS system. Conclusions Our results show that the principles used in the truCCOMS monitor can provide accurate measurement of cardiac output. More importantly, it provides this measurement in near realtime. Clinical studies should confirm these results. This system promises to be an accurate and responsive monitor in the clinical setting.

Evaluation of a modified FloTrac™ algorithm for cardiac output measurement in cardiac surgery patients

A Senn, A Zollinger, C Hofer

Triemli City Hospital, Zurich, Switzerland

Critical Care 2008, 12(Suppl 2):P97 (doi: 10.1186/cc6318)

Introduction The first evaluation studies of the FloTrac™ device (Edwards Lifesciences, USA) for cardiac output (CO) measurement revealed conflicting results [1,2]. The initially used software version may in part be responsible for these findings. The aim of this study was to compare the CO determined by FloTrac™ using software versions 1.03 and 1.07 (aFCO and bFCO) with the CO measured by PiCCOplus™ (Pulsion Medical Systems, Germany) (PCO) and the CO assessed by intermittent thermodilution (ICO). Methods With ethics committee approval, CO was assessed after cardiac surgery. For one set of data (dataset A) aFCO and for one set (dataset B) bFCO was used. After PiCCO calibration the mean of triplicate FCO, PCO and ICO values were recorded 15 minutes after inducing CO changes by different body positions (supine, 30°

head-up, 30° head-down, supine). Statistical analysis was performed using the t test, ANOVA and Bland-Altman analysis for absolute values and percentage changes (A). P < 0.05 was considered significant.

Results Data were obtained from 25 patients and 22 patients for dataset A and dataset B, respectively. Significant changes of FCO, PCO and ICO between measurement points were observed in datasets A and B. During dataset A, AaFCO was significantly greater and APCO was significantly smaller than AICO induced by head-down positioning (P = 0.017 and P < 0.001, respectively). During dataset B no significant difference was observed between AbFCO and AICO. APCO was significantly smaller than AICO during dataset B. Increased limits of agreement for aFCO-ICO and AaFCO-AICO (dataset A) were found when compared with PCO-ICO (Table 1). For dataset B the mean bias and limits of agreement were comparable.

Table 1 (abstract P97)

Bland-Altman analysis of absolute cardiac output (CO) values and percentage CO changes (A)

Figure 1 (abstract P98)


CO (l/min)


-0.1 ± 2.4 -0.5 ± 1.1 -0.2 ± 1.4 -0.2 ± 1.3

-0.6 ± 48.3 -0.4 ± 24.8 -3.8 ± 28.0 -1.5 ± 25.3

disagreement, while the relative change in systemic vascular resistance calculated with PCCO (ASVR-PCCO) and the time interval between calibrations had no predictive value of the reliability of PCCO. See Figure 1.

Conclusions ASVR-TPCO > 20% was associated with unreliability of PCCO measurement. Reliability of PCCO could not be predicted by continuous monitoring parameters, such as ASVR-PCCO.

Conclusions These results indicate that the new FloTrac software version (reduced time window for vascular compliance adjustment) improved performance of CO measurement in patients after cardiac surgery. References

1. Mayer J, Boldt J, Schollhorn T, et al.: Semi-invasive monitoring of cardiac output by a new device using arterial pressure waveform analysis: a comparison with intermittent pulmonary artery thermodilution in patients undergoing cardiac surgery. Br J Anaesth 2007, 98:176-182.

2. Button D, Weibel L, Reuthebuch O, et al.: Clinical evaluation of the FloTrac/Vigileo system and two established continuous cardiac output monitoring devices in patients undergoing cardiac surgery. Br J Anaesth 2007, 99:329-336.

Reliability of continuous pulse contour cardiac output measurement

L Weng1, H Xu2, X Hu1, J Peng1, B Du1

1Peking Union Medical College Hospital, Beijing, China; 2Cancer Institute, CAMS, Beijing, China

Critical Care 2008, 12(Suppl 2):P98 (doi: 10.1186/cc6319)

Introduction To evaluate the reliability of continuous cardiac output monitoring using pulse-contour analysis in critically ill patients.

Methods We retrospectively analyzed the agreement between transpulmonary thermodilution cardiac output (TPCO) and pulse contour cardiac output (PCCO) measured before recalibration of the TPCO from 34 patients with hemodynamic instability. Logistic regression analysis was used to identify the independent factors for the disagreement between TPCO and PCCO, defined as a relative change >15%.

Results We obtained 261 pairs of measurements. The relative change in systemic vascular resistance calculated with TPCO (ASVR-TPCO) of over 20% was the only independent factor for

Indication of peripheral decoupling during extreme hyperdynamic conditions in septic patients

L Hüter, M Wirsing, G Marx

Friedrich Schiller Universität, Jena, Germany

Critical Care 2008, 12(Suppl 2):P99 (doi: 10.1186/cc6320)

Introduction Limitations of minimally invasive hemodynamic monitoring devices are important in order to assess when and where these devices will provide utility. Challenging situations for these devices occur during extreme hyperdynamic conditions like septic shock, characterized by extreme loss of vascular tone and high cardiac output (CO). Identification that a patient has entered this condition could prove useful for assessing performance of the aforementioned devices, as well as providing additional information about patient condition, which should be evaluated for its potential utility. Methods In a group of 18 patients we evaluated the sensitivity and specificity for a new peripheral decoupling indicator (PDI) in assessing moments when physiological patient conditions may be causing the arterial pressure-derived cardiac output (APCO) to be underestimated. Comparison was made between a pulmonary artery catheter (PAC) and an APCO sensor (FloTrac Edwards Lifesciences, CA, USA). Data were collected over a period of 1,090 hours, providing a total of 196,369 data points for evaluation, with CO values ranging from 2 to 16.

Results The PDI demonstrated specificity of 96.7% and sensitivity of 82.6%. During these identified periods, FloTrac consistently exhibited a one-sided bias in its CO value, with a lower CO value when compared with the PAC. Only two of the patients exhibited periods of peripheral decoupling. Overall, the PDI indicated 'on' for 4,392 of the data points collected, or 2.2% of the time. Figure 1 illustrates the data from one of these patients. Conclusions The PDI identified moments when patient physiology led to underestimation of the FloTrac CO value. Additional research is needed to determine whether the low incidence rate of extended peripheral decoupling observed in our study is typical in

septic patients, and whether it could be correlated to patient condition or treatment. Further research is necessary to determine any potential prognostic value from the PDI.

Acknowledgement The study was supported by a limited grant from Edwards Lifesciences.

How accurate are different arterial pressure-derived estimates of cardiac output and stroke volume variation measures in critically ill patients?

B Lamia, HK Kim, A Hefner, D Severyn, H Gomez, JC Puyana, MR Pinsky

University of Pittsburgh Medical Center, Pittsburgh, PA, USA Critical Care 2008, 12(Suppl 2):P100 (doi: 10.1186/cc6321)

Introduction We compared the cross-correlation between estimates of cardiac output (CO) and left ventricular stroke volume variation (SVV) amongst three commercially available, minimally invasive devices. (LiDCOplus, FloTrac and PiCCO). Methods We simultaneously compared continuous and bolus thermodilution CO measures from a pulmonary artery catheter (PAC) with simultaneous estimates of arterial pulse contour-derived CO using the FloTrac®, LiDCOplus® and PiCCO® measured at one time in 20 cardiac surgery patients during the first two postoperative hours. We also compared SVV estimates among the three devices. Mean and absolute values for CO and SVV across all devices were compared by ANOVA and Bland-Altman analysis.

Results Mean CO values were not different across devices (5.8 ± 1.6 l/min vs 5.9 ± 1.7 l/min vs 5.8 ± 1.6 l/min for PiCCO, LiDCO plus and FloTrac, respectively; P = 0.4). The mean PAC CO (5.8 ± 1.6 l/min) was similar to PiCCO and FloTrac estimated CO values, but less than LiDCO CO values (P < 0.01). Biases between PAC and PiCCO, LiDCO and FloTrac values were 0.19 ± 0.57 l/min, -0.35 ± 0.56 l/min and -0.30 ± 1.56 l/min, respectively, and precision was -1.31 to 0.92 l/min, -1.46 to 0.77 l/min and -2.6 to 2.0 l/min, respectively. LiDCO and FloTrac SVV correlated (r2 = 0.58), however, with a bias of -0.40 ± 6.50% and a precision of -13 to 7%; whereas FloTrac and PiCCO SVV were not correlated (r2 = ns), with a bias of 4.0 ± 6.0% and a precision of -8 to 16%. S40 LiDCO and PiCCO SVV were also not correlated (r2 = ns), with a

bias of -5.4 ± 9.0% and a precision of -22 to 17%. Finally, PiCCO and LIDCO pulse pressure variation were correlated (r2 = 0.64, P < 0.05), with a bias of 17.0 ± 6.5% and a precision of -10 to 15%.

Conclusions All three arterial pulse contour analysis devices estimated CO well with a high degree of accuracy and precision. Furthermore, of the two devices that also report pulse pressure variation, both gave similar estimates, whereas SVV estimates correlated well only between LiDCO and FloTrac. The results of prior studies using LiDCO and PiCCO-derived estimates of SVV cannot therefore be compared with each other, nor can absolute values be used to drive similar resuscitation protocols unless independently validated for that catheter. Acknowledgements Funded by HL67181 and HL0761570.

Uncalibrated arterial pulse contour analysis in major vascular surgery

L Vetrugno, L Spagnesi, C Centonze, F Bassi, F Giordano, G Della Rocca

University Hospital, Udine, Italy

Critical Care 2008, 12(Suppl 2):P101 (doi: 10.1186/cc6322)

Introduction Assessment of continuous cardiac output using the arterial pulse wave (APCO) is currently available only with standard radial artery catheterization (Vigileo System, FloTrac™, Edwards Lifesciences, Irvine, CA, USA) [1,2]. Many of the studies available in the literature have compared APCO versus intermittent cardiac output (ICO) obtained with a pulmonary artery catheter (Intellicath, Edwards Lifesciences, Irvine, CA, USA) in patients undergoing cardiac surgery [3]. The aim of this study was to assess the bias and level of agreement between the APCO and ICO in patients undergoing major vascular surgery.

Methods Twenty elective patients undergoing abdominal aortic aneurysm (AAA) repair were enrolled. Patients with a pre-operative history of valvular heart disease, preoperative dysrhythmias, or ejection fraction <40% were excluded from the study. APCO and ICO measurements were simultaneously collected at the following steps: Before anesthesia induction (T1), after anesthesia induction (T2), 30 min after anesthesia induction (T3), at aortic cross-clamping (T4), 30 min after aortic cross-clamping (T5), 5 (T6), 10 (T7), 30 (T8) min after aortic unclamping, and at the end of surgery (T9). Statistical evaluation was performed using the Bland and Altman analysis. The percentage error was calculated according to the method described by Critchley et al. [4]. Results A total of 360 pairs of APCO/ICO measurements were analyzed and the bias was 0.09 ± 1.93 l/min/m2 with a percentage error of 28%. Subgroup analysis revealed that the bias, calculated without the measurements obtained during the T4 and T5 aortic cross-clamping periods, was 0.06 ± 1.97 l/min/m2 with a percentage error of 29%, surprisingly similar to the all pairs results. Conclusions In patients undergoing major vascular surgery, APCO obtained with the Vigileo System provided a clinically acceptable bias and agreement with intermittent pulmonary thermodilution measurements, surprisingly also during the aortic cross-clamping period. Larger population studies are needed to confirm these very preliminary data. References

1. Breukers RM, et al.: J Cardiothorac Vasc Anesth 2007, 21: 632-635.

2. Sander M, et al.: Crit Care 2005, 9:R729-R734.

3. Mayer J, et al.: BJA 2007, 98:176-182.

4. Critchley LAH, et al.: J Clin Monit Comput 1999, 15:85-91.

Comparison of Niccomo™ bioimpedance cardiac output with lithium dilution (LIDCO™) in ICU patients_

J Walker, M Jonas

Southampton University Hospital, Southampton, UK Critical Care 2008, 12(Suppl 2):P102 (doi: 10.1186/cc6323)

Introduction Knowledge of cardiac output (CO) and vascular resistance has been shown to influence and direct clinical decisions in critical care patients [1,2]. The choice of which technology to measure CO, however, remains troublesome and confusing. The accuracy and invasiveness of CO measurement technologies are variable, as are the concepts surrounding whether calibration is necessary. These issues of invasiveness and accuracy are central to decisions relating to CO monitor selection and are perceived to have an inverse relationship. Niccomo™ is a new impedance cardiography (ICG) algorithm that is noninvasive ('plug and play') and uses no calibration. LiDCO™ is a well validated indicator dilution (ID) technique. This study compared the accuracy of the new ICG algorithm versus the LiDCO™ standard. Methods With consent/assent, 14 critically ill patients were studied. The ICG monitor was set up and, following two initial ID determinations, CO measurements were recorded from both monitors simultaneously at one to six time points over 6 hours. Results Fifty-one paired measurements were obtained from the ICG and ID monitors. The mean (±SD) COs were 6.11 (1.62) l/min and 4.67 (1.25) l/min, respectively. The mean bias was -1.44l/min with a precision (standard deviation) of ±2.02 l/min. The lower and upper limits of agreement were -5.48 l/min (mean -2SD) and 2.61 l/min (mean + 2SD) (P < 0.001). Spearman's correlation analysis showed r = 0.149, P = 0.297. On direct comparison of the initial paired readings, the Niccomo™ estimated a lower CO than the LiDCO™; however, the differences were extremely variable.

Conclusions ICG showed both poor agreement and poor correlation versus ID. The percentage error (53%) lies outside the graphically derived accepted 30% level [3], and in this patient population suggests that this ICG algorithm does not have the required accuracy to drive, or the clinical confidence to make, haemodynamic management decisions. References

1. Perel A, et al.: Crit Care 2007, 11(Suppl 2):P285.

2. Jonas M, et al.: Crit Care 2003, 7(Suppl 2):P233.

3. Critchley A, et al.: Clin Monit Comput 1999, 15:85-91.

Intrathoracic blood volume measurement: comparison of transpulmonary lithium indicator dilution with indocyanine green indicator dilution

B Maddison1, C Wolff1, G Findlay2, E Calzia3, C Hinds1, R Pearse1

Queen Mary's University of London, William Harvey Research Institute, London, UK; 2University Hospital of Wales, Cardiff, UK;

3Universitatsklinikum, Ulm, Germany

Critical Care 2008, 12(Suppl 2):P103 (doi: 10.1186/cc6324)

Introduction Intrathoracic blood volume (ITBV) is thought to be a superior measure of cardiac preload compared with intravascular pressure [1]. Transpulmonary indocyanine green (ICG) indicator dilution is regarded as the most reliable method of ITBV measurement but is no longer commercially available. Our previous work suggests lithium indicator dilution could be used to measure the ITBV [2].

Methods Patients undergoing cardiac surgery with cardiopulmonary bypass who met inclusion criteria were enrolled into a single-centre, observational study. Perioperative care was standardised. Comparative ITBV measurements were performed 1, 2, 4 and 6 hours after surgery, using lithium indicator dilution via a radial artery catheter (LiDCOplus; LIDCO Ltd, UK) and ICG indicator dilution via a femoral artery catheter (COLD-Z; Pulsion, Germany). Data were compared by Bland-Altman analysis. Results Seventeen patients were recruited (age 69 (54-87) years; Parsonnet score 10 (0-29)), providing a total of 68 paired measurements. Sixteen ICG measurements were excluded because of poor-quality indicator dilution curves, leaving 52 paired comparisons. The mean ITBV measured by lithium dilution was 2,522 ml (±691) and measured by ICG dilution was 1,708 ml (±432). The mean bias between paired measurements was 813 ml (limits of agreement (Bland-Altman analysis) ±1,248; P < 0.001). For the cardiac index, however, the bias between techniques was only 0.39 l/min/m2 (limits of agreement (Bland-Altman analysis) ±0.9 l/min/m2; P < 0.0001). The discrepancy between the techniques therefore related to differences in the measurement of the mean indicator transit time. There was a decreasing trend in the mean differences in ITBV and mean indicator transit time (Li-ICG) from 1,014 ml and 16.1 seconds at hour 1 to 466 ml and 10.6 seconds at hour 6 (P = not significant). Conclusions Poor agreement between ITBV measurements taken using ICG and lithium indicator dilution appears to be due to inaccurate measurement of the mean indicator transit time. This may relate to the use of a radial as opposed to a femoral artery catheter in patients with poor peripheral perfusion. References

1 Wiesenack C, et al.: J Cardiothorac Vasc Anesth 2001, 15:584. 2. Maddison B, et al.: Crit Care 2007, 11(Suppl 2):295.

Global end-diastolic volume as an indicator of cardiac preload in hemorrhagic shock and resuscitation in swine

CR Phillips, JM Watters, DS Hagg, MA Schreiber

Oregon Health & Science University, Portland, OR, USA Critical Care 2008, 12(Suppl 2):P104 (doi: 10.1186/cc6325)

Introduction Optimal monitoring of cardiac preload is essential during resuscitation from hemorrhagic shock (HSR) to avoid under-resuscitation and over-resuscitation. The maintenance of adequate preload by administration of intravenous fluids remains a primary target to optimize hemodynamics in the early phase of HSR prior to the arrival of blood products. The central venous pressure (CVP) is commonly used as a goal to resuscitation; however, several studies have shown that cardiac filling pressures are not always accurate indicators of ventricular preload. The global end-diastolic volume (GEDV) determined at the bedside by the transpulmonary thermodilution method has been found to better assess cardiac preload in septic patients than CVP but this has not been examined in HSR. The present study was designed to assess the value of GEDV measured by transpulmonary thermodilution as an indicator of cardiac preload in HSR. Methods Twenty anesthetized swine underwent a grade V liver injury and bled without resuscitation for 30 minutes. Animals were then resuscitated with study fluid to, and maintained at, the preinjury mean arterial pressure. Hemodynamic parameters were evaluated in triplicate by the transpulmonary thermodilution technique: before and immediately after the liver injury and spontaneous hemorrhage; and 30 minutes after hemorrhage, immediately before and after resuscitation to the preinjury mean arterial pressure. S41

Figure 1 (abstract P104)

Results Changes in the GEDV index were more highly correlated with changes in stroke volume (SV) as compared with changes in CVP versus changes in SV (Figure 1).

Conclusions In this porcine model of traumatic hemorrhagic shock and resuscitation, the GEDV in contrast to the CVP behaved as an indicator of cardiac preload.

Prognostic value of the extravascular lung water index in critically ill septic shock patients

J Mallat, P Salaun, D Thevenin, L Tronchon, C Patoir, G Gazan

Hospital of Lens, France

Critical Care 2008, 12(Suppl 2):P105 (doi: 10.1186/cc6326)

Introduction The study investigated the prognostic value of the extravascular lung water index (EVLWI) determined by the single transpulmonary thermodilution technique and its relationship with physiologic indexes of lung injury in critically ill patients with septic shock in the ICU.

Methods The EVLWI was determined using a PiCCO monitor, and the daily fluid balance, oxygenation ratio (PaO2/FiO2), pulmonary vascular index (PVI), lung compliance and lung injury score (LIS) were recorded. The final outcome was assessed at day 28. Data (mean ± SD) were compared using Student's t test for continuous variables and by the chi-squared test for discrete variables. The correlations were estimated using Pearson's coefficient. P < 0.05 was regarded as statistically significant.

Results Thirty patients with septic shock were admitted prospectively. Fourteen (47%) patients died before day 28. At day

1 and day 3 the EVLWI was correlated to PaO2/FiO2 (r = -0.4 and r = -0.47, respectively; P < 0.05) and to LIS (r = 0.47 and r = 0.43, respectively; P < 0.05). No correlation was found, however, between the EVLWI and lung compliance and fluid balance. The average EVLWI at baseline was 12 ± 5 ml/kg, and the difference was not different between survivors and nonsurvivors; P = 0.14. The EVLWI and PVI for day 3 in nonsurvivors were significantly higher than in the survivors (13.7 ± 4.5 vs 8.6 ± 2.6 ml/kg; P = 0.001 and 2.69 ± 0.98 vs 1.93 ± 0.65; P = 0.01, respectively). ROC statistics using the highest EVLWI value at day 3 in each individual revealed an area under the curve of 0.868 ± 0.128; P = 0.001 with a cutoff point >11.5 ml/kg. At day 3, the hospital mortality of patients with EVLWI >11.5 ml/kg was significantly higher than those with EVLWI <11.5 ml/kg (77% vs 19%; P = 0.02) with sensitivity of 77% and specificity of 80%. During the course of illness, the EWLI, PVI and fluid balance decreased from days 1 to 3 only in the survivors (P < 0.05). Conclusions In human septic shock, the EVLWI demonstrated moderate correlation with markers of the severity of pulmonary aggression. Dynamic observation of the EVLWI can be one of the

factors for predicting the prognosis of patients with septic shock. A reduction of the EVLWI at early treatment was associated with a better prognosis.

Conjunctival microcirculation in patients with traumatic brain injury

E Klijn1, R Van Zijderveld2, C Den Uil1, C Ince1, J Bakker1

1Erasmus Medical Center, Rotterdam, The Netherlands; 2Academic

Medical Centre, Amsterdam, The Netherlands

Critical Care 2008, 12(Suppl 2):P106 (doi: 10.1186/cc6327)

Introduction Traumatic brain injury (TBI) is one of the most important causes of death in young adults. Treatment aims at controlling the intracranial pressure (ICP) in order to maintain an adequate cerebral blood flow, to reduce the risk of secondary ischemic damage. Abnormal blood flow in the middle cerebral artery in patients with TBI was previously associated with poor outcome. Because perfusion of the brain shares a common origin with blood flow in the conjunctiva, we hypothesized that conjunctival microcirculation is altered after TBI in comparison with healthy subjects.

Methods We used sidestream dark-field (SDF) imaging for evaluation of the readily accessible microcirculation of the bulbar conjunctiva as a noninvasive research site. Conjunctival microcirculation was studied in eight patients with TBI requiring sedation and continuous ICP monitoring. In addition, we investigated eight age-matched healthy control individuals. Using MAS software we determined the functional vascular density (FVD) as the total length of perfused vessels per field of view as well as the microvascular flow index (MFI).

Results Data are presented as the median (interquartile range). The TBI patients had an ICP of 20 (15-25) mmHg and a cerebral perfusion pressure of 61 (53-77) mmHg. The conjunctival MFI in TBI patients was 2.94 (2.88-3.00) in comparison with 2.93 (2.79-3.00) in healthy controls. The FVD was 7.78 (7.54-8.14) and 8.53 (7.60-9.97) in TBI patients and healthy controls, respectively. There was no significant difference in microcirculatory parameters found between the groups.

Conclusions We found that the FVD and MFI did not differ between healthy subjects and patients with TBI. Based on these interim results, further research will focus on the effect of an elevated ICP on conjunctival microvascular blood flow.

Novel models for the prediction of mortality after traumatic brain injury requiring intensive care

JA Hyam1, CA Welch2, DA Harrison2, DK Menon3

Charing Cross Hospital, London, UK; 2ICNARC, Case Mix Programme, London, UK; 3University of Cambridge, UK Critical Care 2008, 12(Suppl 2):P107 (doi: 10.1186/cc6328)

Introduction Major head injury is a common reason for admission to the ICU. Knowledge of factors that predict mortality provides clues to the pathophysiology of head injury, how clinicians' interventions can be most effective, allows audit between different units or time points and provides objective data with which to communicate with patients' relatives. Several established risk prediction models exist in the ICU; however, they have been shown to have suboptimal discrimination and calibration in this patient group [1]. Our aim was therefore to develop a novel model to predict mortality specifically for head injury.

Methods A literature review was undertaken to identify variables predictive for mortality after severe head injury. The ICNARC Case

Mix Programme, containing multiple data from 374,594 admissions to 171 critical care units in England, Wales and Northern Ireland from 1995 to 2005, was searched for head injury patients with a primary diagnosis of 'primary brain injury', 'subdural haematoma, or 'extradural haematoma'. Each variable that could be supported by the database was entered into a stepwise logistic regression model with mortality as the outcome. Calibration of the risk prediction model was assessed by the area under the receiver operating characteristic curve, discrimination by the Hosmer-Lemeshow C statistic and overall fit by Brier's score. Results A total of 10,937 admissions with head injury were identified. A prediction model was constructed using 14 variables and shown to have a superior discrimination and calibration to APACHE II, SAPS II and MPM II. A simplified model consisting of only three variables also performed better than existing models. Conclusions We present two novel prediction models for mortality after head injury requiring intensive care. Both models, even the simplified model of only three variables, had superior discrimination and calibration to existing ICU risk-prediction models. Reference

1. Hyam JA, Welch CA, Harrison DA, Menon DK: Case mix, outcomes and comparison of risk prediction models for admissions to adult, general and specialist critical care units for head injury: a secondary analysis of the ICNARC Case Mix Programme Database. Crit Care 2006, 10(Suppl 2):S2.

Changes in cerebral physiology following cranioplasty: a 15oxygen positron emission tomography study

M Abate, D Chatfield, J Outtrim, G Gee, T Fryer, F Aigbirhio, D Menon, J Coles

Wolfson Brain Imaging Centre, University of Cambridge, UK Critical Care 2008, 12(Suppl 2):P108 (doi: 10.1186/cc6329)

Introduction Patients with skull defects report symptoms, which improve with cranioplasty (CP). We used 15O positron emission tomography (PET) to examine whether this resulted from improvements in cerebral physiology.

Methods Seven patients were imaged 6-12 months post craniectomy with PET to derive maps of cerebral blood flow (CBF), oxygen metabolism (CMRO2), and oxygen extraction fraction (OEF) before and after CP. PET maps were coregistered with magnetic resonance images and segmented into grey matter (GM) and white matter (WM). Physiology was quantified in mixed GM + WM, GM and WM regions of interest (ROIs) underlying the craniectomy and in whole-brain GM, WM and GM + WM ROIs.

Figure 1 (abstract P108)


10080 ■

60 ■ 40-

■ [A] Pre

■ [B] Post

J_t J_t



Grey matter White matter

Cerebrovascular physiology and cranioplasty

Results See Figure 1. There were no significant changes in CBF, CMRO2 or OEF following CP, even within ROIs underlying skull defects. Individual patients showed increases in CBF and CMRO2 and decreases in OEF, but all values were above ischemic thresholds [1].

Conclusions Although individual subjects demonstrate improvements in physiology following CP, there were no systematic changes. Future studies will assess changes in individuals and relate these to metabolic changes within specific brain regions. Reference

1. Cunningham AS, et al.: Brain 2005, 128:1931-1942.

Brain tissue oxygenation: more than a number

DK Radolovich1, I Timofeev2, A Lavinio2, M Czosnyka2, DJ Kim2, P Hutchinson2, J Pickard2, P Smielewski2

1Policlinico S. Matteo, Pavia, Italy; 2Addenbrooke's Hospital, Cambridge, UK

Critical Care 2008, 12(Suppl 2):P109 (doi: 10.1186/cc6330)

Introduction The study objective was to analyse what kind of dynamic interrelations exist between brain tissue oxygenation (PbtO2) and corresponding fast modifications of arterial blood pressure (ABP), cerebral perfusion pressure (CPP) and intracranial pressure (ICP) in transient events.

Methods We reviewed retrospectively 325 computer recordings of PbtO2, invasive ABP and ICP waveforms from 23 head-injured patients. All patients were sedated, paralysed and ventilated. All signals were digitised, and recorded using ICM+ software. We divided the events into two groups, depending on whether ABP (Group 1) or ICP (Group 2) was the first parameter to change. Group 1 was further subdivided based on whether the vascular autoregulation was intact (ABP-ICP negative correlation) or was impaired (ABP-ICP positive correlation).

Results Group 1 (n = 255): intact cerebral autoregulation (n = 179): during hypotension PbtO2 decreased with delay with respect to CPP (48.5 s; SEM 92.1) and ICP (39.9 s; SEM 91.4), and during hypertension PbtO2 increased with a delay of 58.1 seconds (71.9 SEM) with respect to CPP and 52.2 seconds (72.2 SEM) with respect to ICP; impaired cerebral autoregulation (n = 76): PbtO2 modified following ABP changes, with a delay of 56.8 seconds (SEM 59.3) with respect to CPP and 54.2 seconds (58.8 SEM) with respect to ICP. Group 2 (n = 61): plateau waves and isolated gradual increases in ICP caused CPP to lower, followed by a PbtO2 decrease. The delay in PbtO2 reaction was 23.1 seconds (55.7 SEM, n = 23) with respect to ICP and 18.4 seconds (54.9 SEM, n = 24) to CPP.

Conclusions Transient events were observed in PbtO2 related to ABP or ICP modifications. Changes in PbtO2 were present irrespective of the state of autoregulation or the origin of the event (haemodynamic or ICP related). Generally PbtO2 followed the CPP direction. PbtO2 usually changed with a delay relative to the pressure parameters. The CPP-PbtO2 delay was significantly shorter in the events characterized by primary ICP modification (Group 2) in comparison with the ABP-led events (Group 1), irrespective of the state of autoregulation. These findings should be taken into account to evaluate the validity of indices assessing cerebral autoregulation using PbtO2. References

1. Masamoto K, et al.: J Appl Physiol 2007, 103:1352-1358.

2. Czosnyka M, et al.: J Neurol Neurosurg Psychiatry 2004, 75: 813-821. S43

Fatty acid binding protein and tau levels are related to brain damage and outcome after subarachnoid hemorrhage

T Zoerle1, M Fiorini2, L Longhi1, ER Zanier1, A Bersano1, L Cracco2, S Monaco2, N Stocchetti1

University of Milano, Fondazione IRCCS Ospedale Maggiore Policlinico, Milan, Italy; 2Ospedale Borgo Roma, Verona, Italy Critical Care 2008, 12(Suppl 2):P110 (doi: 10.1186/cc6331)

Introduction We measured the fatty acid binding protein (H-FABP) and tau levels in the cerebrospinal fluid (CSF) of patients after subarachnoid hemorrhage (SAH): to evaluate the relationship between SAH severity and H-FABP/tau values; to test the hypothesis that H-FABP/tau might help in the diagnosis of vasospasm; and to evaluate their association with outcome. Methods We studied 38 SAH patients, whose severity was assessed by the Glasgow Coma Scale (GCS). Serial CSF samples were obtained in every patient starting on the day of SAH and up 2 weeks post-SAH. H-FABP/tau levels were measured by ELISA. Vasospasm was defined as neuro-worsening (loss of at least one point of the motor component of GCS and/or appearance of a new focal deficit) + angiographic confirmation. The 6-month outcome was assessed by the dichotomized Glasgow Outcome Score (GOS): good (GOS 4-5) and bad (GOS 1-3). Multiple logistic regression analyses were performed to assess the association between H-FABP/tau values and GOS. Results H-FABP and tau increased after SAH. We observed a significant association between the peak H-FABP/tau values and admission mGCS (Spearman r = -0.581, P = 0.0001 and r = -0.582, P = 0.0001, respectively). Eight patients underwent brain death. Within the survivors we observed vasospasm in 11 patients. Both proteins were significantly higher in this group compared with those without ischemia (H-FABP = 15,958 ± 21,736 pg/ml vs 2,527 ± 2,427 pg/ml, P < 0.05; tau = 5,821 ± 3,774 pg/ml vs 1,118 ± 1,547 pg/ml, P < 0.05). The H-FABP rise preceded clinical recognition of vasospasm in seven patients and was simultaneous in four patients. Tau increased before clinical recognition of vasospasm in five patients. Patients with bad outcome showed higher peak levels of both proteins than patients with good outcome: respectively, H-FABP = 23,977 ± 25,593 pg/ml and 3,374 ± 2,549 pg/ml, P < 0.001; tau = 6,756 ± 4,544 pg/ml and 1,591 ± 1,639 pg/ml, P < 0.001. Logistic regression showed that, after correction for age, sex and SAH severity, the peak value of tau protein was an independent predictor of outcome. Conclusions The H-FABP and tau increase following SAH and might add complementary information for the diagnosis of vasospasm. There is an association between their CSF values and outcome following SAH.

Transdermal nicotine replacement is associated with lower mortality among active smokers admitted with spontaneous subarachnoid hemorrhage

D Seder, M Schmidt, N Badjatia, F Rincon, J Claassen, E Gordon, E Carrera, M Oddo, L Fernandez, C Lesch, K Lee, E Connolly, S Mayer

Columbia University, New York, USA

Critical Care 2008, 12(Suppl 2):P111 (doi: 10.1186/cc6332)

Introduction Active smokers comprise 35-55% of patients admitted with acute spontaneous subarachnoid hemorrhage (SAH). S44 Transdermal nicotine replacement is sometimes prescribed to these

patients to prevent a withdrawal syndrome, but the safety of exogenous nicotine during the acute period after SAH is unknown. Methods We conducted a prospective, observational study from 2001 to 2007 in the neurological ICU of a major academic medical center. All active smokers admitted with SAH were included in the analysis, but we excluded patients who died within 7 days of admission to remove those whose death was due to discontinuation of life support. The primary endpoint was 3-month mortality. Secondary endpoints were delayed cerebral ischemia (DCI) and clinical vasospasm.

Results One hundred and ninety-two active smokers, including 104 (54%) who received transdermal nicotine, were well matched on demographics, gender, age, Hunt and Hess grade, SAH sum score, aneurism size, and smoking pack-year history, but a higher percentage of current heavy smokers (>10 cigarettes daily) received nicotine (67%, P < 0.001). There was no association of nicotine replacement and clinical vasospasm or DCI. After controlling for disease severity and cerebral edema on head CT (OR = 13.9, CI = 1.5-125.3), multivariable logistic regression revealed that heavy smokers were more likely than light smokers to die (OR = 6.0, CI = 1.11-32.7). Smokers who received nicotine had lower mortality (OR = 0.26, CI = 0.68-0.98), an effect that seemed on secondary analysis to be driven by high mortality among heavy smokers who did not receive nicotine. Conclusions Transdermal nicotine replacement is not associated with clinical vasospasm or DCI in smokers admitted with SAH, and is associated with lower mortality, particularly among smokers of more than 10 cigarettes daily. This may be due to prevention of the physiological derangements associated with nicotine withdrawal. Nicotine replacement after acute SAH is probably safe, and should be given to active heavy smokers at the time of admission. More research is needed to verify these findings and define the therapeutic role of nicotine in the ICU. Reference

1. Lee AH, et al.: The association of nicotine replacement therapy with mortality in a medical intensive care unit. Crit Care Med 2007, 35:1517.

Impact of treatment with pravastatin on delayed ischemic disease and mortality after aneurysmal subarachnoid hemorrhage

U Jaschinski, K Scherer, M Lichtwarck, H Forst

Klinikum Augsburg, Germany

Critical Care 2008, 12(Suppl 2):P112 (doi: 10.1186/cc6333)

Introduction Statins have neuroprotective properties including improved vasomotor reactivity, reduced platelet activation and anti-inflammatory effects [1]. A prospective observational controlled study was conducted to evaluate the impact of pravastatin on the development of delayed ischemic disease (DID) and ICU mortality after aneurysmal subarachnoid hemorrhage (aSAH). Methods A total of 98 patients (20-80 years old) with aSAH were randomized to receive either pravastatin 40 mg (n = 40) or nonstatin treatment (n = 58) within 24 hours after the ictus. Primary endpoints, incidence of DID and extent of disability measured by the Glasgow Outcome Scale; secondary endpoint, ICU mortality. Results Groups were comparable with respect to age (54.2 (50.3-58.3) vs 53.2 (49.8-56.7) 95% CI), grade of aSAH (Hess/Hunt) (2.6 (2.17-3.03) vs 3.06 (3.00-3.80) 95% CI) and stroke severity (Glasgow Coma Scale 10.9 (9.4-12.4) vs 10.5 (9.3-11.8) 95% CI). There was a trend towards less DID in the statin group (37.5% vs 60.3% nonstatin; standard error of the difference of the means 9.8 (3.64-28.00) 95% CI). The extent of

disability between the groups, however, was not different (Glasgow Outcome Scale 3.65 (3.16-4.14) statin vs 3.39 (3.00-3.80) nonstatin 95% CI). Mortality was unchanged as well (22.5% statin vs 22.4% nonstatin).

Conclusions These results are in line with a recently published study demonstrating reduced vasospasm-related DID in patients treated with pravastatin after aSAH [2]. We could not confirm the benefit of statin treatment regarding mortality as mentioned in the cited trial since our study was not powered to detect a difference in mortality. So it is to be hoped that the Statins for Aneurysmal Hemorrhage STASH trial will clarify this topic. References

1. Topcuoglu MA: Expert Opin Drug Saf 2006, 5:57.

2. Tseng MY: Stroke 2007, 38:1545.

Pentraxin 3 as a marker of vasospasm following subarachnoid hemorrhage

ER Zanier1, G Peri2, G Brandi1, L Longhi1, M Tettamanti3, C Garlanda2, A Mantovani2, MG De Simoni3, N Stocchetti1

1 University of Milano, Fondazione IRCCS Ospedale Maggiore

Policlinico, Milan, Italy; 2Clinical Institute Humanitas, Milan, Italy;

3Mario Negri Institute, Milan, Italy

Critical Care 2008, 12(Suppl 2):P113 (doi: 10.1186/cc6334)

Introduction We studied the induction of Pentraxin 3 (PTX3), a prototypic long pentraxin protein induced by proinflammatory signals in subarachnoid hemorrhage (SAH) patients, to investigate a possible relation with SAH-associated ischemic brain damage. Methods PTX3 was measured in the plasma and cerebrospinal fluid (CSF) of 38 SAH patients admitted to the neuroscience ICU, who were divided into three groups: occurrence of vasospasm, defined as neuro-worsening (loss of at least one point of the Glasgow Coma Scale motor component and/or appearance of a new focal deficit) and angiographic confirmation of vasospasm; presence of an early hypodense lesion, defined as the appearance of a new hypodense lesion at CT scan following endovascular or surgical treatment, or around the initial intracerebral hematoma; and absence of a hypodense lesion. Arterial and CSF samples were obtained every 12 hours starting on the day of SAH and up 2 weeks post SAH.

Results PTX3 was induced in the plasma and CSF of SAH patients. CSF peak concentrations were significantly higher in patients with vasospasm (21.5 ± 5.1 ng/ml) compared with those with no CT hypodense lesion (5.8 ± 4.5 ng/ml, P < 0.05). Patients with an early hypodense lesion showed a peak concentration that was intermediate between the other two groups (12.4 ± 5.2 ng/ml). No difference was observed in plasma levels among the three groups. The temporal pattern of CSF PTX3 in patients with vasospasm was triphasic: there was an initial increase of PTX3 during the first 48 hours following SAH (acute phase, up to 17.2 ± 5.2 ng/ml), followed by a subsequent decrease in the next 48-96 hours (subacute phase, up to 1.2 ± 0.3 ng/ml, P < 0.01 compared with the acute phase). With the appearance of vasospasm, a secondary peak of PTX3 was detected (up to 7.1 ± 1.4 ng/ml, P < 0.01 compared with the subacute phase). No changes were detectable in plasma.

Conclusions PTX3 is induced in the CSF and in plasma following SAH; however, the CSF but not plasma levels are directly related to the degree of brain injury. In addition the data show that PTX3 measured in the CSF might be a reliable marker of vasospasm following SAH, and suggest that measurements of PTX3-CSF levels associated with clinical evaluation could improve early diagnosis of vasospasm in these patients.

Transcranial sonography investigations of the cerebral blood flow disturbances after hypothalamic pituitary and brain stem surgery

S Madorskiy, O Shovikova, A Safin

Neurosurgery Institute, Moscow, Russian Federation Critical Care 2008, 12(Suppl 2):P114 (doi: 10.1186/cc6335)

Introduction Cerebral blood flow (CBF) disturbances at focal lesions of the hypothalamopituitary system and brain stem structures are one of the unexplored and pressing questions of modern neurosurgery [1]. Transcranial duplex sonography (TCDS) is a widely used method for determination of the cerebral blood-flow velocity (FV) in neurosurgical patients.

Methods We studied characteristics of cerebral hemodynamics by TCDS after hypothalamic and brain stem tumor excision in 186 patients. The data obtained were compared with CT-MRI data, clinical parameters and factors of neurohumoral regulation. Results Our study showed that FV disturbances were observed in 82% of patients after surgery for hypothalamic and brain stem lesion. The revealed FV disturbances were evaluated as vasomotor spasm of different degrees of manifestation; distress of FV was caused by a thrombosis of branches of cerebral vessels, hyper-perfusion, hypoperfusion and infringements of venous outflow and CBF autoregulation. Stable neurological disorders were observed in 100% of patients at FV <40 cm/s and >200 cm/s. At FV >120cm/s we observed 61% of patients with transient neurological disorders and 24% with stable, and at FV >150 cm/s 22% with transient and 78% with stable neurological disorders. Middle cerebral artery (MCA)/internal carotid artery ratio >3.0 and FV >120 cm/s and basilar artery (BA)/external vertebral artery ratio >2.0 with BA velocities >85 cm/s was associated with 92% sensitivity and 97% specificity for vasospasm in the MCA and BA accordingly. In 87% of patients with FV in MCA >185 cm/s we observed focal ischemic brain lesions verified on CT scan. Fixed pathologic interactions between degrees of FV and factors of neurohumoral regulations suggest existing pathogenetic mechanisms of CBF disturbances in focal lesions of the hypothalamic pituitary system and brain stem. The dependence of FV disturbances and vasopressin plasma level was established. The revealed FV disorders allowed us to develop algorithms for therapy and preventing secondary ischemic brain lesions. Conclusions The investigation of FV by the TCDS method at focal brain lesions, together with MRI and neurological research, has allowed us to specify pathogenic mechanisms of CBF disturbances and algorithms for their therapy. Reference

1. Macdonald RL, et al.: J Clin Neurosci 1997, 4:348-352. P115

Flow velocity in head injury of different severity: findings of transcranial duplex sonography

A Safin, A Parfenov, S Madorsky, E Grinenko

Neurosurgical Institute, Moscow, Russian Federation Critical Care 2008, 12(Suppl 2):P115 (doi: 10.1186/cc6336)

Introduction Most commonly, transcranial duplex sonography (TCDS) is used to evaluate the flow velocity (FV). The main signs of cerebral blood flow disturbances in head injury are olygemia, hyperemia and vasospasm [1], which are closely connected with traumatic brain injury and dynamics of the disease. The influence of FV values measured by TCDS on the course and outcome of head injury of different severity is of special importance. S45

Methods FV was measured using an ultrasound triplex system in 83 patients with head injuries. The traumatic brain injury substrate was verified by CT and nuclear MRI. Mean values for FV were registered in the MCA every 48 hours. The hemispheric index (HI) was measured to differentiate a vasospasm (HI = mean MCA / mean ICA).

Results Depending on values of CFV, all patients were divided into three groups: Group I, 22 patients with FV < 70 cm/s; Group II, 23 patients with FV 70-120 cm/s and HI < 3; Group III, 38 patients with FV > 120 cm/s and HI > 3.0. Severity of cerebral lesions in Group I was caused by unilateral intracranial haematomas in six cases, contusion of type 1-2 in nine cases and diffuse axonal injury (DAI) in 12 cases. Patients in Group II and Group III revealed bilateral intracranial haematomas combined with type 2 and 3 contusions and DAI; patients in Group II showed contusion predominance, and patients in Group III had concomitant brain damage predominance (that is, intracranial haematomas combined with type 2-3 contusions and post-traumatic SAH). Outcome analysis in Group I revealed a GOS score of 1-2 in 13 patients, of 3 in seven patients and of 4 in two patients. In Group II the GOS score was 1 or 2 in 17 patients, 3 in four patients and 4 in two patients. In Group III the GOS score was 1 or 2 in 12 patients, 3 in 14 patients and 4 in six patients, and mortality was marked in five patients.

Conclusions The performed analysis allowed us to conclude that there existed a close relationship between the severity of traumatic brain damage and the character of FV disturbances. Marked traumatic brain injuries presented by multiple contusions and intracranial haematomas, DAI of type 2-3 and combined with SAH resulted in development of vasospasm in the MCA. Low values of FV as well as development of vasospasm in the cerebral middle artery are regarded as unfavourable prognosis for patients in the acute period of severe head injury. Reference

1. Oertel M, etal.: J Neurosurg 2005, 103:812-824. P116

Transcranial Doppler in serious malaria

V Mardelle, A Nau, E Peytel

HIA Laveran, Marseille, France

Critical Care 2008, 12(Suppl 2):P116 (doi: 10.1186/cc6337)

Introduction Many assumptions have been proposed to explain the confinement of red blood cells infested by Plasmodium falciparum in the cerebral capillaries (cell adherence, rosetting), involving an increase in blood viscosity and a deceleration in blood flow inside capillaries.

Methods In this study, 10 nonimmune adults were included with serious malaria according to the WHO classification. All of them had one or more criteria of gravity. A Quantitative Buffy Coat malaria test, a microscopic examination of thick and thin blood smear and transcranial Doppler were carried out from entry. We compared the transcranial Doppler findings, the pulsatility index (PI), with the degree of parasitemia. Data are expressed as the mean, standard deviation, extremes and percentage. Results The age of the patients was 40 ± 13 (SD) years (19-62). The sex ratio was 0.9. SAPS II was 34.3 ± 10 (SD) (20-53). The Glasgow Coma Scale score was 10 ± 4 (SD) (14-3). The parasitemia was 12.2 ± 16.9% (SD) (0.01-50). The PI (by averaging the two middle cerebral arteries' PI) was 1.9 ± 2.5 (0.8-9). The correlation coefficient between parasitemia and the PI was 0.86. Conclusions Some studies, carried out in children, demonstrated the interest in monitoring cerebral perfusion pressure and transcranial Doppler in prognostic evaluation of cerebral malaria

[1]. In the adult, the interest in monitoring cerebral perfusion pressure was also demonstrated [2]. Nevertheless, measurement of intracranial pressure is related with hemorrhagic risk because of homeostasis disorder usually observed during serious malaria. We know there is no exact correlation between the degree of parasitemia and the quantity of red blood cells confined in cerebral capillaries. Nevertheless, in our preliminary study, there is a correlation between the degree of parasitemia and disturbance of the cerebral flow. Indeed, the PI rises when parasitemia increases. References

1. Newton CR, et al.: Pediatr Neurol 1996, 15:41.

2. Angel G, et al.: Med Trop 1997, 57(3S):76.

Noninvasive assessment of intracranial pressure using ocular sonography in neurocritical care patients

T Geeraerts, S Merceron, D Benhamou, B Vigue, J Duranteau

CHU de Bicetre, Le Kremlin Bicetre, France

Critical Care 2008, 12(Suppl 2):P117 (doi: 10.1186/cc6338)

Introduction Invasive devices are the 'gold standard' for measurement of intracranial pressure (ICP). Their placement, however, can be challenging (coagulation disorders, lack of surgical availability). Noninvasive sonography of the optic nerve sheath diameter (ONSD) has been proposed to detect elevated ICP [1,2]. However, this method needs further validation. This study was performed to assess the relationship between the ONSD and ICP in neurocritical care patients.

Methods After approval from the local ethics committee, 37 adult patients with severe traumatic brain injury (n = 22), subarachnoidal hemorrhage (n = 6), intracranial hematoma (n = 8) and stroke (n = 1) requiring sedation and ICP monitoring (intraparenchymal probe in the frontal lobe; Codman, Johnson & Johnson) were included. For each optic nerve, two measurements of ONSD were made using a 7.5 MHz linear probe (HP Sonos 5500®; Hewlett Packard) (2D mode, 3 mm behind the globe, one measure in the sagittal and one in the transverse plane). The mean value for both eyes was retained. The ONSD and ICP were measured simultaneously once a day during the first 2 days after ICP probe placement and in cases of important changes in ICP.

Results There was a significant linear relationship between the ONSD and ICP (Spearman correlation p = 0.75, P < 0.0001; Figure 1a). Changes in ICP (delta) were also significantly correlated with ONSD variations (p = 0.78, P < 0.001; Figure 1b). The ONSD cutoff for detecting ICP > 20 mmHg was 5.8 mm (area under ROC curve = 0.91). The negative likelihood ratio of this cutoff was 0.07.

Figure 1 (abstract P117)

Relationship between intracranial pressure (ICP) and the optic nerve sheath diameter (ONSD).

Conclusions There is a significant relationship between the ONSD and ICP in neuro-ICU patients. Changes in ICP are accurately detected by the ONSD. The probability of having high ICP when the ONSD is below 5.8 mm is very low. This noninvasive method could be used to check the absence of raised ICP. References

1. Hansen HC, et al.: J Neurosurg 1997, 87:34-40.

2. Geeraerts T, et al.: Intensive Care Med 2007, 33:1704-1711.

Clinical and prognostic role of intracranial pressure monitoring in patients with aneurismal subarachnoid haemorrhage

E Grinenko, A Parfenov, E Gribova, A Safin, V Emelianov

Neurosurgical Institute, Moscow, Russian Federation Critical Care 2008, 12(Suppl 2):P118 (doi: 10.1186/cc6339)

Introduction Intracranial hypertension (ICH) caused by brain oedema is a frequent complication of the acute aneurismal sub-arachnoid haemorrhage (SAH) [1]. The only adequate method of diagnosis and assessment of ICH degree is its continuous monitoring that is necessary for efficient and timely anti-edematous therapy [2].

Methods The authors report 75 patients with SAH and risk of ICH. Intracranial pressure (ICP) monitoring was performed by 'Codman' sensors in 35 patients (Group 1). In 32 of them ICP monitoring was performed using subdural sensors, and in three of them using intraventricular sensors. In seven cases ICP monitoring was carried out in the preoperative period, and in 28 cases after AA exclusion. In 40 patients without ICP monitoring (retrospective material -Group 2) the basic methods of diagnosis were neurological examination and computed tomography (CT). Both groups were identical by sex, age, time of operative intervention, methods of intensive therapy and severity of state. The basic difference was the starting time of anti-edematous therapy.

Results Cerebral ischemia and marked neurological deficits were more frequently observed in Group 1 compared with Group 2 (80% and 17% correspondingly, P < 0.05). Favourable outcome was 65.7% (GCS, GOS Y-IY) in Group 1 and 17.5% in Group 2. Unfavourable outcome was 34.3% (GOS III-1) in Group 1 and 77.5% in Group 2; mortality made up 25% and 22.9% correspondingly, and brain oedema was 90% and 25% correspondingly. The mortality rate was as follows: in Group 1 eight patients (22.9%) died, two of them of brain oedema, which made up 25% of all mortality cases in this group, and six patients (75%) died of SAH recurrence. In Group 2 10 patients (25%) died, one of them (10%) died of SAH recurrence and nine patients died of brain oedema, which made up 90% of all mortality cases in this group.

Conclusions ICP monitoring in patients with aneurismal SAH allow one to reveal ICH in the early stage and to determine the cause of the increased ICP according to CT data. Besides, ICP monitoring in the acute stage of the aneurismal SAH allows timely adequate intensive care and thus evidence-based outcome improvement (P < 0.05). References

1. Heuer G, et al.: J Neurosurgery 2004, 10:408-416.

2. Gitte Y, et al.: Pediatr Rev 1999, 7:234-239.

Electroencephalogram desynchronization in brain trauma patients

I Ratsep1, T Lipping2

1North Estonian Regional Hospital, Tallinn, Estonia; 2Tampere

University of Technology, Pori, Finland

Critical Care 2008, 12(Suppl 2):P119 (doi: 10.1186/cc6340)

Introduction This study aimed at investigating the advantages of brain function monitoring in patients with subdural haematoma or spontaneous haemorrhage. We hypothesized that the reactivity of the EEG signal to stimuli could aid in the assessment of the condition of the brain and prediction of the outcome. We were also interested in the EEG patterns and features induced by midazolam in these critically ill patients as there are only a few studies on this subject in the literature.

Methods Twenty-three patients with subdural haematoma and four patients with spontaneous haemorrhage were incorporated in the study. Midazolam and fentanyl were used as sedative agents. The EEG signal from four channels (C3, C4, Fp1, Fp2) was recorded for at least 24 hours following the surgery. Every 4-6 hours, on average, a well standardized sequence of stimuli (voice, noise, TOF, tetanic) was applied. Reactions to the stimuli were carefully annotated by the study nurse. Segments of the EEG signal from 20 seconds before up to 40 seconds after each stimulus were extracted. The segments were further divided into 10-second subsegments overlapping by 5 seconds. The modulation of alpha activity (8-13 Hz) by the phase of the delta rhythm (0.5-4 Hz) was estimated for each subsegment.

Results The averaged results are shown grouped by the response of the patient to the stimuli (Figure 1). Deviation of the curves from a straight line indicates modulation. The lowermost curves correspond to the first subsegment (-20 to -10 s relative to the stimulus) and the uppermost curve to the last subsegment (30-40 s).

Figure 1 (abstract 119)

-2 Phase [rad] 2 -2 Phase [radj 2

Conclusions Slight modulation of alpha activity by the delta rhythm can be seen. In cases where clinical response was noted, the modulation is stronger but tends to disappear at about 5-15 seconds post stimulus, indicating desynchronization. Further analysis is needed to draw final conclusions.

Effects of mannitol and melatonin on magnetic resonance imaging findings in secondary brain damage

A Bayir, DA Kiresi, H Kara, S Ko?ak, S Özdin?, A Ak

Selguk University, Konya, Turkey

Critical Care 2008, 12(Suppl 2):P120 (doi: 10.1186/cc6341)

Introduction This study attempts to compare the effects of mannitol and melatonin on traumatic secondary brain damage with magnetic resonance imaging (MRI) findings. Methods In this study we used 1 2 New Zealand rabbits whose weight range was 2,000-2,500 g. After the subjects were injected with anesthesia, they were subjected to head trauma with the Feeney method. Three hours after the trauma, their MRI scans were taken. The subjects were divided into two groups as the mannitol group and the melatonin group. After the first MRI results were taken, 20% mannitol at the rate of 2 g/kg was given to the mannitol group and melatonin at a rate of 100 mg/kg was given to the melatonin group. Thirty-six hours after the trauma, the MRI findings were taken again. The MRI images before and after the trauma were compared. The 36-hour MRI results of the melatonin and mannitol groups were also compared against each other. Results When the findings of 36-hour MRI results were compared with those taken 3 hours after the trauma in the melatonin group, it was found that the ventricular pressure and parenchyma edema, the parenchyma protrusion developed, and those contusion findings got heavier. The symptoms in the MRI images taken 36 hours later in the mannitol group were found to have developed slightly. A significant difference was found between the melatonin and mannitol groups' findings in the MRI images taken 36 hours after the trauma.

Conclusions In decreasing traumatic secondary brain damage,

mannitol is better than melatonin.


1. Lee B, etal.: NeuroRx 2005, 2:372-383.

2. Toyama Y, et al.: Radiat Med 2005, 23:309-316.

3. Maldonado MD, et al.: J Pineal Res 2007,; 42:1-11.

4. Kerman M, et al.: Exp Brain Res 2005, 005:2338-2340.

5. Carillo Vica A, et al.: J Pineal Res 2005, 39:400-408.

6. Özdemir D, et al.: Neurosci Lett 2005, 385:234-239.

7. Sarrafzadeh AS, et al.: Acta Neurochir 2000, 142:1293-1299.

Brain trauma care targets analysis using a high-rate recording and computing network

H Mehdaoui1, R Valentino1, L Allart2, D Zitouni2, B Sarrazin1, C Meunier1, I Elzein1, S Tissier1, P Ravaux2

1Fort de France University Hospital, Fort De France, Martinique;

2Lille 2 University, Lille, France

Critical Care 2008, 12(Suppl 2):P121 (doi: 10.1186/cc6342)

Introduction We analyze information on brain-injured patients' monitoring and care provided by a powerful information system. Methods We analyzed 543 hours on 11 patients, limited to 72 hours per patient when available: mean arterial pressure (MAP), intracranial pressure (ICP) and cerebral perfusion pressure (CPP) values were plotted against guideline thresholds, respectively 90 mmHg, 20 mmHg and 60 mmHg. The data were sampled every 2 seconds. Extraction was performed using a 3 teraflops supercomputer. We developed a method to detect periods of abnormal values.

Results The calculated CPP and monitored CPP differed despite a good correlation (r = 0.91, P < 0.0001). Fifty-seven percent,

Table 1 (abstract P121)

Detected abnormal episodes

5-15 min 53 42 58

15-30 min 25 41 60

30-60 min 17 27 44

60-120 min 6 18 26

>120 min 13 17 40

40% and 27% of the recorded MAP, ICP and CPP values reached thresholds. The time distributions of abnormal CPP, ICP and MAP values are detailed in Table 1: 51.7% of the MAP periods, 48.8% of the ICP periods, 51.8% of the calculated CPP were short episodes (<30 min). Mortality was associated with CPP < 60 (OR = 4.13 - logistic regression model, P < 0.0001) and inversely associated with MAP drops and IC hypertension episodes (OR 0.58 and 0.45, respectively). The mean time spent in each episode was higher in the NS group (76 ± 6 vs 48 ± 5 min). Caregivers' actions are perceptible on a CPP distribution chart. Conclusions Monitoring artifacts should be better identified when monitoring-based targets are used to guide therapy. Computer-based data analysis shows evidence of frequent episodes requiring therapeutic actions according to published guidelines, assuming that multimodal monitoring is not limited to the three studied parameters. Caregivers need new tools for data management to provide a better quality of care. Acknowledgements Project funded by the EC and Martinique country.

Audit of compliance with ventilation protocol in severe head injuries: a retrospective study

V Garikipati, A Eynon, R Lightfoot

Southampton General Hospital, Southampton, UK Critical Care 2008, 12(Suppl 2):P122 (doi: 10.1186/cc6343)

Introduction A reduction in mortality of severe head injury patients is associated with the development of evidence-based protocols [1,2]. This audit studies the adherence to the neurointensive care unit (NICU) protocol for the management of respiratory parameters in severely head injured patients in the first 24 hours. Methods A random case note review was undertaken of 50 patients intubated prior to admission to NICU, between March 2005 and April 2007. All data in the first 24 hours was compared with protocol targets.

Results There were 170 severely head injured patients admitted to the NICU in the defined period. Patients reviewed were 39 males,

11 females; median age 34 years, range 17-74 years. The median presenting GCS was 7. Eighteen patients had thoracic pathology on admission, these included seven spinal fractures, four haemothoraces, one sternal fracture, six rib fractures, six aspiration pneumonitis and one collapsed lung. Admission ventilation targets and their compliance were measured. The results were ventilation mode (SIMV) 98% compliance, tidal volume (6-10 ml/kg) 96%, FIO2 (30-40%) 38%, respiratory rate (12-16) 30%, I:E ratio (1:2) 78% and PEEP (5-10 cmH2O) 94%. See Table 1. Conclusions Overall our audit detected only 18 protocol deviations out of 311 interventions regarding maintenance of adequate oxygenation and tight PaCO2 control (6%). There were 78 episodes out of 397 samples taken where the protocol should have been activated for the management of PaCO2 control (20%).

Table 1 (abstract P122)

Blood gas analysis

Protocol target Total samples Total interventions Protocol deviations Episodes when protocol not activated

PaO2 > 11 kPa 397 138 1 5

PaCO2 4-4.5 kPa 397 173 17 78

Protocols can reduce mortality but knowledge of adherence to protocols is necessary to improve clinical practice. References

1. Patel HC, et al.: Lancet 2005, 366:1538-1544.

2. Clayton et al.: Br J Anaesth 2004, 93:761-766.

c-Jun N-terminal kinase pathway activation in human and experimental traumatic brain injury: neuroprotective effects of its inhibition

F Ortolano1, ER Zanier1, A Colombo2, A Sclip2, L Longhi1, C Perego2, T Borsello2, N Stocchetti1, MG De Simoni2

1 University of Milano, Milan, Italy; 2Mario Negri Institute, Milan, Italy Critical Care 2008, 12(Suppl 2):P123 (doi: 10.1186/cc6344)

Introduction c-Jun N-terminal kinase (JNK) is a regulator of many cellular events, including programmed cell death (apoptosis). The JNK pathway is activated in several models of brain injury and its inhibition confers neuroprotection. The role of JNK following traumatic brain injury (TBI) is unclear. We tested the hypothesis that JNK might be a relevant pathway following TBI in humans and in a model of cerebral contusion, and evaluated the neuro-behavioral and histological effects of its pharmacological inhibition by the administration of DJNKI-1, a peptide that selectively prevents the binding between JNK and its substrates. Methods JNK activation was investigated by western blot analysis performed on brain samples obtained from four TBI patients who underwent surgical removal of a cerebral contusion, and on injured cortex and hippocampus of mice subjected to anesthesia followed by controlled cortical impact brain injury at 1, 4 and 48 hours post injury. In addition, at 10 minutes post injury, animals randomly received an intraperitoneal administration of either DJNKI-1 (11 mg/kg) or an equal volume of saline (100 |il). A second group of mice received identical anesthesia, surgery without injury, and saline to serve as uninjured controls. Neurobehavioral motor outcome was evaluated at 48 hours and 7 days post injury by performing the Neuroscore. Cell death was quantified by the histochemical TUNEL technique at 48 hours post injury and the contusion volume was evaluated at 7 days post injury. Results We observed a robust activation of the JNK pathway both in the human pericontusional brain tissue and in the injured cortex and hippocampus of mice at 1, 4 and 48 hours post injury. At 48 hours and 7 days post injury, mice receiving DJNKI-1 showed a better motor performance compared with mice receiving saline (P < 0.05 at both time points). Moreover, mice receiving DJNKI-1 showed a significant reduction of TUNEL-positive cells in the hippocampus compared with mice receiving saline at 48 hours post injury (P < 0.05) and a reduced contusion volume at 7 days post injury (P < 0.01).

Conclusions JNK is activated following human and experimental TBI. The administration of the inhibitor DJNKI-1 to injured mice induced an amelioration of neurobehavioral deficits and histological damage following controlled cortical impact brain injury.

Hormones and cytokines as biomarkers for immediate cure measures in severe neurosurgical patients: base for inclusion in a neuromonitoring algorithm

V Tenedieva, A Potapov, I Trubina, A Parfenov, E Alexandrova

Burdenko Neurosurgical Institute, Moscow, Russian Federation Critical Care 2008, 12(Suppl 2):P124 (doi: 10.1186/cc6345)

Introduction In spite of dramatic recent achievements in neuroendocrine immunology and neuroprotection, an adequate treatment strategy for interrupting molecular cascade reactions in severe brain damage is not quite clear. The role of daily monitoring of the hormone and cytokine levels in these patients in the ICU is not quite understood and so far recognized.

Methods Two hundred and eighty-two patients with severe traumatic brain injury (GCS < 8 at admission), 226 patients with aneurismal subarachnoid haemorrhage and 325 operated patients with brain tumors were studied. Prolactin (as immunomodulator), free and total thyroxine and triiodothyronine (FT4, T4, FT3 and T3), and cytokines (IL-6, sIL-2R, NT-proBNP) were assayed in blood and CSF by RIA kits and chemiluminescent analysis (Immulite 2000). The obtained data were compared with clinical, neurological and neuroimaging data.

Results Independent of causation and gender, an abrupt serum prolactin level decrease (P < 0.001) started 2-3 days before respiratory and brain inflammatory complaints were verified by roentgenogram. Significant decreases, especially T3 and FT3 to undetected values (P < 0.05), were characterized for worsening patient conditions (brain ischemia/hypoxia and brain edema increasing, consciousness depression (r = -0.239, P < 0.000415)). Simultaneously there were marked significantly increased sIL-2R, IL-6, and NT-proBNP levels in blood and CSF in comparison with normal values (P < 0.001). The highest values were found in patients with unfavourable outcomes.

Conclusions Serum and CSF hormone and cytokine level daily monitoring in critically ill patients with severe brain damage, ARDS, haemodynamic disturbances, sepsis and polyorgan deficit strictly reflects the patient condition (upregulation and downregulation of neuroendocrine and immune systems and its roles in neurosystemic and systemic inflammatory responses) and allows immediate prognosis of the disease process. The 'brain low T3 syndrome' earlier proposed by us for severe neurosurgical patients serves as a basis for brief thyroid hormone substitution therapy in addition to conventional therapy, taking into account the crucial role of T3 in neurogenesis in the adult brain and its important influences on endothelium and cardiodynamics. Reference

1. Trentin AG: Thyroid hormone and astrocyte morphogenesis. J Endocr 2006, 189:189-197.

Is the ratio of lactated to pyrostaphylic acid in cerebral tissue a prognostic index for the outcome of patients with intracerebral hemorrhage?

G Paraforos1, E Mavromatidi1, A Chovas1, T Paraforou1, K Katselou2, V Christodoulou1, A Komnos1

General Hospital Larisa, Greece; 2University Hospital of Larisa, Greece

Critical Care 2008, 12(Suppl 2):P125 (doi: 10.1186/cc6346)

Introduction The objective was to correlate the ratio of lactated to pyrostaphylic acid (L/P) with the outcome of patients with intra-cerebral hemorrhage, according to the Glasgow Outcome Scale (GOS).

Methods ICU patients with spontaneous intracerebral hemorrhage, diagnosed with a brain CT, were enrolled in the study. The inclusion criterion was a GCS on admission <8. An intracranial microdialysis catheter was inserted in cerebral tissue and extracellular brain fluid sample was collected every 2 hours for analysis. A CMA 600 Microdialysis Analyzer was used for measurements. Patients were divided into two groups according to their GOS score 6 months later, group A (GOS 4-5, good outcome) and group B (GOS 1-3, poor outcome). The variable of L/P was dichotomized and a value that was statistically significant correlated to the outcome was investigated. Comparison of the mean value of L/P between the two groups was carried out at a significance level of 95%.

Results There were 29 patients enrolled in the study, with a mean age of 62 years (±9.86). Six months later there were six patients in group A (mean L/P value: 34.13 ± 2.64) and 23 patients in group B (mean L/P value: 41.21 ± 16.39). There was a borderline correlation between the L/P value and the outcome between the two groups. Group A with a good outcome had a lower mean L/P ratio value (P = 0.059). All patients with a good outcome had an L/P value lower than 37, whereas all patients with an L/P value greater than 37 had a poor outcome, as is shown in Table 1.

Table 1 (abstract P125)

Correlation between L/P ratio and GOS scale

L/P ratio GOS 4-5 (n = 6) GOS 1-3 (n = 23)

L/P < 37 6 11

L/P > 37 0 12

Fisher's exact test P = 0.028

Conclusions According to our results, the lactated to pyro-staphylic acid ratio is correlated to the outcome of patients with intracerebral hemorrhage, 6 months after admission to the ICU.

Coagulopathy predicts poor outcome in traumatic brain injury

G Fuller, H Pattani, A Chalmers, D Sperry, P Yeoman

Queen's Medical Centre, Nottingham, UK

Critical Care 2008, 12(Suppl 2):P126 (doi: 10.1186/cc6347)

Introduction Cerebral damage arising from traumatic brain injury (TBI) can occur primarily at the time of injury or can occur secondarily at a temporally distant time point post insult [1]. Abnormal clotting occurs in 10-20% of head-injured patients and may exacerbate secondary brain injury [2,3]. It may also be a S50 marker of the degree of the primary injury. Brain tissue is rich in

thromboplastin, and activation of clotting pathways following TBI is thought to occur leading to abnormal coagulation. This may result in disseminated intravascular coagulation, cerebral microthrombi and ischaemia, or exacerbation of intracranial haemorrhage [4,5]. We have studied the admission International Normalised Ratio (INR) in moderate to severe TBI patients, examining its role as a prognostic indicator in these patients.

Methods All patients admitted to the Queens Medical Centre from 1993 to 2002 with a recorded Glasgow Coma Score of 12 or less within 48 hours of a TBI were included in the Nottingham Head Injury Register. The INR and outcome at 1 year were recorded on the register. We looked at the strength of the association between the admission INR and the outcome at 1 year. Results Data were available on 497 patients. Their mean age was 36 years (range 16-91). Seventy-five per cent of the patients were male. Of the 497 patients, 199 died at 1 year. The INR was increased in 60% of patients. Linear regression and logistic regression after group division into dead versus alive and good versus poor outcome were significant for the whole range of increased INR, but particularly striking and clinically relevant outcome difference was found where INR > 1.5 (chi-squared P < 0.001).

Conclusions A prolonged INR was observed in patients presenting with moderate or severe TBI and was associated with unfavourable outcome. An admission INR >1.5 is a statistically significant indicator of poor prognosis in moderate to severe TBI patients and may be a useful prognostic marker in these patients. This may be a valuable addition to prognostic scoring systems. References

1 Bullock R, et al.: J Neurotrauma 1996, 13:639-734.

2 Olson JD, et al.: Neurosurgery 1989, 24:825.

3 Stein C, et al.: J Neurosurg Anesthesiol 2001, 13:13.

4 Kaufman HH, et al.: Neurosurgery 1984, 15:34.

5 Bjorklid et al.: Thromb Haemost 1977, 37:91.

Alteplase for acute ischemic stroke: 2 years in a community hospital without previous experience in stroke thrombolysis

A Estella, A Sainz de Baranda, E Moreno, MJ Galan, E Leal, A Jareno

Hospital of Jerez, Spain

Critical Care 2008, 12(Suppl 2):P127 (doi: 10.1186/cc6348)

Introduction Intravenous administration of recombinant tissue plasminogen activator (rt-PA) remains the most beneficial proven intervention for emergency treatment of stroke. The objective of the present study was to assess the implementation of the 'Stroke code' in routine clinical care at our center in the last 2 years and to describe the clinical outcome of patients who received treatment with intravenous rt-PA.

Methods The aim of the 'Stroke code' is the early recognition of selected patients with a suspected stroke who may be treated with thrombolysis therapy. Prehospital emergency medical services, critical care, radiology and neurology departments are implicated. Inclusion criteria for intravenous administration of rt-PA (0.9 mg/kg) were: age 18 years or greater, measurable neurological deficit, NIHSS >4 and <25, onset of symptoms <3 hours before beginning treatment, CT without a multilobar infarction (hypodensity >1/3 cerebral hemisphere).

Results Fifty-five 'Stroke codes' were activated from November 2005 to November 2007. rt-PA was administered in 27 patients (49%), 21 patients were males and six females. The mean age was 64 years. APACHE II (admission) score was 8.8 ± 3.5 points. ICU length of stay was 3.5 ± 1.5 days. Eighty-eight percent of patients

had vascular risk factor, 33.3% were receiving aspirin at stroke onset.

Post-treatment study imaging was performed 48 hours after thrombolysis: three patients developed CT haemorrhagic infarct type 1 (asymptomatic small petechiae along the margins of the infarct). Two patients died, because of cerebral infarction with cerebral edema. The median NIHSS score was 12.8 points at admission and 10.2, 8 and 7.2 at 2 hours, 24 hours and 48 hours after treatment, respectively.

Conclusions In selected patients rt-PA is effective when used within 3 hours of stroke onset [1]. rt-PA is safe in routine clinical use despite limited prior experience of thrombolysis for acute stroke [2]. References

1. Broderick J, et al.: Stroke 2007, 38:2001-2023.

2. Wahlgren N, et al.: Lancet 2007, 369:275-282.

Acute lung injury in a neurosciences critical care unit

RD Stevens, E Lin, RE Hoesch

Johns Hopkins University, Baltimore, MD, USA

Critical Care 2008, 12(Suppl 2):P128 (doi: 10.1186/cc6349)

Introduction Acute lung injury (ALI) may complicate neurological illness, but the mechanisms and outcomes of ALI in this setting are poorly understood. We hypothesized that ALI is linked to severity of neurological illness, mechanical ventilation (MV) parameters, and outcomes in brain-injured patients.

Methods We identified consecutive patients admitted over a 2-year period to a tertiary hospital neurosciences critical care unit and requiring MV for >48 hours. ALI was determined using AECC criteria. Univariable and multivariable predictors of ALI and of mortality were assessed.

Results We evaluated 124 patients with head trauma (34 patients), intracerebral hemorrhage (29 patients), subarachnoid hemorrhage (25 patients), ischemic stroke (12 patients), and other brain disorders (24 patients). The primary indication for MV was neurological (impaired consciousness, seizures, intracranial hypertension) in 89 patients, respiratory failure in 22 patients, surgery in 10 patients, and other in three patients. ALI developed in 36 patients (29%) a mean (SD) of 2.7 (1.8) days after initiation of MV. Neither ALI risk factors (pneumonia, aspiration, sepsis, trauma, transfusion, pancreatitis) or neurological insult severity (Glasgow Coma Scale on admission, absence of brainstem reflexes) were significantly associated with ALI. Tidal volumes and positive end-expiratory pressures on days 1 and 2 of MV were not significantly different in patients with and without ALI. Fifty-two patients (42%) died during hospitalization, and independent predictors of death were admission with intracerebral hemorrhage (OR = 4.3, 95% CI = 1.5-12.2), absence of corneal reflex (OR = 5.0, 95% CI = 1.2-20.0), and circulatory shock (OR = 6.2, 95% CI = 1.9-20.9). There was no independent association between ALI and mortality. Conclusions ALI developed in nearly one-third of patients undergoing MV following either traumatic or nontraumatic brain injury. The postulated relationships between ALI and MV parameters, neurological severity of illness, and short-term mortality were not confirmed in this population.

Hemodynamic changes after hypothalamic and brain stem surgery: interdisciplinary approach to studying

S Madorskiy, A Parfenov, A Zakharova

Neurosurgey Institute, Moscow, Russian Federation Critical Care 2008, 12(Suppl 2):P129 (doi: 10.1186/cc6350)

Introduction We discuss prognostic criteria of hemodynamic changes in patients after hypothalamic and brain stem surgery. We hope that better understanding of mechanisms of adaptive disorders in local brain lesions will help to optimize postoperative management of these patients. This study was based on the interdisciplinary neurocardiologic approach [1]. Methods Cardiac output measured by the echocardiographic method as well as hemodynamic and humoral parameters were investigated in 139 patients with pituitary adenomas and craniopharyngiomas and in 148 patients with brain stem tumors. Results We consider that unfavorable hemodynamic changes may be used as prognostic criteria of severe damage of regulatory centers in the hypothalamus or brain stem. It is clear that a favorable type of hemodynamic change is a kind of postoperative stress reaction. The reaction after pituitary tumor surgery was reduced or delayed and grew to its peak by the third day after brain stem surgery. The main unfavorable type of hemodynamics was decreased cardiac output (CO). However, the causes of this decrease are quite different. In damage of the hypothalamus, decrease of CO was connected with decreased blood volume and the latter was connected with a decrease of vasopressin secretion. Our research has shown that patients with lesions of different structures of the hypothalamus and brain stem revealed specific changes of various neurohumoral systems. In damage of the dorsomedial part of the medulla oblongata, the decease of CO was caused by primary neurogenic cardiac insufficiency. In damage of hypothalamic structures, we see increased amplitude power spectral density of the respiratory period of heart rate variability (HRV), a decrease of the amplitude of the low-frequency peak and a very high degree of coherence between HRV and respiratory variability. In brain stem structure damage, we can see low-frequency components only on the power spectral density of HRV. We postulate that the revealed distinctions of power spectral density of HRV showed that with hemodynamic disturbance in hypothalamic and brain stem lesions a different pathological type of cerebral regulation of hemodynamics forms. Conclusions Disorders of a humoral regulation at focal lesions of the hypothalamus and brain stem are specific. Intensive care should therefore be carried out taking into account that these changes should be directed to regeneration of a normal humoral pattern. Reference

1. Goldstein D: The Autonomic Nervous System in Health and Disease. New York: Marcel Dekker; 2001.

Evaluation of development of diabetes insipidus in the early phase following traumatic brain injury in critically ill patients

V Karali, E Massa, G Vassiliadou, I Chouris, I Rodin, M Bitzani

G. Papanikolaou, Thessaloniki, Greece

Critical Care 2008, 12(Suppl 2):P130 (doi: 10.1186/cc6351)

Introduction The purpose of this study was to define the prevalence and outcome of diabetes insipidus (DI) in the early post-traumatic brain injury (TBI) period in ICU patients. Inadequate S51

antidiuretic hormone secretion, which results in DI, is a well recognized complication of TBI, owing to post-traumatic posterior pituitary dysfunction.

Methods This prospective study was performed in 73 ICU-TBI patients (with or without multisystem trauma) admitted to a general ICU at a tertiary center between December 2005 and November 2007. Patients had suffered severe TBI, according to the initial GCS score (<8). DI was diagnosed if plasma sodium exceeded 145 mmol/l in the presence of inappropriate dilute urine with 24hour urine volume >30 ml/kg body weight, urine specific gravity <1,005 or urine osmolality <300 mOsm/kg with a simultaneous plasma osmolality >300 mOsm/kg. The age, gender, GCS, Injury Severity Score (ISS), onset of DI, peak recorded plasma sodium and outcome were noted. Statistical analysis was computed by t test and Fischer exact test. P < 0.05 was considered statistically significant.

Results Twenty-one ICU-TBI patients (28.7%) developed acute DI. Comparison was made between two groups of these patients: Group A, nine survivors and Group B, 12 nonsurvivors of TBI. There was no statistical significance between them with respect to age, gender (P > 0.05). Group B had a lower GCS (4.5 ± 1.5) as compared with Group A (7.8 ± 3, P = 0.003). The ISS was significant greater in Group B: 38 ± 8 versus 17 ± 7 in Group A, P < 0.001. Peak plasma sodium was significantly greater in Group B: 167 ± 4 mmol/l versus 156 ± 3 mmol/l in Group A, P < 0.05. The mean onset time of DI in Group B (1.7 ± 0.9 days) was shorter than in Group A (7.4 ± 3.3 days), P = 0.004. Overall mortality was 57.1%. The mortality rate for the development of DI within the first 3 days after TBI was 90% versus 27.2% if DI occurred later. Nonsurvivors died from brain death and not as a result of their associated injuries.

Conclusions Our results demonstrate that DI is common, following severe TBI. ICU-TBI patients presenting with features of DI have an overall high mortality. This study shows that the development of DI within the first 3 days of TBI is associated with high mortality rate and impending brain death. On the contrary, ICU-TBI patients who develop DI later have a better prognosis.

Hypernatremia and mortality in patients with severe traumatic brain injury

E Picetti, E Antonucci, M Mergoni, A Vezzani, E Parenti, G Regolisti, U Maggiore, A Cabassi, E Fiaccadori

Parma University Medical School, Parma, Italy

Critical Care 2008, 12(Suppl 2):P131 (doi: 10.1186/cc6352)

Introduction Hypernatremia (HyperNa) carries on an increased risk of death in critically ill patients [1]. It is not known, however, whether this is true also in patients with severe traumatic brain injury (TBI).

Methods We analyzed prospective data from all patients admitted for severe TBI (GCS < 8) to a trauma ICU over a 3-year time period. We collected demographics, clinical variables, complications, and the available laboratory data for each day of ICU stay. Major outcomes were ICU and hospital mortality, and ICU length of stay (LOS). We used Cox proportional-hazards regression models with time-dependent variates designed to reflect the exposure to the varying sodium (Na) levels over time during the ICU stay. The same models were adjusted for age, gender, and Na levels at admission as baseline covariates.

Results We included in the study 130 TBI patients (mean age 52 years, SD 23, range 18-96; males 74%; median GCS 3, range 3-8; mean SAPS II 50, SD 14, range 9-84; all mechanically S52 ventilated; tracheostomy in 64/130, 49%). ICU mortality was

36/130 (27.7%), hospital mortality was 42/130 (32.3%). Follow-up included a total of 1,334 patient-days (average of 2.9 measurements of serum Na/day). Serum Na values were computed as the daily average, which was 140 mmol/l (range 133-153); the patient average of the daily maximum Na levels was 143 mmol/l (range 131-164). Twenty-six percent of the days in the ICU were complicated by HyperNa (that is, at least one value of Na > 145 mmol/l), with 70% of the patients showing this abnormality. The average time of first occurrence of HyperNa was 5 days from ICU admission, while only five patients had HyperNa at ICU admission. A daily increase from the cumulative patient-average by 1 SD unit (about 2.4 mmol/l Na) was associated with a 2.15 times increase hazard of death (95% CI = 1.28-3.59; P = 0.004). Adjustment for the daily use of hypertonic solutions did not change our findings. HyperNa was slightly associated with increased ICU LOS. Conclusions Our study suggests a strong relation between increased Na levels and mortality in patients with severe TBI. Although these results do not prove a causal relation between increased Na levels and death, we urge for interventional studies to ascertain the safety of treatment strategies that might increase serum Na levels in patients with severe TBI. Reference

1. Lindner G, etal.: Am J Kidney Dis 2007, 50:952-957. P132

Gastric tubes in patients with severe brain injury

R Lyon1, GR Nimmo2

1Royal Infirmary of Edinburgh, UK; 2Western General Hospital, Edinburgh, UK

Critical Care 2008, 12(Suppl 2):P132 (doi: 10.1186/cc6353)

Introduction Following severe brain injury most patients require intubation and ventilation. Gastric tubes, whether nasogastric or orogastric, allow the stomach to be decompressed, which can aid mechanical ventilation, reduce the risk of aspiration and provide a route for drug administration and subsequently nutrition. Methods A 4-month prospective audit was carried out on patients admitted to the ICU of a regional neurosurgical centre following severe brain injury. Patients were included following primary intracerebral haemorrhage or traumatic brain injury. Results All patients (n = 25) were admitted to the ICU from an Emergency Department. All had a Glasgow Coma Score of 3 on admission to the ICU and were intubated and ventilated prior to arrival. The mean time from accident to arrival in the ICU was 15 hours. Only 32% of patients had a gastric tube in situ on arrival in the ICU; 16% had a nasogastric tube and 16% had an oro-gastric tube in situ. Only 16% of patients had the gastric tube inserted at the time of rapid sequence intubation. Thirty-five percent of patients who required gastric tube insertion after admission to the ICU had documented changes in management or complications as a consequence of the procedure. These included the need for bolus sedation and muscle relaxant use, with ensuing hypotension requiring inotrope support; delay in commencement of enteral feeding and the need for extra chest radiographs to confirm the tube position.

Conclusions Instrumentation to pass a gastric tube may cause a rise in intracranial pressure or induce hypertension, which may precipitate rebleeding in patients with intracerebral haemorrhage. Transfer times to regional neurosurgical units can be long. Optimal management of the brain-injured patient should include insertion of gastric tubes at the time of initial rapid sequence intubation. This is not current practice in the emergency department and improved awareness of the need to place gastric tubes early in brain-injured patients may avoid unnecessary complications.

Alcohol: a risk factor for head injury JE Johnston, SJ McGovern

North Tyneside General Hospital, Newcastle Upon Tyne, UK Critical Care 2008, 12(Suppl 2):P133 (doi: 10.1186/cc6354)

Introduction The study objective was to determine whether there is a significant difference in the pattern and severity of injury sustained during falls in patients who have consumed alcohol and those who have not. To determine how the pattern and severity of injury correlates with the blood alcohol level (BAL). Methods A prospective quasi-randomised controlled study between November 2001 and July 2002. All healthy adults between 16 and 60 years old who had fallen from standing height were included. A systematic history and examination allowed calculation of injury severity scores as per the abbreviated injury scale update 1998. BALs were obtained from intoxicated patients with consent.

Results Three hundred and fifty-one healthy adult patients were included in the study, there were 238 in the no alcohol group, 113 had consumed alcohol, and blood alcohol levels were obtained for 47 patients. The alcohol group had a higher incidence of head injuries (46 (48%) vs 22 (9%)) with a lower incidence of limb injuries (39 (39%) vs 183 (76%)) than the no alcohol group. There was a significant difference in the pattern of injury between the alcohol and no alcohol groups (%2, P < 0.001) and there was a significant difference in the injury severity scores (P < 0.001, Z-2.5). In the alcohol group, the severity and pattern correlated with the alcohol level at the time of injury. Patients with an alcohol level <200 mg/dl had mostly soft-tissue limb injuries (58%), 200-250 mg/dl mostly significant limb fractures (55%) and >250 mg/dl mostly significant head injuries (90%). Conclusions Alcohol-related falls are more often associated with severe craniofacial injury. The severity of both limb and head injury is greater and correlates directly with the BAL.

Existence of microalbuminuria during evolution of acute coronary syndrome is a powerful short-term and long-term prognostic factor

J Garcia Acuna, E Gonzalez Babarro, A Lopez Lago, J Fernandez Villanueva, S De Lange, M Gutierrez Feijoo, J Gonzalez Juanatey

Hospital Clinico Universitario, Santiago de Compostela, Spain Critical Care 2008, 12(Suppl 2):P134 (doi: 10.1186/cc6355)

Introduction Microalbuminuria (MA) is considered a risk factor in the hypertensive and diabetic population. The presence of MA during the evolution of acute coronary syndrome (ACS) is a bad prognosis criterion.

Methods We studied the presence of MA by 24-hour urine test in 396 hospitalized patients with ACS consecutively. During their hospitalization period blood samples were taken in the first 24 hours for all of them (leukocyte recount, hemoglobin and hemato-crit levels, troponin I, total cholesterol, LDL-cholesterol, fibrinogen, ultrasensible C-reactive protein (US-CRP), glucose and glycosylated hemoglobin (HbA1) serum levels). The left ventricular function was determined in all cases through echocardiography. We made a follow-up of 2.5 years.

Results One hundred and forty-seven patients presented MA (37%). We found this group was also the one with older age (P = 0.001), higher hypertension level (P = 0.001) and more diabetes

(P = 0.0001), strokes (P = 0.04), periferal arteriopathy (P = 0.0001) and chronic renal failure (P = 0.0001) cases. Thirty-seven percent of patients were hospitalized in Killip >I stage (P = 0.0001). This group was characterized to have poor left ventricular ejection function (51% vs 46%, P = 0.001), worse renal function (P = 0.001) and higher glycemic levels (P = 0.0001). Patients with MA presented a high intrahospital mortality ratio (9% vs 4%; P = 0.004), more heart failure development (45% vs 21%; P = 0.0001), atrial fibrillation (25% vs 12%; P = 0.004), abnormalities of conduction syndromes (15% vs 7%; P = 0.02), and strokes (4% vs 1%; P = 0.02). In the follow-up, the mortality rate in the MA group rose to 15% (P = 0.0001). In the multivariant analysis due to age, gender, left ventricular ejection function, troponin-I serum levels, existence of anemia and creatinine clearance, MA was found to be an independent risk factor of heart failure (OR = 1.75; 95% CI = 1.02-3.01; P = 0.04) and of mortality (OR = 2.6; 95% CI = 1.05-6.41).

Conclusions The presence of MA during evolution of ACS is associated with high-profile vascular risk and is a powerful short-term and long-term prognostic factor.

Nonoperative management of blunt trauma in abdominal solid organ: a prospective study to evaluate the success rate and predictive factors of failure

S Hashemzadeh, KH Hashemzadeh, S Resaeii, MJ Dehdilani, MZ Dehdilani

Tabriz University of Medical Sciences, Tabriz, Iran

Critical Care 2008, 12(Suppl 2):P135 (doi: 10.1186/cc6356)

Introduction Over the past several years, nonoperative management (NOM) has increasingly been recommended for the care of selected blunt abdominal solid organ injuries. No prospective study has evaluated the rate of NOM of blunt abdominal trauma in the northwest of Iran. The objective of our study was to evaluate the success rate of this kind of management in patients who do not require emergency surgery.

Methods This prospective study was performed in Imam Khomeini Hospital (as a referral center of trauma) at Tabriz University of Medical Sciences, Iran, between 20 March 2004 and 20 March 2007. All trauma patients who had sustained injury to a solid abdominal organ (kidney, liver, or spleen) were selected for initial analysis, using the student's t test or chi-square test. Results During the 3 years of the study, 98 patients (83 male and 15 female) with blunt trauma were selected for NOM for renal, hepatic and splenic injuries. Mean age was 26.1 ± 17.7 years (range, 2-89) and the mean injury severity score (ISS) was 14.5 ± 7.4. The success rate of NOM was 93.8%. Fifty-one patients (43 men, eight women; mean ISS, 14.2 ± 5.8) underwent NOM of splenic trauma, 38 patients (33 men, five women; mean ISS, 12.9 ± 8.2) hepatic trauma, and nine patients (seven men, two women; mean ISS, 22.2 ± 7.6) renal trauma. Six patients underwent laparotomy due to the failure of NOM. The success rates of this treatment were 94.1%, 94.7% and 88.8% for the spleen, liver and kidney injuries, respectively. Female gender and ISS were significant predictors of the failure of NOM (P = 0.005 and P = 0.039, respectively).

Conclusions We suggest that NOM can be undertaken successfully for the hemodynamically stable patients with solid organ blunt trauma. The study indicates that the rates of NOM vary in relation to the severity of the organ injury. These suggest that this approach to the care of blunt injury in abdominal solid organs is being led by trauma centers. S53

Evidence for early presence of intestinal epithelial cell damage in multitrauma patients

J De Haan1, J Derikx1, B Relja2, T Lubbers1, MD Luyer3, WA Buurman1, JW Greve1, I Marzi2

1NUTRIM, Maastricht University Medical Center, Maastricht, The Netherlands; 2University Hospital, JW Goethe University, Frankfurt am Main, Germany; 3Maasland Ziekenhuis, Sittard, The Netherlands Critical Care 2008, 12(Suppl 2):P136 (doi: 10.1186/cc6357)

Introduction The present study investigates the presence of intestinal epithelial cell damage in multitrauma patients on admission. In trauma patients, the development of SIRS and sepsis are important determinants of clinical outcome. Intestinal damage is considered to play an important role in development of these inflammatory syndromes. However, clinical evidence remains scarce. Previously, in a rat model of hemorrhagic shock, we demonstrated that interventions reducing intestinal damage strongly attenuated the inflammatory response. In order to explore potential applicability of such therapies in trauma patients, the presence of early intestinal damage is assessed after trauma. Methods Trauma patients (n = 95) admitted to the emergency room (ER) were divided into four groups regarding the Injury Severity Score (ISS) and the presence of abdominal injury (+AI or -AI): ISS < 25 +AI (n = 27); ISS > 25 +AI (n = 26); ISS < 25 -AI (n = 24) and ISS > 25 -AI (n = 18). Plasma was obtained directly after admittance to the ER. Intestinal fatty acid binding protein (I-FABP), a cytosolic protein constitutively present in mature enterocytes and released after cellular damage, was measured by ELISA. Circulating procalcitonin (PCT), representing inflammation, was assessed by Kryptor assay.

Results On admission, concentrations of I-FABP (1,395 ± 438 pg/ml) were significantly (P < 0.05) elevated in patients with ISS > 25 +AI compared with all other groups on admission (ISS > 25 -AI: 309 ± 67 pg/ml; ISS < 25 +AI: 531 ± 202 pg/ml; ISS < 25 -AI: 221 ± 46 pg/ml) (MWU). Noteworthy, I-FABP was significantly increased in patients without AI (ISS > 25) in comparison with 76 healthy volunteers (102 ± 12 pg/ml). On admission, I-FABP levels were correlated positive with ISS (Pearson r2 = 0.28; P < 0.0001). Furthermore, I-FABP concentrations at ER correlated to PCT levels on day 1 (Pearson r2 = 0.50; P < 0.0001).

Conclusions This is the first study to provide evidence for rapid development of intestinal epithelial cell damage in severe multitrauma patients with and without abdominal trauma. The extent of early intestinal damage is associated with the inflammatory response present at 24 hours. Further studies are needed to determine whether therapies aimed at reduction of intestinal damage improve clinical outcome of patients with severe trauma.

ICU predictors of morbidity after major trauma

F Franchi, S Scolletta, P Mongelli, E Casadei, M Cozzolino, P Giomarelli

University of Siena, Italy

Critical Care 2008, 12(Suppl 2):P137 (doi: 10.1186/cc6358)

Introduction ICU injured patients often experience a condition of tissue hypoperfusion due to low cardiac output and oxygen delivery (DO2). The imbalance between oxygen demand and DO2 could be responsible for an anaerobic metabolism that is correlated with poor outcome. Several authors demonstrated that traditional (that is, serum lactate, base deficit) and oxygen-derived S54 and carbon dioxide-derived parameters of anaerobiosis are helpful

indicators of bad outcome in trauma patients. We aimed to identify predictors of morbidity in our ICU trauma patients. Methods Data for 175 adult trauma patients (age mean 50 ± 18.5 years) admitted to our ICU were prospectively collected from May 2006 and April 2007. Seventy hemodynamic, ventilatory, and metabolic parameters were evaluated within 3 hours after ICU admission. Accordingly to the GIVITI (Italian Group for the Evaluation of Interventions in ICU) database definitions, complications were defined as one or more organ dysfunctions or failures occurring during the ICU stay. Multivariate and receiver operating characteristic (ROC) curve analyses were applied. Results Morbidity was 40.5%. The Simplified Acute Physiology Score II, a high CO2 production (VCO2), and a low DO2/VCO2 ratio were significant in the multivariate analysis (Table 1). The DO2/VCO2 ratio was the best predictor of morbidity. Its cutoff value for morbidity was 3, and its area under the ROC curve was 0.87 (sensitivity 82%, specificity 75%). The ICU stay was longer for complicated patients (4.4 vs 14.5 days, P < 0.001), and mortality was higher (9% vs 22%, P < 0.001).

Table 1 (abstract P137)

Multivariate analysis results

OR 95% CI P value

DO2/VCO2 1.9 1.35-2.9 0.012

VCO2 1.7 1.2-2.3 0.03

SAPSII 1.2 1.01-2.1 0.04

Conclusions This study demonstrated that the DO2/VCO2 ratio correlated well with morbidity. This ratio represents the imbalance between oxygen demand and delivery. The ratio might be continuously monitored in critically ill patients to assess an anaerobiosis state. This ratio together with the SAPS II ratio could predict complications in trauma patients. Reference

1. Husain FA, et al.: Serum lactate and base deficit as predictors of mortality and morbidity. Am J Surg 2003, 185:485491.

Hospital mortality and length of ICU stay in severely burned patients

S Meier1, G Kleger1, W Künzi2, R Stocker2

1Kantonsspital St Gallen, Switzerland; 2Universitätsspital Zürich, Switzerland

Critical Care 2008, 12(Suppl 2):P138 (doi: 10.1186/cc6359)

Introduction Survival and the length of ICU stay (LOS) of severely ill or injured patients are dependent on demographic (for example, age, gender) and organizational factors as well as pre-existing diseases and the degree of physiological abnormalities. Different scores allow one to predict hospital mortality of general ICU patients. Such scores (for example, APACHE II or SAPS II) are developed by multivariate statistical methods. Burned patients, however, have been excluded in the development of most scoring systems. We are interested in finding relevant risk factors concerning hospital mortality and LOS.

Methods Patients with >10% burned surface area (BSA) admitted to the burn unit of the University Hospital Zurich between 1997 and 2006 were retrospectively analysed. Relevant epidemiologic and clinical parameters were included in a univariate analysis and subsequently in a multivariate analysis with either hospital mortality or LOS as endpoints.

Results Six hundred and sixty-two burned patients were treated between 1997 and 2006. Four hundred and eighty-nine patients having a BSA > 10% were included. One hundred and forty-one (28.8%) died and the median LOS was 19 days in survivors. There were no changes in overall mortality, gender distribution, surgical treatment ore intensive care throughout the whole study period. Conclusions We could confirm age, burned surface area, male sex, inhalation injury, diabetes mellitus and psychiatric illness of any kind as important risk factors for mortality. Additionally, suicide attempts were included in the model but did not reach statistical significance. LOS in survivors was correlated with burned surface area, inhalation injury and the presence of suicide attempt. References

1. Knaus WA, et al.: Crit Care Med 1985, 13:818-829.

2, Le Gall JR, et al.: JAMA 1993, 270:2957-2963.

Sympathetic responses during hemorrhagic shock

A Terada, A Caricati, J Mitsunaga, L Poli-de-Figueiredo

Federal University of Sao Paulo, Brazil

Critical Care 2008, 12(Suppl 2):P139 (doi: 10.1186/cc6360)

Introduction Hemorrhagic shock is associated with adrenergic discharge that has been linked to neurohumoral and immune inflammatory responses, vasoconstriction and end-organ perfusion deficits. Our goal is to characterize the sympathetic activity to hemorrhage in tissues with a rich supply of sympathetic nerves (deferens duct).

Methods Seventy anesthetized male Wistar rats were submitted to femoral artery and vein catheterization for mean arterial pressure (MAP) measurement and blood withdrawal to reach a MAP of 40 mmHg. Deferens ducts were removed from rats after 10 minutes, 30 minutes and 60 minutes, and were placed in isolated organ baths between two platinum electrodes for transmural electrical stimulation (TES) (0.1-20 Hz, 1 ms, 60 V). This technique allows the evaluation of neurotransmissors released by sympathetic nerves (noradrenaline and ATP).

Results Controls maintained a MAP of 105 ± 3 mmHg in all experimental groups. Hemorrhaged rats presented a MAP of 39 ± 3 mmHg after 10, 30 or 60 minutes. The contraction profile after ATP and noradrenaline after TES were similar between controls and hemorrhaged rats. The amplitude was greater, however, for the three hemorrhaged groups. The addition of tetrodotoxin abolished contractions induced by TES, confirming the neurogenic nature of those contractions. The ATP-mediated contraction was blocked by the selective P2 purinoreceptor antagonist suramin. Noradrenalin-mediated contraction was blocked by the prazosin, a selective a-adrenoreceptor.

Conclusions We conclude that, based on the increased amplitude contraction induced by both noradrenaline and ATP, sympathetic nerve activity is increased in hemorrhagic shock animals.

Metabolic evaluation during weaning from mechanical ventilation using indirect calorimetry

LJ Santos, SR Vieira

Hospital de Clínicas de Porto Alegre, Brazil

Critical Care 2008, 12(Suppl 2):P140 (doi: 10.1186/cc6361)

Introduction Indirect calorimetry (IC) can be useful in the evaluation of metabolic status from critical care patients, especially during weaning from mechanical ventilation (MV) when energy expenditure can increase. The goals of this study were to compare

the energy expenditure (EE) from patients during weaning from MV, comparing pressure support ventilation (PSV) and T tube (TT) using IC, as well as to compare these findings with results calculated with Harris-Benedict equation.

Methods Patients clinically ready to discontinue MV support were evaluated from August 2006 to January 2007. They were studied, in a random order, during PSV and TT. Measurements from EE were registered during 20 minutes in both methods. Indirect calorimetry was registered using a specific metabolic monitor (Datex-Ohmeda/M-COVX). EE was also estimated using the Harris-Benedict equation with and without an activity factor. Results are shown as the mean ± standard deviation. Statistical analysis was performed with the paired t test, Pearson's correlation coefficient and the Bland-Altman test. The significance level was P < 0.05. Results Forty patients were enrolled. The mean age was 56 ± 16 years, APACHE II score was 23 ± 8 and the majority of patients were male (70%). The mean EE during TT was 14.43% greater than during PSV (P < 0.001). The mean EE estimated by the Harris-Benedict equation was 1,455.05 ± 210.4 kcal/24 hours, and considering the activity factor 1,608 ± 236.14 kcal/24 hours. Both calculated values showed correlation with that measured by indirect calorimetry during PSV (r = 0.647) and TT (r = 0.539). The agreement limits comparing measured and estimated EE with the Bland-Altman analysis suggest that the Harris-Benedict equation underestimates EE during TT.

Conclusions Comparing EE during PSV and TT, using IC, we observed that during TT there was, as expected, an increase in EE (14.43%). The results also suggest that the Harris-Benedict equation underestimates energy expenditure during TT. References

1. Haugen HA, et al.: Indirect calorimetry: a practical guide for clinicians. Nutr Clin Practice 2007, 22:377-388.

2. Cheng CH, et al.: Measured versus estimated energy expenditure in mechanically ventilated critically ill patients. Clin Nutr 2002, 21:165-172.

Hypocaloric nutrition and outcome in critically ill patients with prolonged ICU stay

S Iff, M Leuenberger, Z Stanga, SM Jakob

University Hospital, Bern, Switzerland

Critical Care 2008, 12(Suppl 2):P141 (doi: 10.1186/cc6362)

Introduction While implementation of protocols for nutritional support is associated with less energy deficit [1,2], the impact of hypocaloric feeding on clinical relevant outcomes is more controversial: recent studies suggested both positive [3,4] and negative effects [5] in patients receiving the recommended intakes. The aim of this study was to assess the incidence and magnitude of hypocaloric feeding in an ICU without explicit nutrition protocols together with standardized mortality ratios.

Methods A retrospective analysis of data from all patients staying >72 hours in a mixed medical-surgical 30-bed university hospital ICU in 2006.

Results Data from 562 patients (medical 270 patients, surgical 292 patients) were analyzed. The lengths of ICU and hospital stay were 9 ± 9 days and 27 ± 25 days. The age was 61 ± 16 years, weight 77 ± 17 kg, BMI 26 ± 5 kg/m2, and APACHE II and SAPS scores 24 ± 8 and 50 ± 17. Daily energy and protein intake were 302 ± 33 kcal and 12 ± 1 g (recommended amount of energy and protein intake according to the European Society of Parenteral and Enteral Nutrition: 1,549 ± 34 kcal and 114 ± 2 g). Patients were mechanically ventilated during 7 ± 8 days. ICU mortality was 14% (expected by APACHE II and SAPS II: 50% and 46%), and S55

hospital mortality was 22%. The total caloric deficit per patient was 9,820 ± 1,126 kcal. The distribution of the acquired energy deficit was: 0-5,000 kcal (20%), 5,000-7,500 kcal (31%), 7,50010,000 kcal (20%), 10,000-20,000 kcal (22%), 20,000-30,000 kcal (4%), >30,000 kcal (3%).

Conclusions Most patients with an ICU stay >72 hours acquired a substantial caloric deficit during the study period when compared with recommendations. Despite this, mortality was relatively low for the measured APACHE II and SAPS II scores. Nutrition protocols should be used and their impact on both the delivered calories and clinically relevant outcome parameters be monitored. References

1. Berger et al.: Nutrition 2006, 22:221-229.

2. Heyland etal.: Crit Care Med 2004, 32:2260-2266.

3. Villet et al.: Clin Nutr 2005, 24:502-509.

4. Artinian et al.: Chest 2006, 129:960-967.

5. Krishnan et al.: Chest 2003, 124:297-305.

Early introduction of enteral feeding for patients with percutaneous cardiopulmonary support

H Hayami, O Yamaguchi, H Yamada, S Nagai, S Oohama, Y Sugawara, A Sakurai

Yokohama City University Medical Center, Yokohama, Japan Critical Care 2008, 12(Suppl 2):P142 (doi: 10.1186/cc6363)

Introduction Early enteral nutrition has been shown to have a beneficial effect on intestinal integrity and motility, immuno-competence, and patient outcome. Generally, circulatory stability is needed for the introduction but, because there is no precise definition, we might miss the right time to begin. In the present study we attempted to establish early enteral nutrition for patients with cardiogenic shock with a ventricular assist device. Methods Ten postoperative patients with cardiogenic shock under percutaneous cardiopulmonary support were included. An enteral feeding tube was placed beyond the pylorus within 36 hours of operation under the observation of an upper gastrointestinal fiberscope. We estimated the mobility rate of the stomach by counting the number of vermiculation for 3 minutes at the pylorus. We assessed the movement of the intestine by observing the X-ray film to see whether the contrast medium we injected 3 hours before had moved or not. If the medium had moved rapidly to the colon, enteral formula was started at the rate of 20 ml/hour. The serum prealbumin concentration was measured every 7 days. Other laboratory data was compared with five control TPN patients retrospectively. Results The mobility rate of the stomach was decreased to 4.6 ± 3.2 times/3 minutes, but contrast media moved rapidly to the ascending colon in two patients, to the transverse colon in three patients, to the sigmoid colon in one patient, and to the rectum in three patients. One patient needed to stop enteral nutrition transiently because of reflux, but for the other nine patients enteral nutrition was well established. The prealbumin level also rose to 13 ± 3.5, 14.1 ± 4.9, 22 ± 2.8 weekly, but it was difficult to compare with control TPN patients because many of them died early. Serum ALP, total bilirubin, and direct bilirubin concentration 1 week after in survivors was lower in ED patients (ALP 437 ± 248 vs 566 ± 300, P = 0.57; total bilirubin 2.5 ± 2.5 vs 3.1 ± 1.0, P = 0.09; and direct bilirubin 1.5 ± 1.8 vs 2.1 ± 0.8, P = 0.09). Seven (70%) of the ED patients survived over 90 days (all five patients died in the TPN group).

Discussion If mesenteric circulation were stable, enteral nutrition could not be contraindication. Even an improvement in patient outcome can be expected in the view of avoiding complications S56 such as bacterial translocation.

Conclusions Intestinal mobility is fairly maintained in patients with cardiopulmonary support, and early enteral nutrition can be established under close observation.

Nutritional activation of the cholinergic pathway after hemorrhagic shock reduces inflammation and preserves intestinal integrity

J De Haan1, T Lubbers1, M Hadfoune1, MD Luyer2, CH Dejong1, WA Buurman1, JW Greve1

1NUTRIM, Maastricht University Medical Center, Maastricht, The Netherlands; 2Maasland Ziekenhuis, Sittard, The Netherlands Critical Care 2008, 12(Suppl 2):P143 (doi: 10.1186/cc6364)

Introduction This study investigates the effects of lipid-enriched nutrition administered early after hemorrhagic shock. Previously we have shown that high-lipid feeding effectively inhibits systemic inflammation and preserves intestinal integrity when given before hemorrhagic shock by stimulation of the cholinergic anti-inflammatory pathway via activation of CCK receptors. Control of the inflammatory status of trauma patients forms a major clinical problem since the inflammatory cascade is already ongoing upon presentation. The anti-inflammatory effects of high-lipid intervention after shock are therefore examined.

Methods Hemorrhagic shock in rats was induced by extracting 30-40% of the circulating volume. Animals were subsequently fasted or given enteral feedings containing high or low concentrations of lipids at 30 and 180 minutes after shock (n = 8). CCK-receptor antagonists were administered 10 minutes before feeding. Tissue and plasma were collected 4 hours after shock to assess inflammation and intestinal integrity.

Results Administration of lipid-enriched nutrition early after shock significantly reduced plasma levels of IFNy at 4 hours (0.39 ± 0.06 ng/ml) compared with low-lipid treated (0.77 ± 0.09; P < 0.01) and fasted animals (1.38 ± 0.11; P < 0.001). Enterocyte damage, expressed as circulating levels of ileal lipid binding protein, was prevented by high-lipid feeding compared with animals that received a low-lipid composition or were fasted (3.7 ± 0.3 vs 4.9 ± 0.5 vs 8.0 ± 1.1 pg/ml; P < 0.05 respective P < 0.0001). Furthermore, early post-shock intervention with lipid-enriched feeding significantly reduced translocation of bacteria to distant organs (69.7 ± 6.4 vs low lipid: 100.9 ± 9.2 CFU/g tissue; P < 0.05). Blockage of CCK receptors abrogated the anti-inflammatory effects of high-lipid nutrition (IFNy 1.18 ± 0.15 vs vehicle 0.58 ± 0.14 ng/ml; P < 0.05).

Conclusions Administration of lipid-enriched nutrition after hemorrhagic shock reduces inflammation and preserves intestinal integrity. This study implicates lipid-enriched nutrition as a potential therapeutic option in settings in which inflammation and tissue damage are already present, such as in trauma patients.

Inflammatory response in patients requiring parenteral nutrition: comparison of a new fish-oil-containing emulsion (SMOF®) versus an olive/soybean oil-based formula

I Schade, KD Rohm, A Schellhaass, A Mengistu, J Boldt, SN Piper

Klinikum Ludwigshafen, Germany

Critical Care 2008, 12(Suppl 2):P144 (doi: 10.1186/cc6365)

Introduction Lipid emulsions are an essential part of parenteral nutrition (PN), such as energy supply and source of essential fatty acids. It has been shown that the composition of cell membranes is

influenced by the fatty acid profile of dietary lipids, and may therefore be responsible for modulations in immune response. The aim of this study was to assess the effects of a new lipid emulsion based on soybean oil, medium-chain triglycerides, olive oil and fish oil (SMOF®) compared with a lipid emulsion based on olive and soybean oil (ClinOleic®) on the inflammatory response in postoperative ICU patients.

Methods A prospective randomised study. After approval from the ethical committee, 44 postoperative surgical patients with an indication for PN were included in this study. Nonprotein calories were given as 60% glucose and 40% lipid emulsion. The total energy intake per day was calculated as 25 kcal/kg body weight. The sedation regimen was standardized, excluding propofol administration. Patients were thus allocated to one of two nutrition regimens: group A (n = 22) received SMOFlipid® 20%, and group B (n = 22) a lipid emulsion based on olive and soybean oil (ClinOleic® 20%). Lipid emulsions were administered during 5 days postoperatively, corresponding to the observation time. IL-6, TNFa, and soluble E-selectin levels (sE-selectin) were measured before the start of infusion (d0), at day 2 (d2) and at day 5 (d5) after the start of administration. The significance level was defined at P < 0.05. Results There were no significant differences between the two groups in the inflammatory response at d0 and d2. But at d5, significantly lower IL-6 (group A: 73 ± 58 vs group B: 123 ± 107 pg/ml), TNFa (group A: 15.2 ± 7.9 vs group B: 22.6 ± 12.9 pg/nl), and soluble E-selectin concentrations (group A: 21.5 ± 13.7 vs group B: 32.6 ± 21.2 ng/ml) were seen in patients receiving SMOF® compared with patients administered ClinOleic®. Conclusions The administration of SMOFlipid® within a PN regimen led to a significantly reduced inflammatory response at day 5 of the nutrition regimen compared with a lipid emulsion based on olive and soybean oil, including measurements of IL-6, TNFa, and soluble E-selectin values.

Lipid-enriched nutrition reduces inflammation via local activation of the autonomic nervous system by cholecystokinin

T Lubbers1, J De Haan1, M Luyer2, M Hadfoune1, C Dejong1, W Buurman1, J Greve1

Maastricht University Medical Center, Nutrition and Toxicology Institute (NUTRIM), Maastricht, The Netherlands; 2Maasland Hospital, Sittard, The Netherlands

Critical Care 2008, 12(Suppl 2):P145 (doi: 10.1186/cc6366)

Introduction The present study investigates the nutritional activation of the cholinergic anti-inflammatory pathway. Lipid-enriched nutrition effectively attenuates systemic inflammation and prevents gut barrier failure by stimulation of the cholinergic pathway via cholecystokinin (CCK) receptors. This study investigates whether enteral lipids activate the autonomic nervous system via local stimulation of CCK receptors on the afferent vagus or by activation of receptors within the central nervous system via circulating CCK. Methods Sprague-Dawley rats were subjected to hemorrhagic shock. Before shock, animals were fasted or fed a lipid-enriched oral nutrition at 18 hours, 2 hours and 45 minutes. Peripheral activation of the autonomic nervous system was determined by performing deafferentations with perivagal application of capsaicin prior to shock. Central activation of the autonomic nervous system by circulating levels of CCK was studied by infusion of high levels of sulfated CCK8 starting 30 minutes prior to shock until sacrifice in fasted animals. Plasma and tissue samples were collected 90 minutes after shock to assess the inflammatory status and gut barrier function.

Results Deafferentation significantly abrogated the inhibitory effect of dietary fat on TNFa (133.7 ± 31.6 pg/ml vs 45.3 ± 12.9 pg/ml (sham); P < 0.001) and IL-6 (168 ± 14 pg/ml vs 69 ± 9 pg/ml (sham); P < 0.001). Preservation of gut barrier function was hindered by vagal deafferentation, expressed as increased leakage of HRP in ileal segments (6.1 ± 0.3 |ig/ml vs 2.7 ± 0.3 |ig/ml (sham); P < 0.001) and bacterial translocation (113 ± 20 CFU/g tissue vs 33 ± 4 CFU/g tissue (sham); P< 0.001). Infusion of sulfated CCK8 (arterial levels: 13 ± 2 pM at shock and 19 ± 4 pM at sacrifice) failed to attenuate inflammation and improve gut barrier function. Conclusions Our study shows for the first time that lipid-enriched nutrition attenuates systemic inflammation and improves intestinal integrity via local activation of the afferent vagus nerve. The presence of enteral lipids is essential to exert these protective effects. Clinically, nutritional activation of this potent anti-inflammatory pathway could provide a novel therapeutic treatment for patients prone to develop excessive inflammation.

Efficacy of glutamine dipeptide-supplemented total parenteral nutrition in critically ill patients: a prospective, double-blind randomized trial

TG Grau1, A Bonet2, E Miñambres3, L Piñeiro2, A Robles4,

JA Irles5, J Acosta6, J Lopez1

1Hospital Severo Ochoa, Madrid, Spain; 2Hospital Josep Trueta, Girona, Spain; 3Hospital Marques de Valdecilla, Santander, Spain;

4Hospital Vall d'Hebron, Barcelona, Spain; 5Hospital de Valme,

Seville, Spain; 6Hospital General de Alicante, Spain

Critical Care 2008, 12(Suppl 2):P146 (doi: 10.1186/cc6367)

Introduction The aim of this study was to assess the clinical efficacy of glutamine dipeptide-supplemented total parenteral

nutrition (TPN), defined by the occurrence of nosocomial infections or new organ failure as clinical endpoints.

Methods Patients received a glutamine dipeptide-supplemented TPN (Glu-TPN) or a standard TPN (S-TPN). Entry criteria: adult

patients in the ICU requiring TPN for 3 days or more and APACHE

II score >12. Exclusion criteria: malnutrition or obesity, chronic renal or hepatic failure, immunocompromised patients and poor life expectancy. Both groups received isonitrogenous and isocaloric TPN. Nutritional needs were calculated: 0.25 g N/kg/day and 25 kcal/kg/day. The Glu-TPN group received 0.5 g/kg/day glutamine dipeptide and the S-TPN group a similar amount of amino acids. Vitals, sepsis and septic shock on admission, type of patient, daily SOFA score, daily calories administered, nosocomial infections based on CDC criteria, ICU and hospital lengths of stay and ICU mortality were recorded. Intent-to-treat and per-protocol analyses were done. Infections rates were compared using density rates and the ASOFA score was analyzed using ANOVA. Results One hundred and seventeen patients received any intervention, 53 assigned to Glu-TPN and 64 to S-TPN. Baseline characteristics were similar in both groups. Less new infections occurred in Glu-TPN patients: nosocomial pneumonia 8.04 versus 29.25 episodes-%o days of mechanical ventilation (RR = 1.4; 95% CI = 1.2-1.7; P = 0.02), and urinary tract infections 2.5 versus 16.7 episodes-% days of urinary catheter (RR = 1.6; 95% CI = 1.3-2.1; P = 0.04). There were no differences in the incidence of catheter-related sepsis, primary bacteremias and intra-abdominal infections. There was a trend to improved ASOFA score in patients receiving Glu-TPN: ASOFA 72 hours (1.9 ± 2.4 vs 2.6 ± 2.7, P = 0.07). There were no differences in ICU and hospital lengths of stay or ICU mortality (15% vs 18%).

Conclusions Glu-TPN used in critically ill patients for longer than 3 days significantly reduces the incidence of nosocomial pneu- S57

monias and urinary tract infections, and decreases the severity of organ failures.

Arginine reduces leukocyte/endothelial cell interaction in a model of normotensive endotoxemia without attenuating capillary perfusion failure

J Fertmann1, M Laschke2, B Vollmar3, MD Menger2, JN Hoffmann1

1LMU Munich, Germany; 2University of Saarland, Homburg-Saar,

Germany; 3University of Rostock, Germany

Critical Care 2008, 12(Suppl 2):P147 (doi: 10.1186/cc6368)

Introduction Sepsis and septic multiorgan failure are still associated with a high mortality. Recent pathophysiological studies could show that a substantial depletion of the semi-essential amino acid arginine occurs during sepsis. However, the effects of a high-dosed supplementation of L-arginine on the microcirculation have not been well characterised. This study addresses the effect of an intravenous L-arginine application on the microcirculation in a well-established model of normotensive endotoxemia. Methods In a dorsal skinfold chamber preparation in male Syrian golden hamsters, normotensive endotoxemia was induced by intravenous lipopolysaccharide (LPS) administration (Escherichia coli, 2 mg/kg BW). Before and 30 minutes, 3 hours, 8 hours and 24 hours after LPS application, arteriolar and venular leukocyte rolling and adhesion as well as functional capillary density as a parameter of microvascular perfusion injury were quantified by intravital microscopy. In the treatment group, animals received intravenous L-arginine (50 mg/kg BW, n = 5) 15 minutes before LPS administration. Animals infused with the stereo isomer D-arginine (n = 4, 50 mg/kg BW) or sodium chloride (NaCl 0.9%, vehicle) served as controls.

Results Administration of LPS markedly increased leukocyte rolling and adherence in control animals (P < 0.01 vs baseline). L-Arginine induced a significant reduction of leukocyte rolling (P < 0.05) and adherence (P < 0.01) in postcapillary venules, whereas D-arginine did not lead to significant differences when compared with vehicle controls. Interestingly, despite its effect on leukocyte/endothelial cell interaction, L-arginine did not attenuate capillary perfusion failure. Conclusions L-Arginine supplementation results in a significant reduction of LPS-induced leukocyte/endothelial cell interaction in this in-vivo microcirculation model (dorsal skinfold chamber). The lack of improvement in capillary perfusion has to be further characterised in additional studies.

Antioxidant intake by intensive care patients

S Friar, S Stott

Aberdeen Royal Infirmary, Aberdeen, UK

Critical Care 2008, 12(Suppl 2):P148 (doi: 10.1186/cc6369)

Introduction Evidence shows that in critical illness antioxidant defences are overwhelmed by a massive increase in reactive oxygen species [1]. Antioxidant supplementation may be beneficial in these patients. We quantified antioxidant intake from enteral

nutrition by our patients and compared this with the dietary reference value (DRV) for the healthy population [2]. Methods Data were collected from a retrospective case note review during January 2007. Patients' volume and type of feed delivered was recorded each day. Antioxidant intake was calculated from the volume of feed and the feed nutritional data. Results Antioxidant intake of vitamins and traces elements was assessed for the enterally fed patients over the first 7 days in the ICU or part thereof. This amounted to 117 days of feeding. The mean intake per day and the intake as a percentage of DRV are presented in Table 1.

Conclusions There is no evidence to recommend an optimal intake of antioxidants, but doses of antioxidants used in clinical trials with beneficial outcomes have been up to 10-20 times the DRV [3]. Antioxidant intakes in our patients were much lower than this. The present audit shows that beneficial antioxidant supplementation is unlikely to be met by standardised feed delivery, and additional supplementation will be required. References

1. Goodyear-Bruch C, etal.: Am J Crit Care 2002, 11:543-551.

2. Dietary Reference Values for Food Energy and Nutrients for the United Kingdom. Report of the Panel on Dietary Reference Values of the Committee on Medical Aspects of Food Policy. 11th impression. London: HMSO; 2001.

3. Berger MM: Clin Nutr 2005, 24:172-183.

Influence of first glycemia determination in acute coronary syndrome: long-term prognosis

E Gonzalez Babarro, J Garcia Acuna, A Lopez Lago, J Fernanadez Villanueva, S De Lange, M Gutierrez Feijoo, J Gonzalez Juanatey

Hospital Clinico Universitario, Santiago de Compostela, Spain Critical Care 2008, 12(Suppl 2):P149 (doi: 10.1186/cc6370)

Introduction Hyperglycemia at the moment of hospitalization is associated with a worse prognosis in patients with acute coronary syndrome (ACS). We introduce a study about the influence of glycemic levels at hospital admission in patients with ACS. Methods We studied glycemic levels of 611 patients with ACS at hospitalized admission consecutively. We established three groups based on the glycemic levels results: Group 1, <114 mg/dl; Group 2, 114-163 mg/dl; Group 3, >163 mg/dl. The clinical and evolution characteristics were evaluated and we made a median of 3 years follow-up.

Results Group 3 presented significantly older age, higher hypertension levels and more diabetes, peripheral arteriopathy and chronic renal failure cases. During the follow-up we found in Group

3 the worst Killip stage at the moment of hospitalization, a higher rate of heart failure (44%) and atrial fibrillation and a minor survival rate at the end of the pursuit (Group 1, 92%; Group 2, 89% and Group 3, 82%, P = 0.03).

Conclusions High levels of glycemia at the first determination in ACS patients are a long-term prognostic factor. It is necessary to know the influence of glycemic levels for ACS prognosis when a correct control of the level is obtained during the ACS acute phase.

Table 1 (abstract P148)

Intake of antioxidants

Vitamin A Vitamin C Vitamin E Selenium Copper Zinc

Mean intake 806.9 |ig 98.6 mg 13.6 mg 56.9 | g 1.7 mg 12.2 mg

Percentage of DRV 119% 248% 158% 79% 150% 137%

Tight blood glucose control decreases surgical wound infection in the cardiac surgical patient population in the ICU

E Saad, N Shwaihet, AM Mousa, AK Kalloghlian, BA Afrane, MG Guy, CC Canver

King Faisal Specialist Hospital, Riyadh, Saudi Arabia Critical Care 2008, 12(Suppl 2):P150 (doi: 10.1186/cc6371)

Introduction Tight blood glucose control (TBGC) results in a decrease in the infection rate in critically ill patients. In 2002 a retrospective analysis of 38 postoperative patients in our cardiac surgical ICU revealed that most of the patients had a high serum glucose level upon arrival and remained so throughout their stay irrespective of their diabetes status. Additionally it was noted that the number of infections exceeded the international accepted rate. Methods Based on those findings, we initiated a prospective observational study implementing a continuous insulin intravenous infusion protocol as recommended internationally in our patients, both diabetic and nondiabetic, to achieve a blood glucose level (BGL) between 4 and 8 mmol/l. Our sample study population included 116 patients, mean age 54 (±17.9) years, 65 (56%) were males, 62 (53%) received coronary artery bypass grafting and 46 (40%) were diabetic. Initially there was resistance to implement this protocol and compliance was poor. We therefore embarked on a nursing and physician education program for more than 1 year. We initiated a new prospective study in 2006-2007. The study included 270 patients, mean age 52 years (±15.8), 155 (57%) were males, 136 (50%) received coronary artery bypass grafting and 97 (36%) were diabetic.

Results The demographics of the study patients were similar. The mean admission BGL, highest BGL, lowest BGL and discharge BGL for 2003 and 2006-2007 were 8.1/13/7.9/11 mmol/l and 7.8/12.8/4.6/8.3 mmol/l, respectively. A comparison of wound infection rates before and after full implementation of TBGC showed a decrease in the rate from 7.25% in 2002 to 3.3% in 2007 (P = 0.02). The blood stream infection rate, however, did not showed any statistical significant change, 2% in 2003 versus 1.9% in 2007 (P = 0.4).

Conclusions Our study showed that implementing TBGC in cardiac surgical patients decreases surgical wound infection but does not change significantly the bloodstream infection. References

1. Van den Berghe G, Wouters P, Weekers F, et al.: Intensive insulin therapy in critically ill patients. N Engl J Med 2001, 345:1359-1367.

2. Butler SO, Btaiche IF, Alaniz C: Relationship between hyperglycemia and infection in critically ill patients.

Pharmacotherapy 2005, 25:963-976.

Mechanisms of kidney protection by intensive insulin therapy during critical illness

I Vanhorebeek1, B Ellger1, J Gunst1, M Boussemaere1, Y Debaveye1, N Rabbani2, P Thornalley2, M Schetz1, G Van den Berghe1

1Katholieke Universiteit Leuven, Belgium; 2University of Warwick, UK Critical Care 2008, 12(Suppl 2):P151 (doi: 10.1186/cc6372)

Introduction Strict blood glucose control with intensive insulin therapy reduces mortality and morbidity of critical illness, including newly acquired kidney injury [1-3].

Methods To study the underlying mechanisms, we independently manipulated blood glucose (G) and insulin (I) to normal (N) or high

(H) levels in our rabbit model of prolonged critical illness [4], resulting in four experimental groups: NI/NG, HI/NG, NI/HG and HI/HG.

Results Plasma creatinine levels were elevated in the two HG compared with the two NG groups. Light microscopy showed severe renal structural abnormalities in HG rabbits, with formation of tubular casts. These effects of blood glucose control on kidney function and structure were not explained by an effect on blood flow or oxygen delivery to the kidney. In contrast, in the renal cortex of HG rabbits, the activities of the mitochondrial respiratory chain enzymes were reduced to below 50% to 30% of the values observed in controls and NG rabbits, a finding that was independent of insulin. No significant correlations were found between respiratory chain complex activities and blood flow or oxygen delivery to the cortex. Strongly significant inverse correlations were found between the enzyme activities and plasma levels of creatinine, suggesting that mitochondrial protection by intensive insulin therapy mediated at least part of the prevention of kidney injury. The glucose content in the renal cortex was more than fourfold higher in the HG than the NG groups and correlated directly with creatinine levels and inversely with enzyme activities, supporting glucose toxicity as the mediator of renal mitochondrial damage. The dicarbonyls glyoxal, methylglyoxal and 3-deoxyglucosone were elevated in plasma of the HG groups and strongly correlated with glucose in the cortex and plasma creatinine, suggesting a possible contribution of these toxic metabolites of glucose. Conclusions Intensive insulin therapy during critical illness confers renal protection by prevention of hyperglycemia-induced mitochondrial damage rather than by improving perfusion and oxygen delivery. References

1. Van den Berghe et al.: N Engl J Med 2001, 345:1359.

2. Van den Berghe et al.: N Engl J Med 2006, 354:449.

3. Schetz M, etal.: J Am Soc Nephrol, in press.

4. Ellger et al.: Diabetes 2006, 55:1096.

Complement activation after uncomplicated coronary artery bypass grafting: role of strict glucose control

C Hoedemaekers1, M Van Deuren1, T Sprong1, P Pickkers1, TE Mollnes2, I Klasen1, J Van derHoeven1

1Radboud University Nijmegen Medical Centre, Nijmegen, The

Netherlands; 2Rikshospitalet, Oslo, Norway

Critical Care 2008, 12(Suppl 2):P152 (doi: 10.1186/cc6373)

Introduction The complement system is a key component in the SIRS response after cardiac surgery. The aim of this study was to investigate whether strict glucose control modifies complement activation in this setting and to analyze the route of complement activation.

Methods We performed a randomized trial in 20 adult patients after coronary artery bypass grafting (CABG). Patients were assigned to receive intensive or conventional insulin treatment immediately after admission to the ICU. Components of the complement system were determined by ELISA. Changes in complement levels over time were analyzed with one-way ANOVA repeated measures.

Results Blood glucose levels were significantly lower in the intensive treatment group (P < 0.003). Serum concentrations of terminal complement complex were increased on admission to the ICU in both groups (2.80 ± 1.45 AU/ml vs 3.21 ± 2.17 AU/ml, P = 0.817) and declined significantly thereafter. All complement activation pathways converge at the point of C3 activation. The C3bc concentration was strongly increased on admission in both S59

groups (78.9 ± 36.7 AU/ml vs 103.4 ± 68.0 AU/ml, P = 0.355) and declined in the following hours with a second peak at 8 hours after admission (P = 0.005). C3bBbP (alternative pathway activation) was increased on admission in both groups (106.44 ± 42.72 AU/ml vs 144.44 ± 73.51 AU/ml respectively, P = 0.199), followed by a significant decline in the following hours (P < 0.001). C1rs-C1inh complexes (classical pathway) were increased on admission in both groups (38.00 ± 12.27 AU/ml vs 40.78 ± 16.41 AU/ml, P = 0.690), followed in time by a gradual decrease and later by an increase (P < 0.001). No differences in C4bc (combined classical and lectin pathway) concentrations were measured between the treatment groups, and the concentrations remained constant during ICU stay. MBL (lectin pathway) concentrations were comparable in both treatment groups and did not change significantly during the 24-hour follow-up. Conclusions Strict glucose regulation does not alter the concentration of complement components or route of activation. Complement activation after CABG shows a biphasic pattern. Initially complement is activated trough the classical/lectin pathway and augmented by the alternative pathway. In a second phase, complement is activated by the classical/lectin pathway to the point of C3b formation without production of terminal complement complexes, indicating inhibition beyond C3b.

Relationship between admission blood glucose level and prognosis in acute ischemic and hemorrhagic stroke patients

A Bayir, S Ozdinç, A Ak, B Cander, F Kara

Selçuk University, Konya, Turkey

Critical Care 2008, 12(Suppl 2):P153 (doi: 10.1186/cc6374)

Introduction The aim of this study was to investigate the relationship between blood glucose level measured on admission and hospital mortality with the Glasgow Coma Score (GCS) in ischemic and hemorrhagic stroke patients.

Methods Those patients who experienced ischemic and hemorrhagic stroke and who arrived at the hospital within the first

3 hours after the beginning of the symptoms were included in the study. On arrival the GCS was detected. Blood glucose levels were determined for each patient. The patients were allocated as ischemic and hemorrhagic stroke groups on admission. In addition, ischemic and hemorrhagic stroke groups were allocated as GCS < 8 and GCS > 9 groups. The patients were observed in terms of mortality during their stay in the hospital. The data were compared using Kruskal-Wallis variance analysis and the Mann-Whitney U test with Bonferroni correction. P < 0.05 was considered significant. Results We enrolled 113 patients (26 hemorrhagic, 87 ischemic stroke) in the study. The mean blood glucose level in the ischemic stroke and GCS < 8 group (25 patients) was 189 ± 69.23 mg/dl on admission. The mean blood glucose level for the ischemic stroke and GCS > 9 group (62 patients) was 165 ± 79.8 mg/dl. The mean blood glucose level of the hemorrhagic stroke and GCS < 8 group (16 patients) was 291.7 ± 162.63 mg/dl. On admission, the mean blood glucose level of the hemorrhagic stroke and GCS > 9 group (10 patients) was 141.8 ± 35.46 mg/dl. The mean blood glucose level of dead patients (n = 35) was 236.25 ± 128.88 mg/dl. A significant reverse relationship was found between GCS and blood glucose level (P = 0.00). A significant reverse relationship was found between GCS and blood glucose level for dead patients (P = 0.00). Conclusions In patients with ischemic and hemorrhagic stroke who referred to the emergency clinic within the first 3 hours after the stroke developed, a measured high glucose level on admission S60 could be an indicator of bad prognosis and high hospital mortality.


1. Perttu J, et al.: Stroke 2004, 35:363-364.

2. Martini SR, et al.: J Cerebral Blood Flow Metab 2007, 27: 435-451.

Significance of the suppression of blood glucose variability in acutely ill severe patients with glucose intolerance evaluated by means of bedside-type artificial pancreas

M Hoshino1, Y Haraguchi2, I Mizushima3, S Kajiwara1, M Takagi1

1Shisei Hospital, Saitama, Japan; 2National Hospital Organization Disaster Medical Center, Tokyo, Japan; 3Tokyo Police Hospital, Tokyo, Japan

Critical Care 2008, 12(Suppl 2):P154 (doi: 10.1186/cc6375)

Introduction We hereby report the usefulness of continuous use of an artificial pancreas (AP) to control blood glucose (BG) clinically. In this report we analyzed the significance of BG stability, or variability under strict control of BG, by the use of an AP. Methods BG control was performed by an AP, STG22. Patients were evaluated at early (E) phase and late (L) phase (1 week after the E phase). Based on the daily mean BG (BGm), basically calculated by 24 data obtained hourly, patients were classified into two groups. Patients with BGm <200 mg/dl and those with BGm >200 mg/dl were denoted as group B and group A, respectively. Each group was classified into two subgroups based on the daily BG difference (BGd), or 100 mg/dl, high and low variability subgroups. Group B patients with BGd <100 mg/dl were denoted BL, and group B patients with BGd >100 mg/d as BH. Subgroups AL and AH were classified similarly. The parameters studied were BGm, BGd, SOFA score and mortality.

Results (1) Group A had BGm in the E phase and L phase of 231 ± 24 (n = 11) and 220 ± 19 (n = 7), respectively. Group B had BGm in the E phase and L phase of 175 ± 19 (n = 35) and 166 ± 21 (n = 42), respectively. (2) Relationship between BGm and BGd: (E phase) group A had the tendency of higher BGd as compared with group B (101 ± 60 vs 68 ± 46, P < 0.10); (L phase) group A had significantly higher BGd as compared with group B (109 ± 43 vs 66 ± 46, P < 0.025). (3) Relationships between BGd and SOFA score, mortality: (E phase) group AH had the tendency of higher mortality as compared with group AL (100%, n = 3 vs 50%, n = 8); (L phase) group BH had the tendency of higher SOFA score and mortality as compared with group BL (8.0 ± 6.7, 80%, n = 5 vs 6.7 ± 5.7, 38%, n = 37). Conclusions Although this is a preliminary study, based on the precise data measured by the AP, the following conclusions were suggested. High BG variability or unstability supported high morbidity and mortality. BG control aimed at the suppression of BG variability, or BGd lower than 100 mg/dl, may therefore improve the outcome as well as the improvement of BGm.

Implementing intensive insulin therapy in daily practice reduces the incidence of critical illness polyneuropathy and/or myopathy

N Berends, G Hermans, B Bouckaert, P Van Damme, M Schrooten, W De Vooght, P Wouters, G Van den Berghe

KU Leuven, Belgium

Critical Care 2008, 12(Suppl 2):P155 (doi: 10.1186/cc6376)

Introduction In two randomised controlled trials (RCTs) on the effect of intensive insulin therapy (IIT) in a surgical ICU (SICU) and

a medical ICU (MICU), IIT reduced the incidence of critical illness polyneuropathy and/or myopathy (CIP/CIM) and the need for prolonged mechanical ventilation (MV > 14 days). Here we investigated whether these effects are present in daily practice when IIT is implemented outside a study protocol.

Methods We retrospectively studied all electronically available electrophysiological data (electroneuromyography (ENMG)) from patients in the SICU and MICU before and after implementation of IIT in routine practice (omitting data obtained during the two RCTs). All ENMGs were performed because of clinical weakness and/or weaning failure. As in the RCTs, CIP/CIM was diagnosed by the presence of abundant spontaneous electrical activity (fibrillation potentials or positive sharp waves). Baseline and outcome variables were compared using Student's t test, chi-square test or Mann-Whitney U test when appropriate. The effect of implementing IIT on CIP/CIM and prolonged MV were assessed using univariate analysis and multivariate logistic regression analysis (MVLR) correcting for baseline and ICU risk factors. Results ENMGs were performed in 193 long-stay ICU patients before and 494 after implementing IIT. This population comprised 4.6% of all patients before and 5.6% after IIT implementation in the MICU and 4.0% before and 3.9% of all patients after IIT implementation in the SICU. With IIT, mean glycemia was significantly lowered (median 142 (130-153) to 106 mg/dl (100-113)). IIT implementation significantly reduced ENMG diagnosis of CIP/CIM in this population (71.6% to 48.7% (P <

0.0001.). MVLR identified implementing IIT as an independent protective factor (P < 0.0001, OR = 0.24 (95% CI = 0.14-0.43)). MVLR confirmed the independent protective effect of IIT on prolonged MV (P = 0.03, OR = 0.55 (95% CI = 0.31-0.95)). This effect was explained by the reduction in CIP/CIM (P = 0.009, OR= 1.13 (95% CI = 1.65-2.42)).

Conclusions Implementing IIT in daily practice evokes a similar beneficial effect on neuromuscular function, as observed in two RCTs. IIT significantly improves glycemic control and significantly and independently reduces the electrophysiological incidence of CIP/CIM. This reduction explains the beneficial effect of IIT-prolonged MV. References

1. Van den Berghe G, et al.: Neurology 2005, 64:1348-1353.

2. Hermans G, et al.: Am J Respir Crit Care Med 2007, 175: 480-489.

3. Van den Berghe G, et al.: N Engl J Med 2001, 345:13591367.

4. Van den Berghe G, et al.: N Engl J Med 2005, 354:449-461. P156

Evaluation of the implementation of a fully automated algorithm (eMPC) in an interacting infusion pump system for the establishment of tight glycaemic control in medical ICU patients_

J Plank1, R Kulnik1, C Pachler1, R Hovorka2, D Rothlein3, N Kachel3, M Wufka3, K Smolle1, S Perl1, R Zweiker1, T Pieber1, M Ellmerer1

1Medical University Graz, Austria; 2Addenbrooke's Hospital, Cambridge, UK; 3B. Braun Melsungen AG, Melsungen, Germany Critical Care 2008, 12(Suppl 2):P156 (doi: 10.1186/cc6377)

Introduction The purpose of this study was to investigate the performance of a newly developed prototype decision support system for the establishment of tight glycaemic control in patients in the medical ICU for a period of 72 hours.

Methods The study was conducted as a single-center, open, noncontrolled clinical investigation in 10 mechanically ventilated

patients at the Medical University Graz. After admittance to the ICU, arterial blood glucose values were monitored and the CS-1 Decision Support System (interacting infusion pumps with integrated algorithm eMPC and user interface) was used to adjust the infusion rate of intravenously administered human soluble insulin to normalize arterial blood glucose. The efficacy and safety were assessed by calculating the percentage within the target range (4.4-6.1 mM), the hyperglycaemic index (HGI), mean glucose and the number of hypoglycaemic episodes (<2.2 mM). Results The percentage of readings within the target range was 47.0% (±13.0). The average blood glucose concentration and HGI were 6.08 mM (±0.73) and 0.54 mM (±0.52), respectively. No hypoglycaemic episode (<2.2 mM) was detected. Several technical malfunctions of the device, such as repetitive error messages and missing data in the data log owing to communication problems between the new hardware components, are shortcomings of the present version of the device. Owing to these technical failures of system integration, treatment had to be stopped ahead of schedule in three patients. Conclusions For the first time a decision support system fully integrated into an infusion pump system was tested clinically. Despite technical malfunctions, the performance of this prototype system of the CS-1 Decision Support System device was, from a clinical point of view, already effective in maintaining tight glycaemic control. Accordingly, and with technical improvement required, the CS1 system has the capacity to be further improved in the next phase of the development process and to serve as a reliable tool for routine establishment of glycaemic control for critically ill patients.

Tight glucose control by intensive insulin therapy in Belgian ICU: an evaluation of practice

JC Preiser1, T Sottiaux2

1University Hospital Centre of Liege, Belgium; 2Notre-Dame de Grâce Hospital, Gosselies, Belgium

Critical Care 2008, 12(Suppl 2):P157 (doi: 10.1186/cc6378)

Introduction Recent data suggest that a tight glucose control by intensive intensive therapy (TGCIIT) may improve survival of critically ill patients. The optimal target for blood glucose (BG), however, is a matter of debate and controversy, and the constraints associated with the implementation of TGCIIT are considerable. Methods The present study surveyed the current practice of glucose management. We sent a multiple-choice questionnaire to

1 20 Belgian ICUs, on behalf of the Belgian Federal Board for Intensive Care.

Results Fifty-two ICUs (43%) answered. A total of 489 patients were staying in the ICUs when the questionnaire was filled. The number of glucometers per ICU bed averaged 0.6 ± 0.4 and the nurse/patient ratio averaged 0.6 ± 0.3. Glucose control is felt to be an important issue by all participants, while 96% are aware of the results of the landmark 2001 study of Van den Berghe and colleagues. Ninety percent changed their practice following this study. Fifty percent of the responders use TGCIIT for every patient, while others restricted TGCIIT to long-stayers, septic or diabetic patients. An algorithm is used for glucose control by 98% of the participants. The BG target is 80-110 mg/dl and 110-140 mg/dl for 27% and for 56% of the responders, respectively. BG is checked systematically two to eight times per day (five to eight times for 54%), on blood and capillary samples. Prior to the achievement of the target BG, checks are performed hourly (60%) or every 2 hours (28%). Once the target BG is reached, checks are performed six times (45%) to 1 2 times (36%) a day. The S61

amount of glucose supplied per day ranged from 50 g to more than 200 g, with 55% of the participants providing 75-150 g. For 81% of the responders, patients are discharged from the ICU with subcutaneous insulin therapy. Finally, 98% of the responders are waiting for recommendations concerning TGCIIT. Conclusions In spite of an awareness of TGCIIT, the current practice is largely variable among ICUs. The need for practical recommendations, including the type of patients, the equipment required and the optimal BG target, is underlined by these data.

Glucose control and the incidence of severe hypoglycaemia in a burns population following the introduction of intensive insulin therapy

D Arawwawala, T Kooner, P Dziewulski

Broomfield Hospital, Chelmsford, UK

Critical Care 2008, 12(Suppl 2):P158 (doi: 10.1186/cc6379)

Introduction Hyperglycaemia is often associated with the hyper-metabolic response to burn injury. It is perceived that glycaemic control can be difficult in a burns population. The aims of this study are to define the level of glycaemic control and the incidence of severe hypoglycaemia since the introduction of a nurse-led intensive insulin programme in a tertiary referral burns intensive therapy unit (BITU).

Methods A retrospective analysis of blood glucose levels following the introduction of a tight glycaemic target range (4.4-6.1 mmol/l) in November 2003. The study period was 42 months. All patients were admissions to the BITU. Insulin therapy was initiated once glucose levels were outside the defined range and were adjusted by nursing staff according to a regularly revised glucose/insulin sliding scale. Glucose levels were obtained by whole blood analysis using an onsite blood gas analyser (Chiron Diagnostics, Novartis, USA) subjected to daily calibration. Results In total, 24,602 blood glucose measurements were recorded within the study period. For 146 adult admissions (mean age = 47.7 years, mean% burn = 42.25%), there were 19,723 measurements. Median blood glucose = 7.1 mmol/l (IQR ± 2.2). Of these measurements, 22.6% were within the target range. Thirty per cent were >8.0 mmol/l. Incidence of severe hypo-glycaemia was 0.21%. For 85 paediatric (age < 16 years) admissions (mean age = 6.9 years, mean% burn = 43.5%), there were 4,879 recorded blood glucose measurements. Median = 6.8 mmol/l

(IQR ± 2.2); 29.1% of measurements were within the target range, and 23.5% were >8.0 mmol/l. The incidence of severe hypo-glycaemia was 0.22% (see Figure 1).

Conclusions The defined management strategy did not achieve tight glycaemic control; however, the majority of measurements were less than 8.0 mmol/l, as recommended by the Surviving Sepsis Campaign [1]. Paediatric patients had more results within the defined range compared with adult patients. The rate of severe hypoglycaemia was only 0.2%. Given the potential morbidity of severe hypoglycaemia and the uncertain benefit of intensive insulin therapy, this approach has produced an acceptable level of glycaemic control. Reference

1. Dellinger RP, et al.: Surviving Sepsis Campaign guidelines for management of severe sepsis and septic shock. Crit Care Med 2004, 32:858-873.

Achieving glycemic control with intensive insulin therapy in the ICU_

A Vakalos, G Tsigaras, G Zempekakis, V Makrakis

Xanthi General Hospital, Xanthi, Greece

Critical Care 2008, 12(Suppl 2):P159 (doi: 10.1186/cc6380)

Introduction Hyperglycemia is common in critically ill patients and is associated with increased mortality. The aim of our study was to test the efficacy of intensive insulin therapy in maintaining blood glucose levels within the target range.

Methods During a 2-month period, three patients (mean age: 67.6 years; mean APACHE II score: 13) were included in our study. The goal of the procedure was to maintain blood glucose levels below 150 mg/dl. Insulin intravenous dosages (continuous infusion and push) were adjusted by the ICU nurses based on ABG glucose levels, also according to nutritional support and the glucose levels trend algorithm.

Results During this period, 547 ABG samples were performed overall for the patients. The number of samples per patient per day was 7.5 ± 1.6 (mean ± SD): minimum 5, maximum 12. The blood glucose value per patient per day was 123.6 ± 25.7 (mean ± SD), minimum 13.5, maximum 196.33. The insulin dosage per patient per day was 86.34 ± 76.86 (mean ± SD) minimum 13.5, maximum 334. We recorded eight episodes of hypoglycemia (1.46% of all measurements), all successfully treated after 30% dextrose

Figure 1 (abstract P158)

Glucose distribution.

infusion. Within the target range were 427 blood glucose levels (78.06%), while the higher glucose values were associated with the initial hyperglycemia correction. The regration between glucose values and insulin dosage was not linear but rather polynomial, while the higher values of insulin dosage correlated with both the higher and the lower glucose levels.

Conclusions The blood glucose level target is difficult to achieve with intensive insulin therapy in a population of ICU patients with high severity score on admission. In our study, the glycemic control target below 150 mg/dl was achieved in more than two-thirds of measurements using a high insulin dosage. On the other hand, the rate of hypoglycemia was high in our study (1.46%), probably because of application failure of the insulin dosage algorithm during nutritional interruption. We suggest that application of an intensive insulin therapy protocol adjusted to nutritional support and the glucose levels trend will achieve glycemic control in clinical practice, while minimizing the risk of hypoglycemia.

Intensive insulin therapy: protocols in use in The Netherlands

MJ De Graaff1, H Kieft2, JP Van der Sluijs3, AA Royakkers4, JC Korevaar1, PE Spronk5, MJ Schultz1

1Academic Medical Centre, Amsterdam, The Netherlands; 2Isala Hospitals, Zwolle, The Netherlands; 3Medical Centre Haaglanden, The Hague, The Netherlands; 4Tergooi Hospitals, Blaricum, The Netherlands; 5Gelre Hospitals, Apeldoorn, The Netherlands Critical Care 2008, 12(Suppl 2):P160 (doi: 10.1186/cc6381)

Introduction Intensive insulin therapy (IIT) reduces mortality and morbidity of critically ill patients [1]. The 'original IIT protocol' as used by the group of van den Berghe (Leuven, Belgium) is a simple text-based protocol aiming for blood glucose values between 4.4 and 6.1 mmol/l. We conducted a postal survey amongst intensive care physicians and nurses in February 2007. As part of this survey, respondents were asked to send in a copy of their protocol on glycemic control (GC).

Methods All Dutch ICUs with >5 beds available for mechanical ventilation received a questionnaire on GC policies, in particular thresholds for blood glucose values to start insulin, and the targets of GC. Respondents were explicitly asked to send in their GC protocol too, when available.

Results Of 71 ICUs responding to the questionnaire, 46 (65%) sent in their GC protocol. Formats of the glucose control protocol varied widely, four different types of protocol formats could be recognized: 'flow chart' based (n = 17), 'sliding scales' based (n = 16), 'text' based (n = 7), and 'others' (n = 5). In three ICUs the GC protocol was computer based. In only 11 GC protocols (24%) were blood glucose targets between 4.4 and 6.1 mmol/l. In the majority of GC protocols (87%), the lower target for blood glucose was <4.5 mmol/l; in only one-half of GC protocols (43%), the upper target for blood glucose was <6.1 mmol/l. In four GC protocols, the thresholds for starting insulin were unclear. Conclusions There is large variability in the presently used GS protocols in The Netherlands. In only 24% did the GC-protocol targets reflect those of the original IIT protocol as used by van den Berghe. Reference

1. van den Berghe G, et al.: N Engl J Med 2001, 345:13591367.

Serum insulin-like growth factor binding protein 1 and C-peptide to assess insulin resistance in septic patients

C Chelazzi, AR De Gaudio

University of Florence, Italy

Critical Care 2008, 12(Suppl 2):P161 (doi: 10.1186/cc6382)

Introduction Insulin resistance and hyperglycemia are important features of the care of the critically ill. They are part of the metabolic pathophysiology of acute conditions [1,2]. In septic patients, insulin resistance is linked to the synergic effect of cytokines, bacterial products and catecholamines [3]. Laboratory findings include elevated C-peptide and serum insulin-like growth factor binding protein 1 (IGFBP-1) [4]. This is secreted by the liver and its secretion is inhibited by insulin. In the insulin-resistant patient, a rise in insulin fails to reduce its plasma levels. We dosed serum IGFBP-1 in samples of patients with sepsis and compared it with glycemia and serum C-peptide.

Methods Only patients with a definite diagnosis of sepsis were included. Five blood samples were taken from each patient at the time of their entrance and every 24 hours for the next 96 hours. For each sample, glycemia, C-peptide and IGFBP-1 were dosed and their values compared.

Results C-peptide levels constantly remained high, even in normoglycemic patients. Higher glycemias were associated with raised serum IGFBP-1 levels. Insulin increases could not inhibit IGFBP-1 and, hence, the worse the insulin resistance, the higher the glycemia, the higher the IGFBP-1. The relation between C-peptide and IGFBP-1 showed that higher levels of circulating insulin were associated with higher levels of IGFBP-1. This was thought as a direct sign of the existing insulin-resistant state. Conclusions IGFBP-1 can be used to asses insulin resistance in septic critical patients, particularly compared with glycemia. Its correlation with C-peptide might better define the severity of insulin resistance and, thus, of the underlying sepsis. References

1. Vincent JL: Metabolic support in sepsis and multiple organ failure: more questions than answers. Crit Care Med 2007, 35(Suppl):S436-S440.

2. Vanhorebeek I, et al.: Tight blood glucose control with insulin in the ICU: facts and controversies. Chest 2007, 132:268-278.

3. Turina M, et al.: Acute hyperglicemia and the innate immune system: clinical, cellular and molecular aspects. Crit Care Med 2005, 33:1624-1633.

4. Langouche L, et al.: Effect of intensive insulin therapy on insulin sensitivity in the critically ill. J Clin Endocrinol Metab 2007, 92:3890-3897.

Insulin increases deformation-induced lung cell death

B Wilcox, N Vlahakis, G Liu

Mayo Clinic Rochester, MN, USA

Critical Care 2008, 12(Suppl 2):P162 (doi: 10.1186/cc6383)

Introduction With insulin use increasing to treat ICU hyper-glycemia, we wished to determine whether insulin is cytoprotective in the presence of deformation-induced lung cell injury. Methods A549 epithelial cells were grown in full growth media for 24 hours on deformable six-well culture plates coated with type 1 collagen. Cells were then grown in glucose concentrations of 1.26 g/l, 4.26 g/l and 7.26 g/l for an additional 24 hours. After 2 hours of serum starvation, cells were exposed to 100 nM insulin for S63

30 minutes, followed by stretching for 2 minutes at 8 Hz, 30% strain amplitude and 140%/s strain velocity in the presence of 1% FITC solution, a marker of plasma membrane injury and reseal. Following stretching, the cells were incubated with 1% PI, a marker of cell death, with quantification of death performed by confocal microscopy. Results In undeformed cells the rate of cell death was 0.8%, 0.5% and 1.2% at 1.26 g/l, 4.26 g/l and 7.26 g/l glucose, respectively. In undeformed cells treated with 100 nM insulin, cell death was 0.2%, 1.3% and 2.1%. Deformation increased the percentage of cellular death to 3%, 5% and 2% compared with undeformed cells. Deformation increased the percentage of death in cells treated with 100 nM insulin to 15%, 15%, and 8% as compared with undeformed cells treated with insulin and deformed cells grown in glucose alone. Cell death did not differ from control at insulin concentrations ranging up to 100 nM but doubled and plateaued, with concentrations of 100-300 nM. Conclusions Insulin increases deformation-induced death in lung cells. This is consistent with a previously published multivariate analysis of patients with respiratory failure showing that the amount of infused insulin and the mean glucose level were independent risk factors for increased mortality. Our findings may have important implications for the use of insulin therapy in critically ill and ventilated patients. The mechanism by which insulin influences lung cell death after deformation is not yet known.

Factors influencing accuracy of blood glucose measurements in critically ill patients

J Peng, Y Liu, Y Meng, X Song, L Weng, B Du

Peking Union Medical College Hospital, Beijing, China Critical Care 2008, 12(Suppl 2):P163 (doi: 10.1186/cc6384)

Introduction Rapid and accurate blood glucose measurement is essential for treatment of critically ill patients. This prospective observational study, performed in a nine-bed medical ICU, was designed to evaluate factors affecting the accuracy of different methods of blood glucose measurement.

Methods A total of 29 consecutively admitted patients were included. Blood glucose was measured with a glucometer (Lifescan surestep™, capillary and arterial), a blood gas analyzer (Radiometer ABL-735, arterial), and a central laboratory (Olympus AU5400, arterial). Each value was compared with the reference laboratory result. Discrepancy was defined as the percentage of paired values not in accordance (>0.83 mmol/l difference for laboratory values <4.12 mmol/l, and >20% difference for values >4.12 mmol/l). Patient demographics, and clinical data including the presence of peripheral edema, vasopressor dependence, hematocrit, arterial pH and PaO2 were also recorded. Binary logistic regression analysis was used to determine the independent factor of discrepancy of blood glucose measurements. Results Discrepancy occurred in 46% (152/332) of blood glucose measurements. Independent factors of discrepancy are presented in Table 1.

Table 1 (abstract P163)

Independent factors of discrepancy of blood glucose measurements

Factor OR 95% CI P value

Hematocrit 0.93 0.89-0.98 0.006

PaO2 0.98 0.98-0.99 0.003

Laboratory 1.0

ABG 2.4 1.1-4.9 0.021

Glucometer 7.5 3.6-15.6 0.000

Conclusions Decreased hematocrit, poor oxygenation, and use of glucometer significantly increased the risk of discrepancy of glucose measurements in critically ill patients.

Evaluation of a noninvasive blood glucose monitoring device for critically ill patients

O Amir1, D Dvir2, B Grunberg2, J Cohen2, E Gabis1, P Singer2

1OrSense Ltd., Nes Ziona, Israel; 2Rabin Medical Center, Petach Tikva, Israel

Critical Care 2008, 12(Suppl 2):P164 (doi: 10.1186/cc6385)

Introduction The purpose of this study was to evaluate the feasibility of the NBM device (OrSense Ltd) for noninvasive continuous glucose monitoring in critically ill patients. Critically ill patients frequently experience abnormalities in carbohydrate metabolism and a severe insulin resistance state. Hyperglycemia is a negative predictor of outcome in these patients, as high blood glucose (BG) values are associated with an increased risk of morbidity and mortality. Current BG monitoring methods do not provide the continuous glucose monitoring needed to implement tight glucose control protocols. Methods The NBM uses a sensor shaped like a ring, placed on the base of thumb. Red/near-infrared occlusion spectroscopy detects and analyzes BG and hemoglobin concentrations. A study was conducted on 14 patients (seven females, seven males, ages 34-92 years) in the ICU of the Rabin Medical Center. The NBM probe performed noninvasive continuous glucose monitoring for up to 24 hours, with readings every 10 minutes. There were a total of 22 sessions, with two excluded due to insufficient calibration. NBM results were compared with arterial blood samples taken through an arterial line every 30-60 minutes and were analyzed with a blood gas machine (ABL 700; Radiometer). In all sessions there was good patient compliance and no adverse effects were identified. Results A prospective analysis based on a uniform model with personal calibration was performed on the NBM readings, for a total of 195 paired data points. The calibration phase lasted 3 hours utilizing reference BG values taken at t0+0:30, t0+1:30, t0+2:30, and t0+3:30. The reference BG range was 62-369 mg/dl. The median relative absolute error was 7.6%. A Clarke error grid analysis showed that 95.9% of the measurements fell within zones A (66.7%) and B (29.2%). Furthermore, the NBM and ABL 700 showed comparable estimates for the average percentage of time in hypoglycemia (8% ABL 700, 11% NBM), euglycemia (25% ABL 700, 26% NBM), and hyperglycemia (49% ABL 700, 57% NBM). Conclusions This study indicates the potential use of the noninvasive NBM as a device for continual, accurate, safe, and easy-to-use BG evaluation for the ICU. Consequently, it will improve patient care and survival, as well as reducing staff workload. The device has the promise for trend analysis, hypoglycemia detection and closed-loop systems enabling automatic glycemic control.

Evaluation of a near-infrared automated blood glucose monitor for use in critical care settings

S Hendee, S Vanslyke, F Stout, M Borrello, D Welsh, A Ross, A Fettig, S Martha, A Truong, R Robinson, R Thompson

Luminous Medical, Inc., Carlsbad, CA, USA

Critical Care 2008, 12(Suppl 2):P165 (doi: 10.1186/cc6386)

Introduction Luminous Medical (Carlsbad, CA, USA) is developing an automated, patient-attached system that uses near-infrared spectroscopy to measure glucose in whole blood. The system under development will aid caregivers in achieving tight glycemic

control in critical care patients. Luminous Medical conducted a pilot study to characterize system performance in terms of automated blood access and glucose measurement accuracy. Methods Four volunteers with type 2 diabetes (mean BMI = 32) participated in an IRB-approved study of the Luminous Medical Automated Glucose Monitor (AGM). Two subjects were enrolled for 24-hour sessions, and two for 48-hour sessions. Two AGM systems were used in the study. A standard peripheral intravenous catheter was placed in the subject's arm to provide venous access. The AGM was attached to the catheter via a sterile, patient-dedicated, disposable tubing set. The system was configured to automatically draw a blood sample through a flow cell integrated into the disposable set at 30-minute intervals. Near-infrared transmission spectra were collected as blood was drawn through the flow cell. After measurement, the system reversed flow to return the blood to the subject and to flush the circuit with saline. Glucose measurements were determined from collected spectra using partial least-squares regression applied in subject-out cross-validation. Simultaneous blood samples collected and analyzed with a YSI 2700 Select provided reference glucose values. Results The Luminous Medical AGM systems collected 283 blood glucose measurements during 144 hours of operation. The system operated with a single disposable set without interruption during each of the four sessions, infrequently requiring only minor operator interventions (such as slight adjustment of the arm position). Glucose values ranged from 75 to 340 mg/dl. Bland-Altman analysis showed good agreement between Luminous Medical AGM glucose measurements and paired reference values, with a mean difference of 4.15 mg/dl, 95% confidence limits of -18.8 to 10.5 mg/dl, and R2 = 0.97.

Conclusions Luminous Medical's AGM provides reliable access to peripheral venous blood samples in volunteers with type 2 diabetes, and accurately measures glucose in these samples. Luminous technology holds considerable promise for providing an improved critical care glucose monitoring solution over currently available methods.

Comparison of accuracy of three point-of-care glucometers in an adult ICU

A Roman, C Hanicq, P Flament, T El Mahi, F Vertongen, E Stevens

CHU Saint-Pierre, Brussels, Belgium

Critical Care 2008, 12(Suppl 2):P166 (doi: 10.1186/cc6387)

Introduction Obtaining accurate blood glucose levels at the bedside is mandatory to titrate insulin infusions in ICU patients under tight glycemic control. We evaluated concurrently the performance of three point-of-care devices - one blood gas analyzer and two glucometers - in an adult ICU. Methods Simultaneously, arterial blood glucose was measured with RapidLab 1265, the Accu-chek Aviva, the Nova StatStrip and

Table 1 (abstract P166)

Mean bias n >10% n >20%

n (mg/dl) SD discrepancy discrepancy

RapidLab 1265 329 -2.9 5.6 22 (6.6%) 0

Accu-Chek Aviva 329 -1.2 7.7 45 (13.6%) 5 (1.5%)

Nova StatStrip 329 -0.4 5.6 20 (6.0%) 1 (0.3%)

in the central laboratory as reference using the hexokinase method. The Bland-Altman approach and a modified Kanji approach [1] were used.

Results A total of 330 matched analysis were randomly performed in 275 patients. The mean SOFA score was 4.5 (minimum 0; maximum 21). The range of laboratory glucose was 34-526 mg/dl. One patient showed 1,025 mg/dl and was not included in statistical analysis as glucometers all indicated a high out-of-range value. No patient was receiving peritoneal dialysis with icodextrin, and none had a paracetamol overdose. Biases are defined as point-of-care minus laboratory glucose values. These mean biases were -2.9 mg/dl for the RapidLab 1265 blood gas analyzer, -1.2 mg/dl for the Accu-Chek Aviva and -0.3 mg/dl for the Nova StatStrip. The analysis of the 20% discrepancy showed, respectively, zero cases, five cases and one case in the study, while another 22 cases, 40 cases and 19 cases revealed more than 10% discrepancy. See Table 1.

Conclusions The very low biases and the low rate of significant (>20%) discrepancy appear sufficient for safe tight glucose control monitoring in the adult ICU. Reference

1. Kanji S, et al.: Crit Care Med 2005, 33:2778-2785. P167

Accuracy of point-of-care blood glucose measurements in the medical ICU

Y Liu, D Wu, X Song, Y Meng, L Weng, B Du

Peking Union Medical College Hospital, Beijing, China Critical Care 2008, 12(Suppl 2):P167 (doi: 10.1186/cc6388)

Introduction Accurate blood glucose measurement is essential for strict glycemic control. We evaluated four methods of point-of-care blood glucose measurement in critically ill patients. Methods In this prospective observational study, blood glucose was measured with the Roche Accu-Chek (capillary and arterial), the Radiometer ABL-700 analyzer (arterial), and in the central laboratory (arterial). Each value was compared with the reference laboratory result. Discrepancy was defined as the percentage of paired values not in accordance (>0.83 mmol/l difference for laboratory values <4.12 mmol/l, and >20% difference for values >4.12 mmol/l).

Table 1 (abstract P167)

Mean ± SD (mmol/l) Bias (mmol/l) r2 (95% CI) Cb Discrepancy (%)

Reference 7.7 ± 2.8 NA NA NA NA

Accu-Chek, capillary 9.3 ± 2.7 1.6 ± 1.4 0.874 (0.818-0.914) 0.848 65.3 (62/95)

Accu-Chek, arterial 9.1 ± 2.6 1.4 ± 1.5 0.835 (0.758-0.889) 0.867 59.8 (52/87)

ABL-700, arterial 8.7 ± 3.0 1.1 ± 1.1 0.925 (0.890-0.950) 0.930 35.8 (34/95)

Laboratory, arterial 8.3 ± 2.8 0.6 ± 1.2 0.908 (0.864-0.938) 0.978 21.5 (20/93)

r2, Pearson correlation coefficient; Cb, bias correction factor

Results The mean value, bias, agreement and discrepancy are presented in Table 1.

Conclusions Our findings suggest that capillary blood glucose measured by a glucometer is inaccurate in critically ill medical patients. Fingerstick measurements should be interpreted with great caution to avoid hypoglycemia.

Tight glycemic control: comparison of a real-time continuous interstitial tissue glucose monitoring system with arterial plasma glucose measurement in critically ill patients

A Vlkova, F Musil, P Dostal, Z Zadak, A Smahelova, V Cerny

University Hospital Hradec Kralove, Czech Republic Critical Care 2008, 12(Suppl 2):P168 (doi: 10.1186/cc6389)

Introduction The purpose of the study was to compare the results of interstitial glucose measurements obtained by the Guardian Real-time system and arterial plasma glucose concentration in mechanically ventilated, critically ill patients. The Guardian Realtime continuous glucose monitoring system is an external device that uses a subcutaneous microsensor that measures the concentrations of glucose in interstitial fluid.

Methods Ten mechanically ventilated critically ill patients with tight glycemic control based on arterial blood glucose measurements admitted to the six-bed multidisciplinary ICU of a tertiary care hospital, with no clinical and laboratory signs of inadequate tissue perfusion, were included in this single-center study. Interstitial glucose concentrations were measured by the Guardian Real-time monitoring system and compared with a standard reference method of plasma glucose measurement. The Guardian Real-time system was calibrated against the arterial plasma glucose measurement every 8 hours. Arterial blood glucose concentrations were measured every 60 minutes (glucose oxidation reaction) and the Guardian Real-time data were downloaded and paired with plasma glucose. Data were analyzed using the Bland-Altman method and the correlation coefficient was calculated. Results Two hundred and seventeen paired results were obtained and analyzed. Correlation between both methods was reasonable, but not perfect (correlation coefficient r = 0.6930, P < 0.0001). This was confirmed by Bland-Altman analysis (Figure 1), demonstrating broad limits of agreement: +2.3 and -3.1 mmol/l. Conclusions The observed, clinically unacceptable broad limits of agreement do not support the use of the Guardian Real-time

Figure 1 (abstract P168)

system for tight glycemic control management in mechanically ventilated, critically ill patients. Acknowledgement Supported by MZO 00179906. Reference

1. Aussedat B, et al.: Am J Physiol Endocrinol Metab 2000, 278:E16-E28.

Lactate measurement by the capillary method in shocked patients

O Collange, F Cortot, A Meyer, B Calon, T Pottecher, P Diemunsch

Strasbourg University Hospital, Strasbourg, France Critical Care 2008, 12(Suppl 2):P169 (doi: 10.1186/cc6390)

Introduction Arterial blood lactate is a reliable indicator of tissue oxygen debt and is of value in expressing the degree and prognosis of circulatory failure. Compared with arterial measurement, capillary lactate measurement is easier, faster, cheaper and lowers the incidence of arterial puncture complications. Capillary lactate measurement has been already validated to assess fetal well-being. The aim of this study was to compare arterial and capillary lactate in adult shocked patients.

Methods Consecutive shocked patients hospitalized in a university hospital surgical ICU were simultaneously tested for arterial and capillary lactate measurements. Arterial lactate was measured by the usual method described by Marbach and Weil, capillary lactate was measured using a micromethod device (Lactate Pro* LT1710; Arkray, KGK, Japan). Lactate levels were compared by linear regression, calculation of Pearson's correlation coefficient R2 and using a Bland-Altman plot.

Results In total, 60 simultaneous measurements of capillary and arterial blood lactate concentrations were performed in 16 patients with shock. A good linear correlation was found between capillary lactate (CapL) and arterial lactate (ArtL) concentrations: CapL = 0.85ArtL + 1.61; r = 0.8 (P < 0.0001). The mean difference was 0.78 ± 2.3 mmol/l. See Figure 1.

Conclusions These preliminary findings suggest that capillary lactate values could be used to assess the severity and guide therapy during shock.

Figure 1 (abstract P169)

Influence of Stewart acid-base variables on plasma pH in critically ill patients

A Marcelli, L Bucci, A Pochini, L Errico, RA De Blasi, M Mercieri

Université La Sapienza di Roma, Rome, Italy

Critical Care 2008, 12(Suppl 2):P170 (doi: 10.1186/cc6391)

Introduction The study objective was to determine and quantify the influence of strong-ion approach variables on the plasma pH in critically ill patients.

Methods A retrospective analysis of clinical records for 284 consecutive patients admitted to the medical adult ICU of a university hospital. Analysis was made of plasma acid-base data for 5,172 blood samples collected at admission and throughout hospitalization (one sample per patient). By substituting bicarbonate with the apparent strong-ion difference (SID), the weak acid anionic component (A-) and unmeasured ions (UI) in the Henderson-Hasselbalch equation, and after selecting samples

Figure 1 (abstract P170)

Figure 3 (abstract P170)

«4\J » »9 * iXs

y= -0.01 x *7,8 N. R Sq Linear = 0,854 «

20,00 30,00 PaCO mmHg) 50,00 60,00

Correlation between the PaCO2 and pH (patients with normal UI and SID).

Correlation between the SID and pH (patients with normal UI and PaCO2

Figure 2 (abstract P170)

Correlation between UI and pH (patients with normal SID and PaCO2

having at least one variable within the normal range, we determined possible linear relationships between the study variables. We then compared our results with those calculated using derivatives of the simplified strong-ion equation.

Results In samples with normal UI and PaCO2, the SID had a strong correlation with plasma pH (r2 = 0.84), yielding a SpH/SSID ratio of +0.013 (strong-ion acidosis/alkalosis) (Figure 1). In samples with normal SID and PaCO2, UI also correlated strongly with pH (r2 = 0.69), yielding a SpH/SUI ratio of -0.014 (uncompensated metabolic acidosis) (Figure 2). Hypoalbuminemia caused a compensatory reduction in SID of about 3 mEq/l per g/dl, thereby also influencing the pH (SpH/Salbumin ratio of -0.040). In samples with normal SID and UI, PaCO2 correlated strongly with pH (r2 =

0.85), yielding a SpH/SPaCO2 ratio of -0.01 (pure acute respiratory acidosis/alkalosis) (Figure 3).

Conclusions The SID correlates strongly with changes in pH, thus identifying a strong-ion acidosis or alkalosis. The changes in pH related to the SID, total nonvolatile weak anions and PaCO2 in critically ill patients almost match those calculated using the Stewart's simplified strong-ion equation. Reference

1. Constable PD: J Appl Physiol 1997, 83:297-311. P171

Value of postoperative C-reactive protein and leukocyte count after elective thoracoabdominal aneurysm surgery

GB Brunnekreef1, E Scholten2, LJ Bras2, MA Schepens2, LP Aarts3, EP Van Dongen2

1University Medical Centre, Utrecht, The Netherlands; 2St Antonius Hospital, Nieuwegein, The Netherlands; 3University Medical Centre, Leiden, The Netherlands

Critical Care 2008, 12(Suppl 2):P171 (doi: 10.1186/cc6392)

Introduction There are many causes for a systemic inflammatory response after thoracoabdominal aortic aneurysm (TAAA) repair. The aneurysm itself, the surgical trauma, ischemia-reperfusion injury and reactions to graft material can all cause an inflammatory response. This makes it difficult to identify postoperative infection. A PubMed search revealed no study on how to discern between normal postoperative levels of inflammation and postoperative infection after TAAA repair.

Methods In this prospective single-centre study we included 34 patients. They underwent elective surgical TAAA repair. Immuno-compromised patients and patients using immunosuppressive agents were excluded. C-reactive protein (CRP) levels and leukocyte count were measured in the operating room and on every postoperative day until discharge from the ICU, with a maximum of 14 days. We also determined the occurrence of fever. Results Five patients (15%) suffered a postoperative infection: three pulmonary infections, one bacteraemia of unknown origin and one patient suffered a septic period without positive cultures. In all patients there was a postoperative rise in CRP with a maximum on the second and third postoperative day. The median CRP was 229 mg/l on the second day and 221 mg/l on the third postoperative day. CRP declined towards preoperative levels during the first 2 weeks after surgery. The leukocyte count continued to rise postoperatively to 13 x 109/l on day 14. There was no correlation between fever or leukocyte count and infection. In only three of five patients with postoperative infection was a second rise in CRP noted.

Conclusions This study shows the CRP levels and leukocyte count that can be expected in the ICU after TAAA surgery. Surprisingly the leukocyte count continued to rise. This may be caused by the fact that patients with infection tend to stay longer in the ICU. The median CRP level on day 14, however, was only 22 mg/l. Not all postoperative infections caused a rise in the already high CRP levels. So in some cases CRP, leukocyte and temperature may be of no clinical value. Clinical evaluation combined with positive cultures may be the only method to diagnose postoperative infection in this group of patients.

Role of inflammation in nonhemorrhagic strokes

H Michalopoulou, A Bakhal, J Vaitsis, D Nastou, N Stravodimou, D Pragastis

Metaxa Hospital, Athens, Greece

Critical Care 2008, 12(Suppl 2):P172 (doi: 10.1186/cc6393)

Introduction In recent years considerable interest has been focused on the role of inflammation in the pathophysiology of acute coronary syndromes. There are limited data, however, about its participation in the pathogenesis of strokes. We investigated whether inflammation markers are increased in the acute phase of strokes.

Methods We studied consecutively 54 patients aged 55 ± 8 years old (32 males) that were hospitalized in the ICU from June 2005 to December 2007 with the diagnosis of nonhemorrhagic stroke proven by computed (CT) or magnetic (MRI) tomography. Within 24 hours of their admission, C-reactive protein (CRP), IL-6 and fibrinogen values were determined in all patients. Seventy patients, who were comparable as regards their age and sex, were used as a control group. Results See Figure 1.

Conclusions Inflammation markers are increased in the acute phase of ischemic strokes. Further studies are needed to show whether this increase is secondary to or contributes itself in the pathogenesis of ischemic strokes.

Figure 1 (abstract P172)

CRP (mg 1)

Fibrinogen (m g'dl) IL-6 (pg'ml)

? jiimt,- with stroke (n= 54)

4.5(1,8-125) 432 (352-496) 4.7 (2.2-97)

Control aroup

(n= 70) 1,9 (0.9-3 6) 315 (280-392) 29(1.94,6)


0,0001 0.0001 0,0001

Biomarkers that might improve prognostic accuracy in ICU patients

M Karvouniaris, M Stougianni, A Tefas, D Lagonidis

General Hospital of Giannitsa, Thessaloniki, Greece Critical Care 2008, 12(Suppl 2):P173 (doi: 10.1186/cc6394)

Introduction There is often a discrepancy between physician prediction of mortality and clinical prediction scores, and the accuracy of the latter is rather moderate. Nevertheless, it is tempting to just measure a biochemical parameter - a biomarker -to find prognostic information. Serum lipoproteins, for example, can mirror inflammatory activity and prognosis as well. Methods A prospective study of 29 patients that stayed in the ICU for at least 4 days and had the following characteristics: age 62.28 ± 16.92 years, length of stay in the ICU 15.55 ± 10.51 days and APACHE II score 21.28 ± 7.83. C-reactive protein (CRP), total cholesterol and high-density lipoprotein (HDL) were measured on admission and on day 4 in the ICU. First, we correlated these parameters with the length of stay in the ICU using Pearson's correlation method. Secondly, we compared the means between survivors and nonsurvivors after 6 months with an independent-samples' t test. We finally performed receiver operating curves of the above parameters according to mortality. Results CRP both on admission and on day 4 was positively correlated with length of stay in the ICU (P < 0.05). Mortality in 6 months was 18/29 (62%). According to the independent-samples' t test, statistical significance (P < 0.05) was only found for CRP on admission. On admission, the values of areas under the curve (AUC) were: CRP = 0.793, HDL = 0.667, total cholesterol = 0.604. Furthermore, on day 4 the values of AUC were: CRP =

0.629, HDL = 0.712 and cholesterol = 0.629. Using a cutoff CRP value on admission of <0.87 mg/dl, there was a better chance of survival with a sensitivity of 63.6% and a specificity of 94.44% (95% CI = 72.6-99.1). In addition to that, a cutoff HDL value on day 4 of >28.4 mg/dl predicts survival with a sensitivity of 63.6% and a specificity of 83.3% (95% CI = 46.5-90.2). Conclusions Serial measurements of CRP and HDL are both easy to perform and can add prognostic information. On the other hand, total cholesterol seems not to have any prognostic significance. Reference

1. Schuetz etal.: Curr Opin Crit Care 2007, 13:578-585. P174

Plasma C-reactive protein and albumin as predictors of readmission to intensive care

N Al-Subaie, T Reynolds, R Sunderland, A Myers, P Morgan, RM Grounds, A Rhodes

St Georges Hospital, London, UK

Critical Care 2008, 12(Suppl 2):P174 (doi: 10.1186/cc6395)

Introduction Readmission to intensive care is associated with poor outcome. C-reactive protein (CRP) and low albumin are associated with systemic inflammation, and this study aims to assess their usefulness in predicting readiness for discharge from the ICU. Methods An observational study based in a London teaching hospital mixed medical/surgical ICU. Plasma CRP and albumin on the day of ICU discharge and patients' demographic and outcome data were collected.

Results Seven hundred consecutive patients were identified, of which 125 were excluded as they were not suitable for readmission. Eleven patients did not have plasma CRP and albumin data on the day of discharge or the outcome was unknown. Of the 564 patients

Figure 1 (abstract P174)

Subsequent icu readmission

No Readmission Readmission p value

Age 63 68 0.053

PlasmaCRP 70.2 128.4 0.037

Plasma Albumin 23 19 <0.001

included, 53.1% were males, the median age was 64 (16-97) years and 38.1% were medical admissions. In the 55 patients who were readmitted to the ICU (9.8%), there was a significant difference in their median CRP (70.2 vs 128.4, P = 0.036) and median albumin (19 vs 21, P < 0.001) compared with the remaining patients. See Figure 1. The areas under the ROC curve for plasma CRP and albumin are 0.583 and 0.313, which precludes use of this biochemical marker as a useful predictor of readmission to the ICU in our population.

Conclusions Plasma CRP and albumin are of limited clinical value in predicting successful ICU discharge despite the significant difference of their values in patients who were subsequently readmitted to the ICU compared with the others. We are currently validating these results by collecting the data prospectively.

Role of the leukocyte antisedimentation rate in prediction of early recognition of post-stroke infection

T Molnar, A Peterfalvi, L Bogar, Z Illes

University of Pecs, Hungary

Critical Care 2008, 12(Suppl 2):P175 (doi: 10.1186/cc6396)

Introduction Patients with stroke are more susceptible to bacterial infections that indicate early immune responses, especially those by leukocytes. The leukocyte antisedimentation rate (LAR), a simple test to detect activation of leukocytes, was therefore serially examined and correlated with high-sensitivity C-reactive protein (hsCRP), S100b, procalcitonin (PCT) and outcome in patients with acute ischemic events.

Methods Venous blood samples were taken serially for measuring the LAR, S100b, hsCRP and PCT within 6 hours after the onset of first symptoms (T0), at 24 hours (T24) and at 72 hours (T72). After 24 hours, enrolled patients were categorized into acute ischemic stroke (AIS) and transient ischemic attack (TIA) groups, based on clinical and imaging data. The LAR and hsCRP were also obtained in 61 healthy volunteers. For statistical analysis, the Wilcoxon test, Spearman correlation, ROC analysis and Mann-Whitney U test were used. Results The LAR measured on admission (T0) was significantly higher in patients with acute ischemic events (AIS, n = 38 and TIA, n = 11) compared with healthy controls (median, IQR: 0.329, 0.212 vs 0.159, 0.218 vs 0.060, 0.069, respectively; P < 0.001, P = 0.002). In addition, the LAR was significantly higher at T0 in AIS patients compared with patients with TIA (median, IQR: 0.338, 0.204 vs 0.149, 0.168, P < 0.05). When the LAR was serially analyzed in the AIS group, a significant decrease in the LAR at T24 was found in 10 patients complicated by post-stroke infections (P =0.028). The cutoff value of the LAR at T24 differentiating patients with high risk of post-stroke infections was found to be 25.6% with a sensitivity of 81.5% and a specificity of 60% (AUC: 0.728, P = 0.035). The cutoff value of LAR24 for predicting poor outcome (defined by Glasgow Outcome Scale < 3) was found to be 26.4% with a sensitivity of 84.2% and a specificity of 53% (AUC: 0.728, P = 0.020). We also observed a positive correlation between S100b and the LAR, hsCRP at 72 hours (P < 0.05).

Conclusions The simple LAR test was capable of separating individuals with definitive ischemic stroke from those with TIA within 6 hours after onset of symptoms and to select patients with AIS at T24 who are at high risk for post-stroke infection. Our results indicate a very early and rapid activation of innate immune responses in stroke correlating with the size of infarct, and suggest that lack of elevation in the LAR could be related to an increased risk of infection due to a dysregulated activation of leukocytes.

Identifying sepsis in the emergency room: the best clinical and laboratory variables

B Gardlund, P Gille-Johnson, K Hansson

Karolinska Hospital, Stockholm, Sweden

Critical Care 2008, 12(Suppl 2):P176 (doi: 10.1186/cc6397)

Introduction Early diagnosis, antibiotics and supportive therapy are essential in sepsis. The diagnostic value of clinical and laboratory variables were evaluated in a prospective observational study. Methods A cohort of 404 adult patients admitted to the Department of Infectious Diseases from the emergency room (ER) for suspected severe infection was studied. A bacterial infection requiring antibiotic treatment was diagnosed in 306 patients (pneumonia 130 patients, urinary tract infection 80 patients, skin/soft tissue 43 patients, other bacterial infections 53 patients). Nonbacterial infections or noninfectious conditions were diagnosed in 82 patients. Significant bacteremia was detected in 68 patients (most common isolates: pneumococci 19, Escherichia coli 18, Staphylococcus aureus eight, P-haemolytic streptococci seven). Physiological variables recorded were temperature, heart rate, blood pressure, respiratory rate (RR), oxygen saturation, urine output, cerebral status. Laboratory variables were C-reactive protein (CRP), lactate, bicarbonate, creatinine, urea, hemoglobin (Hb), white blood cells (WBC), neutrophils, platelets, International Normalized Ratio, D-dimer, albumin, bilirubin, procalcitonin (PCT), IL-6 and LPS binding protein (LBP). Results The value of each variable in identifying patients with bacteremic sepsis or bacterial infections requiring antibiotics was evaluated. In a univariate analysis, PCT, IL-6, LBP, CRP, bilirubin and maximum RR during the first 4 hours (RRmax 0-4 h) was significantly associated with bacteremia with P < 0.001 and CRP, PCT, IL-6, LBP, WBC, neutrophils, RRmax 0-4 h and Hb was associated with a bacterial infection with P < 0.001. In a multivariate logistic regression, PCT, RRmax 0-4 h, bilirubin and CRP each contributed significantly to the accurate prediction of bacteremia. To predict a bacterial infection, CRP, WBC, Hb and RRmax 0-4 h contributed significantly. If patients with pneumonia were excluded, the RR still contributed significantly to the prediction of bacteremia. Conclusions The studied patients have a high level of suspicion of a serious infection. The patients without bacterial infections often have other inflammatory processes, sometimes mimicking sepsis. This is indeed a challenging population for a diagnostic variable to prove its value but also one where it would be most needed. The results show that the RR is the best discriminatory physiological variable and that PCT is best fitted to predicting bacteremia whereas CRP best predicts bacterial infections.

Procalcitonin, cytokine and NOx in diabetic ketoacidosis

S Suwarto, R Noer, M Oemardi, S Waspadji, K Inada

Ciptomangunkusumo, Jakarta, Indonesia

Critical Care 2008, 12(Suppl 2):P177 (doi: 10.1186/cc6398)

Introduction Procalcitonin (PCT), cytokine and NOx concentrations are different in patients with systemic inflammatory response

syndrome (SIRS) and sepsis. Diabetic ketoacidosis (DKA) is frequently accompanied by SIRS, and inflammatory cytokines can increase in the absence of infection. The aim of this study was to determine PCT, IL-6, IL-8, IL-10 and NOx between patients with SIRS and sepsis in DKA patients.

Methods Patients with DKA admitted to Dr Cipto Mangunkusumo Hospital Jakarta, between 1998 and 1999, were retrospectively reviewed. Plasma IL-6, IL-8, and IL-10 levels were measured by commercially available kits based on ELISA. Nitrate and nitrite (NOx) levels were measured by Griess reagent. PCT concentrations were measured by the immunoluminometric method (PCT LIA; BRAHMS Aktiengesellschaft GmbH, Germany). Results Patients characteristic are presented in Table 1. Clinical characteristics were similar in both groups. PCT and IL-6 on admission were higher in septic groups compared with SIRS groups. The mean of PCT was 0.1 ± 0.1 in SIRS groups and 26.5 ± 25.9 in sepsis groups (P < 0.05). IL-6 in SIRS groups ranged from 10 to 38 (median 24.7) and in sepsis groups ranged from 15.9 to 562.1 (median 46.2, P < 0.05). Serum IL-8, IL-10 and NOx did not differ in both groups.

Table 1 (abstract P177)

Characteristics of 22 patients with DKA

SIRS Sepsis

Glucose 362.8 1 ± 49.5 372.9 ± 63.8

pH 7.2 ± 0.1 7.1 ± 0.2

Temperature 37.8 1 ± 0.8 37.7 ± 0.9

Leukocyte 16.7 ' ± 7.0 19.5 ± 8.0

mycoses, and the fact that PCT values do not increase in deep mycoses that are single infections.

Methods In the present study we made simultaneous measurements of the PCT values of patients with hyper-p-D-glucanemia and

assessed the results.

Results Fungi were isolated from every patient by local or blood culture. In 16 patients with p-D-glucan values of 100 pg/ml or more it was also possible to continuously measure both p-D-glucan values and PCT values, and six of them had p-D-glucan values that exceeded 1,000 pg/ml. There were eight patients with fungal infections alone, and all of them had PCT values below 0.5 pg/ml. There were four patients with mixed infections caused by fungi and Gram-negative bacteria, and three of them had PCT values of 0.5 pg/ml or more. There were also five cases of mixed infection by fungi and Gram-positive bacteria, and in three of them the PCT value exceeded 0.5 pg/ml. When there was a fungal infection alone, the PCT value never rose, even when the p-D-glucan value exceeded 1,000 pg/ml. No significant correlation was found between the p-D-glucan values and the PCT values. Conclusions Simultaneous measurement of p-D-glucan values and PCT values was shown to be useful in making the differential diagnosis between mycoses alone and mixed infections.

Elevation of procalcitonin in chronic dialysed patients

H Brodska, K Malickova, A Kazda, J Lachmanova, J Uhrova, T Zima

University Hospital, Prague, Czech Republic

Critical Care 2008, 12(Suppl 2):P179 (doi: 10.1186/cc6400)

Conclusions Most cases of DKA had signs of SIRS. PCT and IL-6 are a useful marker to differentiate between SIRS and sepsis in ketoacidosis patients. References

1. Gogos CA, Giali S, Paliogianni F, et al.: Interleukin-6 and C-reactive protein as early markers of sepsis in patients with diabetic ketoacidosis or hyperosmosis. Diabetologia 2001; 44:1011-1014.

2. Reith HB, Mittelkotter U, Wagner R, Thiede A: Procalcitonin (PCT) in patients with abdominal sepsis. Intensive Care Med 2000; 26:S165-S169.

Assessment of procalcitonin values in deep mycosis associated with high P-D-glucan values

S Kikuchi, Y Suzuki, G Takahashi, M Kojika, N Sato, S Endo

Iwate Medical University, Morioka, Japan

Critical Care 2008, 12(Suppl 2):P178 (doi: 10.1186/cc6399)

Introduction Measurement of serum P-D-glucan values has come into widespread use in routine clinical practice as a means of diagnosing deep mycosis. We have previously reported on the usefulness of measuring procalcitonin (PCT) as a means of diagnosing infections and sepsis and assessing the severity of

Introduction In some chronic dialysed patients, without signs of infection, increased values of procalcitonin (PCT) are found. The aim of our work was to determine the relations of PCT with other markers of inflammation.

Methods Heparinized plasma of 35 chronically dialysed patients without infection were analysed before and after 4 hours of dialysis. Parameters were daily diuresis, diabetes mellitus (yes/no), secondary hyperparathyroidism (yes/no), IL-10, IL-1 2 (flow cytometry), calprotectin (spectrophotometry), PCT (ELFA; Brahms), C-reactive protein (CRP) (turbidimetry; Modular). Statistics involved the Spearman correlation coefficient and the nonparametric Mann-Whitney test.

Results The PCT level was above 0.5 |ig/l in seven patients and the maximal value was 4.9. The median, lower and upper quartiles were calculated before and after dialysis. During dialysis the values of PCT were not statistically different; similarly the value of IL-10, IL-12, CRP and calprotectin. Calprotectin was significantly elevated in hemodialysed patients in comparison with blood donors (P < 0.001). Reference ranges: IL-10, 10-35%; IL-12, 20-40%; calprotectin, 0-12 |ig/ml; CRP, <7 mg/l; PCT, <0.5 |ig/l. See Table 1 Conclusions No significant change of IL-10, IL-12 and of calprotectin during dialysis indicates no activation of monocytes, nor of polymorphonuclear cells. An elevated level of calprotectin confirms chronic persisting inflammation. There was no correlation between PCT and all other markers of inflammation. Elevation of

Table 1 (abstract P179)

Markers of inflammation: statistical parameters

PCT PCT IL 10 IL 10 IL 12 IL 12 Calprotectin Calprotectin CRP CRP

before after before after before after before after before after

Median 0.23 0.23 11.1 9.3 26.7 25.7 25 29.6 6.0 6.0

First; third quartiles 0.13; 0.375 0.11; 0.397 6.8; 21.9 4.4; 18 18.9; 38.6 14.7; 32.8 11.6; 41.1 18.2; 68.3 3.0; 26 3.0; 22

PCT in chronic dialysed patients is not caused by infection systemic inflammation. We also found no correlation of elevation of PCT with diuresis, diabetes mellitus and secondary hyperparathyroidism.

Role of procalcitonin in diagnostics of acute adrenal insufficiency

K Popugaev, I Savin, L Astafieva, O Gadjieva, V Tenedieva

NN Burdenko, Russian Academy of Medical Sciences, Moscow, Russian Federation

Critical Care 2008, 12(Suppl 2):P180 (doi: 10.1186/cc6401)

Introduction Acute adrenal insufficiency (AAI) with refractory arterial hypotension (RAH) is a rare but life-threatening complication after neurosurgery. Clinical and laboratory diagnostics of AAI are difficult and its treatment must be begun immediately. Hyper-thermia, leukocytosis and increased C-reactive protein (CRP) are commonly revealed in these patients. RAH and the other abovementioned symptoms are also typical for patients with septic shock onset. These delayed the timely beginning of adequate therapy. We undertook a pilot study to elucidate the role of procalcitonin (PCT) in diagnostics of AAI.

Methods RAH developed in three patients postoperatively: one patient had an aneurism of the anterior cerebral artery, one patient cavernoma of the midbrain, and one patient clival chordoma. After hemodynamic insult, clinical blood analysis was performed. PCT (LUMItest PCT; BRAHMS), CRP, electrolytes, glucose and cortisol were investigated. X-ray investigation, urine and liquor examinations were performed.

Results Patients had hyperthermia (>38°C), increased CRP (>90 mg/l), leukocytosis (>11 x 109/l) in two patients and leuco-penia in one case. Two patients had PCT < 0.5 ng/ml and 1-1.3 ng/ml (in 2 days after hydrocortisone administration, PCT was 0 ng/ml). There were no revealed sites of infection. Hyponatremia, normoglycemia and tendency to hyperkalemia were founded. Cortisol was normal in two patients and low in one patient. Patients received a stress dose of hydrocortisone with sympathomimetics and infusion. The hemodynamics was stabilized. In 2-3 days patients were weaned from the sympathomimetics. There were no indications for antibiotics.

Conclusions PCT is normal in patients with AAI despite the presence of hyperthermia, leukocytosis and increased CRP. The results show PCT helps in timely diagnostics and treatment of AAI.

Use of procalcitonin as an aid to antibiotic prescribing in intensive care

A Fletcher, P Moondi

The Queen Elizabeth Hospital, Kings Lynn, UK

Critical Care 2008, 12(Suppl 2):P181 (doi: 10.1186/cc6402)

Introduction Procalcitonin (PCT) is increasingly used as a specific marker for bacterial infection and sepsis. It has been shown to increase the accuracy of sepsis diagnosis at an early stage. PCT levels are low in viral infections, chronic inflammatory disorders or autoimmune processes. PCT levels in sepsis are generally greater than 1-2 ng/ml and often reach values between 10 and 100 ng/ml. An audit was carried out to ascertain whether a change in antibiotic prescribing occurred when PCT results were used in conjunction with white cell count (WCC) and C-reactive protein (CRP). Methods The audit was carried out over a 1-month period. All patients with suspected infection had their WCC, CRP and PCT measured. The consultant intensivist was blinded to the PCT result

Figure 1 (abstract P181)

70 60 50 40 30 20 ■ 10 -0

In i-hi

* & jf / ^ cf

□ Decision befare PCT ■ Decision after PCT

and asked for their management plan on the basis of all other blood tests and clinical assessment. The PCT result was then revealed and the management plan was then re-evaluated. Results A total of 100 PCT tests were carried out on 30 patients. The PCT result did change management 26% of the time. The PCT result led to an omission of antibiotics, which would otherwise have been given in eight out the 30 tests. The PCT result also led to continuation of antibiotics, which otherwise would have been changed in 10 out of 16 tests. The number of continuations of antibiotics was higher after PCT (65 post PCT and 52 pre PCT) but this is related to the fact that 10 of these would have had alteration of antibiotics. See Figure 1.

Conclusions The use of PCT had a useful role in changing antibiotic prescribing in 26% of instances. Most commonly it leads to patients not having antibiotics initiated or reducing the change in antibiotics once started. References

1. Tang et al.: Accuracy of procalcitonin for sepsis diagnosis.

Lancet Infect Dis 2007, 7:210-217.

Uzzan et al.: Procalcitonin as a diagnostic test for sepsis in crtically ill adults. Crit Care Med 2006, 34:1996-1920. BRAHMS PCT International [].

3. P182

Procalcitonin to guide length of antibiotic therapy in surgical intensive care patients

S Schroeder, M Hochreiter, T Koehler, T Von Spiegel

Westkuestenklinikum Heide, Germany

Critical Care 2008, 12(Suppl 2):P182 (doi: 10.1186/cc6403)

Introduction The development of resistance by infective bacterial species is an encouragement for us to reconsider the indication and administration of the available antibiotics. Proper recognition of the indication and the correct duration of therapy are particularly important for the use of highly potent substances in intensive care. There has as yet been no clinical chemical parameter that is capable of specifically distinguishing a bacterial infection from a viral or noninfectious inflammatory reaction. It now appears that procalcitonin (PCT) offers this possibility [1-3]. The present study is intended to clarify whether PCT can be used to guide antibiotic therapy in surgical intensive care patients.

Methods One hundred and ten patients in a surgical intensive care ward receiving antibiotic therapy after confirmed infection or a high-grade suspicion of an infection were enrolled in this study. In 57 of these patients a new decision was reached each day as to whether the antibiotic therapy should be continued after daily PCT determination and clinical judgement. The control group consisted of 53 patients with a standardised duration of antibiotic therapy over 8 days.

Results Demographic and clinical data are comparable in both groups. In the PCT group, however, the period of antibiotic therapy is significant shorter compared with controls (5.9 ± 1.7 vs 7.9 ± 0.5 days, P < 0.001) without unfavourable effects on clinical outcome. Conclusions The daily determination of PCT for intensive care patients shortened the duration of antibiotic therapy. There were no unfavourable effects on the outcome. References

1. Christ-Crain M, Muller B: Swiss Med Wkly 2005, 135:451-460.

2. Harbarth S: Am J Respir Crit Care Med 2001, 164:396-402.

3. Oberhoffer M Clin Chem Lab Med 1999, 37:363-368.

Validation of procalcitonin measurement to the side of the stream bed as marking infection in intensive therapy patients

M Vaisman, R Lima, C Filho, M Dourado, A Castro, H Torres, I Barbosa, D Castro, J Machado

Samaritano Hospital, Rio de Janeiro, Brazil

Critical Care 2008, 12(Suppl 2):P183 (doi: 10.1186/cc6404)

Introduction Elevation of the serum concentration of procalcitonin (PCT) has been proposed as a marker of disease severity and is associated with systemic infection. This association has led to the proposed use of PCT as a novel biomarker for bacterial sepsis. We sought to evaluate the PCT measurement with culture samples to quickly ratify the sepsis and rapidly begin the use of antibiotics. Methods Between September 2006 and March 2007 we evaluated 82 blood samples from 82 patients - 48 males (80.33 ± 10.55 years old) and 34 females (81.17 ± 13.83 years old) - with sepsis or SIRS in the adult ICU of a tertiary hospital. The PCT levels were measured by a quantitative imunoturbidimetry method (PCTL) in ng/ml (Lumitest PCT; Brahms, Germany) and the results compared with a sample culture (blood, urine, tracheal secretion and others).

Results With the cutoff of PCT levels at 2 ng/ml and positive or negative sample cultures, the analysis found that sensitivity is 37%, specificity is 92%, positive predictive value is 0.84, negative predictive value is 0.40, positive likelihood ratio is 4.62 and negative likelihood ratio is 0.68. With the cutoff of PCT levels at

0.5 ng/ml and positive or negative sample cultures, the analysis found that sensitivity is 72%, specificity is 33%, positive predictive value is 0.54, negative predictive value is 0.48, positive likelihood ratio is 1.07 and negative likelihood ratio is 0.84. Conclusions This preliminary analysis suggests that PCT can be used to accurately early identify sepsis only at levels above 2 ng/ml and then decide to rapidly begin the use of antibiotics. In patients with PCT < 2 ng/ml we cannot use PCT to exclude the diagnosis of sepsis. With the cutoff of 0.5 ng/ml we found the same result. Other studies with more samples are necessary to confirm this conclusion. Reference

1. Giamarellos-Bourboulis EJ, Mega A, Grecka P, et al.: Procal-citonin: a marker to clearly differentiate systemic inflammatory response syndrome and sepsis in the critically ill patient? Intensive Care Med 2002, 28:1351-1356.

Prognostic value of raised procalcitonin when combined with routine biomarkers of sepsis among critically ill patients

J Louw, T Veenith, P Moondi

Queen Elizabeth Hospital, Kings Lynn, UK S72 Critical Care 2008, 12(Suppl 2):P184 (doi: 10.1186/cc6405)

Introduction In sepsis the timing of treatment is vital in survival of the patients. Procalcitonin (PCT) by itself cannot reliably differentiate sepsis from other noninfective causes [1]. PCT, however, may help to identify the critically ill patients with poor prognosis when used in combination with other markers, such as C-reactive protein (CRP) and white cell count (WCC). The aim of this retrospective study was to look at prognosis of patients admitted to the ICU with a raised PCT >10 ng/ml. A novel approach for prediction of prognosis and severity may be to combine the biomarkers with PCT.

Methods We looked at all the patients with a raised PCT (PCT >10 ng/ml), admitted to a general ICU in a district general hospital over a period of 17 months. The total number of patients admitted over this time was 976 (surgical patients 67% and medical patients 33%) with a corresponding unit mortality of 16% and a hospital mortality of 21%. The corresponding WCC and CRP were noted. Our patients had similar SOFA and IPS scores so they were comparable with each other.

Results The overall mortality of patients with a PCT >10 ng/ml was 28%, compared with our ICU mortality of 16%. When the biomarkers are combined, the mortality of patients with all biomarkers raised (> 3 markers - abnormal WCC, increased CRP, increased PCT) was 30%. The mean length of stay in patients with all biomarkers raised was 10.5 days, compared with the length of stay in patients with an isolated marker of 6.5 days (isolated raise of WCC or CRP).

Conclusions These results support the use of PCT as prognostic marker in the critically ill, but also emphasize the role of CRP for added accuracy in predicting mortality. The WCC seems to have less significance as a predictive indicator. This may even be important in an inpatient setting to identify the high-risk patients for early intervention. Reference

1. Tang et al.: Accuracy of procalcitonin for sepsis diagnosis in critically ill patients: systematic review and meta-analysis. Lancet Infect Dis 2007, 7:210-217.

Procalcitonin in elective colorectal surgery and its predictive value for an early discharge of fast-track patients

A Chromik1, W Uhl1, A Thiede2, H Reith2, U Mittelkotter1

1St Josef University Hospital, Bochum, Germany; 2University Hospital of Würzburg, Germany

Critical Care 2008, 12(Suppl 2):P185 (doi: 10.1186/cc6046)

Introduction Procalcitonin (PCT) is regarded as a specific indicator of bacterial infection. Infectious complications in patients after colorectal surgery are a common cause of morbidity and mortality. The aim of this study was to investigate whether PCT could serve as a negative predictive marker for postoperative complications, and whether in patients with elevated PCT levels a preemptive treatment with the third-generation cephalosporin ceftriaxone is superior to antibiotic treatment starting later on the appearance of clinical signs and symptoms of infection. Methods By screening 250 patients with colorectal surgery we identified 20 patients with PCT serum levels >1.5 ng/ml on at least two of the first three postoperative days. The remaining 230 patients were followed up for the occurrence of infectious complications. The 20 patients with elevated PCT were included in a prospective randomised pilot study comparing preemptive antibiotic treatment with ceftriaxone versus standard treatment. Results The negative predictive value of PCT for systemic infectious complications was 98.3%. In patients receiving

preemptive antibiotic treatment (ceftriaxone), both the incidence and the severity of postoperative systemic infections were significantly lower compared with those in a control group (Pearson's chi-squared test P = 0.001 and P = 0.007, respectively). Major differences were also observed with respect to the duration of antibiotic treatment and the length of hospital stay. Conclusions PCT is an early marker for systemic infectious complications after colorectal surgery with a high negative predictive value. A significant reduction in the rate of postoperative infections in patients with elevated PCT serum concentrations was achieved by means of preemptive antibiotic treatment.

Endotoxin adsorption method may affect serum procalcitonin

T Ikeda, K Ikeda, T Ueno, S Suda

Tokyo Medical University, Hachioji Medical Center, Tokyo, Japan Critical Care 2008, 12(Suppl 2):P186 (doi: 10.1186/cc6407)

Introduction When septic patients progress to endotoxin shock, they present a high mortality rate. The mortality rate of septic patients with multiple organ failure has been reported to be 30-80%. The endotoxin adsorption method (PMX-DHP: Toray Industries, Inc., Tokyo, Japan) has been used for treatment of patients with severe sepsis and septic shock primarily caused by Gram-negative infections in Japan. We also reported that PMX-DHP removed plasma endotoxin and improved hemodynamic parameters in clinical trials [1]. The purpose of this study was to assess the changes of procalcitonin (PCT) values during PMX-DHP.

Methods The retrospective study was carried out in our ICU. In this study, 68 septic patients who had multiple organ failure due to intra-abdominal infection were treated with PMX-DHP. Sepsis was diagnosed according to the criteria of the ACCP/SCCM Consensus Conference Committee. These patients were separated into two groups: those who survived for at least 28 days after the start of PMX-DHP therapy (S group; 49 patients), and those who did not (nonsurvival group; 19 patients). Background factors and inflammatory mediators were examined in each group. PCT was measured by immunoluminometric assay before and after PMX-DHP and 24 hours later. The luminometer used was an Autolumat LB953 (Berthord, Bad Wildbad, Germany). Endotoxin (kinetic turbidimetric method) was also measured just before and immediately after PMX-DHP.

Results The 28-day survival rate was 72.1% (49 survivors, 19 nonsurvivors). The APACHE II scores were 22.0 ± 8.3 and 28.3 ±

7.0 and the Sequential Organ Failure Assessment scores were

9.1 ± 3.9 and 11.1 ± 2.8 in the survival and nonsurvival groups, respectively, showing significantly higher scores in the nonsurvival group. PCT before PMX-DHP in all patients was 59.1 ± 97.2 ng/ml and tended to decrease 54.7 ± 81.7 ng/ml after PMX-DHP. PCT was 59.4 ± 109.4 ng/ml before PMX-DHP and significantly decreased to 42.2 ± 67.4 ng/ml at 24 hours after PMX-DHP in the survival group, but it did not change significantly in the nonsurvival group. There was a significant correlation between endotoxin and PCT (r = 0.527, P < 0.001).

Conclusions Our results may suggest that PMX-DHP can reduce systemic inflammatory cytokines and serum PCT in the survival group. Reference

1. Ikeda T, et al.: Clinical evaluation of PMX-DHP for hypercy-tokinemia caused by septic multiple organ failure. Ther Apher Dial 2004, 8:293-298.

Discriminative procalcitonin values for diagnosis of systemic inflammatory response syndrome and sepsis in the ICU

H Gharsallah, I Labbene, B Fatthallah, N Hichri, K Lamine, M Ferjani

Military Hospital of Tunis, Tunisia

Critical Care 2008, 12(Suppl 2):P187 (doi: 10.1186/cc6408)

Introduction No study has found a procalcitonin level that distinguishes between systemic inflammatory response syndrome (SIRS) and sepsis [1]. The goal of our study was to determine a cutoff value of serum procalcitonin concentration for diagnosis of SIRS and sepsis.

Methods In this prospective and observational study, we included all patients admitted to the ICU. The procalcitonin level was determined on day 0, day 2, day 4 and day 7 of hospitalization using the immunoluminometric method (PCT-lumin; Brahms Diagnostica, Berlin, Germany). Normal values are < 0.1 ng/ml. The detection limit was 0.3 ng/ml. P < 0.05 was considered significant. Time points were defined as the procalcitonin concentrations measured at different times in all patients. Time points were associated with SIRS, sepsis and septic shock (SS) according to the established ACCP/SCCM consensus definition. Statistical analysis was performed using SPSS software for windows version 10. Results A total of 70 patients were included in our study. Two hundred and sixty-five time points were categorized into three groups (SIRS, sepsis and SS). The mean IGS II score was 32 ± 14; the mean APACHE II score 15 ± 7. The median procalcitonin levels in the SIRS group and the sepsis + SS group were, respectively, 0.325 and 1.115 ng/ml (P < 0.001). The area under the ROC curve to distinguish the presence or absence of sepsis was 0.745 (0.685-0.805). A cutoff value of 1.3 has a specificity and a sensitivity of 89% and 45%, respectively. If we exclude patient patients with SS, a cutoff value of 1.3 has the same specificity (89%) and was less sensitive (58%). A cutoff value of

0.265 had a specificity of 58% and a sensitivity of 80%. Conclusions According to these preliminary results a procalcitonin level between 0.265 and 1.3 ng/ml cannot distinguish between SIRS and sepsis. This can be explained by the fact that we considered different time points in the same patients and by the fact that we did not separate medical and surgical patients. Reference

1. Simon L, Gauvin F, Amre DK, et al.: Clin Infect Dis 2004, 39:206-217.

Measurement of procalcitonin in bronchoalveolar lavage and serum as early predictors in acute respiratory distress syndrome

A Abdel Razek, E Abdel Moneim Arida, A Deghady

Alexandria Faculty of Medicine, Alexandria, Egypt

Critical Care 2008, 12(Suppl 2):P188 (doi: 10.1186/cc6409)

Introduction Procalcitonin (PCT) is a diagnostic for identifying severe bacterial infections and reliably indicating complications secondary to systemic inflammation. PCT levels increase in cases of sepsis, septic shock and in severe systemic inflammatory reactions. The early pathological features of acute respiratory distress syndrome (ARDS) are generally described as diffuse alveolar damage, which can be diagnosed by cytological examination of bronchoalveolar lavage fluid (BAL) [1]. The aim of this study was to evaluate the value of PCT measurement in BAL S73

and serum in the early diagnosis, and to faciliate reliable follow-up of the clinical course of ARDS.

Methods This study included 35 patients admitted to the Critical Care Department at Alexandria Hospital. Patients were allocated into two groups, the study group (25 cases) and the control group (10 cases). A plain X-ray scan was performed, and the hypoxic index (PaO2/FiO2) was calculated daily. PCT in BAL and blood was measured on days 0, 3 and 6 from the diagnosis of ARDS in the patient group using an automated immunofluorescent assay (KRYPTOR BRAHAMS PCT). Disease severity was assessed daily during the stay of the patient using the Acute Lung Injury Score and the Multiple Organ Failure Score (MODS). Results In the ARDS group the mean serum PCT was higher than the control, 6.86 ± 3.34, and increased insignificantly after 3 days to 8.06 ± 7.21 (P = 0.195) and after 6 days to 8.60 ± 9.49 (P =

0.232.. On the other hand, there was no significant change between the study and control groups for BAL PCT. There was a significant direct correlation between serum PCT and the Murray score on diagnosis of ARDS, on day 3 and day 6. All were significantly higher in nonsurvivors, compared with survivors. Conclusions Serum PCT is helpful in the diagnosis of ARDS, but there was no significant change in the value of BAL PCT. A direct correlation exists between serum PCT, the MODS and the Murray score in ARDS patients.


1. Beskow CO, Drachenberg CB, Bourquin PM, et al.: Diffuse alveolar damage. Morphologic features in bronchoalveolar lavage fluid. Acta Cytol 2000, 44:640-645.

2. Estenssoro E, Dubin A, Laffaire E, et al.: Incidence, clinical course, and outcome in 217 patients with acute respiratory distress syndrome. Crit Care Med 2002, 30:2450-2456.

Correlation of endotoxin, procalcitonin and C-reactive peptide patterns with ICU admission

R Otero, J Crawford, A Suarez, E Rivers, J Yang

Henry Ford Hospital, Detroit, MI, USA

Critical Care 2008, 12(Suppl 2):P189 (doi: 10.1186/cc6410)

Introduction The ability to stratify patients with evidence of sepsis and to determine appropriate ICU admission is often hampered by inadequate history and a paucity of physical examination findings. The literature is replete with multiple markers used to determine the presence and severity of sepsis. Among the proposed biomarkers, procalcitonin (PCT), C-reactive peptide (CRP) and recently the endotoxin activity assay (EAA) have been proposed as tools to aid in the diagnosis of sepsis from various etiologies. The purpose of this current investigation is to compare baseline EAA levels with PCT and CRP in their ability to correlate with ICU admission above specified levels.

Methods This is a secondary analysis of a prospective observational study of patients qualifying for early goal-directed therapy for severe sepsis. Emergency department patients enrolled were >18 years old, with at least two SIRS criteria and evidence of infection. From a total of 95 patients (25 nonsevere sepsis and 70 severe sepsis (lactate >4 mmol/l)) who were enrolled, 92 had complete data. Descriptive statistics are provided for ICU and non-ICU patients. Non-normally distributed data for disposition and values for EAA, PCT and CRP were analyzed by Spearman correlation. An alpha level <0.05 was considered statistically significant.

Results Mean EAA value for patients = 0.54 (SD = 0.23, n = 92), PCT = 19.72 (SD = 54.82, n = 91) and CRP = 122.29 (SD = S74 94.14, n = 86). Based upon baseline values for EAA (>0.6), PCT

(>5 ng/ml) and CRP (>5 mg/l), no statistically significant correlations were found between an elevated EAA, PCT or CRP and ICU admission (P = 0.67, 0.16 and 0.67, respectively). Similarly, the combination of EAA (>0.6) and lactate (>2.0 mmol/l) did not correlate with a significantly higher rate of ICU admissions (P = 0.86). When the maximum level of EAA was followed throughout the first 72 hours of evaluation and treatment, however, there was a trend towards higher ICU admission (P = 0.054). Conclusions In this analysis, baseline levels of EAA, PCT and CRP of patients with severe sepsis showed no statistically significant correlation with ICU admission. Evaluation of the maximum value of EAA did display a trend towards higher ICU admission. Possible explanations of this discrepancy may point to heterogeneity in infectious etiologies and presence of comorbidities that complicate interpretation of biomarker data.

Why measure endotoxin in septic shock patients?

G Monti, S Colombo, M Mininni, V Terzi, GM Ortisi, S Vesconi, G Casella

Niguarda Hospital, Milan, Italy

Critical Care 2008, 12(Suppl 2):P190 (doi: 10.1186/cc6411)

Introduction The aim of the present study was to evaluate the clinical utility of endotoxin activity (EAA) measurement in critically ill septic shock (SS) patients.

Methods From January 2007 to August 2007 in an eight-bed general ICU, we performed a prospective analysis of the EAA level on 29 critically ill patients within 24 hours of SS diagnosis (CDC criteria). The EAA level was assessed by a new and rapid assay based on neutrophil-dependent chemioluminescence. The EAA level (defined as low, intermediate and high, respectively, for values <0.40, 0.40-0.60, and >0.6) was then correlated with severity of illness and ICU mortality.

Results The clinical profile of SS patients is shown in Table 1. The EAA level was low in a minority of SS patients (13%), and intermediate and high EAA levels were evidenced in 31% and 56% of SS patients, respectively (Table 2). Our results seem to evidence a good correlation between EAA levels and severity of illness (Table 2). The EAA level seems to correlate with ICU mortality, which was 0% in low EAA patients, and 17% and 37% in intermediate and high EAA patients.

Table 1 (abstract P190)

Clinical profile of septic shock patients

Age (years) 58.6 ± 16.4

PCR (mg/dl) 27 ± 10

VAM / CRRT 90% / 22.5%

SOFA / SAP SII 12.3 ± 3 / 47 ± 9

Gram-positive / Gram-negative 24% / 17%

Table 2 (abstract P190)

Endotoxin activity (EAA) level and severity of illness

EAA < 0.4 0.4 < EAA < 0.6 EAA > 0.6

MAP (mmHg) 86.8 ± 4.7 80.5 ± 5.8 79.3 ± 17.1

NEu (kg/min) 0.36 ± 0.21 0.43 ± 0.36 0.68 ± 0.53

Lac (mmol) 3.0 ± 1.5 3.9 ± 3.7 6.4 ± 5.8

SOFA 9.7 ± 5 10.1 ± 3. 12.6 ± 3.8

CI (l/min/m) 3.78 ± 1.26 4.16 ± 1.14 3.64 ± 2.10

Conclusions Although our sample is too small to reach statistical significance, the EAA level could be a good marker of severity in SS patients. A high level of EAA seems to correlate with worse prognosis in SS patients.

Measuring endotoxin with newly developed endotoxin scattering photometry

Y Kase, A Endo, T Obata

Jikei University School of Medicine, Tokyo, Japan

Critical Care 2008, 12(Suppl 2):P191 (doi: 10.1186/cc6412)

Introduction Endotoxin scattering photometry (ESP) is a newly developed endotoxin assay. The mechanism of ESP is the same as for the turbidimetric method, which is a conventional endotoxin assay in Japan; however, ESP enables one to detect a very small amount of endotoxin within 1 hour. This is because ESP can detect the clotting enzyme product coagulin, which is the first appearance of limulus amebocyte lysate cascade evoked by endotoxin [1]. Methods For measurement of clinical samples of endotoxin with ESP, three groups of patients were examined. The three groups were normal healthy volunteers (n = 14), patients with elective surgery (n = 10) and patients with sepsis (n = 19) who were admitted to the ICU between February and September 2007. Sepsis is defined by the American College of Chest Physicians/ Society of Critical Care Medicine as systemic inflammatory response syndrome resulting from infection. Results Using endotoxin measurement with ESP, the value was higher in patients with sepsis (median, 20.7 pg/ml (interquartile range, 5.1-64.1 pg/ml)) than in patients with elective surgery (0.259 pg/ml (0.050-0.875 pg/ml)) and in normal healthy volunteers (0.073 pg/ml (0.031-0.345 pg/ml)). Conclusions Endotoxin could be detectable in every clinical sample by ESP, even though the turbidimetric method could detect positive for only 15% of patents with sepsis. These data suggest the potential value of measuring endotoxin with ESP for hunting down a hidden infection or an early manifestation for Gramnegative infection. Reference

1. Obata T, Nomura M, Kase Y, Sasaki H, Shirasawa Y: Early detection of the limulus amebocyte lysate reaction evoked by endotoxins. Anal Biochem 2008, 373:281-286.

Biochemical markers of the iron metabolism and their relationship with the inflammatory status in multiple trauma patients

E Pavlou, K Makris, A Palaiologou, O Tsimpoukidou, P Sarafidou, A Kotouzas, M Stavropoulou, E Katsioula,

I Drakopoulos, E Ioannidou

KAT General Hospital, Athens, Greece

Critical Care 2008, 12(Suppl 2):P192 (doi: 10.1186/cc6413)

Introduction Anaemia is a common problem in critically ill patients. The pathophysiology of anaemia involves altered iron metabolism and impaired erythropoiesis. In this study we compared the iron metabolism in septic and nonseptic patients on admission and during their stay in the ICU.

Methods Sixty polytrauma patients under mechanical ventilation were studied, 34 septic patients (Group I) and 26 nonseptic patients (Group II). The mean age of all patients was 51 ± 19 years, APACHE II score 13 ± 6, ISS 24 ± 11, and the mean ICU stay 25 ± 8 days. Blood samples were collected on admission, on

the 7th day or the day of the onset of sepsis, and on the 15th day, and were tested for serum iron (Fe), ferritin (Ft), transferrin (Tf), and soluble transferrin receptor (sTfR). The measured inflammatory parameters were white blood cell count, C-reactive protein and procalcitonin. Statistical analysis involved Student's t test and linear regression analysis.

Results In Group I the mean values were Fe 22 |ig/dl (12-78), Ft 877 ng/dl (85-4,797), and Tf 128 mg/dl (61-212). In group II the mean values were Fe 43 |ig/dl (23-97), Ft 377 ng/dl (36-1,127) and Tf 151 mg/dl (69-216). There was a statistical difference between the two groups (P < 0.05). No difference was observed for sTfR: 1.08 mg/dl (0.44-2.16) vs 0.94 mg/dl (0.6-1.49) between the two groups. No correlation could be established between any of the markers of the iron metabolism and the patient outcome. For all patients C-reactive protein was weakly correlated with Ft (r = 0.43, P < 0.001) and inversely with Tf (r = -0.39, P < 0.001). Conclusions The iron metabolism is altered in patients who develop sepsis in the ICU but this does not seem to outline patient outcome.

Compartmentalization of the inflammatory response in abdominal sepsis

E Grigoryev

Medical University, Kemerovo, Russian Federation Critical Care 2008, 12(Suppl 2):P193 (doi: 10.1186/cc6414)

Introduction There are three forms of translocation phenomena in sepsis: translocation in a proximal way to the small intestine, in the lymphatic way to peritoneal exudates and lymphatic collectors, and to the portal vein and hepatic circulation. Such factors of sepsis evolution are called decompartmentalization. The aim of our study was to investigate the prognostic value of several biochemical markers in peritoneal exudates in abdominal sepsis. Methods One hundred and four patients with general peritonitis and abdominal sepsis were examined. According to the Consensus Conference ACCP/SCCM (1992), the patients were divided into three groups: the sepsis group (n = 34, the focus of infection and two SIRS symptoms; APACHE II 5 ± 1; SOFA 1.0 ± 0.5); severe sepsis (n = 50, sepsis + multiorgan dysfunction; APACHE II 15 ± 2; SOFA 4.5 ± 2.5); and septic shock (n = 20, severe sepsis + vasopressor agents; APACHE II 25 ± 6; SOFA 7.6 ± 3.5). We researched the markers of SIRS in blood serum and in peritoneal exudates: TNFa (ELISA; DPC Biermann, Bad Nauheim, Germany), IL-1 (ELISA, LIA; Sangtec Medical, Bromma, Sweden), lactoferrine (Vector Best, Russia). The data were analyzed by t test, Fisher criteria. P < 0.05 was considered statistically significant. Results The sepsis group was characterized by a brief increase of TNF and IL-1 levels in blood serum on the first day (mean ± SD: TNF, 0.24 ± 0.1 pg/ml vs 0.1 ± 0.06 pg/ml; IL-1, 0.34 ± 0.12 pg/ml vs 0.1 pg/ml, significant). The severe sepsis group was characterized by an increase of TNF and IL-1 levels in blood serum, the considerable increase of TNF level in peritoneal exudates (severe sepsis 0.56 ± 0.21 pg/ml vs sepsis 0.12 ± 0.08 pg/ml, significance), and a significant increase of lactoferrine level in peritoneal exudates. The septic shock group was characterized by the low level of proinflammatory cytokines in blood serum, the increase of the IL-1 level in peritoneal exudates (septic shock 0.78 ± 0.24 pg/ml vs severe shock 0.54 ± 0.25 vs sepsis 0.18 ± 0.09 pg/ml, significant), and the low concentration of lactoferrine in peritoneal exudates.

Conclusions The nonfavourable outcome in abdominal sepsis was associated with the increase of TNFa and IL-1 levels, and the decrease of the lactoferrine level in peritoneal exudates. S75


1. Dugernier TL, Laterre PF, Wittebole X, et al.: Compartmen-talization of the inflammatory response during acute pancreatitis. Correlation with local and systemic complications.

Am J Respir Crit Care Med 2003, 168:148-157.

Early elevation of plasma soluble CD14 subtype, a novel biomarker for sepsis, in a rabbit cecal ligation and puncture model

M Nakamura, T Takeuchi, K Naito, K Shirakawa, Y Hosaka, F Yamasaki, S Furusako

Mochida Pharmaceutical Co., Ltd, Gotemba, Japan Critical Care 2008, 12(Suppl 2):P194 (doi: 10.1186/cc6415)

Introduction To reduce the mortality rates of patients with sepsis, rapid diagnosis and therapeutic decision are required. We have therefore discovered the soluble CD14 subtype (sCD14-ST), which is specific for sepsis and is elevated at an early stage during the disease progression [1]. Additionally, we have been researching a novel fusion protein, MR1007, which consists of the modified light chain of interalpha inhibitor and the anti-CD14 antibody as an anti-sepsis agent.

Methods We developed an ELISA using two rat monoclonal antibodies against N-terminal and C-terminal peptide sequences of rabbit sCD14-ST, respectively, to determine sCD14-ST concentrations in rabbit plasma. Survival rates and the time course of plasma levels of sCD14-ST, IL-6, and D-dimer were examined in a rabbit cecal ligation and puncture (CLP) model. Blood bacterial counts were also determined as colony-forming units. Results The plasma sCD14-ST levels in seven dead animals clearly increased at 2 hours or later together with blood bacterial counts, reached the peak at 3 hours, and then gradually decreased at 4-8 hours, whereas those in one surviving animal did not. The induction phase was about 24 minutes and the half-life ranged from 4 to 5 hours. Additionally, the plasma IL-6 and D-dimer levels in dead animals clearly increased at 3 hours or later, whereas those in one surviving animal did not. Intravenous administration of MR1007 with an antibiotic, latamoxef sodium, following the observation of increases in sCD14-ST levels and blood bacterial counts, improved the survival and the plasma D-dimer levels in a rabbit CLP model (n = 9, P < 0.05).

Conclusions Plasma sCD14-ST levels were elevated earlier than IL-6 and D-dimer along with occurrence of blood bacteria in a rabbit CLP model. Therapy with an anti-sepsis agent such as MR1007 following the elevation of sCD14-ST improved the outcome in the CLP model. These results suggest that sCD14-ST is useful to determine the earlier initiation of anti-sepsis therapy. Reference

1. Yaegashi Y, et al.: J Infect Chemother 2005, 11:234-238. P195

Proinflammatory versus anti-inflammatory cytokine profiles as an early predictor of outcome in severe multiple trauma

E Apostolidou1, V Grosomanidis2, A Mouzaki3, E Xenou3, M Rodi3, C Gogos4, C Skourtis2, D Vassilakos2, M Giala2

1Mpodosakeio Hospital, Ptolemaida, Greece; 2Ahepa University Hospital, Thessaloniki, Greece; 3School of Medicine, Patras, Greece; 4Patras University Hospital, Rion-Patras, Greece Critical Care 2008, 12(Suppl 2):P195 (doi: 10.1186/cc6416)

Introduction In the present study we investigated the early prog-S76 nostic value of the serum levels of the main proinflammatory and anti-

inflammatory cytokines and soluble cytokine inhibitors for mortality and late complications such as sepsis and multiorgan failure (MOF) in a well-defined population of patients with severe trauma. Methods A total of 62 previously healthy immunocompetent patients with severe multiple trauma (ISS > 16) admitted to the Emergency Room and aged less than 65 years were included during a period of 18 months. Sixty-four healthy individuals served as controls. Sera for sequential cytokine determination from patients were obtained on admission, 12 hours and 24 hours after trauma. We used an ELISA kit for quantitative determination of a wide spectrum of proinflammatory and anti-inflammatory cytokines simultaneously (TNFa, IL-1 P, IL-6, IL-10, sTNFR type I and type II, IL-1 ra and TGFP). All patients were evaluated clinically and microbiologically and were followed up for clinical outcome until discharge from the hospital.

Results The patient characteristics (57 men and five women) were age 34.51 ± 11.65 years and ISS 22.16 ± 12.43. They had a mortality rate of 11.29%, MOF 22.58%, ARDS 8.06% and sepsis 33.87%. On admission, trauma patients had significantly higher levels of IL-6, IL-10, sTNFRII, IL-1ra and TGFP than did controls. Among the various cytokines, IL-6 (admission, 12 hours, 24 hours) and IL-10 (24 hours) were more closely related to the severity of trauma and the ISS (P < 0.001). Elevated serum IL-6 (24 hours), TGFP (admission) and IL-1 ra (24 hours) were associated with intrahospital death, whereas higher levels of IL-6 (24 hours), IL-10 (24 hours), sTNFRI (24 hours), sTNFRII (12 hours and 24 hours) and IL-1 ra (24 hours) were detected in patients who developed later sepsis and higher levels of IL-6 (admission, 12 hours and 24 hours), IL-10 (12 hours and 24 hours) and IL-1 ra (1 2 and 24 hours) were detected in patients who developed later MOF. In the multivariate analysis, higher values of IL-6 (12 hours and 24 hours) were detected in sepsis and MOF (P = 0.006 and P = 0.029, respectively). In addition a significant decline in IL-10 at 1 2 hours and 24 hours was observed in patients without sepsis and MOF, as well as a decline in IL-1ra at 24 hours in survivors. Conclusions The levels of IL-6 as well as a sustained IL-10 and IL-1 ra production may predict death and late complications as early as into the first 24 hours following severe trauma.

Immunoparalysis in patients with acute respiratory distress syndrome

A Lahana1, E Galiatsou2, G Nakos2

1NIMTS Hospital, Athens, Greece; 2University Hospital, Ioannina, Greece Critical Care 2008, 12(Suppl 2):P196 (doi: 10.1186/cc6417)

Introduction Dysregulation of innate immunity may contribute to both the initiation and progression of acute respiratory distress syndrome (ARDS) [1]. The deactivation of alveolar macrophages (AMs), which is expressed by reduced HLA-DR surface molecules, was associated with higher mortality rate in patients with acute lung injury [2]. Our aim was to investigate the immune status in the lungs and systemically in early ARDS, by evaluating the AM and peripheral blood monocyte (PBM) HLA-DR expression. Methods Forty-one mechanically ventilated patients, 34 with early ARDS and seven without lung disease (control), were studied. On the third day after the onset of ARDS, all patients underwent fiberoptic bronchoscopy. Bronchoalveolar lavage fluid was obtained, and, besides the cell differential analysis, evaluation of AM HLA-DR expression was performed. At the same time, peripheral blood samples were obtained for evaluation of HLA-DR expression on PBMs. The three-step immunoperoxidase method was applied using the streptavidin-biotin complex Kit and the monoclonal mouse anti-human HLA-DR antigen. Levels of HLA-DR

expression were determined from the percentages of cells with positive cytoplasmic staining to the total number of cells. Results Patients were characterized as having direct ARDS (group A, 17 patients) and indirect ARDS (group B, 17 patients), respectively. In both groups, percentages of polymorphonuclear cells and lymphocytes were higher, while AM percentages were lower in comparison with the control group. HLA-DR expressions on AMs in both ARDS groups were lower than in controls (19.9 ± 11.4% (group A), 32.1 ± 10.4% (group B) vs 56.4 ± 10.5% (control), respectively; P < 0.05). AM HLA-DR expression in group A was lower than in group B (P < 0.05). PBM HLA-DR expressions in both ARDS groups were lower than in controls (38.06 ± 15.7% (group A), 27.5 ± 12.6% (group B) vs 54.1 ± 15.4% (control), respectively; P < 0.05). PBM HLA-DR expression in group B was lower than in group A (P = 0.01).

Conclusions In early ARDS, HLA-DR expressions on AMs as well as on PBMs were low. In direct ARDS, however, local immuno-paralysis was more profound, while more intense peripheral monocyte deactivation was observed in the indirect syndrome. The understanding of the immune dysfunction in ARDS may allow the assessment of novel treatments in an attempt to modify lung injury. References

1. Gunther A, et al.: Am J Respir Crit Care Med 1996, 53:176-184.

2. Pugin J, et al.: Crit Care Med 1999, 27:304-312.

Transforming growth factor beta 1 gene transcription in infection and severe sepsis displays distinguishing characteristics

M White1, MJ O'Dwyer1, R Grealy1, P Stordeur2, B O'Connell1, DK Kelleher1, R McManus3, T Ryan1

1St James Hospital, Dublin, Ireland; 2Hôpital Erasme, Université Libre de Bruxelles, Brussels, Belgium; 3Trinity Centre for Health Sciences, Dublin, Ireland

Critical Care 2008, 12(Suppl 2):P197 (doi: 10.1186/cc6418)

Introduction Transforming growth factor beta (TGFp) is a pleo-trophic cytokine that promotes a CD4 Th1 response to infection. We examined the gene expression of TGFp by quantitative RT-PCR in three study groups: 10 healthy controls, 15 patients with Gram-negative bacteraemia but without severe sepsis, and 58 patients with severe sepsis.

Methods Blood samples were collected from healthy controls at one time point. In bacteraemic patients, blood sampling was carried out within 24 hours of the positive blood culture being reported. In 58 patients presenting with severe sepsis, blood sampling was carried on day 1 of intensive care admission and on day 7 in survivors. Mononuclear cells were isolated and TGFp mRNA was quantified using the technique of quantitative QRT-PCR. All values are stated as the median and interquartile range. Between-group comparisons were performed by Wilcoxon rank sum test. Results TGFp mRNA copy numbers were significantly reduced in the bacteraemic group (1.99 x 106; 2.22 x 106-1.92 x 106) compared with controls (3.8 x 106; 4.1 x 106-2.9 x 106), P = 0.01, and was significantly reduced in the sepsis group (1.97 x 106; 2.8 x 106-0.76 x 106) compared with the control group, P = 0.009. While median TGFp copy numbers were similar in sepsis and bacteraemia groups, 18 of 58 (30%) patients with sepsis had TGFp copy numbers less than the lowest of the bacteraemic group (P = 0.02). In the sepsis group, 19 patients died. There was no association between TGFp mRNA copy numbers and outcome measures such as mortality, the presence of shock after prolonged sepsis, duration of vasopressor support, duration of mechanical ventilation and duration of intensive care stay.

Conclusions The human host response to infection is related to a distinct pattern of TGFP gene transcription, with deficient TGFP gene transcription related to the occurrence of infection and onset of septic shock rather than recovery from a shocked state or survival. This information could be used to structure genomic studies in sepsis and infection.

Accumulation of advanced glycation end products in intensive care patients

WL Greven, J Smit, JH Rommes, PE Spronk

Gelre Hospitals, Apeldoorn, The Netherlands

Critical Care 2008, 12(Suppl 2):P198 (doi: 10.1186/cc6419)

Introduction Oxidative stress plays an important role in the course and eventual outcome of a majority of patients admitted to the ICU. Markers to estimate oxidative stress are not readily available in a clinical setting. Recently, advanced glycation endproducts (AGEs), compounds that accumulate with age and play an important role in the development of end organ damage in several conditions, have emerged as one of the very few stable end products of oxidative stress. Skin autofluorescence (AF) is a validated marker of tissue content of AGEs, and can be rapidly and noninvasively measured. We hypothesized that AGEs, measured by AF accumulate in ICU patients, are a prognostic factor for outcome. Methods Skin AF was measured using an AGE reader in 40 consecutive ICU patients (with a small subgroup of five diabetic patients), age >18 years. As a comparison, historical data of a non-diabetic control group (n = 231) and a diabetic control group (n = 973) were also used to calculate age-adjusted AF levels (AF-adj). Values are expressed as the median and interquartile range (P25-P75). Differences between groups were tested by the MannWhitney U test. P <0.05 was considered statistically significant. Results AF-adj values were higher in nondiabetic ICU patients (0.333 (0.002-0.676)) than in nondiabetic controls (-0.070 (-0.290 to 0.240); P < 0.001). AF-adj values were also higher in diabetic ICU patients (0.770 (0.566-0.892)), compared with diabetic controls (0.000 (0.000-0.000); P < 0.001). No differences in skin AF were observed between acute or planned admissions, nor was skin AF related to severity of disease as estimated by the APACHE II score, length of ICU and hospital stays or mortality. Conclusions Acute AGE accumulation occurs in ICU patients, probably reflecting oxidative stress. The group was too small to allow any conclusions on the possible predictive value of skin AF for prognosis for patients on the ICU. Further studies should reveal whether AGE accumulation will be a useful parameter in ICU patients.

Total antioxidant status and lipid metabolism in patients with severe multiple trauma

M Muravyeva1, A Zhanataev2, V Moroz1, A Durnev2, V Reshetnyak1

1Science Research Institute of General Reanimatology and

2Zakusov's State Research Intstitute of Pharmacology, Russian

Academy of Medical Sciences, Moscow, Russia Federation Critical Care 2008, 12(Suppl 2):P199 (doi: 10.1186/cc6420)

Introduction The objective was to study the parameters of free

radical processes and cholesterol metabolism in sufferers with

severe multiple trauma (SMT). S77

Methods The investigation included 77 persons. The patients were divided into two groups in relation to the outcome of disease: group I, nonsurvivors; group II, survivors. The concentrations of lipid metabolic parameters, total antioxidant status (TAOS) and a number of biochemical plasma parameters were determined on the biochemical analyzer on days 1, 3, 5, 7 and 15. Very low-density lipoprotein and low-density lipoprotein (LDL) cholesterols were calculated. 8-Hydroxy-2-desoxyguanosine was determined using the method of gel electrophoresis of isolated blood cells. Results The study indicated normal levels of 8-hydroxy-2-desoxyguanosine in group II in the early period after SMT. There was a rise of this parameter on days 5 and 7 in group I. The TAOS was decreased in comparison with the normal range in both groups and has a tendency to decrease later. The level of total cholesterol was decreased in both groups during the first week after the SMT. A rise of total cholesterol occurred in group II on day 15 (4.48 ± 1.81 mmol/l). At the same time, this parameter remained decreased in group I. The content of LDL cholesterol in the first week after trauma tended to increase in group II and to decrease in group I. The study findings suggest that a level of LDL cholesterol lower than 2.0 mmol/l during the first week after SMT with a decreased (<3.2 mmol/l) level of total cholesterol are unfavourable prognostic factors of disease. There was a reduction of high-density lipoprotein cholesterol in the early period after trauma. This parameter, however, tended to increase in group II and to decrease in group I. There was a rise of GGT in group I, although the total protein tended to decrease. Enhanced alkaline phosphatase activity was observed in both groups, and on day 15 was in 1.5-2 times higher than the normal range. Conclusions The dynamics of changes of total cholesterol, LDL cholesterol, total protein, GGT and 8-hydroxy-2-desoxyguanosine can be used as a prognostic factor in sufferers in the early period after SMT.

Evaluation of plasma thiolic groups and reactive oxygen metabolites in critically ill patients

L Montini, M Antonelli, M Calabrese, C Rossi, A Minucci, S Persichilli, P De Sole

Universita Cattolica del Sacro Cuore, Roma, Italy

Critical Care 2008, 12(Suppl 2):P200 (doi: 10.1186/cc6421)

Introduction Free thiolic group (SH) and reactive oxygen metabolite (ROM) determination could provide helpful information on the balance between oxidative damage and antioxidant capacity

[1]. In previous work we reported the change of the relationship between ROMs and SHs in a group of patients with severe sepsis

[2]. In this work we show the values of ROMs and SHs of patients in the ICU divided into three groups according to the gravity of sepsis to investigate a possible relationship between these parameters and the clinical state.

Methods Sixty patients admitted to the ICU were divided into three groups (sepsis, severe sepsis, septic shock). At least three determinations of ROMs and SHs for patient were assayed in 2-3 weeks. Control cases: 20 surgical patients without complications. The blood for ROM and SH determinations was drawn during 24 hours after surgery. SH groups were assayed in plasma by Ellman's reaction with spectrophotometric methods applied to an automatic instrument (OLYMPUS AU 460) [3]. The plasmatic ROM values were assayed by a DIACRON-Italia kit, applied to an automatic instrument (OLYMPUS AU 640).

Results The results obtained show a significant reduction of both S78 plasma SHs and ROMs in the three groups according to their level

of sepsis. The analysis of variability (CV) of ROMs shows a clear CV increase in the three groups of patients (CV 40-60%) in comparison with the relatively low values in the control group (CV 20%). If the septic shock patients are divided in two groups according to their ROM levels (lower and higher than 150 Ucarr), the frequency of deaths in the group of low ROM values (12/20) is decidedly higher than that observed in survivors patients during the observation time (3/17).

Conclusions This last result suggests that plasma ROM levels decrease significantly when the clinical situation gets worse, and allows one to hypothesize a possible diagnostic use of this parameter as a prognostic index. References

1. Bergamini CM, et al.: Current Pharmac Design 2004, 10: 1611-1626.

2. Montini L, et al.: 20th ESICM Annual Congress, Berlin, 7-10 October 2007 [abstract 0843].

3. Ellman G, Lysko H: A precise method for the determination of whole blood and plasma sulphydryl groups. Anal Biochem 1979, 93:98-102.

Decreased apolipoprotein A1 levels correlate with sepsis and adverse outcome among ICU patients

E Pavlou1, K Makris1, A Palaiologou1, B Kaldis1, G Vrioni2, E Economou1, M Eforakopoulou1, L Zerva1, I Drakopoulos1, E Ioannidou1

1KAT General Hospital, Athens, Greece; 2Atticon, Athens, Greece Critical Care 2008, 12(Suppl 2):P201 (doi: 10.1186/cc6422)

Introduction Although changes in lipoprotein levels occur in a variety of inflammatory disorders, little is known about lipoprotein metabolism among septic patients. This study investigated the dynamics of plasma apolipoprotein A1 (apoA1), as well as other inflammatory markers, in ICU patients with and without sepsis. Methods Sixty patients (34 with sepsis and 26 without) on mechanical ventilation, mean age 51 ± 19.6 years, mean ICU stay 24 ± 18.8 days, APACHE II score 13 ± 6.8, admitted directly to our ICU were enrolled in our study. Three blood samples were collected on day 0, on day 7 or the day of sepsis onset and on day 15 for the determination of plasma apoA1, C-reactive protein and serum amyloid A levels by the nephelometric technique (BNProSpec; Dade-Behring).

Results Among septic patients apoA1 levels decreased from 81.9 ± 28.3 on day 0 to 56.2 ± 16.0 mg/dl the on day of sepsis onset (63.2 ± 14.4 and 50.6 ± 15.4 mg/dl for survivors and non-survivors, respectively). On day 15, surviving patients demonstrated increasing values (79.3 ± 16.4 mg/dl); the opposite was true for nonsurvivors (30.3 ± 15.4 mg/dl on the third sample). Among nonseptic patients, the apoA1 values corresponded to 92.8 ± 26.2 on day 0, 85.2± 19.3 on day 7, and 87.2 ± 20.9 mg/dl on day 15. Significantly different levels (paired Student's t test, P < 0.05) were detected between septic and nonseptic patients on day 7 or on the day of sepsis onset, between surviving sepsis and nonsurviving sepsis patients on the same day, and between surviving sepsis and nonsurviving sepsis patients on day 15. C-reactive protein and serum amyloid A concentrations showed no difference between patients who survived and those who passed away (second or third sample). Conclusions Among ICU patients with sepsis, the apoA1 concentrations decrease rapidly, but not in nonseptic patients. Low apoA1 levels on the day of onset of sepsis appear to be a predictive factor for adverse outcome.

Coagulation in hospitalized community-acquired pneumonia: disturbances in even the least ill

MC Reade, EB Milbrandt, S Yende, SL Shook, L Kong, DC Angus, JA Kellum, for the GenIMS Investigators

University of Pittsburgh, PA, USA

Critical Care 2008, 12(Suppl 2):P202 (doi: 10.1186/cc6423)

Introduction Although previous studies of severe sepsis (SS) patients found coagulopathy quite common, little is known of coagulopathy in infection with lesser degrees of illness severity. Methods In a 28-center prospective cohort study (GenIMS) of patients presenting to US emergency departments with community-acquired pneumonia, we measured serum coagulation markers (INR, partial thromboplastin time, platelets, antithrombin, D-dimer, factor IX, prothrombin activator-inhibitor (PAI) and thrombin-antithrombin (TAT)) on emergency department presentation. We stratified the proportion of subjects with abnormal values by illness severity (APACHE III), subsequent development of SS, and 90-day mortality. We hypothesized coagulation abnormalities would increase with illness severity and be greater in those with poor outcomes. Results Of 1,895 hospitalized subjects, 31% developed SS and 11% died by day 90. The proportion with abnormal initial coagulation marker values increased with initial illness severity (Figure 1). Yet, even among the least ill (APACHE III mean (SD), 31 (7); ICU admission rate 6%), coagulation abnormalities were common. Day 1 percentage abnormal PAI and TAT were greater in those that developed SS, while day 1 PAI, TAT, partial thromboplastin time, D-dimer were more often abnormal in those dying by day 90. Many subjects that either did not develop SS or died had evidence of coagulopathy at presentation (see table in Figure 1).

Figure 1 (abstract P202)

Figure 1 (abstract P203)

Day 1 coagulation abnormalities by initial APACHE I

75 75% £

1 50% .Û TO

^ 25% 0%

riiTI,mn i-rf

Ë llrrrfl.rrrfl

n Quartile 1

□ Quartile 2

□ Quartile 3

□ Quartile 4


• p<Q.D5

% abnotmal INR PTT PLT AT3 D-dimer Factor IX PAI-1 TAT

No sepsis 48% 14% 10% 14% 79% 8% 7% 31%

Alive at 9Lid 43% 12% 11% 15% 79% 10% 8% 33%

Conclusions Coagulation abnormalities are common in hospitalized community-acquired pneumonia patients, increasing with illness severity and poor outcome. Abnormalities were seen even in the least ill, however, and differences between groups were not large. Therapeutic manipulation of coagulation in infection will probably require a carefully titrated approach. Acknowledgement Supported by NIGMS R01GM61992.

Testing of anti-activated protein C antibodies in four drotrecogin alfa (activated) severe sepsis studies

S Yan, J Brandt, G Vail, S Um, J Bourdage, N Correll

Lilly Research Laboratories, Indianapolis, IN, USA

Critical Care 2008, 12(Suppl 2):P203 (doi: 10.1186/cc6424)

Introduction This study evaluated anti-activated protein C (anti-APC) antibody (Ab) development in drotrecogin alfa (activated)

□ AA (n=1855)

PBO (n=1493)


ritrlz Ab + anti-ARC+

ntrlz Ab +

PROWESS 1.2% (7/586) (3/7)

EVBF 2.9% (2/70) (1/2

ADDRESS 1.5% (14/938) (1/14)

XPRESS 0.8% (2/261) (0/2)

1.8% (10/564) (2/10)

1.5% (14/928) (1/14) 0% (0/1)*

1.3% (25/1855) (5/25)

1.6% (24/1493) (3/24)

Patients with negative BL and positive post-BL anti-APC Abs. *DAA not given.

(DAA) (recombinant human APC)-treated adult patients with severe sepsis.

Methods Serum and plasma samples were collected for anti-APC Ab testing from patients in the PROWESS, EVBF (ENHANCE substudy), ADDRESS and XPRESS trials at baseline (BL) and on days 14, 28 and 60 (except PROWESS). PROWESS and ADDRESS were placebo-controlled studies. All patients in EVBF and XPRESS were DAA-treated. An ELISA detecting anti-APC IgA/IgG/IgM Abs (sensitivity: 0.26 |ig/ml) was used to screen all serum samples from patients who had a BL sample and at least one post-BL sample. Confirmed positive samples (binding inhibited >50% with 50 |ig/ml exogenous DAA) were titered by twofold serial dilutions. IgG isolated from plasma of positive samples was tested for neutralizing activity against DAA-induced prolongation of aPTT. Positive anti-APC Ab was analyzed on an 'as treated' basis. Results The proportions of patients who tested negative for BL and positive for post-BL anti-APC Abs in all studies are presented in Figure 1, and were similar in the DAA and placebo cohorts at each sampling time. Twenty-five DAA patients and 24 placebo patients had a negative BL but positive post-BL anti-APC Abs; all were alive at day 28 and all but two in each group were alive at hospital discharge, including all eight with positive neutralizing Abs. No thrombotic events were reported. No relationship between the titer of anti-APC Abs and neutralizing Abs was observed. In PROWESS, no difference in markers of coagulopathy between Ab-positive and Ab-negative patients was observed. Conclusions The proportion of patients with anti-APC or neutralizing Ab was low and was similar between the 1,855 DAA patients and 1,493 placebo patients tested. No relationship between anti-APC Ab development and adverse reactions was observed. There was no evidence that the anti-APC Abs detected represented a specific immune response to DAA therapy.

Early infusion of recombinant human activated protein C decreases the number of years lost due to premature death

GF Vazquez de Anda1, J Gutierrez Ruiz2, L De la Cruz Avila2, C Zuniga Velazquez2, E Quintero Zepeda2, AP Arriaga2

1Universidad Autonoma del Estado de Mexico, Centro de Investigation en Ciencias Medicas, Toluca, Mexico; 2ISSEMYM Medical Center, Toluca, Mexico

Critical Care 2008, 12(Suppl 2):P204 (doi: 10.1186/cc6425)

Introduction Multicentre studies have demonstrated that early infusion of recombinant human activated protein C (rhAPC) improves survival of patients suffering from severe sepsis. The objective of this study was to demonstrate the benefit of early infusion of rhAPC on the number of years lost due to premature death (YLPD).

Methods This case-control study included 146 patients suffering from severe sepsis admitted to the ICU from January 2003 to December 2006. Patients were divided into three groups based on the initiation time of rhAPC after the diagnosis of severe sepsis: Group I (GI): patients who received rhAPC within the first 24 hours of severe sepsis (n = 53), Group II (GII): patients who received rhAPC after 24 hours from diagnosis of severe sepsis (n = 41), and Group III (GIII): patients with severe sepsis who did not receive rhAPC (n = 52). Dependent variables included age, gender, APACHE II score, and the number of organs with acute failure at the time of admission, YLPD and mortality. Four follow-up time periods were established: Time (T) I: from initiation of infusion to day 4 of infusion of rhAPC, T2: from completion of infusion to day 8, T3: from day 9 to day 30, and T4: from day 30 to the end of the study period (December 2006). Descriptive statistics were performed to identify the variable distribution. Chi-square analysis was used to determine the association between mortality and therapy. The number of YLPD was calculated according to conventional equations. Results There were no differences between groups in age and APACHE II score at admission. There were statistical differences in the number of organs in acute failure; GI 2 (1-5), GII 3 (2-5) and GIII 2 (1-4) (median, minimum and maximum) (P = 0.03). At T1, mortality was 7.5% GI, 26.8% GII and 23.1% GIII (P = 0.03), and YLPD were 81.82 years GI, 226.65 years GII and 189.6 years GIII. Within T2, mortality was 4% GI, 23% GII and 7.5% GIII (P = 0.017), and YLPD were 20 years GI, 102.63 years GII and 54.89 years GIII. Within T4, mortality was 28.3% GI, 70.7% GII and 48.1% GIII (P = 0.000), and YLPD were 244 years GI, 529.7 years GII and 369.24 years GIII.

Conclusions Early infusion of rhAPC improves survival and decreases the YLPD in patients suffering from severe sepsis.

Extended drotrecogin alfa (activated) therapy in patients with persistent requirement for vasopressor support after 96-hour infusion with commercial drotrecogin alfa (activated)

J Dhainaut1, M Antonelli2, P Wright3, M Belger4, M Cobas-Meyer4, M Mignini4, J Janes4

1Cochin University, Paris, France; 2A Gemelli University, Rome, Italy;3Moses Cone Memorial Hospital, Greensboro, NC, USA; 4Eli Lilly & Co. Ltd, Windlesham, UK

Critical Care 2008, 12(Suppl 2):P205 (doi: 10.1186/cc6426)

Introduction In the European Union, drotrecogin alfa (activated) (DAA) is licensed (intravenous infusion, 96 hours) for adults with severe sepsis with multiple organ failure. In the PROWESS trial, DAA treatment was associated with significant mortality reduction and more rapid improvement in cardiovascular function over 7 days (decreased need for vasopressors). But 22% of DAA-treated patients remained on vasopressors at end infusion. The primary aim of this study was to investigate, in severe sepsis patients with persistent vasopressor dependency at the end of 96-hour commercial DAA treatment, whether continued administration of DAA for up to a further 72 hours results in more rapid resolution of vasopressor dependency compared with placebo (no DAA after commercial DAA infusion). Secondary objectives were mortality, biomarker changes, and safety.

Methods A multicentre, double-blind, randomized, placebo-controlled study. Owing to slower than anticipated recruitment, the planned sample size was reduced from 275 to 200. Results Two hundred and one patients (64 centers, nine S80 countries) were entered, 199 randomized, 193 received study

medication for any length time (ITT population). There were clinically relevant differences in baseline characteristics, with more DAA patients having a cardiovascular SOFA score of 4 compared with placebo (78.7% vs 64.3%, P = 0.03), having higher median doses of norepinephrine (0.26 |ig/kg/min vs 0.16 |ig/kg/min, P = 0.03) and tending to have lower protein C levels (66.8% vs 72.9%, P = 0.23). There was no statistically significant difference for primary endpoint resolution of vasopressor dependency (log-rank P = 0.42), nor in the proportion of resolvers (34.0% DAA vs 40.4% placebo, P = 0.36). Day 28 mortality was 39.8% in the DAA group, 32.3% in the placebo group (P = 0.28). The DAA group had significantly lower percentage change in D-dimers (21.9% vs 63.2%, P < 0.001), driven primarily by a larger increase in the placebo group. By end infusion, protein C levels were similar (81.7% DAA vs 79.4% placebo, P = 0.23). One serious bleeding event occurred during the infusion period in each group. Conclusions Continued DAA for up to a further 72 hours after commercial drug administration did not result in more rapid resolution of vasopressor-dependent hypotension, despite anticipated effects on D-dimer and protein C levels, and was associated with an acceptable safety profile. The reduction in the planned sample size combined with baseline imbalances in protein C levels and vasopressor requirements may have limited our ability to show clinical benefit.

Drotrecogin alfa: start early, ensure response, stop early!

S Jog, B Pawar, P Rajhans, P Akole, B Bhurke, S Gadgil

Deenanath Mangeshkar Hospital and Research Centre, Pune, India Critical Care 2008, 12(Suppl 2):P206 (doi: 10.1186/cc6427)

Introduction Drotrecogin alfa (DA) is an effective treatment in sepsis-induced MODS. Optimum duration of treatment is 96 hours of infusion. It is unclear whether stopping DA before 96 hours, in patients in whom organ dysfunction rapidly resolves, ultimately affects the 30-day mortality. We performed a prospective study to evaluate this concept.

Methods We evaluated patients with severe sepsis having three or more organ failures (OF) who received DA within 24 hours of onset. We stopped DA before completion of 96 hours of infusion, assuring complete resolution of OF. All of these patients were monitored for reappearance of OF until discharge from hospital. Results Six patients with APACHE II score 25 ± 1.89 were evaluated. All six patients recovered completely from MODS and were discharged to home. Reappearance of OF was not seen in any of them. See Table 1.

Conclusions DA can be safely stopped before 96 hours in patients who show rapid reversal of organ dysfunction.

Table 1 (abstract P206)

Patient data

Patient number APACHE II score Shock reversal (hours) ARDS reversal (hours) Duration of drotrecogin alfa (hours) ICU stay (days)

1 28 70 48 72 16

2 25 No shock 66 72 7

3 23 48 20 76 8

4 23 56 70 72 6

5 25 76 40 76 9

6 26 56 100 90 10

Association of mortality in the surgical ICU with plasma concentrations of plasminogen activator inhibitor-1 and soluble E-selectin

T Yasuda, M Nakahara, Y Kakihana, Y Kanmura

Kagoshima University Hospital, Kagoshima, Japan

Critical Care 2008, 12(Suppl 2):P207 (doi: 10.1186/cc6428)

Introduction Both plasminogen activator inhibitor-1 (PAI-1) and soluble E-selectin (sES) are substances activated by cytokines under strong inflammation. PAI-1 is a rapid inhibitor of tissue plasminogen in vivo. PAI-1 is known as one of the markers of systemic inflammatory response syndrome, which is followed by multiple organ dysfunctions. sES is an adhesion molecule that is expressed from endothelial cells activated by TNF. It is reported that elevation of sES is followed by respiratory failure, which causes acute respiratory distress syndrome. But it is not clear whether their plasma levels affect the mortality and morbidity of critically ill patients. We therefore divide patients into two groups by the plasma levels of PAI-1 and sES and evaluate the mortality respectively.

Methods We compared the levels of PAI-1 and sES in survivors with those in nonsurvivors. We examined 29 patients admitted to our surgical ICU in the hospital of Kagoshima University. High levels of PAI-1 are known to be accompanied by hemorrhage after surgery. To evaluate, we therefore use the values of PAI-1 and sES on the admission day (day 1), day 2 and the day when hemorrhage is controlled. The plasma levels of PAI-1 and sES are measured by the latex agglutination assay with an automatic analyzer (LPIA-NV7; Mitsubishi Kagaku Iatron Co., Tokyo, Japan). For statistical analysis, a two-sided Fisher exact probability test was used to analyze the difference in the mortality. P < 0.05 indicated statistical significance. Results Among the patients examined, 11 patients showed elevated tPAI levels (>50 ng/ml) (PE group) and 18 patients showed normal tPAI levels (<50 ng/ml) (PN group). Fourteen patients showed elevated sES levels (>30 ng/ml) (EE group) and 15 patients showed normal sES levels (<30 ng/ml) (EN group). Mortality is significantly higher in the PE group (9/11, 81.8%) and EE group (8/14, 57.1%) than in the PN group (1/18, 5.5%) (P < 0.0001) and EN group (2/15, 13.3%) (P < 0.0209), respectively. Conclusions Both the levels of PAI-1 and sES are useful for evaluating prognosis of critically ill patients in the surgical ICU.

Efficacy of antithrombin administration in the acute phase of burn injury

A Lavrentieva, I Houris, S Aspragathou, K Mitka, K Kazanas, M Bitzani

G. Papanikolaou Hospital, Thessaloniki, Greece

Critical Care 2008, 12(Suppl 2):P208 (doi: 10.1186/cc6429)

Introduction Severe burn injury is characterized by the activation of coagulation, decreased fibrinolytic activity and decreased natural anticoagulant activity. The aim of our study was to investigate the effect of antithrombin administration on the coagulation status and on organ function in the early postburn period. Methods Thirty-one patients admitted to the burn ICU were randomized into two groups, antithrombin-treated (n = 15) and control (n = 16), for four consecutive days after thermal injury. The clinical data, coagulation parameters and fibrinolysis parameters were compared and the adverse effects were monitored. Results Significant differences in the time trend of D-dimers and thrombin-antithrombin complexes were observed between anti-thrombin-treated and control groups (decrease in the antithrombin-

treated group and increase in the control group). According to the International Society on Thrombosis and Hemostasis criteria, disseminated intravascular coagulation (DIC) diagnosis was set for 28 from 31 patients. The presence of overt DIC was associated with mortality (P = 0.002). The Sequential Organ Failure Assessment score time trend differed significantly between the two investigation groups (decreased in the treated group and did not change in the control group). Antithrombin-treated patients had an absolute reduction in 28-day mortality of 25% compared with the control group (P = 0.004). No treatment-related side effects were observed. Conclusions Treatment with antithrombin seems to affect the coagulation status and to reduce multiple organ failure incidence and mortality in the early postburn period.

Whole blood coagulation and platelet activation in the athlete: a comparison of marathon, triathlon and longdistance running

A Hanke1, A Staib1, K Görlinger2, M Perrey1, D Dirkmann2, P Kienbaum1

1Heinrich-Heine-Universität Düsseldorf, Germany; 2Uniklinikum Essen, Germany

Critical Care 2008, 12(Suppl 2):P209 (doi: 10.1186/cc6430)

Introduction Thromboembolic events have been reported in marathon athletes during competition. We tested the hypothesis that activation of coagulation and platelets depends on the type of endurance sport and running fraction.

Methods After ethic committee approval, 68 healthy athletes participating in a marathon (MAR, running 42 km, n = 24), a triathlon (TRI, swimming 2.5 km + cycling 90 km + running 21 km, n = 22), and long-distance cycling (CYC, 151 km, n = 22) were included in the study. Blood samples were taken before and immediately after competition. Rotational thrombelastometry was performed (ROTEM; Pentapharm, Germany). The coagulation time (CT) and maximum clot firmness (MCF) after intrinsic activation was assessed. Platelet aggregation was tested using a multiple platelet function analyzer (Multiplate; Dynabyte, Germany) by activation with ADP as well as thrombin-activating peptide 6 and expressed as the area under the curve (AUC). Statistics used the Wilcoxon signed rank test, P < 0.05.

Results Complete datasets were obtained in 59 athletes (MAR: n = 21, TRI: n = 19, CYC: n = 19). The CT significantly decreased in MAR (from 172 ± 15.3 s to 155 ± 18.3 s), TRI (from 168.1 ± 12.9 s to 154.2 ± 11.3 s), and CYC (from 164.7 ± 17.7 s to 152.5 ± 13.0 s) without differences between groups. In parallel, the MCF increased in all groups (MAR: from 58.1 ± 3.9 mm to 62.4 ± 3.8 mm, TRI: from 56.1 ± 3.2 mm to 59.5 ± 3.1 mm, CYC: from 59.3 ± 5.0 mm to 64.2 ± 4.2 mm). Platelets were only activated during the MAR and TRI, however, as indicated by an increased AUC during TRAP activation in the MAR (from 919 ± 149 to 1,074 ± 290) and an increased AUC during ADP activation in the MAR (from 532 ± 184 to 827 ± 262) and TRI (from 505 ± 205 to 799 ± 329). Conclusions As shown before, coagulation is activated during physical activity. We observed significant platelet activation during a marathon and to a lesser extent during a triathlon. We conclude that prolonged running may increase platelet activity. Moreover, we speculate that direct mechanical stress during running contributes to the observed effect. Running therefore activates both coagulation and platelet activity, resulting in an increased risk of thrombo-embolic incidents in running athletes. Reference

1. Sumann G, Fries D, Griesmacher A, et al.: Blood Coagul Fibrinolysis 2007, 18:435-440. S81

Admission platelet count as a prognostic indicator in intensive care

V Hariharan, J Paddle

Royal Cornwall Hospital, Truro, UK

Critical Care 2008, 12(Suppl 2):P210 (doi: 10.1186/cc6431)

Introduction Abnormal platelet counts are common findings in ICU patients. Thrombocytopenia is associated with a poor outcome [1]. Conversely, thrombocytosis may be associated with an improved outcome [2]. We therefore conducted a retrospective observational study in our own unit to investigate this further. Methods All patients admitted to the ICU of a large district general hospital (Royal Cornwall Hospital) from January 2002 to April 2005 were included in this retrospective study. We collected data on age, sex, admission category, platelet count, APACHE II score, APACHE II predicted mortality, and hospital mortality. The platelet value was taken as the lowest platelet count obtained within the first 24 hours of ICU admission. The primary outcome was hospital mortality. Statistical analysis was conducted with SPSS version 15.0 using logistical regression models.

Results A total of 1,767 patients were admitted during the study period. We excluded 119 patients with no recorded platelet data. We found a strong negative correlation between the admission platelet count and mortality, which was significant (P = 0.001, logistic regression). To test this relationship with actual hospital mortality we divided the cohort into deciles of platelet count and plotted the data against mortality. Those with platelet counts below 67 had a mortality rate of 57.2%. This was substantially higher then the remaining deciles (P = 0.0001, Fisher's exact test). We did not demonstrate any significant reduction in mortality in patients with thrombocytosis (P = 0.523, Fisher's exact test). We compared medical versus surgical patients and found that, for any given platelet value, the predicted outcome for surgical patients was better (P = 0.008, t test). We analysed a model that included platelets as an additional indicator for outcome. In binary logistic regression analysis there was a significant association between platelet count and mortality (coefficient = 0.998, CI = 0.9960.999). This association remained significant in a multiple logistic regression model, which included APACHE II (P < 0.001). A model including both APACHE II and platelet count improved the proportion of deaths correctly predicted from 69.5% with APACHE II alone to 71.3% with platelets included. Conclusions We confirmed previous findings that there is a correlation between low platelet counts and adverse outcome, and we have further demonstrated that the correlation between platelet count and predicted mortality exists across the spectrum of platelet values. In addition we have demonstrated a difference in mortality between medical and surgical patients for any given admission platelet values. Finally, we have demonstrated that platelet values provide additional prognostic information above the APACHE II score. References

1. Vanderschueren S, et al.: Crit Care Med 2000, 28:1871-1876.

2. Gurung AM, et al.: Br J Anaesth 2001, 87:926-968.

A phase 1 trial of nebulized heparin in acute lung injury

B Dixon, D Santamaria, J Campbell, A Tobin

St.Vincent's Health, Melbourne, Australia

Critical Care 2008, 12(Suppl 2):P211 (doi: 10.1186/cc6432)

Introduction Animal studies of acute lung injury (ALI) suggest S82 nebulized heparin may limit damage from fibrin deposition in the

alveolar space and microcirculation. We therefore undertook a trial to assess the safety and tolerability of nebulized heparin in patients with ALI.

Methods An open-label phase 1 trial of four escalating doses of nebulized heparin was administered over 2 days. A total of 16 ventilated patients with ALI were studied. Each dose was assessed in four patients. The first group was administered 50,000 U/day, the second 100,000 U/day, the third 200,000 U/day and the fourth 400,000 U/day. We measured the arterial to inspired oxygen ratio (PaO2/FiO2), lung compliance, the alveolar dead space fraction, the blood thrombin clotting time and the activated partial thromboplastin time (APTT). Bronchoalveolar lavage (BAL) fluid was collected and the prothrombin fragment and tissue plasmino-gen activator levels assessed.

Results There was no difference between groups in the PaO2/FiO2, lung compliance or the alveolar dead space fraction over the study period. A trend to reduced prothrombin fragment levels in BAL fluid was present with higher doses of nebulized heparin (P = 0.1). Nebulized heparin did not increase tissue plasminogen activator levels in BAL fluid. A trend to increased blood thrombin clotting time and APTT levels was present with higher doses of nebulized heparin (P = 0.1 and P = 0.09, respectively). For the highest dose, the APTT reached 64 seconds. Conclusions Nebulized heparin can be administered safely to ventilated patients with ALI. At higher doses, nebulized heparin may limit coagulation activation in the lungs and increase systemic APTT levels.

Thromboelastography in clinical decision-making in the critically ill patient in a district general hospital ICU

J Louw, T Veenith, P Moondi

Queen Elizabeth Hospital, Kings Lynn, UK

Critical Care 2008, 12(Suppl 2):P212 (doi: 10.1186/cc6433)

Introduction Thromboelastography (TEG) is a point-of-care monitoring tool that could help in managing coagulopathy in the critically ill. This may be beneficial in reducing the length of stay in the ICU, guide blood product transfusion and improve patient outcome. Methods We conducted a retrospective analysis of the use of TEG in a busy district general hospital ICU. We included all 100 patients in whom TEG was performed over 1 year. They required >4 units blood intraoperatively or >2 units blood on the ICU, abdominal aortic aneurysm repair or had sepsis. TEG was performed on 212 occasions, in parallel with routine coagulation studies. Results We transfused 656 units of packed RBCs, 27 units of cryoprecipitate, 180 units of FFP and 130 units of platelets, incurring an expenditure of £722,682. The cost of running TEG for that year was £1,845. Two hundred and twelve clinical decisions were made following TEG along with clotting results. We identified 174 (82.08%) abnormal TEG results, of which 88 (50.57%) were accompanied by abnormal clotting. One hundred and eighty-seven (88.21%) clinical decisions were influenced by the TEG result. In this group, 171 (91.44%) were related to guiding transfusion of blood products. Fifteen (8.02%) resulted in a change of medical management, guiding activated protein C administration, renal replacement therapy, invasive procedures and starting secondary anticoagulation prophylaxis.

Conclusions Standard coagulation assays do not provide any information on platelet function or fibrinolysis [1]. TEG can replace clotting studies and the assessment of platelet function [2]. TEG can guide blood product transfusion in cardiac surgery [1]. TEG can be done with a fraction of the total costs of transfusion and provides confidence during the management of coagulopathy.

Ongoing research should focus on establishing clear guidelines for

the appropriate use of the thromboelastograph.


1. Avidan MS, et al.: Comparison of structured use of routine laboratory tests or near patient assessment with clinical judgment in the management of bleeding after cardiac surgery. Br J Anaesthesia 2004 92:178-186.

2. The clinical and cost effectiveness of thromboelastogra-phy/thromboelastometry. Health Technology Assessment Report, NHS Quality Improvement Scotland; December 2007.

Modifications of coagulation imbalance during antithrombin treatment in preeclamptic patients: our experience

C Buscemi, S Pirri, D Mangione, A Giarratano

University of Palermo, Italy

Critical Care 2008, 12(Suppl 2):P213 (doi: 10.1186/cc6434)

Introduction Preeclamptic conditions are often associated with a natural inhibitor consumption. Many studies have evidenced validity of antithrombin (AT) treatment during preeclamptic conditions. The aim of the study is to restore a congruous coagulation imbalance with administration of AT under the guide of thromboelastographic monitoring (TEG).

Methods Ten preeclamptic pregnant women in the 24th-30th weeks with diastolic blood pressure >90 mmHg and urinary protein level 24 hours >0.3 g were included. All patients were submitted to a complete study of coagulation function: prothrombin time (PT), activated partial thromboplastin time (aPTT), International Normalization Ratio (INR), fibrinogen C, D-dimer, AT and TEG at the beginning, after every administration of AT, weekly until caesarean section, and daily for 1 week in the postoperative period. AT was administered every time the AT plasmatic level was less than 80% to restore the plasmatic level to more than 120% using the following algorithm: (120% - AT plasmatic level) x kg. At the beginning, only seven patients were treated with AT. Results At the beginning, all patients showed AT consumption and a hypercoagulation TEG (Figure 1), but the INR and aPTT were in the normal ranges. Patients treated with AT at the beginning did not need a new administration. The remaining patients were treated at the 31st, 33rd and 34th weeks, respectively. In all patients, AT administration determined a normalization of TEG without any modification of the PT and aPTT or bleeding. All

Figure 1 (abstract P213)

1 Native

1 ESÛM5 Campions: 12/03/2007 10.13.36 OM - 11.38.01 AM


d/âc % mm %

0,0 19.2K 0,3 73,7 5,2 0,3

: lb_— 3 g — s

Thromboelastographic monitoring at admission.

patients were submitted to caesarean section between the 36th and 39th weeks.

Conclusions AT administration could play a central role in preeclampsia treatment. TEG monitoring evidenced, in real time, coagulation changes that common laboratory tests could not show. Reference

1. Redman C: Am J Obstet Gynecol 1999, 180:499-506. P214

Thrombocytopenia is associated with mortality in hospitalized patients with low risk of death

M Oliveira, R Gomes, L Silva, F Ribeiro, C Boaventura, A Camelier, R Passos, D Flores, J Teles, A Farias, O Messeder

Hospital Portugues, Salvador Bahia, Brazil

Critical Care 2008, 12(Suppl 2):P214 (doi: 10.1186/cc6435)

Introduction Thrombocytopenia is inversely related to survival in critical care patients [1]. The objective of the present study was to evaluate the prevalence of thrombocytopenia in patients of an ICU and to determine whether it might be a significant predictor of outcome.

Methods A prospective observational cohort study was performed from April to September 2007 in a 24-bed medical-surgical ICU. All patients admitted to the ICU during the period of observation were included in the study. Patients were prospectively studied until 14 days from admission, discharge from the ICU, or death. Patients who had thrombocytopenia on admission or spent less than 48 hours in the ICU were excluded from the patient population. Results During the period of observation, 215 patients were admitted to the ICU (57.5% male), with a median age 65.0 years (IQR 54-77) and APACHE II score 14.0 (IQR 10.0-19.0). One hundred and seventy-six subjects (81.9%) were alive after a 14-day follow-up. Seventy patients (32.6%) developed thrombocyto-penia during the study. Patients who ever developed thrombocyto-penia had a higher ICU mortality (28.6% vs 13.0%, respectively; P < 0.006) and a higher consumption of blood products (24% vs 2%, P < 0.0001). However, both groups had the same APACHE II score (15.15 ± 6.1 vs 15.15 ± 7.2, P = 0.99) and ICU stay (8.2 ± 7.1 vs 8.4 ± 12.8, P = 0.93).

Conclusions Even in an ICU sample with a low risk of death predicted by the APACHE II score, thrombocytopenia was highly associated with higher mortality and consumption of blood products. Reference

1. Akca S, Haji-Michael P, De MA, Suter P, Levi M, Vincent JL: Time course of platelet counts in critically ill patients. Crit Care Med 2002, 30:753-756.

Functional state of the hemostasis system in physiological pregnancy and late toxicosis

V Mazur, O Tarabrin, A Suhanov, S Shcherbakov

Odessa Medical University, Odessa, Ukraine

Critical Care 2008, 12(Suppl 2):P215 (doi: 10.1186/cc6436)

Introduction One of the causes of obstetric hemorrhages is toxicosis in the second half of pregnancy accompanied by a chronic form of disseminated intravascular coagulation syndrome, hypercoagulation and increased aggregation activity of platelets. Methods To assess the functional state of the hemostasis system we used our devised test with local ischemia of the upper extremity. The analysis of the coagulation, vascular and thrombo-

cytic components of hemostasis and fibrinolysis was made on the basis of parameters of the blood aggregate state obtained using the method of haemoviscoelastography.

Results We examined 30 healthy pregnant women in the age range 20-31 years (control group), and 30 pregnant women with revealed late toxicosis of different severity degree (nephropathy of II degree, 10 women; nephropathy of III degree, 20 women). While analyzing the functional state of the hemostasis system in healthy pregnant women we distinguished two types of response to the test: compensated type (1) in 30% and subcompensated type (2) in 70%. The pregnant women suffering from late toxicosis were registered to have a subcompensated type of the hemostasis system response (3) in 20% of cases and a decompensated type (4) in 80% of cases. The functional test in group 1 resulted in decreased aggregation activity of platelets, reduced activity of the I and II phases of blood coagulation (elevation of r and k) and activation of the fibrinolytic system. Group 2 is noted to have enhanced aggregation activity of platelets, enhanced thrombin activity and acceleration of the thrombin formation and activation of II and III coagulation phases. The total fibrinolytic blood activity was reduced by 42%.

Conclusions Late toxicosis is therefore accompanied by changes in the hemostasis system, causing exhaustion of compensatory potentials of the regulation system of the blood aggregate state and promoting a high risk of thrombohemorrhagic complications during delivery and in the postpartum period.

Plasma fibrinolysis is related to the SOFA score but not to the von Willebrand factor on ICU admission

K Zouaoui Boudjeltia1, S Ollieuz2, M Piagnerelli3, P Biston2, P Cauchie1, M Vanhaeverbeek1

1ISPPC CHU Charleroi, Vesale Hospital, Montigny-le-Tilleul, Belgium; 2ISPPC CHU Charleroi, Belgium; 3Erasme University, Brussels, Belgium

Critical Care 2008, 12(Suppl 2):P216 (doi: 10.1186/cc6437)

Introduction Endothelial cell activation and injuries are important causes of multiorgan failure. Altered fibrinolysis promotes fibrin deposition and may create microvascular alterations during inflammation. C-reactive protein (CRP) is correlated with an increased risk of MOF [1] and CRP may inhibit fibrinolysis [2]. We aimed to determine whether plasma fibrinolysis is related to the SOFA score and von Willebrand factor (vWF antigen), as a marker of endothelium dysfunction, in critically ill patients at ICU admission. Methods A cross-sectional study in an adult medicosurgical ICU. Patients were 49 consecutive patients (31 nonseptic and 18 septic). Plasma fibrinolysis was assessed by the euglobulin clot lysis time (ECLT) at ICU admission [3].

Results The ECLT was significantly longer in septic than in nonseptic patients (1,219 ± 574 min versus 701 ± 224 min, P = 0.001). Significant correlation between the ECLT and CRP (R = 0.67, P < 0.001) and between the ECLT and SOFA score (R = 0.36, P = 0.009) were observed. CRP was weakly correlated with vWF (R =

0.29. P = 0.04). The vWF was not correlated either with the ECLT (R = -0.06, P = 0.65) or the SOFA score (R = -0.02, P = 0.88). Conclusions The ECLT measurement could be a marker of organ dysfunction and a prognosis factor in critically ill patients. Further studies with measurement of plasma fibrinolysis by the ECLT should be investigated in ICU patients.


1. Lobo SM, et al.: Chest 2003, 123:2043-2049.

2. Devaraj S, et al.: Circulation 2003, 107:398-404. S84 3. Boudjeltia Z, et al.: BMC Biotechnol 2002, 2:8.

Perioperative monitoring of coagulation in patients after abdominal surgery

O Tarabrin, V Mazur, A Suhanov, S Shcherbakov

Odessa Medical University, Odessa, Ukraine

Critical Care 2008, 12(Suppl 2):P217 (doi: 10.1186/cc6438)

Introduction Despite the evidence of perioperative hypercoagulability in cancer patients, there are no consistent data evaluating the extent, duration, and specific contribution of platelets and procoagulatory proteins by in vitro testing. This study compared the efficacy of haemoviscoelastography (HVG) versus thrombo-elastography (TEG) for monitoring coagulation imbalance. Methods In 108 patients undergoing surgery for abdominal cancer we examined the efficacy of a variety of coagulation tests. A complete coagulation screening, TEG and HVG were performed before and at the end of surgery.

Results We calculated the elastic shear modulus of standard maximum amplitude (MA) (Gt) and HVG MA (Gh), which reflect the total clot strength and procoagulatory protein component, respectively. The difference was an estimate of the platelet component (Gp). There was a 14% perioperative increase of standard MA, corresponding to a 48% increase of Gt (<0.05/J) and an 80-86% contribution of the calculated Gp to Gt. We conclude that serial standard TEG and HVG viscoelastic tests may reveal the independent contribution of platelets and procoagulatory proteins to clot strength. The results showed that some components of the TEG failed to identify hypercoagulation (r < 0.2, P > 0.75). All components of the HVG test reflected postoperative coagulopathies. Conclusions Hypercoagulability is not reflected completely by standard coagulation monitoring and TEG, and seems to be predominantly caused by increased platelet reactivity. HVG provides a fast and easy-to-perform bedside test to quantify in vitro coagulation, and may be useful in determining the coagulation status of cancer patients perioperatively. Reference

1. Samama CM, et al.: Anesthesiology 2001, 94:74-78. P218

Neutrophil oxidative burst evaluation during acute normovolemic hemodilution

MA Kahvegian1, DT Fantoni2, DA Otsuki1, CA Holms1, CO Massoco2, JO Auler Jr1

1Faculdade de Medicina da Universidade de Sao Paulo, Brazil;

2Faculdade de Medicina Veterinaria da Universidade de Sao Paulo, Brazil

Critical Care 2008, 12(Suppl 2):P218 (doi: 10.1186/cc6439)

Introduction In recent years there has been increasing evidence that resuscitation strategies with different fluids can have widely divergent impacts on the immune response and neutrophil activation. This study was undertaken to determine the neutrophil's oxidative burst in a swine model during the acute normovolemic hemodilution (ANH) procedure with hydroxyethyl starch (HES), normal saline solution (NSS) or gelatin (GEL). Methods Twenty-four pigs were anesthetized, instrumented and randomized into four groups: Control, ANH + HES, ANH + NSS and ANH + GEL. Animals in the ANH group were submitted to acute normovolemic hemodilution to a target hematocrit of 15% with volume replacement performed with HES 130/0.4 and GEL at a 1:1 ratio and NSS at a 3:1 ratio. The withdrawn blood was returned to the animals 120 minutes after the end of hemodilution. Neutrophil oxidative burst was performed with blood samples

collected from the femoral vein at the following time points: before ANH (baseline), after instrumentation (INST), immediately after ANH (H), 60 minutes after ANH (60H), 120 minutes after ANH (120H), 60 minutes after blood infusion (60BI) and 120 minutes after blood infusion (120BI) and determined with a flow cytometer. A t test was performed to evaluate differences between groups. P < 0.05 was considered statistically significant. Results Between groups there were significant differences at time point H between Control (25.75 ± 8.45) and HES (60.61 ± 10.49; P < 0.01), between Control and NSS (55.94 ± 10.38; P < 0.05), and between Control and GEL (68.42 ± 27.83; P < 0.01). At time point 60H, the differences were between Control (34.48 ± 8.11) and HES (54.15 ± 12.49; P < 0.01). In 120H, Control (29.05 ± 9.39) and HES (45.20 ± 5.80; P < 0.05) and NSS (46.18 ± 9.42; P < 0.05) showed significant differences. Sixty minutes after blood infusion, only HES (38.57 ± 7.89; P < 0.05) was different from Control (26.46 ± 7.54).

Conclusions Fluid replacement immediately after induced ANH increased inflammation expressed by oxidative burst activity without significant differences among them. Acknowledgements Performed at LIM 08. Supported by grants from FAPESP (05/58987-9). References

1. Lee CC, et a/.: Shock 2005, 2:177-181.

2. Watters JM, et a/.: Shock 2004, 22:283-287.

Reticulocyte counts and their relation to hemoglobin levels in trauma patients

M Otterman, JM Nijboer, IC Van der Horst, HJ Ten Duis, MW Nijsten

University Medical Center Groningen, The Netherlands Critical Care 2008, 12(Suppl 2):P219 (doi: 10.1186/cc6440)

Introduction In many trauma patients, blood loss is the major cause of anaemia. Subsequent increased production of red blood cells is reflected by increased reticulocyte numbers (R). Measurement of the R might be useful in predicting the recovery of hemoglobin (Hb), especially since current transfusion guidelines accept lower Hb levels. The value of the modern fully automated measurement of R in assessing recovery of Hb after blood loss has not been investigated in this context. We therefore investigated the temporal relation of Hb and R in a cohort of trauma patients. Methods Over a 10-month period all patients with trauma admitted to our hospital were analysed. Patients were grouped by comorbidity and reason for admission. When an Hb was routinely measured, a R measurement was also performed in the same sample. Both Hb and R (reference range 8-26 promille) were determined in EDTA-anticoagulated blood in the central laboratory with a Sysmex XE-2100. Before further pooled analysis, values for individual patients were averaged or interpolated to daily values. Red blood cell (RBC) transfusions were administered according to modern restrictive transfusion guidelines, with a Hb threshold of 4.3 mmol/l in otherwise healthy patients. Hb and R were analyzed for a maximum of 30 days post-trauma, and were related with age, sex and the presence of comorbidity.

Results Two hundred and forty-one patients with a mean ± SD age of 52 ± 21 years were studied. The mean length of stay was 15 days (range 1-110). In 107 patients (44%), important comorbidity was present. In 28 patients (12%), one or more RBC transfusions were administered with a mean of 2.2 RBCs (range 1-4). Hb decreased from a mean level of 7.6 ± 1.5 mmol/l at admission to 6.8 ± 1.3 on day 3. R slowly rose from 16 ± 11 at admission to 38 ± 21 promille on day 13. The highest R value

observed was 121 promille. Nadir Hb values and maximum R values were inversely related upon univariate analysis (Pearson R = -0.62, P < 0.001). Multivariate analysis with the variables minimal Hb, maximum R, age, sex, and the presence of comorbidity showed that only minimal Hb was a significant determinant of R (R = 0.63).

Conclusions There is a strong relationship between minimal Hb and maximum R in trauma patients. The measurement of reticulocytes may be helpful in predicting the recovery in Hb after acute blood loss due to trauma and to assist in deciding whether a patient needs to be transfused.

Geographic variation in the reversal of vitamin K antagonist-associated coagulopathy

M Neal1, M Crowther2, J Douketis2, M Verhovsek2, C Stidley1, D Garcia1

1University of New Mexico, Albuquerque, NM, USA; 2McMaster University, Hamilton, ON, Canada

Critical Care 2008, 12(Suppl 2):P220 (doi: 10.1186/cc6441)

Introduction Serious bleeding is the most feared adverse effect of vitamin K antagonists (VKA) such as warfarin. VKA-treated patients who are bleeding (or found to have a supratherapeutic INR value) can be managed by administering one or more of the following: vitamin K, fresh frozen plasma, recombinant activated factor VII, or prothrombin complex concentrates. Current guidelines and review articles addressing this subject are discordant. We tested the hypothesis that significant clinical practice differences exist between North America and the rest of the world for reversal of VKA-associated coagulopathy.

Methods A survey containing three hypothetical clinical cases was presented to attendees at a meeting of the International Society of Thrombosis and Haemostasis in July 2007. The respondents were primarily physicians with experience in anticoagulant management. The cases involved patients with an elevated INR value and either intracerebral bleeding, gastrointestinal bleeding, or no clinical evidence of bleeding. For each case, the attendee was asked to choose the intervention they would most probably order at their institution.

Results A total of 119 surveys were distributed and 46 were completed. See Table 1. For patients with intracerebral or gastrointestinal bleeding who required urgent reversal of VKA-associated coagulopathy, there was significantly greater use of fresh frozen plasma and recombinant activated factor VII in North America and significantly greater use of prothrombin complex concentrates in the rest of the world. For patients with an elevated INR but no bleeding, there was no significant difference in practice by geographic region; vitamin K was used consistently in all cases. Conclusions Significant geographical differences exist in the way clinicians urgently reverse VKA-associated coagulopathy in bleeding patients. This suggests that randomized trials are needed to define optimal management strategies.

Table 1 (abstract P220)

Comparison by region of respondents recommending prothrombin complex concentrates for each case

North America (%) (n = 10) Other (%) (n = 36) P value

Intracerebral bleeding 10 81 <0.0001

Either 10 86 <0.001

Prothrombin complex concentrate use in surgical patients: a retrospective analysis of efficacy and safety for coumarin reversal and bleeding management

KS Schick, JM Fertmann, KW Jauch, JN Hoffmann

Klinikum Grosshadern, Munich, Germany

Critical Care 2008, 12(Suppl 2):P221 (doi: 10.1186/cc6442)

Introduction Anticoagulation, coagulation disorders and haemorrhage are causing considerable morbidity and mortality in surgical patients. Reversal of vitamin K anticoagulants and treatment of perioperative coagulopathy can be achieved by prothrombin complex concentrates (PCC). However, the effects on coagulation parameters, and any side effects on organ function, have yet to be determined.

Methods Patients of the surgical department were analysed during 1 year retrospectively by reviewing patient charts and documentation in a case-note review. Patients with vitamin K antagonist reversal (reversal group: n = 12) were compared with patients receiving PCC for management of severe bleeding (bleeding group, n = 38). Coagulation was assessed using thromboplastin times (INR/Quick's value). Serum bilirubin and creatinine concentrations at day 3 after PCC application served as safety variables. Results Both patient groups were comparable in terms of age (reversal: 67.3 ± 4.1 years vs bleeding: 66.1 ± 1.8 years) and body temperature (37.2 ± 0.2°C vs 36.8 ± 0.3°C). Thromboplastin times (INR) before PCC treatment were significantly higher in the reversal group (reversal: 2.4 ± 0.2 vs bleeding: 1.5 ± 0.2; P < 0.001), whereas anaemia occurred significantly more frequently in bleeding patients (haemoglobin: reversal 11.8 ± 0.6 g/dl vs bleeding: 8.2 ± 0.3 g/dl; P < 0.001). Both groups showed a highly significant decrease in INR values over time (reversal: 1.3 ± 0.2 at 180 ± 31 min after PCC application vs bleeding: 1.2 ± 0.2 at 147 ± 15 min after treatment; INR: P < 0.001 vs baseline, time: not significant). Creatinine and bilirubin concentrations at day 3 were not significantly increased in either group (P > 0.05), indicating no significant effect on renal and hepatic function. Conclusions Patients of the reversal group showed significant differences when compared with bleeding patients in terms of baseline INRs and cardiocirculatory situation (data not shown). Our results demonstrate that PCC can effectively improve INR in nonhypothermic surgical patients requiring coumarin reversal or experiencing severe bleeding. In almost all patients, this improvement in plasmatic coagulation was judged to be clinically significant, and allowed operative and/or interventional procedures.

Postoperative dose of tranexamic acid decreases postoperative bleeding and inflammatory response associated with cardiopulmonary bypass: a randomized, double-blind study

M Brouard, J Jimenez, J Iribarren, L Lorenzo, S Palmero,

I Rodriguez, R Perez, P Machado, J Raya, J Rodriguez, P Garrido, I Nassar, R De la Llana, R Martinez, M Mora

Hospital Universitario de Canarias, La Laguna, Tenerife, Spain Critical Care 2008, 12(Suppl 2):P222 (doi: 10.1186/cc6443)

Introduction Postoperative bleeding reflects haemostatic alterations associated with cardiopulmonary bypass (CPB), which may lead to inflammatory response (IR). We evaluated the efficacy of different doses of tranexamic acid (TA) (before versus before S86 and after CPB) for IR and postoperative bleeding.

Methods We performed a randomized, double-blind study with consecutive Caucasian adult patients undergoing elective CPB surgery from January 2006 to January 2007 in a 24-bed ICU at a university hospital. From 209 consecutive patients, 49 met the criteria for exclusion. After obtaining informed written consent, patients were randomized to receive coded infusions of a single pre-CPB dose (40 mg/kg) of TA (n = 80), and 40 mg/kg TA before and after (twice) CPB (n = 80). We performed an analysis, comparing IR incidence (defined as core body temperature higher than 38°C (100.4°F) in the first 4 hours after intervention, systemic vascular resistance index <1,600 dyn-s-cm_5-m-2 and cardiac index higher than 3.5 l-min_1-m-2) and postoperative 24-hour bleeding. We also analyzed several biological parameters related to inflammation, coagulation, fibrinolysis and hemoderivative requirements. SPSS version 15 was used.

Results The incidence of post-CPB IR was significantly lower in the twice-TA group than in the single-TA group (7.5% vs 20%; P = 0.037). The twice-TA group had lower D-dimer at 4 and 24 hours after CPB (both, P < 0.001). The twice-TA group lost less blood at 24 hours after CPB than the single-TA group: 670 (95% CI = 543-798) ml vs 827 (95%CI = 704-950) ml (P = 0.007). No differences in blood transfusions were observed. Conclusions We observed a significant reduction of IR and postoperative bleeding with lower postoperative fibrinolysis in the group of CPB patients who received TA before and after CPB.

Predicting response to recombinant activated factor VIIa administration in the critically ill

R Pugh, R Wenstone

Royal Liverpool University Hospital, Liverpool, UK

Critical Care 2008, 12(Suppl 2):P223 (doi: 10.1186/cc6444)

Introduction There is considerable interest in the potential use of recombinant activated factor VIIa (rFVIIa) as adjunctive therapy in major haemorrhage; to date, however, only a single RCT supports its use as rescue treatment [1]. Previous efforts have been made to establish which patients are most likely to benefit from rFVIIa using scoring systems, but the optimal circumstances remain unclear [2,3]. The purpose of this study was to investigate potential factors influencing response to rFVIIa (in terms of subsequent packed red cell (PRBC) transfusion) and survival in a cohort of nonhaemophiliac patients treated with rFVIIa for haemorrhage in our region. Methods We performed a retrospective analysis of the records of 40 nonhaemophiliac critically ill adults treated at seven hospitals in the Cheshire and Mersey region with rFVIIa for haemorrhage resistant to conventional management. The influence of potential factors on post-rFVIIa PRBC transfusion and ICU survival were evaluated using the Mann-Whitney U test and Fisher's exact test, respectively.

Results The 40 patients were surgical (21 patients), trauma (11 patients), obstetric (three patients), cardiothoracic (three patients) and medical (two patients). The median age was 53.5 years, 26 patients were male. A median single dose of 90 |ig/kg rFVIIa was administered after a median 14.5 units PRBC. Fifty-three per cent of patients survived to ICU discharge. Factors influencing PRBC transfusion in the 24 hours post rFVIIa administration and ICU survival are presented in Tables 1 and 2.

Conclusions While the optimal circumstances for rFVIIa administration remain unclear, it seems easier to focus on the question 'Who is unlikely to benefit?' pH < 7.1 at the time of administration of rFVIIa was associated with significantly increased PRBC transfusion and 100% mortality. Hypothermia (temperature <35°C) and cardiac arrest prior to rFVIIa administration were also

Table 1 (abstract P223)

Median post-rFVIIa PRBC transfusion in the presence or absence of factor

PRBC transfusion (units)

Factor Present Absent P value

pH < 7.1 б З G.G5G*

Temperature < 35°C В З G.GG1*

Prior cardiac arrest 1G З G.G34*

Prior PRBC transfusion > 20 units 4.5 З G.G62

INR > 1.5 З З G.621

Platelet count < 50 x 109/l 4.5 З G.6G9

'Significant at P < 0.05.

Table 2 (abstract P223)

Survival in the presence or absence of factor

Survival (%)

Factor Present Absent P value

pH < 7.1 G бЗ G.GG3*

Temperature < 35°C 44 б5 G.711

Prior cardiac arrest ЗЗ 5б G.398

Prior PRBC transfusion > 20 units 47 5б G.745

INR > 1.5 42 б1 G.342

Platelet count < 50 x 109/l 25 5б G.331

'Significant at P< G.G5.

associated with significantly increased PRBC transfusion. Although there have been previous reports of survival following rFVIIa administration for haemorrhage in the presence of pH < 7.1, such profound acidaemia provides a strong indication that rFVIIa is likely to be futile. References

1. Boffard et al.: J Trauma 2005, 59:8-18.

2. Biss et al.: Vox Sang 2006, 90:45-52.

3. Bowles et ail.: Br J Anaesth 2006, 97:476-481.

A multicentre prospective open-label study assessing efficacy and safety of a triple-secured fibrinogen concentrate in the treatment of postpartum haemorrhage

AS Ducloy-Bouthors1, F Broisin2, C Teboul3, A Jaffry3, T Waegemans3, B Padrazzi3

1Hopital Jeanne de Flandre, Lille, France; 2Höpital de la Croix

Rousse, Lyon, France; 3LFB, Les Ulis, France

Critical Care 2008, 12(Suppl 2):P224 (doi: 10.1186/cc6445)

Introduction Postpartum haemorrhage (PPH) is a major cause of global maternal morbidity and mortality. The use of haemostatic drugs in the therapeutic management of patients is mainly empirical. A rationale for an early treatment with fibrinogen, however, has been recently suggested [1].

Methods A multicentre, noncontrolled, phase 2 study was performed to assess the efficacy and the safety of a new triple-secured fibrinogen concentrate (FGT1; LFB, Les Ulis, France) in the treatment of PPH. A single median dose of 30 mg/kg was administered in addition to standard care. Patients were followed until 6 weeks after the inclusion. Failure of treatment was defined when ultimate resources (that is, invasive hemostatic intervention or treatment with activated recombinant factor VII) were required to

stop the hemorrhage, or in case of massive transfusion or death. Other clinical criteria included the course of haemorrhage and investigator's assessment using a four-point scale. Laboratory assessments were changes in fibrinogen plasma levels and in the FibTEM A15 parameter (RoTEM®). Safety included adverse events and vital signs.

Results Sixteen patients were included with a median (range) volume of haemorrhage of 1,667 (800; 3,160) ml at baseline. FGT1 succeeded in controlling the haemorrhage in 75% of the 12 patients who were clinically assessable. A convergence was observed for all efficacy criteria used. The median fibrinogen incremental recovery was 10.0 (g/l)/(g/kg) with a 7% concomitant median increase of A15 FibTEM. Biological ranges were also very large. Higher incremental recovery and relative increases of FibTEM A15 were associated with clinical success. FGT1 was well tolerated in all patients. Among the 14 adverse events reported, only one was serious, but all were reported as not related to FGT1. No thrombosis or allergic reaction to the study drug occurred. Conclusions This exploratory study suggests efficacy of FGT1 to control PPH in cases of failure of first-line treatments. Most severe PPH may require higher doses than used in this study. These results are to be confirmed by larger controlled trials. Reference

1. Charbit B, etal.: J Thromb Haemost 2007, 5:266-273. P225

Comparison of prothrombin complex concentrate and fresh frozen plasma on thrombin generation and hemorrhage in experimental dilutional coagulopathy

G Dickneite, I Pragst

CSL Behring, Marburg, Germany

Critical Care 2008, 12(Suppl 2):P225 (doi: 10.1186/cc6446)

Introduction Thrombin, the key enzyme of blood clotting, is responsible for the conversion of fibrinogen to fibrin, and for the activation coagulation factors and platelets. Sufficient thrombin generation (TG) is therefore crucial for hemostasis. In an experimental dilutional coagulapathy in pigs with subsequent bone or spleen injury, the effect of a substitution therapy with prothrombin complex concentrate (PCC) (Beriplex P/N; CSL Behring) or autologous porcine fresh frozen plasma (pFFP) on TG and hemorrhage was evaluated.

Methods A dilutional coagulopathy was induced in 44 anaesthetized pigs. Erythrocytes were retransfused, the lost volume was replaced by hydroxyethyl starch. Animals were randomized to the following two study groups - A: bone injury, (1) placebo (n = 7), (2) PCC 25 U/kg (n = 7), (3) 15 ml/kg pFFP (pFFP15, n = 7), or (4) 40 ml/kg pFFP (pFFP40, n = 4); B: spleen injury, (1) placebo (n = 7), (2) PCC 25 U/kg (n = 6), or (3) 15 ml/kg pFFP (n = 6). A

3 mm bone injury was performed by drilling a hole into the femur neck. A spleen incision was created by a scalpel blade. Blood loss (BL) and the time to hemostasis (TH) were determined. TG, the prothrombin time (PT) and coagulation factor levels were measured. Results The dilutional coagulopathy led to a decrease in circulating coagulation factors, a prolonged PT and a decreased TG. The substitution with PCC and pFFP normalized the impaired PT, but only PCC could normalize the TG. PCC but not pFFP could restore the decreased plasma levels of coagulation factors of the prothrombin complex to sufficiently high levels. In the placebo group, TH and BL after bone injury were 90.0 ± 27.4 minutes and 625 ± 330 ml, and were 82.8 ± 24.5 minutes and 757 ± 251 ml after spleen injury. After substitution therapy with PCC, a significantly faster TH of 39.6 ± 9.8 minutes (P = 0.0004) and a decreased BL of 191 ± 119 ml (P = 0.0127) was observed after S87

bone injury. After spleen injury, TH was 45.3 ± 9.1 minutes (P = 0.0129) and BL was 356 ± 175 ml (P = 0.0152). Neither dose of pFFP could correct hemorrhage.

Conclusions Substitution with PCC but not FFP could normalize TG and provide sufficient coagulation factors to reduce hemorrhage. In both models of either venous or arterial injury, TH and BL were significantly decreased by PCC. It was concluded that the correction of TG correlates with hemostasis from a trauma injury.

Potential effects of infused particles in paediatric intensive care patients

T Jack1, B Brent1, M Mueller2, M Boehne1, A Wessel1, M Sasse1

Medical School Hanover, Germany; 2Fraunhofer ITEM, Hanover, Germany

Critical Care 2008, 12(Suppl 2):P226 (doi: 10.1186/cc6447)

Introduction As a part of a clinical trial to evaluate the potential benefits of inline filtration on reducing major complications of paediatric ICU (PICU) patients (Clinical ID NCT 00209768), we examined the physical aspects and chemical composition of particles captured by inline microfilters. Additionally we investigated the inflammatory and cytotoxic effects of particles on human endothelial cells and macrophages in vitro.

Methods We analysed 22 filters used by critically ill children with electron microscopy and energy dispersion spectroscopy. The average number of particles on the surface as well as their composition was examined. In the in vitro model, human endothelial cells and murine macrophages were exposed to different solutions of glass particles and the cytokine levels assayed to assess their immune response. Levels of IL-1P, IL-6, IL-8, and TNFa were measured.

Results The average number of particles found on the surface of a filter membrane was 542/cm2 and energy dispersion spectroscopy analysis confirmed silicon as one of the major particle constituents. When human endothelial cells and murine macrophages were exposed to different solutions of glass particles (according to the particles found on the filter membranes), levels of IL-1 P, IL-6, IL-8, and TNFa were found to be significantly suppressed. Conclusions Inline filtration prevents the infusion of potentially harmful particles. The suppression of macrophage and endothelial cell cytokine secretion by particles in vitro suggests that the infusion of microparticles may also contribute to immune compromisation, which is often seen in vivo in the clinical course of PICU patients. These findings and their effect on the clinical outcome of our PICU patients may be further elucidated.

What is the efficacy of a filter needle in the retention of typical bacterial pathogens?

W Elbaz, J Moore, C Goldsmith, G McCarthy, T Mawhinney

Belfast Health and Social Services Trust, Belfast, UK Critical Care 2008, 12(Suppl 2):P227 (doi: 10.1186/cc6448)

Introduction The use of filter needles to prepare medication is suggested as a way of preventing particulate contamination of infusions, which is regarded as a source of infection or inflammation in critical care patients [1]. To assess the impact of such needles purely on bacterial contamination we carried out a comparison of the retention of four bacterial pathogens through a standard 25-swg needle (BD), a standard fill needle with 5 |im filter (BD) and the filter of an Epidural minipack system (Portex). Comparisons were made S88 at high and then low ('real world') bacterial contamination levels.

Methods We prepared four bacterial suspensions mixed together in peptone saline to produce Staphylococcus aureus (2.35 x 106 colony-forming units (cfu)/ml, Bacillus cereus (5.77 x 105 cfu/ml), Escherichia coli (4.38 x 106 cfu/ml) and Pseudomonas aeruginosa (3.86 x 106 cfu/ml) per 10 ml test fluid. This volume was then injected through each type of needle and the epidural filter, after which the filtrate was taken for culture. Small standard amounts were plated on Columbia blood agar while the remainder of the sample (9.9 ml) was mixed with double-strength nutrient broth to be incubated for 24 hours at 37°C. This was repeated with three sets of needles and filters. We then prepared a low-density inoculate of 1 ml equating to a total density of 2.63 x 102 cfu of an equal mix of the above bacteria. This was injected through six sets of the devices in question, and both quantitative counts and cultures were performed. Student's t test was used to compare counts. Results No bacteria could be cultured following the use of the 0.2 |im epidural filter either from Columbia blood agar or from broth at high or low contamination levels. In contrast, it was easy to isolate all four pathogens from both needles. In quantitative counts there was no difference in the mean counts (175 cfu/ml vs 190 cfu/ml) between filtered and unfiltered needles.

Conclusions In terms of preventing bacterial transmission where an infusion is contaminated (even at low levels), the use of a 5 |im filter needle is no better than a normal needle. In contrast a 0.2 |im filter is highly efficient at preventing transmission, at least when resisting a single challenge. Reference

1. Stein HG: Medsurg Nurs 2006, 15:290-294. P228

Administration of a standardized plasma-protein solution (Biseko®) in high-risk patients with systemic inflammatory response syndrome: influence on cytokine levels and survival

B Meyer1, A Bojic1, GJ Locker1, T Staudinger1, B Wulkersdorfer1, M Mullner1, A Spittler1, E Schuster1, B Reith2, A Kaye3, M Frass1

Medical University of Vienna, Austria; 2Clinic of Konstanz, Germany; 3LSU School of Medicine, New Orleans, LA, USA Critical Care 2008, 12(Suppl 2):P228 (doi: 10.1186/cc6449)

Introduction Morbidity and mortality in patients suffering from systemic inflammatory response syndrome (SIRS) remain high. The present study was carried out because proteins might serve as promising agents to reduce mortality in SIRS patients. We therefore investigated the effects of a plasma-protein solution containing immunoglobulins on serum cytokine levels and survival. Methods This prospective, double-blind, randomized, controlled trial was performed in the medical ICU of a university hospital. Forty consecutive patients with SIRS were randomized to receive either a commercially available standardized plasma-protein solution (Biseko®; Biotest, Dreieich, Germany) [1] consisting of all important transport and inhibitor proteins as well as immunoglobulins or a 5% albumin solution. Plasma/albumin was given intravenously at a volume of 1,000 ml on the first day and 500 ml/day during the following 4 days. Serum cytokine levels of IL-1 P and IL-6 were measured on days 1-6, TNFa and TNF-R levels were determined on days 1 and 14 and at day 28. Survival was assessed on day 28 and on day 180.

Results Eighteen patients received Biseko®, 20 patients received albumin. Two patients died before receiving the complete study medication. During days 1-6 of the study period, serum levels of IL-1 P were significantly lower in patients with Biseko® therapy compared with patients receiving albumin (IL-1 P AUC 65 ± 71 vs 111 ± 157, P = 0.03). No statistically

significant difference could be found in serum levels of IL-6, TNFa and TNF-R between both groups. While a not statistically significant trend towards better survival could be observed in the Biseko® group on day 28, the survival rate on day 180 was significantly higher in the Biseko® group (50% (9/18)) vs the albumin group (10% (2/20), (P < 0.008)).

Conclusions The data suggest that Biseko® therapy was associated with significantly lower IL-1 P plasma concentrations (days 1-6) and with improved survival rates. Reference

1. Keller E, et al: Comparison of fresh frozen plasma with a standardized serum protein solution following therapeutic plasma exchange in patients with autoimmune disease.

Ther Apher 2000, 4:332-337.

Albumin versus colloids in colon surgery patients: preliminary results

K Kotzampassi1, V Grosomanidis2, K Andreopoulos2, CH Skourtis2, E Eleftheriadis1

1Department of Surgery and 2Department of Anaesthesiology, University of Thessaloniki Medical School, Thessaloniki, Greece Critical Care 2008, 12(Suppl 2):P229 (doi: 10.1186/cc6450)

Introduction In colon surgery for malignancy, tissue oedema as a result of increased capillary permeability - due both to operational stress response and the perioperative fluid therapy - may contribute not only to the systemic consequences, but also to anastomotic dehiscence with questionable end results for the patient. The use of albumin is considered the gold standard for prevention of this complication, being at least theoretically combined with hypoalbuminaemia; however, in recent years its use has become controversial. On the other hand, after the acknowledgement of the pharmacokinetic advantages of synthetic colloids, there has been an ongoing shift towards their use as perioperative fluid therapy in major elective surgery, too. We aimed to investigate the effect of colloids as a postoperative regimen against routinely given human albumin in patients subjected to colectomy for cancer. Thirty-day morbidity, including anastomotic leakage, abdominal wound infection and dehiscence, as well as organ-specific and systemic infections, sepsis and septic shock, were assessed. Methods Fifty colon cancer patients with actual indication for early postoperative albumin treatment were randomized to receive either human albumin (100 ml/day) or 6% HES 130/0.4 (Voluven; Fresenius AG) (500 ml/day), for six consecutive days. Patients were then followed up for the next 30 days.

Results In the albumin and Voluven groups, anastomotic leakage was prominent in three patients and one patient, respectively; wound infection in three patients and one patient, respectively; systemic infection in five patients and four patients, respectively; and sepsis in two patients and zero patients, respectively. One patient finally died from sepsis in the albumin group. Conclusions We conclude that, in our study, patients receiving Voluven against albumin as a perioperative 6-day treatment

exhibited lower morbidity rates. However, further research is



1. Desborough JP: Br J Anaesth 2000, 85:109-117.

2. Wilkes MM, Navickis RJ: Ann Intern Med 2001, 135:149-164.

3. Lang K, Suttner S, Boldt J, Kumle B, Nagel D: Can J Anaesth 2003, 50:1009-1016.

Influence of hydroxylethyl starch infusions on cerebral blood flow in patients with severe traumatic brain injury

A Akmalov, V Avakov

Tashkent Medical Academy, Tashkent, Uzbekistan

Critical Care 2008, 12(Suppl 2):P230 (doi: 10.1186/cc6451)

Introduction The aim of the study was to evaluate the effects of hydroxylethyl starch (HES) solutions in patients with severe brain injury dependently of cerebral blood flow (CBF) type. Methods One hundred and twenty-six patients (female/male ratio, 32/94) with severe brain injury GCS < 8 were included. 76.2% patients passed surgery due to depressed skull fracture or intracerebral hemorrhage. CBF was assessed by transcranial Doppler ultrasonography (insonation of M1-2 segments of the middle cerebral artery (MCA) with assessment of linear blood velocity (LBV)). Patients were divided into three groups: I, with brain hyperemia (BH) (n = 42); II, with cerebral vasospasm (CVS) (n = 58); III, with brain hypoperfusion (BHP) (n = 26). All patients received volume replacement with 6% HES 200/0.5. Results Infusion of HES in group I causes a statistically significant increase of the cerebral blood volume, a rise of intracranial pressure (ICP) and central venous pressure (CVP) and leads to more deep depression of consciousness. In group II, HES provided some reduction of LBV in MCA and improvement of consciousness. Increasing CVP was statistically significant. In group III, increasing CVP and LBV in the MCA were observed. See Table 1. Conclusions Volume replacement with HES during severe traumatic brain injury gives best results in patients with CVS and BHP. HES in patients with BH is not expedient as it may increase ICP and CVP and lead to more deep depression of consciousness.

Haemodynamic effects of a fluid challenge with hydroxylethyl starch 130/0.4 (Voluven®) in patients suffering from symptomatic vasospasm after subarachnoid haemorrhage

L Van Tulder, RL Chiolero, M Oddo, P Eggimann, L Regli, JP Revelly

Centre Hospitalier Universitaire Vaudois, Lausanne, Switzerland Critical Care 2008, 12(Suppl 2):P231 (doi: 10.1186/cc6452)

Introduction Patients suffering from symptomatic cerebral artery vasospasm (CAV) after subarachnoid haemorrhage (SAH) develop alterations in sodium and fluid homeostasis, and the effects of fluid

Table 1 (abstract P230)

Data BH initial BH after 4 hours CVS initial CVS after 4 hours BHP initial BHP after 4 hours

GCS 7.8 ± 0.4 5.7 ± 0.5* 7.6 ± 0.6 8.8 ± 0.7 5.4 ± 0.4 5.9 ± 0.4

LBV in MCA (cm/s) 148.2 ± 11.4 165.3 ± 13.5* 156.6 ± 15.2 139.3 ± 14.8 63.2 ± 5.8 81.2 ± 7.1*

CVP (cmH2O) 9.6 ± 3.5 11.3 ± 5.9* 5.6 ± 2.4 7.3 ± 3.7* 7.2 ± 4.1 9.1 ± 5.3*

ICP (mmHg) 109.4 ± 5.8 132.3 ± 6.2* 101.3 ± 3.1 108.1 ± 4.9 180.3 ± 9.4 168.4 ± 7.2

*P < 0.05.

infusion are uncertain. Since their fluid management is controversial, we assessed the effect of a single colloid infusion on global haemodynamics and fluid balance.

Methods In a prospective study, 500 ml of 130/0.4 hydroxylethyl starch (HES) was administered over 30 minutes in patients with CAV after SAH. The mean arterial pressure (MAP), central venous pressure (CVP), fluid balance, cardiac index (CI), intrathoracic blood volume (ITBV), and extravascular lung water (EVLW) were measured immediately before, and 60, 120, 180 and 360 minutes after HES by transpulmonary thermodilution (PiCCO; Pulsion). Patients increasing CI by more than 10% were considered as responders (R), versus nonresponders (NR). Comparisons were made between groups by one-way ANOVA, and at various time points by two-way ANOVA (P < 0.05 significant, mean ± SD). Results After HES, the CI changed from -14% to +62%. Considering all patients (n = 20), the CI increased at 60 minutes (4.3 ± 0.7 vs 4.8 ± 0.9 l/m2/min, P < 0.05) but returned to baseline value at 120 minutes (4.6 ± 0.9 l/m2/min) and thereafter. There was no difference in the MAP, CVP, ITBV and EPLW over time. Ten patients were R and 10 were NR. Baseline MAP, CVP, CI, ITBV and EPLW were not different between R and NR. The norepinephrine infusion rate was higher in NR than in R (18 ± 12 vs 6 ± 9 |ig/min, P < 0.05). The CI increased in R from 60 to 180 minutes, and returned to baseline at 360 minutes (respectively 4.0 ± 0.6, 5.2 ± 1.1, 4.9 ± 1.1, 4.8 ± 1.0, and 4.2 ± 1.1 l/m2/min). The evolution of fluid balance was different between R and NR: it remained unchanged in R, while it was negative at 360 minutes in NR (-0.60 ± 0.87 vs -0.04 ± 0.47). The MAP, CVP, ITBV and EPLW were not different between R and NR throughout. Conclusions By transpulmonary thermodilution, the haemo-dynamic effects of a short HES infusion were variable and unpredictable. In R, the increase in cardiac output lasted 3 hours. In NR, fluid infusion fluid therapy should be considered with caution, since it may be associated with a negative fluid balance, probably due to cerebral salt wasting. Our data suggest that fluid therapy should be closely monitored in this population of patients with altered homeostasis.

Hydroxyethyl starch 200/0.5 induces more renal macrophage infiltration than hydroxyethyl starch 130/0.42 in an isolated renal perfusion model

L Hüter, T Simon, L Weinmann, G Marx

Friedrich Schiller Universität, Jena, Germany

Critical Care 2008, 12(Suppl 2):P232 (doi: 10.1186/cc6453)

Introduction The pathological renal mechanisms of hydroxy-ethylstarch (HES) are not identified. We designed an isolated renal perfusion model in which we try to identify possible mechanisms of injury between different HES preparations.

Methods After approval of the local animal protection committee, 24 porcine kidneys in an isolated renal perfusion model of 6 hours duration were studied. We compared three different infusion solutions: 10% HES 200/0.5 (HES200) versus 6% HES 130/0.42 (HES130) versus Ringer's lactate (RL). Infusion was supplied to achieve a stable hematocrit of 0.2. Tubular damage was assessed with W-acetyl-ß-D-glucosaminidase (ß-NAG). After immunohisto-logical staining, proliferation (proliferating nuclear antigen (PCNA)) and macrophage activation (ED-1+macrophages (ED-1)) were analyzed as positive cells/visual field. Effects of infusion solution and time were statistically analyzed by ANOVA for repeated measurements. The histological changes were analyzed using ANOVA. Results ß-NAG was significantly different between groups (P < 0.001) (Figure 1). For ED-1 there were significant differences

Figure 1 (abstract P232)

iao 80

0 2 4 $

time [h]

—•- HES 200 -O- HES 130 —T— RL

W-acetyl-p-D-glucosaminidase over time for the three infusion solutions.

between HES200 and HES130 (1.3 ± 0.4 vs 0.17 ± 0.04, P = 0.044). Proliferation was significantly greater in the HES200 versus the HES130 group (18.8 ± 3.2 vs 7.2 ± 0.8, P = 0.008). Subanalysis of PCNA showed that these differences occurred in the interstitium and not in the glomerulus (18.0 ± 0.3 vs 6.5 ± 0.1, P = 0.006). Conclusions For the first time we identified proliferation and macrophage activation as the pathomechanism causing renal injury after HES application. Furthermore, the degree of renal injury in our model was significantly lower using HES130 compared with HES200.

Hydroxyethyl starch 130/0.42/6:1 for perioperative plasma volume replacement is safe and effective in children

R Suempelmann, W Osthaus

Medizinische Hochschule Hannover, Germany

Critical Care 2008, 12(Suppl 2):P233 (doi: 10.1186/cc6454)

Introduction In several clinical studies it has been shown that hydroxyethyl starch (HES) may be as effective and safe but less expensive as albumin when used for perioperative plasma volume replacement in children. The new third-generation HES 130/0.42 solution was designed to reduce adverse drug reactions (ADR) and improve safety while maintaining efficacy. In this prospective multicentric observational postauthorization safety study, therefore, the perioperative application of HES 130/0.42 was examined in children with a focus on possible ADR.

Methods In the first year approximately 300 pediatric patients aged up to 12 years with risk score ASA I-III undergoing perioperative application of HES 130/0.42 (Venofundin 6%; Braun, Germany) should be included. According to statistics, this number of patients is sufficient to show a 1% occurrence of ADR. After approval of a local ethic commission, patient data and those relating to the application of HES 130/0.42, the performed procedure, anaesthesia-related data and ADR were documented with focus on cardiovascular stability, hemodilution, acid-base balance, renal function, blood coagulation and hypersensitivity. Results Three hundred and sixteen children (ASA I-III, age 3 (SD 3.4, range day of birth-12) years, body weight 13 (SD 10.5, range 1.1-60) kg) were studied in five centres in Germany, Austria and Italy until August 2007. Forty-five percent of the patients underwent abdominal, 12.4% urological, 11.4% thoracic, 7.6%

orthopedic and 7% cardiac surgical procedures. The mean volume of infused HES 130/0.42 was 11 (SD 4.8, range per day 5-42) ml/kg. Cardiovascular stability was maintained in all cases. After HES infusion, values of hemoglobin (11.5 vs 10.25 g/dl), base excess (-2 vs -2.7 mmol/l), anion gap (12.9 vs 11.17 mmol/l) and strong ion difference (34.3 vs 31.4 mmol/l) decreased and chloride (105.7 vs 107.8 mmol/l) increased significantly (P < 0.05). No serious ADR (i.e. bleeding, renal insufficiency, hypersensitivity) were observed.

Conclusions Moderate doses of HES 130/0.42 help to maintain cardiovascular stability and lead to only moderate changes in haemoglobin concentration and acid-base balance in children. The probability of serious ADR is lower than 1%. HES 130/0.42/6:1 for plasma volume replacement therefore seems to be safe and effective even in neonates and small infants.

Effects of two different hydroxyethylstarch solutions on colloid osmotic pressure and renal function in ovine endotoxemic shock

C Ertmer1, S Rehberg1, M Lange1, A Morelli2, C Hucklenbruch1, B Ellger1, H Van Aken1, M Westphal1

1 University of Muenster, Germany; 2University of Rome 'La Sapienza', Rome, Italy

Critical Care 2008, 12(Suppl 2):P234 (doi: 10.1186/cc6455)

Introduction The purpose of the present study was to directly compare the impact of two different hydroxyethylstarch solutions (HES 6% 130/0.4, Voluven® 6%; and HES 10% 200/0.5, Hemohes® 10%) and a balanced crystalloid (Sterofundin® ISO) on the colloid-osmotic pressure (COP) and renal function in fulminant ovine endotoxemia.

Methods Thirty healthy ewes received a continuous infusion of Salmonella typhosa endotoxin started at 5 ng/kg/min, which was doubled every hour until the mean arterial pressure fell below 65 mmHg. Thereafter, sheep were randomized (each group n = 10) to either receive repeated bolus infusions of 5 ml/kg HES 130 or HES 200, or 10 ml/kg crystalloid to increase the central venous pressure (8-12 mmHg), pulmonary arterial occlusion pressure (12-15 mmHg) and mixed-venous oxygen saturation (>65%). Following infusion of the maximum colloid dose (20 ml/kg), all groups received only crystalloid infusions (10 ml/kg), if necessary. Animals surviving the 12-hour intervention period were anesthetized and killed. Data are expressed as means ± SEM. Statistics were performed using two-way ANOVA with Student-Newman-Keuls post-hoc comparisons.

Results The colloid groups needed less total fluids than the crystalloid group. Apart from significantly higher systemic oxygen delivery index and stroke volume index in sheep treated with HES 130, the hemodynamics were comparable between groups. COP was higher in both colloid-treated groups as compared with the crystalloid group (12.5 ± 0.6 and 14.7 ± 1.0 vs 8.4 ± 1.7; P < 0.05 for HES 130 and HES 200 vs crystalloids). However, there was no significant difference in COP between the two colloid groups (P = 0.429). Urinary output was markedly reduced in the HES 200 group (2.5 ± 0.9 vs 5.7 ± 1.3 and 7.2 ± 1.2 ml/kg/hour; P < 0.05 for HES 200 vs HES 130 and crystalloids). Plasma creatinine was highest in sheep treated with HES 200 (1.4 ± 0.1 vs 1.0 ± 0.0 and 1.0 ± 0.1 mg/dl; P < 0.05 for HES 200 vs HES 130 and crystalloids).

Conclusions In ovine endotoxemia, treatment with HES 200 may compromise oxygen transport and renal function as compared with HES 130 and crystalloids. The underlying mechanisms remain to be elucidated but appear to be independent of COP.

Transfusion policy and outcome in critically III patients with a long ICU stay

1 Grigoras, O Chelarescu, D Rusu

Sf Spiridon Hospital, lasi, Romania

Critical Care 2008, 12(Suppl 2):P235 (doi: 10.1186/cc6456)

Introduction Patients with a long ICU stay (>7 days) are prone to develop anemia due to high severity of disease, repeated flebo-tomies and inflammatory status with altered erythropoiesis. They are also more prone to receive a blood transfusion. The aim of our study was to assess the hemoglobin (Hb) transfusion trigger and the influence of blood transfusion on outcome in critically ill patients with an ICU length of stay (LOS) >7 days. Methods The prospective noninterventional study was performed in a mixed 19-bed ICU of a tertiary care university hospital and included all patients with an ICU LOS > 7 days admitted during 1 year. Patients were divided into two groups: patients never transfused (NT group), and patients ever transfused (ET group). Collected data were demographic data, severity scores, Hb transfusion trigger, transfusion data, ICU LOS and outcome. Statistical analysis was conducted using the Student t test and multinomial logistic regression. Results The study enrolled 132 patients (NT, 54 patients; ET, 78 patients) with a mean ICU LOS 12.9 days, a mean worst APACHE II score 22.8 and a mean worst SOFA score 9.3. Anemia (Hb <12 g%) was present in 83.3% patients at ICU admission and in 95.4% at ICU discharge. In the ET group the transfusion trigger Hb was 7.8 ± 2.3g%. In the ET group a total of 228 red blood cell units were transfused on 154 different occasions with a median of

2 (1-16) units/patient. The mortality was significantly different in the ET group (51 patients, 65.3%) versus the NT group (11 patients, 20.7%). Mortality significantly correlates with worst SOFA score (P = 0.041) and mostly with transfusion status (P = 0.002). See Table 1.

Table 1 (abstract P235)

Patient data

Variable NT group ET group P value

Patients, n (%) 54 (40.9%) 78 (59.1%) NS

Worst APACHE score 23.2 ± 9.5 22.2 ± 9.8 NS

Worst SOFA score 8.9 ± 3.6 9.8 ± 4.4 NS

ICU mortality 11 (20.7%) 51 (65.3%) 0.0004

Conclusions The incidence of anemia in critically ill patients with a long ICU stay is high (83% at ICU admission, 95% at ICU discharge). The transfusion trigger Hb was 7.8g%, a value that matches the actual restrictive policy. Blood transfusion was an independent risk factor for increased mortality in the ET group.

Outcome of surgical patients who needed blood transfusion

DO Toledo, JM Silva Jr, A Gulinelli, DD Magalhaes, F Correa, H Nuevo, KR Genga, AM Isola, E Rezende

Hospital do Servidor Público Estadual, Sao Paulo, Brazil Critical Care 2008, 12(Suppl 2):P236 (doi: 10.1186/cc6457)

Introduction Blood transfusions are associated with immune modulation, a higher postoperative infection rate, transmission of infectious diseases and higher hospital costs. Measures are necessary to avoid blood transfusion both intraoperative and post- S91

operative in surgeries with a major blood loss. This study aimed to verify the practices of blood transfusion in surgical patients. Methods A prospective cohort study, with a follow-up of 10 months, in a surgical center of a tertiary hospital. Inclusion criteria were age above 18 years, need for blood transfusion in the intraoperative period. Exclusion criteria were patients who refused to receive blood transfusion due to religious reasons, coronary artery disease, acute brain injury. Blood transfusion decision-taking was the charge of the surgical team.

Results Eighty patients were included, with mean age 68.3 ± 13.1 years, 55% female. The POSSUM and MODS scores were equal to 36.2 ± 10.3 and 2.4 ± 1.9, respectively. Eighty-two percent of the surgeries were elective, with mean length 6.3 ± 3.2 hours. The basal hemoglobin level was 12.3 ± 1.6 g/dl, and at the moment of the blood transfusion it was 8.4 + 1.8 g/dl. Patients were transfused, on average, 2.3 ± 0.9 units packed red cells, stocked for 16.8 ± 11.8 days. Hospital mortality rate was 24.6%. Patients who had a higher mortality rate (death vs discharge) were elderly (77.3 ± 8.1 vs 66.2 ± 13.4 years old; P = 0.005), had higher POSSUM (44.2 ± 10.6 vs 33.4 ± 9.1; P = 0.001) and MODS (3.4 ± 1.9 vs 2.1 ± 1.7; P = 0.02) scores, and had any of the following complications in the first 28 days postoperative (92.9% vs 39.5%; P < 0.001), such as infections, tissue hypoperfusion, shock, neurologic disturbances, ARDS, ARF, and digestive fistulae, in decreasing values.

Conclusions The mean hemoglobin level used to trigger blood transfusion in surgical patients was 8.4 ± 1.8 g/dl, and patients were transfused 2 units packed red cells on average. Age, POSSUM and MODS scores, urgent surgeries, and 28-day postoperative complications determined a worse outcome in this population. References

1. Hebert PC, et al.: Does transfusion practice affect mortality in critically ill patients? Am J Respir Crit Care Med 1997, 155:1618-1623.

2. Park KW: Transfusion-associated complications. Int Anes-thesiol Clin 2004, 42:11-26.

Immediate transfusion without crossmatching

Y Moriwaki, M Iwashita, J Ishikawa, S Matsuzaki, Y Tahara, H Toyoda, T Kosuge, M Sugiyama, N Suzuki

Yokohama City University Medical Center, Yokohama, Japan Critical Care 2008, 12(Suppl 2):P237 (doi: 10.1186/cc6458)

Introduction We should perform red cell blood transfusion therapy (RC-BTF) only after crossmatching to decrease the whole risks of this treatment, particularly ABO-incompatible transfusion. In a rapid catastrophic bleeding condition, however, we often have to do this treatment without crossmatching and without establishment of ABO blood type. Our BTF manual (2002) states that the ABO blood type is confirmed only after double examinations, usually the first examination in the routine examination at admission or first visit to the hospital and the second in the procedure of crossmatching. Most of our medical staff have been afraid that type O RC-BTF was unacceptable for patients and their families. The aim of this study is to establish safety in the procedure of immediate RC-BTF without crossmatching and to clarify how we explain this safety to medical staff, patients and their families.

Methods We examined the medical records of the patients who underwent immediate RC-BTF without crossmatching for the past 5 years in our Critical Care and Emergency Center. Data were the number of requested and used packed red cells (PRC) without crossmatching, adverse events of transfusion, and incidents S92 concerning RC-BTF therapy.

Results In 5 years in our Critical Care and Emergency Center, 1,036 units PRC were used for 109 cases without crossmatching. Type O RC-BTF without crossmatching before blood type detection was performed in 30 cases. These 109 patients underwent 9.42 units (mean) PRC without crossmatching. For patients who underwent RC-BTF without crossmatching at first and with crossmatching successively, 6.10 units (mean) of non-crossmatched PRC were transfused. On the other hand, for patients who underwent repeated RC-BTF without crossmatching, we required 7.03 units (mean) at first and excessive non-crossmatched units successively. In total, we actually used 85% of requested units of noncrossmatched PRC in immediate RC-BTF. No incompatible RC-BTF was performed, and eight incidents were noticed (error in sampling of blood, labeling on the examination tube, entering the data, and detecting the blood type). Conclusions Our manual for transfusion therapy is safe and useful. In immediate RC-BTF without crossmatching, we should use type O PRC, and we can make this therapy acceptable to medical staff in the hospital and patients and their families for step-by-step education.

Intraoperative red blood transfusion is associated with adverse outcome after cardiac surgery

L Hajjar1, F Galas1, S Zeferino1, W Leao1, G Fallico2, J Auler Jr1

1Heart Institute - Faculdade de Medicina da Universidade de Sao Paulo, Brazil; 2University of Catania Policlinico G. Rodolico of Catania, Trieste, Italy

Critical Care 2008, 12(Suppl 2):P238 (doi: 10.1186/cc6459)

Introduction Bleeding occurs in about 20% of cardiac surgery and is associated with significant morbidity. In this context, red cell transfusions are used to augment the delivery of oxygen to avoid the deleterious effect of oxygen debt. However, blood cell transfusions are associated with morbidity and mortality in clinical studies with critically ill patients. The purpose of this study was to evaluate the relationship between blood transfusion in the operative room and clinical outcomes after cardiac surgery. Methods We performed a consecutive observational study in a university hospital. A total of 125 patients undergoing elective coronary artery bypass graft surgery or valve surgery were studied. Demographic data were analyzed. Postoperative cardiac dysfunction was defined as low cardiac output or hemodynamic instability requiring inotropic support for >24 hours. Renal dysfunction was defined as >50% increase in serum creatinine from baseline. Univariate and multivariate analyses were performed. Results Of 125 patients, 41 (32.8%) received blood transfusion in the intraoperative room. Patients who received blood transfusion had more postoperative complications than those who did not (P <

0.001). Intraoperative blood transfusion was a risk factor for low-output syndrome (P = 0.02), renal dysfunction (P < 0.003), infection (P < 0.008) and longer stay in the ICU (P = 0.01). In a multivariate analysis, blood cell transfusion increased independently the risk of renal dysfunction (OR = 5.5, 95% CI = 1.5-16.9). Conclusions In this observational study, intraoperative blood cell transfusion predicted the outcome after cardiac surgery, resulting in more postoperative complications. These findings suggest that the criteria for perioperative blood transfusion should be revised, considering potential risks.


1. Covin RB, et al.: Hypotension and acute pulmonary insufficiency following transfusion of autologous red blood cells during surgery: a case report and review of the literature.

Transfus Med 2004, 14:375-383.

Acute transfusion reactions in critically ill pediatric patients

N Ozata, D Demirkol, M Karabocuoglu, A Citak, R Ucsel, N Uzel

Istanbul Medical Faculty, Istanbul, Turkey

Critical Care 2008, 12(Suppl 2):P239 (doi: 10.1186/cc6460)

Introduction This study was undertaken to determine the incidence, type, and severity of acute transfusion reactions observed in a tertiary care pediatric ICU.

Methods All transfusions of blood product administered to consecutive patients admitted to our pediatric ICU, between February 2006 and February 2007, were prospectively recorded. For each transfusion, the bedside nurse recorded the patient's status before, during, and up to 4 hours after the transfusion, as well as the presence of any new sign or symptom suggesting an acute transfusion reaction.

Results A total of 651 transfusions were administered during the study period. Sixty-one febrile nonhemolytic transfusion reactions (9.4%) were recorded. No allergic and hypotansive reactions and transfusion-related acute lung injury were seen. Seventy-seven percent (n = 47) of the febrile reactions were recorded during the red blood cell transfusions.

Conclusions The incidence of febrile nonhemolytic reactions was higher when compared with similar studies. Possible cause may be not using leuko-reduced components. Transfusion-related acute lung injury is not common in critically ill pediatric patients. These estimates are useful for decisions concerning transfusion therapy, and for evaluating efficacy of interventions to reduce risk in critically ill pediatric patients. References

1. Gauvin F, Lacroix J, Robillard P, Lapointe H, Hume H: Acute transfusion reactions in the pediatric intensive care unit. Transfusion 2006. 46:1899-1908.

2. Sanchez R, Toy P: Transfusion related acute lung injury: a pediatric perspective. Pediatr Blood Cancer 2005, 45:248255.

Utility of an artificial oxygen carrier in a rat haemorrhagic shock model

M Yanai, K Morisawa, H Takahashi, S Fujitani, K Ishigooka, Y Taira

St Marianna University School of Medicine, Kawasaki, Japan Critical Care 2008, 12(Suppl 2):P240 (doi: 10.1186/cc6461)

Introduction Hemorrhagic shock is a frequent encountered entity in a critical situation. The initial treatment for hemorrhagic shock is rapid replacement of massive extracellular fluid (ECF); however, sufficient oxygen transportation to peripheral tissue cannot be obtained only by ECF replacement. Lately, using state-of-the-art nanotechnology, an artificial oxygen transporter with the ribosome inclusion body (hemoglobin endoplasmic reticulum (HbV)) has been developed. In this study, measuring the serum lactate level and tissue lactate level, we investigated the effect of HbV on the peripheral oxygen metabolism by the microdialysis method in a fatal rat (Wister rat) model with hemorrhagic shock. Methods Cannulation was placed in the femoral vein of a rat and probes for measurement of the tissue oxygen partial pressure and microdialysis method were inserted into subcutaneous tissue of the abdominal wall. A shock model was made by exsanguinations from the femoral vein by 60% of the total body blood volume until the mean arterial pressure decreased to 25 ± 5 mmHg. We classified this shock model into two arms at random; ECF

replacement arm (n = 10) and HbV arm (n = 8). The mean arterial pressure, plasma oxygen pressure, plasma lactate (p-lac), tissue partial oxygen pressure and tissue lactate (t-lac) of this shock model were measured every 50 minutes after exsanguinations until 250 minutes elapsed.

Results Elevation of p-lac and t-lac was observed in each arm after exsanguinations. In both arms, p-lac elevated rapidly and then decreased after exsanguinations. While p-lac in the ECF arm re-elevated, that in the HbV arm decreased steadily. The p-lac in the HbV arm showed a significantly lower value than that in the ECF arm at each measured point (P < 0.05). After exsanguinations, t-lac elevated rapidly and then steadily elevated thereafter in the ECF arm, while t-lac in the HbV arm decreased. Similarly, t-lac in the HbV arm showed a significantly lower value than that in the ECF arm at each measured point (P < 0.05). In conclusion, the HbV arm significantly decreased p-lac and t-lac compared with the ECF arm in a fatal rat model with hemorrhagic shock. Conclusions Our study showed that HbV had better outcomes than ECF in terms of oxygen metabolism in peripheral tissue. References

1. Klaus S, etal.: Intensive Care Med 2003, 29:634-641.

2. Sakai H, etal.: Biomaterials 2004, 25:4317-4325.

Clinical outcome and mortality associated with postoperative low cardiac output after cardiopulmonary bypass: a cohort study

J Jimenez, J Iribarren, M Brouard, L Lorente, R Perez, S Palmero, C Henry, J Malaga, J Lorenzo, N Serrano, R Martinez, M Mora

Hospital Universitario de Canarias, La Laguna, Tenerife, Spain Critical Care 2008, 12(Suppl 2):P241 (doi: 10.1186/cc6462)

Introduction Postoperative low cardiac output (PLCO) remains a serious complication after cardiopulmonary bypass (CPB). Our aim was to determine the incidence and clinical outcome of PLCO. Methods We performed a cohort study in consecutive patients who underwent CPB surgery in a period of 8 months. PLCO was defined as dobutamine requirements >5 |ig/kg/min at least longer than 4 hours after optimized pulmonary capillary wedge pressure = 18 mmHg, to achieve a cardiac index higher than 2.2 l/min/m2. We recorded the preoperative left ventricular function, postoperative haemodynamic parameters, and clinical outcomes (postoperative arrhythmias, length of mechanical ventilation, ICU and hospital stays, and mortality). SPSS version 15 was used. Results We studied 166 patients, 50 (30.1%) women and 116 (69.9%) men, mean age 67 ± 1 years. Surgical procedures were 92 (55.4%) coronary artery bypass grafting, 55 (33.1%) valvular, 16 (9.5%) combined surgery and three (1.8%) other procedures. The preoperative left ventricular function was 65 ± 10%, and there was no difference between patients regarding PLCO. Thirty-nine (23.5%) patients developed PLCO. Aortic clamping and CPB time showed no differences. According to the type of surgery, valvular procedures had 19 (48.7%), coronary artery bypass grafting 14 (35.9%) and combined surgery six (15.4%) PLCO (P = 0.037). According to the type of valvulopathy, PLCO was associated with 16 (59.3%) mitral valvulopathy versus 11 (40.7%) other valvulopathies (P = 0.011). Patients with PLCO needed longer mechanical ventilation (15 (7-37) hours versus 7 (5-10) hours (P < 0.001)), ICU stay (5 (3.5-12.5) days versus 3 (2-4) days (P< 0.001)) and hospital stay (20 (15-28) days versus 24 (18-37) (P = 0.022)). We observed 50 postoperative arrhythmias, and in 22 patients were associated with PLCO (P < 0.001). There were nine deaths, seven of them had PLCO (P < 0.001). S93

Conclusions PLCO was associated with valvular procedures, particularly mitral valvulopathy. PLCO had a higher incidence of arrhythmias, longer ICU and hospital stays, longer mechanical ventilation and higher mortality.

Renal insufficiency is a powerful predictor of worse outcome in patients with acute heart failure

H Michalopoulou, P Stamatis, A Bakhal, E Ktorou, J Vaitsis, E Drabalou, D Nastou, D Stamatis, D Pragastis

'Metaxa' Hospital, Athens, Greece

Critical Care 2008, 12(Suppl 2):P242 (doi: 10.1186/cc6463)

Introduction Large clinical trials have revealed that renal dysfunction (RD) is a common problem and one of the major negative predictors of survival in patients with acute heart failure (AHF); however, the influence of different degrees of RD on prognosis has been less well defined.

Methods We studied 82 patients (mean age 69 ± 11 years, 64 male patients) admitted to our unit for AHF from July 2005 to November 2006. Fifty-eight percent of them (48 patients) met the criteria for RD. The aim of our study was to evaluate whether creatinine clearance (Cr-C) values calculated by Cockroft's formula [(140 - age (years)) x weight (kg)] / [72 x plasma creatinine level (mg/dl)] adjusted by sex correlated with inhospital mortality in this ICU population with AHF. We analyzed four subgroups of patients according their Cr-C: >90 ml/min, 89-60 ml/min, 59-30 ml/min and <30 ml/min. Kidney failure was defined as Cr-C < 60 ml/min. The etiology of AHF was mainly ischemic heart disease (68%) and mean the left ventricular ejection fraction was 31.6 ± 12.7%. We compared baseline characteristics and used a multivariable model to adjust and compare inhospital all-cause mortality across the CrC groups.

Results Lower Cr-C was significantly related to older age, female gender, lower blood pressure and ischemic etiology. Cardiogenic shock was more frequent in patients with reduced Cr-C. Inhospital total mortality was significantly higher in RD patients than in those without RD (10% vs 3.3%), P < 0.0001. Mortality was 3.3% in patients with Cr-C > 90 ml/min, 13.4% in patients with Cr-C between 89 and 60 ml/min, 24.2% in patients with Cr-C between 59 and 30 ml/min, and 62.5% in patients with Cr-C < 30 ml/min. OR = 3.2 (2.23-4.58), P < 0.001.

Conclusions Among patients with AHF, RD is a frequent finding and a major risk factor for inhospital mortality. Even mild degrees of Cr-C impairment showed higher mortality rates than normal values.

Study of risk factors and prognoses in female patients younger than 60 years old with acute myocardial infarction

H Fu1, Q Hua1, Y Zhao2

1Xuanwu Hospital of Capital Medical University, Beijing, China;

2Gerontic Cardiovascular Disease Institution in Chinese People Liberation Army Hospital, Beijing, China Critical Care 2008, 12(Suppl 2):P243 (doi: 10.1186/cc6464)

Introduction Acute myocardial infarction (AMI) is one of the most common cardiovascular emergencies. Studies about old female patients with AMI are much more frequent than those about younger patients. The objective of our study is to analyze the risk factors and prognoses of female patients younger than 60 years old with AMI.

Methods Seventy-five female patients younger than 60 years old S94 with AMI were compared with 440 male patients regarding

hypertension, hyperlipemia, diabetes, smoking, occupation, body mass index, complications and hospital mortality. Results The morbidity of hyperlipemia and the ratios of mental labors and smoking in female patients were significantly lower than those in male patients (P < 0.001, P < 0.05, P < 0.001, respectively); the morbidity of hypertension and the ratios of physical labors in female patients were significantly higher than those in male patients (P < 0.001, all); the morbidity of diabetes and body mass index were similar in both sexes. The incidence of complications in female patients was significantly higher than that in male patients (P < 0.05), and the hospital mortality was similar in both sexes.

Conclusions The incidence of AMI in female patients younger than 60 years old was much less than that in male patients, which probably related to lower blood fat and more physical labors in female patients. Hypertension played a more important role in female patients younger than 60 years old with AMI as compared with male patients. The prognoses in female patients were worse than those in male patients, probably owing to the higher morbidity of hypertension in female patients.

Impact factors of multiple organ dysfunction syndromes complicating acute myocardial infarction in the elderly: multivariate logistic regression analysis

Y Zhao, X Li, Q Xue, W Gao, S Wang

Institute of Geriatric Cardiology, Chinese PLA General Hospital, Beijing, China

Critical Care 2008, 12(Suppl 2):P244 (doi: 10.1186/cc6465)

Introduction Multiple organ dysfunction syndrome (MODS) is one of the leading causes of inhospital mortality after acute myocardial infarction (AMI) in the elderly. The identification of patients at increased risk of MODS in the immediate postmyocardial infarction period could therefore aid in targeting more aggressive treatment, thereby leading to improved outcomes in these patients. Methods Eight hundred patients 60 years of age and older, who presented after onset of symptoms and had either ST elevation in any two contiguous leads or new left bundle branch block, were admitted to the Chinese PLA General Hospital from 1 January 1993 to 30 June 2006. Patients were divided into two groups, based on the patients with or without MODS in the immediate postmyocardial infarction period. Data were obtained from case-record forms. The clinical characteristics, risk factors, clinical presentation, and complications were analyzed. All statistical tests were two-sided and nominal P values with a threshold of 0.05 were used in these exploratory analyses. All baseline variables on univariate analyses with P < 0.05 were included as candidate variables in the multivariable models.

Results Of the 800 patients enrolled, 27 patients (3.4%) developed MODS within 30 days after AMI. Patients with MODS had higher mortality rates (55.6% vs 11.6%, P < 0.001) and more complications of cardiogenic shock (25.9% vs 6.2%, P < 0.001), heart failure (59.3% vs 18.2%, P < 0.001), arrhythmia (44.4% vs 26.4%, P < 0.05) and pneumonia (55.6% vs 16.3%, P < 0.001) at 30 days, compared with patients without MODS. From multivariate logistic regression analysis using MODS as the dependent variable and the major acute symptoms during AMI, risk history, inhospital complication and so on as the independent variables, the major determinants of the MODS secondary to AMI inhospital were shortness of breath (OR = 2.64, 95% CI = 1.13-6.16), heart rate on the first day of admission (OR = 1.74, 95% CI = 1.14-2.64), inhospital complication of heart failure (OR = 3.03, 95% CI = 1.26-7.26) and pneumonia (OR = 2.82, 95% CI = 1.18-6.77).

Conclusions These findings demonstrate that the heart rate on the first day of admission and inhospital complication of heart failure and pneumonia were the independent impact factors of MODS complicating AMI in the elderly.

Impact factors of pneumonia in hospitalized patients with acute myocardial infarction

X Li, Y Zhao, D Wang, X Wu

Institute of Geriatric Cardiology, Chinese PLA General Hospital, Beijing, China

Critical Care 2008, 12(Suppl 2):P245 (doi: 10.1186/cc6466)

Introduction Acute myocardial infarction (AMI) may be accompanied by acute, severe, concomitant, noncardiac conditions. The most common concomitant condition was pneumonia. The study objective was to investigate the impact factors of pneumonia in hospitalized patients with AMI.

Methods A total of 1,443 patients admitted with an AMI were prospectively enrolled in Chinese PLA General Hospital between January 1993 and June 2006. Patients were divided into two groups, based on whether or not the patient was ill with pneumonia within 30 days in hospital. The clinical characteristics, risk factors, clinical treatment and complications were analyzed. Results From multivariate logistic regression analysis using pneumonia as a dependent variable and the history, inhospital complications and so on as independent variables, the major determinants of pneumonia were age (OR = 1.983, 95% CI = 1.499-2.623), history of coronary heart disease (OR = 1.566, 95% CI = 1.034-2.371), heart rate (OR = 1.823, 95% CI = 1.452-2.287) and white blood cell count (OR = 1.409, 95% CI = 1.071-1.853) at admission, and complications of heart failure (OR= 3.264, 95% CI = 2.130-5.002), ventricular tachycardia or fibrillation (OR = 2.347, 95% CI = 1.231-4.476), anemia (OR = 2.292, 95% CI = 1.482-3.543) and percutaneous coronary interventions (OR = 0.519, 95% CI = 0.327-0.824). Conclusions These findings demonstrate that aging, history of coronary heart disease, heart rate and white blood cell count at admission, and complications of heart failure, ventricular tachycardia or fibrillation, anemia and percutaneous coronary interventions are independent impact factors of pneumonia complicating AMI. Reference

1. Lichtman JH, Spertus JA, Reid KJ, et al.: Acute noncardiac conditions and in-hospital mortality in patients with acute myocardial infarction. Circulation 2007, 116:1925-1930.

Left ventricular TEI index: comparison between flow and tissue Doppler analyses and its association with postoperative atrial fibrillation in cardiopulmonary bypass surgery

J Iribarren, C Naranjo, M Mora, R Galvan, R Perez, L Lorenzo, M Brouard, A Barragan, I Laynez, J Jimenez

Hospital Universitario de Canarias, La Laguna, Tenerife, Spain Critical Care 2008, 12(Suppl 2):P246 (doi: 10.1186/cc6467)

Introduction The TEI index (myocardial performance index), an indicator of combined ventricular systolic and diastolic function, is defined as the ratio of the sum of the isovolumic relaxation time and the isovolumic contraction time. The TEI index is independent of ventricular geometry, and is not significantly affected by heart rate or blood pressure. We sought to determine whether there was association with postoperative atrial fibrillation (AF) after cardiopulmonary bypass surgery.

Methods We performed a preoperative and the postoperative first hour comparison between flow and tissue Doppler imaging analyses in patients who underwent cardiopulmonary bypass surgery. The Doppler sample volume was placed at the tips of the mitral leaflets to obtain the left ventricular inflow waveforms from the apical four-chamber view and just below the aortic valve to obtain the left ventricular outflow waveforms from the apical long-axis view, sequentially. All sample volumes were positioned with ultrasonic beam alignment to flow. Tissue Doppler imaging was obtained with the sample volume placed at the lateral and septal corner of the mitral annulus from the apical four-chamber view. We analyzed the mean of five consecutive measures. We compared the result according to the presence of postoperative atrial fibrillation. SPSS version 15 was used.

Results We studied 166 patients, 50 (30.1%) women and 116 (69.9%) men, mean age 67 ± 1 years. Surgical procedures were 92 (55.4%) coronary artery bypass grafting, 55 (33.1%) valvular, 16 (9.5%) combined surgery and three (1.8%) other procedures. The onset of postoperative AF was 38 ± 5 hours. We observed a higher preoperative lateral-mitral tissue Doppler TEI index (0.87 ± 0.43 versus 0.68 ± 0.32, P = 0.017) and preoperative septal-mitral tissue Doppler (0.96 ± 0.45 versus 0.67 ± 0.31, P = 0.004) in patients who developed postoperative AF. The flow Doppler TEI index showed no differences between both groups. The postoperative tissue Doppler TEI index showed no differences: lateral-mitral, 0.76 ± 0.42 versus 0.71 ± 0.43 and septal-mitral, 0.76 ± 0.45 versus 0.78 ± 0.44, and the postoperative flow TEI index showed similar values, 0.66 ± 0.30 versus 0.64 ± 0.27. Conclusions Higher values of the preoperative tissue Doppler TEI index, which reflects a worse global ventricular function, were associated with postoperative AF.

Color-coded speckle tracking radial strain dyssynchrony analysis in a canine model of left bundle branch block and cardiac resynchronization therapy

B Lamia, M Tanabe, H Kim, J Gorcsan III, M Pinsky

University of Pittsburgh Medical Center, Pittsburgh, PA, USA Critical Care 2008, 12(Suppl 2):P247 (doi: 10.1186/cc6468)

Introduction Quantification of left ventricular (LV) dyssynchrony is important for heart failure patients with left bundle branch block to assess the effectiveness of cardiac resynchronization therapy (CRT). We tested the hypothesis that LV contraction dyssynchrony and the impact of CRT on restoration of efficient synchronous contraction could be quantified from discordant regional radial strain. Methods Seven open-chest dogs had grayscale mid-LV short axis echo images. Right atrial (RA) and right ventricular (RV) to simulate left bundle branch block, and LV free wall (LVf) and apical LV (LVa) pacing leads were placed to create CRT. Regional radial strain was analyzed by custom software (Toshiba Corp.) for color-coded speckle tracking in six radial sites during four different pacing modes: RA, RA-RV, RA-RV-LVf (CRTf) and RA-RV-LVa (CRTa). Dyssynchrony was assessed as the maximum time difference between the earliest and latest time to peak segmental strain. RA pacing was used as minimal dyssynchrony and RA-RV pacing as maximal dyssynchrony. For each pacing mode, the global efficient strain was calculated as the area under the curve (AUC) of the global positive strain. During RA-RV we calculated the global negative strain as the sum of AUCs of negative individual segment strains. Results Baseline dyssynchrony during RA pacing control was minimal (58 ± 40 ms). RV pacing increased dyssynchrony (213 ± 67 ms, P < 0.05 vs control) and reduced LV stroke work (89 ± 46 mJ, P < 0.05 vs RA). Radial dyssynchrony was improved by both S95

CRTf and CRTa (116 ± 47 ms, 50 ± 34 ms, respectively, P< 0.05 vs RV). RV pacing displayed early septal wall thickening and opposing wall thinning with a lower efficient strain compared with RA (257 ± 124%/ms vs 129 ± 80%/ms, P < 0.05), whereas both CRTf and CRTa restored efficient strain to RA pacing levels (205 ± 78%/ms and 223 ± 76%/ms). During RA-RV, the global efficient strain and negative global strain were similar (230 ± 88%/ms vs 257 ± 123%/ms, P < 0.05) and correlated (r2 = 0.96) with RA global efficient strain.

Conclusions LV contraction efficiency and cardiac performance can be quantified by speckle tracking radial strain analysis. RA-RV-induced decreased efficiency and improved efficiency with both CRTa and CRTf can be characterized by summed regional strain changes.

Endotoxin impairs the human pacemaker current If

K Zorn-Pauly1, B Pelzmann1, P Lang1, H Mächler2, H Schmidt3, H Ebelt3, K Werdan3, K Koidl1, U Mueller-Werdan3

1Institut für Biophysik, Graz, Austria; 2Klinik für Chirurgie, Graz,

Austria; 3Martin-Luther-University Halle, Germany

Critical Care 2008, 12(Suppl 2):P248 (doi: 10.1186/cc6469)

Introduction Lipopolysaccharides (LPSs) trigger the development of sepsis by Gram-negative bacteria and cause a variety of biological effects on host cells including alterations in ionic channels. As heart rate variability is reduced in human sepsis and endotoxemia, we hypothesized that LPS affects the pacemaker current If in the human heart, which might - at least in part -explain this phenomenon.

Methods Isolated human myocytes from right atrial appendages were incubated for 6-10 hours with LPS (1 |ig/ml and 10 |ig/ml), and afterwards used to investigate the pacemaker current If. The If was measured with the whole-cell patch-clamp technique (at 37°C). Results Incubation of atrial myocytes with 10 |ig/ml LPS was found to significantly impair If by suppressing the current at membrane potentials positive to -80 mV and slowing down current activation, but without effecting maximal current conductance. Furthermore, in incubated cells (10 |ig/ml) the response of If to ß-adrenergic stimulation (1 |iM isoproterenol) was significantly larger compared with control cells (the shift of half-maximal activation voltage to more positive potentials amounted to 10 mV and 14 mV in untreated and treated cells, respectively). Simulations using a spontaneously active sinoatrial cell model indicated that LPS-induced If impairment reduced the responsiveness of the model cell to fluctuations of autonomic input.

Conclusions This study showed a direct impact of LPS on the cardiac pacemaker current If. The LPS-induced If impairment may contribute to the clinically observed reduction in heart rate variability under septic conditions and in cardiac diseases like heart failure where endotoxin could be of pathophysiological relevance.

Cardiac cycle efficiency correlates with pro-B-type natriuretic peptide in cardiac surgery patients

S Scolletta, F Carlucci, A Tabucchi, F Franchi, S Romano, P Giomarelli, B Biagioli

University of Siena, Italy

Critical Care 2008, 12(Suppl 2):P249 (doi: 10.1186/cc6470)

Introduction Cardiac cycle efficiency (CCE) can be calculated by the pressure recording analytical method (PRAM), a less invasive S96 pulse-contour system that can provide beat-to-beat monitoring of

cardiac output (CO). CCE is an innovative parameter that ranges from -1 to +1, with -1 being the worse and +1 the best possible efficiency of the cardiac cycle (that is, better ventricular-arterial coupling). Pro-BNP-type natriuretic peptide (pro-BNP) is predominantly secreted from the cardiac ventricles in response to increases in ventricular wall stress (VWS). Pro-BNP has been shown to correlate with myocardial hypertrophy and dysfunction [1]. We studied the feasibility of the CCE by PRAM when compared with pro-BNP to monitor the VWS and myocardial impairment and recovery in cardiac surgery.

Methods Ten patients with myocardial hypertrophy undergoing aortic valve replacement were studied. Plasma pro-BNP concentrations were obtained 15 minutes after the induction of anesthesia (t0), 15 minutes after myocardial reperfusion (t1), and 24 hours after surgery (t2). CCE measurements were acquired at the same times and correlations with pro-BNP levels were assessed. Results CCE values ranged from -0.38 to +0.44. CCE decreased from 18% to 42% at t1 with respect to t0 (P < 0.05). Also, at t1 a decrease of CO from 10% to 25% with respect to t0 was observed (P < 0.05). The t2 and t0 intervals showed similar values for CCE (+0.37 ± 0.08 vs +0.35 ± 0.11) and CO (5.0 ± 0.9 vs 4.8 ± 1.1 l/min). Pro-BNP was 1,270 ± 1,560 pg/ml at t0, increased moderately at t1, and peaked significantly at t2 (2,839 ± 873 pg/ml; P < 0.001). Overall, a negative correlation between CCE and pro-BNP values was found (r = -0.89, P < 0.01). At each time of the study, correlations between CCE and pro-BNP were -0.91, -0.83, and -0.88 (t0, t1, and t2, respectively; P < 0.01). Conclusions This study demonstrated an inverse correlation between CCE and pro-BNP values. The feasibility of PRAM to assess VWS, myocardial impairment and recovery during various phases of surgery sounds good. This new pulse-contour system seems a valuable tool that, together with pro-BNP measurements, may provide new insights into cardiac and hemodynamic assessment of patients scheduled for cardiac surgery. Reference

1. Ikeda T, et al.: Pro-BNP and ANP elevate in proportion to left ventricular end-systolic wall stress in patients with aortic stenosis. Am Heart J 1997, 133:307-314.

Bradford Acute Coronary Syndrome study - the impact of primary percutaneous coronary intervention in a tertiary centre: a review of the process

A Alfian Yusof, MA Khan, N Denbow, A Khan

Bradford Teaching Hospital, Bradford, UK

Critical Care 2008, 12(Suppl 2):P250 (doi: 10.1186/cc6471)

Introduction The Bradford Acute Coronary Syndrome (BACS) study was set up after the introduction of 24-hour primary percutaneous coronary intervention (pPCI) service in a tertiary centre, the Leeds General Infirmary (LGI), approximately 15 miles away from the initial site of presentation. This paper reviews the process, the demographics and the challenges faced with the introduction of pPCI in a tertiary centre remote from our hospital, the Bradford Royal Infirmary (BRI). The BRI is an urban hospital serving a population of about half a million people, and our busy emergency department (ED) sees in excess of 110,000 new patients per year. Methods Data from all acute coronary syndromes presenting to the ED at the BRI are stored prospectively on a database. The BACS study reviewed patients presenting between 22 May 2005 and 21 May 2007, 1 year before and 1 year after the introduction of 24-hour pPCI, which commenced on 22 May 2006. A structured analysis of the database was performed for the purpose of this paper. Data concerning treatment modalities, times to

achieve treatments and all the complications were tabulated and presented graphically.

Results The study looked at 161 patients who had presented in the year prior to the introduction of pPCI, and 156 patients who had attended the ED at the BRI in the year after pPCI was introduced. After the introduction of 24-hour pPCI, 87 (56%) patients had primary angioplasty at the LGI, 24 (15%) had angiogram only at the LGI, two (1.3%) had primary angioplasty at the BRI, eight (5%) were thrombolysed at the BRI, three (2%) were thrombolysed one (0.7%) was thrombolysed prehospitally, 26 (17%) had medical management at the BRI and five (3%) patients had medical management at the LGI. Of 119 patients transferred to the LGI, 87 (73%) had primary angioplasty, 24 (20%) had only angiogram, three (2.5%) were thrombolysed, and five (4.5%) were managed medically. Conclusions pPCI is the gold standard for the management of acute myocardial infarction. This paper by the BACS study group reviews the processes, the demographics of patients, and the complications that can occur in patients who present with acute myocardial infarctions and need to be transferred to a tertiary centre where onsite 24-hour pPCI service is available.

Preoperative tissue Doppler imaging and diastolic filling patterns on postoperative new-onset atrial fibrillation in cardiopulmonary bypass surgery

J Iribarren, J Jimenez, I Laynez, A Barragan, J Lacalzada, L Lorente, R Perez, L Lorenzo, R Galvan, M Mora

Hospital Universitario de Canarias, La Laguna, Tenerife, Spain Critical Care 2008, 12(Suppl 2):P251 (doi: 10.1186/cc6472)

Introduction Postoperative atrial fibrillation is one of the most frequent complications after cardiopulmonary bypass (CPB) surgery. We evaluated the value of preoperative transthoracic echocardiography and tissue Doppler imaging (TDI) analysis of the mitral annulus and the incidence of postoperative new-onset atrial fibrillation (NOAF) in CPB surgery.

Methods A cohort of CPB surgery patients underwent a preoperative transthoracic echocardiography. The annular TDI waveforms were obtained from the apical four-chamber view. The sample volume was located at the septal and lateral side of the mitral annulus. Early (E') and late (A') diastolic mitral annulus velocities and the ratio of early to late peak velocities (E'/A') were obtained. SPSS version 15 was used.

Results We studied 166 patients, 50 (30.1%) women and 116 (69.9%) men, mean age 67 ± 1 years. Surgical procedures were 92 (55.4%) coronary artery bypass grafting, 55 (33.1%) valvular, 16 (9.5%) combined surgery and three (1.8%) other procedures. Postoperative NOAF developed in 37 (74%) patients out of 50 patients with postoperative atrial fibrillation. There were no differences in preoperative left ventricular function between groups. We found a higher distance of the left atrium in systole and diastole in patients with NOAF (P = 0.005 and P = 0.038, respectively) and a higher D pulmonary vein peak velocity (52 ± 18 cm/s versus 41 ± 14 cm/s, P = 0.02). Patients with NOAF had a higher TDI E/A septal ratio (0.90 ± 0.33 versus 0.74 ± 0.29, P = 0.22). Attending to preoperative diastolic filling patterns, patients with NOAF had nine (24.3%) normal pattern, 19 (51.3%) abnormal relaxation, eight (21.6%) pseudonormal pattern and one (2.7%) restrictive pattern (P = 0.11), but NOAF patients were prompted to have an alteration in the diastolic filling pattern (28 (75.7%) versus 9 (24.3%), P = 0.028). Conclusions Higher size of the left atrium, preoperative D pulmonary vein peak velocity and TDI E/A ratio together with any degree of alteration of preoperative diastolic filling pattern were associated with postoperative NOAF in CPB surgery.

Risk factors of hospitalisation in general surgery units: new application of International Classification of Diseases

M Piechota1, M Banach2, M Marczak3, A Jacon3

1Boleslaw Szarecki University Hospital No. 5 in Lodz, Poland; 2University Hospital No. 3 in Lodz, Poland; 3Medical University of Lodz, Poland

Critical Care 2008, 12(Suppl 2):P252 (doi: 10.1186/cc6473)

Introduction The authors decided to estimate the risk of death for patients admitted to general surgery units dependent on the suggested risk factor comprised in the description of basic diagnosis (according to International Classification of Diseases (ICD-10)).

Methods The study was a retrospective analysis of mortality in general surgery units located at three university hospitals: N. Barlicki University Hospital No. 1 in Lodz, WAM University Hospital No. 2 in Lodz and B. Szarecki University Hospital No. 5. The study comprised 26,020 patients treated in these units from 1 January 2003 to 31 December 2006. One of the distinguished death risk factors - malignant neoplasm, suspicion of malignant neoplasm, acute diffuse peritonitis, paralytic ileus, acute pancreatitis, other inflammatory conditions, bleeding from digestive tract, acute vascular disorders of intestines (included in basic diagnosis), states with peritoneal obliteration, perforation or peritonitis (included in basic diagnosis), states with acute hepatic failure or cirrhosis (included into basic diagnosis) or lack of death risk factor - is ascribed to each basic diagnosis of patients hospitalised in one of the selected units (after modification of the structure). The death risk groups formed in this way were subjected to further statistical analysis in order to estimate the occurrence of significant differences in mortality between the group without the risk factor and the groups containing determined risk factors. Results Among the risk factors subjected to analysis, only one (malignant neoplasm) demonstrated a significant difference in mortality in relation to the group of diagnoses without a risk factor in every general surgery unit subjected to analysis. Three risk factors (paralytic ileus, acute vascular disorders of intestines, states with peritoneal obliteration, perforation or peritonitis) manifested a significant difference in mortality in relation to the group of diagnoses without a risk factor in one of the three surgical units subjected to analysis.

Conclusions 1. A patient hospitalised in a general surgery unit with basic diagnosis (according to ICD-10) comprising malignant neoplasm is a patient at increased risk of death (high-risk factor). 2. A patient hospitalised with basic diagnosis comprising paralytic ileus, acute vascular disorders of intestines or states with peritoneal obliteration, perforation or peritonitis is a patient with moderately increased risk of death (low-risk factor). 3. A patient hospitalised with basic diagnosis comprising acute diffuse peritonitis and the states with acute hepatic failure or cirrhosis requires further studies (necessity for increase of the sample size).

Prophylactic modalities against venous thromboembolism in complicated surgery for cancer patients

W Salem, R Abulmagd, A Shaker

National Cancer Institute, Cairo, Egypt

Critical Care 2008, 12(Suppl 2):P253 (doi: 10.1186/cc6474)

Introduction Venous thromboembolism (VTE) is the most frequent complication following surgery in cancer patients. This complication becomes more serious in complicated surgery. The morbidity S97

and mortality associated with VTE remains unacceptably high. The surgeon may not perceive VTE as a significant problem and would not be aware of the effects of prophylaxis. The aim of the work is to evaluate a different modality for prophylaxis against VTE among patients with complicated, major surgery, for cancer treatment. Methods One hundred and seventy-four patients admitted to the surgical ICU with complicated (unexpected long duration (more than 6 hours) or vascular injury) major surgery for cancer treatment, in the period from January 2006 to June 2007, were included. The patients were randomized to receive enoxaparin, 40 mg/12 hours (group (E)), intermittent pneumatic compression (group (PC)) or enoxaparin 40 mg/24 hours + intermittent pneumatic compressions (group (E + PC)). All patients underwent duplex venous ultrasonography examination on day 0; at discharge and at clinical suspicion of deep vein thrombosis (DVT) or pulmonary embolism (PE) (complaint of chest discomfort or shortness of breath, change on ECG), a same-day chest X-ray scan and ventilation-perfusion scan was obtained, to confirm PE. The incidence of DVT, PE and bleeding was recorded.

Results Calf DVT was only recorded in one patient in group (E). The incidence of proximal DVT was significantly higher in group (E), 3.6%, compared with group (PC), 1.7%, and group (E + PC), 1.6%. No significant difference occurred in the incidence of clinical PE between the three groups, but the incidences of total and fatal PE were higher in group (PC), 3.4% and 1.7%, respectively. The bleeding complication was recorded in three patients in group (E), 5.5%, one patient in group (PC), 1.7%, and one patient in group (E + PC), 1.6%. The total incidence of mortality in the 174 patients admitted to the surgical ICU was 5.75%, 30% of deaths were ascribed to PE, 20% were sudden cardiac deaths (which undoubtedly included some undiagnosed PE). Fifty percent were due to surgical complication and cancer, of which 60% were considered due to respiratory failure, which may also have included some deaths due to PE.

Conclusions In high-risk patients with complicated surgery the use of multimodality (intermittent pneumatic compression plus LMWH) provided excellent and safe prophylaxis against VTE. Reference

1. Patiar S, etal.: Br J Surg 2007, 94:412-420. P254

Cardiac output and oxygen delivery are affected by intraoperative hyperthermic intrathoracic chemotherapy

S Scolletta, F Franchi, M Garosi, L Voltolini, M Caciorgna,

S Romano, P Giomarelli, B Biagioli

University of Siena, Italy

Critical Care 2008, 12(Suppl 2):P254 (doi: 10.1186/cc6475)

Introduction Pleural space hyperthermic perfusion with cisplatin (hyperthermic intrathoracic chemotherapy (HIC)) in the multimodality treatment of malignant mesothelioma is a relative modern procedure [1]. Published data are related to postoperative lung function and medium-long-term outcome. To our knowledge, no study describes the effects of HIC on cardiovascular and metabolic parameters. We aimed to evaluate the influence of the HIC on cardiac output (CO) and oxygen delivery (DO2) in thoracic surgery patients.

Methods Ten patients (mean age 67 years) undergoing thoracic surgery for malignant mesothelioma were studied. HIC was applied with 3 l of 0.9% saline solution warmed at 42.5°C, containing cisplatin (100 mg/m2), and infused in 60 minutes. CO, DO2 and systemic vascular resistance (SVR) were calculated with a pulse S98 contour system called the pressure recording analytical method

(PRAM) [2]. PRAM parameters were blinded to the anaesthesiolo-gists who based their management (for example, fluids and/or vasoactive drugs) on standard protocols. Data were retrieved before, during and after the HIC.

Results When the HIC started, the mean arterial pressure (MAP) and SVR decreased from 81 to 51 mmHg, and from 1,500 to 1,050 dyne*s/cm5, respectively (P < 0.05). The MAP quickly went up to pre-HIC values before the end of HIC (within 10 min). Conversely, SVR achieved pre-HIC values after 3 hours. CO and DO2 decreased from 4.6 to 2.6 l/min, and from 610 to 370 ml/min, respectively (P < 0.05). They increased after the end of HIC and reached the pre-HIC values after 2 hours. Serum lactates peaked during the HIC from 0.9 to 2.8 mmol/l (basal vs on-HIC values, P <

0.01. and slowly decreased to reach pre-HIC values after 3 hours. Conclusions The hemodynamic and metabolic state of patients undergoing thoracic surgery is severely affected by HIC. Standard monitoring may not disclose the intraoperative hemodynamic changes of patients undergoing HIC. Furthermore, it does not provide key information about oxygen delivery with the hazard of an imbalance between tissue oxygen demand and consumption. We believe that a beat-to-beat hemodynamic monitoring should be used whenever a HIC is scheduled for thoracic surgery patients to avoid the risk of a low output state, tissue hypoperfusion, and bad outcome.


1. Ratto GB, et al.: Pleural perfusion with cisplatin in the treatment of malignant mesothelioma. J Thorac Cardiovasc Surg 1999, 117:759-765.

2. Scolletta S, et al.: PRAM for measurement of CO during various hemodynamic states. Br J Anesth 2005, 95:159165.

Influence of donor gender in early outcome after lung transplantation

E Miñambres1, J Llorca2, B Suberviola1, A Vallejo1, F Ortíz-Melón1, A González-Castro1

1Hospital Universitario Marques de Valdecilla, Santander, Spain;

2University of Cantabria School of Medicine, Santander, Spain Critical Care 2008, 12(Suppl 2):P255 (doi: 10.1186/cc6476)

Introduction In the current practice of lung transplantation (LT), donor and recipient genders are neither directly considered nor matched. However, donor female gender has been suggested as a significant risk factor for mortality in recipients after solid organ transplantation. The purpose of this study was to evaluate the early mortality (30 days) in LT recipients according to the donor gender (male or female).

Methods We analysed the potential effect of donor gender on early survival in all lung transplant recipients performed in our institution between January 1999 and December 2006. The curves of survival were calculated by the Kaplan-Meier method and the comparison among curves was made by the log-rank method. Results During the study period 153 LT procedures were performed in 150 patients. There was a total of 99 male donors and 54 female donors. The study groups were found to be homogeneous with regard to the major preoperative risk factors (etiology, status at transplantation, donor and recipient age, total ischemic time). The mean age of recipients was 54 ± 10 years (range 14-70). Indications included chronic obstructive pulmonary disease in 49%, idiopathic pulmonary fibrosis in 40%, and other in 11%. The 30-day survival was 86% (95% CI, 77-91%) for recipients who received male donor lungs and 80% (95% CI,

66-88%) for recipients who received female donor lungs. No differences were observed between both curves of survival according to the log-rank test (P = 0.983). A Cox proportional hazards analysis for overall survival at 30 days showed a hazard ratio of 0.99 (95% CI, 0.63-1.58; P = 0.98) in recipients who received male donor lungs.

Conclusions Even though previous reports suggest that gender negatively affects survival, this factor proved to have no influence on the early outcome of the present series. Reference

1. International Society of Heart and Lung Transplantation Registry, Sato M, Gutierrez C, Kaneda H, et a/.: The effect of gender combinations on outcome in human lung transplantation: the International Society of Heart and Lung Transplantation Registry experience. J Heart Lung Transplant 2006, 25:634-637.

Early outcome following single versus bilateral lung transplantation in recipients 60 years of age and older

A González-Castro1, J Llorca2, B Suberviola1, A Vallejo1, F Ortíz-Melón1, E Miñambres1

1Hospital Universitario Marqués de Valdecilla, Santander, Spain;

2University of Cantabria School of Medicine, Santander, Spain Critical Care 2008, 12(Suppl 2):P256 (doi: 10.1186/cc6477)

Introduction Lung transplantation (LT) has been increasingly

applied to patients over 60 years. The outcome of LT recipients in this age group has not been analyzed systematically. The purpose

of this study was to evaluate the early mortality (30 days) in LT recipients older than 60 years according to the type of procedure (single vs bilateral LT).

Methods We retrospectively reviewed our experience with older recipients between January 1999 and August 2007. The curves of survival were calculated by the Kaplan-Meier method and the comparison among curves was made by the log-rank method. Results During the study period 167 LT procedures were performed in 164 patients, of which 51 (30.5%) were aged 60 years and older (range 60-70, mean 63.3 ± 2.4 years). Thirty-seven of the recipients 60 years and older received a single LT and 14a bilateral LT. Indications included chronic obstructive pulmonary disease in 51% (26/51), idiopathic pulmonary fibrosis in 43% (22/51), and other in 6% (3/51). The 30-day survival was 84% (95% CI, 67-92%) for patients who underwent a single LT and 93% (95% CI, 59-99%) for patients who underwent a bilateral LT. No differences were observed between both curves of survival according to the log-rank test (P = 0.896). A Cox proportional hazards analysis for overall survival at 30 days showed a hazard ratio of 1.05 (95% CI, 0.46-2.38; P = 0.897) in the unilateral LT group.

Conclusions The early survival of lung transplant recipients 60 years of age or older who underwent bilateral versus single LT is comparable. The type of procedure is not a predictor of mortality in this age group. In carefully selected recipients >60 years of age, LT offers acceptable early survival. Reference

1. Nwakanma LU, Simpkins CE, Williams JA, et al.: Impact of bilateral versus single lung transplantation on survival in recipients 60 years of age and older: analysis of United Network for Organ Sharing database. J Thorac Cardiovasc

Surg 2007, 133:541-547.

Acute mesenteric ischemia: a comparative study of causes and mortality rates in Shiraz, Southern Iran

P Hasanpour1, K Bagheri Lankarani2

1Shiraz University of Medical Education, Shiraz, Iran; 2Ministry of Health and Education, Tehran, Iran

Critical Care 2008, 12(Suppl 2):P257 (doi: 10.1186/cc6478)

Introduction Acute mesenteric ischemia (AMI) is a catastrophic disorder of the gastrointestinal tract with high mortality. Owing to recognition in advanced stages and late treatment of patients with AMI, this disease is still considered a highly lethal condition. There are few data on characteristics of this disease in Iran, so this study was conducted to determine characteristics of this disease in the population.

Methods In a retrospective study, all patient records of public and private hospitals in Shiraz, Southern Iran, with an impression of acute abdomen, bowel gangrene or abdominal pain, and patients with risk factors for this disease, who were admitted from March 1989 to March 2005, were reviewed, and those with AMI were identified, analyzed and compared with other research. Results Among the 10,000 patient records studied, 105 patients with AMI were identified. The mean age of patients was 57 years. The most common symptoms were abdominal pain (98.09%), vomiting (68.5%) and constipation (36.1%). Heart diseases were seen in 44.7% of cases. The mortality rate in patients with AMI was 50.5%. The mortality rate was lower in patients undergoing mesenteric angiography (P = 0.014). In those patients in whom the site of lesion was exactly defined, 41.9% of cases were due to venous thrombosis, 25.7% due to mesenteric arterial emboli, 19% due to mesenteric arterial thromboses, and 8.5% were of nonocclusive types.

Conclusions AMI is a relatively common cause of acute abdomen especially in old patients referred to Shiraz hospitals, with venous thrombosis being the most common type. Early diagnosis especially with early use of mesenteric angiography and treatment may decrease the mortality from AMI. References

1. Brandt LJ, Boley SJ: Intestinal ischemia. In: Sleisenger and Fordtran's Gastrointestinal and Liver Disease. 7th edition. Philadelphia, PA: W.B Saunders, Co.; 2002.

2. Williams LF Jr: Mesentric ischemia. Surg Clin North Am 1988, 68:331-353.

3. Benjamin E, Oropello JM, Iberti TJ: Acute mesenteric ischemia: pathophysiology, diagnosis and treatment. Dis Mon 1993, 39:131-210.

4. Luther B, Bureger K, Sellentin W: Acute occlusion of the intestinal ateries - diagnosis and surgical therapy. Zen-tralbl Chir 1997, 112:1411-1419.

Systemic inflammatory response syndrome post cardiac surgery: a useful concept?

NS MacCallum, SE Gordon, GJ Quinlan, TW Evans, SJ Finney

NHLI, Imperial College, London, UK

Critical Care 2008, 12(Suppl 2):P258 (doi: 10.1186/cc6479)

Introduction Systemic inflammatory response syndrome (SIRS) is the leading cause of morbidity and mortality in the critically ill. It is associated with a 50% reduction in 5-year life expectancy. SIRS is defined as two of the following criteria: heart rate >90 beats/min, respiratory rate >20 breaths/min or pCO2 <4.3 kPa, temperature <36°C or >38°C, white cell count <4 x 109/l or >12 x 109/l. These S99

criteria are used to stratify patients for specific therapies and in research to define interventional groups. Cardiac surgery is associated with systemic inflammation. We undertook to describe the incidence of SIRS post cardiac surgery and relate this to outcome.

Methods We retrospectively analysed prospectively collected data from 2,764 consecutive admissions post cardiac surgery (coronary artery bypass grafting 1,425 admissions, valve 763 admissions, combined 252 admissions, other 324 admissions). The number of criteria met simultaneously within 1-hour epochs was recorded for the entire admissions.

Results Totals of 96.4%, 57.9% and 12.2% of patients met at least two, three or four criteria respectively within 24 hours of admission (Table 1). The ICU mortality was 2.67%. The length of stay (LOS) exceeded 3 days in 18.5% of patients. The temperature criterion was least often fulfilled. Scoring and outcome data are presented. Simultaneous presence of more criteria was associated with greater mortality and more prolonged ICU stay, P < 0.0001.

Table 1 (abstract P258)

Scoring and outcome variables associated with meeting SIRS criteria (n = 2,764)

Figure 1 (abstract P259)

SIRS criteria >2 (96.4%) >3 (57.9%) 4 (12.2%)

APACHE II score 15.2 ± 4.7 15.8 ± 5.1 17.1 ± 5.9

SOFA score (day 1) 5.5 ± 2.2 5.7 ± 2.3 6.3 ± 2.6

ICU mortality (%) 2.78 4.25 10.42

LOS (days) 3.2 ± 7.0 4.0 ± 8.5 6.8 ± 14.0

Conclusions Nearly all patients fulfilled the standard two-criteria definition of SIRS within 24 hours of admission. This definition does not adequately define the subgroup of patients with greater systemic inflammation, mortality or LOS. Thus, some clinical manifestations of inflammation are very common following cardiac surgery, although not necessarily prognostic. The presence of three or more criteria was more discriminatory of death and prolonged ICU stay. We propose that three SIRS criteria is a more appropriate threshold that defines those patients with clinically significant inflammation post cardiac surgery.

I Inhalational

day 1 day 2 day 3

Systemic inflammatory response syndrome (SIRS) post coronary artery bypass grafting.

total intravenous anaesthesia (TIVA) (propofol). The surgical technique was standardised as far as possible. All cases necessitated the use of cardiopulmonary bypass. Inflammation was assessed up to 72 hours postoperatively using a combination of physiological and biochemical parameters. Physiological assessment consisted of the development of systemic inflammatory response syndrome (SIRS). Biochemical assessment consisted of measurement of plasma IL-6, myeloperoxidase and C-reactive protein. Blood samples were obtained preoperatively and at 5, 24, 48 and 72 hours postoperatively.

Results SIRS was reduced in patients who received TIVA (P < 0.05, Fisher's exact test; n = 13 TIVA, n = 8 inhalational; Figure 1). Plasma IL-6, myeloperoxidase and C-reactive protein were elevated postoperatively although levels were unaffected by the mode of anaesthesia (P = not significant, two-way ANOVA; n = 13 TIVA, n = 8 inhalational).

Conclusions We have demonstrated a protective benefit of TIVA on the development of SIRS postoperatively in patients undergoing CABG surgery. A larger double-blind randomised controlled trial is required to confirm these results.

Modulation of the inflammatory response induced during coronary artery bypass graft surgery

SE Gordon, MJ Griffiths

Royal Brompton Hospital, London, UK

Critical Care 2008, 12(Suppl 2):P259 (doi: 10.1186/cc6480)

Introduction Ischaemic preconditioning provides endogenous protection against ischaemia and also inflammation resulting from ischaemia reperfusion injury. A number of exogenous pharmacological substances including adenosine, bradykinin, noradrenaline and inhalational halogenated anaesthetic agents are recognised triggers of preconditioning.

Methods The investigation was approved by the Royal Brompton, Harefield & NHLI research ethics committees. Patients scheduled for first-time coronary artery bypass grafting (CABG) surgery with triple-vessel coronary artery disease and at least moderate left ventricular function were recruited. Exclusion criteria included: age >80 years; unstable angina; noninsulin-dependent diabetes mellitus treated with KATP channel blockade; use of nicorandil or nitrate use within 24 hours of surgery. Preoperative risk variables were compared using the EuroSCORE. Patients were randomised S100 to anaesthesia facilitated using halogenated inhalational agents or

Intraoperative optimization of hemodynamic parameters is associated with a better outcome after cardiac surgery

L Hajjar1, R Melo1, F Galas2, M Sundin2, A Gullo3, J Auler Jr2

1Heart Institute, Sao Paulo, Brazil; 2Heart Institute - Faculdade de Medicina da Universidade de Sao Paulo, Brazil; 3University of Catania Policlinico G. Rodolico of Catania, Trieste, Italy Critical Care 2008, 12(Suppl 2):P260 (doi: 10.1186/cc6481)

Introduction Goal-directed therapy has been used for critically ill patients. The purpose of this study was to evaluate the ability of intraoperative perfusion parameters in predicting outcome after cardiac surgery.

Methods A total of 98 patients undergoing cardiac surgery were prospectively evaluated. Samples of lactate, arterial gases, and central venous saturation (SVO2) were collected 60 minutes after beginning surgery, 120 minutes after and at the end of the procedure. Univariate and multivariate analyses were performed. Results Factors associated with cardiac dysfunction were previous low ejection fraction (P = 0.035), surgery with pump (P = 0.001), longer duration of pump (P = 0.003), low initial intraoperative central venous saturation (P = 0.001) and high level of initial gapCO2 (P = 0.02). A low intraoperative SVO2 was

independently associated with a sevenfold increase (95% CI, 2-21) and a initial high level of CO2gap with a 4.5-fold increase (95% CI, 1.6-12) in cardiac dysfunction after cardiac surgery. Associated factors with renal dysfunction were age (P = 0.045), longer duration of pump (P = 0.01) and low initial intraoperative central venous saturation (P = 0.001). A low intraoperative SVO2 was associated with a 12-fold increase and a low level of initial base excess with a 27-fold increase in rates of infection after cardiac surgery. A high level of final arterial lactate predicted a longer time of mechanical ventilation (OR, 4.56; 95% CI, 1.4-11.2). There were no relations of perfusion parameters with longer time of stay in the ICU or mortality.

Conclusions In this observational study, a low intraoperative level of SVO2 is an independent predictor of cardiac dysfunction, renal failure and infection after cardiac surgery and a high level of lactate is associated with a longer time of mechanical ventilation. These findings suggest that these parameters may be markers of prognosis after cardiac surgery. Reference

1. Rivers E, Nguyen B, et al.: Early goal-directed therapy in the treatment of severe sepsis and septic shock. N Engl J Med 2001, 345:1368-1377.

Effects of inhaled iloprost on right ventriculovascular coupling and ventricular interdependence in acute pulmonary hypertension

S Rex1, C Missant2, P Claus2, W Buhre3, R Rossaint1, PF Wouters4

1University Hospital Aachen, Germany; 2Katholieke Universiteit Leuven, Belgium; 3University of Witten-Herdecke, Hospital Koln-Merheim, Cologne, Germany; 4Ghent University Hospital, Ghent, Belgium

Critical Care 2008, 12(Suppl 2):P261 (doi: 10.1186/cc6482)

Introduction Prostacyclin inhalation is increasingly used to treat acute pulmonary hypertension (PHT) and right ventricular (RV) failure. Prostacyclins not only affect vasomotor tone, but may also have cyclic adenosine 3',5'-monophosphate-mediated positive inotropic effects and modulate autonomic nervous system (ANS) tone. We studied the role of these different mechanisms in the overall hemodynamic effects produced by iloprost (ILO) inhalation in an experimental model of acute PHT.

Methods Twenty-six pigs were instrumented with biventricular conductance catheters, a pulmonary artery (PA) flow probe and a high-fidelity PA-pressure catheter. The effects of 50 |ig inhaled ILO were studied in healthy animals with and without blockade of the ANS, and in animals with acute hypoxia-induced PHT. Results ILO had minimal hemodynamic effects in healthy animals and produced no direct effects on myocardial contractility after pharmacological ANS blockade. During PHT, ILO resulted in a 51% increase in cardiac output when compared with placebo (5.6 ± 0.7 vs 3.7 ± 0.8 l/min, P = 0.0013), a selective reduction of RV afterload (effective PA-elastance (PA-Ea): from 0.6 ± 0.3 vs 1.2 ± 0.5 mmHg/ml; P = 0.0005) and a significant increase in left ventricular (LV) end-diastolic volume (91 ± 12 vs 70 ± 20 ml, P = 0.006). Interestingly, RV contractility was reduced after ILO (slope of preload recruitable stroke work: 3.4 ± 0.8 vs 2.2 ± 0.5 mW/ml; P = 0.0002), while ventriculovascular coupling remained essentially preserved (ratio of RV end-systolic elastance over PA-Ea: 0.97 ± 0.33 vs 1.03 ± 0.15).

Conclusions In acute PHT, ILO improved global hemodynamics primarily via selective pulmonary vasodilation and a restoration of LV preload. The reduction of RV afterload was associated with a paradoxical decrease in RV contractility. This appears to reflect an indirect mechanism serving to maintain ventriculovascular coupling at the lowest possible energetic cost, since no evidence for a direct negative inotropic effect of ILO was found. Reference:

1. Rex S, et al.: Epoprostenol treatment of acute pulmonary hypertension is associated with a paradoxical decrease in right ventricular contractility. Intensive Care Med 2008, 34: 179-189.

Dobutamine in acute myocardial infarction: should we use it for reduction of pulmonary hypertension and pulmonary capillary wedge pressure in acute myocardial infarction?

A Macas, G Baksyte, A Pikciunas, A Semetaite

Kaunas University of Medicine, Kaunas, Lithuania

Critical Care 2008, 12(Suppl 2):P262 (doi: 10.1186/cc6483)

Introduction Enthusiasm for application of dobutamine has retreated after the introduction to clinical practice of new medications such as levosimendan. Such treatment, however, is rather expensive as well as remote outcomes still being under discussion. Owing to this, dobutamine still has an appropriate place in the treatment of patients with complicated acute myocardial infarction (AMI) where signs of acute heart failure and pulmonary hypertension should be treated immediately. The study objective was to evaluate the hemodynamic effect of dobutamine using the pulmonary artery catheterization technique in patients with AMI complicated by cardiogenic shock and pulmonary hypertension. Methods Dobutamine was infused continuously for patients with AMI complicated by cardiogenic shock and with verified pulmonary hypertension. Only low doses not exceeding 4 |ig/kg/min dobutamine were continuously infused. Data were obtained using a pulmonary artery catheter. Hemodynamic indices including the cardiac output (CO), pulmonary pressures and pulmonary artery capillary wedge pressure (PAWP) were measured. Results Nineteen patients were investigated according to the study protocol, 11 (57.9%) men and eight (42.1%) women. Average age was 65.1 ± 11.2 years. Anterior AMI was diagnosed for 14 (73.7%) patients, inferior for five (26.3%). The inhospital mortality rate was 52.6% (10 patients). The initial CO was 3.3 ± 0.9 (range from 1.8 to 5.4 l/min), the mean pulmonary artery pressure (MPAP) was 34.8 ± 13.4 mmHg (maximum 50 mmHg), and the PAWP was 25.7 ± 10.4 mmHg (maximum 42 mmHg). After the first day of continuous dobutamine infusion, the CO was 4.2 ± 1.2 (range from 2.5 to 6.4 l/min), the MPAP was 31.1 ± 7.9 mmHg (maximum 43 mmHg), and the PAWP was 16.2 ± 3.8 mmHg (maximum 21 mmHg). After the termination of dobutamine (after 48 hours), the CO was 4.2 ± 0.9 (range from 3.1 to 6.1 l/min), the MPAP was 30.2 ± 8.1 mmHg (maximum 46 mmHg), and the PAWP was 16.4 ± 3.4 mmHg (maximum 21 mmHg). The increase of the initial CO and reduction of PAWP after the first day of continuous dobutamine infusion were statistically significant (P < 0.05).

Conclusions Application of dobutamine showed a positive benefit in reduction of pulmonary hypertension and pulmonary capillary wedge pressure as well as in the increase of cardiac output for patients with AMI complicated by cardiogenic shock and pulmonary hypertension. S101

Dobutamine protects lymphocytes against staurosporin-induced apoptosis: investigation of the antioxidative action of the dobutamine molecule

J Van Melkebeek1, S Vandeput2, M Roesslein3, F Jans1

1Ziekenhuis Oost-Limburg, Genk, Belgium; 2University of Hasselt, Diepenbeek, Belgium; 3University Hospital, Freiburg, Germany Critical Care 2008, 12(Suppl 2):P263 (doi: 10.1186/cc6484)

Introduction Catecholamines have been shown to modulate various immunological functions. In previous experiments we demonstrated that dobutamine pretreatment protects T cells from staurosporin-induced apoptosis [1]. In the current study we planned to investigate whether antioxidative properties of the dobutamine molecule might be responsible for its protective effect. Methods Jurkat T-cell passages 1-12 were used. Results Experiments with a caspase-activity assay confirmed previous results: pretreatment (4 hours) with dobutamine 0.1 mM and 0.5 mM decreased staurosporin-induced apoptosis in Jurkat T cells from 14.0% to 11.6% and 8.7%, respectively (P < 0.01). Other catecholamines such as epinephrine and norepinephrine had no protective effect. To investigate whether production of ROS could be measured, Jurkat T cells were loaded with CM-H2DCFDA. After washing steps, the cells were exposed to 0 |iM, 1 |iM, 10 |iM and 100 |iM H2O2 for 6 hours. The fluorescence signal (ex 480/em 520 nm) measured was 36.7 U, 37.7 U, 38.1 U and 54.3 U, respectively, demonstrating the relation between ROS and the fluorescence signal. Next, production of ROS due to staurosporin treatment (2 |iM) was measured: ROS production increased minimally after exposure for 2 hours. Only after 6 hours of staurosporin treatment, the ROS signal increased from 36.7 U to 42.1 U (P < 0.05). Subsequently, the ROS-scavenging effect of dobutamine was investigated. CM-H2DCFDA-loaded cells were exposed to staurosporin (2 |iM) for 2 hours, with or without dobutamine pretreatment (0.1 mM and 0.5 mM): the ROS-scavenging effect was very pronounced in the 0.1 mM group (decrease in fluorescence signal from 56.1 U to 22.5 U, P < 0.01), and increased further in the 0.5 mM group (17.8 U, P < 0.01). Control experiments with unstained cells showed that addition of dobutamine did not change the autofluorescence signal. Conclusions These experiments demonstrate that dobutamine acts as a ROS scavenger. Whether this scavenging effect is responsible for the protective properties of dobutamine against staurosporin-induced apoptosis is currently under investigation. Reference

1. Jans F, et al.: Crit Care 2007, 11(Suppl 2):31. P264

Hemodynamic effects of levosimendan in patients with low-output heart failure

H Gharsallah, M Bel Hadj Amor, W Koubaa, F Gargouri, K Lamine, M Ferjani

Military Hospital of Tunis, Tunisia

Critical Care 2008, 12(Suppl 2):P264 (doi: 10.1186/cc6485)

Introduction Levosimendan is a positive inotropic drug agent that increases the sensibility of contractile proteins to calcium [1]. The drug is used in patients with decompensated low-output heart failure [2]. The aim of our study is to analyze the hemodynamic effects of levosimendan in patients with refractory cardiogenic shock. Methods After approval by our institutional ethics committee, 16 patients who had a refractory cardiogenic shock after myocardial infarction (n =10), peripartum cardiomyopathy (n = 3) and

cardiomyopathy (n = 3) were prospectively included in our study. Levosimendan was added to conventional inotropic agents (dobutamine) with a bolus dose of 12 |ig/kg for 30 minutes followed by a continuous infusion at a rate of 0.1 |ig/kg/min for 24 hours. Hemodynamic data (continuous cardiac output and venous oxygen saturation) were obtained by a Swan-Ganz catheter at T0, 30 minutes, 90 minutes, 2 hours, 4 hours, 8 hours, 12 hours, 24 hours and 48 hours. A transoesophageal echocardiography was performed at T0, day 1, day 2, day 7 and day 15. SPSS version 10 was used for all statistical analyses. Results After levosimendan administration, a significant reduction of pulmonary and vascular resistances values was followed by a significant increase of the cardiac index and venous oxygen saturation. Changes in heart rates and mean arterial blood pressure were not significant. The left ventricular ejection fraction was increased from 24% (T0) to 40% (day 2). Conclusions This study showed that levosimendan improves hemodynamic parameters and left ventricular ejection fraction in patients with cardiogenic shock. Controlled trials of sufficient size are needed, however, to confirm these results. References

1. Figgitt DP, Gillies PS, Goa KL: Drugs 2001, 61:613-627.

2. Follath F, Cleland JGF, Just H, et al.: Lancet 2002, 360:196202.

Levosimendan does not improve resuscitation success after hypovolemic cardiac arrest

E Semenas, L Wiklund

Uppsala University Hospital, Uppsala, Sweden

Critical Care 2008, 12(Suppl 2):P265 (doi: 10.1186/cc6486)

Introduction Resuscitation from hemorrhagic shock and subsequent cardiac arrest (CA) is a major clinical challenge in the care of trauma patients. Levosimendan, a new calcium sensitizer, exerts positive inotropic and lusitropic effects in failing human myocardium without increase in energy expenditure [1]. The aim of this study was to evaluate possible beneficial effects of levosimendan in combination with vasopressin in hypovolemic CA and subsequent cardiopulmonary resuscitation. Methods Five anesthetized male piglets (26.5 ± 1.1 kg) were bled (25.1 ± 3.4% of calculated total blood volume) to a mean arterial blood pressure of 35 mmHg during 12.9 ± 0.2 minutes. Afterwards the piglets were subject to 4 minutes untreated ventricular fibrillation followed by 12 minutes open-chest cardiopulmonary resuscitation (CPR). At 5 minutes of CA, 0.4 U/kg vasopressin and 12 |ig/kg levosimendan were given intravenously and an infusion of

3 ml/kg hypertonic saline and dextran (7.5% saline, 6% dextran 70) was given in 20 minutes. Internal defibrillation was attempted from 7 minutes of CA to achieve restoration of spontaneous circulation (ROSC). If necessary, at 8 minutes of CA, 0.4 U/kg vasopressin was repeated intravenously. Hemodynamic variables, continuous cerebral cortical blood flow and blood gas parameters were measured during CPR and up to 180 minutes after ROSC. Blood samples for 8-iso-PGF2a, 15-keto-dihydro-PGF2a, protein S-100P and troponin I were taken.

Results ROSC was achieved in two out of five piglets. Only one of these piglets survived the whole experiment. Another piglet died 60 minutes after ROSC due to a new episode of ventricular fibrillation. It was difficult to achieve ROSC due to persistent ventricular fibrillation during CPR. The mean number of defibrillation attempts was 14.2 (range: 8-21). The mean coronary perfusion pressure was 19-21 mmHg during CPR. Piglets that achieved ROSC needed a constant dobutamine infusion for

hemodynamic stability. Concentrations of troponin I continuously increased after ROSC, reaching maximum levels at the end of the study. During the very early reperfusion phase (5-15 min after ROSC) the cerebral cortical blood flow was 18-47% greater than baseline values. Thereafter, it remained elevated about 18% at 30 minutes, and was decreased to baseline level during the remainder of the experiment.

Conclusions A combination of levosimendan and vasopressin did not improve resuscitation success. A combination of levosimendan and vasopressin produced ventricular fibrillation resistant to defibrillation attempts in a hypovolemic cardiac arrest model. Further studies are necessary in order to evaluate effects of vasopressin and other inotropic agents in hypovolemic animal models. Reference

1. De Luca L, etal.: Eur Heart J 2006, 27:1908-1920.

Early experience with levosimendan in children with low-output syndrome after cardiac surgery

F Galas, L Hajjar, A Marques, A Rodrigues, C Silva, M Sundin, J Auler Jr

Heart Institute - Faculdade de Medicina da Universidade de Sao Paulo, Brazil

Critical Care 2008, 12(Suppl 2):P266 (doi: 10.1186/cc6487)

Introduction After cardiac surgery, most children require inotropic support. Dobutamine, dopamine and milrinone are not always effective, however, and in some cases these drugs are associated with significant adverse effects. Levosimendan, a new calcium-sensitizing agent, may be an alternative to treatment of this syndrome, with positive inotropic effects, vasodilating properties without cathecolamine release. We sought to investigate the effect of levosimendan in children after cardiac surgery. Methods A prospective open-label study was carried out in 18 children. Their mean age was 42 months (5 days-18 years), the mean ejection fraction was 31% and all of them required one or more inotropic drug for more than 24 hours before receiving levosimendan. Levosimendan was administered without bolus dose, in an intravenous infusion of 0.2 |ig/kg/min over 24 hours. Echocardiographic assessments of ventricular function were made before and 3-5 days after levosimendan infusion. Results The heart rate, systolic pressure, diastolic pressure, mean blood pressure, and central venous pressure were unchanged during and after levosimendan administration. Levosimendan allowed for discontinuation of catecholamines in 10 patients and a dose reduction in five patients. The dose of dobutamine was reduced from 8.4 |ig/kg/min prelevosimendan to 3 |ig/kg/min on day 5 (P < 0.01). The ejection fraction for the group improved from 31% to 40.5% (p < 0.01).

Conclusions Levosimendan can be safely administered to infants and children with low-output syndrome after cardiac surgery. Levosimendan allowed for significant reduction in catecholamine infusions and also produced an objective improvement in myocardial performance in children after cardiac surgery without significant adverse effects. References

1. Michaels AD, McKeown B, Kostal M, et al.: Effects of intravenous levosimendan on human coronary vasomotor regulation, left ventricular wall stress, and myocardial oxygen uptake. Circulation 2005, 111:1504-1509.

2. Kivikko M, Lehtonen L, et al.: Sustained hemodynamic effects of intravenous levosimendan. Circulation 2003, 107:81-86.

Effects of levosimendan in acute heart failure, cardiogenic and septic shock

JA Alhashemi, Q Alotaibi

King Abdulaziz University, Jeddah, Saudi Arabia

Critical Care 2008, 12(Suppl 2):P267 (doi: 10.1186/cc6488)

Introduction Levosimendan, a calcium sensitizer, improves cardiac output in patients with acute decompensated heart failure (ADHF). Its effect in patients with cardiogenic and septic shock remains unknown. We hypothesized that levosimendan improves cardiac output similarly in patients with ADHF (group A), cardiogenic (group C), and septic shock (group S).

Methods Sixty consecutive patients were enrolled in this prospective observational study. Levosimendan infusion was started in all patients at 0.05 |ig/kg/min intravenously, without a bolus dose, and was increased by 0.05 |ig/kg/min every 30 minutes to a maximum of 0.2 |ig/kg/min intravenously, at which time levosimendan infusion was continued for 24 hours. The thermodilution cardiac output and central venous saturation (ScvO2) were measured at baseline and every 4 hours thereafter for a total of 48 hours. Hypotension (mean arterial pressure < 65 mmHg) was treated with norepinephrine, titrated to mean arterial pressure > 65 mmHg.

Results APACHE II scores were 15 ± 7, 18 ± 7, and 22 ± 7 (P = 0.01) and median (IQR) ICU length of stay was 5 (3.9), 8 (6.23), and 11 (7.21) days (P < 0.01) for groups A, C, and S, respectively. During the first 48 hours, the cardiac index increased within each group (P = 0.01) but there were no differences in cardiac index between the study groups (P = 0.58), and the ScvO2 did not change significantly within and between study groups (Figure 1). Group S received more norepinephrine than did the other groups (P < 0.05).

Conclusions Levosimendan has similar hemodynamic effects when administered to patients with ADHF, cardiogenic, and septic shock. It did not appear to have a significant effect on ScvO2.

Figure 1 (abstract P267)

Time jh)

Changes in central venous saturation (ScvO2).

Sedation during mechanical ventilation: a comparison of sedatonarcosis and awake sedation

B Volgyes, MM Mezei, PG Golopencza

Bajcsy Zs. Kórház, Budapest, Hungary

Critical Care 2008, 12(Suppl 2):P268 (doi: 10.1186/cc6489)

Introduction The most important goal during mechanical ventilation in the ICU is to achieve patient comfort and patientventilator synchrony. Once proper analgesia has been established, an infusion of a sedative should be added. The goal of this study was to investigate whether continuously awake sedation during mechanical ventilation (MV) decreased the days of ventilation and complications.

Methods All patients with MV - based on the abovementioned criteria - were included (age: 20-70 years; community-acquired pneumonia; two quadrant infiltrates; PaO2/FiO2 < 200; no other organ dysfunction). From June 2001 to February 2004, patients with MV received deep sedation (midazolam 0.03-0.20 mg/kg body weight/hour and propofol 0.5-2.0 mg/kg body weight/hour). This is the 'sedatonarcosis' group. From March 2004 to July 2007, patients were treated with 'awake sedation' (alprasolam 1.5-2.0 mg/day).

Results All of the patients received low tidal volume ventilation, de-escalation antibiotics, continuous correction of homeostasis, management of enteral feeding and pulmonary physiotherapy. In both groups we applied noninvasive respiratory therapy after extubation. It was possible to mobilise patients earlier - before the extubation - in the awake sedation group. See Table 1. Conclusions Adopting awake sedation during MV (compared with continuous sedatonarcosis) decreased the days on ventilation, and the lengths of ICU and hospital stay.

Multiparametric evaluation of sedation in the ICU

A Donati, L Botticelli, L Romagnoli, M Romanelli, S Marzocchini, C Anastasi, P Pelaia

Universita Politécnica delle Marche, Torrette di Ancona, Italy Critical Care 2008, 12(Suppl 2):P269 (doi: 10.1186/cc6490)

Introduction A problem for patients submitted to extended sedation is evidence of a liver dysfunction. We wanted to verify whether this dysfunction is either correlated with the splanchnic hypoperfusion or with the extended administration of drugs. Methods During March-September 2007, four patients with pure cranial trauma (age 34-50 years) were treated with midazolam, propofol, sodium thiopental and fentatienil. Parameters evaluated: burst suppression ratio (BSR) with electroencephalography (Aespect; GE Health Care), functional capillary density (FCD) and mean velocity with Microscan and MAS (MicroVision Medical, Amsterdam, The Netherlands), plasma disappearance ratio (PDR)

with Pulsion LIMON (SEDA; Milano), intramucosal pH and regional PCO2 with Tonocap (GE Health Care), electrocardiography, mean arterial pressure and hematochemical examinations (transaminases, y-glutamyl transpeptidase (GGT), serum bilirubin, serum amylases, serum lactates and drugs dosages). Exclusion criteria: hepatopathy at admission, age <18 years and >60 years, BMI > 30, clinical factors favouring splanchnic hypoperfusion. All the parameters are analysed at t0 (admission to the ICU) and then every 48 hours during the sedation (t1, t2, t3) and at its end (t4, t5) through the Friedman test (P < 0.05) and the Spearman test (P < 0.05 and R > 0.6).

Results Increase of transaminases at the end of sedation in three of the four patients. Earlier increase of GGT in all patients. Serum bilirubin always in range. Increase of serum amylases in three of the four patients is correlated to propofol dosage. PDR always >16%/ml (cutoff of hepatic hypoperfusion). BSR always >0% during t1, t2, t3. FCD always steady and mean velocities always high. Hepatic cytonecrosis indexes, GGT, serum amylases are well correlated to splanchnic perfusion indexes (serum lactates, FCD and regional PCO2) so their increase apparently is not due to splanchnic hypoperfusion. Patient 1 (propofol, fentatienil early replaced with sodium thiopental, midazolam), increase of transaminases when midazolam was stopped. Because of the beginning of an epileptic status, the patient was eliminated from the study. Patient 2 (propofol, midazolam, fentatienil), increase of trans-aminases associated with the paradoxical increase of splanchnic perfusion. Patient 3 (propofol, midazolam, fentatienil), increase of transaminases and serum amylases is not associated with the splanchnic hypoperfusion. Serum amylases increase according to the increase of propofol dosage. Patient 4 (propofol, fentatienil), late increase of transaminases, serum amylases and GGT is associated with the propofol dosage.

Conclusions Drugs used for the analgosedation seem responsible for the increase of transaminases but not for the decrease of splanchnic perfusion. This study has to be confirmed by other studies recruiting more patients and with more precise exclusion criteria. Reference

1. Jacobi J, et al.: Crit Care Med 2002, 30:119-141. P270

Short-term sevoflurane sedation using the anaesthetic conserving device AnaConDa® after cardiac surgery: feasibility, recovery and clinical issues

KD Rohm, MW Wolf, J Boldt, T Schollhorn, A Schellhaass, SN Piper

Klinikum Ludwigshafen, Germany

Critical Care 2008, 12(Suppl 2):P270 (doi: 10.1186/cc6491)

Introduction With the approval of the anaesthetic conserving device (AnaConDa®), inhalative sedation in the ICU has become feasible [1]. Isoflurane has been investigated in postoperative and

Table 1 (abstract P268)


MV Length of ICU stay Length of hospital stay Died Exclusion

Patients (n) (average, days) (average, days) (average, days) (n) (n)

Sedatonarcosis group (n = 21) 6.37 9.06 18.66 4 1

Awake sedation group (n = 23) 4.38 7.22 15.16 3 2

^0.05 1.8357 1.28725 1.2940

P value <0.05 <0.05 <0.05 Not significant

critically ill patients using AnaConDa® [2,3], whereas sevoflurane sedation has only been reported in small observations [4,5]. This randomised, single-blinded, BIS-controlled study was to evaluate for the first time sevoflurane via AnaConDa® compared with propofol, with regard to recovery, sedation quality and consumption of anaesthetics.

Methods Seventy patients scheduled for elective coronary artery bypass graft surgery were randomised at admission to the ICU to either receive sevoflurane (n = 35) or propofol (n = 35) for postoperative sedation. The primary endpoint was recovery time from termination of sedation (extubation time, spontaneous eye opening and hand grip). Sedation quality (using the Richmond Agitation Sedation Scale, RASS), sevoflurane consumption, duration of ICU and hospital stays, and adverse side effects were documented. Results Median recovery times were significantly shorter (P <

0.002. with sevoflurane than with propofol (extubation time: 21.5 min (2-259) vs 150.5 min (22-910)). Mean sevoflurane consumption was 3.2 ± 1.4 ml/hour to obtain end-tidal concentrations of 0.5-1 vol%; mean administration of propofol was 2.4 ± 1.1 mg/kg/hour. Sedation quality was comparable in both groups (RASS -3 to -4), and no serious complications including haemodynamics related to either sedative drug occurred. Length of stay in the ICU was similar in both groups, whereas patients receiving sevoflurane were discharged significantly (P < 0.03) earlier from hospital (10.6 ± 3.3 days vs 14 ± 7.7 days). Conclusions Sevoflurane administration via AnaConDa® is an efficacious and easy titratable way to provide postoperative sedation in the ICU. Recovery from sedation was facilitated with sevoflurane compared with propofol, and resulted in a shorter ventilation time. Sevoflurane sedated patients left hospital a mean 3 days earlier compared with a propofol-based regimen. References

1. Enlund M: Anaesthesia 2001, 56:429-432.

2. Sackey PV: Crit Care Med 2004, 32:2241-2246.

3. Hanafy M: Egyptian J Anaesth 2005, 21:237-242.

4. Soukup J: Intensiv- und Notfallbehandlung 2007, 32:29-36.

5. Berton J: Anesth Analg 2007, 104:130-134.

Thoracal epidural analgesia in upper abdominal surgery

S Vukosavljevic, T Randjelovic, D Pavlovic

KBC Bezanijska Kosa, Belgrade, Serbia

Critical Care 2008, 12(Suppl 2):P271 (doi: 10.1186/cc6492)

Introduction Surgery of upper abdominal organs is painful and mutilating, joined with possible serious postoperative complications - pulmonary, abdominal (anastomoses related), cardiological, and thromboembolic.

Methods During past year 100 patients had upper abdominal surgery. According to analgesia type they were divided into two groups. The first group (G1) was administered a first dose of local anesthetic (bupivacain 0.25% 5 ml) prior to total anesthesia through a thoracic epidural catheter. After that they underwent classical total anesthesia (midazolam, diprivan, fentanil, relaxant), followed by anesthesia with diprivan 6 mg/kg/hour and analgesia with local anesthetic epidurally. Postoperatively they were administered through the thoracic epidural catheter a combination of opioids (morphine 2 mg) and local anesthetic (bupivacain 0.125% 6-8 ml) every 8 hours. The other group (G2) underwent classical total anesthesia followed by classical proportion of oxidul and oxygen, and analgesia by fentanyl, with postoperative systemic analgesia by nonsteroid anti-inflammatory drugs, paracetamol, and metamisol sodium. The parameters followed during surgery were arterial tension, heart rate, gas analysis, diuresis, and operating

field bleeding. The postoperatively followed parameters were Visual Analog Scale, arterial tension, heart rate, gas analysis, beginning of peristalsis, and pulmonary complications. Results Thoracic epidural analgesia during surgery provides better hemodynamic patient stability and lower blood loss due to intraoperative bleeding, statistically and clinically significantly better analgesia in the first postoperative 72 hours, compared with systemic analgesia (Visual Analog Scale, G1 < 8 in movement vs G2 > 30 in movement), reduces the period of postoperative ileus (for 1.06 days), reduces pulmonary and cardiologic complications, provides early patient mobilization and decreases the number of intensive postoperative care days (for 3.9 days). Conclusions Our experience shows that thoracic epidural analgesia is the right choice, because it provides effective pain relief in patients, prevention of postoperative complications, provides early patient mobilization and reduces the length of stay in the ICU. References

1. Bromage PR: Br J Anaesth 1975, 47:199-212.

2. Bonica JJ, etal.: Anesthesiology 1970, 33:619-626.

Etomidate and relative adrenal insufficiency in cardiopulmonary bypass surgery: impact on the postoperative hemodynamic status

L Lorenzo, M Brouard, J Iribarren, J Jimenez, L Lorente, R Perez, S Palmero, R Santacreu, R Martinez, M Mora

Hospital Universitario de Canarias, La Laguna, Tenerife, Spain Critical Care 2008, 12(Suppl 2):P272 (doi: 10.1186/cc6493)

Introduction Use of etomidate in cardiopulmonary bypass (CPB) surgery is usual practice during the anesthetic induction. The objective of this study was to determine the incidence of relative adrenal insufficiency (RAI) in CPB patients after etomidate administration and the impact on hemodynamic status. Methods A prospective cohort study was performed on CPB patients who received etomidate or not during the anesthetic induction. Patients were excluded if they had received systemic or inhaled corticosteroids or immunosuppressants, and active preoperative infection. RAI was defined as a rise in serum cortisol <9 |ig/dl after the administration of 250 |ig cosyntropin. Cortisol levels were measured preoperatively, immediately before and 30 minutes, 60 minutes and 90 minutes after the administration of cosyntropin (250 |ig). We used SPSS version 15. Results We studied 65 patients (74% men), mean age 68 ± 11 years. The incidence of RAI was 89.4% in these patients compared with 50% in patients who did not receive etomidate (P =0.01) (Table 1). Higher postoperative cortisol levels were associated with lower doses of norepinephrine at 4 hours post CPB. Levels of cortisol in etomidate patients were inversely proportional to the needs of norepinephrine. Finally the maximum increase of cortisol levels after ACTH stimulation was directly associated with the systemic resistance index in nonetomidate patients at four postoperative hours.

Table 1 (abstract P272)

Incidence of relative adrenal insufficiency

RAI Etomidate (n = 47) Others (n = 18) P value

Yes (%) 89.4 50 No (%) 10.6 50 0.01

Conclusions The use of etomidate was associated with RAI post CPB surgery. Cortisol levels were related to the postoperative hemodynamic profile and the need for vasopressor drugs.

Economic evaluation of remifentanil-based versus conventional sedation for patients with an anticipated mechanical ventilation duration of 2-3 days in Germany

MJ Al1, J Martin2, J Bakker1, R Welte3

1Erasmus MC, University Medical Center, Rotterdam, The Netherlands; 2Klinik am Eichert, Goeppingen, Germany; 3GlaxoSmithKline, Munich, Germany

Critical Care 2008, 12(Suppl 2):P273 (doi: 10.1186/cc6494)

Introduction Hospitals are increasingly forced to consider the economics of technology use. We estimated the incremental cost-consequences of remifentanil-based sedation (RS) versus conventional sedation (CS) in ICU patients with an anticipated mechanical ventilation (MV) time of 2-3 days for Germany. Methods A probabilistic Markov model was utilized that describes the patient flow on the ICU using eight model states: MV - maintenance, MV- eligible start weaning, MV - actual weaning started, MV- eligible for extubation, ICU - extubated, ICU - eligible for discharge, Discharged from ICU, and Death. At every hour, patients stay at the current state, move to the next state, or die. The respective transition probabilities and the utilization of sedation drugs were derived from UltiSAFE, a Dutch open-label trial with 205 critically ill patients. In UltiSAFE, patients either received CS (predominantly morphine or fentanyl combined with propofol or midazolam) or RS (remifentanil, combined with propofol if required). Unit prices for drugs and total costs per ICU-hour with and without MV were collected in one 12-bed adult mixed ICU in a general German hospital. Material, staff and overhead costs were considered. All costs were measured from a hospital perspective with 2006 as the reference year. According to the UltiSAFE target population, only patients who started weaning within 72 hours of the start of treatment were included.

Results The average duration of MV and of ICU stay was shorter, 0.9 and 0.8 days respectively, in the RS group compared with the CS group, while the average costs per patient were €6,157 in the RS group versus €7,160 in the CS group. The savings caused by the shorter length of stay therefore more than offset the additional drug acquisition costs, leading to €1,003 savings per patient in the RS group. The probability of RS being cost-saving was estimated at 91%.

Conclusions RS seems to be the economically preferred option for patients with an anticipated MV time of 2-3 days: RS decreases the length of stay in the ICU, the total costs per patient and the duration of MV, which is a risk factor for ventilator-associated morbidity.

Propofol pharmacokinetics in preterm and term neonates: the relevance of both postmenstrual and postnatal age

K Allegaert1, M Peeters2, R Verbesselt1, D Tibboel3, G Naulaers1, J De Hoon1, C Knibbe2

1University Hospitals Leuven, Division of Woman and Child, Leuven, Belgium; 2Leiden/Amsterdam Centre for Drug Research, Leiden, The Netherlands; 3Sophia Children Hospital, Erasmus MC, Rotterdam, The Netherlands

Critical Care 2008, 12(Suppl 2):P274 (doi: 10.1186/cc6495)

Introduction Although disposition of propofol has been extensively studied in different populations of adult and paediatric age, data are still very limited. Preliminary data in neonates suggested that propofol clearance is significantly different compared with toddlers and children, with important interindividual variability in propofol

clearance in neonates [1]. We therefore wanted to document covariates that contribute to interindividual variability in propofol pharmacokinetics in preterm and term neonates. Methods Population pharmacokinetics were estimated (nonlinear mixed effects model) based on arterial blood samples collected in (pre)term neonates after intravenous bolus administration of propofol (3 mg/kg, 10 s). Covariate analysis included post-menstrual age (PMA), postnatal age (PNA), gestational age, weight and creatinaemia.

Results Two hundred and thirty-five arterial concentration-time points were collected in 25 neonates. The median weight was 2,930 (range 680-4,030) g, PMA 38 (27-43) weeks and PNA 8 (1-25) days. In a three-compartment model, PMA was the most predictive covariate for clearance (P < 0.001) when parameterized as [CLstd x (PMA / 38)11.5]. The standardized propofol clearance (CLstd) at 38 weeks PMA was 0.029 l/min. The addition of a fixed value in neonates with a postnatal age >10 days further improved the model (P < 0.001) and resulted in the equation [CLstd x (PMA / 38)11.5 + 0.03] for neonates >10 days old. Values for the central volume (1.32 l), peripheral volume 1 (15.4 l) and peripheral volume 2 (1.29 l) were not significantly influenced by any of the covariates (P > 0.001).

Conclusions PMA and PNA contribute to the interindividual variability of propofol clearance with very fast maturation of clearance in neonatal life. This implicates that preterm neonates and neonates in the first week of postnatal life are at an increased risk for accumulation during either intermittent bolus or continuous administration of propofol. Reference

1. Allegaert K, etal.: Crit Care 2007, 11:S170. P275

Comparison of sedation with dexmedetomidine versus lorazepam in septic ICU patients

P Pandharipande1, TD Girard1, RD Sanders2, JL Thompson1, M Maze2, EW Ely1

1Vanderbilt University, Nashville, TN, USA; 2Imperial College of London, UK

Critical Care 2008, 12(Suppl 2):P275 (doi: 10.1186/cc6496)

Introduction New strategies for sedation in mechanically ventilated (MV) patients have yielded improvements in patient outcomes, including acute brain dysfunction, but the differential effect of sedation regimens across patient diagnosis categories is not known. In this pilot project, we evaluated the impact of sedation using dexmedetomidine versus lorazepam, in an a priori determined subgroup of septic patients enrolled in the MENDS trial [1]. Methods The MENDS study enrolled 103 adult medical/surgical MV patients and excluded those with neurological disease, severe liver failure, active coronary ischemia, and seizures. Patients were randomized in a double-blind fashion to receive dexmedetomidine (DEX)-based (maximum 1.5 |ig/kg/hour) or lorazepam (LZ)-based (maximum 10 mg/hour) sedation for up to 5 days, titrated to a target Richmond Agitation-Sedation Scale score. Patients were monitored for delirium daily with the Confusion Assessment Method for the ICU.

Results Thirty-nine patients in the MENDS study were admitted with sepsis, with 19 in the DEX group and 20 in the LZ group. Baseline demographics, ICU type and admission diagnoses of this septic subgroup were balanced between DEX and LZ, with the median (interquartile range) age being 57 (49, 66) vs 55 (44, 65) years, P = 0.66 and APACHE II scores of 30 (24, 32) vs 28.5 (25, 32), P =0.86, respectively. The median DEX dose was 0.9 |ig/kg/hour and the median LZ dose was 3 mg/hour. DEX

patients had greater delirium and coma-free days (8 (4, 10) vs 1.5 (0.7, 5) days, P = 0.002), delirium-free days (10 (7.5, 11) vs 7.4 (4, 8.2) days, P = 0.007), MV-free days (9.5 (0, 11.6) vs 2 (0, 8.5) days, P = 0.037) and a reduction in the risk of dying at 28 days (hazard ratio 0.3 (0.1, 0.9), P = 0.036) as compared with the LZ patients.

Conclusions In this subgroup analysis of severe sepsis patients from the MENDS trial, sedation incorporating dexmedetomidine reduced the duration of delirium and coma and the length of time on the ventilator, and decreased the risk of dying as compared with lorazepam. This serves as a hypothesis-generating analysis to help direct further prospective study in such patients. Reference

1. Pandharipande P, et al.: Effect of sedation with dexmedeto-midine vs lorazepam on acute brain dysfunction in critically ill. The MENDS randomized controlled trial. JAMA 2007, 298:2645-2653.

Dexmedetomidine-based sedation for noninvasive ventilation failure

O Senturk, O Demirkiran, T Utku, S Urkmez, Y Dikmen

Istanbul University Cerrahpasa Medical School, Istanbul, Turkey Critical Care 2008, 12(Suppl 2):P276 (doi: 10.1186/cc6497)

Introduction Noninvasive ventilation (NIV) reduces the intubation and mortality in patients with acute on chronic respiratory failure. NIV is associated with a large number of failures, and with patient refusal. The purpose of the study was to assess the feasibility and safety of dexmedetomidine-based sedation during NIV. Methods In this prospective, randomised controlled study, patients on NIV support with agitation and ventilatory discomfort were included. The patients were allocated randomly into two groups: dexmedetomidine (Dex) and control. In the Dex group, the infusion rate was 0.2-0.7 |ig/kg/hour to reach a Ramsey sedation score (RSS) between 3 and 4. Haemodynamic and respiratory characteristics, and the RSS were documented at 1 minute, 10 minutes, 30 minutes, 1 hour, 4 hours and 24 hours after Dex infusion was started. When additional sedation was needed 0.02-0.03 mg/kg intravenous midazolam was used. Spontaneous ventilation and NIV support durations, the total infusion time (hours), the total Dex consumption, the reason for infusion therapy cessation, additional sedative agent requirements, and the duration of the mechanical ventilation were documented. Results Thirty patients under NIV support with agitation and ventilatory discomfort were included in this study. The results in the Dex group are summarized in Table 1. Additional sedative agent requirement was significantly higher in the control group. Side effects such as hypotension and hypoglycemia were found in the Dex group.

Table 1 (abstract P276)

Respiratory, haemodynamic characteristics and Ramsey sedation score variables

1 minute 1 hour P value

PaO2/FiO2 159 ± 47 239 ± 80 <0.01

SaO2 95.8 ± 4 98.8 ± 1 <0.05

PaO2 83.6 ± 27 130 ± 31 <0.05

Respiratory rate 30.4 ± 5 24.5 ± 6 <0.05

Mean arterial pressure 86.6 ± 12 77.7 ± 11 <0.01

Heart rate 108 ± 23 98.9 ± 18 <0.05

RSS 1.0 ± 0 2.6 ± 0.8 <0.01

Conclusions Dexmedetomidine is safe and effective for the

sedation of the patients under NIV support.


1. Cooper AB, Thornley KS, Young GB, et al.: Chest 2000, 117: 809-881.

Dexmedetomidine for endovascular neurosurgery Y Kase, T Obata

Jikei University School of Medicine, Tokyo, Japan

Critical Care 2008, 12(Suppl 2):P277 (doi: 10.1186/cc6498)

Introduction Perioperative managements for endovascular neurosurgery require some considerations such as minimizing hemodynamic changes and especially avoiding blood pressure elevation accompanying extubation, immobilizing the trunk and lower limbs until bleeding from the site of femoral artery sheath removal has stopped, and monitoring patients in the ICU for at least 12 hours to notice neurological deterioration promptly. The aim of this study was to assess the usefulness of dexmedetomidine as a postoperative sedative drug for endovascular neurosurgery. Methods The study included 182 patients with endovascular neurosurgery admitted to the ICU in 2006. The authors evaluated the postoperative sedative state and hemodynamics with the Richmond Agitation Sedation Scale (RASS) [1], heart rate (HR) and mean arterial pressure (MAP). To examine the time-dependent changes of the RASS, MAP and HR, data were collected from medical records, including at the start and end of dexmedeto-midine infusion and at the time of extubation. Results The surgical indications in patients with endovascular neurosurgery were unruptured cerebral aneurism (57%), subarachnoid hemorrhage (20%), arteriovenous malformation (5%), arteriovenous fistula (3%) and others (15%). One hundred and eighteen patients (85.3%) received dexmedetomidine. The RASS showed patients with dexmedetomidine experienced RASS -1 to -3 states, which was arousable with verbal stimulation. There were no dexmedetomidine-induced HR and MAP deteriorations; furthermore, dexmedetomidine prevented the blood pressure elevation accompanying extubation. The overall morbidity and mortality rates relating to endovascular neurosurgery were 1.6% and 0.54%, respectively.

Conclusions Application of sedative drugs for the postoperative management for neurovascular disease may be controversial; however, the use of dexmedetomidine facilitates postoperative management of endovascular neurosurgery. Compared with the reported endovascular neurosurgery morbidity (3.7-5%) and mortality (1.1-1.5%), our morbidity and mortality rate showed that dexmedetomidine did not cause neurological deteriorations. Reference

1. Ely EW, Truman B, Shintani A, et al.: JAMA 2003, 289:29832991.

Single-centre audit on the use of intravenous paracetamol in neonates

C Vanhole, K Allegaert, F van Beek, A Debeer, T de Rijdt, G Naulaers

University Hospitals Leuven, Belgium

Critical Care 2008, 12(Suppl 2):P278 (doi: 10.1186/cc6499)

Introduction An intravenous formulation of paracetamol is available, but remains off-label in patients below 10 kg although pharmacokinetics during repeated intravenous administration were S107

documented and dosing regimes suggested [1,2]. We therefore retrospectively evaluated aspects of the administration of intravenous paracetamol in neonates.

Methods A single-centre retrospective study. Data were collected in neonates born and admitted between 1 January 2006 and 1 October 2007 to whom intravenous paracetamol was administered. In these patients, clinical data (age, duration of treatment, switch to oral treatment, liver enzymes during and up to 2 days after intravenous treatment) were retrieved. Correlations (Spearman rank) of hepatic enzymes with duration of treatment (hours) and differences in liver enzymes during/after (Mann-Whitney U test) were investigated.

Results Information on 2,360 administrations in 189 cases (postmenstrual age 38 (range 30-55) weeks, postnatal age 5 (1-182) days) was available. The median duration of administration was 48 (6-480) hours. The indication for initiation of intravenous paracetamol was postoperative analgesia in about 50% of cases, of whom the most frequent surgical interventions were cardiac surgery (39 cases), abdominal surgery (31 cases), thoracic surgery (16 cases) or neurosurgery (seven cases). Switch to oral treatment was only documented in 68/189 cases, end of paracetamol administration in 84 cases and insufficient analgesia (unscheduled initiation of opioids) in 23 cases. Six hundred and forty-nine observations on liver enzymes (ALT 280, AST 284, yGT 85) during and 174 (74, 75 and 25, respectively) after intravenous administration were available. No significant correlations between liver enzymes and duration of administration were observed and there was no significant difference in liver enzymes during versus after intravenous administration.

Conclusions The current observations in 189 (pre)term neonates suggest that intravenous paracetamol does not alter hepatic enzymes profiles during or after intravenous administration in this specific population, and therefore seems to be a safe drug. The switch to oral treatment has only been observed in a relatively limited number of patients, probably reflecting the need to implement strategies to facilitate this switch. References

1. Anderson B, et al.: Pediatr Anesth 2005, 15:282-292.

2. Allegaert K, et al.: Pediatr Anesth 2007, 17:811-812.

Postmenstrual age and CYP2D6 polymorphisms determine urinary tramadol O-demethylation excretion in critically ill neonates and infants

K Allegaert1, R Verbesselt1, R Van Schaik2, J Van den Anker2, C Vanhole1, A Debeer1, J De Hoon1

1University Hospitals Leuven, Belgium; 2Erasmus MC, Rotterdam, The Netherlands

Critical Care 2008, 12(Suppl 2):P279 (doi: 10.1186/cc6500)

Introduction Interindividual variability in drug metabolism is based on constitutional, environmental and genetic factors but mainly reflects ontogeny in early neonatal life. We therefore wanted to document determinants of O-demethylation activity in critically ill neonates and young infants.

Methods Tramadol (M) and O-demethyltramadol (M1) concentrations were determined in 24-hour urine collections in neonates in whom continuous intravenous tramadol was administered [1]. Samples were analysed by a HPLC methodology described earlier [2]. The log M/M1 in 24-hour urine collections was calculated and correlations with clinical characteristics and CYP2D6 polymorphisms were investigated.

Results Based on 86 urine collections, a significant correlation between urine log M/M1 (0.98, SD 0.66) and postmenstrual age (PMA) (r = -0.69) was observed. One-way analysis of variance documented a significant decrease in log M/M1 with an increasing number of active CYP2D6 alleles. In a forward multiple regression model, PMA and the number of active CYP2D6 alleles remained independent determinants of the urine log M/M1. Conclusions Both ontogeny (PMA) and CYP2D6 polymorphisms already contribute to the interindividual variability of phenotypic O-demethylation activity of tramadol in critically ill (pre)term neonates and young infants. The current observations are of pharmaco-dynamic relevance. In addition, we hereby are the first to illustrate the simultaneous impact of both age and genetic polymorphisms on drug metabolism in early life. References

1. Allegaert K, et al.: Br J Anaesth 2005, 95:231-239.

2. Allegaert K, etal.: Eur J Clin Pharmacol 2005, 61:837-842.

Pressure support ventilation improves oxygenation by redistribution of pulmonary blood flow in experimental lung injury

A Carvalho1, P Spieth1, P Pelosi2, B Neykova1, A Heller1, T Koch1, M Gama de Abreu1

1University Clinic Carl Gustav Carus, Dresden, Germany;

2University of Insubria, Varese, Italy

Critical Care 2008, 12(Suppl 2):P280 (doi: 10.1186/cc6501)

Introduction The presence of spontaneous breathing (SB) activity may improve gas exchange during mechanical ventilation. Pressure support ventilation (PSV) is one of the most frequently used modes of assisted mechanical ventilation, but little is known about the mechanisms of improvement of lung function during PSV. To shed light into this issue, we evaluated the regional distribution of aeration and pulmonary blood flow (PBF) during controlled and assisted mechanical ventilation with PSV in experimental acute lung injury.

Methods In five anesthetized, controlled mechanically ventilated pigs, acute lung injury was induced by surfactant depletion. The ventilatory mode was switched to biphasic intermittent positive airway pressure ventilation and the depth of anesthesia reduced to resume SB. When SB represented 20% of the minute ventilation, animals were ventilated with PSV during 1 hour. Measurements of lung mechanics, gas exchange and hemodynamics, as well as whole lung computed tomography at mean airway pressure, were obtained at baseline, injury and during assisted ventilation with PSV. In addition, PBF was marked with intravenously administered fluorescent microspheres and spatial cluster analysis was used to determine the effects of interventions in the distribution of PBF. Statistical analysis was performed with Wilcoxon's test and P < 0.05 was considered significant.

Results In injured lungs under controlled mechanical ventilation, impairment of oxygenation was associated with a significant increase of poorly aerated and nonaerated areas in dependent lung regions. Resuming of SB and assisted mechanical ventilation with PSV led to a decrease in mean airway pressures and improvement in oxygenation, but not in total and dependent lung aeration. However, redistribution of PBF toward well aerated nondependent regions was observed.

Conclusions The improvement of oxygenation during PSV seems not to result from recruitment of dependent lung areas, but rather from redistribution of PBF from dependent, less aerated lung zones toward better aerated, nondependent lung regions.

Model based analysis reveals differences in viscoelasticity between acute respiratory distress syndrome and healthy lungs

S Ganzert1, K Möller2, CA Stahl1, D Steinmann1, S Schumann1, J Guttmann1

Wniversity Hospital Freiburg, Germany; 2Hochschule Furtwangen

University, Villingen-Schwenningen, Germany

Critical Care 2008, 12(Suppl 2):P281 (doi: 10.1186/cc6502)

Introduction We hypothesized that the time course of the slow pressure drop after interruption of inspiratory flow contains information about the underlying lung disease. Respiratory data obtained from repetitive inspiratory flow interruption manoeuvres was compared between mechanically ventilated patients without pulmonary disease and with acute respiratory distress syndrome (ARDS) and was analyzed using a model describing the Newtonian and viscoelastic properties of the lung.

Methods Inspiratory airflow was repetitively interrupted for 3 seconds after inflation of 100 ml volume steps up to a maximum plateau pressure of 45 mbar by means of an automated super syringe manoeuvre (Evita 4Lab; Dräger Medical, Lübeck, Germany). Twelve patients with healthy lungs and 20 patients suffering from ARDS were investigated. We determined the Newtonian and viscoelastic unit of a model (Figure 1a) [1,2] by mathematical fitting the model to segments of the pressure curve (Figure 1b). The slow pressure drop was described by viscoelasticity (resistance R2, compliance C2).

Results Analysis of time constant T of the viscoelastic unit revealed no differences between ARDS and healthy patients (Figure 2a). However, compliance C2 (Figure 2b) and resistance R2 (Figure 2c) of the viscoelastic unit were significantly different. C2 was lower

Figure 1 (abstract P281)

(a, left) Newtonian (R1, C1) and viscoelastic unit (R2, C2). (b, right) Data fit.

and R2 was higher in ARDS patients. The time constant as well as C2 and R2 showed a significant dependency on pressure. Conclusions The time constant of the viscoelastic unit determines the decay of the pressure curve after airflow interruption. Identical time constants mean that there is no significant difference in the decay between health and ARDS. Only the model-based analysis revealed the significant difference in viscoelasticity that is associated with ARDS. References

1. Jonson B, etal.: J Appl Physiol 1993, 75:132-14.0

2. Bates JH, etal.: J Appl Physiol 1985, 58:1840-1848.

Short time efficacy and safety of modified pressure-controlled ventilation recruitment maneuver in a group of patients with acute respiratory distress syndrome

G Georgiev, S Milanov, V Todorova, M Milanov

Pirogov Emergency Institute, Sofia, Bulgaria

Critical Care 2008, 12(Suppl 2):P282 (doi: 10.1186/cc6503)

Introduction Recruitment maneuvers (RM) attempt mechanical homogenization and improvement of V/Q matching in the heterogeneous acute respiratory distress syndrome (ARDS) lung in a short time. Apart from the postulated beneficial effect, their performance carries a risk for serious adverse events. Methods The study included 17 consecutive ARDS patients placed on baseline ventilation with standardized ventilatory parameters. Pressure-controlled ventilation RM was then applied for

2 minutes and consisted of: peak inspiratory pressure = 45 mbar, respiratory rate = 10/min, I:E = 1:1, positive end-expiratory pressure (PEEP) = 20 mbar for the first minute, 25 mbar for the remaining time. Predefined safety criteria were used for premature RM termination. Patients with a minimum of 30% PaO2/FiO2 increment on the fifth minute after the RM completion were judged responders. Those with prematurely terminated RM and non-responders were excluded from the subsequent study. In the remaining group, a decremental PEEP trial was then conducted. ECG, SpO2, invasive systemic arterial pressures, Paw, exhaled Vt/MV and total respiratory system compliance (Ctot) were continuously monitored and their representative values were recorded for different time periods. Arterial blood samples for blood gas analysis were taken immediately before the RM, on the fifth minute and on the sixth hour after the RM completion. Twenty-four hours after the RM, a bedside chest X-ray was conducted for extraalveolar air detection.

Results Six patients (35.3%) were considered nonresponders, and in one of them RM was prematurely terminated. In the responders' group there was statistically significant PaO2/FiO2

Figure 2 (abstract P281)

Results for the viscoelastic unit. (a, left) Time constant T; (b, middle) compliance C; (c, right) resistance R.

increment on the fifth minute after the RM, which was preserved on the sixth hour. The PaO2/FiO2 increment was significant in the nonresponders' group too, but with smaller magnitude. There was also a significant increase in Ctot and PaCO2 decrement in the responders' group on the sixth hour. No significant changes in PaCO2 and Ctot in the nonresponders' group were noted. PaO2/ FiO2 in the fifth minute after the RM was not significantly different between responders and nonresponders, but PaCO2 and Ctot were. None of the monitored hemodynamic parameters changed significantly at any time in both groups. Clinical or radiographic signs of barotrauma were not found.

Conclusions The described pressure-controlled ventilation RM and decremental PEEP titration increased arterial oxygenation efficacy in the short term. Surrogate markers for alveolar recruitment were also influenced. RM showed good tolerability regarding hemodynamic stability and barotrauma potential.

Effect of frequency on lung protection during high-frequency oscillation ventilation in a sheep acute respiratory distress syndrome model

S Liu, H Qiu, Y Yang, Q Chen

Nanjing Zhong-da Hospital and School of Clinical Medicine, Southeast University, Nanjing, China

Critical Care 2008, 12(Suppl 2):P283 (doi: 10.1186/cc6504)

Introduction The objective was to evaluate the effect of frequency on the prevention of ventilation-induced lung injury during high-frequency oscillation ventilation (HFOV) in a sheep acute respiratory distress syndrome (ARDS) model. Methods Twenty-four adult sheep (38.3 ± 2.3 kg) were randomly divided into four groups (n = 6): three HFOV groups (3 Hz, 6 Hz, 9 Hz) and a conventional mechanical ventilation (CMV) group. After induction of the ARDS model (PaO2 < 60mmHg) by repeated NS lavage, step-by-step lung recruitment was performed in all groups, optimal alveolar recruitment as a PaO2 > 400 mmHg. After this recruitment procedure, the optimal mean airway pressure was selected by decreasing 2 mmHg every 5 minutes until the PaO2 decreased below 400 mmHg, and ventilation was continued for 4 hours. Hemodynamics, respiratory mechanics and gas exchange were measured throughout the experiment, and lung histopatho-logical changes, lung wet/dry weight ratio, lung myeloperoxidase activity, lung and plasma IL-6 expression (ELISA) were determined. Results The heart rate, mean arterial pressure, cardiac output, central venous pressure and pulmonary arterial wedge pressure did not differ among the four groups in experiment (P > 0.05). The mean pulmonary arterial pressure was significantly higher in the HFOV group after 4 hours than in the CMV group (P < 0.05). After lung recruitment, sustained improvements in the oxygenation index were observed in all groups. Lung compliance and the intrapulmonary shunt (Qs/Qt) were significantly improved in the 6 Hz and 9 Hz HFOV groups after 4 hours of ventilation (P < 0.05). The amplitude of alveolar pressure was significantly lower in the 9 Hz HFOV group during the experiment (P < 0.05). Histologically, the lung injury score was significantly lower in the 9 Hz HFOV group than the other groups (P < 0.05). The lung wet/dry weight ratio did no differ among the four groups (P > 0.05). The lung MPO activity and expression of IL-6 in lung tissue and blood plasma significantly reduced in the 6 Hz and 9 Hz HFOV-treated animals (P < 0.05).

Conclusions Compared with CMV and low frequency in HFOV, the higher frequency in HFOV results in less lung injury. HFOV may S110 be an optimal lung-protective strategy.

Hungarian sites of the multinational VALID study are responsive to training measures advocating the ARDS Network ventilation protocol

K Madach1, Z Marjanek2, A Ortutay2, P Golopencza3, D Kiss4, Z Koroknai5, F Taut6

1Semmelweis University, Budapest, Hungary; 2Vac Hospital, Vac, Hungary; 3Bajcsy Zsilinszky Hospital, Budapest, Hungary; 4Jahn Ferenc Hospital, Budapest, Hungary; 5Omnicare Clin Research Kft, Budapest, Hungary; 6Nycomed, Konstanz, Germany Critical Care 2008, 12(Suppl 2):P284 (doi: 10.1186/cc6505)

Introduction Adherence to the ARDS Network low-stretch ventilation protocol [1] is recommended as standard of care in the VALID study, a randomised double-blind mortality study of rSP-C surfactant (Venticute®) in intubated and mechanically ventilated patients with severe respiratory failure due to pneumonia or aspiration. Ten Hungarian study sites have recruited about 9% of patients in the VALID study to date.

Methods The VALID study is conducted in more than 140 medical centers in 23 countries. Adherence to ARDS Network ventilation standards was assessed after 200 patients were randomised. Subsequently, an intensified training program to promote low-stretch ventilation was conducted during site visits and investigator meetings, through letters and emails, by distributing training material, and by discussing ventilation settings prior to enrolment of individual patients. In Hungary, data are available from 24 patients enrolled prior to implementation of training measures and from 45 patients enrolled thereafter. Tidal volumes (VT) and peak inspiratory pressures (PIP) at baseline were assessed. Results In Hungary, the median VT at baseline decreased from 8.2 ml/kg predicted body weight (PBW) prior to initiation of intensified training measures to 7.3 ml/kg PBW thereafter (P < 0.001, Wilcoxon, two-sided). Concurrently, the overall study median VT decreased from 7.8 ml/kg PBW (Patient 1 to Patient 200) to 7.0 ml/kg PBW (Patient 201 to Patient 776). The two Hungarian sites with the highest enrolment decreased the median VT from 8.2 to 7.5 ml/kg PBW and from 9.1 to 6.6 ml/kg PBW. The median PIP at baseline decreased from 30.8 to 29.0 cmH2O in Hungary and from 29.5 to 28.5 cmH2O in the VALID study overall. Conclusions In response to training measures, Hungarian investigators participating in a multinational multicenter intensive care trial of lung surfactant in ventilated patients have decreased VT and PIP, thereby improving adherence to the global standards of the ARDS Network ventilation protocol. Reference

1. The ARDS Network: N Engl J Med 2000, 342:1301-1308. P285

High-frequency oscillatory ventilation and adult patients with acute respiratory distress syndrome: our impressions and experience

L Pokorny1, R Bartova2, P Rolecek2, C Bryson1

1Antrim Area Hospital, Antrim, UK; 2Masaryk Hospital, Usti nad Labem, Czech Republic

Critical Care 2008, 12(Suppl 2):P285 (doi: 10.1186/cc6506)

Introduction An analysis of clinical experience of patients treated with high-frequency oscillatory ventilation (HFOV) was performed. This alternative technique of mechanical ventilation is used as 'rescue' therapy for patients with severe acute respiratory distress syndrome (ARDS) when it is not possible to provide adequate oxygenation and ventilation by conventional methods.

Methods A prospective review of all patients treated with HFOV (SensorMedics 3100B) in the ICU during 2004-2006. The data (patient demographics, aetiology of ARDS, gas exchange, ventilator settings before and after initiation of HFOV, duration of HFOV, complications, outcome at 30 days, etc.) were obtained and statistical analysis was performed (mean ± SD, %, t test). For all analyses P < 0.05 was considered significant. Results Values given as mean ± SD. Thirty-one patients (13 women and 18 men, age 42.8 ± 16.1 years; APACHE II score, 22.1 ± 4.9) with severe ARDS (PaO2/FiO2, 72.0 ± 14.7; oxygenation index (OI), 44.0 ± 16.5) were connected to HFOV (38 trials) after previous conventional ventilation (CV) for a duration of 6.8 ± 4.1 days with ventilator settings (plateau, 39.3 ± 5.1 cmH2O; PEEP, 14.5 ± 3.6 cmH2O; mPaw, 26.5 ± 6.3 cmH2O). Patients were treated with HFOV for 4.7 ± 2.1 days. The 30-day mortality rate was 70.9%. Of the patients 51.6% were treated with steroids, and 22.6% of patients underwent prone positioning. Survivors/ nonsurvivors: 6/9 women, 3/13 men; age 27.4 ± 4.9/49.0 ± 16 years; duration of CV 4.4 ± 3.1/7.8 ± 4.0 days; ventilator settings - plateau 41.2 ± 4.3/38.5 ± 5.2 cmH2O, PEEP 16 ± 1.9/13.9 ± 4.0 cmH2O, mPaw 26.6 ± 3.4/26.4 ± 7.1 cmH2O; duration of HFOV 6.41 ± 1.4/4.0 ± 1.9 days.

Conclusions We found significant improvement in PaO2/FiO2, the OI and reduction in paCO2 within 12 hours of transition to HFOV. The age of patients and days on CV were significantly higher in nonsurvivors (49 years; 7.8 days) than in survivors (27 years; 4.4 days). Early treatment with HFOV can help to bridge the most critical period of respiratory failure and improve the mortality rate. Timing of HFOV initiation is the most important factor; that is, early intervention may improve outcome. References

1. Fessler HE, et al.: Lessons from pediatric high-frequency oscillatory ventilation may extend the application in critically ill adults. Crit Care Med 2007, 35:2473.

2. Mehta S, et al.: High-frequency oscillatory ventilation in adults: the Toronto experience. Chest 2004, 126:518-527.

3. Derdak S: High-frequency oscillatory ventilation for acute respiratory distress syndrome in adult patients. Crit Care Med 2003, 31:S317-S323.

4. David M, et al.: High-frequency oscillatory ventilation in adult acute respiratory distress syndrome. Intensive Care Med Oct, 29:1656-1665.

Perfusion pressure and positive end-expiratory pressure influence edema formation in isolated porcine lungs

S Schumann1, A Kirschbaum1, S Schliessmann1, K Gamerdinger1, C Armbruster1, F Erschig2, B Passlick1, J Guttmann1

1 University Hospital of Freiburg, Germany; 2Public Agency for Veterinary Affairs, Freiburg, Germany

Critical Care 2008, 12(Suppl 2):P286 (doi: 10.1186/cc6507)

Introduction Preparation of lungs for transplantation includes perfusion with lung protection solution, typically by flushing the organ within short time periods leading to high fluidic (arterial) pressures. In an isolated porcine lung model we analyzed the influence of perfusion pressure during anterograde perfusion on the pulmonary edema formation during mechanical ventilation. Methods Isolated porcine lungs were ventilated in the volume control mode (SV 900 C; Siemens-Elema, Solna, Sweden) with a tidal volume of 3 ml/kg BW and with two positive end-expiratory pressure (PEEP) levels, 4 cmH2O and 8cmH2O. After awaiting stationary ventilation conditions, flush perfusion with nutrition

solution (PERFADEX; Vitrolife, Goteborg, Sweden) was applied at different fluidic (hydrostatic) pressures that were achieved by height differences between the lung and fluid reservoir of 100 cm (high level) or 55 cm (low level), respectively. Results During high-level perfusion, the maximal fluidic pressure reached 50 mmHg and lung mass increased by 130%. During low-level perfusion, the fluidic pressure reached 28 mmHg and lung mass increased only by 91% at PEEP of 4 cmH2O. Histological examination of the lung tissue confirmed that this increase in lung mass corresponded to an increase of interstitial edema. Using a PEEP of 8 cmH2O at low-level perfusion reduced the relative increase in lung mass to 30%. With increased PEEP the relative increase of lung mass caused by flush perfusion was reduced. Conclusions Flush perfusion at high fluidic pressure amplitudes leads to an increased lung mass compared with low fluidic pressure amplitudes. Edema formation in isolated lungs caused by flush perfusion is reduced using low-perfusion pressures in combination with high PEEP. Low flush perfusion pressures might be advantageous for preparation of lungs for transplantation.

Closed system endotracheal suction to reduce loss in functional residual capacity during pressure-controlled mechanical ventilation

G Falzetti, T Principi, S Marzocchini, G Narcisi, M Caimmi, P Pelaia

Politechnical University of Marche, Ancona, Italy

Critical Care 2008, 12(Suppl 2):P287 (doi: 10.1186/cc6508)

Introduction The aim of the study was the evaluation of efficacy to limit loss in functional residual capacity (FRC) of endotracheal suction with a closed system (ESCS) versus endotracheal suction with an open standard system (ESOS) in patients needing ventilation with PEEP > 10 cmH2O.

Methods After IRB approval and obtaining consent, 15 patients admitted to the ICU for acute respiratory failure were connected and adapted to the Engstrom Carestation (GE Healthcare - 2006) by the closed suction system Cathy (Vygon - 2006). We performed ESCS and ESOS in an alternate randomization sequence at a distance of 2 hours for the first from the second. FRC measurements (based on evaluation of nitrogen washin and washout by the COVX metabolic module - Engstrom Carestation Lung FRC INview) were made at baseline, 5, 10 15, 20, 25, and 30 minutes after suction. The PaO2 was measured at baseline, immediately after the suction and 30 minutes after suction. Loss in FRC was considered as the difference between the basal value and values obtained after suction, and the time to FRC recovery after suction was considered as the minutes to return to the basal value. Data are shown as the mean ± SD; FRC was measured in millilitres, time in minutes, PaO2 in mmHg; intergroup variables were analysed with the Mann-Whitney test. P < 0.05 was taken as statistically significant.

Results Basal values of all studied parameters did not show significant differences between the two groups categorized by suction methods. Loss in FRC 5, 10 and 15 minutes after ESCS was significantly lower than after ESOS (5 minutes ESCS = -250 ± 483, ESOS = -740 ± 567, P = 0.002; 10 minutes ESCS = -36 ± 388, ESOS = -211 ± 188, P = 0.006; 15 minutes ESCS = -89 ± 489, ESOS = -268 ± 148, P = 0.046; 20 minutes ESCS = -157 ± 569, ESOS = -125 ± 176, not significant; 25 minutes ESCS = -167 ± 570, ESOS = -89 ± 133, not significant; 30 minutes ESCS = +216 ± 246, ESOS = +22 ± 81, not significant). Time to recovery of FRC basal value after ESCS was significantly lower than after ESOS (ESCS = 9 ± 5, ESOS = 21 ± 7, S111

P < 0.0001). PaO2 reduction was significantly lower after ESCS than after ESOS (ESCS = 151 ± 18, ESOS = 119 ± 9, P = 0.048) Conclusions ESCS in patients needing mechanical ventilation with PEEP >10 cmH2O for acute respiratory failure reduces significantly the loss of FRC, the reduction on PaO2 and a time to recovery of loss after suction greater than standard open suction. In this way it could be possible to avoid pulmonary overdistension made by recruitment manoeuvres, often necessary after suction with an open system to recover the loss in FRC and PaO2. Reference

1. Olegard C, et al.: Anesth Analg 2005, 101:206-212. P288

Effects of melatonin in an experimental model of ventilator-induced lung injury

PR Pedreira, E Garcia-Prieto, D Parra, F Taboada, GM Albaiceta

Hospital Universitario Central de Asturias, Oviedo, Spain Critical Care 2008, 12(Suppl 2):P288 (doi: 10.1186/cc6509)

Introduction Melatonin is a hormone with antioxidant and immunomodulatory effects. We studied the effects of melatonin treatment on lung damage, inflammation and oxidative stress in a model of ventilator-induced lung injury (VILI).

Methods Forty-eight Swiss mice were randomized into three experimental groups: control, low-pressure ventilation (peak pressure 15 cmH2O) or high-pressure ventilation (peak pressure 25 cmH2O). Within each group, eight mice were treated with melatonin (10 mg/kg) and eight mice with placebo. After 2 hours, lung injury was evaluated by gas exchange and histological analysis. Tissue levels of malondialdehyde and IL-6 and IL-10 were measured as indicators of lipid peroxidation and inflammation. Variables were compared using a two-way ANOVA. P < 0.05 was considered significant.

Results See Table 1. Ventilation with high pressures induced severe lung damage and release of IL-6. Treatment with melatonin improved oxygenation and decreased histological lung injury, but significantly increased oxidative stress quantified by malondialdehyde levels. The increase of IL-1 0 observed after melatonin treatment could be responsible for the differences. There were no differences in IL-6 caused by melatonin. Conclusions Melatonin reduces VILI by increasing the anti-inflammatory response. The combination of high pressure and melatonin increased oxidative stress.

Hyperoxia inhibits alveolar epithelial repair by inhibiting the transdifferentiation of alveolar epithelial type II cells into type I cells

S McKechnie, D Harrison, M McElroy

Royal Infirmary of Edinburgh, UK

Critical Care 2008, 12(Suppl 2):P289 (doi: 10.1186/cc6510)

Introduction The alveolar epithelium is comprised of type I (ATI) and type II (ATII) cells. ATI cells are incapable of cell division, with epithelial repair achieved by proliferation of ATII cells and transdifferentiation of ATII cells into ATI cells. We have previously shown that hyperoxia inhibits the transdifferentiation of ATII cells in vitro [1]. The objective of these studies was to determine the effect of hyperoxia on the transdifferentiation of ATII cells in vivo. Methods Rats (n = 9) were anaesthetised and Staphylococcus aureus or saline (controls) instilled into the distal airways. Animals recovered in air for 72 hours and were then randomised to normoxia (air) or hyperoxia (FiO2 ~0.9) for 48 hours. Lung sections were stained with combinations of cell-selective antibodies, immunofluorescent images obtained using confocal microscopy and the proportion of the alveolar surface covered with ATII (RTII70/MMC4-positive), ATI (RTI40-positive) and transitional (RTI40/MMC4-positive) cell-staining membrane determined by quantification of binary masks.

Results In control lungs, 94 ± 2% of the alveolar surface was lined by ATI, 2 ± 1% by ATII and <1% by transitional cell-staining membrane. In S. aureus-instilled lungs exposed to normoxia alone, there was a decrease in ATI cell-staining membrane (84 ± 3%) with an increase in ATII (8 ± 1%) and transitional (12 ± 4%) cell-staining membrane consistent with ATI cell necrosis, ATII cell hyperplasia and transdifferentiation of ATII cells into ATI cells. In S. aureus-instilled lungs exposed to hyperoxia, there was a decrease in ATI cell-staining membrane (73 ± 5%, P < 0.05) with a marked increase in ATII cell-staining membrane (16 ± 1%, P < 0.001) and less transitional cell-staining membrane (3 ± 1%, P < 0.05). As hyperoxia is proapoptotic and inhibits ATII proliferation [2,3], these data suggest persistent ATII cell hyperplasia and reduced ATII cell transdifferentiation.

Conclusions Hyperoxia impairs epithelial repair by inhibiting the transdifferentiation of ATII cells into ATI cells in a model of resolving S. aureus-induced pneumonia. References

1. McKechnie S, et al.: Eur Respir J 2005, 26:S49, 21s.

2. Buccellato L, et al.: J Biol Chem 2004, 279:6753-6760.

3. O'Reily M, et al.: Am J Respir Cell Mol Biol 1998, 18:43-50.

Pretreatment with atorvastatin ameliorates lung injury caused by high-pressure/high-tidal-volume mechanical ventilation in isolated normal rabbit lungs

II Siempos1, P Kopterides1, NA Maniatis2, C Magkou2, TK Ntaidou2, A Armaganidis1

1Attikon University Hospital, Athens, Greece; 2Evangelismos Hospital, Athens, Greece

Critical Care 2008, 12(Suppl 2):P290 (doi: 10.1186/cc6511)

Introduction Previous animal studies revealed that administration of statins ameliorates lung injury following endotoxin exposure or ischemia-reperfusion. In this experiment, we endeavored to investi-

Table 1 (abstract P288)

Control, placebo Control, melatonin PIP 15 cmH2O, placebo PIP 15 cmH2O, melatonin PIP 25 cmH2O, placebo PIP 25 cmH2O, melatonin

PaO2/FiO2 Histological score IL-10 (pg/mg protein) Malondialdehyde (nmol/mg protein) 349 ± 38 0.8 ± 1.3 221 ±156 7 ± 1.5 441 ± 61 1 ± 1.6 438 ± 236 7.5 ± 0.9 333 ± 49 1.9 ± 2.8 208+102 8.8 ± 2.1 322 ± 54 2.1 ± 1.8 513+202* 7.3 ± 1.1 127 ± 14#,+ 6.3 ± 3.3 123+191 10.1 ± 3.4 380 ± 86* 2.7 ± 3.1 457+270* 13.8 ± 3.4#,+ *

S112 Data are mean ± SEM. P < 0.05 compared with #control, +PIP 15 cmH2O or 'melatonin.

gate whether pretreatment with atorvastatin confers protection from lung injury caused by high-pressure/high-tidal-volume ventilation. Methods Twenty-four isolated sets of normal rabbit lungs were utilized. Treated animals received atorvastatin (20 mg/kg body weight/day per os) for 3 days before surgery. All isolated lungs were perfused (constant flow, 300 ml/min) and ventilated for 1 hour with pressure control ventilation at either 23 cmH2O (high-pressure, injurious ventilation) or 11 cmH2O (low-pressure, noninjurious ventilation) peak static pressure and positive end-expiratory pressure of 3 cmH2O. Four groups of six lung sets each were examined: HPC (high pressure, no statin), HPS (high pressure, statin pretreatment), LPC (low pressure, no statin), and LPS (low pressure, statin pretreatment). Changes (from baseline to the end of ventilation) in the ultrafiltration coefficient (AKfc; pulmonary capillary permeability), weight gain (AW; edema formation) and histological lesions (hemorrhage) were used as indices of lung damage.

Results At baseline, the compared groups did not differ with regard to Kfc (P = 0.3). At the end of ventilation, the HPC group suffered greater AKfc (P < 0.001) and greater AW (P < 0.001) than both the LPC and LPS groups. In contrast, the HPS group did not differ from both the LPS and LPC groups regarding these variables (P > 0.4). In the HPC and HPS groups, lungs with as opposed to those without statin pretreatment experienced a significantly lower AKfc (-0.013 ± 0.016 versus 1.723 ± 0.495 g/min/cmH2O/100 g; 'P = 0.0006) and lower AW (4.62 ± 1.50 versus 17.75 ± 4.71 g; P = 0.007). This was also the case for the histological lesions.

Conclusions In an ex vivo model of ventilator-induced lung injury, pretreatment with atorvastatin attenuates lung injury following high-pressure/high-tidal-volume ventilation. References

1. Jacobson JR, et al.: Am J Physiol Lung Cell Mol Physiol 2005, 288:L1026-L1032.

2. Pirat A, et al.: Anesth Analg 2006, 102:225-232.

Systemic markers of inflammation in mechanically ventilated brain-injured patients in the absence of sepsis and acute lung injury: the effect of positive end-expiratory pressure

I Korovesi1, A Kotanidou1, E Papadomichelakis2, E Giamarellos-Bourboulis2, A Pelekanou2, O Livaditi1, C Sotiropoulou1, A Koutsoukou1, I Dimopoulou2, A Armaganidis2, C Roussos1, N Marczin3, S Orfanos2

Wniversity of Athens Medical School, Evangelismos Hospital, Athens, Greece; 2Attikon Hospital, University of Athens Medical School, Haidari, Athens, Greece; 3Imperial College London, UK Critical Care 2008, 12(Suppl 2):P291 (doi: 10.1186/cc6512)

Introduction In mechanically ventilated (MV) brain-injured patients, pulmonary complications appear to be the leading cause of non-neurological morbidity, suggesting that the local and systemic inflammatory responses associated with brain damage and/or the mechanical ventilation itself may contribute to the pulmonary injury (that is, one or two hit model). We recently provided evidence for lung inflammation in brain-injured, mechanically ventilated patients with neither acute lung injury nor sepsis [1]. We now investigate whether positive end-expiratory pressure (PEEP) application in the same cohort (27 MV brain-injured subjects) is associated with systemic inflammatory changes that could probably contribute to the lung inflammation observed in these patients. Methods Patients were ventilated with 8 ml/kg tidal volume and were put on either zero PEEP (ZEEP, n = 12) or 8 cmH2O PEEP

(PEEP, n = 15). The following markers of systemic inflammation were recorded or measured in blood, on the first, third, and fifth days of MV: temperature, leukocyte and neutrophil counts, albumin, soluble triggering receptor expressed on myeloid cells (sTREM-1), C-reactive protein, procalcitonin, IL-10, IL-1P, IL-6, IL-8, IL-12p70, and TNFa.

Results Upon entry, the two groups were well balanced clinically and demographically. Significant differences between the two patient groups were observed in leukocyte counts, IL-6 and sTREM-1; all three parameters were constantly higher on ZEEP (P<0.05; two-way ANOVARM), while the former two markers decreased with time in both groups (P < 0.05). Conclusions In our population of MV brain-injured patients, ZEEP is associated with increases in systemic inflammatory markers that are present early on and throughout the first 5 days of MV. Our findings suggest that PEEP application influences the systemic inflammatory response observed in the absence of sepsis and acute lung injury, and probably points to significant IL-6 and sTREM-1 roles in the systemic and pulmonary inflammation observed in this patient setting. Reference

1. Korovesi I, et al.: Eur Respir J 2007, 30(Suppl 51):444S. P292

Shotgun proteomics reveals biochemistry of normal and acute respiratory distress syndrome bronchoalveolar lavage fluid proteome

D Goodlett

University of Washington, Seattle, WA, USA

Critical Care 2008, 12(Suppl 2):P292 (doi: 10.1186/cc6513)

Introduction Recently, so-called label-free quantitative proteomic methods have gained acceptance for protein expression analysis via mass spectrometry (MS). These methods allow better experiment design by circumventing the need to establish pair-wise comparisons required by popular stable isotope dilution methods (for example, isotope coded affinity tag, isobaric tags for relative and absolute quantitation and stable-isotope labelling by amino acids in cell culture) and are thus fundamentally better suited for proteome studies where sample number is large. Here we discuss the use of shotgun proteomics (that is, no prior protein fractionation) and labelfree quantitative methods to characterize human bronchoalveolar lavage fluid (BALF) proteomes of six normal healthy volunteers and three patients with acute respiratory distress syndrome (ARDS) [1]. Methods Our proteomic profiling technology for BALF involves: (i) removal of cells by centrifugation followed by precipitation of the remaining soluble protein, (ii) denaturation and proteolysis of soluble proteins, (iii) analysis in quadruplicate of each sample by LC-MS/MS using Fourier transform-mass spectrometry (FT-MS), (iv) identification of proteins by database search of tandem mass spectra (MS2 data), and (v) extraction of changes in protein relative expression directly from peptide ion intensity values (MS1 data) as recently described [2]. Notably, data for each sample are acquired independent of all other samples, allowing any patient's data to be compared directly with all others in silico. We also measured levels of several proteins of interest (identified in LC-MS/MS experiments) by ELISA in ARDS BALF from 69 patients ranging from those at risk to 14 days post diagnosis and six normals. Enriched functional categories within BALF proteins relative to the entire human proteome were determined using Expression Analysis Systematic Explorer (EASE) software [3]. Results Using BALF from three patients, we identified a total of 870 different proteins, a nearly 10-fold increase from previous reports. Among the proteins identified were known markers of lung S113

injury, such as surfactant, proteases, and serum proteins. We also identified several biologically interesting proteins not previously identified in patients with ARDS, including insulin-like growth factor-binding protein-3 (IGFBP-3). Immunoassay showed elevated levels of IGFBP-3 and IGF-I in at-risk patients and those with early ARDS, whereas normal controls had low levels of IGFBP-3. The IGF pathway, acting through the type 1 IGF receptor, repressed apoptosis of lung fibroblasts but not lung epithelial cells. Furthermore, depletion of IGF in ARDS BALF led to enhanced fibroblast apoptosis. Additionally, normal human BALF was profiled from six volunteers. From these analyses a total of 167 unique proteins were detected with >100 proteins detected in each of the six individual BAL samples, 42 of which were in common to all six subjects. Conclusions Our data suggest that the IGFBP-3/IGF pathway is involved in the pathogenesis of lung injury, illustrating the power of shotgun proteomics to catalog proteins present in complex biological fluids, such as BALF, from which hypotheses can be developed and tested. From the normal BALF proteome data, gene ontology analysis demonstrated enrichment of several biological processes in the lung that reflects its expected role in gas exchange and host defense as an immune organ. The same biological processes were enriched compared with either human plasma proteome or total human genome calculated proteome, suggesting an active enrichment of plasma proteins in the lung rather than passive capillary leak. References

1. Schnapp et al.: Mining the acute respiratory distress syndrome proteome: identification of the insulin-like growth factor (IGF)/IGF-binding protein-3 pathway in acute lung injury. Am J Pathol 2006, 169:86-95.

Foss et al.: Genetic basis of proteome variation in yeast.

Nat Genet 2007, 39:1369-1375.

Hosack et al.: Identifying biological themes within lists of genes with EASE. Genome Biol 2003, 4:R70 [http://david.]

Figure 1 (abstract P293)

Surrogate markers of pulmonary edema and severity of lung injury do not accurately reflect measured extravascular lung water

JL LeTourneau1, J Pinney1, K Bacon1, MA Chesnutt2, CR Phillips1

Oregon Health & Science University, Portland, OR, USA; 2Portland Veterans Administration Medical Center, Portland, OR, USA

Critical Care 2008, 12(Suppl 2):P293 (doi: 10.1186/cc6514)

Introduction Determination of extravascular lung water (EVLW) by transpulmonary thermodilution predicts progression to acute lung injury (ALI). Early identification of patients at risk for developing ALI may impact clinical decision-making. Measurement of EVLW, however, is invasive, requiring central venous and arterial catheters. We asked whether less invasive clinical parameters and markers of severity of lung injury could be used to estimate lung water, obviating the need for the more invasive determination. Methods Eighteen patients at risk for ALI due to massive aspiration (n = 1), sepsis (n = 16), massive transfusion (n = 1), and/or trauma (n = 2) were studied prospectively. The PaO2/FiO2 ratio, central venous pressure (CVP), Vd/Vt, fluid balance, Cs, and Lung Injury Scores (LIS) were compared with the EVLW measured on the same day by linear regression analysis (Figure 1). Results Poor correlation of EVLW with the PaO2/FiO2 ratio (r2 = 0.0061), CVP (r2 = 0.1261), fluid balance (r2 = 0.0903), Vd/Vt (r2= 0.0135), Cs (r2 = 0.0014), and LIS (r2 = 0.0837) was seen (Figure 1).

Conclusions The clinical parameters examined in this study do not accurately reflect the EVLW and should not be used as surrogates for it.

Different tidal volumes induce similar elevation of lung injury markers in animals sensitized to injury by previous anesthesia and surgery

P Dostal, V Cerny, M Senkerik, R Parizkova, H Zivna, P Zivny

University Hospital Hradec Kralove, Czech Republic Critical Care 2008, 12(Suppl 2):P294 (doi: 10.1186/cc6515)

Introduction Large tidal volumes and transpulmonary pressures play a central role in pathogenesis of ventilator-induced lung injury. Moderate tidal volumes are considered safe in healthy lungs [1]. Recent studies suggested that insults like endotoxin [2] or surgery [3] sensitize the lung to injury by priming for an exaggerated response to a second stimulus. Our aim was to investigate how animals sensitized to lung injury by previous anesthesia and surgery respond to mechanical ventilation (MV). Methods Twenty-two male adult Sprague-Dawley rats instrumented surgically under ether anesthesia with vascular catheters on the previous day were anesthetized, tracheostomized and randomly allocated to two mechanically ventilated groups - MVLP group (FiO2 1.0, respiratory rate 60/min, tidal volume 10 ml/kg, PEEP 2 cmH2O, n = 8) and HVZP group (FiO2 1.0, respiratory rate 20/min, tidal volume 30 ml/kg, PEEP 0 cmH2O, n = 8) - and a no MV group C (n = 6). After randomization (C group) or after

2 hours of MV, rats were sacrificed, the P-V curve of the respiratory system constructed, and bronchoalveolar lavage fluid (BALF) and aortic blood samples obtained.

Results Comparison of P-V curves suggested derecruitment of the lung in the MVLP group, but no significant difference in airway pressures at maximal lung volume (14 ml) was observed between groups. Total protein (|ig/ml) in BALF was similar in both the MVLP and HVZP groups (0.21 (0.16, 0.30) and 0.22 (0.18, 0,50), P = 0.366) and lower in the C group (0.13 (0.07, 0.16), P < 0.05) versus both MV groups. Similar results were obtained for IL-6 levels (pg/ml) in BALF (102.0 (93.4, 111.8) and 120.0 (53.7, 130.0) for the MVLP and HVZP groups, P = 0.628 and 48.5 (43.4, 56.4) pg/ml for the C group, P < 0.05 vs both MV groups). Conclusions Both moderate and high tidal volumes induce a similar elevation of lung injury markers in mechanically ventilated animals sensitized to injury by previous anesthesia and surgery. Acknowledgement Supported by Research Project MZO 00179906.


1. Ricard J-D, et al.: Eur Respir J 2003, 22(Suppl 42):2s-9s.

2. Schreiber T, et al.: Anesthesiology 2006, 104:133-141.

3. Kaneko A, et al.: J Surg Res 2006, 134:215-222.

Effects of an open lung approach following the ARDS Network ventilatory strategy in patients with early acute lung injury/acute respiratory distress syndrome

V Rotman1, F Bozza2, A Carvalho3, R Rodrigues4, J Castro1, J Pantoja1, F Saddy1, D Medeiros1, W Viana1, E Salgueiro1, CRR de Carvalho5

1Copa Dor Hospital, Rio de Janeiro, Brazil; 2Oswaldo Cruz Foundation, Rio de Janeiro, Brazil; 3University Clinic Carl Gustav Carus, Dresden, Germany; 4Federal University of Rio de Janeiro, Brazil; 5Sao Paulo University, Sao Paulo, Brazil Critical Care 2008, 12(Suppl 2):P295 (doi: 10.1186/cc6516)

Introduction The beneficial effects of the institution of high levels of positive end-expiratory pressure (PEEP) after recruitment maneuvers are controversial. We aim to compare the effects of the ARDS Network (ARDSNet) ventilatory strategy and open lung approach (OLA) applied in a sequential way, in patients with acute lung injury/acute respiratory distress syndrome (ALI/ARDS). Methods Ten patients fulfilling criteria for early ALI/ARDS were recruited. For definitive selection, blood gas collected after 30 minutes application of 5 cmH2O PEEP and tidal volume (VT) = 10 ml/kg had to demonstrate PaO2/FIO2 < 300 mmHg. The patients were initially ventilated for 24 hours according to the ARDSNet protocol. After this period, if the PaO2/FIO2 was <350 mmHg, a recruitment maneuver was performed (sequential 5 cmH2O increments in PEEP starting from 20 cmH2O, until PaO2/FIO2 > 350 mmHg) and an additional 24 hours of ventilation according to the OLA (VT = 6 ml/kg and PEEP to achieve a PaO2/FIO2 > 350 mmHg) was applied. Whole lung computed tomography images (1.0 mm thickness with 10 mm gap) were acquired after 24 hours of each strategy.

Results The institution of OLA was necessary in nine of the 10 studied patients. The PEEP was significantly higher during OLA (17 cmH2O (17-19) vs 8 cmH2O (8-11); P = 0.007) and resulted in a significant improvement of oxygenation sustained for 24 hours of follow-up, with no significant differences in plateau pressure, static compliance, minute ventilation, PaCO2 and pH (P > 0.1). OLA resulted in a significant reduction of the fraction of nonaerated regions as compared with the ARDSNet protocol (13% (10-23) vs 37% (33-42); P = 0.018) without a significant increase in the percentage of hyperinsufflation (5% (1-13) vs 2% (0-7); P =

0.149). No significant differences were observed in the infused doses of vasopressors, fluid balance and arterial blood pressure. Conclusions When compared with the ARDSNet protocol OLA improved oxygenation, reducing the fraction of nonaerated regions without significant increase in hyperinflated areas with comparable levels of hemodynamics and fluid balance.


1. Grasso S, et al.: Anesthesiology 2002, 96:795-802. P296

From low-tidal-volume ventilation to lowest-tidal-volume ventilation

A Rezaie-Majd, N Gauss, L Adler, U Leitgeb, P Kraincuk, L Cokoja, A Aloy

Medical University of Vienna, Austria

Critical Care 2008, 12(Suppl 2):P296 (doi: 10.1186/cc6517)

Introduction The therapeutic measures of lung-protective mechanical ventilation used in treatment of acute lung injury (ALI) and acute respiratory distress syndrome (ARDS) have revived the interest in high-frequency ventilation (HFV). The reduction of the tidal volume during the conventional ventilation (CV) in terms of low-tidal-volume ventilation is not unboundedly feasible. However, HFV allows a further reduction of tidal volume. Established HFV techniques are high-frequency oscillation (HFO), high-frequency percussive ventilation (HFPV) and superimposed high-frequency jet ventilation (SHFJV). The aim of this study was to evaluate the amelioration of the oxygenation index (OI).

Methods Twenty-four patients with ALI/ARDS admitted to an ICU were involved. Haemodynamic parameters, blood gas analysis, ventilation pressures (positive end-expiratory pressure (PEEP), plateau and mean airway pressures) were measured. The use of HFV was indicated if the OI was still lower than 200 under CV. The initial parameters (plateau and mean airway pressure, PEEP, I: E ratio, ventilation frequency and FiO2) were chosen as the latest setups of the CV. We randomly used one of the abovementioned techniques to treat patients with ALI/ARDS. The clinically relevant parameters were proved every 4 hours and ventilation was adopted.

Results All patients treated with HFV showed an amelioration of OI within 24 hours after the start (Figure 1). Furthermore, we registered a significant increase of OI after 24 hours compared with basis CV (Figure 1). However, we did not measure any significant changes between the three HFV techniques at this time point. We observed less/no haemodynamic disturbances with SHFJV and HFPV compared with HFO. It was therefore important to clinically stabilize the patients.

Conclusions We achieved a significant amelioration of the OI using HFV rather than with CV. Each of the HFV techniques, however, needs a period of a few hours to predict that the technique is responding or nonresponding.

Figure 1 (abstract P296)

Relationship of the stress index, lung recruitment and gas exchange in patients with acute respiratory distress syndrome

Y Huang, H Qiu, Y Yang

Nanjing Zhong-da Hospital and School of Clinical Medicine, Southeast University, Nanjing, China

Critical Care 2008, 12(Suppl 2):P297 (doi: 10.1186/cc6518)

Introduction The study objective was to investigate the relationship of stress index and positive end-expiratory pressure (PEEP) in

patients with acute respiratory distress syndrome (ARDS). To determine the relationship of the stress index, lung recruitment, oxygenation and respiration mechanics.

Methods Fourteen patients with ARDS were enrolled in the study. During volume control ventilation with constant inspiratory flow, the pressure-time (P-t) curve was fitted to a power equation: P = a x timeb + c, where coefficient b (stress index) describes the shape of the curve: b = 1, straight curve; b < 1, progressive increase in slope; and b > 1, progressive decrease in slope. PEEP was set to obtain a b value between 0.9 and 1.1 after application of a recruiting maneuver (RM). PEEP was changed to obtain 0.6 < b <

0.8 and 1.1 < b < 1.3. The experimental conditions sequence was random. The recruited volume (RV) was measured by the static pressure-volume curve method. The hemodynamics, pulmonary mechanics and gas exchange were observed at the same time. Results The PEEPs at b < 1, b = 1 and b > 1 were 8.3 ± 1.5 cmH2O, 15.0 ± 1.9 cmH2O and 18.4 ± 1.9 cmH2O, respectively, which were significantly different (P < 0.001). At b = 1 and b > 1, the partial arterial oxygen tension (PaO2/FiO2) (350.1 ± 113.0 mmHg, 338.3 ± 123.8 mmHg) was higher than that (165.1 ± 59.9 mmHg) of pre-RM (P < 0.05). The plateau pressures (Pplat) at b = 1 (29.0 ± 3.5 cmH2O) and b > 1 (32.9 ± 7.3 cmH2O) post-RM were significantly higher than that at b < 1 (21.9 ± 4.3 cmH2O) (P< 0.05). The Pplat at b > 1 was higher than that (25.3 ± 15.9 cmH2O) pre-RM (P < 0.05). Compared with the static pulmonary compliance (Cst) at b = 1 (38.6 ± 10.9 ml/cmH2O), the Cst at b > 1 (26.4 ± 6.5 ml/cmH2O) decreased significantly (P < 0.05). The RV at b = 1 and b > 1 (401.6 ± 204.0 ml, 588.3 ± 269.1 ml) were significant higher than that at pre-RM and b < 1 (135.9 ± 111.1 ml, 175.2 ± 122.4 ml) (P < 0.05). At pre-RM, b < 1, b = 1 and b >

1, the HR, mean arterial pressure and lactate were not significantly different (P > 0.05).

Conclusions The stress index at post-RM could be a good method of PEEP titration for ARDS patients.

Pulmonary homogenicity changes during recruitment maneuvers and positive end-expiratory pressure in dogs with acute respiratory distress syndrome

Q Chen, Y Yang, H Qiu, S Liu

Nanjing Zhong-da Hospital and School of Clinical Medicine, Southeast University, Nanjing, China

Critical Care 2008, 12(Suppl 2):P298 (doi: 10.1186/cc6519)

Introduction The objective of the study was to investigate pulmonary homogenicity changes during recruitment maneuvers (RM) and positive end-expiratory pressure (PEEP) in dogs with pulmonary acute respiratory distress syndrome (ARDSp) or extrapulmonary acute respiratory distress syndrome (ARDSexp). Methods After induction of saline lavage-injured ARDS (ARDSp, n = 8) or oleic-acid-injured ARDS (ARDSexp, n = 8), PEEP was set at 20 cmH2O and RM were performed (40/30-maneuver). RM were repeated every 5 minutes until reaching sufficient alveolar recruitment (PaO2/FiO2 > 400 mmHg), and then the tidal volume was set at 10 ml/kg and PEEP was lowered by 2 cmH2O in every 10 minutes. Optimal PEEP was defined at 2 cmH2O above the PEEP where PaO2/FiO2 dropped below 400 mmHg. Computed tomography (CT) scans were done before and after induction of ARDS and at each pressure level. By the changes in the CT values, the lung was divided into hyperinflated, normally aerated, poorly aerated and nonaerated regions. Lung volumes were calculated by Pulmo software.

Results After RM, the total lung volume and air volume were significantly increased before and after induction of ARDS in the two models (P < 0.05). At optimal PEEP, poorly aerated and nonaerated lung areas decreased and normally aerated lung areas increased sharply but were accompanied by significant alveolar hyperinflation in the two models (P < 0.05). Compared with ARDSexp models, the changing of hyperinflated lung areas was markedly greater in ARDSp models at optimal PEEP (P < 0.05). After three-dimensional renderings of CT scans, alveolar hyperinflation occurred mainly in nondependent lung regions, whereas alveolar recruitment occurred in dependent regions. Conclusions The alveolar hyperinflation increase and pulmonary heterogeneity climb during RM and at optimal PEEP. A focal distribution of lung injury in ARDSp may be more susceptible to alveolar hyperinflation with optimal PEEP.

Positive end-expiratory pressure-induced changes of the vibration response image

S Lev, I Kagan, M Grinev, J Cohen, P Singer

Rabin Medical Center, Beilinson Hospital and Sackler School of

Medicine, Tel Aviv University, Petah Tiqva, Israel

Critical Care 2008, 12(Suppl 2):P299 (doi: 10.1186/cc6520)

Introduction Vibration response imaging (VRI) is a new modality that reflects the distribution of vibration in the lung during the respiratory process. The VRI dynamic and functional image has been proved to be sensitive to changes in ventilator settings, including changes in mode of mechanical ventilation. In the present study, we assess the changes of the VRI image and quantification data as a function of positive end-expiratory pressure (PEEP) changes. Methods Thirty-four ventilated patients were consecutively enrolled in this study. PEEP levels (0-15 cmH2O) were assigned according to level of oxygenation and blood pressure. A change in vibration distribution of more than 10% in one of the six lung segments was considered major. One hundred and thirteen recordings were performed sequentially in 21 patients at the same level of PEEP in order to assess the reproducibility of the measurement. Results One hundred and sixty VRI recordings were completed. Sixty-eight percent showed major changes in the VRI measurement when changing the PEEP. These changes were detectable for a PEEP change of 5 cmH2O or less in 17 patients and for 10 cmH2O for six patients. Among most VRI responders, an optimal PEEP range could be detected, in the range 5-10 cmH2O (Figure 1). Ninety-six percent of the sequential recordings performed on the same patient at a given PEEP level exhibited less than 10% change. Conclusions VRI measurements respond rapidly to PEEP changes, and can provide the optimal vibration distribution at different PEEP levels.

Figure 1 (abstract P299)


Effect of positive end-expiratory pressure application on inflammation in acute respiratory distress syndrome patients during pressure-volume curve maneuver

T Adanir, A Sencan, T Aydin, M Koseoglu

Ataturk Training and Research Hospital, Izmir, Turkey Critical Care 2008, 12(Suppl 2):P300 (doi: 10.1186/cc6521)

Introduction There are several studies indicating that low-tidal-volume ventilation causes less damage to the alveoli than a high tidal volume [1], but this strategy may facilitate alveolar de-recruitment and deterioration of gas exchange [2]. Recruitment maneuvers may improve gas exchange, but inflating the lungs to nearly vital capacity might be harmful due to stretch stress imposed on the pulmonary parenchyma. Alveolar macrophages liberate inflammatory cytokines in response to stretch [3]. The pressure-volume (PV) curve is a physiological tool proposed for diagnostic or monitoring purposes during mechanical ventilation in acute respiratory distress syndrome (ARDS). This maneuver effect is similar to recruitment maneuver. We study the hypothesis that the systemic level of proinflammatory cytokines may be affected by a PV curve maneuver in ARDS patients, and positive end-expiratory pressure (PEEP) application may affect these levels. Methods This prospective, interventional clinical trial was performed in the ICU of a teaching hospital. Twenty-two consecutive mechanically ventilated patients with clinical and radiological signs of ARDS were included in the study. A single PV curve maneuver was performed by elevating the airway pressure to 40 cmH2O for 7 seconds with (group 2) (n = 10) or without PEEP (group 1) (n = 12) application. Plasmatic concentrations of IL-6, TNFa and antioxidant capacity, arterial blood gases and respiratory changes were measured immediately before and 5 minutes, 1 hour and 5 hours after the PV curve maneuver.

Results The PV curve maneuver caused a minor but nevertheless significant improvement of oxygenation (P = 0.015) in both groups. In addition, plasma concentrations of TNFa of both groups were similar, but the IL-6 level was higher in the group without PEEP application than in the group with PEEP after the maneuver (P = 0.000). Conclusions The PV curve maneuvers with or without PEEP improved oxygenation slightly. But if the PV curve maneuvers were applied with PEEP, this did not modify systemic inflammatory mediators in mechanically ventilated ARDS patients. References

1. Acute Respiratory Distress Syndrome Network: N Engl J Med 2000, 342:1301-1308.

2. Richard JC, et al.: Am J Respir Crit Care Med 2001, 163: 1609-1613.

3. Pugin J, et al.: Am J Physiol 1998, 275:L1040-L1050. P301

Airway pressure release ventilation: an alternative ventilation mode for pediatric acute respiratory distress syndrome

D Demirkol, M Karabocuoglu, A Citak, R Ucsel, N Uzel

Istanbul Medical Faculty, Istanbul, Turkey

Critical Care 2008, 12(Suppl 2):P301 (doi: 10.1186/cc6522)

Introduction The purpose of the present data is to determine whether airway pressure release ventilation (APRV) can improve oxygenation in pediatric patients with acute respiratory distress syndrome relative to pressure-controlled ventilation (PCV). Methods Data about the patients with acute respiratory distress syndrome whose oxygenation was not improved with conventional ventilation and switched to APRV were collected retrospectively. Results Five patients were switched from conventional ventilation to APRV. Of these five, three patients responded to APRV with improvement in oxygenation. The mean age of the responders was 5.8 ±1.3 (4.3-7.4) months. The fractional oxygen concentration decreased from 96.6 ± 2.3% for PCV to 68.3 ± 11.5% for APRV, the peak airway pressure fell from 36.6 ± 11.5 cmH2O for PCV to 33.3 ± 5.7 cmH2O for APRV, the mean airway pressure increased from 17.9 ± 5.9 cmH2O for PCV to 27 ± 2.6 cmH2O for APRV and the release tidal volume increased from 8.3 ± 1.5 ml/kg for PCV to 13.2 ± 1.1 ml/kg for APRV at the first hour (Table 1). Of the two nonresponders, both had primary acute respiratory distress syndrome and one of them had prior ventilation history. We suggested that the lungs of the nonresponders were not recruitable. Conclusions APRV may improve oxygenation in pediatric ARDS patients when conventional ventilation does not work. APRV modality may provide better oxygenation with lower peak airway pressure. The recruitability of the lungs may affect the response to APRV. References

1. Stock CM, Downs JB, Frolicher DA: Airway pressure release ventilation. Crit Care Med 1987, 15:462-466.

2. Habashi NM: Other approaches to open-lung ventilation: airway pressure release ventilation. Crit Care Med 2005, 33:S228-S240.

3. Schultz TR, Costarino AT, Durning SM, et al.: Airway pressure release ventilation in pediatrics. Pediatr Crit Care Med 2001, 2:243-246.

Table 1 (abstract P301) Conventional ventilation versus APRV

APRV time (hours)

Pre-enrollment to APRV Ti T24 T48 T72

PIP/Phigh (cmH2O) MAP (cmH2O) 36.6 ± 11.5 33.3 ± 5.7 30.3 ± 5.5 29 ± 1.4 25.5 ± 2.1

17.9 ± 5.9 27 ± 2.64 24.6 ± 4.16 23.5 ± 0.7 21.6 ± 2.3

PEEP/P|0W (cmH2O) 13.3 ± 1.5 0 0 0 0

Thigh (sec) 3.6 ± 0.5 3.7 ± 0.4 3.8 ± 0.2 3.65 ± 0.21

Tlow (sec) 0.73 ± 0.2 0.76 ± 0.11 0.9 ± 0 0.75 ± 0.21

VTe (mL/kg) 8.3 ± 1.5 13.2 ± 1.1 12 ± 1.7 11.0 ± 1.41 9.5 ± 6.7

Set FiO2 (%) 96.6 ± 2.3 68.3 ± 11.5 60.0 ± 5 57.5 ± 3.5 52.5 ± 3.5

T4 T24 T48 T72

pH 7.23 ± 9.7 7.32 ± 9.5 7.35 ± 6.42 7.34 ± 4.9 7.39 ± 1.41

pCO2, mmHg 72.3 ± 21.4 63.4 ± 16.8 61.4 ± 7.85 56.4 ± 6.43 57.0 ± 12.7

PIP, peak airway pressure; MAP, mean alveolar pressure; PEEP, positive end expiratory pressure; VTe, expiratory tidal volume; FiO2, fractional oxygen concentration; T, time

Influence of rhDNAse on the duration of mechanical ventilation in intensive care patients

N Deschner, B Friedrich, W Brehm, R Vonthein, J Riethmueller

University Hospital Tuebingen, Germany

Critical Care 2008, 12(Suppl 2):P302 (doi: 10.1186/cc6523)

Introduction rhDNase is effective in the treatment of children with cystic fibrosis [1]. Significant reduction of the duration of ventilation by rhDNAse has been reported in children following cardiac surgery [2]. The goal of the present study was to investigate whether rhDNase is able to reduce the duration of ventilation in adult mechanically ventilated intensive care patients. Methods After approval of local ethics committees we conducted a double-blind, placebo-controlled, randomised, multicentre national trial. Patients were stratified into two subgroups depending on their status as surgical or nonsurgical. The trial was started within 48 hours after the start of mechanical ventilation and lasted until weaning was successful. Patients in the treatment group received 2.5 ml rhDNase endotracheally twice a day. Patients in the placebo group received the same amount of normal saline. Results One hundred and twenty-three surgical and 162 nonsurgical patients were included in the study. Factors such as gender, weight, smoking habit, chronic pre-existing diseases and prevalence of chronic obstructive pulmonary disease were distributed equally in both groups in surgical patients. In nonsurgical patients, more smokers were randomized to the rhDNase group. Acute burn patients were randomized to the rhDNase group only. Twelve patients (two surgical) died in the rhDNase group versus 16 (four surgical) in the placebo group. In surviving surgical patients, the median duration of ventilation was

16.6 days (95% CI 11.5-21 days) in the rhDNase group and

11.7 days (95% CI 8.4-15.6 days, P = 0.39) in the placebo group. In surviving nonsurgical patients, the median duration of ventilation was 7.8 days (95% CI 6-9.3 days) in the rhDNase group and 12.6 days (95% CI 7.9-16.9 days; P = 0.038) in the placebo group without adjustment for smoking habits.

Conclusions In adult nonsurgical intensive care patients, rhDNase significantly shortens the duration of ventilation. This effect is not seen in surgical patients. The hypothesis that especially pneumonia that requires mechanical ventilation responds favourably to treatment with rhDNase should be investigated. References

1. Fuchs HJ, et al.: N Engl J Med 1994, 331:637-642.

2. Riethmueller J, et al.: Pediatr Pulmonol 2006, 41:61-66.

Respiratory dialysis: a new therapy for chronic obstructive pulmonary disease

S Morley1, M Rosenberg1, W Federspiel1, BG Hattler1, A Batchinsky2

1ALung Technologies Inc. and University of Pittsburgh McGowan Institute for Regenerative Medicine, Pittsburgh, PA, USA; 2US Army Institute of Surgical Research, San Antonio, TX, USA Critical Care 2008, 12(Suppl 2):P303 (doi: 10.1186/cc6524)

Introduction Chronic obstructive pulmonary disease (COPD) is predicted to be the third leading cause of death in the United States by 2020. Approximately 125,000 people die yearly from acute exacerbations of the disease. Once intubation and mechanical ventilation become necessary, the death rate increases. To avoid the need for ventilator use we have developed a new device (the Hemolung), which is an integrated pump/ oxygenator that functions at low blood flow rates (250-500 ml/min)

equivalent to those used in renal dialysis. The small priming volume (190 ml), reduced membrane surface area (0.5 m2), and use of a percutaneously inserted dual lumen venous catheter (15 Fr) to provide blood inflow and outflow make the entire system suitable for repetitive use in patients with hypercapnic acute respiratory failure. We report here 7-day animal data stressing the hemocompatibility and gas exchange capabilities of the device. Methods The venous catheter was inserted into the right exterior jugular vein of four adult sheep and connected to the saline-primed Hemolung circuit. Hollow fiber membranes were coated with siloxane and heparin to prevent plasma wetting and to increase biocompatibility. Animals were minimally anticoagulated with heparin (ACT 150). Blood flow, CO2 exchange, blood gases and key hematological parameters were measured over 7 days. Necropsy was performed on termination.

Results Removal of CO2 remained steady over 7 days, averaging 72 ± 12 ml/min at blood flows of 384 ± 18 ml/min. As the venous PCO2 rose or fell, so did the level of CO2 removal. No changes were necessary in the system and no plasma wetting was noted over the 7 days. Hematocrit remained stable and no blood products were required. Initial platelet counts dropped to 221,000 ± 58,000/|il by the second day, but recovered to baseline values on day 4 and remained stable. Necropsy showed no signs of thromboembolism or organ damage. Conclusions A simple alternative to mechanical ventilation for patients with COPD and hypercapnic respiratory failure has been successfully tested in animals. Human trials are planned for 2008 to determine what role 'respiratory dialysis' will have in this patient population.

Effect of the inspiratory flow pattern on inspiration/ expiration transition of lung vibrations in mechanically ventilated patients

S Jean, I Cinel, R Dellinger

Robert Wood Johnson School of Medicine, UMDNJ, Cooper

University Hospital, Camden, NJ, USA

Critical Care 2008, 12(Suppl 2):P304 (doi: 10.1186/cc6525)

Introduction Mechanical ventilation can be performed using different inspiratory flow patterns. We used vibration response imaging (VRI) technology to ascertain the intensity of lung vibration with different inspiratory flow patterns during various modes of mechanical ventilation.

Methods VRI was performed in succession during volume control ventilation with a square and decelerating inspiratory flow waveform, a pressure control waveform and a pressure support waveform. The total lung vibration energy transmitted to the posterior thorax (from 36 sensors) was plotted over time and the transition of vibrations from inspiration to expiration noted.

Figure 1 (abstract P304)

Results During volume control ventilation with a square inspiratory flow waveform, peak inspiratory flow (PIF) is immediately followed by peak expiratory flow (PEF) and, as such, the separation of peak inspiratory (I) and expiratory (E) lung vibrations as transmitted to the chest wall surface was minimal or absent consistently (Figure 1a). During pressure support ventilation, PIF and PEF are separated -resulting in a consistent separation of I and E peak lung vibration (Figure 1b). During pressure control ventilation with sufficient I time to allow inspiratory flow to return to a baseline as near-baseline level, the valley was consistently widened between the peak I and E lung vibration energy, reflecting the decrease in vibration energy associated with lower end-inspiratory airflow (Figure 1c). Conclusions When the flow-time waveform recorded between the patient and the ventilator is compared with the vibration waveform summed from surface sensors, two patterns emerge: as the time between PIF and PEF increases, the separation of peak I and E lung vibration energy increases; and as end-inspiratory flow decreases, the valley between peak I and E vibration deepens. The decelerating flows that gradually diminish to zero will therefore produce the greatest separation in respiratory vibrations. There could be clinical relevance as to the effect of the inspiratory flow waveform on vibration transition in the chest.

Effect of the pressure ramp on the breathing pattern and respiratory drive in pressure support ventilation

A Tejero Pedregosa, F Ruiz Ferrón, MI Ruiz García, S Galindo Rodríguez, A Morante Valle, A Castillo Rivera, J Pérez Valenzuela, ÁJ Ferrezuelo Mata, L Rucabado Aguilar

Complejo Hospitalario de Jaén, Spain

Critical Care 2008, 12(Suppl 2):P305 (doi: 10.1186/cc6526)

Introduction Assisted ventilation with pressure support (PS) is one of the most commonly used ventilation modes; however, there is no agreement about the most adequate method to set an optimal level of PS. The aim of our study is to analyze the effect of different pressurization ramps on the work of breathing. This was estimated by the airway occlusion pressure in the first 100 ms of inspiration (P0.1) and by the breathing pattern: the respiratory rate (RR) and tidal volume (TV).

Methods We carried out an interventional prospective study in a group of 15 patients mechanically ventilated after acute respiratory failure (ARF), with different initial causes at the beginning of assisted ventilation. The initial PS level was equal to the plateau pressure, and this was decreased in order to keep the patient comfortable and the P0.1 lower than 3 cmH2O. The PEEP level used in controlled ventilation was maintained and subsequently we changed the

Figure 1 (Abstract P306)

inspiratory rise time in three ranks: 0.0, 0.2 and 0.4 seconds. After keeping the patient respiratorily and haemodynamically stable we measure the RR, TV and P0.1. We compare data using statistical analysis with nonparametric tests. Results are presented as the mean ± standard deviation at 0.0, 0.2 and 0.4 seconds. Results The cause of ARF was acute respiratory distress syndrome (ARDS) in nine patients and acute on chronic respiratory failure in six patients. Mean age: 61.2 ± 14.02 years. The mean level of PS and PEEP was 17.93 ± 7.10 and 6 ± 1.69 cmH2O. Decreasing the inspiratory ramp was associated with the significantly highest P0.1 levels (1.25 ± 0.8, 1.47 ± 1, 1.96 ± 1.24, P = 0.01), whereas the RR and TV did not significantly change (RR: 22.66 ± 9.38, 21.73 ± 7.05, 22 ± 6.67; TV: 534.46 ± 162.59, 541.8 ± 168.5, 522.73 ± 157.09). There were no significant differences in the P0.1 levels between acute respiratory distress syndrome and acute on chronic respiratory failure. Conclusions The parameters frequently used to estimate the breathing pattern do not necessarily reflect the changes in the work of breathing. The availability of automatic measurements in some respirators can help to optimize the ventilatory mode used.

Knowledge acquisition to design a fuzzy system for disease-specific automatic control of mechanical ventilation

D Gottlieb1, S Lozano2, J Guttmann1, K Moller2

1University Clinics, Freiburg, Germany; 2Furtwangen University, VS-Schwenningen, Germany

Critical Care 2008, 12(Suppl 2):P306 (doi: 10.1186/cc6527)

Introduction A closed-loop system for automated control of mechanical ventilation, Autopilot-BT, will be enhanced [1]. It must be able to adapt to diverse disease patterns. The Autopilot-BT is based on fuzzy logic, which can model complex systems using expert knowledge. The expert knowledge was acquired by a specifically designed questionnaire (Figure 1a). Methods Exemplarily we will focus on the respiratory rate (RR) controller, responsible for the arterial partial pressure of carbon dioxide/end-tidal carbon dioxide pressure (etCO2) control. The etCO2 values are classified into seven different fuzzy sets ranging from 'extreme hyperventilation' to 'extreme hypoventilation'. For different diseases such as chronic obstructive pulmonary disease or acute respiratory distress syndrome (ARDS), every clinician assigns given etCO2 values to a ventilation status. By averaging over all assignments of the clinicians, new targets and limits for each disease are obtained. Afterwards the new target and limit areas were implemented in a new fuzzy system controlling the RR.

(a) Questionnaire for ARDS. (b) Membership functions for "healthy" and "ARDS".

Results Sixty-one of the anaesthesiologists filled the questionnaire completely, two did not answer. Figure 1b exemplarily shows the different classifications of etCO2 (membership functions) in 'healthy' and ARDS derived from the questionnaire. The membership areas of 'normal state', 'moderate', 'strong' and 'extreme hypoventilation' in the ARDS fuzzy system are shifted to the right. Also the basis of the target area 'normal state' ranges in the ARDS system from 30 to 60 mmHg etCO2. One of the limits in the fuzzy system therefore shifts more to the hypoventilated area and the system tolerates etCO2 values up to 60 mmHg as the normal state range. Conclusions Disease-specific expert knowledge derived from the questionnaire greatly modifies the performance of the RR controller. The developed disease-specific adaptive controller provides better mechanical ventilation support to patients. Reference

1. Lozano S, etal.: Technol Health Care, in press. P307

Use of chest sonography to estimate alveolar recruitment in general anesthesia

B Serretti1, B Ferro1, L Gargani2, F Forfori3, R Mori3, C Mosca3, F Giunta3

1Scuola di Specializzazione in Anestesia e Rianimazione, Université degli Studi di Pisa, Pisa, Italy; 2Institute of Clinical Physiology, National Research Council, Pisa, Italy; 3AOUP, Pisa, Italy Critical Care 2008, 12(Suppl 2):P307 (doi: 10.1186/cc6528)

Introduction Chest sonography evaluates extravascular lung water as the number of ultrasound lung comets (ULC). The goal of this study was to assess the potential role of chest sonography to detect lung reaeration after alveolar recruitment maneuvers (RM) using the PaO2/FiO2 ratio as reference parameter. Methods Chest sonography and arterial blood gas (ABG) analysis were independently performed in eight anesthetized and

Figure 1 (abstract P307)

Figure 2 (abstract P307)

mechanically ventilated patients after abdominal surgery, before RM (achieved with positive end-expiratory pressure at 35 cmH2O for 30 s) and 5 and 60 minutes later. A total of 11 anterior and lateral chest regions were studied. For each chest region, lung patterns were scored as: 0 = normal, 1 = B lines >7 mm, 2 = B lines <7 mm, 3 = white lung, 4 = alveolar consolidation. A global ULC score for each patient was calculated adding the score of the 11 chest regions. Using the PaO2/FiO2 ratio as reference parameter, we calculated the sensitivity, specificity and positive predictive value of the lung ultrasound. The baseline ULC score was compared with the basal PaO2/FiO2 ratio, and furthermore the variation of eco score was compared with the variation of the PaO2/FiO2 ratio. Results Chest sonography shows a sensitivity of 50-100%, a specificity of 75% and a positive predictive value of 67-88%. A linear correlation between the PaO2/FiO2 ratio and eco score was found (Figures 1 and 2).

Conclusions Chest sonography is a promising, simple and bedside tool to estimate the efficacy of alveolar recruitment maneuvers.

Lung sound analysis to detect recruitment processes during mechanical ventilation

K Möller1, Y Lu1, B Vondenbusch1, S Schließmann2, S Schumann2, A Kirschbaum2, J Guttmann2

1Furtwangen University, VS-Schwenningen, Germany; 2University Clinics, Freiburg, Germany

Critical Care 2008, 12(Suppl 2):P308 (doi: 10.1186/cc6529)

Introduction The optimal setting of positive end-expiratory pressure (PEEP) during mechanical ventilation in the ICU is still an open problem [1]. Recruitment processes are important influence factors that are difficult to measure in patients [2,3]. It is proposed

Figure 1 (abstract P308)

to use lung sound analysis to identify recruitment during mechanical ventilation.

Methods A special low-noise microphone and amplifier has been developed (Figure 1, left) and integrated into a standard stethoscope. A series of experiments were performed to obtain sound data from isolated animal lungs. A standard protocol was implemented using the Evita4Lab equipment (Dräger Medical, Lübeck, Germany). After preparation of the lung, a low-flow manoeuvre was performed as a first inflation to open the atelectatic lung. Visual inspection allows us to identify the amount of recruitment. At most two more low-flow manoeuvres were performed while recording the lung sounds until the whole lobe below the stethoscope was opened. Finally, a PEEP wave manoeuvre was used to record normal breaths at different PEEP levels starting from zero PEEP up to a peak pressure of 50 mbar. Results Crackle sounds (Figure 1, right) could be identified and analysed in the recorded sound data offline using Matlab (Mathworks, Natick, MA, USA) for the analysis and visualization. With a window technique, the power of the sound signal in a specific frequency range for crackles (700-900 Hz) was determined. At the beginning of the low-flow manoeuvre, trains of crackles were found that increased in intensity with pressure. At higher pressure levels the crackle intensity decreased, especially in the second and third manoeuvres.

Conclusions Sound analysis in isolated lungs can reliably detect increasing crackle activities that seem to correlate with visually observable opening of atelectasis. References

1. Hickling KG: Curr Opin Crit Care 2002, 8:32-38.

2. Gattinoni L, et al.: N Engl J Med 2006, 354:1775-1786.

3. Stahl CA, et al.: Crit Care Med 2006, 34:2090-2098.

Lung sound patterns of exacerbation of congestive heart failure, chronic obstructive pulmonary disease and asthma

T Bartter, Z Wang

Cooper University Hospital, Camden, NJ, USA

Critical Care 2008, 12(Suppl 2):P309 (doi: 10.1186/cc6530)

Introduction Congestive heart failure (CHF), chronic obstructive pulmonary disease (COPD) and asthma patients typically present with abnormal auscultatory findings on lung examination. The aim of this study was to evaluate in detail the distribution of respiratory sound intensity in CHF, COPD and asthma patients during acute exacerbation.

Methods Respiratory sounds throughout the respiratory cycle were captured and displayed using an acoustic-based imaging technique. The breath sound distribution was mapped to create a gray-scale sequence of two-dimensional images based on the intensity of sound (vibration). Consecutive CHF (n = 23), COPD (n = 12) and asthma (n = 22) patients were imaged at the time of presentation to the emergency department. Geographical area of the images and respiratory sound patterns were quantitatively analyzed. Results In healthy volunteers, CHF, COPD and asthma patients, the mean geographical area of the vibration energy image in an inspiratory maximal energy frame was 76.2 ± 4.5, 67.6 ± 6.7, 72.2 ± 7.6 and 52 ± 11.7 kilo-pixels, respectively (P < 0.01). In healthy volunteers, CHF, COPD and asthma patients, the ratio of vibration energy values at peak inspiration and expiration (peak I/E ratio) were 6.3 ± 5.2 and 5.6 ± 4, 2.8 ± 2.2 and 0.3 ± 0.3, respectively (P < 0.01). Mathematical analysis of the timing of vibration energy peaks of right lungs versus left lungs showed that the time between inspiratory peaks was 0.03 ± 0.04 seconds and between expiratory peaks was 0.14 ± 0.09 seconds in

symptomatic asthmatic patients. There were no significant differences in the timing of vibration energy peaks in healthy volunteers, CHF and COPD patients.

Conclusions Compared with healthy volunteers, the geographic area of the image in CHF is smaller, there is no difference in the peak I/E vibration ratio and there is no peak energy asynchrony between two lungs; In COPD, there is no difference in the geographic area of the image and no asynchrony in peak energy between two lungs but there is a significant decrease in the peak I/E vibration ratio; In asthma, the geographic area of the image is much smaller, and the peak I/E ratio is even further decreased and there is asynchrony in peak energy between the two lungs. These characteristics may be helpful in distinguishing acute symptomology due to CHF, COPD or asthma.

Determination of expiratory lung mechanics using cardiogenic oscillations during decelerated expiration

A Wahl1, M Lichtwarck-Aschoff2, K Möller3, S Schumann1, J Guttmann1

University Hospital Freiburg, Germany; 2Uppsala University, Sweden; 3Furtwangen University, Furtwangen, Germany Critical Care 2008, 12(Suppl 2):P310 (doi: 10.1186/cc6531)

Introduction Mechanical energy from the beating heart is transferred to the lung, inducing variations in the airway pressure signal called cardiogenic oscillations (COS), which we hypothesize reflect intratidal nonlinear lung mechanics. However, during high flow rate, as characteristic for passive expiration, the analysis of lung mechanics is impractical since COS are almost suppressed and the quantity is low.

Methods Five piglets with atelectasis were investigated during constant inspiratory flow mechanical ventilation with positive end-expiratory pressure of 0, 5, 8, 12 and 16 mbar. The airflow rate, airway pressure, pleural pressure and ECG were recorded (sample frequency 100 Hz). The expiratory airflow rate was limited using two switchable tubes of different lumen. Signals were separated and compared by each breath.

Results Compared with passive expiration COS in decelerated expiration became clearly distinguishable (Figure 1). COS amplitudes were increasing with decreasing airflow rate. Conclusions By decelerating the expiration, COS become distinguishable and therefore analyzable. With this method, lung mechanics can be determined separately in expiration. References

1. Lichtwarck-Aschoff M, et al.: J Appl Physiol 2004, 96:879-884.

2. Bijaoui E, et al.: Adv Exp Med Biol 2004, 551:251-257.

3. Bijaoui E, et al.: J Appl Physiol 2001, 91:859-865.

Figure 1 (abstract P310)

Cardiogenic oscillations reflect nonlinear lung mechanics

A Wahl1, M Lichtwarck-Aschoff2, K Möller3, S Schumann1, J Guttmann1

1University Hospital Freiburg, Germany; 2Uppsala University, Sweden; 3Furtwangen University, Furtwangen, Germany Critical Care 2008, 12(Suppl 2):P311 (doi: 10.1186/cc6532)

Introduction Heartbeats transfer mechanical energy to the lung, inducing variations in local compliance (dV/dP) [1], which translate to cardiogenic oscillations (COS) in the pressure-volume (PV) loop. We hypothesized that the COS-related local dV/dP change reflects intratidal nonlinear lung mechanics modulated by positive end-expiratory pressure (PEEP).

Methods Ten piglets with atelectasis were investigated during constant inspiratory flow mechanical ventilation with PEEP of 0, 5, 8, 12 and 16 mbar, respectively. The airflow rate, airway pressure and ECG were recorded (sample frequency: 100 Hz). The inspiratory limb of the PV loop was partitioned into segments confined by two consecutive ECG R-peaks and containing one COS. Local compliances were analyzed as the local slope (dV/dp) within consecutive volume windows of 50 ml size. Results The COS-related local compliances depend on the PEEP level as shown in one representative animal (Figure 1). They are maximal at PEEP levels of 5 and 8 mbar and are minimal at zero PEEP and at high PEEP levels of 12 and 16 mbar, and they decrease with increasing inspired volume.

Conclusions COS-related local compliances reflect nonlinear lung mechanics. The information obtained by COS corresponds to what can be learnt from the sigmoid PV loop of a quasistatic manoeuvre with low compliance at low pressure, high compliance at intermediate pressure and again low compliance with overdistension. The intratidal pattern of COS-related compliances possibly reflects the nonlinearity of pulmonary volume distensibility, which, among others, is modulated by PEEP. Analysis of COS-related local compliances may open a window towards lung mechanics determination in spontaneous breathing. References

1. Lichtwarck-Aschoff M, et al.: J Appl Physiol 2004, 96:879884.

2. Bijaoui E, et al.: Adv Exp Med Biol 2004, 551:251-257.

3. Bijaoui E, et al.: J Appl Physiol 2001, 91:859-865

Figure 1 (abstract P311)

Functional residual capacity measurements during mechanical ventilation in ICU patients

IG Bikker, J Van Bommel, D Dos Reis Miranda, D Gommers

Erasmus Medical Center, Rotterdam, The Netherlands Critical Care 2008, 12(Suppl 2):P312 (doi: 10.1186/cc6533)

Introduction The level of positive end-expiratory pressure (PEEP) is important to avoid ventilator-induced lung injury (VILI) by preventing alveolar collapse and alveolar overdistension. One of the mechanisms of application of optimal PEEP could be measurement of the functional residual capacity or end-expiratory lung volume (EELV) in mechanically ventilated patients. Recently, GE Healthcare introduced a multibreath open-circuit nitrogen technique to measure the EELV during mechanical ventilation. The aim of this study was to measure the EELV levels at three different PEEP levels in ventilated patients with different diseases.

Methods We examined 45 sedated mechanically ventilated patients in a mixed ICU of a university hospital. Patients were divided into three groups: normal pulmonary function (group N), respiratory failure due to primary lung disorders (group P) and respiratory failure due to secondary lung disorders (group S). In all patients the EELV measurements were performed at three PEEP levels (15 cmH2O, 10 cmH2O, 5 cmH2O). Arterial blood gases were also obtained at each PEEP level.

Results Figures 1 and 2 show the EELV data and PaO2/FiO2 (PF) ratio data, respectively.

Figure 1 (abstract P312)

Figure 2 (abstract P312)

Conclusions We conclude that the EELV values decreased significantly after stepwise reduction of the PEEP levels from 15 to 5 cmH2O, whereas the PaO2/FiO2 ratio did not change. This indicates that monitoring a patient's lung function could be a prerequisite to find the optimal PEEP in order to prevent VILI.

Clinical evaluation of a system for measuring functional residual capacity in mechanically ventilated patients

L Brewer, B Markewitz, E Fulcher, J Orr

University of Utah Health Sciences Center, Salt Lake City, UT, USA Critical Care 2008, 12(Suppl 2):P313 (doi: 10.1186/cc6534)

Introduction The ability to track the functional residual capacity (FRC) volume for a patient treated with ventilator therapy may be useful for guidance in improving or preserving gas exchange and for avoiding ventilator-associated injury to the lung. We tested the repeatability of the FRC measurement at the bedside for mechanically ventilated patients in the ICU. Methods All data needed for the FRC measurement were collected using a volumetric capnography system (NICO2; Respironics, Wallingford, CT, USA), which had been modified to contain a fast, on-airway oxygen sensor. The nitrogen washout and washin method was used to calculate the FRC for 13 ICU patients. The protocol for a measurement set called for increasing the FIO2 from the clinically determined baseline to 100% for 5 minutes, returning the FIO2 to the baseline level for 5 minutes, and then repeating the FIO2 change. After approximately 1 hour, the measurement set was repeated. The differences between the first and second measurements in each set were analyzed. Results Bland-Altman analysis yielded a bias between repeated measurements of 85 ml (2.8%) and a standard deviation of the differences of ±278 ml (9.0%). The r2 of the repeated measurements was 0.92 (n = 28). See Figure 1.

Conclusions We have previously verified the FRC measurement accuracy of this system in bench tests and volunteer studies. The satisfactory repeatability demonstrated in this study suggests the system is clinically viable in the ICU.

Figure 1 (abstract P313)

0 1000 200C 3000 4000 5000 sooc

(FRC Measurement #1 + FRC Measurement #2)/2 (ml)

Effects of alveolar recruitment in patients after cardiac surgery: a prospective, randomized, controlled clinical trial

L Hajjar, F Galas, E Nozawa, A Leme, C Shigemi, C Barbas, J Auler

Heart Institute - Faculdade de Medicina da Universidade de Säo Paulo, Brazil

Critical Care 2008, 12(Suppl 2):P314 (doi: 10.1186/cc6535)

Introduction Pulmonary atelectasis and hypoxemia are significant events after cardiac surgery. The aim of the present study was to determine the efficacy of alveolar recruitment in different ventilation modes after cardiac surgery.

Methods We evaluated 480 patients submitted to cardiac surgery after arrival in the ICU. Groups were randomly allocated to one of eight groups: group 1 - volume controlled (8 ml/kg), respiratory rate if 12/min, PEEP = 5 cmH2O with no recruitment, FiO2 = 0.6; group 2 - volume controlled (8 ml/kg), respiratory rate if 12/min, PEEP = 5 cmH2O, FiO2 = 0.6 and recruitment with three maneuvers (PEEP 30 cmH2O for 30 seconds); group 3 - volume controlled (8 ml/kg), respiratory rate if 12/min, PEEP = 10 cmH2O, with no recruitment, FiO2 = 0.6; group 4 - volume controlled (8 ml/kg), respiratory rate if 12/min, PEEP = 10 cmH2O, FiO2 =

0.6 and recruitment with three maneuvers (PEEP 30 cmH2O for 30 seconds); group 5 - pressure controlled (to achieve 8 ml/kg), respiratory rate if 12/min, PEEP = 5 cmH2O with no recruitment, FiO2 = 0.6; group 6 - pressure controlled, respiratory rate if 12/min, PEEP = 5 cmH2O, FiO2 = 0.6 and recruitment with three maneuvers (PEEP 30 cmH2O for 30 seconds); group 7 - pressure controlled, respiratory rate if 12/min, PEEP = 10 cmH2O, with no recruitment, FiO2 = 0.6; and group 8 - pressure controlled, respiratory rate if 1 2/min, PEEP = 10 cmH2O, FiO2 = 0.6 and recruitment with three maneuvers (PEEP 30 cmH2O for 30 seconds). The primary outcome was the ratio of arterial tension to inspired oxygen fraction measured after 4 hours of ventilation and the time for extubation.

Results Oxygenation was higher in recruitment groups (P < 0.01), and pressure-controlled ventilation resulted in better oxygenation than volume-controlled ventilation (P < 0.05). Patients of groups 6 and 8 (pressure controlled with recruitment maneuvers) presented a lower time for extubation than the other modes (280 min vs 476 min, P < 0.01).

Conclusions After cardiac surgery, pressure-controlled ventilation with recruitment is an effective method to reduce hypoxemia, and results in a reduction of length in mechanical ventilation. Reference

1. Minkovich L, et al.: Effects of alveolar recruitment on arterial oxygenation in patients after cardiac surgery. J Cardio-thorac Vasc Anesth 2007, 21:375-378.

Evaluation of homogeneity of alveolar ventilation with electrical impedance tomography during anaesthesia and laparoscopic surgery

T Meier, H Luepschen, J Karsten, M Großherr, C Eckmann, H Gehring, S Leonhardt

Klinik für Anästhesiologie, Lübeck, Germany

Critical Care 2008, 12(Suppl 2):P315 (doi: 10.1186/cc6536)

Introduction After induction of anaesthesia and during abdominal surgery, homogeneity of ventilation is influenced by different factors such as compression and absorption atelectasis and change in abdominal pressure [1]. To assess the spatial change in

ventilation by applying positive end-expiratory pressure (PEEP), we employed the dynamic centre of gravity index ycog [2], which is a new mathematical feature of electrical impedance tomography (EIT) measurement in a clinical study.

Methods After approval of the local ethics committee and informed consent we prospectively randomized 32 consecutive patients (ASA physical status I/II) scheduled to undergo elective laparo-scopic cholecystectomy. The patients were randomly assigned to the PEEP group (10 cmH2O) or ZEEP group (0 cmH2O). Patients obtained volume-controlled ventilation (8 ml/kg bw) and the minute volume was adjusted by increasing the respiratory rate but maintaining a PaCO2 level between 35 and 45 mmHg. EIT (EIT evaluation KIT; Drager Medical, Lubeck, Germany/GoeMF II system; University of Gottingen, Germany) was performed at an intercostal level of Th 6 in the supine position. Measurements were carried out preoperatively and intraoperatively at five different time points (T0-T4). We calculated the ventral/dorsal lung ycog [2] to investigate the differences in homogeneity of pulmonary ventilation. EIT data and gas exchange parameters were compared between the randomized groups. A t test and variance analysis by the GLM repeated-measures procedure (Greenhouse-Geisser) method were used for statistical analysis.

Results Both study groups showed no differences in their preoperative characteristics. After induction of anaesthesia, oxygenation was reduced in the ZEEP group compared with the PEEP group and also the PaO2/FiO2 ratio was lower during anaesthesia compared with T0 measurements. The PEEP-ventilated patients showed higher values of respiratory compliance. The ZEEP-ventilated patients showed a lower gravity index compared with the PEEP group (P = 0.018). Ventilation with PEEP showed no difference in ycog at T0.

Conclusions The dynamic change of the homogeneity of ventilation after induction of anaesthesia and during surgery can be characterized by the calculation of the gravity index, which is a result of mathematical calculation of noninvasive EIT measurements. ZEEP ventilation resulted in a prominent reduction of oxygenation and a shift of the dynamic centre of gravity index compared with preoperative measurements and ventilation with PEEP. To optimize ventilation in anaesthetized patients, this new index can be of fundamental help. References

1. Duggan M, Kavanagh BP: Anesthesiology 2005, 102:838854.

2. Luepschen H, etal.: Physiol Meas 2007, 28:S247-S260. P316

Serum level of growth-related oncogene alpha during abdominal aortic aneurysm repair in humans

M Jedynak, A Siemiatkowski, M Gacko, B Mroczko

Bialystok Medical University, Bialystok, Poland

Critical Care 2008, 12(Suppl 2):P316 (doi: 10.1186/cc6537)

Introduction High postoperative mortality in patients after open elective aortic abdominal aneurysm (AAA) repair is thought to be the effect of ischemia-reperfusion organ injury followed by Multiple Organ Dysfunction Score. Neutrophils appear to be predominant leukocytes that are important in mediating ischemia-reperfusion injury and organ damage. Growth-related oncogene alpha (GROa) (CXCL1) is the chemokine with potent chemotactic activity for neutrophils. The aim of this study was to determine changes in serum GROa concentrations in the course of ischemia-reperfusion during AAA repair.

Methods Blood samples were taken before surgery (Preop), S124 before unclamping of the aorta (Pre-Xoff), 90 minutes after

unclamping (90min-Xoff) and 24 hours after surgery. GROa serum concentrations were measured with the ELISA technique. Results Seventeen patients, all men, with median age 65 (range 44-76) years undergoing AAA repair and a control group comprised of 11 volunteers, all men, were included in the study. Nine patients made an uncomplicated recovery, eight (47%) developed complications and four of them (24% of all) died. During AAA repair the GROa level decreased from 79 pg/ml at Preop and 76 pg/ml at Pre-Xoff to the lowest value of 61 pg/ml at 90min-Xoff (P = 0.308 vs Preop), followed by an increase to 100 pg/ml 24 hours after operation (P = 0.055 vs 90min-Xoff). Contrary to the uncomplicated group, in death and complicated cases there was no depletion of GROa levels during surgery, but the rise of its level to 137 pg/ml (P = 0.144 vs Preop) and 133 pg/ml (P = 0.86 vs Preop), respectively, was observed 24 hours after surgery. There was significant positive correlation between GROa level and Multiple Organ Dysfunction Score (r = 0.417), calculated on the second day after AAA repair. Conclusions Serum GROa concentrations decreased in the course of ischemia-reperfusion during AAA repair in uncomplicated patients. The lack of depletion of the chemokine level during surgery and its high value after AAA repair were associated with the development of postoperative organ dysfunction and death. References

1. Miura M, etal.: Am J Pathol 2001, 159:2137-2145.

2. Belperio JA, et al.: J Immunol 2006, 175:6931-6939.

Serum bilirubin over 50 ^mol/l on postoperative day 5: causes, consequences and outcome

S Toth1, M Radnai1, B Fule1, A Doros1, B Nemes1, M Varga1, J Fazakas2

1Semmelweis University, Budapest, Hungary; 2Bajcsy-Zsilinsky Hospital, Budapest, Hungary

Critical Care 2008, 12(Suppl 2):P317 (doi: 10.1186/cc6538)

Introduction The 50-50 criteria (serum bilirubin (SeBi) > 50 |imol/l, serum prothrombin index (PI) < 50%) on postoperative 5 day is a predictor of mortality after liver transplantation [1]. The aim of this study was to analyse the perioperative causes, consequences and outcome of early excretion liver dysfunction. Methods In 96 liver transplanted patients the graft dysfunction was defined in PI > 50% and SeBi > 50 |imol/l. The multiorgan donation data including liver biopsies results were recorded. During and after liver transplantation, volumetric hemodynamic, global oxygenation and regional splanchnic perfusion parameters were measured. The hepatic and renal functions were analysed. Based on the postoperative fifth-day SeBi levels, the patients were divided in two groups: group A (SeBi < 50 |imol/l, n = 47) and group B (SeBi > 50 |imol/l, n = 49). The postoperative complications bleeding, renal and respiratory failure, infection were noticed and mortality was recorded. Statistical analyses were performed with the Wilcoxon signed rank test, chi-squared test and Kaplan-Meyer model.

Results Before organ retrieval, more group B donors received dopamine (P < 0.04), compared with group A donors who received noradrenalin (P < 0.004). The occurrence of donors' fatty liver was the same in both groups. In group B, more Child-Pugh C recipients (P < 0.004) with higher Model for End-Stage Liver Disease (MELD) score (P < 0.001) had longer transplantations (P < 0.05). Worse volumetric hemodynamic (intrathoracic blood volume index, cardiac index, p < 0.03), global oxygenation (oxygen delivery index, P < 0.04) and regional splanchnic perfusion parameters (intramucosal gastric pH; P < 0.02) were found in group B

only after portal and arterial reperfusion. In group B the occurrence of continuous renal replacement therapy and sepsis were higher (P< 0.02), the ICU therapy was longer (P < 0.02), and the 1-year mortality was 47% compared with 4% of group A (P < 0.004). Conclusions Early billiary injury after liver transplantation can be connected with the vasopressor therapy of the donor, the severity of cirrhosis, and the worse oxygenation and hemodynamic parameters after reperfusion. Reference

1. Dondero F, et al.: Liver Transplantation 2006, 12(Suppl 5):119. P318

Liver stiffness measurement for diagnosis of portal hypertension-related digestive bleeding in the ICU

L Tual1, R Amathieu1, P Nahon1, D Thabut2, G Dhonneur1

1AP-HP CHU Jean Verdier, Bondy, France; 2AP-HP CHU Pitie Salpetriere, Paris, France

Critical Care 2008, 12(Suppl 2):P318 (doi: 10.1186/cc6539)

Introduction Without past history or clinical signs of cirrhosis, the diagnosis of portal hypertension (PHT)-related digestive haemorrhage (DH) is often delayed until upper intestinal endoscopy performance, thus delaying specific medical treatment such as vasoactive drugs. As the liver stiffness measurement (LSM) is correlated to hepatic fibrosis [1] and PHT, it could be therefore useful for the diagnosis of cirrhosis in this situation. The aim of this study was to assess the predictive value of LSM by FIBROSCAN® for the diagnosis of cirrhotic PHT-related DH. Methods Between January and May 2006, all consecutive patients referred for DH in two ICUs were prospectively included. In all patients a LSM and an upper gastrointestinal endoscopy were performed simultaneously at admission, each operator blinded to the result of the other examination. Exclusion criteria were the presence of ascitis or portal thrombosis without cirrhosis. Results Sixty-two patients were included (mean age: 60.2 ± 14.2 years; males: 40/62 (65%); BMI: 24.0 ± 11.3 kg/m2; SAPS II: 26.0 ± 19.3): 27/62 (44%) had non PHT-related DH (gastroduodenal ulcer, 16 cases; oesophagitis, seven cases; others, four cases); 35/62 (56%) had PHT-related DH (oesophageal varicose, 100%). All of these 35 patients had cirrhosis, either previously known or clinically obvious (18/35, 51%) or biopsy-proven later. The median LSM was significantly higher in patients with PHT-related DH (54.6 kPa (45.0-65.7) vs 5.2 kPa (4.3-6.3), P< 10-6). The AUROC for the diagnosis of PHT-related DH was 0.97 ± 0.03. A threshold of 13.7 kPa was chosen with specificity and a positive predictive value at 100% (sensitivity, 93%; negative predictive value, 94%). Conclusions LSM is a powerful noninvasive tool for the instant diagnosis of PHT-related DH. Performed at admission, it could allow the rapid onset of specific medical management. The prognostic value assessment of LSM in these patients is ongoing. Reference

1. Ganne-Carrie N, et al.: Hepatology 44:1511-1517. P319

Small intestinal transit time in critically ill patients using endoscopic video capsule

S Rauch1, K Krueger1, DI Sessler2, N Roewer3

1University of Louisville, KY, USA; 2Cleveland Clinic, Cleveland,

OH, USA; 3University of Wurzburg, Germany

Critical Care 2008, 12(Suppl 2):P319 (doi: 10.1186/cc6540)

Introduction We investigated the small bowel transit time and pathophysiological changes using a video capsule in ICU patients

with head injuries. The gut was only recently recognized as pivotal in the disease process of critical illness; hence, work is needed to improve our understanding of how bowel dysfunction impacts healing. We hypothesized that new diagnostic technology such as wireless capsule endoscopy, which allows real-time investigation of the small bowel, will show that the small bowel transit time is increased in critically ill patients with brain injuries. Methods We recruited 32 patients older than 18 years in this prospective, controlled, IRB-approved trial. Their authorized representatives gave written, informed consent. Sixteen of them were neuro-ICU patients with mild-to-moderate brain injury (GCS 6-14) who required a feeding tube. The control group consisted of 16 ambulatory patients. A small capsule containing a video camera (PillCam™) was positioned in each patient's proximal small intestine at the time of endoscopic feeding tube insertion. Sensors on the patient's abdomen picked up signals the capsule transmitted to allow real-time video recording of the gut. Two independent observers analyzed the data.

Results The average small bowel transit time for neuro-ICU patients ranged from 144 to 480 minutes (median 344 min, mean 338 min). For the ambulatory patients, the range was 228-389 minutes (median 250 min, mean 279 min). All five patients with small bowel transit times greater than 400 minutes were neuro-ICU patients. The mean difference between the groups was 59 minutes (95% CI: -17 to 135), and according to a Mann-Whitney rank-sum test P = 0.184.

Conclusions The current results suggest that the small bowel transit time is not significantly increased in our critical care patients.

Transpulmonary pressure evaluation in an obese patient under mechanical ventilation

S Delisle1, M Francoeur2, M Albert1

1Hôpital du Sacré-Coeur de Montréal, QC, Canada; 2Summit

Technologies Inc., Montréal, QC, Canada

Critical Care 2008, 12(Suppl 2):P320 (doi: 10.1186/cc6541)

Introduction It is often difficult to manage optimal ventilation parameter settings in patients with low chest wall compliance, as in an obese patient. This case report demonstrates options to ventilate such difficult patients.

Methods A 34-year-old obese woman was transferred to our academic ICU to solve difficult ventilation and oxygenation problems in the context of septic shock following urinary tract infection. This patient was transferred to an Avea ventilator (Viasys Healthcare) and an 8 Fr esophageal balloon catheter was inserted in the lower third of the esophagus. An expiratory airway occlusion maneuver, described by Baydur and colleagues [1], was performed to confirm correct positioning. We performed an inspiratory hold to obtain and compare the airway plateau pressure (Pplat) and transpulmonary plateau pressure (Ptpplateau), and an end-expiratory hold to obtain and compare airway total positive end-expiratory pressure (PEEPt) and transpulmonary total PEEP (PtpPEEP). Transpulmonary pressures were used to change ventilator parameter settings.

Results The patient had a BMI of 58.6. Arterial blood gas showed metabolic and respiratory acidosis (pH: 7.18, paCO2: 53, HCO3: 19) and hypoxemia (paO2: 75, SpO2: 94% with 1.0 FiO2) that persisted for 4 days prior to the installation of an esophageal balloon. The Pplat was 52.6 cmH2O and PEEPt was 24.8 cmH2O, with a set PEEP of 8 cmH2O. The Ptpplateau was 14.3 cmH2O and PtpPEEP was -4.4 cmH2O. The set PEEP was increased to 25 cmH2O, which resulted in a PtpPEEP of -1.4cmH2O. S125

Hemodynamic parameters remained unchanged. Within 24 hours, FiO2 was decreased down to 0.35 (pH: 7.34, paCO2: 35, HCO3: 18, paO2: 165, SpO2: 99%), and the patient was extubated 3 days later.

Conclusions As demonstrated in acute lung injury patients [2], this case study also showed a clinical benefit of measuring transpulmonary pressures to adjust ventilator parameter settings, especially the PEEP to recruit the lung, as it should be observed in patients with very abnormal chest wall compliance. References

1. Baydur A, Behrakis PK, Zin WA, et al.: Am Rev Respir Dis 1982, 126:788-791.

2. Talmor D, Sarge T, O'Donnell CR, et al.: Crit Care Med 2006, 34:1389-1394.

Passive mechanical properties of rat diaphragms: a new method for analyzing mechanical tissue properties

S Schumann, C Armbruster, K Gamerdinger, M Schneider, CA Stahl, J Guttmann

University Hospital of Freiburg, Germany

Critical Care 2008, 12(Suppl 2):P321 (doi: 10.1186/cc6542)

Introduction During controlled mechanical ventilation the diaphragm's passive mechanical characteristics contribute to the respiratory system's impedance, reflecting predominantly a part of thorax compliance. We focus here on the passive mechanical properties of the diaphragm. We hypothesize that changes in diaphragm compliance can be quantitatively assessed with our new bioreactor setup [1].

Methods Isolated diaphragms of wildtype rats were placed inside the bioreactor on an elastic membrane building the deformable wall of a pressure chamber of 5.5 ml volume. By increasing the pressure inside the chamber, the membrane and the diaphragms were deflected following the shape of a spherical cap. By analysis of the pressure-volume relationship inside the pressure chamber we calculated the mechanical properties (that is, compliance of the passive diaphragms) at certain points in time. Results Two diaphragms were investigated for 24 hours after explantation. Courses of compliance over time of the cultivated diaphragms showed characteristic courses reflecting relaxation, onset and end of rigor mortis and breakup of tissue structure (Figure 1).

Conclusions We attribute the increase in compliance to time-dependent changes of mechanical tissue properties of the diaphragms after explantation. We conclude that our method allows investigation of changes in mechanical characteristics of biological tissue during application of strain. In combination with histological and molecular-biological examinations, our method could give new insights into processing of mechanically evoked development of inflammation, apoptosis, necrosis or physiological reactions (for example, muscle fatigue). To our knowledge, this is

Figure 1 (abstract P321)


0 5 10 15 20 25 30 time (h)

the fist setup that allows repeated measurement of mechanical tissue properties at the same sample. We therefore further conclude that with our method the number of animal experiments could be reduced. Reference

1. Schumann S, Stahl CA, Möller K, et al.: Contact-free determination of material characteristics using a newly developed pressure-operated strain-applying bioreactor. J Biomed Mater Res B Appl Biomater, in press.

Negative pressure therapy improved outcome in a clinically applicable abdominal compartment syndrome porcine model

S Albert1, B Kubiak1, G Hutchinson2, K Norbury2, G Nieman1

1SUNY Upstate Medical University, Syracuse, NY, USA; 2KCI USA Inc., San Antonio, TX, USA

Critical Care 2008, 12(Suppl 2):P322 (doi: 10.1186/cc6543)

Introduction Abdominal compartment syndrome (ACS) is manifested by elevated intra-abdominal pressures (IAP) and associated hemodynamic, lung, or renal dysfunction. ACS may develop in trauma, pancreatitis, or burn patients. Abdominal closure after laparotomy, using negative pressure therapy (NPT) via a reticulated open-cell foam-based dressing, provides indirect negative pressure to the abdominal wall and viscera. We hypothesize that NPT improves hemodynamic, lung, and renal function as compared with a dressing-covered open abdomen without NPT. Methods Pigs (25-37 kg) were anesthetized and ventilated. After laparotomy, the superior mesenteric artery was occluded for 30 minutes. The cecum was perforated and a fecal clot was created to induce severe sepsis. Animals received isotonic fluid resuscitation titrated to mean arterial pressure (MAP) > 60 mmHg.

Table 1 (abstract P322) Data from T24 through T48

Hemodynamics Lung function Intestinal edema Renal function

MAP Cardiac output Plateau pressure Bladder pressure Creatine Urine output

(mmHg)+ (l/min)+ (cmH2O)+ (cmH2O)+ Wet/dry+ (mg/dl)+ (ml/hour)+

NPT 65 ± 1.3 3.00 ± 0.08 19 ± 0.46 11 ± 1.33 5.99 ± 0.20 1.08 ± 0.06 135 ± 17

No NPT 59 ± 2.3 1.70 ± 0.07 23 ± 1.34 18 ± 1.14 8.22 ± 0.67 1.7 ± 0.22 45 ± 11

Data are mean ± SE. +P < 0.05 between groups.

The abdomen was closed at the time of injury, then reopened 12 hours later and the animals were randomized to receive either NPT at -125 mmHg (n = 3) or no NPT (n = 3). Parameters were recorded hourly for 48 hours or until premature death. Results The hemodynamics, lung, and renal function were similar prior to application of NPT (T0-T11). The parameters improved after placement of the NPT device (Table 1). Conclusions NPT improved physiologic parameters in a clinical model of ACS. NPT is an effective strategy for the treatment of ACS in a severe sepsis model.

Clinically applicable porcine model of abdominal compartment syndrome

B Kubiak1, S Albert1, G Hutchinson2, K Norbury2, G Nieman1

1SUNY Upstate Medical University, Syracuse, NY, USA; 2KCI USA, Inc., San Antonio, TX, USA

Critical Care 2008, 12(Suppl 2):P323 (doi: 10.1186/cc6544)

Introduction Aggressive resuscitation in disease processes such as sepsis, peritonitis, and bowel ischemia can result in elevated intra-abdominal pressure (IAP), leading ultimately to abdominal compartment syndrome (ACS). Clinically, ACS causes organ dysfunction with oliguria, increased airway pressures, reduced oxygenation, and a fall in cardiac output (CO). There are currently no animal models that adequately mimic the complex pathophysiology associated with ACS. We have developed a clinically applicable porcine model that closely mimics the pathology seen in human patients.

Methods Pigs (n = 3) weighing 25-28 kg were anesthetized and placed on mechanical ventilation. Bladder, venous, systemic and pulmonary arterial catheters were placed for hemodynamic monitoring, infusion of fluids and drugs, blood sampling, and to measure bladder pressure (IAP). The injury model consists of 'two hits': through a midline laparotomy, the superior mesenteric artery was occluded for 30 minutes then released to create intestinal ischemia/reperfusion injury; and the cecum was perforated, and stool collected (0.5 ml/kg) and mixed with blood (2 ml/kg) to form a fecal clot that was placed in the right lower quadrant of the peritoneal cavity. Following injury the laparotomy was closed and animals received vigorous fluid resuscitation to maintain the mean arterial pressure (>60 mmHg) and urine output (UOP) (>0.5 cm3/kg/hour), and wide-spectrum antibiotics (ampicillin 2 g and flagyl 500 mg) were administered. The abdomen was

Figure 1 (abstract P323)

g? *?> <§> Time (Hr)

reopened 12 hours after injury, and passively drained to mimic current clinical treatment. Results See Figure 1.

Conclusions This model accurately mimics the development of human ACS as indicated by an increasing IAP and plateau pressure (Ppl) with a decrease in oxygenation (P/F ratio), CO, and UOP.

Relation between ventilatory pressures and intra-abdominal pressure

A Ashraf, JM Conil, B Georges, H Gonzalez, P Cougot, K Samii

Pôle Anesthésie et Réanimation, Toulouse, France Critical Care 2008, 12(Suppl 2):P324 (doi: 10.1186/cc6545)

Introduction The intra-abdominal pressure (IAP) may increase in critically ill ventilated patients inducing abdominal compartment syndrome with irreversible intra-abdominal organ ischemia. Increases in positive end-expiratory pressure (PEEP) induce an increase in plateau pressure (Pplat) and in intrathoracic pressure, which lead to hemo-dynamic changes and may also increase IAP by pressure transmission through the diaphragm. The aim of this study was to evaluate the relation between Pplat changes induced by PEEP and IAP. Methods During a 6-month period, 278 measurements were prospectively performed in 27 ICU patients. Pplat and IAP were measured 20 minutes after changes in the PEEP level. IAP measurement was performed using an intravesical pressure monitoring method by clamping the Foley urinary tube after injection of 30 ml normal saline, under sterile conditions. Statistical analysis was performed using parametric and nonparametric tests, as appropriate, and correlation tests. See Figure 1. Results Twenty-seven patients (22 males, five females) were included with a mean age of 58.2 years. The overall relation between Pplat and IAP was significant (r2 = 0.143, P < 0.001). Conclusions Our study shows that ventilatory pressure is a factor of the increase in IAP. In patients with high risk of intra-abdominal hypertension, therefore, IAP monitoring using a vesical pressure method may be useful before and after each PEEP adjustment. References

1. Gracias VH: Abdominal compartment syndrome in open abdomen. Arch Surg 2002, 137:1298-3000.

2. Biancofiore G, et al.: Postoperative intra-abdominal pressure and renal function after liver transplantation. Arch Surg 2003, 138:703-706.

Figure 1 (abstract P324)

• •• • 5 •

2,5 I ■ -r ■ -1-i-:-1-1-1-1-1-

10 12,5 15 17,5 20 22,5 25 27.5 30 32.5 35 37,5 Plateau pressure

Evaluation of the role of noninvasive positive pressure ventilation in prevention of postextubation respiratory failure in high-risk patients

G Hassan Abdel Naby, T Habib, AR Abdel Razik

Alexandria University Faculty of Medicine, Alexandria, Egypt Critical Care 2008, 12(Suppl 2):P325 (doi: 10.1186/cc6546)

Introduction Unsuccessful extubation (the need for reintubation) occurs in up to 20% of patients within 24-72 hours of planned extubation. Factors that appear to increase the risk are age >70 years, higher severity of illness at weaning onset, use of intravenous sedation, and longer duration of mechanical ventilation prior to extubation [1]. Reintubation is associated with increased hospital stay and mortality [2]. Noninvasive positive pressure ventilation (NIPPV) has been proposed in the management of acute respiratory failure occurring in the postextubation period. The use of NIPPV to prevent postextubation respiratory failure must therefore be considered.

Methods Thirty high-risk patients for postextubation failure were enrolled in this study, and were divided into two groups. Group A received standard medical therapy just after extubation, while in group B NIPPV is applied just after extubation. Results Reintubation and NIPPV were applied in 8/15 patients (55.33%) in group A, while in group B it was 2/15 patients (13.33%). The improvement in oxygen extraction in group B after 48 hours of the study was greater than in group A (25.32 ± 0.69% and 27.89 ± 1.82%, respectively) (P = 0.004). The shunt fraction was significantly different (P = 0.001) after 48 hours between group A and group B (3.55 ± 0.35 and 2.92 ± 0.37, respectively). Conclusions NIPPV is an efficient means to prevent postextubation respiratory failure in high-risk patients when applied immediately after extubation. References

1. Epstin SK: Endotracheal extubation. Respir Care Clin N Am 2000, 6:321-326.

2. Epstin SK: Effect of failed extubation on the outcome of mechanical ventilation. Chest 1997, 112:186-192.

Methods Patients were randomized to either ASV or PRVC on arrival in the ICU. Respiratory weaning progressed through three phases: phase 1 (start of intensive care ventilation to recovery of sustained spontaneous breaths), phase 2 (end of phase 1 to peak airway pressures <15 cmH2O during spontaneous breaths), phase 3 (T-piece trial). Following a successful T-piece trial, patients were extubated. The primary outcome was the duration of intensive care ventilation. Secondary outcomes were the time from intensive care admission to extubation, duration of phases 1-3, number of patients failing to wean, arterial blood-gas samples and ventilator setting changes made prior to extubation.

Results Forty-eight patients completed the study. The duration of intensive care ventilation was significantly shorter in the ASV than the PRVC group (165 (120-195) vs 480 (360-510) min; P <

0.001). The observed reduction in intubation time was mainly a result of shortening of phase 1 (21 (6-41) min in the ASV group vs 60 (24-153) min in the PRVC group) and of phase 2 (147 (91-171) min in the ASV group vs 357 (163-458) min in the PRVC group) (P < 0.001). Seventeen patients in the PRVC and three patients in the ASV group did not reach the protocol criteria for a T-piece trial within 8 hours but were successfully extubated. There were no significant differences in the number of arterial blood-gas samples taken or ventilator setting changes between the groups.

Conclusions ASV is associated with earlier extubation, without an increase in clinician intervention, when compared with PRVC in patients undergoing uncomplicated cardiac surgery. Reference

1. Sulzer CF, et al.: Anesthesiology 2001, 95:1339-1345.

Effectiveness of a spontaneous breathing trial with a low-pressure support protocol for liberation from the mechanical ventilator in a general surgical ICU

K Chittawatanarat

Chiang Mai University, Chiang Mai, Thailand

Critical Care 2008, 12(Suppl 2):P327 (doi: 10.1186/cc6548)

A randomized control trial comparing adaptive support ventilation with pressure-regulated volume control ventilation in weaning patients after cardiac surgery

Introduction Adaptive support ventilation (ASV) is a minute ventilation-controlled mode governed by a closed-loop algorithm. The combination of target tidal volume and respiratory rate is continuously adjusted with the goal of maintaining the patient in isominute ventilation, and thus reducing the work of breathing. A recent study demonstrated a reduction in time to extubation in patients ventilated in the ASV mode compared with those ventilated in synchronized intermittent mandatory ventilation (SIMV) followed by a pressure support mode [1]. This might be explained by a delay in switching the patient from SIMV to the pressure support mode. Pressure-regulated volume control (PRVC) with automode is a better comparator as it delivers pressure control breaths in the absence of triggering and automatically switches to pressure support breaths when triggered. We compared ASV with PRVC in the duration of intensive care ventilation in 50 patients S128 after elective coronary artery bypass surgery.

Introduction Discontinuing patients from mechanical ventilation is an important problem in ICUs. The aim of this study is to compare the effectiveness between a spontaneous breathing trial with a low-pressure support protocol and a liberal or nonprotocol-directed method.

Methods We conducted a retrospective cohort study involving 577 patients who were arranged and appropriate for weaning from mechanical ventilation on a general surgical ICU in an academic university-affiliated hospital between 1 July 2003 and 30 June 2007. Two hundred and twenty-two patients (Liberal group) had weaning process orders that depended on their physicians. Three hundred and fifty-five patients underwent a once-daily spontaneous breathing trial with a low-pressure support protocol. Patients assigned to this protocol had the pressure support level decreased to 5-7 cmH2O for up to 2 hours each day. If signs of intolerance occurred, assisted control ventilation was reinstituted for 24 hours. Patients who tolerated a 2-hour trial without signs of distress were extubated. We collected demographic data, cause of ICU admission, APACHE II score at arranged time of weaning, the weaning process time, ventilator days and ICU length of stay. Results There were no statistical differences between liberal and protocol groups in age (59.2 ± 19.3 vs 55.6 ± 19.8 years; P = 0.03), gender (male 74.3 vs 67.9%; P = 0.2) and APACHE II score at arranged time of weaning (14.7 ± 7.4 vs 15.3 ± 6.3; P = 0.2). The mean duration of the weaning process was 72.1 ± 101.3

P Gruber, C Gomersall, P Leung, G Joynt, S Ng, M Underwood

The Chinese University of Hong Kong, New Territories, Hong Kong Critical Care 2008, 12(Suppl 2):P326 (doi: 10.1186/cc6547)

Figure 1 (abstract P327)

hours in the liberal group and 7.7 ± 16.8 hours in the protocol group (P < 0.01). The mean ventilator days and length of ICU stay were statistically different between the liberal and protocol groups (5.7 ± 2.8 vs 2.7 ± 2.3; and 7.3 ± 7.1 vs 4.4 ± 3.4 days, respectively; P < 0.01) (Figure 1).

Conclusions The spontaneous breathing trial with a low-pressure support protocol for liberation from the mechanical ventilator was effective in the general surgical ICU. Reference

1. MacIntyre N: Discontinuing mechanical ventilatory support.

Chest 2007, 132:1049-1056.

Predicting success in weaning from mechanical ventilation

S Vieira1, A Savi2, C Teixeira3, L Nasi1, C Trevisan1, A Guntzel1, R Oliveira2, R Cremonesi2, T Tonietto2, J Hervé1, S Brodt2, F Alves2, J Horer3, N Silva2

1Hospital de Clínicas de Porto Alegre, Brazil; 2Hospital Monihos de Vento, Porto Alegre, Brazil; 3Complexo Hospitalar Santa Casa, Porto Alegre, Brazil

Critical Care 2008, 12(Suppl 2):P328 (doi: 10.1186/cc6549)

Introduction Failure in weaning from mechanical ventilation (MV) is frequent (25-30%) and associated with high mortality. Indexes predicting success can be helpful clinically. However, their predictive capacity can be low. The goal from this study is to evaluate weaning predictor indexes in patients during weaning from MV.

Methods We included patients under MV for at least 48 hours, submitted to a spontaneous breathing trial (SBT) for 30 minutes, extubated according to clinical decision and followed for 48 hours. They were evaluated concerning age, sex, clinical characteristics, length of hospital and ICU stays and length of MV. At the first and 30th minutes from the SBT we analyzed: arterial blood gases, hemodynamic and respiratory parameters such as respiratory rate (f), tidal volume (VT), rapid shallow breathing index (f/VT), maximal inspiratory and expiratory pressures. Comparisons were made between two groups of patients: success vs failure, defining failure as return to MV in the first 48 hours.

Results Four hundred and fifty-eight patients were studied. The overall mortality rate was 14%. Return to MV occurred in 21%. The most important differences comparing success with failure groups were: lower age (56 ± 19 vs 62 ± 17 years, P < 0.01), lower mortality rate (10% vs 31%, P < 0.001), shorter length of ICU stay (15 ± 12 vs 19 ± 13 days, P < 0.01), higher oxygen saturation at the first and 30th minutes (97 ± 3 vs 96 ± 6 and 95 ± 4 vs 94 ± 4, P < 0.05), lower f at the first and 30th minutes (24 ± 6 vs 26 ± 6 bpm and 25 ± 6 vs 28 ± 7 bpm, P < 0.001), lower f/VT at the first minute and principally in the 30th minute (56 ± 32 vs 69 ± 38 and 62 ± 39 vs 84 ± 55, P < 0.001), and lower increase in f/VT (4 ± 28 vs 12 ± 38, P < 0.05) during the test.

Conclusions In this group of patients a great number failed in the weaning process, showing, as expected, a higher mortality rate. Parameters related to failure were higher age, longer length of ICU stay, lower level of oxygenation, higher f and f/VT and higher increase in f/VT during the test.

Acknowledgements Members of the Weaning Study Group: R Wickert, LG Borges, ME Alves, ACT Silva, R Condessa, CE Hahn, L Cassel, MB Blom, R Zancanaro, F Callefe, KB Pinto, K Hartmann, P Pinheiro, and ES Oliveira. References

1. Frutos-Vivar F, etal.: Chest 2006, 130:1664-1671.

2. Tanios MA, et al.: Crit Care Med 2006, 34:2530-2535.

Is the threshold useful in accelerating weaning from mechanical ventilation?

R Condessa, S Vieira, J Brauner, A Saul, A Silva, M Baptista, L Borges, M Moura, M Alves, F Kutchak, L Biz

Hospital de Clinicas de Porto Alegre, Brazil

Critical Care 2008, 12(Suppl 2):P329 (doi: 10.1186/cc6550)

Introduction The threshold can be used as a physiotherapy tool in order to increase muscle strength. This effect can be useful in weaning patients. However, there are still controversies considering its advantages during weaning from mechanical ventilation (MV). This study aims to evaluate the effects of the threshold in such situations.

Methods Patients under MV for more than 48 hours and prone to weaning were randomly assigned to the control group or to the threshold group (trained twice daily). They were followed until extubation, tracheotomy or death. All cardiorespiratory variables, maximal inspiratory and expiratory pressures (MIP and MEP), length of weaning and success or failure were registered. Statistical analysis was performed using ANOVA, Mann-Whitney U test and chi-square test, where appropriate. A level of 0.05 was considered significant.

Results Eighty-six patients were studied (52% men, mean age 63 ± 17 years, 48% with chronic obstructive pulmonary disease). No differences were observed when comparing initial versus final cardiorespiratory variables in both groups, with the exception of the MIP (varied from -33.72 ± 13.5 cmH2O to -40.81 ± 12.67 cmH2O in the threshold group and from -37.67 ± 10.49 cmH2O to -34.19 ± 10.85 cmH2O in the control group, P< 0.001), the MEP (varied from 25.47 ± 12.48 cmH2O to 29.65 ± 12.02 cmH2O in the threshold group and from 29.65 ± 11.97 cmH2O to 26.86 ± 11.6 cmH2O in the control group, P < 0.05) and tidal volume (varied from2 386.16 ± 236.56 ml to 436.16 ± 228.39 ml in the threshold group and from 361.91 ± 168.81 ml to 357.14 ± 121.35 ml in the control group, P < 0.05). No differences were observed in the length of weaning (1.36 days in the threshold group versus 1.98 days in the control group, P > 0.05) and weaning success (83.7% in the threshold group versus 76.7% in the control group, P > 0.05).

Conclusions The threshold during weaning from MV can cause an increase in MIP, MEP and tidal volume. In this group of patients, however, it was not associated with a decrease in the length of weaning or with an increase in weaning success. Reference

1. Caruso P, etal.: Clinics 2005, 60:479-484. S129

Addition of a spontaneous awakening trial improves outcome in mechanically ventilated medical iCu patients

H Mann1, M Lupei1, C Weinert2

University of Minnesota College of Pharmacy and 2University of Minnesota College of Medicine, Center for Excellence in Critical Care, Minneapolis, MN, USA

Critical Care 2008, 12(Suppl 2):P330 (doi: 10.1186/cc6551)

Introduction Delayed discontinuation of mechanical ventilation is associated with increased mortality. The Sixth International Consensus Conference on Intensive Care Medicine recommends spontaneous breathing trials (SBT) as best practice for mechanical ventilation weaning. Daily spontaneous awakening trials (SAT) are also correlated with reduced ventilation duration and ICU length of stay. The aim of our study was to implement the SBT and SAT as best practices in the ICU and to assess the outcome of using the SAT and SBT combined.

Methods We collected information on medical ICU patients for 12 weeks in 2006 after implementing a SBT protocol and in 2007 after adding a SAT protocol to the SBT. We compared the likelihood of passing the SBT, extubation after a complete SBT, reasons for not extubating after a passed SBT, and the median ventilator days. Statistical comparison included the chi-square test and Mann-Whitney test (two-tailed with P < 0.05 considered significant).

Results Fifty-three patients were enrolled in the SBT-only group and 44 patients were included in the SAT + SBT group. In the SAT + SBT group the likelihood of passing both a safety screen (38% vs 47%; P < 0.05) and 30-minute SBT (73% vs 85%, P < 0.05) were lower than in the SBT-only group. The decreased likelihood of passing the safety screen in the SAT + SBT group was associated with an increased incidence of physician override to the protocol. The number of SBT trials performed decreased from 6.1 to 5.7 per patient with the addition of the SAT. The likelihood of extubation following a complete SBT increased in the SAT + SBT group versus the SBT-only group (42% versus 29%, P = 0.143). The likelihood of not extubating following a passed SBT due to sedation is decreased in the SAT + SBT group (10% vs 36%, P = 0.002). The median ventilator days was reduced in the SAT + SBT group versus the SBT-only group (5 days versus 6 days, P = 0.18). Conclusions Implementation of a best practice protocol for SAT to an SBT in the medical ICU improved patient outcome by decreasing the days on the ventilator and increasing the likelihood of extubation.

Early versus late tracheotomy in the ICU

K Mitka, V Soulountsi, I Houris, E Massa, G Vasiliadou, M Bitzani

G. Papanikolaou, Thessaloniki, Greece

Critical Care 2008, 12(Suppl 2):P331 (doi: 10.1186/cc6552)

Introduction This study was conducted to compare early versus late tracheotomy in ICU patients.

Methods A total of 103 patients (81 men, 22 women) were included in this study. They were classified into two groups: Group A - early tracheotomy (<7 days) included 36 patients (mean age 55.14 ± 17.698 years), and Group B - late tracheotomy (>7 days) included 67 patients (mean age 56.55 ± 19.078 years). We studied the impact of timing of tracheotomy on the duration of mechanical ventilation, duration of weaning, length of stay in the

ICU (LOS), outcome in 28 days, incidence of ventilator-associated pneumonia (VAP), and days of sedation administration. Severity of illness and organ dysfunction were assessed by APACHE II, SAPS and SOFA scores. Statistical analysis was performed using the Pearson x2, independent t test, Levene significance control, Mann-Whitney U test, and paired t test. The control criterion was P (significance) < a (significance level), a = 5%. Results The two groups were comparable in terms of age, APACHE II score and SAPS. There was a statistically significant difference in the admission SOFA score (P << a), the SOFA score of the tracheotomy day (P = 0.003) and in SOFA max (P << a), as well as the total days of mechanical ventilation (Group A 18.36 ± 12.059 vs Group B 24.19 ± 14.27, P = 0.05) and the LOS (Group A 16.75 ± 7.038 vs Group B 22.51 ± 10.726, P = 0.007). No difference was observed regarding the days of weaning after tracheotomy (Group A 7.56 ± 6.135 vs Group B 9.19 ± 9.24) and mortality (25% vs 23.9%, respectively). The prevalence of VAP was evaluated in 58 patients. In Group A VAP developed in 23.1%, vs 76.9% of patients in Group B (P = 0.099). There was no difference in the day VAP was diagnosed (P = 0.959). A significant difference in the days of sedative administration before and after tracheotomy was observed in both groups (before: 7.49 ± 5.34 days, after: 4.76 ± 8.05 days, P = 0.005). Days of sedative administration before tracheotomy were significantly different (Group A 4.32 ± 2.083 vs Group B 9 ± 5.690, P = 0.003). Conclusions Our results reinforce the findings of previous studies showing that early tracheotomy decreases significantly the duration of mechanical ventilation, ICU LOS and total days of sedative administration, and may provide a benefit in reducing the occurrence of VAP.

Precocious tracheotomy versus prolonged intubation in a medical ICU

B Charra, A Hachimi, A Benslama, S Motaouakkil

Ibn Rochd University Hospital, Casablanca, Morocco Critical Care 2008, 12(Suppl 2):P332 (doi: 10.1186/cc6553)

Introduction The main purpose of our study was to assess whether precocious tracheotomy, compared with prolonged intubation, reduces the duration of ventilation, the frequency of nosocomial pneumopathy, the duration of hospitalization and the mortality. Methods A retrospective and comparative study between two groups who present a neurologic or respiratory pathology and require mechanical ventilation for more than 3 weeks. The study covered 7 years and was about 60 patients divided into two groups: tracheotomy group (TG, n = 30), where the tracheotomy was realized between the eighth day and the 15th day, after the first period of intubation; and intubation group (IG, n = 30), where the patients are still intubated during the whole period of hospitalization until extubation or death. We determined the duration of ventilation, the frequency of nosocomial pneumopathy, the mean duration of hospitalization and the mortality. The statistical study was based on the chi-squared test for qualitative variables and on Student's test for quantitative variables. P < 0.05 was considered significant. The two groups contain a similar number of cases that have the same diagnosis. They have the same data about age, the sex and the gravity score: SAPS II and APACHE II score.

Results There was a significant statistical decrease of the whole duration of mechanical ventilation for the TG, 27.03 ± 3.31 days versus 31.63 ± 6.05 days for the IG, with P = 0.001. However,

there was no significant difference between the two groups for the frequency of nosocomial pneumopathy (P = 0.18). The mean duration of hospitalization did not differ between the two groups, and was about 30.96 ± 9.47 days for the TG versus 34.26 ± 9.74 days for the IG with P = 0.10. The study of the evolution shows that there was no statistically significant difference between the two groups regarding the mortality, 26.7% in the TG versus 46.7% for the IG with P = 0.10.

Conclusions It seems that precocious tracheotomy in the resuscitation of patients leads to a decrease of the duration of ventilation and delayed the incidence of nosocomial pneumopathy without a modification of the frequency of the mean duration of hospitalization in the resuscitation ward, or of death.

Ciaglia Blue Dolphin: a new technique for percutaneous tracheostomy using balloon dilation

C Byhahn1, M Zgoda2, O Birkelbach3, C Hofstetter1, T Gromann3

1JW Goethe Medical School, Frankfurt, Germany; 2Carolinas Medical Center, Charlotte, NC, USA; 3German Heart Institute, Berlin, Germany

Critical Care 2008, 12(Suppl 2):P333 (doi: 10.1186/cc6554)

Introduction Percutaneous tracheostomy (PcT) has reached a high level of safety; however, significant perioperative complications still occur. The most feared complication is posterior tracheal wall injury during insertion of the dilational device into the trachea applying downward pressure. Ciaglia Blue Dolphin (CBD) is a new technique for PcT using radial balloon dilation, thereby eliminating downward pressure during insertion and dilation. Methods An observational, clinical trial was conducted in 20 adult ICU patients undergoing elective PcT with the CBD technique (Cook Inc., Bloomington, IN, USA). After a 15 mm skin incision, tracheal puncture, and predilation of the puncture channel with a 14 F punch dilator, a balloon-cannula apparatus was passed over a guidewire until the tip of the balloon mounted at the distal end of the apparatus was seen in the trachea. The balloon was inflated with saline solution to 11 atm for a few seconds, then deflated, and the 8.0 mm ID tracheostomy tube preloaded onto a customized stylet, which formed the proximal portion of the apparatus, was placed by advancing the entire apparatus further into the trachea. The apparatus and guidewire were then removed, leaving only the cannula in place.

Results Twenty patients underwent CBD PcT under broncho-scopic control. All procedures were successfully completed in a mean time of 3.8 ± 1.7 minutes. Even though six patients were under continuous therapeutic anticoagulation therapy, blood loss was classified as 'none' (n = 14), 'marginal' (n = 5), or 'moderate' (n = 1). In the latter patient, bleeding occurred from a subcutaneous vein, but ceased without further intervention once the tracheostomy tube was in place. No other complications of either medical or technical nature were noted.

Conclusions Based on the data of this first clinical report, the new CBD device allows for quick, reliable, and safe dilation and subsequent cannula placement with one single apparatus. Even though the operators had no previous experience with CBD, no complications were noted. Randomized trials need now to be conducted to confirm the promising results of our study and to determine both advantages and disadvantages of the CBD technique when compared with other techniques of PcT. Reference

1. Zgoda M, Berger R: Chest 2005, 128:3688-3690.

Percutaneous dilatation tracheostomy in critically ill patients with documented coagulopathy

P Kakar, D Govil, S Gupta, S Srinivasan, P Mehta, A Malhotra, O Prakash, D Arora, S Das, P Govil

Max Super Speciality Hospital, New Delhi, India

Critical Care 2008, 12(Suppl 2):P334 (doi: 10.1186/cc6555)

Introduction Percutaneous tracheostomy techniques are gaining greater popularity in ICUs. Refinement of the percutaneous tracheostomy technique has made this a straightforward and safe procedure in appropriately selected patients. Generally, coagulo-pathy is a relative contraindication for surgical tracheotomy. We sought to determine its usage in high-risk patients with documented coagulopathy.

Methods Twenty critically ill patients with coagulopathy (International Normalized Ratio (INR) > 1.5) underwent elective percutaneous tracheostomy using a Portex percutaneous tracheos-tomy kit (Ultraperc). The Ciaglia Blue Rhino single-stage dilator set was used in all cases and the same intensivists performed all of the tracheotomies

Results There were 17 patients with an INR > 1.5, two patients were on a heparin drip, and one patient had a platelet count <20,000. One patient included in the study met requirements for two categories with a platelet count of 17,000 and an INR of 1.7. The procedural times ranged from 3 to 5 minutes. Apart from minor bleeding episodes during and after the procedures in three patients, which were controlled promptly, no other complications occurred; average estimated blood loss was around 5-10 ml. Conclusions In trained hands with careful precautions, we believe that percutaneous tracheostomy is safe even in patients with documented coagulopathy. Reference

1. Aoyagi M: Percutaneous tracheostomy to the patient with coagulopathy. Jpn J Anesthesiol 2005, 54:153-155.

An audit of perioperative staffing and complications during percutaneous and surgical tracheostomy insertion

P O'Neil, D Noble

Aberdeen Royal Infirmary, Aberdeen, UK

Critical Care 2008, 12(Suppl 2):P335 (doi: 10.1186/cc6556)

Introduction Percutaneous tracheostomy (PDT) has been established as a safe technique in the critically ill, with an equivalent complication rate to surgical tracheostomy (ST). However, PDT insertion may result in unrecognised hypercarbia, and has been associated with an increased perioperative complication rate. We therefore decided to audit current practice within our ICU. Methods Over a 3-month period, prospective data were collected on 25 patients within a 14-bed regional ICU. A single observer collected data on staff present, cardiovascular recordings and end-tidal carbon dioxide.

Results PDT was performed on 15 patients within the ICU, and ST was performed on 10 patients. Indication for tracheostomy was prolonged mechanical ventilation in 16 patients, poor neurological status in eight patients and sputum retention in one patient. Cardiovascular instability, defined as a greater than 20% deviation from normal blood pressure, occurred in nine (60%) patients during PDT. For ST, eight (80%) patients were cardiovascularly unstable. Hypercarbia, as detected by an end-tidal CO2 rise of more than 20%, occurred in six (40%) patients during PDT and in one (10%) patient during ST. See Table 1. S131

Table 1 (abstract P335)

Staff involved in PDT and ST

PDT (%) ST (%)

Assistants >2 8 (53) 10 (100)

Operator SpR3+ 12 (80) 4 (40)

Anaesthetist SpR3+ 6 (40) 9 (90)

Conclusions This audit has shown that assistance for PDT is inferior to that provided in the operating theatre, and this has potential safety implications particularly when junior staff are anaesthetising. Perioperative complication rates were similar overall, confirming the safety of PDT as a technique. Hypercarbia occurred relatively frequently during PDT, however, which may have deleterious effects in the brain-injured patient. From this audit, we would recommend that within our ICU more attention be focused on adequate staffing during the performance of this operative procedure on critically ill patients. Also, end-tidal carbon dioxide should be monitored carefully and treated if elevated. Reference

1. Rana S, et al.: Tracheostomy in critically ill patients. Mayo Clin Proc 2005, 80:1632-1638.

Severe airway compromise after percutaneous dilatational


AT Bos, BI Cleffken, P Breedveld

University Hospital Maastricht, The Netherlands

Critical Care 2008, 12(Suppl 2):P336 (doi: 10.1186/cc6557)

Introduction Percutaneous dilatational tracheostomy (PDT) is considered a safe alternative to open surgical tracheotomy, with comparable complication rates. Major complications are reported to be <1.5%, with a mortality rate of 0.3% [1]. Methods A 52-year-old male was admitted to our ICU following craniotomy for an intracranial hemorrhage. Prior history revealed hypertension and morbid obesity (BMI 46).

PDT was performed on the fourth day after intubation, because of persisting low Glasgow Coma Score and failure to clear secretions. PDT was performed with a Seldinger technique. With bronchoscopic guidance, endotracheal placement was confirmed. Initial airway pressure was high, but normalized quickly. Although oxygenation was maintained, saturation was 84% at the end of the procedure.

Results After 3 days a subcutaneous swelling occurred around the tracheostomy tube (TT), compromising the airway. An abscess was expected but could not be confirmed by stab incision or CAT scan. A rise of airway pressure with loss of tidal volume was seen in the next hours. On oral and transtracheostomy bronchoscopy, a diffusely swollen larynx with narrowing of the proximal trachea was seen. The TT was exchanged for a Bivona TT. On retrospection, the CAT scan revealed a dislocated cuff visualized as a double bubble. This was caused by tissue swelling, gradually enlarging the distance between skin and trachea. In this morbid obese patient, the standard TT was too short and dislocation could occur. A second CAT scan confirmed an adequate position of the Bivona TT. After 1 week, a TT with increased skin-to-trachea length was inserted and the patient was successfully weaned from ventilation.

Conclusions Since the complication rate is increased when performing a PDT in the obese [2], we suggest the following. First,

PDT should be guided by fiberoptic bronchoscopy. Second, a TT of adequate diameter and length should be used. Inadequate skin-to-trachea length of the TT can result in improper placement with cuff dislocation not necessarily resulting in air leak with ventilation. An experienced team should perform the procedure: one person doing a bronchoscopy, another placing the TT. References

1. Marx WH, et al.: Chest 1996, 110:762-766.

2. Byhahn C, et al.: Anaesthesia 2005, 60:12-15.

Incidence of postoperative sore throat and cough: comparison of a polyvinylchloride tube and an armoured tube

J Saleem, S Athar

Hameed Latif Hospital, Lahore, Pakistan Critical Care 2008, 12(Suppl 2):P337 (doi: 10.1186/cc6558) Introduction One of the common postoperative complications is sore throat and cough. A number of factors may be responsible, namely direct trauma by the airway, the endotracheal tube, and mucosal damage caused by pressure by the cuff hyperinflated by N2O. This complication is minor but distressing to an otherwise healthy patient; different strategies have been proposed to prevent it. They include changes in the technique of intubation, in the endotracheal tube material, and lower cuff pressure. Methods Eighty female patients ASA status I and II, scheduled for elective caesarean sections, were divided randomly into two groups. See Table 1. The same anaesthetist performed all the intubations and extubations. The patients were interviewed on the day of operation and on the following 2 days about cough and sore throat. See Table 2.

Results The frequency of postoperative sore throat and cough was greater with the use of the polyvinylchloride endotracheal tube. Conclusions The study demonstrates that the use of a polyvinylchloride endotracheal tube was associated with a significantly higher incidence of postoperative sore throat and cough.

Table 1 (abstract P337)

Demographic data

Tube Number Mean age (years) Mean weight (kg)

Polyvinylchloride 40 25.2 67.27

Armoured 40 28.8 65.95

Table 2 (abstract P337)

Frequency of cough and sore throat

Tube Cough Sore throat P value

Armoured 10 (12.5%) 26 (32.5%) 0.0704

Polyvinylchloride 26 (32.5%) 58 (72.5%) 0.0129


1. Mandoe H, Nikolajsen L: Sore throat after endotracheal intubation. Anesth Analg 1992, 74:897-900.

2. Stout DM, et al.: Correlation of endotracheal size with sore throat and hoarseness following general anaesthesia. Anesthesiology 1987, 67:419-421.

3. Hahnel J, Treiber H: A comparison of diffeent endotracheal tubes. Tracheal cuff seal, peak centering and the incidence of post operative sore throat [in German]. Anaesthetist 1993, 42:232-237.

4. Karaswa F, Mori T: Profile soft-seal cuff, a new endotracheal tube, effectively inhibits an increase in the cuff pressure through high compliance rather than low diffusion of nitrous oxide. Anaesth Analg 2001, 92:140-144.

5. Ouellete RG: The effect of nitrous oxide on laryngeal mask airway. AANA J 2000, 68:411-414.

6. Klemola UM: Postoperative sore throat; effect of lignocaine jelly and spray with endotracheal intubation. Eur J Anaes-thesiol 1988, 5:391-399.

7. Soltani HA, Aghadavoudi O: The effect of different ligno-caine application methods on postoperative cough and sore throat. J Chin Anesth 2002, 14:15-18.

Efficacy of postprocedural chest radiographs after percutaneous dilational tracheostomy

M Jenkins-Welch1, C Whitehead2

1 University Hospital Cardiff, Cardiff, UK; 2The Walton Centre for

Neurology and Neurosurgery, Liverpool, UK

Critical Care 2008, 12(Suppl 2):P338 (doi: 10.1186/cc6559)

Introduction Both retrospective and prospective cohort studies of routine chest radiographs after uncomplicated dilational percutaneous tracheostomy have showed a low pick up rate of clinically important new data [1,2]. It was felt that the true level of complications on insertion was much less than the reported rate of between 3% and 18% [3]. We decided to undertake a retrospective analysis of the last 100 percutaneous tracheostomies from our ICU population, looking at the utility of a postprocedural radiograph in terms of new data added.

Methods Percutaneous tracheostomies were performed consistently by the Portex Blue Rhino™ (Portex, UK) dilational method under direct bronchoscopic control. At the end of the procedure the tip-carina distance was measured with the fibroscope and recorded. The bronchoscope logbook was examined to identify patients. Patients were excluded if aged under 18 or they could not be identified on the electronic radiographic database. The report on the postprocedural radiograph was compared with the previous report for data that could not be detected clinically or bronchoscopically.

Results Two hundred and two records were examined to give 100 procedures. Of these, 89 could be identified on the radiology database. Eighty-three reports (93.25%) showed no new data. In three cases the tube tip was reported as close to the carina, which was not correct on direct vision. No radiograph showed any serious complication of the procedure.

Conclusions In this series the pneumothorax rate was 0%, and over 93% of radiographs added no new clinical data. This evidence does not support the use of a routine radiograph, and we recommend them only if indicated clinically. References

1. Datta D, Onyirimba F, McNamee MJ: The utility of chest radiographs following percutaneuos dilational tracheostomy. Chest 2003, 123:1603-1606.

2. Haddad SH, Alderwood AS, Arabi YM. The diagnostic yield and clinical impact of a chest X-ray after percutaneous dilation tracheostomy: a prospective cohort study. Anaesth Intensive Care 2007, 35:393-397.

3. Bodenham A, Diamet R, Cohen A, et al.: Percutaneous dilational tracheostomy: a bedside procedure on the intensive care unit. Anaesthesia 1991, 46:570-572.

Tracheal wall pressures in the clinical setting: comparing Portex Soft Seal and Lotrach cuffs

R Santhirapala, MC Blunt, A Fletcher, PJ Young

Queen Elizabeth Hospital, Kings Lynn, UK

Critical Care 2008, 12(Suppl 2):P339 (doi: 10.1186/cc6560)

Introduction The tracheal wall pressure (TWP) exerted by a tracheal tube cuff should normally be kept between 20 and 30 cmH2O. This protects against mucosal injury whilst allowing ventilation without an audible air leak [1]. The Portex Soft Seal high-volume low-pressure (HVLP) cuff has a working intracuff pressure of 30 cmH2O, providing a safe TWP of the same value because there is no tension in the cuff wall material. There are, however, folds within the cuff wall that allow passage of subglottic fluid to the tracheobronchial tree below, increasing the risk of ventilator-associated pneumonia [2]. The Lotrach endotracheal tube was designed to prevent this leakage at an equivalent TWP to that of correctly inflated HVLP cuffs [3]. Each Lotrach cuff is individually calibrated to transmit only 30 cmH2O to the tracheal wall, yet because of the lack of folds in the cuff wall it has been shown to prevent of aspiration of subglottic contents [4]. Although extensively tested in benchtop models and pig tracheas, we wished to demonstrate that the Lotrach cuff had an identical sealing pressure, and therefore TWP, as the HVLP cuff in normal clinical practice.

Methods One hundred and two ventilated patients were intubated with either the Lotrach (n = 54) or the Portex Soft Seal (n = 48) tubes. Both the Lotrach and Portex Soft Seal cuffs were inflated to their working pressures. Whilst undertaking staged recruitment manoeuvres (up to 40 cmH2O), the positive end-expiratory pressure at which laryngeal air leak occurred was noted. Results The seal pressures (TWP) are presented in Table 1.

Table 1 (abstract P339)

Seal pressures for the Lotrach and Portex Soft Seal cuffs

Number of Mean (SD)

Type of tube measurements TWP (cmH2O)

Portex (30 cmH2O) 73 (in 54 patients) 32.4 (3.0)

Lotrach (80 cmH2O) 100 (in 48 patients) 30.0 (3.8)

Conclusions Both the Portex Soft Seal and Lotrach cuffs exert an equal and safe TWP when inflated to their recommended working pressures. References

1. Seegobin RD, et al.: BMJ 1984, 288:965-968.

2. Young PJ, et al.: BJA 1997, 78:557-562.

3. Young PJ, et al.: Med Eng Phys 2003, 25:465-473.

4. Young PJ, et al.: Anaesthesia 1999, 54:559-563.

Intraluminal measurement probe increases resistance of pediatric endotracheal tubes

J Spaeth, D Steinmann, S Schumann, J Guttmann

University Hospital of Freiburg, Germany

Critical Care 2008, 12(Suppl 2):P340 (doi: 10.1186/cc6561)

Introduction During mechanical ventilation the resistance of the endotracheal tube (ETT) causes a noticeable pressure drop between the airway pressure and the tracheal pressure. Analysis of lung mechanics requires knowledge of the tracheal pressure. S133

Figure 1 (abstract P340)

Besides methods for calculation of the tracheal pressure [1,2], direct measurement of the tracheal pressure was suggested [3]. We hypothesized that the measure probe significantly increases the ETT's resistance and therefore is inappropriate for continuous monitoring of the intratracheal pressure in the presence of pediatric ETTs. In a laboratory study we investigated the pressure drop across pediatric ETTs with and without an intraluminal sensor probe. Methods A physical model consisting of a special tube connector for insertion of the sensor probe (Samba Preclin 420 LP; Samba Sensors, Vastra Frolunda, Sweden), the anatomically curved ETTs of inner diameter 2.0-4.5 mm, and an artificial trachea was ventilated with sinusoidal gas flow with an amplitude of 240 ml/s and a ventilation rate ranging from 20 to 50 cycles/min. The airway pressure (proximal to the ETT) was measured at the proximal end, and the tracheal pressure at the distal end of the ETT. Results We found that placement of the intraluminal sensor significantly increased the pressure drop across the ETT (P <

0.05. for all sizes of ETT. Figure 1 shows the pressure-flow relationship of a 2-mm-ID tube. The relative increase of this pressure drop caused by the intraluminal sensor was more prominent for smaller ETTs.

Conclusions Measurement of tracheal pressure using intraluminal sensors results in an increased ETT resistance and thus in an additional increase of work of breathing. We conclude that direct tracheal measurement is inappropriate for continuous bedside monitoring of tracheal pressure in small children. References

1. Guttmann J, etal.: Crit Care Med 2000, 28:1018-1026.

2. Jarreau PH, et al.: J Appl Physiol 1999, 87:36-46.

3. Sondergaard S, et al.: Pediatr Res 2002; 51:339-345.

Effect of telemedicine for a prehospital suburban emergency medical service

KH Lee, YK Kim, SO Hwang, H Kim

Wonju College of Medicine, Wonju, Republic of Korea Critical Care 2008, 12(Suppl 2):P341 (doi: 10.1186/cc6562)

Introduction The telemedicine system from ambulance to hospital is not popular in emergency medical service (EMS) systems in the world. In this study we investigated the effect of telemedicine from ambulance to hospital in a suburban EMS.

Methods From June 2007 to October 2007, 2,934 patients enrolled in our study. The emergency patient information from the ambulance was transferred to the emergency medical information center and emergency center by the code-division multiple access

(CDMA) transfer system. In the emergency medical information center, the patient data were stored and analyzed. The transferred data were the patient's ECG, blood pressure, respiration rate, pulse oxymetry, and body temperature. We analyzed the effect of the using the telemedicine system in our suburban EMS. Results Of the 2,934 patients, 351 patients (12%) used the telemedicine system from ambulance to hospital (group 1). The other 2,583 patients (88%) did not use the telemedicine system (group 2). The rate of medical control was increased in group 1 (100%) compared with group 2 (0%). The severity of patients was more increased in group 1 than group 2. The time to treatment in prehospital was longer in group 2 (6.3 ± 5.3 min) than group 1 (5.6 ± 4.7 min). The transfer time was longer in group 1 (21 ± 10.4 min) than group 2 (15.7 ± 8.9 min). The rate of using the telemedicine was increased in paramedics (24.6%) compared with EMT-intermediate (9.6%) or EMT-basic (4.0%). Conclusions Our telemedicine system from ambulance to hospital is the effective system for medical control and prehospital care in a suburban EMS. References

1. Galli R, Keith J, McKenzie K, et al.: TelEmergency: a novel system for delivering emergency care to rural hospitals. Ann Emerg Med 2007. [Epub ahead of print]

2. Gutierrez G: Medicare, the internet, and the future of telemedicine. Crit Care Med 2001, 29(8 Suppl):N144-N150.

An improved Bussignac device for the delivery of noninvasive continuous positive airway pressure: the SUPER-Bussignac_

G Bellani1, G Foti2, E Spagnolli1, M Greco1, L Castagna1, V Scaravilli1, N Patroniti1, A Pesenti1

1Milan-Bicocca University, Monza, Italy; 2San Gerardo Hospital, Monza, Italy

Critical Care 2008, 12(Suppl 2):P342 (doi: 10.1186/cc6563)

Introduction The purpose of this work was to test, in a bench study, the performance of a modified Boussignac continuous positive airway pressure (CPAP) system, aimed at avoiding the drop in inspired oxygen fraction (FiO2) during high inspiratory patient peak flows.

Methods We modified a Boussignac system (Bssc), inserting between the valve and the face mask a T-piece, connected with a 1.5 l reservoir balloon, supplemented with oxygen independently from the valve itself. During inspiration, the patient inhales oxygen from the reservoir, diminishing the blending with atmospheric air and keeping the FiO2 higher. The performance of the system was evaluated in a bench study, applying a CPAP face mask to a plexiglass cylinder connected with an active lung simulator (ASL 5000; IngMar Medical, USA) generating tidal volumes with increasing inspiratory peak flow rates (^'insp). Two positive end-expiratory pressure (PEEP) levels were tested for the Bssc, SUPER-Boussignac with 10 l/min (SB10) and 30 l/min (SB30) supplementary oxygen flows. The FiO2 was continuously recorded and the lowest value reached during each tidal volume (FiO2) was averaged over 20 breaths.

Results With all systems the FiO2 increased with increasing PEEP levels and decreased at higher l/'insp. SB10 and, to a greater extent, SB30 allowed one to reach greater FiO2 values, and decreased the drop in FiO2 due to increasing l/'insp. See Tables 1 and 2. Conclusions The SUPER-Boussignac is simple and effective in increasing (up to 30%) the FiO2, and limiting the drop related to

V | during noninvasive CPAP.

Table 1 (abstract P342)

FiO2 values at PEEP 7 cmH2O

V insp 30 l/m 60 l/min 90 l/min

Bssc 84 ± 0.3 61 ± 0.6 51 ± 0.4

SB10 91 ± 0.4 72 ± 0.3 58 ± 0.4

SB30 92 ± 0.4 93 ± 0.2 77 ± 0.3

Table 2 (abstract P342)

FiO2 values at PEEP 13 cmH2O

V insp 30 l/min 60 l/min 90 l/min

Bssc 90 ± 0.2 72 ± 0.4 59 ± 0.4

SB10 91 ± 0.7 81 ± 0.5 68 ± 0.5

SB30 92 ± 0.4 93 ± 0.2 83 ± 0.3

Abstract withdrawn

How quick is soon? Early response to continuous positive airway pressure: a randomized controlled trial

J Crawford, R Otero, EP Rivers, T Lenoir, J Garcia

Henry Ford Hospital, Detroit, MI, USA

Critical Care 2008, 12(Suppl 2):P344 (doi: 10.1186/cc6565)

Introduction Numerous studies have confirmed that using noninvasive continuous positive airway pressure (nCPAP) for chronic obstructive pulmonary disease and congestive heart failure improves the respiratory rate, heart rate (HR), and work of breathing. We hypothesize that early application of nCPAP with concomitant medical therapy to patients with acute undifferentiated shortness of breath (SOB) will improve objective measures of respiratory distress. Specifically, early application of nCPAP can improve the tidal volume (TV), end-tidal carbon dioxide (EtCO2) and Rapid Shallow Breathing Index (RSBI), and reduce intubations over the standard of treatment alone in 15 minutes or less. Methods Fifty-two patients were randomized equally to either CPAP + standard of care (nCPAP group) or to standard of care (standard group) for acute undifferentiated SOB. nCPAP was applied for 15 minutes. Subject enrollment was randomized and demographic data were recorded upon enrollment. Volumetric measures were obtained by breathing through the EtCO2/Flow sensor for 30 seconds at 5-minute intervals along with vital signs. Inclusion criteria were adults >18 years, with acute respiratory distress, admitted to the resuscitation room of the emergency department, respiratory rate > 25 bpm, SpO2 > 75%, Glasgow Coma Score > 8, HR > 60/min, and systolic blood pressure > 90 mmHg. Exclusion criteria were respiratory arrest and/or cardiac arrest, suspected pulmonary embolism, pneumothorax, myocardial infarction, temperature >38.5°C, or refusal to participate. Results All tests were two-sided and assessed at the 0.05 type-I error rate. The gender distribution was equal for both groups. There was no difference in baseline characteristics except for age, HR and diastolic blood pressure (P < 0.05). Subjects in the nCPAP group had a greater improvement for various parameters

compared with the standard group including TV (0.8 l, 0.3 l), EtCO2 (30 mmHg, 38 mmHg) and RSBI (39, 150), respectively. The nCPAP group also had a shorter hospital and ICU length of stay compared with the standard group (4 vs 5 days, and 2 vs 3 days, respectively). Finally, the rate of intubations was higher in the standard group (n = 8, n = 3) than the nCPAP group (P < 0.01). Conclusions The early application of nCPAP in patients with acute undifferentiated SOB improves their volumetric parameters and vital signs in as early as 5 minutes. This pilot study provides objective support for the notion that early application of nCPAP can lead to measurable improvement in TV, EtCO2, RSBI and reductions in intubations.

Conventional versus noninvasive ventilation in acute respiratory failure

SH Zaki, G Hamed, A Andraos, A Abdel Aziz, H Fawzy, F Ragab, S Mokhtar

Kasr Al Ainy University Hospital, Cairo, Egypt

Critical Care 2008, 12(Suppl 2):P345 (doi: 10.1186/cc6566)

Introduction Treatment of patients with acute respiratory failure (ARF) involves mechanical ventilation via endotracheal intubation (INV). Noninvasive positive pressure ventilation (NIV) using the Bilevel positive airway pressure (BiPAP) can be safe and effective in improving gas exchange. The aim of the study is to assess NIV (BiPAP) as an alternative method for ventilation in ARF and to determine factors that predict the successful use of BiPAP. Methods Thirty patients with ARF (type I and type II) were enrolled and divided into two groups. Group I included 10 patients subjected to INV ventilation. Group II included 20 patients subjected to NIV using BiPAP. Both groups were compared regarding arterial blood gases (ABG) on admission, 30 minutes after beginning of ventilation, at 1.5 hours and then once daily. Complications, namely ventilator-associated pneumonia (VAP), skin necrosis and carbon dioxide narcosis, static compliance and resistance, were measured at day 1 and day 2. Results Compared with group I, group II patients were associated with similar improvement in ABG at 30 minutes and at discontinuation of ventilation. Group II patients showed lower incidence of VAP (20% vs 80%), a shorter duration of ventilation (3 ± 3 vs 6 ± 5 days, P < 0.01), with shorter length of hospital stay (5.8 ± 3.6 vs 8.9 ± 2.7 days, P < 0.01) when compared with group I. Skin necrosis and carbon dioxide narcosis occurred in group II only. Group II patients showed a difference change in compliance and a change in resistance from day 1 to day 2 when compared with group I. On a univariate basis, parameters were analyzed to choose those associated with the outcome under concern (successful NIV). The following parameters were identified: level of consciousness, pH (7.3 ± 0.03 vs 7.26 ± 0.1, P =0.009), PCO2 (69.16 ± 13.14 vs 100.97 ± 12.04) on admission, 1.5 hours after NIV, pH (7.37 ± 0.03 vs 7.31 ± 0.17, P = 0.005), PCO2 (53.98 ± 8.95 vs 77.47 ± 5.22, P = 0.0001) in whom NIV succeeded and failed, respectively. The variable identified was PCO2 after 1.5 hours in the two models with 100% specificity.

Conclusions In patients with ARF, NIV was as effective as conventional ventilation in improving gas exchange, associated with fewer serious complications and shorter stay in intensive care. A 1.5-hour trial with NIV can predict success with BiPAP, as shown by an improvement in pH and PCO2 and the overall clinical picture. PCO2 after 1.5 hours could be the sole predictor of

successful NIV with 100% specificity.

Can the delta Rapid Shallow Breathing Index predict respiratory failure in spontaneously breathing patients receiving positive pressure ventilation?

R Otero, JC Crawford, ER Rivers, DG Goldsmith

Henry Ford Hospital, Detroit, MI, USA

Critical Care 2008, 12(Suppl 2):P346 (doi: 10.1186/cc6567)

Introduction The Rapid Shallow Breathing Index (RSBI) is derived by dividing the respiratory rate by the tidal volume. Previous work by this group has shown an association between an elevated RSBI (>105) and the need for noninvasive ventilation. Hypothesis: an improvement in RSBI, defined as a decrease from baseline (that is, ARSBI) can predict whether patients will develop respiratory failure either when receiving conventional therapy (non-CPAP) or continuous positive airway pressure (CPAP). Methods A secondary analysis of a prospective randomized controlled trial of patients receiving CPAP plus conventional therapy (CPAP group) versus conventional therapy alone (non-CPAP group) for undifferentiated dyspnea. The tidal volume was determined utilizing volumetric capnography with an end-tidal carbon dioxide flow sensor while receiving treatment. There were 26 patients in each group (CPAP and non-CPAP). Comparisons of ARSBI between the CPAP and non-CPAP groups were made. Simple t tests were performed to compare ARSBI values between groups. All tests were two-sided and assessed at the 0.05 type-I error rate.

Results The mean ARSBI in the CPAP group at t = 0-5 minutes, 0-10 minutes, and 0-15 minutes were 79.1, 96.2, and 93.6, respectively. For the time period t = 5-10 minutes the mean ARSBI was 15.8, and for t = 10-15 minutes the mean was -2.5. In the non-CPAP group the mean ARSBI for t = 0-5, t = 0-10, t = 0-15, t = 5-10 and t = 10-15 minutes were 6.7, -30.2, -5.4, -36.9, and 11.4, respectively. Patients randomized to CPAP had a greater improvement in ARSBI compared with patients receiving conventional therapy. Change from 0 to 5 minutes (P = 0.01), from 5 to 10 minutes (P = 0.03) and from 10 to 15 minutes (P = 0.42). The largest improvement in RSBI was seen in the first 10 minutes. There were more intubations in the non-CPAP (n = 8) group compared with the CPAP group (n = 4).

Conclusions ARSBI may be used as a noninvasive technique to predict respiratory failure in patients receiving CPAP. The largest improvement in respiratory function in this group occurred during the first 10 minutes of treatment with CPAP. Further studies are needed to compare ARSBI with conventional predictive techniques.

Plain radiological investigations in admissions to a trauma centre ICU

K O'Connor, P Glover

Royal Victoria Hospital, Belfast, UK

Critical Care 2008, 12(Suppl 2):P347 (doi: 10.1186/cc6568)

Introduction As the tertiary centre in Northern Ireland, severely injured patients are transferred to Royal Victoria Hospital (RVH) regional ICU for further definitive management. Minimum radiology should be performed before transfer.

Methods ICU transfers from the emergency department (ED), RVH and district general hospital (DGH), to the trauma centre were prospectively audited over 4 months.

Results Thirteen patients were admitted from RVH ED, 25 from a DGH ED. Spinal injury was diagnosed in 7.7% in the RVH group

Table 1 (abstract P347)

Plain radiological investigations performed


Cervical spine Thoraco-

spine antero- lumbar Pelvis

Chest (%) lateral (%) posterior (%) (%) (%)

RVH 100 100 100 53.8 38.5

DGH 96 88 88 68 32

Table 2 (abstract P347)

Injuries identified prior to ICU admission

Brain Cervical Thoracolumbar Chest

(%) spine (%) spine (%) (%)

RVH 53.8 0 0 38.5

DGH 76 8 8 28

versus 28% from the DGH ED. The median time to clear the cervical spine was 24 hours (RVH) versus 48 hours (DGH), and the thoracolumbars was 34 hours (RVH) versus 48 hours (DGH). See Tables 1 and 2.

Conclusions Despite Advanced Trauma Life Support guidelines, plain X-ray scans of the lateral cervical spine, chest and pelvis are not routinely performed in all trauma patients. Not all spinal injuries are being detected by radiology performed in EDs. Significant delays in clearance of spinal injuries occur despite a protocol in place, exposing patients to other potential risks. The development of a critical care network in Northern Ireland should allow the standardisation of pre-ICU management of trauma patients.

Needlestick injuries and infectious diseases in emergency medicine

JE Johnston, O Oconor

North Tyneside General Hospital, Newcastle Upon Tyne, UK Critical Care 2008, 12(Suppl 2):P348 (doi: 10.1186/cc6569)

Introduction Needlestick injuries are an occupational hazard for junior doctors, especially in emergency medicine. The emergency department is involved in the management of injuries both in the hospital setting and in the community. The setting was in an innercity area with a high incidence of intravenous drug abuse, HIV, hepatitis B and hepatitis C. The study was to highlight areas for improvement in management.

Methods A retrospective review of all emergency notes coded as needlestick injury for a 12-month period from 1 July 2001 to 1 July 2002. Information recorded included times, from incident, arrival at department, to be seen by doctor and to get postexposure prophylaxis (PEP) if indicated. Also the number of tetanus toxoid, hepatitis B immunoglobulin/vaccine, and HIV PEP given as well the number indicated. Risk of injury and exposure were assessed and follow-up was checked.

Results There were 73 needlestick injuries, 35 (48%) presented during normal working hours (9-5 pm) and 38 (52%) outside these hours. Twenty-six (34%) were healthcare workers, 51(66%) were nonhealthcare workers. The average time from incident to arrival was 1.4 hours for healthcare workers and 22.6 hours for nonhealthcare workers. The median time from arrival in the department to being seen by a doctor was 90 minutes. Ten (13.7%) injuries were high risk. Antiretroviral agents were given to

15 (20.1%) patients and the average time from door to HIV PEP was 141 minutes.

Conclusions Emergency medicine staff should be aware of the risks of blood-borne viral transmission as they have greater exposure than other healthcare groups. They are at higher risk of percutaneous injury and therefore should adopt universal precautions; shield and sheath devices would also reduce the risk of sharp injury. The HIV PEP is effective if given early, so these patients must be assessed urgently and antiretroviral agents given as soon as possible if indicated. Emergency medicine has had an increasing role in management of needlestick injuries in healthcare workers occurring outside working hours, out-of-hospital injuries and other attendances for HIV PEP. Greater education of emergency staff, other healthcare workers and the general public is required for optimal management of needlestick injuries.

Impact of a pandemic triage tool on intensive care admission

A Bailey, I Leditschke, J Ranse, K Grove

The Canberra Hospital, Canberra, Australia

Critical Care 2008, 12(Suppl 2):P349 (doi: 10.1186/cc6570)

Introduction The issues of pandemic preparedness and the use of critical care resources in a pandemic have been of increased interest recently [1]. We assessed the effect of a proposed pandemic critical care triage tool [2] on admissions to the ICU. The tool aims to identify patients who will most benefit from admission to the ICU and excludes patients considered 'too well', 'too sick', or with comorbidities likely to limit survival in the shorter term. Methods To assess the impact of the pandemic tool on our current practice, we performed a retrospective observational study of the application of the pandemic triage tool described by Christian and colleagues [2] to all admissions to our 14-bed general medical-surgical ICU over a 1-month period.

Results One hundred and nineteen patients were admitted to the ICU. Using the pandemic triage tool, 91 of these patients (76%) did not meet the triage inclusion criteria on admission. As required by the triage tool, patients were reassessed at 48 and 120 hours, with only one of the 91 patients becoming eligible for admission on reassessment. Further assessment of the 29 patients (24%) who met the triage inclusion criteria revealed that 17 of these met the triage exclusion criteria, leaving 12 patients (10%) from the original 119 as qualifying for ICU. One of these 12 was deemed 'too sick' by the triage tool and therefore was also excluded, leaving 11 patients (9%).

Conclusions Application of this triage tool to our current ICU patient population would radically change practice, and would generate substantial capacity that could be used in the situation of a pandemic. In addition, as the triage tool aims to exclude patients who are less likely to benefit from admission to the ICU, these results also have implications for ICU resource management in the nonpandemic situation. References

1. Challen K, et al: Crit Care 2007, 11:212.

2. Christian MD, et al: CMAJ 2006, 175:1377.

Pneumonia Severity Index: a validation and assessment study for its use as a triage tool in the emergency department

S Jacobs

Alice Springs Hospital, Alice Springs, Australia

Critical Care 2008, 12(Suppl 2):P350 (doi: 10.1186/cc6571)

Introduction Our rural 10-bed ICU serves a 167-bed hospital situated in a small town at the centre of the Australian Continent. The hospital is utilised mainly by Indigenous patients. We were interested to estimate the Pneumonia Severity Index (PSI) among all patients presenting to our emergency department (ED) with community-acquired pneumonia (CAP) during 2006 to ascertain whether this scoring system could be validated among our unique patient population and to ascertain whether the score could have been helpful to support a clinical decision to transfer patients to the medical ward or ICU.

Methods All patients presenting to the ED during 2006 were identified and their demographic and parameters details noted to calculate their PSI scores, which were performed retrospectively. Triage was performed clinically by the ED doctors. The following complications were noted: requirement for artificial ventilation, septic shock, acute renal failure, requirement for percutaneous tracheostomy and mortality. All patients transferred to tertiary centres because of staff or bed shortages were also noted. Microbiological data were collected on all ICU patients and most of the ward patients.

Results Four hundred and seventy-six CAP patients presented to the ED in 2006, of which 91% (436/476) were Indigenous and 12% (57/476) were transferred to the ICU. Admission characteristics of ICU patients revealed high incidences of alcoholism (76%) and chronic illness (70%). Artificial ventilation rates of these 57 patients were defined according to PSI severity: no patient with a score <91 required artificial ventilation, whereas 64% of patients with a score of 91-130 and 75% of patients with a score >130 required artificial ventilation. Using PSI < 91 for predicting absence of the need for artificial ventilation, specificity of 100% and sensitivity of 91% were demonstrated.

Conclusions The CAP rate among the central Australian Indigenous population is unacceptably high. This high rate is associated with high incidences of alcoholism, chronic ill health and poor social conditions. The PSI has been validated in this study, accurately predicting mortality and the need for artificial ventilation. The PSI could be a useful tool to support a clinical decision to transfer patients from the ED to the general medical wards (PSI < 91) or to the ICU (PSI > 91).

Injuries sustained by patients with behavioral disturbances brought to an emergency department in police custody

J Metzger, P Maher, M Wainscott, PE Pepe, K Delaney

UT Southwestern Medical School, Dallas, TX, USA Critical Care 2008, 12(Suppl 2):P351 (doi: 10.1186/cc6572)

Introduction Although it is widely recognized that physical restraint of violent persons can result in death or serious injury, formal reports documenting the incidence or rate of such injuries are lacking. The rate and nature of injuries for emotionally disturbed patients brought in police custody to a major urban emergency department (ED) were therefore recorded.

Methods All medical records of persons brought by police to the public hospital ED for a large metropolitan county (population 2.5 million) are electronically marked for subsequent rapid searches. Excluding those arriving following commission of a crime or for evaluation of medical conditions or sexual assaults, all patients brought with the transport assistance of paramedic crews to the ED in police custody for psychiatric evaluation as an emotionally disturbed person from 12 December 1999 through 31 August 2003 were studied. Patients were classified specifically as 'agitated' if they were described as violent, psychotic, aggressive, combative, hostile, threatening, homicidal or dangerous. S137

Results Of the total 24,895 police custody visits, 17,733 met the inclusion criteria for receiving psychiatric evaluation as emotionally disturbed persons. Of these, 10,173 (57%) could be classified as agitated. A potentially lethal weapon was confiscated in 447 cases. Of the 17,733 studied, 511 (3.3%) sustained injuries - 398 (78%) of which were self-inflicted stab wounds, wrist lacerations and hangings. Mace exposure resulted in 34 minor injuries, while none were attributable to conductive electrical devices. Overall, the rate of self-inflicted injuries over 3 years and 9 months was 2.2% (n = 398) while it was 0.6% (n = 113) for those inflicted by others. Only four of these patients (approximately one per year) required admission to a surgical service. Of note, 29% of injuries sustained by agitated patients were not self-inflicted, compared with 13.6% in nonagitated patients (70/238 vs 37/273, P = 0.0001).

Conclusions With the assistance of transporting paramedics, police officers were able to restrain violent, emotionally disturbed patients with a very low risk of serious injury. Reference

1. Morrison A, et a/.: Death of a psychiatric patient during physical restraint. Med Sci Law 2001, 41:46-50.

A study on the reliability and validity of two four-level emergency triage systems

N Parenti1, L Ferrara1, D Sangiorgi2, M Bacchi Reggiani2, R Manfredi1, T Lenzi1

1Hospita/ Santa Maria de//a Sca/etta Imo/a, Bo/ogna, Ita/y;

2Po/ic/inico Sant'Orso/a, Bo/ogna, Ita/y

Critica/ Care 2008, 12(Suppl 2):P352 (doi: 10.1186/cc6573)

Introduction We compared the reliability and validity of two four-level emergency triage systems: the ETS1, used in our emergency department (ED); and the ETS2, a triage algorithm derived from the Emergency Severity Index [1].

Methods This is an observational retrospective study of 189 patients admitted to our ED in 2 weeks of October 2006. Triage scenarios were designed with medical records. Ten trained triage nurses were randomly assigned either to training in ETS2, or to refresh the ETS1. They independently assigned triage scores to the scenarios, at time zero and after 6 months. Both triage systems have four urgency categories (UC): UC1 = immediate response; UC2, UC3, UC4 = assessment within 20, 60, 120 minutes. We collected demographic and clinical characteristics, nurse triage category, admission status and site, nurse triage forms with presenting complaint, mode and time of arrival, past diseases, vital signs, and pain score. For each scenario we considered 'true triage' the mode of the UC assigned by the nurses. Weighted kappa (K) was used to calculate inter-rater and intra-rater reliability in each of the two groups of nurses. The relationships between the 'true triage' and admission, and admission site were assessed. Results The UCs assigned were similar in two groups: 20% versus 21% with UC4, 50% versus 48% with UC3, 28% versus 28% with UC2, 2% versus 3% with UC1. A complete disagreement in UC was found in 3% and 5% cases of ETS1 and ETS2; a complete agreement in 52% and 56% cases of ETS1 and ETS2. Inter-rater reliability among nurses using ETS1 and ETS2 was K = 0.73 (95% CI: 0.59-0.87) and K = 0.79 (95% CI: 0.65-0.93), respectively; intra-rater reliability was: K = 0.82 (95% CI: 0.67-0.96) and K = 0.78 (95% CI: 0.62-0.93) in ETS1 and ETS2. Hospital admission by ETS1 and ETS2 was similar for UC2 (39% vs 37%), UC3 (5% vs 8%), and UC4 (3% vs 0%); 100% of patients with UC1 in ETS1 and 60% in ETS2 were admitted to the S138 ICU.

Conclusions The two emergency triage systems showed similar reliability and validity. ETS2 is easier to consult but worse in prediction of ICU admission. To our knowledge this is the first study on the intra-rater reliability of two four-level emergency triage systems. Reference

1. Wuerz RC, Milne LW, Eitel DR, et al.: Reliability and validity of a new five-level triage instrument. Acad Emerg Med 2000, 7:236-242.

Patterns of medical illness and injuries in emotionally disturbed patients brought to an emergency department in police custody

J Metzger, P Maher, M Wainscott, P Pepe, K Delaney

UT Southwestern Medica/ Schoo/, Da//as, TX, USA Critica/ Care 2008, 12(Suppl 2):P353 (doi: 10.1186/cc6574)

Introduction Emotionally disturbed persons brought to the emergency department (ED) are at risk of surgical and medical illness. The patterns of medical and surgical illness for emotionally disturbed patients brought in police custody to a major urban ED were recorded.

Methods Our electronic ED medical records include the chief complaint, final diagnosis and disposition. A smaller number have detailed nursing notes. All patients with detailed nursing notes who were brought between 1 December 1999 and 31 August 2003 in police custody for psychiatric evaluation were studied. Patients were classified as 'agitated' if they were described as violent, psychotic, aggressive, combative, hostile, threatening, homicidal or dangerous.

Results In total, 17,733 were brought for psychiatric evaluation. Of these 6,432 had complete nursing notes. Rates of injury and illness were low. Of 1,985 nonagitated patients, 194 (9.8%) were injured: 180 (93%) self-inflected and none was attributed to a restraint process. For 4,447 agitated patients, 227 (3.5%) were injured, 160 (70%) were self-inflicted and 39 (0.9%) were attributed to the restraint process. Of these, 31 were exposed to mace, six had minor head or soft-tissue injuries, one had a pneumothorax, and one had an airway injury. For the subgroup self-inflicted injuries included lacerations (245), head injuries (60), hangings (13), penetrating wounds (10), hand fractures or infection (three), and other minor (nine). Overall 2,903 (45.8%) were admitted to a psychiatric service, 110 (1.7%) to a medical service and seven (0.12%) to a surgical service. Admitted surgical diagnoses were abdominal stab wound (one), tibial fracture (two), pneumothorax (one), airway injury (one), infected human bite (one), complicated lacerations (three), and penile foreign body (one). The predominant medical admission diagnoses were alcohol withdrawal (16 patients), overdose (24 patients), and rhabdomyolysis (12 patients). Forty-five patients had complications of chronic medical illness and eight patients had dementia with by psychosis.

Conclusions Although serious medical illnesses and injuries occurred, there was a low rate of medical and surgical illness that required admission. There was a high rate of psychiatric admission. Reference

1. Morrison A, et a/.: Death of a psychiatric patient during physical restraint. Med Sci Law 2001, 41:46-50.

Factors of hospitalization for mild heat illness

M Todani, T Ryosuke, T Kaneko, Y Kawamura, S Kasaoka, T Maekawa

Yamaguchi University Hospital, Ube Yamaguchi, Japan Critical Care 2008, 12(Suppl 2):P354 (doi: 10.1186/cc6575)

Introduction Untreated mild heat illness (heat exhaustion) becomes progressively more severe (heat stroke). Although the prognosis and risk factors for hospital mortality in patients with severe heat illness are often reported, the epidemiologic data for mild heat illness have been rarely reported. We therefore investigated the hospitalization predictive factors in the patients with mild heat illness. Methods Questionnaire sheets were sent to our affiliated hospitals in Yamaguchi prefecture, located at 34° north latitude with a population of 1.5 million, to identify the patients who received medical attentions with heat illness from 1 July to 31 August 2007. The questionnaire included symptoms, vital signs, laboratory data and presence or absence of hospitalization (decided by each doctor). Results We analyzed the data from 114 of the 126 patients with mild heat illness. Twelve patients were excluded because of insufficient description. The total number of hospitalizations was 44 (35%) and all patients were discharged without subsequent complications. The significant differences were shown in body temperature, consciousness disturbance, dysfunction of central nerve system including convulsion or cerebellar ataxia, age, levels of serum C-reactive protein, blood urea nitrogen (BUN) and white blood cell count between hospitalized and nonhospitalized patients with mild heat illness. Independent predictive factors for hospitalization were Glasgow Coma Scale < 15 (P = 0.04, OR = 3.56, 95% CI = 1.05-12.01), age > 60 years (P < 0.01, OR = 4.44, 95% CI =1.50-13.08), and BUN >21 mg/dl (P = 0.03, OR = 3.35, 95% CI = 1.16-9.67).

Conclusions Based on the present findings, the factors of hospitalization for mild heat illness were identified as presence of consciousness disturbance, seniority and high BUN level. These factors did not directly show the severity of heat illness, but they do help us to determine the patients who should be hospitalized. References

1. Bouchama A, Knochel JP: N Engl J Med 2002, 346:19781988.

2. Misset B, et al: Crit Care Med 2006, 34:1087-1092.

Missed opportunities to facilitate the diagnosis of colon cancer in emergency department patients

K Delaney

UT Southwestern Medical School, Dallas, TX, USA Critical Care 2008, 12(Suppl 2):P355 (doi: 10.1186/cc6576)

Results Two hundred and thirteen patients (116 men and 97 women) with colon cancer diagnosed during the 3-year period had visited the ED at least once prior to diagnosis. 37.6% were diagnosed within 1 week of the first symptomatic ED visit and 50% within the first 40 days. The median time to diagnosis after the first symptomatic ED visit was 46 days while the average was 256 days. Diagnoses were delayed more than 1 year in 51 patients. The median time to diagnosis after the first detection of anemia was 84 days. Forty-one patients were diagnosed at greater than 1 year. Women with anemia (n = 27) were nearly twice as likely as men (n = 16) to have >1-year delays in diagnosis following the detection of anemia.

Conclusions Opportunities to facilitate early diagnosis of colon cancer were missed in some cases. Focused interaction of ED providers with outpatient care providers to facilitate evaluation of suspected colon cancer is necessary to improve early detection. Women with anemia are less likely to be evaluated for a gastrointestinal source of blood loss.

When are chest X-ray scans performed in the emergency department useful?

JE Johnston

North Tyneside General Hospital, Newcastle Upon Tyne, UK Critical Care 2008, 12(Suppl 2):P356 (doi: 10.1186/cc6577)

Introduction This study was designed to demonstrate use of chest X-ray scans in a district general hospital emergency department and to highlight areas of inappropriate use.

Methods A retrospective chart review of 62 consecutive chest X-ray exposures from emergency patients in the department. The frequency of temperature, pulse, respiratory rate and oxygen saturation parameters were recorded, and those subsequently abnormal. The indication for the X-ray scan, information about previous chest X-ray scans and the record of the observed result were reviewed, as well as whether the patient's management was altered by the result of the X-ray scan.

Results Only 50% of the X-ray scans provided information that would potentially change the patient's management, although 68% had positive findings recorded. Twenty-nine per cent were requested for investigation of chest pain (pleuritic in 8.1%), 11.3% for investigation of abdominal pain, and 5% for transient ischemic attacks.

Conclusions Approximately 50% of chest X-ray scans requested from the emergency department are inappropriate. They are often requested unnecessarily for chest pain, transient ischaemic attacks, mild chest infections, head injury, haematemesis and minor injuries.

Introduction We have been aware that colon cancer in our public institution is too often diagnosed at a late stage. We evaluated the pattern of visits of these patients to the emergency department (ED) prior to the diagnosis of cancer.

Methods All patients in the hospital cancer registry with a diagnosis of colon cancer between January 2001 and December 2004 were evaluated. The prior ED visits were tracked in the searchable ED electronic medical record back to January 1999. Chief complaints and hematocrits were recorded. Anemia was defined as a hematocrit <37%. Symptoms were unexplained anorectal or abdominal pain or bleeding, chronic constipation, diarrhea or vomiting.

Management of acute organophosphorus poisoning at a university hospital

JM Shaikh

Liaquat University of Medical & Health Sciences, Jamshoro, Pakistan

Critical Care 2008, 12(Suppl 2):P357 (doi: 10.1186/cc6578)

Introduction Organophosphorus (OP) insecticides are widely used in agriculture, usually as pesticides, and frequently cause ill S139

health and death for hundreds of thousands of people each year. The majority of deaths occur following deliberate self-poisoning. They are common suicidal agents in Pakistan, India, Sri Lanka and other South Asian countries. The Accident & Emergency Department of Liaquat University Hospital Hyderabad routinely receives victims of OP poisoning from the farming communities all around. Our objective was to document the management, complications and subsequent outcome of patients with acute organo-phosphorus poisoning in the ICU of Liaquat University Hospital. Methods All victims of OP poisoning admitted to the ICU of Liaquat University Hospital admitted from May 2004 to October 2006 were included in the study. Diagnosis of OP poisoning was confirmed from history and clinical findings. Management, complications and subsequent outcome were noted. Statistical analysis was performed using SPSS 10.

Results A total of 111 patients of OP poisoning were admitted to the ICU during the study period; 60.4% of patients were males. The mean age was 25.26 ± 8.52 years; 85.6% were within the age limit of 12-30 years. Of patients, 89.2% were a suicidal attempt. In 94.6% of patients, ingestion was the route of exposure. The mean ICU stay was 2.3 ± 3.2 days. Twenty (18%) patients needed mechanical ventilatory support. The overall mortality ratio was 9% (n = 10). The mortality rate for the patients who required mechanical ventilation was 40% (n = 8), but the rate was 2.2% (n = 2) for the patients who were not mechanically ventilated. Conclusions Because of widespread use of OP pesticides by farming communities of the developing world, it is very difficult to reduce mortality by primary prevention. Immediate shifting of the victim to a well-equipped and well-staffed ICU, careful resuscitation with appropriate use of antidotes and good supportive care and observation can help reduce the number of deaths in the period after admission to the hospital. Awareness and education of general practitioners in the rural areas regarding emergency management, as well as prompt referral to an appropriate facility, is also recommended to reduce the mortality rate. References

1. World Health Organization in collaboration with the United Nations Environmental Programme: Pub/ic Impact of Pesticides Used in Agricu/ture. Geneva: WHO; 1990.

2. Bairy KL, Vidyasagar S, Sharma A, Sammad V: Controversies in the management of organophosphate poisoning. Ind J Pharmaco/ 2007, 39:71-74.

3. Dreisbach RH: Cholinesterase inhibitor pesticides. In Hand Book of Poisoning. 11th edition. CA: Longe Medical Publications; 1983:106-114.

4. Phillips MR, Li X, Zhang Y: Suicide rates in China, 1995-99. Lancet 2002, 359:835-840.

5. Niwaz A, Faridi MA: Organophosphate insecticide poisoning [case report]. Anaesth Pain Intensive Care 1999, 3:3436.

6. Vijayakumar L: Suicide prevention: the urgent need in developing countries. Wor/d Psychiatry 2004, 3:158-159.

7. Eddleston M, Nick AB, Andrew HD: The need for transla-tional research on antidotes for pesticide poisoning. C/in Exp Pharmaco/ Physio/ 2005, 32:999-1005.

8. Eddleston M, Phillips MR: Self poisoning with pesticides. BMJ 2004, 328:42-44.

9. Gunnell D, Eddleston M: Suicide by intentional ingestion of pesticides: a continuing tragedy in developing countries. Int J Epidemio/ 2003, 32:902-909.

10. Eddleston M: Patterns and problems of deliberate self-poisoning in the developing world. Q J Med 2000, 93:715731.

11. Karalliedde L, Eddelston M, Murray V: The global picture of S140 organophosphate insecticide poisoning. In Karalliedde,

Feldman F, Henry J, Marrs T, editors. Organophosphates and Health. London: Imperial Press; 2001:432-471.

12. Jamil H: Organophosphorus insecticide poisoning. J Pak Med Assoc 1989, 39:27-31.

13. Singh S, Wig N, Chaudhary D, Sood NK, Sharma BK: Changing pattern of acute poisoning in adults: experience of a large North-West Indian hospital (1970-89). J Assoc Physc India 1997, 45:194-197.

14. Ganesvaran T, Subramaniam S, Mhadevan K: Suicide in a northern town of Sri Lanka. Acta Psychiatr Scand 1984, 69: 420-425.

Suicidal intoxication by the black stone in Tunisia

I El Amri1, K Garrab1, Y Youssef2, S Slaheddine2

1CHU Hached, Ka/aa-Sghira, Tunisia; 2CHU Hached, Sousse, Tunisia

Critica/ Care 2008, 12(Suppl 2):P358 (doi: 10.1186/cc6579)

Introduction Paraphenylenediamine (PPD) is frequently used by women in certain countries such as Tunisia, Morocco, India, Pakistan and Sudan to dye their hair black. Knowledge of its systemic toxicity led to use for a purpose of autolysis, but global studies including a medico-legal and toxicological investigation remain very rare. In Tunisia the sporadic cases of suicide by the ingestion of this substance were recorded in the regions of the south and the center.

Methods A retrospective study concerning a series of 10 cases of voluntary acute intoxications with PPD brought together in the Laboratory of Toxicology of CHU Farhat Hached Sousse. The samples of blood, urine and gastric contents were realized during the clinical examination and autopsy in the hospitals of Sfax, Sousse and Kairouan. The characterization of PPD in the aqueous biological circles was able to be made after the clarification of a separation technique.

Results The sex ratio (male/female) was equal to 0.25. The average age was 28 years. Seven subjects were single and three were married. The socioeconomic level was low in all of the cases. Death was noted in 9/10 cases. The most common clinical evidence was cervicofacial oedema (nine cases), diffuse myalgia (three cases), and blackish urine (six cases). The analysis toxicology brought conclusive evidence of the ingestion of PPD by revealing the level of the gastric contents and the urine in almost all of the cases (10 cases).

Conclusions The study shows that intoxication by the black stone is relatively rare but there is a potentially burning absence of a fast and effective medical intervention. The prognosis for survival involves the initial phase characterized by cervicofacial oedema requiring a tracheotomy of rescue for lack of intubation, often difficult. It is this logic that makes the role for the Laboratory of Toxicology bringing, in the shortest time period, proof of an acute intoxication, for which the diagnosis is not to be underestimated. References

1. Zeggwagh AA, Aboukal R, Madani R, et a/.: Myocardite toxique due a la paraphenylene diamine, a propos de deux cas. Rea Urg 1996, 5:699-703.

2. Brasch J: 'New' contact allergens. Curr Opin A//ergy C/in Immuno/ 2007, 7:409-412.

Prognosis factors of poisonings treated with extracorporeal life support in the ICU

B Mégarbane1, N Deye1, S Mohebbi-Amoli1, J Théodore1, P Brun1, I Malissin1, P Le Prince2, F Baud1

1Hôpital Lariboisière, Paris, France; 2Hôpital Pitié Salpétrière, Paris, France

Critical Care 2008, 12(Suppl 2):P359 (doi: 10.1186/cc6580)

Introduction Massive drug ingestions may be associated with refractory cardiac failure, which reversibility makes extracorporeal life support (ECLS) promising despite prolonged arrest. Our objective was to determine the prognosis factors of ECLS-treated patients. Methods A prospective study including all poisoned patients treated with ECLS during 2003-2007; surgical cannulation of femoral vessels in the ICU to perform ECLS (Rotaflow®; Jostra-Maquet SA) in collaboration with a cardiosurgical team of a neighboring hospital. Descriptive analysis (median, (percentiles 10-90%)); univariate comparisons using chi-squared and Mann-Whitney tests. Results Fifty-seven poisoned patients (19 males/38 females, 41 years (21-59); SAPS II, 75 (49-94)) were treated with ECLS over a 4-year period in relation to cardiac failure (26/57) and arrest (31/57). Patients had ingested high doses of cardiotoxicants in 49/57 cases (chloroquine 19%, class I antiarrhythmic drugs 19%, P-blockers 14%, calcium channel blockers 11%). Sixteen patients (28%) survived, including five to prolonged cardiac arrest (maximal duration: 180 min). Death was consecutive to multiorgan failure, anoxic encephalopathy or capillary leak syndrome if ECLS was performed under cardiac massage. Four patients presented documented brain death, allowing organ donation in two cases. Among these patients, the heart of one flecainide-poisoned patient was successfully transplanted, after normalization of ECG and myocardial function as well as toxicant elimination under ECLS. Prognosis factors in ECLS-treated poisoned patients were as follows: QRS enlargement on admission (P = 0.009), SAPS II score on admission (P = 0.005), ECLS performance under massage (P = 0.008), arterial pH (P < 0.001), lactate concentration (10.7 (6.6-19.6) versus 15.0 mmol/l (6.2-29.5), P = 0.003), as well as red cell (P = 0.008), fresh plasma (P = 0.003), and platelet (P = 0.03) transfusions within the first 24 hours. Conclusions ECLS appears to be an efficient salvage technique in cases of refractory toxic cardiac failure or arrest.

Prognostic factors of acute calcium-channel blocker poisonings

S Karyo1, B Mégarbane1, K Abidi2, P Sauder2, A Jaeger3, F Baud1

1Hôpital Lariboisière, Paris, France; 2Hôpital Civil, Strasbourg,

France; 3Hôpital Hautepierre, Strasbourg, France

Critical Care 2008, 12(Suppl 2):P360 (doi: 10.1186/cc6581)

Introduction The incidence of acute calcium-channel blocker (CCB) poisonings is increasing. Our objectives were to describe the CCB-poisoned patients admitted to the ICU and to determine the prognostic factors.

Methods Retrospective collection of clinical data in three ICU in 2000-2006; determination of plasma concentration using HPLC (REMEDI). Description (median, (25-75% percentiles)); comparisons using Mann-Whitney and chi-squared tests; multivariate analysis using a step-by-step logistic regression model. Results Eighty-four patients (47 males/36 females, 44 years (31-56); SAPS II, 15 (8-25)) were included. Verapamil (39/83), diltiazem (13/83), nifedipine (11/83), nicardipin (9/83), and amlo-

pdipine (8/83) were involved. On admission, systolic blood pressure was 105 mmHg (86-118), heart rate 76/min (67-91), QRS duration 85 ms (80-110), and plasma lactate concentration 2.86 mmol/l (1.79-5.98). Poisoning features included shock (42/83), atrioventricular block (34/83), asystole (8/83), and/or ventricular arrhythmia (4/83). All patients received fluid replacement, 50/83 epinephrine infusion (maximal rate: 3.0 mg/hour (1.4-8.0)), and 27/83 norepinephrine (5.0 mg/hour (2.9-15.0)). Thirty-three out of 83 were mechanically ventilated. Treatments included calcium salts (22/83), glucagon (18/83), dobutamine (18./33), 8.4% sodium bicarbonate (16/83), isoprenaline (14/83), insulin + glucose (13/83), terlipressin (4/83), electrosystolic stimulation (2/83), and extracorporeal life support (5/83). Eleven patients (13%) died in the ICU. The plasma verapamil concentration was significantly different on admission regarding survival (800 versus 2,522 mg/l, P < 0.05). If excluding SAPS II from the model, multivariate analysis showed that QRS duration (>100 ms; OR, 5.3; 95% CI, 1.1-27.0) and maximal epinephrine rate (>5 mg/hour; OR, 27.6; 9%% CI, 5.3-144.7) were the only two predictive factors of death (P = 0.007). Shock was refractory if epinephrine + norepinephrine was >8 mg/hour with renal (creatinine > 150 |imol/l) or respiratory failure (PaO2/FiO2 > 150 mmHg) (sensitivity, 100%; specificity, 89%). Conclusions Despite optimal management in the ICU, the CCB poisoning mortality remains high (13%), encouraging development of extracorporeal life support and new antidotes.

Retrospective study of patients following deliberate self-poisoning admitted to Cardiff and Vale NHS critical care services between April 2006 and December 2007

A Yates, C Weaver, J Parry-Jones

University Hospital of Wales, Cardiff, UK

Critical Care 2008, 12(Suppl 2):P361 (doi: 10.1186/cc6582)

Introduction There is a paucity of data regarding the demographics and the type of drugs ingested by patients who require admission to critical care following deliberate self-poisoning (DSP). Critical care admissions place a large economic burden on the health care system, and potential measures to prevent critical care admissions should be taken. Ingestion of antihypertensive medication, coma upon presentation, and presentation to the emergency department less than 2 hours after ingestion are predictive of ICU admission. Our aim was to establish the incidence of DSP, to assess the demographics, and to identify factors that may contribute to multiple admissions for DSP. Methods Data were retrieved from the Riyadh Intensive Care Programme (RIP) database for each case of DSP between April 2006 and October 2007 that required critical care. Case notes were reviewed and the following data recorded: type of poison ingested, past medical history, past psychiatric history, previous ICU admission for DSP and demographics. Results The RIP database identified 64 episodes of DSP involving 55 patients. The mean age was 40 years (26 males, 29 females). Forty-eight episodes required level 3 care. Forty-one patients required intubation. The average length of stay was 2.5 days. Nine patients (16.3% admissions) had more than one admission to the ICU during the study period. Of these, seven were female, average age 35 years, and two male, mean age 50 years. The commonest drug in multiple DSP was alcohol, followed by benzodiazepines. Six patients (75%) had a known psychiatric history. Three patients died during this period, one male and two female. Their average age was 46 years. None of these patients had previously presented to critical care. One of the deceased patients had a psychiatric history. S141

Conclusions Most of the ICU admissions to Cardiff and Vale NHS trust following multiple DSP episodes involve young females. The most frequently ingested drug is alcohol, and then benzodia-zepines. Further targeted psychiatric involvement with young women with a known psychiatric history may be warranted financially to prevent multiple critical care admissions. This may have no impact on overall successful DSP suicides. Reference

1. Novack V, et a/.: Eur J Intern Med 2006, 17:485-489.

Immunological manifestations in paraphenylenediamine


B Charra, A Hachimi, A Benslama, S Motaouakkil

Ibn Rochd University Hospita/, Casab/anca, Morocco Critica/ Care 2008, 12(Suppl 2):P362 (doi: 10.1186/cc6583)

Introduction The classic physiopathologic approach of the manifestations after paraphenylenediamine (PPD) intoxication presents many limits. Recently, the immunogenic role of PPD (particularly its derivatives of oxidation) in the genesis of contact dermatitis and immunologic perturbations has been revealed. Our aim is to establish the immunologic profile of PPD-intoxicated persons based on monitoring of the inflammatory reaction. Methods A prospective study of 21 patients treated in the medical resuscitation unit in Ibn Rochd University Hospital of Casablanca due to PPD intoxication, realized during 2005. A follow-up of the demographic, clinical, paraclinical, therapeutic and evolutive parameters as well as valuation of the scores of gravity (SAPS II, APACHE II, OSF) was carried out in all our patients, and an inflammatory check-up including the white corpuscles, C-reactive protein, C3 and C4 fractions of complement and lymphocyte subpopulations CD3, CD4, CD8 and CD19 were realized for all of them. A follow-up was realized, and the kinetics compared with those of the clinical and paraclinical evolution of the patients. Results The monitoring of the inflammatory reaction in our patients shows evolution at three times for this reaction, with the first time of inflammatory stress during the first 3 days after the intoxication characterized by a relative immunodepression, the second time from the third day when the rhabdomyolysis exerts its proinflammatory power, and the third time (from the sixth day) corresponds to the immunomodulative action of PPD and to its oxidative metabolism. It is a systemic inflammatory reaction specific to a cytotoxic cell support, which would explain the secondary worsening of the clinical and paraclinical parameters of our patients (the nature of the state relative to hemodynamic shock, the cause of multivisceral failures, etc.). Conclusions It seems that the immunological aspect may present the answer to several questions that rhabdomyolysis alone could not answer. This study tried to establish a first immunologic profile of PPD-intoxicated persons, and to correlate it with their evolution.

Introducing rapid response teams in Slovenia

G Lokajner, D Radolic, A Zagar, P Ecimovic

Institute of Oncology, Ljubljana, Slovenia

Critical Care 2008, 12(Suppl 2):P363 (doi: 10.1186/cc6584)

Introduction Hospitals in Slovenia lack an organized approach to S142 medical emergencies and clinical deterioration in hospitalized

patients. The Institute of Oncology (IO) is the first hospital in Slovenia that is considering implementation of rapid response teams (RRT) with the intention of improving patient safety and care. The IO is a teaching hospital and tertiary national cancer centre for Slovenia. On average, 350 outpatients and 210 inpatients are treated every day.

Methods A cross-sectional study of emergencies and clinical deteriorations in our wards from August to October 2007 with intent to access the situation and provide the optimal basis for introduction of RRT. Data were collected through a report form that was filled out by doctors and nurses on the ward at the time of the emergency or clinical deterioration. All hospital wards were included. Results A total of 3,140 patients were hospitalized during this 3-month period and 43 reports were returned. Most emergencies and clinical deteriorations were linked to active patient treatment -surgical and systemic therapy (chemotherapy and target therapies). The most common complications were: sepsis (34.8%), serious hypersensitivity to therapy (30.2%), pulmonary embolism (9.3%), bleeding (4.6%), followed by single cases of ileus, acute respiratory failure, cardiac arrest, spinal cord compression, stroke, pneumothorax, high intracranial pressure, peritonitis, and acute renal failure. There were seven fatal outcomes (16.3%), all transferred and treated in the ICU, caused mainly by late identification and treatment of sepsis, and were possibly preventable.

Conclusions The incidence of emergencies and clinical deteriorations in the IO was somewhat lower than reported, which can be ascribed to under-reporting [1]. The results are useful for providing basis for planning the most efficient and appropriate form of RRT but also to provide better education for ward staff with the intent to improve their awareness and immediate management of these conditions. As a result we hope to introduce RRT by the end of 2008, as the first hospital in Slovenia. We intend to continue with assessment of emergencies on the wards also as a part of quality assessment after the introduction of RRT. Reference

1. Vincent C, et al.: Adverse events in British hospitals: preliminary retrospective record review. BMJ 2001, 322:517519.

Emergency call system in the hospital

M Sugiyama, Y Moriwaki, H Toyoda, T Kosuge, M Iwashita, J Ishikawa, Y Tahara, N Harunari, S Arata, N Suzuki

Yokohama City University Medical Center, Yokohama, Japan Critical Care 2008, 12(Suppl 2):P364 (doi: 10.1186/cc6585)

Introduction Although there are many people of poor condition in the hospital, they have not been properly managed compared with out-of-hospital patients by the out-of-hospital emergency medical service system. The system of an inhospital medical emergency team (MET) is desired to be set up. Our staff of the Critical Care and Emergency Center functioned as a voluntary MET in response to the inhospital whole paging system. Since 2006, our hospital has adopted the emergency call system as a regular system. The objective of this study was to clarify the usefulness and problems of our MET system.

Methods We examined the medical records of our MET system (Doctor Call (DC)) for the past 1 year and 8 months. Results The data of 34 cases were enrolled. Events occurred in wards or diagnostic and treatment rooms in the outpatients' department in 29% of cases, examination room in 6%, and other nonmedical areas in 65%. Patients were found by the doctor in 21%, nurse in 18%, patients' family in 6%, nonhospital staff

including other patients in 12%, and other nonmedical staff in 43%. The reasons why bystanders decided to start the DC system were cardiac arrest in 12%, unresponsiveness in 26%, convulsion in 12%, falling down in 29%, lying in 15%, and others in 6%. In two cases, who were inpatients in their ward, a bystander who found their abnormality (unresponsive and no respiration) at first called a doctor on night duty for the ward before starting up the DC system, with hesitation for using this system because these events occurred at night. We experienced six cases of cardiopulmonary arrest in the DC system including these two cases, 33% of whom were survived without any functional disturbance, 49% died after temporary return of spontaneous circulation, and 18% died without return of spontaneous circulation because of acute aortic dissection in the outpatient department of cardiac surgery during consultation by a specialist. Except for these two cases, patients were managed at first by bystanding doctors before DC in 25%, by MET in 53% within 3 minutes, and by other generalists and specialists in 22% within 2 minutes. Conclusions Although our MET system is thought to work well, it needs to be helped by other doctors working nearer the scene than the MET. It is thought necessary to educate the importance of the emergency call and the MET system even at night for all hospital staff.

Relevance of a cardiac arrest team in an Indian cancer hospital ICU_

S Nainan Myatra, J Divatia, R Sareen

Tata Memorial Hospital, Mumbai, India

Critical Care 2008, 12(Suppl 2):P365 (doi: 10.1186/cc6586)

Introduction Cardiopulmonary resuscitation (CPR) after cardiac arrest in cancer patients is often discouraged as it is associated with very poor outcomes. In our 560-bed tertiary cancer hospital in Mumbai, India, the ICU runs a cardiac arrest team (CAT). We reviewed our data to determine outcomes in our patients and whether it is justified to continue the CAT.

Methods All inhospital patients from June 2005 to July 2007 with unanticipated cardiorespiratory arrests and other emergencies for whom the CAT was called were included. Data were recorded using the Utstein template. Patients with anticipated progression towards arrest, and those with seizures, hypotension without dysarrythmias or dysarrythmias without hypotension, were excluded. The outcome studied was survival on hospital discharge (SOHD). Binary logistic regression analysis was performed to determine factors associated with SOHD.

Results Three hundred and sixty patients (227 males, 133 females, mean age 45.2 ± 18.3 years) were studied. The mean time interval between collapse and onset of resuscitation was 2.3 ± 2.1 minutes. The overall SOHD was 25.3%. Sixteen out of 244 patients (6.6%) with cardiac arrest and 75/116 (64.7%) patients with respiratory arrest or other emergencies had SOHD. The initial rhythm recorded during CPR was asystole in 189 patients, pulseless electrical activity in 31 patients and ventricular fibrillation/tachycardia in 24patients, while 116 patients had other rhythms. SOHD for these rhythms was 1.6%, 3.2%, 54% and 65.6%, respectively. Cardiac arrest in medical oncology patients was associated with significantly worse SOHD than in other patients (3/117, 2.6% vs 13/127, 10.1%, P = 0.02). On univariate analysis, the age, medical oncology admission and monitored arrest were not associated with SOHD. On multivariate analysis, only asystole (OR = 97.5, 95% CI = 29.0-327.5) and time to resuscitation (OR = 1.4, 95% CI = 1.17-1.67) were significantly associated with mortality (P < 0.000), while witnessed arrest and cardiac arrest were not.

Conclusions The overall survival was 25.3%. Nearly one-third of patients suffer from respiratory arrest or other emergencies with good (64.7%) SOHD. A reduced response time is associated with improved SOHD. These considerations justify the presence of a CAT in our cancer hospital. Asystolic patients should not be resuscitated.

Outcome after ICU admission following out-of-hospital cardiac arrest in a UK teaching hospital

H Roberts, M Smithies

University Hospital of Wales, Cardiff, UK

Critical Care 2008, 12(Suppl 2):P366 (doi: 10.1186/cc6587)

Introduction Outcome after out-of-hospital cardiac arrest is poor [1]. We examined the records of patients admitted to the University Hospital of Wales ICU to compare our outcomes with published literature and to identify risk factors for poor outcome. Methods All patients aged >18 years who were admitted between

1 January 2004 and 31 December 2006 after out-of-hospital cardiac arrest were identified from computerised records and case notes. Patients admitted between 1 January 2007 and 31 May 2007 were studied prospectively. Demographic and outcome data were collected as well as information related to the cardiac arrest episode.

Results Sixty patients were admitted over 41 months. Twenty-one out of 60 were female (male:female ratio 2:1). The mean age was 61.8 ± 15.2 years. There were six patients >80 years old. Bystander cardiopulmonary resuscitation (CPR) was attempted in 73% of cases. The response time of medical services ranged from

3 to 45 minutes (mean 10.5, median 7 min). The longest response time for a surviving patient was 6 minutes. No patient survived with a total duration of cardiac arrest >15 minutes or time without CPR >6 minutes. There were no survivors with any initial rhythm other than ventricular fibrillation/ventricular tachycardia (VF/VT). The mean ICU length of stay was 3.3 days for nonsurvivors (range 1-15 days) and 12.9 days (range 1-35 days) for survivors. Mean hospital length of stay was 4.4 days for nonsurvivors (range 1-35 days) and 31.4 days (range 1-91 days) for survivors. Overall survivals to ICU/ home discharge were 38.3%/33.3%, respectively. Survival in the >80-year-old group was 0% compared with 40% in those aged <80 years (P = 0.024). Survival in males was 38.5% and in females 23.8% (P = 0.25). Information on neurological outcome was available for seven out of 20 survivors. All seven received therapeutic hypothermia treatment. Five (71%) had 'good' neurological outcomes. One had minor cognitive deficit and one required long-term nursing home care. Conclusions The high male:female ratio may reflect the higher incidence of ischaemic heart disease in males. Gender does not affect outcome after ICU admission. Our survival rates of 33.3% are higher than the national average of 28.6% [1], but we have had no survivors over the age of 80 years or with any initial rhythm other than VF/VT. Delay to initiation of CPR (>6 min) and prolonged CPR (>15 min) were also universally associated with death in this patient cohort. Neurological outcomes of survivors appear good. Reference

1. Nolan JP, Laver SR, Welch C, Harrison D, Gupta V, Rowan K: Case mix, outcome and activity for admissions to adult general ICUs following a cardiac arrest: a secondary analysis of the ICNARC Case Mix Programme Database.

J Intens Care Soc 2007, 8:38. S143

Comparison of the characteristics and outcome between patients suffering from out-of-hospital primary cardiac arrest and drowning victims with cardiac arrest: an analysis of variables based on the Utstein Style for Drowning

S Grmec, M Strnad, D Podgorsek, A Jus

ZD dr. Adolfa Drolca Maribor, Slovenia

Critical Care 2008, 12(Suppl 2):P367 (doi: 10.1186/cc6588)

Introduction In 2003, ILCOR published the Utstein Style for Drowning (UFSD) to improve the understanding of epidemiology, treatment and outcome prediction after drowning. Characteristics and outcome among patients with out-of-hospital primary cardiac arrest (OHPCA) compared with drowning victims with cardiac arrest (DCA) patients were described with application and evaluation of UFSD data for outcome analysis. Methods All patients with OHPCA and DCA from February 1998 to February 2007 were included in the research and analysis. Data on patients with OHPCA and DCA were collected prospectively using the Utstein method. Data on patients with DCA were then compared with data of patients with OHPCA. Results During the study period 788 cardiac arrests with resuscitation attempts were identified: 528 of them were OHPCA (67%) and 32 (4%) were DCA. The differences between patients with DCA and patients with OHPCA were: the patients with DCA were younger (46.5 ± 21.4 vs 62.5 ±15.8 years; P = 0.01), they suffered a witnessed cardiac arrest less frequently (9/32 vs 343/528; P = 0.03), they were more often found in a non-shockable rhythm (29/3 vs 297/231; P < 0.0001), they had a prolonged ambulance response time (11 vs 6 min; P = 0.001), they had a relatively better (but not statistically significant) return of spontaneous circulation in the field (22/32 (65%) vs 301/528 (57%); P = 0.33) and more of them were admitted to hospital (19/32 (60%) vs 253/528 (48%); P = 0.27) and they also had a significantly higher survival rate - discharge from hospital (14/32 (44%) vs 116/528 (22%); P = 0.01). Patients with DCA had higher values of initial partial pressure of end-tidal carbon dioxide (petCO2) (53.2 ± 16.8 vs 15.8 ± 8.3 mmHg; P < 0.0001) and average2 petCO2 (43.5 ±13.8 vs 23.5 ± 8.2; P = 0.002). These values of petCO2 suggest asphyxial mechanism of cardiac arrest. The analysis showed that survived patients with DCA were younger, had more bystander cardiopulmonary resuscitation, shorter call-arrival interval, higher values of petCO2 after 1 minute of cardiopulmonary resuscitation, higher average and final values of petCO2, a lower value of initial serum K+ and more of them received vasopressin (P < 0.05) in comparison with DCA patients who did not survive.

Conclusions Patients with DCA had a better survival rate (discharge from hospital) and higher initial and average petCO2 values, and more of them had a nonshockable initial rhythm.

Trends (1998-2006) in hospital mortality for admissions to UK critical care units following cardiopulmonary resuscitation

JP Nolan1, DA Harrison2, CA Welch2, SR Laver1, K Rowan1

1Royal United Hospital, Bath, UK; 2Intensive Care National Audit and Research Centre, London, UK

Critical Care 2008, 12(Suppl 2):P368 (doi: 10.1186/cc6589)

Introduction The objective of this study was to investigate trends in hospital mortality for admissions to UK critical care units following cardiopulmonary resuscitation (CPR) for the period S144 1998-2006.

Methods A retrospective analysis of the Intensive Care National Audit and Research Centre Case Mix Programme Database of 480,433 admissions to 178 units in England, Wales and Northern Ireland. Admissions, mechanically ventilated in the first 24 hours in the critical care unit and admitted following CPR in the 24 hours before admission, were identified for the period 1 January 1998-31 December 2006.

Results Mechanically ventilated survivors following CPR accounted for 26,722 (5.6%) of admissions. In total 15,143 (56.7%) died on the admitting critical care unit and 18,573 (70.7%) died before ultimate discharge from acute hospital. Over the 9 years, relative to all admissions, the proportion of patients admitted following CPR decreased from 6.6% to 5.0%; this reduction occurred mainly among those admitted following inhospital CPR. The mean age of admissions following CPR has increased (from 62 to 65 years following inhospital CPR (P < 0.001) and from 58 to 62 years for out-of-hospital CPR (P < 0.001)). Hospital mortality decreased significantly from 70.5% to 69.0% (trend analysis odds ratio (95% confidence interval); 0.98 per year (0.97-0.99) P < 0.001). After adjustment for case mix, the reduction in hospital mortality following inhospital CPR remained significant (0.97 per year (0.96-0.99) P = 0.001) but did not for out-of-hospital CPR (0.99 per year (0.97-1.01) P = 0.43).

Conclusions In the period 1998-2006, the crude hospital mortality for admissions to UK critical care units following CPR has decreased significantly, and for inhospital CPR this decrease remained significant after adjustment for case mix.

Return of spontaneous circulation after cardiac arrest using mechanical chest compressions with the Lund Cardiac Arrest System compared with manual chest compressions

M Beukema, P Van der Weyden, R De Wilde, MS Arbous, HI Harinck

Leiden University Medical Center, Leiden, The Netherlands Critical Care 2008, 12(Suppl 2):P369 (doi: 10.1186/cc6590)

Introduction Experimental studies have shown improved organ perfusion with mechanical chest compression with the Lund Cardiac Arrest System (LUCAS) compared with conventional cardiopulmonary resuscitation (CPR). Few data exist on the effects on clinical outcome. From September 2006 onwards, all out-of-hospital resuscitations for cardiac arrest in the Leiden area were performed using the LUCAS in combination with continuous flow insufflations of oxygen (CFI). We studied the effect of the LUCAS-CFI on the return of spontaneous circulation (ROSC) on arrival at the hospital compared with conventional CPR. Methods From January 2007 to September 2007, data on ROSC on arrival at the hospital were collected prospectively, and were compared with historical controls, manual CPR, for January 2006-September 2006. Only patients with primary cardiac disease (ischemia or arrhythmias) were included in the analysis. Groups were compared using the chi-square test and the Mann-Whitney test. Potential confounders of the effect of the LUCAS on ROSC were tested in a univariate logistic regression model. In a multivariate logistic regression model, the effect of LUCAS was tested, corrected for confounders. Results From January 2007 to September 2007, 57 patients were resuscitated using LUCAS-CFI. Fifty-six patients were used as historical controls. Groups were comparable (Table 1) with the exception of bystander CPR. ROSC occurred in 20 (35%) patients in the LUCAS-CFI group versus 14 (25%) in the control group. In the univariate analysis, asystole significantly decreased the chance of ROSC (OR = 0.21, 95% CI = 0.05-0.96). Corrected for

Table 1 (abstract P369)

Manual CPR LUCAS-CPR P value

Age 64 61 0.40

Ventricle fibrillation (%) 56.5 65.5

Arrival time (min) 7 6 0.27

Bystander CPR (%) 45.2 69.0 0.01

confounders the LUCAS did not perform better than manual chest compression with respect to ROSC (OR = 1.25, 95% CI = 0.53-2.94).

Conclusions We found no significant difference in ROSC between LUCAS-CPR compared with conventional CPR by manual chest compressions.

Abstract withdrawn

Return of spontaneous circulation and neurological outcome after inhospital Lund University Cardiac Arrest System cardiopulmonary resuscitation

P Durnez, W Stockman, R Wynendaele, P Germonpré, P Dobbels

Hart Roeselare, Roeselare, Belgium

Critical Care 2008, 12(Suppl 2):P371 (doi: 10.1186/cc6592)

Introduction We report return of spontaneous circulation (ROSC) and neurological outcome after inhospital Lund University Cardiac Arrest System cardiopulmonary resuscitation (LUCAS-CPR). Methods From February 2006 onwards, we intended to use LUCAS-CPR for all cases of adult inhospital arrest, after arrival of the inhospital emergency team. The Glasgow Coma Scale (GCS) was used to determine neurological outcome 24 hours after discontinuing sedative drugs. Outcome at 3 and 6 months was determined by the Cerebral Performance Categories (CPC) [1]. Results Seventy-two patients received inhospital LUCAS-CPR. Twenty-two were female. The mean age was 71.46 (SD ± 11.9) years. The location of arrest was a monitored ward in 28 cases (emergency department, coronary care unit, ICU) and a general ward in 44. All but three arrests were witnessed. Because of obesity, LUCAS-CPR could not be initiated in three patients. First rhythm was asystole in 15 patients (20.8%), pulseless electrical activity in 40 (55.5%) and ventricular tachycardia/ventricular fibrillation in 17 cases (23.6%). ROSC was obtained in 46 of 72 patients (63.8%). The GCS was favourable (14 or 15/15) in 25 cases (34.7%). The CPC at 3 months revealed a CPC of one in seven patients (9.7%) and of two in 10 patients (13.9%). One patient had a CPC of 3 and one patient a CPC of 4. The CPC at 6 months was slightly different. One patient with a CPC of 1 died, one patient with a CPC of 3 changed to a CPC of 2 after revalidation, and finally one patient with a CPC of 3 died. Conclusions LUCAS-CPR is a good alternative for manual closed-chest compression in patients with inhospital cardiac arrest. The ROSC ratio (63,8%) and early neurologic outcome as determined by the GCS (34.7%) are high. CPC at 3 and 6 months revealed a good outcome (CPC 1 or 2) in 23.6%. Reference

1. Brain Rescucitation Clinical Trial II Study Group: A randomized clinical trial of calcium entery blocker adlinistration to comateus survivors of cardiac arrest: design, methods, and patient characteristics. Control Clin Trials 1991, 12: 525-545.

Etiology of prehospital cardiac arrest largely determines outcome in patients treated with mild hypothermia

HJ Busch, T Schwab, S Richter, T Bestehorn, E Kohls, K Fink, M Zehender, C Bode

Intensive Care Unit Heilmeyer, Freiburg, Germany

Critical Care 2008, 12(Suppl 2):P372 (doi: 10.1186/cc6593)

Introduction Clinical and experimental investigations have demonstrated improved neurological outcome after therapeutic mild hypothermia in patients after successful resuscitation from prehospital cardiac arrest. In these investigations only patients with prehospital cardiac arrest due to ventricular fibrillation were included. After the presentation of controlled studies, therapeutic hypothermia moved into the topical international guidelines. S145

Methods We investigated efficacy and outcome of mild therapeutic hypothermia in the treatment of out-of-hospital cardiac arrest due to varied etiologies. We compared retrospectively 168 patients admitted in the years 2001-2006 to our medical ICU with the indication for cooling therapy after cardiac arrest. Eighty-nine patients received cooling therapy (MHT Group), 79 patients were not cooled after cardiac arrest (NO-COOL Group). Cooling was obtained by endovascular cooling device or surface cooling. Survival in the two groups and factors associated with survival were analysed.

Results In the MHT Group, survival was significantly higher (53% versus 47%, P = 0.0012). Age and duration of resuscitation therapeutic hypothermia were independently associated with survival. In patients with first registered rhythm of asystole (8/25 (32%) vs 2/13 (15%), P = 0.06), prolonged resuscitation, time from return to spontaneous circulation >20 minutes and prolonged time to arrival on scene, cooling therapy was associated with a significant improvement in neurological outcome. Conclusions Therapeutic hypothermia improves significantly survival and neurological outcome of out-of-hospital cardiac arrest in patients independent of first registered rhythm. Patients with a prolonged episode of hypoxia and prolonged time to return of spontaneous circulation profit significantly from treatment with therapeutic hypothermia.

Good neurological recovery at ICU discharge of asystole patients treated with induced mild hypothermia

M Foedisch1, A Viehoefer1, N Richling2, M Holzer2, I Herold3, M Adriaanse4, A Van Zanten4, E Rijnsburger5, A Girbes5, S Tompot6, K Polderman3

1Evange/ische K/iniken, Bonn, Germany; 2Medica/ University, Vienna, Austria; 3Universitair Medisch Centrum, Utrecht, The Nether/ands; 4Ziekenhuis Ge/derse Va//ei, Ede, The Nether/ands; 5Vrije Universiteit, Amsterdam, The Nether/ands; 6KCI Europe Ho/ding BV, Amste/veen, The Nether/ands Critica/ Care 2008, 12(Suppl 2):P373 (doi: 10.1186/cc6594)

Introduction Induced mild hypothermia therapy (MHyT) improves neurological outcome in postresuscitation cardiac arrest patients with ventricular fibrillation/tachycardia (VF/VT). Patients with asystole were excluded from earlier studies due to poor overall outcome [1,2]. The present study enrolled both patients with asystole or VT/VF; one of the objectives was to assess neurological function at ICU discharge.

Methods A prospective multicenter single-arm registry in 49 patients with witnessed cardiac arrest who were selected for MHyT. Patients had to be >18 years old and unconscious at ICU/emergency room admission (GCS < 8), with a time interval between cardiac arrest and initiation of MHyT treatment <6 hours. Informed consent was obtained from the patient or legal representative. Neurological status was documented upon emergency room arrival, during MHyT treatment, at ICU discharge and at hospital discharge as measured by the Glasgow Outcome Scale (GOS). Temperature measurements were continuously taken throughout the MHyT treatment via bladder catheter. For the MHyT treatment the Deltatherm® device (KCI, San Antonio, TX, USA) was used.

Results Of the 49 patients included in the registry 31 (63%) had VF/VT and 17 (35%) had asystole as first rhythm. In one patient (2%) the rhythm was unknown. Good neurological recovery at ICU discharge (GOS 5 and 4) was seen in 22 (45%), three patients (6%) were neurologically impaired (GOS 3), six patients (1 2%) S146 were in a vegetative state (GOS 2) and 16 (33%) of the patients

died. In two patients (4%) the neurologic outcome was unknown. Good neurological recovery was seen in 48% (n = 15) of patients with VF/VT and in 41% (n = 7) of patients with asystole. Conclusions MHyT improves neurological outcome in patients with witnessed cardiac arrest regardless of the initial rhythm. Favorable results in VT/VF patients were similar to preceding studies [1,2]. Hypothermia also appears to provide neurological protection in patients presenting with asystole. References

1. Hypothermia after Cardiac Arrest Registry Group: N Eng/ J Med 2002, 346:549-556.

2. Bernard SA, et a/.: N Eng/ J Med 2002, 346:557-563.

Use of hypothermia for out-of-hospital cardiac arrest survivors: a single-centre experience

A Gupta, S Morrish, E Thomas

Derriford Hospita/s, P/ymouth, UK

Critica/ Care 2008, 12(Suppl 2):P374 (doi: 10.1186/cc6595)

Introduction Out-of-hospital cardiac arrest patients have a poor prognosis. Recent randomised controlled trials have shown that moderate hypothermia improves neurologic outcome and survival in selected patients after cardiac arrest [1]. Therapeutic hypothermia is now recommended by the ALS Task Force of the ILCOR and incorporated in the American and European resuscitation guidelines for postresuscitation care.

Methods Case notes of all OHCA patients admitted alive to our ICU in 2006 were retrospectively analysed. All patients received standard care including adequate sedation and mechanical ventilation. Mild hypothermia was initiated as soon as possible and ideally maintained at 32-34°C for 12-24 hours with a combination of cold saline, cooling blanket and ice packs. Patients were allowed to passively rewarm. The institution's ethics committee approved the study. Discharged survivors were followed up for 6 months, and neurologic outcome was evaluated using the Glasgow Outcome Score (GOS).

Results Twenty-five patients were admitted following OHCA. Twenty (80%) fulfilled our cooling criteria. Eight patients had cooling initiated in A&E whereas nine had cooling initiated in the ICU and three were not cooled. Five patients were at target temperature on arrival in A&E. Of the 17 patients who had cooling initiated in hospital, the target temperature was achieved in only 15 patients. In patients where cooling was initiated in A&E, the median time to reach the target temperature from hospital admission was 4.25 hours; and when cooling was initiated in the ICU, it was 6.25 hours. A temperature above 37.5°C was noted in 12 (48%) patients during rewarming. Seven (28%) had a favourable outcome and were discharged from the hospital with a GCS of 15 and all had a GOS of 5 at 6 months. Five out of seven survivors were cooled. The cause of death was hypoxic brain injury in 15 (60%) and cardiogenic shock in three (12%) patients. All deaths occurred in hospital following treatment limitation decisions at (median (range)) 41 (9.5-94) hours.

Conclusions Our mortality rate of 72% was higher than the HACA study but the same as the Intensive Care National Audit and Research Centre. All survivors had a good neurologic outcome. Using this method of cooling we failed to achieve consistent hypothermia. Preventing rebound hyperthermia was difficult and many treatment limitation decisions were made before 72 hours. Reference

1. Hypothermia after Cardiac Arrest Study Group: Mild therapeutic hypothermia to improve the neurologic outcome after cardiac arrest. N Eng/ J Med 2002, 346:549-556.

Therapeutic hypothermia preserves the brain by reducing nitric oxide synthase after asphyxial cardiac arrest in rats

D Ndjekembo Shango, S Hachimi-Idrissi, G Ebinger, Y Michotte, L Huyghens

UZ Brussels, Belgium

Critical Care 2008, 12(Suppl 2):P375 (doi: 10.1186/cc6596)

Introduction Induced therapeutic hypothermia (TH) following cardiac arrest (CA) is the only strategy that has demonstrated improvement in outcomes. The mechanism by which TH, when applied after reperfusion, exerts its cell protective effect during CA remains unclear. The study aim was to elucidate the mechanisms; dopamine, glutamate as marker of excitatory amino acid overflow and also the citrulline/arginine ratio (CAR) as marker of nitric oxide synthase were measured during reperfusion after asphyxial CA in a sham operated group of rats, in a normothermic group and in a hypothermic group. Also the effect of TH on the histological data obtained from the rat's brain 24 hours and 7 days post insult were analyzed.

Methods Anesthetized rats were exposed to 8 minutes of asphyxiation including 5 minutes of CA. The CA was reversed to restoration of spontaneous circulation, by brief external heart massage and ventilation within a period of 2 minutes. Results After the insult and during reperfusion, the extracellular concentration of glutamate and dopamine, as determined by microdialysis in the rat striatum, increased up to 3,000% and 5,000%, respectively, compared with the baseline values in the normothermic group. However, when TH was induced for a period of 60 minutes after the insult and restoration of spontaneous circulation, the glutamate and dopamine concentrations were not significantly different from that in the sham group. The CAR increased up to fivefold compared with the basal value in the normothermic group and only 2.5-fold in the hypothermic group. However, in the sham operated group this ratio remained low and stable throughout the experiment. Histological analysis of the brain showed that TH reduced brain damage, ischemic neurons, as well as astroglial cell proliferation.

Conclusions TH induced after asphyxial CA mitigates the excitotoxic process, and diminishes nitric oxide synthase activity and brain damage as well as astroglial cell proliferation.

Cerebral blood flow and cerebrovascular reactivity during hypothermia after cardiac arrest

L Bisschops, C Hoedemaekers, K Simons, J van der Hoeven

Radboud University Nijmegen Medical Centre, Nijmegen, The Netherlands

Critical Care 2008, 12(Suppl 2):P376 (doi: 10.1186/cc6597)

Introduction Anoxic neurological injury is a major cause of morbidity and mortality in cardiac arrest patients. After restoration of spontaneous circulation, pathophysiological changes in cerebral perfusion appear, known as postresuscitation syndrome. In the delayed hypoperfusion phase, the cerebral blood flow is reduced, resulting in a potential mismatch between cerebral oxygen supply and demand, and secondary neurological damage. The aim of this study was to assess cerebral blood flow and cerebrovascular reactivity to changes in PaCO2 in patients after cardiac arrest treated with mild hypothermia.

Methods We performed an observational study in 10 adult comatose patients after out-of-hospital cardiac arrest. All patients

were cooled to 32-34°C for 24 hours, followed by passive rewarming. Blood flow velocity in the middle cerebral artery (MCA) was measured by transcranial doppler. Oxygen saturation in the jugular bulb (SjbO2) was measured by repeated blood sampling. Hypocapnia and hypercapnia were induced by a 20% increase and decrease in minute ventilation during 20 minutes. Data are expressed as the mean ± SEM. Changes over time were analysed by ANOVA. The relation between MCA velocity and SjbO2 was determined by linear regression analysis.

Results We present the results of the first five patients. All patients were male, with a mean age of 66 ± 5 years. Ventricular fibrillation was the cause of cardiac arrest in all patients. The mean time from collapse to return of spontaneous circulation was 25 ± 15 minutes. At the start of the experiment, mean flow velocity in the MCA was low (32.2 ± 9.6 cm/s), increasing significantly to 62.5 ± 11.3 cm/s at 48 hours (P < 0.001). The SjbO2 at the start of the experiment was 68.2 ± 4.0%, increasing significantly to 79.7 ± 3.8% at 48 hours (P < 0.001). Regression analysis showed that the change in SjbO2 correlated significantly with the change in PaCO2 (P < 0.001). A 1 kPa decrease in PaCO2 resulted in a 9.5% decrease in SjbO2. A decrease in PaCO2 also resulted in decreased flow velocities in the MCA (P = 0.09). Conclusions During mild hypothermia after cardiac arrest, MFV in the MCA is low, suggesting active cerebral vasoconstriction. Cerebrovascular reactivity to changes in PaCO2 is preserved in comatose cardiac arrest patients during mild hypothermia. Hyperventilation may induce cerebral ischemia in the postresuscitation period.

Early therapeutic hypothermia in sudden cardiac death and following return of spontaneous circulation

A Luzzani, E Polati, G Larosa, F Righetti, G Castelli

Osp Maggiore Borgo Trento, Verona, Italy

Critical Care 2008, 12(Suppl 2):P377 (doi: 10.1186/cc6598)

Introduction The aim of this study is to assess the role of therapeutic hypothermia on neurological outcome in patients who experienced sudden cardiac death (SCA) with ensuing return of spontaneous circulation.

Methods Thirty adult patients, aged between 18 and 85 years, referred to our ICU after SCA due to cardiac disease with following return of spontaneous circulation were randomly allocated to the following treatment groups: patients in group 1 were treated immediately after admission with therapeutic hypothermia plus standard treatment, patients in group 2 received only standard treatment. All patients at entry presented with GCS 3. Neurological outcome was assessed on discharge and after 6 months, by means of the GOS scale (0 = dead, 1 = vegetative,

2 = severely disabled, 3 = moderately disabled, 4 = good recovery). We consider scores 0-1 as unfavourable outcome, scores from 2 to 4 as favourable outcome. To compare the two groups we used the Mann-Whitney U test of for continuous variables, the chi-square test for qualitative variables.

Results Patients in group 1 (15 patients) and in group 2 (15 patients) were statistically comparable for sex (P = 0.16) and age (P = 0.77) and presentation ECG rhythm (P = 0.63). Median GOS values at discharge from the ICU were 2 (interquartile range 25-75% (IR) 1-3) in group 1 and 1 (IR 0-1) in group 2 (P = 0.06). Median GOS values at 6 months were 3 (IR 1-4) in group 1 and 1(IR 1-2) in group 2 (P = 0.09). The patients who improved their GOS values were 9/15 (60%) in group 1 and 2/15 (13.3%) in group 2 (P < 0.003). S147

Conclusions Our study demonstrated that early treatment with therapeutic hypothermia in the patient who had SCA improves neurological outcome.

Effect of therapeutic mild hypothermia initiated by cold fluids after cardiac arrest on haemostasis as measured by thrombelastography

A Spiel, A Kliegel, A Janata, T Uray, C Wandaller, F Sterz, A Laggner, B Jilma

Medica/ University of Vienna, Austria

Critica/ Care 2008, 12(Suppl 2):P378 (doi: 10.1186/cc6599)

Introduction Application of mild hypothermia (32-33°C) has been shown to improve neurological outcome in patients with cardiac arrest, and is therefore a class IIa recommendation in the treatment of those patients. However, hypothermia affects haemostasis, and even mild hypothermia has been found to be associated with bleeding and increased transfusion requirements in patients undergoing surgery [1]. On the other hand, crystalloid hemodilution has been shown to enhance the coagulation onset [2]. Currently, it is unknown in which way the induction of mild therapeutic hypothermia by a bolus infusion of cold crystalloids affects the coagulation system of patients with cardiac arrest. Methods This was a prospective pilot study in 18 patients with cardiac arrest and return of spontaneous circulation. Mild hypothermia was initiated by a bolus infusion of cold 0.9% saline fluid (4°C; 30 ml/kg/30 min) and maintained for 24 hours. At 0 hours (before hypothermia), 1 hour, 6 hours and 24 hours we assessed the prothrombin time, activated partial thromboplastin time (aPPT) and platelet count, and performed thrombelastography (ROTEM®) after in vitro addition of heparinase. Thrombelasto-graphy yields information on the cumulative effects of various blood compounds (for example, coagulation factors, haematocrit, platelets, leukocytes) involved in the coagulation process. Results A total amount of 2,527 (± 527) ml of 0.9% saline fluid was given. Platelet counts dropped by 27% (P < 0.01) after 24 hours. The haematocrit significantly decreased after 1 hour (P <

0.05. due to hemodilution and returned thereafter to baseline values. The aPTT increased 2.7-fold after 1 hour (P < 0.01), mainly due to administration of heparins. All ROTEM® parameters did not significantly change in the time course. None of the patients developed bleeding complications during the observation period. Conclusions Despite significant changes in haematocrit, platelet count and APTT, thrombelastographic parameters were not altered during the course of mild hypothermia. Therapeutic hypothermia initiated by cold crystalloid fluids therefore has only minor overall effects on the coagulation system in patients with cardiac arrest. References

1. Schmied H, et a/.: Lancet 1996, 347:289-292.

2. Ruttmann TG, et a/.: Br J Anaesth 1996, 76:412-414.

Immunomodulatory effects of esmolol in a septic animal model due to multidrug-resistant Pseudomonas aeruginosa pyelonephritis

G Dimopoulos, T Tsaganos, M Theodorakopoulou, H Tzepi, M Lignos, A Armaganidis, E Giamarellos-Bourboulis

Medica/ Schoo/, University of Athens, Greece

Critica/ Care 2008, 12(Suppl 2):P379 (doi: 10.1186/cc6600)

Methods Eighty white rabbits underwent pyelonephritis (multidrug-resistant Pseudomonas aeruginosa) induction and classification in pretreatment (PT) (n = 40) (infusion of esmolol immediately after pyelonephritis induction) and treatment (T) (n = 40) (initial infusion of esmolol 2 hours after pyelonephritis induction) group. PT = group A (n = 10, control, N/S 0.9% infusion), group B (n = 10, esmolol infusion), group C (n = 10, amikacin infusion) and group D (n = 10, esmolol and amikacin infusion) and T = groups E, F, G and H having similar treatment. Serum malondialdehyde (MDA) was estimated at serial time intervals by the thiobarbiturate assay followed by HPLC analysis. The animals were under survival follow-up every 12 hours for the next 21 days. After death, quantitative organ cultures were performed.

Results Median (IQR) MDA at 24 hours was 1.95 (0.75), 0.78 (1.79), 1.55 (1.60) and 0.12 (0.24) |imol/ml in groups A, B, C and

Figure 1 (abstract P379)

Pretreatment group.

Figure 2 (abstract P379)

Introduction The infusion of esmolol (a hypelective P1-blocker) is S148 associated with immunomodulatory effects [1].

Treatment group.

D, respectively. Respective values at 48 hours were 2.60 (2.00), 1.40 (2.36), 3.15 (3.00) and 0.25 (0.20) |imol/ml. At 24 hours, the median (IQR) MDA of groups E, F, G and H were 2.80 (5.74), 0.32 (0.87), 0.61 (5.83) and 0.19 (2.75) |imol/ml, respectively. Tissue bacterial load was similar within groups. See Figures 1 and 2. Conclusions In the present septic model, esmolol prolonged survival probably by exerting an immunomodulatory effect as assessed by reduced oxidative stress without any effect on tissue bacterial load. Reference

1. Suzuki T, etal.: Crit Care Med 2005, 33:2294-2300.

Applying the 2003 SCCM/ESICM/ACCP/ATS/SIS instead of the 1992 ACCP/SCCM sepsis definitions increases the numbers of patients with systemic inflammatory response syndrome shock and septic shock but decreases mortality rates

M Weiss, M Taenzer, K Traeger, J Altherr, B Hay, M Kron, M Huber-Lang, M Schneider

University Hospital, Ulm, Germany

Critical Care 2008, 12(Suppl 2):P380 (doi: 10.1186/cc6601)

Introduction To compare the prevalence of patients suffering from different stages of systemic inflammatory response syndrome (SIRS) and sepsis applying the original 1992 ACCP/SCCM and the revised 2003 SCCM/ESICM/ACCP/ATS/SIS sepsis definitions. Methods Set in a university adult ICU. Patients were postoperative/post-traumatic critically ill patients admitted to the ICU from October 2006 to October 2007. No interventions were used. From October 2006 to October 2007, 714 patients were surveyed using computer assistance with respect to different stages of SIRS and sepsis using the 1992 and the 2003 sepsis definitions, respectively.

Results Within the same patient collective, applying the 2003 definitions instead of the 1992 definitions, the prevalence of no SIRS (11 vs 110, respectively), no SIRS due to infection (sepsis) (0 vs 12), SIRS (129 vs 169) and sepsis (18 vs 52) decreased, and the prevalence of severe SIRS (169 vs 86), SIRS shock (121 vs 65) and septic shock (216 vs 168) increased. Prevalence of severe sepsis was comparable with both definitions (50 vs 52). Applying the 2003 definitions in patients with SIRS shock and septic shock, the mortality rates of 17% and 25% were markedly lower than those of 23% and 30%, respectively, under the 1992 definitions. Compared with the patients classified to be without SIRS shock and septic shock, the risk of mortality of those patients was markedly elevated that were classified to be in SIRS shock or septic shock with the 2003 definitions but not with the 1992 definitions (odds ratio = 5.0, CI = 2.2-11.2, P < 0.0001). Conclusions Replacing the original 1992 sepsis definitions with the 2003 revised sepsis definitions may result in increased prevalence of severe SIRS, SIRS shock and septic shock. However, the mortality rates of patients with SIRS shock and septic shock will be lower. References

1. American College of Chest Physicians/Society of Critical Care Medicine Consensus Conference: Definitions for sepsis and organ failure and guidelines for the use of innovative therapies in sepsis. Crit Care Med 1992, 20:864-874.

2. Levy MM, Fink MP, Marshall JC, et al.: 2001 SCCM/ESICM/ ACCP/ATS/SIS International Sepsis Definitions Conference. Intensive Care Med 2003, 29:530-538.

Systemic inflammatory response syndrome as a clinical detection tool and inclusion criterion in sepsis trials: too blunt an instrument?

P Gille-Johnson, K Hansson, B Gardlund

Karolinska Hospital, Stockholm, Sweden

Critical Care 2008, 12(Suppl 2):P381 (doi: 10.1186/cc6602)

Introduction The term systemic inflammatory response syndrome (SIRS) was introduced in 1992 [1] and has frequently served as a criterion for enrollment in sepsis trials. In the present study, the prevalence of SIRS in patients with significant bacterial infections and in patients with septic shock was assessed. Methods A cohort of 404 adult patients admitted to the Department of Infectious Diseases from the emergency room (ER) for suspected severe infection was studied prospectively. Of the SIRS variables, white blood cells (WBC) were measured on arrival while the physiological variables (temperature, heart rate (HR) and respiratory rate (RR)) were recorded on arrival to the ER and every 4 hours for 24 hours. In another cohort of 36 consecutive adults with vasopressor-dependent septic shock, the presence of SIRS criteria during 24 hours around the start of vasopressors was evaluated.

Results Bacterial infections requiring antibiotic treatment were diagnosed in 306 patients in the ER cohort. Nonbacterial infection or noninfection was diagnosed in 82 patients. In 16 patients, no diagnosis could be verified. Significant bacteremia was detected in 68 patients; the most common pathogens were pneumococci and Escherichia coli. Of the 306 patients with a verified bacterial infection and of the 68 with verified bacteremia, 26% and 21%, respectively, failed to meet two or more of the SIRS criteria on arrival in the ER. SIRS on arrival correlated significantly with bacterial infection, but not with bacteremia. Only RR and WBC contributed significantly to this finding, HR and temperature did not. In the septic shock group, all patients eventually fulfilled the HR and RR criteria but only 23/36 (64%) reached the temperature criterion and 25/36 (69%) the WBC criterion during 24 hours. Conclusions SIRS correlated with a subsequently verified bacterial infection requiring antibiotic treatment, but only the RR and WBC criteria contributed to this finding. As a tool for defining sepsis and selecting patients for enrollment in clinical sepsis trials, SIRS is nonspecific, and for >3 fulfilled criteria it lacks sensitivity. It may be time to abandon the SIRS criteria in selecting patients for sepsis trials and instead focus on more strict definitions of underlying infections in association with sepsis-related hypoperfusion and organ dysfunction. Reference

1. Bone RC, et ail.: Chest 1992, 101:1644-1655. P382

Pharmacokinetic-pharmacodynamic analysis of human purified C1-esterase inhibitor in patients with sepsis

L Dolzhenkova, N Lazareva, A Igonin

Sechenov Medical Academy, Moscow, Russian Federation Critical Care 2008, 12(Suppl 2):P382 (doi: 10.1186/cc6603)

Introduction Several randomized prospective studies showed some beneficial protective effects of exogenous human purified C1-esterase inhibitor (C1INH) in patients with sepsis [1,2]. Our purpose was to evaluate influence of systemic inflammation on the pharmaco-kinetic-pharmacodynamic of C1INH in patients with sepsis. Methods C1INH (Bicizar®; BioGenius LLC, Russia) was administered at the total dosage of 12,000 U in 48 hours (scheme

of infusion: 6,000 U, 3,000 U, 2,000 U, 1,000 U every 12 hours) to 13 patients meeting ACCP/SCCM sepsis criteria during the first 24 hours after hospitalization. C1INH activity and C3, C4, IL-6 and procalcitonin levels were measured at baseline, 5 minutes, 30 minutes, 1 hour, 2 hours, 4 hours, 6 hours, 8 hours and 10 hours after C1INH intravenous infusion. The ratio of Cinitial to Cmax 0-10 hours reflected changes in C1INH activity after 6,000 U infusion. AUC 0-10 hours was calculated after correction of the C1INH activity-time curve to baseline.

Results The median C1INH maximal shift after the first infusion was 55% (38-75%), as reflected by q^a/C.^ 0-10 hours. The calculated AUC 0-10 hours was 8.8 U-hours/ml (4.6-14.5 U-hours/ml). In patients with lower Cinitial 1.69 U/ml (0.96-2.65 U/ml), levels of C3 (r = 0.69, P < 0.01) ¡and C4 (r = 0.67, P < 0.05) at baseline were likely to be decreased. A direct correlation between C3 level and

Cinitial/Cmax 0-10 hours (r = °.49, P < °.°5) as weN as inverse

correlation with AUC 0-10 hours (r = -0.613, P < 0.05) were

found. The significant correlation of Cinitial/Cmax 0-10 hours with the

baseline procalcitonin was also observed (r = 0.57, P < 0.05). Conclusions The shift in C1INH activity after 6,000 U infusion of purified protein was likely to be connected with baseline compliment activity in sepsis. Initial C3 and C4 depletion was associated with increased C1INH activity. The pharmacokinetic-pharmacodynamic profile of human purified C1INH might also be influenced by the severity of systemic inflammatory response. These factors could have some implication in the dosage-adjustment strategy. References

1. Caliezi C, et al.: Crit Care Med 2002, 30:1722-1728.

2. Zeerleder S, et al.: Clin Diagn Lab Immunol 2003, 10:529535.

Nitrite consumption and production in the cardiopulmonary circulation during hemorrhagic shock and resuscitation

CR Phillips1, CN Wong1, TT Oyama2, S Anderson2, GD Giraud1, WE Holden2

Oregon Health & Science University, Portland, OR, USA;

2Portland Veterans Administration Medical Center, Portland, OR, USA

Critical Care 2008, 12(Suppl 2):P383 (doi: 10.1186/cc6604)

Introduction Nitrite (NO2-) is reduced to nitric oxide (NO) by deoxyhemoglobin, and resynthesized in blood by oxidation of NO in the presence of ceruloplasmin. The central circulation seems a probable site for nitrite consumption and repletion during periods of oxidative stress and recovery such as that seen in hemorrhagic shock and resuscitation (HSR). We asked whether NO2- is consumed in the central circulation during hemorrhage, and reconstituted during resuscitation.

Methods Male Sprague-Dawley rats (n = 13) were anesthetized, ventilated via tracheostomy, and then underwent HSR by withdrawing venous blood to a target systolic pressure of 40% of baseline, waiting 30 minutes and then resuscitating with saline to prebleed mean arterial pressure. Whole blood NO2- (arteriovenous NO2-) and exhaled NO (NOexh) (measured by chemilumi-nescence), blood gases and hemodynamics were sampled at baseline, at the end of hemorrhage, after 20 minutes auto-resuscitation, and after saline resuscitation. Mass flow of NO2-(mass NO2-) across the central circulation was calculated as the product of arteriovenous difference and blood flow. Results Figure 1 shows changes (± SEM) in hemodynamics, arterial and venous whole blood nitrite, and NOexh during HSR. Mass flow of NO2- decreased acutely with hemorrhage and NOexh

Figure 1 (abstract P383)

SP Flow NO,A NO,"V A-V NOs- Mass NOa- NOexh

fmmHsl miyttlitl JUlMl fuM) iuM/min) fnL/L1

Control 13114 4S15 0.621. 07 0.571.06 0.051.02 1.8511.2 0.810.2

Hemorrhage 87±6 28±4 0.561. 04» 0.561.05 0.001.02»* 0.2010.7 0.910.2«

Atito resus 7515 2614 0.761, 08# 0.651.06 0.121.05 1.9011.3« 0.610.M

Saline resits 9017 41 ±6 0.63±. 070 0.631.08 0.00±.05 1.5010.3 0.910.

♦p-0.06 vs. control, *»p0.02 vs. control, #p<0.04 vs. hemorrhage

increased, suggesting consumption of NO2- to NO across the central circulation. Conversely, during autoresuscitation, mass flow increased and NOexh decreased - suggesting production of NO2-. Conclusions Our findings are consistent with the hypothesis that NO2- consumption to NO is involved in the hemodynamic response to HSR. We also provide evidence that the lung is a major site of repletion of the NO2- pool, presumably by oxidation of NO to NO2- during both autoresuscitation and saline resuscitation.

Postconditioning in a rat model of gut ischemia-reperfusion

P Diemunsch, O Collange, M Kindo, A Steib, F Piquard, B Geny

Strasbourg University Hospital, Strasbourg, France Critical Care 2008, 12(Suppl 2):P384 (doi: 10.1186/cc6605)

Introduction Ischemic postconditioning has been shown to protect several organs (heart, brain, liver) from prolonged ischemia-reperfusion-induced damages. However, an eventual protecting effect of postconditioning after gut ischemia-reperfusion remains to be demonstrated. In this study, we evaluated an ischemic postconditioning protocol on a rat model of gut ischemia reperfusion.

Methods Wistar male rats (300 g) were randomized in three groups of eight rats: control (C), gut ischemia-reperfusion (IR) and gut IR plus postconditioning (IR-postC) groups. A laparotomy was performed under ketamine anesthesia for all rats. Then, the superior mesenteric artery (SMA) was occluded during 60 minutes and then reperfused during 60 minutes in both the IR and IR-postC groups. Postconditioning consisted of a succession of three ischemia (30 seconds) and reperfusion (120 seconds) periods. At the end of reperfusion, mesenteric and systemic bloods were sampled for lactate measurement. Lactate levels were compared using a Student test.

Results All animals survived the duration of study. Gut IR provided a significant increase in mesenteric lactate (LacM), 3.9 versus 1.34 mmol/l (P < 0.0001), and in systemic lactate (LacS), 4.2 versus

Figure 1 (abstract P384)

0.9 mmol/l (P = 0.007). There was no significant difference in terms of lactate between the IR and IR-postC groups: LacM: 3.4 mmol/l (P = 0.35); LacS: 3.4 mmol/l (P = 0.66). See Figure 1. Conclusions This protocol of postconditioning was not efficient in terms of hyperlactatemia reduction in our rat model of gut IR. Further studies will be needed to determine whether post-conditioning might be a therapeutic alternative in cases of gut ischemia-reperfusion.

Endotoxin-induced activation of hypoxia-inducible factor 1a in cultured human hepatocytes and monocytes: impact on cellular and mitochondrial respiration

T Regueira, PM Lepper, S Brandt, J Takala, SM Jakob, S Djafarzadeh

Bern University Hospital (Inselspital), University of Bern, Switzerland Critical Care 2008, 12(Suppl 2):P385 (doi: 10.1186/cc6606)

Introduction Hypoxia-inducible factor 1a (HIF1a) is a transcriptional factor activated by hypoxia. HIF1 a coordinates cell adaptation to hypoxia and modulates cellular metabolism and respiration. Recent data suggest that HIF1 a may also be activated via proinflammatory mediators and Toll-like receptors under normoxic conditions. The aim of this study was to evaluate whether lipopolysaccharide (LPS) could increase HIF1 a expression in a time-dependent and dose-dependent manner in cultured human hepatocytes and monocytes, and to determine a possible role in the modulation of cellular respiration.

Methods Cultured human hepatocytes (HepG2) and monocytes (MM6) were exposed to cobalt chloride, hypoxia (1.5% oxygen) and different concentrations of LPS. The time-course expression of HIF1a was determined by western blotting. Mitochondrial respiration was assessed after cell permeabilization with a protocol of stimulation-inhibition of each mitochondrial complex using the Oxygraph 2K (Oroboros Instruments, Innsbruck, Austria) and Datlab 4.2 software for data acquisition.

Results Hypoxia, cobalt chloride and LPS induced accumulation of HIF1 a in both cell lines in comparison with controls. In monocytes, HIF1a was detected after 4 hours of normoxic LPS incubation at a concentration of 1 mg/ml. In cultured hepatocytes, HIF1 a was detected after 2 hours of normoxic LPS incubation at a concentration of 1 mg/ml. Cellular respiration of permeabilized cultured hepatocytes was not affected after 6 hours (complex I and II respiratory control ratio (RCR)-dependent respiration: controls: 1.7 ± 0.4 vs LPS: 2.2 ± 0.4 and controls: 2.4 ± 1 vs LPS: 3.4 ± 0.5, respectively, P > 0.05, n = 5) or after 24 hours (complex I and II RCR-dependent respiration: controls: 2 ± 0.4 vs LPS: 1.9 ± 0.5 and controls: 3.9 ± 1.7 vs LPS: 4.5 ± 2.6, respectively, P > 0.05, n = 6) of normoxic LPS 1 mg/ml incubation (P > 0.05 for both). Conclusions LPS induces the expression of HIF1 a in human monocytes and hepatocytes under normoxic conditions. Exposing hepatocytes to LPS (1 mg/ml) for 6 and 24 hours does not impair their cellular respiration.

Induction of hypoxia inducible factor 1a by Toll-like receptors in human dendritic cells

S Djafarzadeh, R Spirig, T Regueira, J Takala, SM Jakob, R Rieben, PM Lepper

Bern University Hospital (Inselspital), University of Bern, Switzerland Critical Care 2008, 12(Suppl 2):P386 (doi: 10.1186/cc6607)

Introduction Nonhypoxic stimuli can induce the expression of hypoxia-inducible factor 1 a (HIF1 a). Only recently has it been

demonstrated that lipopolysaccharide (LPS) induces the expression of HIF1a in macrophages and that the induction of hypoxic genes in macrophages is also Toll-like receptor 4 (TLR4)-dependent. We hypothesized that HIF1 a expression is induced in dendritic cells in a TLR-dependent manner, plays a crucial role in linking the innate with the adaptive immune system and may also influence mitochondrial respiration.

Methods Human monocyte-derived immature dendritic cells (iDC) were stimulated with different TLR ligands (hyaluronic acid (HA), LPS or lipoteichoic acid) under normoxia. Furthermore, iDC were incubated under hypoxic conditions (1.5% oxygen) at the same time points with or without additional stimulation with LPS. HIF1 a expression was examined by western blot at 2 hours, 4 hours, 6 hours, 8 hours, 12 hours and 24 hours after TLR stimulation. In parallel, the cells were analyzed for the expression of the costimulatory molecules and maturation markers CD80 and CD86 by flow cytometry (FACScan; B&D). Finally, iDC were incubated in the presence or absence of 1 mg/ml LPS, and mitochondrial respiration of digitonin-permeabilized iDC was determined using the Oxygraph 2K (Oroboros Instruments, Innsbruck, Austria) and DatLab 4.2 software for data acquisition and analysis. Results All tested TLR ligands stimulated the expression of HIF1 a in a time-dependent manner. Interestingly, TLR induced HIF1 a expression levels in normoxia were even higher than in hypoxia. Hyaluronic acid, LPS and lipoteichoic acid led to dendritic cell maturation, as shown by CD80 and CD86 induction. LPS also increased complex II-dependent mitochondrial respiration of iDC (complex II respiratory control ratio: 1.5 ± 0.5 for controls vs 3.8 ± 1.2 for LPS, P < 0.05; n = 3).

Conclusions The current data demonstrate that HIF1 a expression in dendritic cells is induced under normoxic conditions via TLR2 and TLR4 agonists in a time-dependent manner. LPS also increases complex II-dependent mitochondrial respiration of dendritic cells.

Association between ATP production and oxidative mtDNA damage through mitochondrial respiratory chain in the rat caecal ligation and puncture heart injury model

J Hirata, M Oya, J Kotani, T Yamada, A Hashimoto, T Ueda, M Terashima, S Marukawa

Hyogo Co//ege of Medicine, Nishinomiya, Japan

Critica/ Care 2008, 12(Suppl 2):P387 (doi: 10.1186/cc6608)

Introduction Undisturbed generation of ATP, produced through NADH dehydrogenase in the respiratory chain, is required for the homeostasis of aerobic metabolism. In the case of a failing heart, excess production of reactive oxygen species (ROS) can cause an oxidative modification of mtDNA, such as 8-oxo-dGTP, which can lead to defects in DNA replication. On the other hand, free radical scavengers, such as polyethyleneglycole catalase (PEG-CAT), have been considered with improvement of defection in DNA replication through hydrolyzing 8-oxo-dG by the human functional homologue of the MutT protein (hMTH1 for mutt homologue 1; MTH-1) in the rat caecal ligation and puncture (CLP) model. However, associations between oxidative mtDNA damage and ATP production have not been well isolated clearly in the rat CLP heart injury model. Methods Sepsis was induced by CLP. Adult male Sprague-Dawley rats (n = 20) after 4 hours of CLP were administrated with or without free radical scavengers (H2O2 scavenger: PEG-CAT). We measured cardiomyocyte generation of MTH-1 by RT-PCR, NAD/NADH ratio, ATP production and 8-oxo-dG by HPLC with or without inhibition of ROS by PEG-CAT in the rat CLP heart injury model. S151

Table 1 (abstract P387)

Effects of PEG-CAT on myocarditis


group group group

NAD/NADH 148.1 ± 0.48 128.5 ± 1.05 90.2 ± 1.03

ATP 2,560 ± 2.3 1,960 ± 2.2 1,127 ± 1.8

8-oxo-dG 1.06 ± 0.83 1.07 ± 0.62 1.71 ± 0.20

Results Both the NAD/NADH ratio and ATP production level of the PEG-CAT(+) group were significantly increased compared with the PEG-CAT(-) group (P < 0.05), but these level had not normalized. The 8-oxo-dG level of the PEG-CAT(-) group were increased compared with the PEG-CAT(+) group (P < 0.05). The mRNA of MTH-1 level of PEG-CAT(+) group was significantly higher compared with the PEG-CAT(-) group (P< 0.05). See Table 1. Conclusions ATP production of the mitochondria may be inhibited by mtDNA damage of ROS through the respiratory chain. Reference

I. Lazzarino G, et a/.: Single-sample preparation for simultaneous cellular redox and energy state determination. Ana/ Biochem 2003, 322:51-59.

Cecal ligation and perforation, when disrupting proper abscess formation, provides highly reproducible results and has common features with human data

S Johnson, P Kubes

University of Ca/gary, Ca/gary, AB, Canada

Critica/ Care 2008, 12(Suppl 2):P388 (doi: 10.1186/cc6609)

Introduction Sepsis-induced acute respiratory distress syndrome is generally accepted to be caused by neutrophil sequestration in the lung microvasculature with resultant pulmonary endothelial damage from neutrophilic enzymes and metabolites. Developing models to study this condition accurately is crucial if therapeutic goals are to be achieved. Currently, endotoxemia models such as systemic lipopolysaccharide (LPS) injection predominate in the study of this condition due to the poor reproducibility of septic models such as cecal ligation and perforation (CLP). Methods A method of CLP designed to inhibit proper abscess formation was compared against intraperitoneal injection of 0.5 mg/kg LPS using C57B/6 mice at various time points up to 24 hours. Outcomes included circulating leukocyte counts, lung myeloperoxi-dase levels, and a multitude of cytokines and chemokines using Luminex technology. Septic human plasma from patients in the ICU was also analyzed for comparison using Luminex technology. Results LPS-treated mice consistently demonstrated earlier and greater peak MPO, TNFa, IL-1 a, IL-5, IL-6, IL-10, MIP-1 a, MCP-1, and Rantes levels that were shorter lasting in duration when compared with our CLP model, which consistently demonstrated steadily increasing levels over time. Interestingly, IL-17 levels were observed to peak at 424.3 ± 7.0 pg/ml in our CLP model but only reached a level of 31.7 ± 17.8 pg/ml in the LPS model, which was comparable with the control value. Our CLP model demonstrated multiple comparable trends in cytokine and chemokine levels with the septic human plasma data taking into account the differences in time-point collection. The most apparent trend was the highest and consistently elevated IL-6 levels found to be 11,528.7 ± 955.3 pg/ml in septic C57B/6 mice and

II,718.7 ± 4511.0 pg/ml in septic human patients. Conclusions Systemic LPS effects are very robust and short-lived; therefore, this model is not as relevant as CLP with respect to

human sepsis. Furthermore, septic effects such as those seen with IL-17 are not observed in LPS models. Here, we demonstrate that CLP with abscess impairment can be highly reproducible and comparable with human data. References