Publications
We work hard to attract, retain, and support the most outstanding faculty.
2011
2011
OBJECTIVES
The aim of this study was to compare single- versus dual-chamber implantable cardioverter-defibrillator (ICD) implantation and complication rates in a large, real-world population.
BACKGROUND
The majority of patients enrolled in ICD efficacy trials received single-chamber devices. Although dual-chamber ICDs offer theoretical advantages over single-chamber defibrillators, the clinical superiority of dual-chamber models has not been conclusively proven, and they may increase complications.
METHODS
The National Cardiovascular Data Registry ICD Registry was used to examine the association between baseline characteristics and device selection in 104,049 patients receiving single- and dual-chamber ICDs between January 1, 2006, and December 31, 2007. A longitudinal cohort design was then used to determine in-hospital complication rates.
RESULTS
Dual-chamber devices were implanted in 64,489 patients (62%). Adverse events were more frequent with dual-chamber than with single-chamber device implantation (3.17% vs. 2.11%, p < 0.001), as was the rate of in-hospital mortality (0.40% vs. 0.23%, p < 0.001). After adjusting for demographics, medical comorbidities, diagnostic test data, and ICD indication, the odds of any complication (odds ratio: 1.40; 95% confidence interval: 1.28 to 1.52; p < 0.001) and in-hospital mortality (odds ratio: 1.45; 95% confidence interval: 1.20 to 1.74; p < 0.001) were increased with dual-chamber versus single-chamber ICD implantation.
CONCLUSIONS
In this large, multicenter cohort of patients, dual-chamber ICD use was common. Dual-chamber device implantation was associated with increases in periprocedural complications and in-hospital mortality compared with single-chamber defibrillator selection.
View on PubMed2011
PURPOSE
To prospectively compare adequacy of colonic cleansing, adequacy of solid stool and fluid tagging, and patient acceptance by using reduced-volume, 2-L polyethylene glycol (PEG) versus magnesium citrate bowel preparations for CT colonography.
MATERIALS AND METHODS
This study was approved by the institutional Committee on Human Research and was compliant with HIPAA; all patients provided written consent. In this randomized, investigator-blinded study, 50 patients underwent oral preparation with either a 2-L PEG or a magnesium citrate solution, tagging with oral contrast agents, and subsequent CT colonography and segmentally unblinded colonoscopy. The residual stool (score 0 [best] to 3 [worst]) and fluid (score 0 [best] to 4 [worst]) burden and tagging adequacy were qualitatively assessed. Residual fluid attenuation was recorded as a quantitative measure of tagging adequacy. Patients completed a tolerance questionnaire within 2 weeks of scanning. Preparations were compared for residual stool and fluid by using generalized estimating equations; the Mann-Whitney test was used to compare the qualitative tagging score, mean residual fluid attenuation, and adverse effects assessed on the patient experience questionnaire.
RESULTS
The mean residual stool (0.90 of three) and fluid burden (1.05 of four) scores for PEG were similar to those for magnesium citrate (0.96 [P = .58] and 0.98 [P = .48], respectively). However, the mean fecal and fluid tagging scores were significantly better for PEG (0.48 and 0.28, respectively) than for magnesium citrate (1.52 [P < .01] and 1.28 [P < .01], respectively). Mean residual fluid attenuation was higher for PEG (765 HU) than for magnesium citrate (443 HU, P = .01), and mean interpretation time was shorter for PEG (14.8 minutes) than for magnesium citrate (18.0 minutes, P = .04). Tolerance ratings were not significantly different between preparations.
CONCLUSION
Reduced-volume PEG and magnesium citrate bowel preparations demonstrated adequate cleansing effectiveness for CT colonography, with better tagging and shorter interpretation time observed in the PEG group. Adequate polyp detection was maintained but requires further validation because of the small number of clinically important polyps.
View on PubMed2011
2011
2011
2011
We have observed in clinical practice that Native Americans require lower dosages of tacrolimus to attain similar target blood trough levels compared to whites after renal transplant. Because there are no pharmacokinetic studies of tacrolimus in this ethnic group, we investigated whether this clinical observation could be corroborated by pharmacokinetic differences between Native Americans and other ethnic and racial groups. We recruited 24 adult Native American kidney transplant recipients on stable oral doses of tacrolimus for at least 1 month posttransplant. We conducted a 12-h steady-state pharmacokinetic profile for all of the patients and estimated pharmacokinetic parameters using NONMEM. The concentration-time data were fit to a linear two compartment model with first-order absorption and lag time using an empirical Bayesian approach. The mean estimate of oral clearance (CL/F) was 11.1 l/h. Compared with previously reported data in other ethnic and racial groups, the Native American cohort has approximately one third the clearance of other groups. Our pharmacokinetic study reveals the clinically observed low dose of tacrolimus in Native American renal transplant patients is associated with a decreased oral tacrolimus clearance. There is scant information available on the genetic or environmental characteristics unique to this ethnic group that affect pharmacokinetics compared to other, better-studied groups, and elucidation of these factors will provide information to further facilitate individualized drug treatment for tacrolimus and a wide range of other drugs with similar clearance processes.
View on PubMed