Purpose To evaluate differences in liver enhancement among patients with low and high morbidity risks and to determine the relationship between severity of liver dysfunction and the relative ratio of liver to aortic enhancement (RE) on MRI using hepatocyte-specific contrast. Materials and Methods A total of 126 patients underwent magnetic resonance imaging (MRI) and blood serum testing including serology, bilirubin, international normalized ratio, and creatinine tests. Radiologists analyzed a region of interest in the liver and aorta on precontrast and 10- and 20-minute delayed hepatobiliary phase MR images. Liver enhancement after 10 (LE10min) and 20 minutes (LE20min) were compared between the low- and high-risk groups by independent t-test. Regression analysis was used to assess the relationship between the Model for Endstage Liver Disease (MELD) score and RE. Results All 126 patients were classified into either the low-risk group (MELD <8; n = 85) or high-risk group (MELD ≥8; n = 41). The mean LE10min and LE20min were significantly higher in the low-risk group (471.61; 95% confidence interval [CI]: 449.79-493.43 and 510.69; 95% CI: 486.51-534.87, respectively) than in the high-risk group (401.6776; 95% CI: 364.75-438.61 and 413.81; 95% CI: 370.91-456.70). There was a moderate inverse correlation between MELD score and the relative ratio of liver enhancement (RLE) (r = -0.5442; 95% CI: -0.6480 to -0.4207; P<0.01), but a high positive correlation between MELD score and RE (r = 0.7470; 95% CI: 0.6665-0.8102; P < 0.01). Conclusion Although liver enhancement was significantly greater in low-risk patients compared to high-risk patients, RE may be a better predictor of liver function than RLE. J. Magn. Reson. Imaging 2014;39:24-30. © 2013 Wiley Periodicals, Inc.
All Science Journal Classification (ASJC) codes
- Radiology Nuclear Medicine and imaging