Subscribe: Nephrology Dialysis Transplantation - current issue
http://ndt.oxfordjournals.org/rss/current.xml
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
boxtitle abstract  dialysis  disease  italic  kidney  paragraphsection boxtitle  patients  renal  risk  style italic  style style  style 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Nephrology Dialysis Transplantation - current issue

Nephrology Dialysis Transplantation Current Issue





Published: Fri, 10 Feb 2017 00:00:00 GMT

Last Build Date: Fri, 10 Feb 2017 10:46:33 GMT

 



Announcements

2017-02-10

News from ERA-EDTA:









Erythropoiesis stimulating agents and nephroprotection: is there any room for new trials?

2017-02-10

The paper by Fliser et al. [1], which has been published in this issue of Nephrology Dialysis Transplantation, was aimed at assessing the effect of a low dose of the continuous erythropoiesis receptor activator (CERA) on the rate of decline of kidney function in patients with moderate chronic kidney disease (CKD) [estimated glomerular filtration rate (eGFR) between 30 and 59 mL/min/1.73 m2]. The other key inclusion criterion was that patients should either have type 2 diabetes mellitus or have undergone kidney transplantation at least 6 months prior to study entry.



Is trough level variability the new tool for identifying patients at risk for rejection after transplantation?

2017-02-10

We know from long experience that the overall success of transplantation is determined during the few first months post-transplantation, when acute rejection episodes occur most frequently. Thus, immunosuppression has to be adequate and more intensive during this early period than at later times. Early targeted trough levels are higher than in later periods, and consequently higher risks for infections, as well as more frequent infectious episodes, are acceptable side effects. Moreover, early immunosuppression non-adherence has been reported to be associated with acute rejection and premature graft loss [1].



Long-term risks of kidney living donation: review and position paper by the ERA-EDTA DESCARTES working group

2017-02-10

Abstract
Two recent matched cohort studies from the USA and Norway published in 2014 have raised some concerns related to the long-term safety of kidney living donation. Further studies on the long-term risks of living donation have since been published. In this position paper, Developing Education Science and Care for Renal Transplantation in European States (DESCARTES) board members critically review the literature in an effort to summarize the current knowledge concerning long-term risks of kidney living donation to help physicians for decision-making purposes and for providing information to the prospective live donors. Long-term risk of end-stage renal disease (ESRD) can be partially foreseen by trying to identify donors at risk of developing ‘de novo’ kidney diseases during life post-donation and by predicting lifetime ESRD risk. However, lifetime risk may be difficult to assess in young donors, especially in those having first-degree relatives with ESRD. The study from Norway also found an increased risk of death after living donor nephrectomy, which became visible only after >15 years of post-donation follow-up. However, these findings are likely to be largely the result of an overestimation due to the confounding effect related to a family history of renal disease. DESCARTES board members emphasize the importance of optimal risk–benefit assessment and proper information to the prospective donor, which should also include recommendations on health-promoting behaviour post-donation.Keywords: kidney transplant, live donor, living donor, long-term risk, outcome



Early low-dose erythropoiesis-stimulating agent therapy and progression of moderate chronic kidney disease: a randomized, placebo-controlled trial

2017-02-10

Abstract
Background. It is unknown whether early intervention with low-dose erythropoiesis-stimulating agents (ESAs) in non-anaemic patients delays progression of chronic kidney disease (CKD).Methods. In a single-blind, 24-month trial, adults with estimated glomerular filtration rate (eGFR) 30–59 mL/min/1.73 m2 and either Type 2 diabetes mellitus or previous kidney transplantation were randomized to low-dose continuous erythropoiesis receptor activator (CERA; monthly dose 30–75 µg; n = 115) or placebo (n = 120). The primary endpoint was the annual change in eGFR (abbreviated Modification of Diet in Renal Disease formula).Results. Mean (standard deviation) eGFR was 40.7 (9.8) mL/min/1.73 m2 versus 39.8 (9.2) mL/min/1.73 m2 at baseline for CERA and placebo, respectively, and 39.0 (11.6) g/dL versus 39.7 (10.6) g/dL at the final visit. The median (interquartile range) annual reduction in eGFR was 0.5 (−2.2, 3.8) mL/min/1.73 m2 with CERA versus 0.4 (−2.0, 3.2) mL/min/1.73 m2 with placebo (P = 0.657). No significant difference in the annual change in eGFR was observed between treatment groups in the subpopulations with Type 2 diabetes or kidney transplant. Adverse events with a suspected relation to study drug occurred in 22.0% and 16.2% of patients randomized to CERA or placebo, respectively, and adverse events led to study drug discontinuation in 11.0% and 8.5% of patients.Conclusions. Patients with moderate CKD and Type 2 diabetes or previous kidney transplantation showed stable renal function that was unaffected by administration of low-dose ESA. In addition, there was no clinically meaningful effect of 2-year low-dose ESA treatment on albuminuria, an important surrogate marker of kidney injury.



Association between e-alert implementation for detection of acute kidney injury and outcomes: a systematic review

2017-01-14

Abstract
Background. Electronic alerts (e-alerts) for acute kidney injury (AKI) in hospitalized patients are increasingly being implemented; however, their impact on outcomes remains uncertain.Methods. We performed a systematic review. Electronic databases and grey literature were searched for original studies published between 1990 and 2016. Randomized, quasi-randomized, observational and before-and-after studies that included hospitalized patients, implemented e-alerts for AKI and described their impact on one of care processes, patient-centred outcomes or resource utilization measures were included.Results. Our search yielded six studies (n = 10 165 patients). E-alerts were generally automated, triggered through electronic health records and not linked to clinical decision support. In pooled analysis, e-alerts did not improve mortality [odds ratio (OR) 1.05; 95% confidence intervals (CI), 0.84–1.31; n = 3 studies; n = 3425 patients; I2 =0%] or reduce renal replacement therapy (RRT) use (OR 1.20; 95% CI, 0.91–1.57; n = 2 studies; n = 3236 patients; I2 =0%). Isolated studies reported improvements in selected care processes. Pooled analysis found no significant differences in prescribed fluid therapy.Conclusions. In the available studies, e-alerts for AKI do not improve survival or reduce RRT utilization. The impact of e-alerts on processes of care was variable. Additional research is needed to understand those aspects of e-alerts that are most likely to improve care processes and outcomes.



Association of the combination of time-weighted variability of tacrolimus blood level and exposure to low drug levels with graft survival after kidney transplantation

2016-12-26

Abstract
Background. The variability of tacrolimus blood levels has been shown to be associated with inferior graft survival. However, the effect of variability during the early post-transplantation period has not been evaluated. We sought to evaluate the association between time-weighted variability in the early post-transplantation period and graft survival. We also explored the interaction between drug level variability and exposure to inadequate drug levels.Methods. This retrospective cohort study included all patients who underwent kidney transplantation in the Rabin Medical Center and were treated with tacrolimus. Time-weighted coefficient of variability (TWCV) was defined as time-weighted standard deviation divided by the mean drug level. Univariate and multivariate Cox proportional hazard model was used with the primary outcome of patients and graft survival.Results. The study population included 803 patients who underwent kidney transplantation between 1 January 2000 and 29 September 2013. The high tertile of TWCV of tacrolimus blood levels was associated with reduced graft survival by univariate and multivariate analyses [hazard ratio (HR) 1.69, 95% confidence interval (CI) 1.14–2.53, P = 0.01 and HR 1.74, 95% CI 1.14–2.63, P = 0.01, respectively]. The interaction between high TWCV and exposure to inadequately low drug levels was significantly associated with reduced survival (P = 0.004), while the interaction between TWCV and high drug blood levels was not. One hundred and thirty patients (16.2%) had the combination of high TWCV and exposure to low drug values (<5 ng/mL). These patients had reduced graft survival by univariate and multivariate analyses (HR 2.42, 95% CI 1.57–3.74, P < 0.001 and HR 2.6, 95% CI 1.65–4.11, P < 0.001, respectively).Conclusions. The combination of high TWCV and exposure to low drug levels might identify high-risk patients in the early post-transplantation period.



Lifetime risk of renal replacement therapy in Europe: a population-based study using data from the ERA-EDTA Registry

2016-12-24

Abstract
Background. Upcoming KDIGO guidelines for the evaluation of living kidney donors are expected to move towards a personal risk-based evaluation of potential donors. We present the age and sex-specific lifetime risk of renal replacement therapy (RRT) for end-stage renal disease in 10 European countries.Methods. We defined lifetime risk of RRT as the cumulative incidence of RRT up to age 90 years. We obtained RRT incidence rates per million population by 5-year age groups and sex using data from the European Renal Association–European Dialysis and Transplant Association (ERA-EDTA) Registry, and used these to estimate the cumulative incidence of RRT, adjusting for competing mortality risk.Results. Lifetime risk of RRT varied from 0.44% to 2.05% at age 20 years and from 0.17% to 1.59% at age 70 years across countries, and was twice as high in men as in women. Lifetime RRT risk decreased with age, ranging from an average of 0.77% to 0.44% in 20- to- 70-year-old women, and from 1.45% to 0.96% in 20- to- 70-year-old men. The lifetime risk of RRT increased slightly over the past decade, more so in men than in women. However, it appears to have stabilized or even decreased slightly in more recent years.Conclusions. The lifetime risk of RRT decreased with age, was lower in women as compared with men of equal age and varied considerably throughout Europe. Given the substantial differences in lifetime risk of RRT between the USA and Europe, country-specific estimates should be used in the evaluation and communication of the risk of RRT for potential living kidney donors.



A reliable method to assess the water permeability of a dialysis system: the global ultrafiltration coefficient *

2016-11-10

Abstract
Background: Recent randomized controlled trials suggest that sufficiently high convection post-dilutional haemodiafiltration (HC-HDF) improves survival in dialysis patients, consequently this technique is increasingly being adopted. However, when performing HC-HDF, rigorous control systems of the ultrafiltration setting are required. Assessing the global ultrafiltration coefficient of the dialysis system [GKD-UF; defined as ultrafiltration rate (QUF)/transmembrane pressure] or water permeability may be adapted to the present dialysis settings and be of value in clinics.Methods:GKD-UF was determined and its reproducibility, variability and influencing factors were specifically assessed in 15 stable patients routinely treated by high-flux haemodialysis or HC-HDF in a single unit.Results:GKD-UF invariably followed a parabolic function with increasing QUF in dialysis and both pre- and post-dilution HC-HDF (R2 constantly >0.96). The vertex of the parabola, GKD-UF-max and related QUF were very reproducible per patient (coefficient of variation 3.9 ± 0.6 and 3.3 ± 0.3%, respectively) and they greatly varied across patients (31–42 mL/h−1/mmHg and 82–100 mL/min, respectively). GKD-UF-max and its associated QUF decreased during dialysis treatment (P < 0.01). The GKD-UF-max decrease was related to weight loss (R2 = 0.66; P = 0.0015).Conclusions:GKD-UF is a reliable and accurate method to assess the water permeability of a system in vivo. It varies according to dialysis setting and patient-related factors. It is an objective parameter evaluating the forces driving convection and identifies any diversion of the system during the treatment procedure. It is applicable to low- or high-flux dialysis as well as pre- or post-dilution HDF. Thus, it may be used to describe the characteristics of a dialysis system, is suitable for clinical use and may be of help for personalized prescription.



Selection of peritoneal dialysis among older eligible patients with end-stage renal disease

2016-10-25

Abstract
Background: Older patients with end-stage renal disease (ESRD) are less likely to choose peritoneal dialysis (PD) over hemodialysis (HD). The reasons behind their choice of dialysis modality are not clear. This study seeks to determine the patient-perceived factors that influence ESRD patients' choice of dialysis modality among older ESRD patients who are deemed eligible for both PD and HD.Methods: All patients had completed a multidisciplinary modality assessment, were deemed eligible for both PD and HD, and had received modality education. Semi-structured interviews were conducted and transcripts were read repeatedly to derive potential codes using line-by-line textual analysis. The Capability, Opportunity, Motivation – Behaviour (COM-B) and Theoretical Domain Framework (TDF), validated tools that were developed for designing behavioral change interventions, were used to help guide the coding framework.Results: Among older ESRD patients who are deemed eligible for both PD and HD, factors relevant to their modality decision-making were identified with respect to physical strength/dexterity and having a sound mind (capability), external forces and constraints (opportunity), and values and beliefs (motivation). Often a combination of factors led to an individual's choice of a particular dialysis modality. However, preferences for PD were primarily based around convenience and maintaining a normal life, while a heightened sense of security was the primary reason for those who selected HD.Conclusions: We have identified patient-perceived factors that influence choice of dialysis modality in older individuals with ESRD who are eligible for PD and HD. These factors should be considered and/or addressed within PD programs seeking to promote PD.



Chronic interstitial nephritis in agricultural communities: a worldwide epidemic with social, occupational and environmental determinants

2016-10-13

Abstract
Increase in the prevalence of chronic kidney disease (CKD) is observed in Central America, Sri Lanka and other tropical countries. It is named chronic interstitial nephritis in agricultural communities (CINAC). CINAC is defined as a form of CKD that affects mainly young men, occasionally women. Its aetiology is not linked to diabetes, hypertension, glomerulopathies or other known causes. CINAC patients live and work in poor agricultural communities located in CINAC endemic areas with a hot tropical climate, and are exposed to toxic agrochemicals through work, by ingestion of contaminated food and water, or by inhalation. The disease is characterized by low or absent proteinuria, small kidneys with irregular contours in CKD stages 3–4 presenting tubulo-interstitial lesions and glomerulosclerosis at renal biopsy. Although the aetiology of CINAC is unclear, it appears to be multifactorial. Two hypotheses emphasizing different primary triggers have been proposed: one related to toxic exposures in the agricultural communities, the other related to heat stress with repeated episodes of dehydration heath stress and dehydration. Existing evidence supports occupational and environmental toxins as the primary trigger. The heat stress and dehydration hypothesis, however, cannot explain: why the incidence of CINAC went up along with increasing mechanization of paddy farming in the 1990s; the non-existence of CINAC in hotter northern Sri Lanka, Cuba and Myanmar where agrochemicals are sparsely used; the mosaic geographical pattern in CINAC endemic areas; the presence of CINAC among women, children and adolescents who are not exposed to the harsh working conditions; and the observed extra renal manifestations of CINAC. This indicates that heat stress and dehydration may be a contributory or even a necessary risk factor, but which is not able to cause CINAC by itself.



Snail and kidney fibrosis

2016-10-06

Abstract
Snail family zinc finger 1 (SNAI1) is a transcription factor expressed during renal embryogenesis, and re-expressed in various settings of acute kidney injury (AKI). Subjected to tight regulation, SNAI1 controls major biological processes responsible for renal fibrogenesis, including mesenchymal reprogramming of tubular epithelial cells, shutdown of fatty acid metabolism, cell cycle arrest and inflammation of the microenvironment surrounding tubular epithelial cells. The present review describes in detail the interactions of SNAI1 with AKI-associated signalling pathways. We also discuss how this central factor has been iteratively (and promisingly) targeted in a number of animal models in order to prevent or slow down renal fibrogenesis.



Comparison of outcomes between the incremental and thrice-weekly initiation of hemodialysis: a propensity-matched study of a prospective cohort in Korea

2016-09-28

Abstract
Background: Recent reports have suggested the possible benefit of beginning hemodialysis (HD) at a rate less frequent than three times weekly and incrementally increasing the dialysis dose. However, the data regarding the benefits and safety of incremental HD are insufficient.Methods: We analyzed 927 patients with newly initiated HD from the Clinical Research Center for End-Stage Renal Disease cohort from 2008 to 2014. The patients were classified into a thrice-weekly initiation group or an incremental initiation group (one to two sessions per week) according to the frequency of HD per week at baseline. We compared health-related quality of life (HRQOL), daily urine volume at 12 months and all-cause mortality between the groups. We matched the thrice-weekly and incremental groups at a 1:2 ratio using propensity score matching.Results: A total of 312 patients (207 in the thrice-weekly group and 105 in the incremental group) were selected. All-cause mortality was comparable between the two groups before and after propensity score matching. The HRQOL tended to be better in the incremental group for the majority of domains of the Kidney Disease Quality of Life Short Form and Beck's Depression Inventory; however, only the symptoms and problems domain was significantly better in the incremental group at 3 months after HD. At 12 months after HD, there were no differences between the groups. The daily urine volume at 12 months after HD was similar between the two groups.Conclusions: Incremental HD initiation showed comparable results to thrice-weekly initiation for HRQOL, residual renal function and all-cause mortality. Incremental HD may be considered an additional option for HD initiation in selected patients.



Serum iron level and kidney function: a Mendelian randomization study

2016-06-02

Abstract
Background: Iron depletion is a known consequence of chronic kidney disease (CKD), but there is contradicting epidemiological evidence on whether iron itself affects kidney function and whether its effect is protective or detrimental in the general population. While epidemiological studies tend to be affected by confounding and reverse causation, Mendelian randomization (MR) can provide unconfounded estimates of causal effects by using genes as instruments.Methods: We performed an MR study of the effect of serum iron levels on estimated glomerular filtration rate (eGFR), using genetic variants known to be associated with iron. MR estimates of the effect of iron on eGFR were derived based on the association of each variant with iron and eGFR from two large genome-wide meta-analyses on 48 978 and 74 354 individuals. We performed a similar MR analysis for ferritin, which measures iron stored in the body, using variants associated with ferritin.Results: A combined MR estimate across all variants showed a 1.3% increase in eGFR per standard deviation increase in iron (95% confidence interval 0.4–2.1%; P = 0.004). The results for ferritin were consistent with those for iron. Secondary MR analyses of the effects of iron and ferritin on CKD did not show significant associations but had very low statistical power.Conclusions: Our study suggests a protective effect of iron on kidney function in the general population. Further research is required to confirm this causal association, investigate it in study populations at higher risk of CKD and explore its underlying mechanism of action.



Age-determined severity of anti-myeloperoxidase autoantibody-mediated glomerulonephritis in mice

2016-05-26

Abstract
Background: Anti-neutrophil cytoplasmic antibody associated vasculitis (AAV) is a typical disease of the elderly. In AAV, there is an age-specific increase in disease incidence with age being a predictor of disease outcome. In this study, we aimed to determine the contribution of age to the development of AAV employing a mouse model of anti-myeloperoxidase (MPO) antibody-mediated glomerulonephritis.Methods: Anti-MPO IgG and lipopolysaccharide (LPS)-mediated glomerulonephritis was induced in 3- and 18-month-old C57Bl6 mice. Clinical and pathological parameters of disease severity, alterations in the immune system and kidney specific changes in these mice were evaluated.Results: Eighteen-month-old mice developed increased disease severity upon injection of anti-MPO IgG/LPS compared with 3-month-old mice. This was evidenced by increased albuminuria, more extensive glomerular capillary necrosis and increased glomerular neutrophil accumulation. Glomerular crescent formation was mild in both young and old mice. Old mice displayed higher plasma interleukin-6 levels as well as higher proportions of circulating neutrophils and activated monocytes compared with young mice. In addition, renal mRNA levels of inflammatory genes and endothelial adhesion molecules were higher in 18-month-old mice compared with 3-month-old mice.Conclusion: In conclusion, our results indicate that aged mice develop more severe clinical and pathological disease upon induction of anti-MPO IgG/LPS-mediated glomerulonephritis. These findings may be attributed to age-related changes in the immune system as well as in the kidney itself.



Monocytic angiotensin-converting enzyme 2 relates to atherosclerosis in patients with chronic kidney disease

2016-05-24

Abstract
Background: Increased levels of monocytic angiotensin-converting enzyme (ACE) found in haemodialysis (HD) patients may directly participate in the pathogenesis of atherosclerosis. We demonstrated recently that uremia triggers the development of highly pro-atherogenic monocytes via an angiotensin II (AngII)–dependent mechanism. Opposing actions of the AngII-degrading ACE2 remain largely unknown. We examined the status of both ACEs and related receptors in circulating leukocytes of HD, not-dialyzed CKD and healthy individuals. Furthermore, we tested the possible impact of monocytic ACEs on atherogenesis and behaviour of the cells under conditions mimicking chronic renal failure.Methods: Expression of ACE, ACE2, AT1R, AT2R and MASR was investigated on circulating leukocytes from 71 HD (62 ± 14 years), 24 CKD stage 3–5 (74 ± 10 years) patients and 37 healthy control subjects (53 ± 6 years) and isolated healthy monocytes treated with normal and uremic serum. Analyses of ACE, ACE2, ICAM-1, VCAM-1, MCSF and endothelial adhesion were tested on ACE-overexpressing THP-1 monocytes treated with captopril or losartan. ACE2-overexpressing monocytes were subjected to transmigration and adhesion assays and investigated for MCP-1, ICAM-1, VCAM-1, MCSF, AT1R and AT2R expression.Results: The ACE mRNA level was significantly increased in HD and CKD stage 3–5 leukocytes. Correspondingly, ACE2 was downregulated and AngII as well as MAS receptor expression was upregulated in these cells. Healthy monocytes preconditioned with uremic serum reflected the same expressional regulation of ACE/ACE2, MAS and AngII receptors as those observed in HD and CKD stage 3–5 leukocytes. Overexpression of monocytic ACE dramatically decreased levels of ACE2 and induced a pro-atherogenic phenotype, partly reversed by AngII-modifying treatments, leading to an increase in ACE2. Overexpression of ACE2 in monocytes led to reduced endothelial adhesion, transmigration and downregulation of adhesion-related molecules.Conclusions: HD and not-dialyzed CKD stage 3–5 patients show enhanced ACE and decreased ACE2 expression on monocytes. This constellation renders the cells endothelial adhesive and likely supports the development of atherosclerosis.



C-C motif-ligand 2 inhibition with emapticap pegol (NOX-E36) in type 2 diabetic patients with albuminuria

2016-04-27

Abstract
Background: Emapticap pegol (NOX-E36) is a Spiegelmer® that specifically binds and inhibits the pro-inflammatory chemokine C-C motif-ligand 2 (CCL2) (also called monocyte-chemotactic protein 1). The objective of this exploratory study was to evaluate the safety and tolerability as well as the renoprotective and anti-diabetic potential of emapticap in type 2 diabetic patients with albuminuria.Methods: A randomized, double-blind, placebo-controlled Phase IIa study was initiated in 75 albuminuric type 2 diabetics. Emapticap at 0.5 mg/kg and placebo were administered subcutaneously twice weekly for 12 weeks to 50 and 25 patients, respectively, followed by a treatment-free phase of 12 weeks.Results: Twice weekly subcutaneous treatment with emapticap over 3 months was generally safe and well tolerated and reduced the urinary albumin/creatinine ratio (ACR) from baseline to Week 12 by 29% (P < 0.05); versus placebo a non-significant ACR reduction of 15% was observed (P = 0.221). The maximum difference, 26% (P = 0.064) between emapticap and placebo, was seen 8 weeks after discontinuation of treatment. At Week 12, the HbA1c changed by −0.31% in the emapticap versus +0.05% in the placebo group (P = 0.146). The maximum difference for HbA1c was observed 4 weeks after the last dose with −0.35% for emapticap versus +0.12% for placebo (P = 0.026). No relevant change in blood pressure or estimated glomerular filtration rate was seen between the treatment groups throughout the study. A post hoc analysis with exclusion of patients with major protocol violations, dual RAS blockade or haematuria increased the ACR difference between the two treatment arms to 32% at Week 12 (P = 0.014) and 39% at Week 20 (P = 0.010).Conclusions: Inhibition of the CCL2/CCL2 receptor axis with emapticap pegol was generally safe and well tolerated. Beneficial effects on ACR and HbA1c were observed in this exploratory study, which were maintained after cessation of treatment. Taken together, emapticap may have disease-modifying effects that warrant further investigation in adequately powered confirmatory studies.



Hypothermia and kidney: a focus on ischaemia–reperfusion injury

2016-04-06

Abstract
Cellular damage after reperfusion of ischaemic tissue is defined as ischaemia–reperfusion injury (IRI). Hypothermia is able to decrease oxygen consumption, preventing a rapid loss of mitochondrial activity. However, even though cooling can help to decrease the deleterious effects of ischaemia, the consequences are not exclusively beneficial, such that hypothermic storage is a compromise between benefits and harm. The present review details the relationship between renal IRI and hypothermia, describing the pathophysiology of IRI and hypothermic protection through experimental evidence. Although experimental models of renal IRI are a valuable tool for understanding the pathophysiology of renal ischaemia–reperfusion, the clinical transfer of experimental results has several limitations, particularly because of anatomical and physiological differences. In this review limitations of animal models but also hypothermia as a strategy to protect the kidney from IRI are discussed. We also attempt to describe three clinical scenarios where hypothermia is used in clinical settings of IRI: transplantation, deceased donors and post-cardiac arrest.



End-stage renal disease in ANCA-associated vasculitis

2016-04-06

Abstract
The outcomes in patients with anti-neutrophil cytoplasmic antibody (ANCA)-associated vasculitis have improved significantly over the past decades, although a significant proportion of them still reach end-stage renal disease (ESRD). Renal replacement therapy (RRT) is associated with a relatively low risk of relapsing vasculitis as a result of anti-rejection treatment after kidney transplantation or quiescence of the autoimmune process in haemodialysis patients, but a flare of vasculitis in the latter setting presents a challenge because the treatment is poorly tolerated. There are benefits of rituximab in haemodialysed patients, as it is more steroid sparing in the treatment of extrarenal disease. More favourable outcomes of kidney transplantation compared with haemodialysis support its use as a preferable method of RRT in patients with vasculitis remission or low disease activity.



Depression and all-cause and cardiovascular mortality in patients on haemodialysis: a multinational cohort study

2016-03-22

Abstract
Background: Depression and early death are both common in adults with Stage 5 chronic kidney disease. Studies have shown an association between depression and total mortality, but the association between depression and cardiovascular death is less certain.Methods: We conducted a prospective multinational cohort study involving adults who were treated with long-term haemodialysis within a single dialysis network between April and November 2010. Depression was considered present when patients reported a Beck Depression Inventory (BDI) II score ≥14 at baseline. Sensitivity analyses considered a BDI II score ≥20 to identify moderate depression. Multivariable Cox proportional hazards regression was used to assess adjusted hazards for all-cause and cardiovascular mortality at 12 months.Results: Three thousand and eighty-six participants in the network received the BDI II questionnaire, and 2278 (73%) provided complete responses to the survey questions. Among these, 1047 (46%) reported depression. During a mean follow-up of 11 (standard deviation: 2.5) months (2096 person-years), we recorded 175 deaths, of which 66 were attributable to cardiovascular causes. Depression (BDI score ≥14) was not associated with all-cause mortality [adjusted hazard ratio: 1.26 (95% confidence interval: 0.93–1.71)] or cardiovascular mortality [0.82 (0.50–1.34)]. When a higher BDI score (BDI score ≥20) was used to identify moderate depression, depression was associated with total mortality [1.40 (1.02–1.93)] but not cardiovascular mortality [1.05 (0.63–1.77)].Conclusions: The association between depression and cardiovascular mortality in adults with kidney failure treated with haemodialysis is uncertain. Depression is a heterogeneous disorder and may only be a risk factor for premature death when at least of moderate severity.



Vascular dysfunction in children and young adults with autosomal dominant polycystic kidney disease

2016-03-21

Abstract
Background: Adults with autosomal dominant polycystic kidney disease (ADPKD) exhibit vascular dysfunction, as evidenced by impaired endothelium-dependent dilation (EDD) and stiffening of the large elastic arteries. However, it is unknown whether vascular dysfunction begins earlier in the course of ADPKD. The aim of the study was to assess EDD and arterial stiffness in children and young adults with ADPKD.Methods: Fifteen children and young adults 6–22 years of age with ADPKD and normal renal function were prospectively recruited for participation in a cross-sectional study. Fifteen healthy controls were enrolled to match cases for age and sex. The primary outcomes were EDD, measured as brachial artery flow-mediated dilation (FMDBA), and arterial stiffness, measured as carotid-femoral pulse wave velocity (CFPWV).Results: ADPKD cases were more likely to be taking an angiotensin-converting enzyme inhibitor, but otherwise did not differ from controls in clinical characteristics, including blood pressure. FMDBA was 25% lower in children and young adults with ADPKD (7.7 ± 0.9%, mean ± SE) when compared with matched controls (10.2 ± 0.8%) (P < 0.05). CFPWV was 14% higher in children and young adults with ADPKD (544 ± 23 cm/s) when compared with matched controls (478 ± 17 cm/s) (P < 0.05). Secondary measures of arterial stiffness, carotid augmentation index and carotid systolic blood pressure were also increased in cases when compared with controls (P < 0.05).Conclusions: Impaired EDD and increased arterial stiffness, important independent predictors of future cardiovascular events and mortality, are evident very early in the course of ADPKD in the presence of normal kidney function. Novel interventions to reduce vascular dysfunction in children and young adults with ADPKD should be evaluated, as childhood and young adulthood may represent a critical therapeutic window to reduce future cardiovascular risk in patients with ADPKD.



Longitudinal trends in serum ferritin levels and associated factors in a national incident hemodialysis cohort

2016-03-21

Background: The rise in serum ferritin levels among US maintenance hemodialysis patients has been attributed to higher intravenous iron administration and other changes in practice. We examined ferritin trends over time in hemodialysis patients and whether iron utilization patterns and other factors [erythropoietin-stimulating agent (ESA) prescribing patterns, inflammatory markers] were associated with ferritin trajectory.Methods: In a 5-year (January 2007–December 2011) cohort of 81 864 incident US hemodialysis patients, we examined changes in ferritin averaged over 3-month intervals using linear mixed effects models adjusted for intravenous iron dose, malnutrition and inflammatory markers. We then examined ferritin trends across strata of baseline ferritin level, dialysis initiation year, cumulative iron and ESA use in the first dialysis year and baseline hemoglobin level.Results: In models adjusted for iron dose, malnutrition and inflammation, mean ferritin levels increased over time in the overall cohort and across the three lower baseline ferritin strata. Among patients initiating dialysis in 2007, mean ferritin levels increased sharply in the first versus second year of dialysis and again abruptly increased in the fifth year independent of iron dose, malnutrition and inflammatory markers; similar trends were observed among patients who initiated dialysis in 2008 and 2009. In analyses stratified by cumulative iron use, mean ferritin increased among groups receiving iron, but decreased in the no iron group. In analyses stratified by cumulative ESA dose and baseline hemoglobin, mean ferritin increased over time.Conclusions: While ferritin trends correlated with patterns of iron use, increases in ferritin over time persisted independent of intravenous iron and ESA exposure, malnutrition and inflammation.



Risk for cognitive impairment across 22 measures of cognitive ability in early-stage chronic kidney disease

2016-03-08

Abstract
Background: Chronic kidney disease (CKD) is a significant risk factor for cognitive impairment. Previous studies have examined differences in cognitive impairment between persons with and without CKD using multiple cognitive outcomes, but few have done this for an extensive battery of cognitive tests. We relate early-stage CKD to two indices of impairment for 22 measures of cognitive ability.Methods: The study was community-based and cross-sectional with 898 individuals free from dementia and end-stage renal disease. Estimated glomerular filtration rate (eGFR) was calculated using the chronic kidney disease epidemiology collaboration equation and classified as <60 or ≥60 mL/min/1.73 m2, based on consensus definitions of Stage 3 or greater CKD. The eGFR classifications were related to modest [≥1 standard deviation (SD) below the mean] and severe (≥1.5 SD below the mean) impairment on each measure using logistic regression analyses adjusting for potential risk factors.Results: A total of 146 individuals (16.3%) had eGFR <60 mL/min/1.73 m2 (mean 51.6 ± 10.1 mL/min/1.73 m2). These participants had significantly greater risk for modestly impaired abilities in the scanning and tracking and visual-spatial organization/memory (VSOM) domains after accounting for comorbidity-related risk factors [odds ratios (ORs) between 1.68 and 2.16], as well as greater risk for severely impaired functioning in the language domain (OR = 2.65).Conclusions: Participants with eGFR <60 mL/min/1.73 m2 were at higher risk for cognitive impairment than those with eGFR ≥60 mL/min/1.73 m2 on the majority of cognitive abilities, specifically those within the VSOM, Language, and scanning and tracking domains. Targeted screening for cognitive deficits in kidney disease patients early in their disease course may be warranted.



Altered circadian hemodynamic and renal function in cirrhosis

2016-02-29

Abstract
Background: Given that alterations in systemic hemodynamics have a profound influence on renal function in patients with cirrhosis, it is surprising that circadian variations in blood pressure (BP) and renal electrolyte excretion have scarcely been studied. Our aims were to define the relationship of 24-h ambulatory BP changes with renal tubular function and to determine the influence of endotoxemia on BP and urinary parameters.Methods: Forty healthy controls served as a comparator to 20 cirrhotic patients. They underwent 24-h ambulatory BP monitoring and 24-h urine collection.Results: Subjects with cirrhosis demonstrated significant diurnal variations in urinary excretion of sodium (57.7 µmol/min day versus 87 µmol/min night) and creatinine (826 µg/min day versus 1202 µg/min night). Increasing severity of cirrhosis was associated with a progressive reduction in ambulatory awake systolic (P-trend = 0.015), diastolic (P-trend < 0.001) and mean BP (P-trend < 0.001). In patients with cirrhosis, the magnitude of change in BP from awake to sleep state was blunted for systolic BP (5% reduction, P = 0.039) and pulse rate (2% reduction, P < 0.001). The amplitude of variation in pulse rate was blunted with increasing severity of cirrhosis (controls 6.5, Child-Pugh Class A 5.3, Child B 3.4, Child C 1.2, P = 0.03) and the acrophase was right-shifted with increasing severity of cirrhosis. Compared with sleep state, during the awake state, endotoxin was associated with less sodium excretion and a lower systolic BP. Compared with the awake state, endotoxin was associated with a higher sleeping pulse rate (P < 0.001).Conclusions: Patients with cirrhosis have altered diurnal profiles in renal tubular function and blood pressure that appear to be related to endotoxemia. Determining whether endotoxemia is causally related to these perturbations will require interventional studies.



Genetic risk variants for membranous nephropathy: extension of and association with other chronic kidney disease aetiologies

2016-02-04

Abstract
Background: Membranous nephropathy (MN) is a common cause of nephrotic syndrome in adults. Previous genome-wide association studies (GWAS) of 300 000 genotyped variants identified MN-associated loci at HLA-DQA1 and PLA2R1.Methods: We used a combined approach of genotype imputation, GWAS, human leucocyte antigen (HLA) imputation and extension to other aetiologies of chronic kidney disease (CKD) to investigate genetic MN risk variants more comprehensively. GWAS using 9 million high-quality imputed genotypes and classical HLA alleles were conducted for 323 MN European-ancestry cases and 345 controls. Additionally, 4960 patients with different CKD aetiologies in the German Chronic Kidney Disease (GCKD) study were genotyped for risk variants at HLA-DQA1 and PLA2R1.Results: In GWAS, lead variants in known loci [rs9272729, HLA-DQA1, odds ratio (OR) = 7.3 per risk allele, P = 5.9 × 10−27 and rs17830558, PLA2R1, OR = 2.2, P = 1.9 × 10−8] were significantly associated with MN. No novel signals emerged in GWAS of X-chromosomal variants or in sex-specific analyses. Classical HLA alleles (DRB1*0301-DQA1*0501-DQB1*0201 haplotype) were associated with MN but provided little additional information beyond rs9272729. Associations were replicated in 137 GCKD patients with MN (HLA-DQA1: P = 6.4 × 10−24; PLA2R1: P = 5.0 × 10−4). MN risk increased steeply for patients with high-risk genotype combinations (OR > 79). While genetic variation in PLA2R1 exclusively associated with MN across 19 CKD aetiologies, the HLA-DQA1 risk allele was also associated with lupus nephritis (P = 2.8 × 10−6), type 1 diabetic nephropathy (P = 6.9 × 10−5) and focal segmental glomerulosclerosis (P = 5.1 × 10−5), but not with immunoglobulin A nephropathy.Conclusions:PLA2R1 and HLA-DQA1 are the predominant risk loci for MN detected by GWAS. While HLA-DQA1 risk variants show an association with other CKD aetiologies, PLA2R1 variants are specific to MN.



The Choice of Renal Replacement Therapy (CORETH) project: dialysis patients' psychosocial characteristics and treatment satisfaction

2016-02-02

Abstract
Background: Until today, research has underestimated the role of psychosocial conditions as contributing factors to dialysis modality choice. The novelty within the Choice of Renal Replacement Therapy (CORETH) project (German Clinical Trials Register #DRKS00006350) is its focus on the multivariate associations between these aspects and their consecutive significance regarding treatment satisfaction (TS) in peritoneal dialysis (PD) versus haemodialysis (HD) patients. In this article, we present the baseline results of a multicentre study, which is supported by a grant from the German Ministry for Education and Research.Methods: Six to 24 months after initiation of dialysis, 780 patients from 55 dialysis centres all over Germany were surveyed. The questionnaire addressed psychosocial, physical, socio-demographic and shared decision-making (SDM) aspects. Furthermore, cognitive functioning was tested. After indexing the measures, two propensity score-matched groups (n = 482) were compared in a first step, after having chosen PD or HD. In a second step, a moderated multiple regression (n = 445) was conducted to initially investigate the multivariate impact of patient characteristics on TS.Results: In comparison with HD patients, PD patients were more satisfied with their treatment (P < 0.001), had a more autonomy-seeking personality (P = 0.04), had better cognitive functioning (P = 0.001), indicated more satisfying SDM (P < 0.001) and had a larger living space (P < 0.001). All patients were more satisfied when they had a good psychological state and received SDM. Especially in HD patients, TS was higher when the patient had a less autonomous personality, lower cognitive functioning, more social support, a poorer physical state and poorer socio-demographic conditions (R2 = 0.26).Conclusions: Psychosocial characteristics play a major role in TS in dialysis patients. Within a multivariate approach, these factors are even more important than physical or environment-related factors. In practice, focusing on SDM and screening patient characteristics at an early stage can foster patients’ TS. Changes will be examined in a 1-year follow-up.