Subscribe: Nephrology Dialysis Transplantation - recent issues
http://ndt.oxfordjournals.org/rss/recent.xml
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
chronic kidney  ckd  dialysis  disease  kidney disease  kidney  mortality  ndash  patients  renal  risk  salt  study  treatment 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Nephrology Dialysis Transplantation - recent issues

Nephrology Dialysis Transplantation - recent issues



Nephrology Dialysis Transplantation - RSS feed of recent issues (covers the latest 3 issues, including the current issue)



 















Causality at the dawn of the 'omics era in medicine and in nephrology

2016-09-02T00:05:37-07:00

Causality is a core concept in medicine. The quantitative determinacy characterizing today's biomedical science is unprecedented. The assessment of causal relations in human diseases is evolving, and it is therefore fundamental to keep up with the steady pace of theoretical and technological advancements. The exact specification of all causes of pathologies at the individual level, precision medicine, is expected to allow the complete eradication of disease. In this article, we discuss the various conceptualizations of causation that are at play in the context of randomized clinical trials and observational studies. Genomics, proteomics, metabolomics and epigenetics can now produce the precise knowledge we need for 21st century medicine. New conceptions of causality are needed to form the basis of the new precision medicine.




The blood pressure-salt sensitivity paradigm: pathophysiologically sound yet of no practical value

2016-09-02T00:05:37-07:00

Sodium plays an important pathophysiological role in blood pressure (BP) values and in the development of hypertension, and epidemiological studies such as the Intersalt Study have shown that the increase in BP occurring with age is determined by salt intake. Recently, a meta-analysis of 13 prospective studies has also shown the close relationship between excess sodium intake and higher risk of stroke and total cardiovascular events. However, the BP response to changing salt intake displayed a marked variability, as first suggested by Kawasaki et al. (The effect of high-sodium and low-sodium intakes on blood pressure and other related variables in human subjects with idiopathic hypertension. Am J Med 1978; 64: 193–198) and later by Weinberger et al. (Definitions and characteristics of sodium sensitivity and blood pressure resistance. Hypertension 1986; 8: II127–II134), who recognized the heterogeneity of the BP response to salt and developed the concept of salt sensitivity. We have a large body of evidence in favour of a major role of metabolic and neuro-hormonal factors in determining BP salt sensitivity in association with the effect of genetic variation. There is evidence that salt sensitivity influences the development of organ damage, even independently—at least in part—of BP levels and the occurrence of hypertension. In addition, several observational studies indicate that salt sensitivity is clearly associated with a higher rate of cardiovascular events and mortality, independently of BP levels and hypertension. A cluster of factors with well-known atherogenic potential such as hyperinsulinaemia, dyslipidaemia and microalbuminuria—all known to be prevalent in salt-sensitive hypertension—might at least partially explain the increased cardiovascular risk observed in salt sensitive individuals. The gold standard for the evaluation of BP salt sensitivity is the BP response to a moderate reduction of salt intake for several weeks; nevertheless, these protocols often suffer of poor patient compliance to dietary instructions. To overcome this problem, short-term tests have been proposed that evaluate either large differences in salt intake for a few days or the response to intravenous administration of saline solution and short-acting diuretics. Recently, the use of ambulatory BP measurement has been proposed for the clinical assessment of BP salt sensitivity. Noteworthy, BP salt sensitivity, in whomever or however assessed, behaves as a continuous variable but salt sensitivity is used as a categorical parameter, with salt-sensitive individuals being defined as those with a difference in BP between low- and high-sodium intake >10%, and salt-resistant subjects those in whom BP does not increase or shows an increase <5% under sodium loading. The general conclusion that can and should be drawn from the above considerations is that the paradigm of salt sensitivity, despite its important pathophysiological meaning, is not helpful, so far, to the practising physician in clinical practice nor is it relevant or useful to the design and implementation of a population-based strategy of salt intake reduction; however, further studies are warranted for an accurate assessment of the salt-sensitivity phenotype in clinical practice. In the absence of a population strategy for salt intake reduction, the aim should be the generation of a ‘low sodium environment’ allowing for a dietary salt intake tailored on true human requirements and not on deleterious lifestyle habits.




Pro: Reducing salt intake at population level: is it really a public health priority?

2016-09-02T00:05:37-07:00

A reduction in salt intake reduces blood pressure, stroke and other cardiovascular events, including chronic kidney disease, by as much as 23% (i.e. 1.25 million deaths worldwide). It is effective in both genders, any age, ethnic group, and in high-, medium- and low-income countries. Population salt reduction programmes are both feasible and effective (preventive imperative). Salt reduction programmes are cost-saving in all settings (high-, middle- and low-income countries) (economic imperative). Public health policies are powerful, rapid, equitable and cost-saving (political imperative). The important shift in public health has not occurred without obstinate opposition from organizations concerned primarily with the profits deriving from population high salt intake and less with public health benefits. A key component of the denial strategy is misinformation (with ‘pseudo’ controversies). In general, poor science has been used to create uncertainty and to support inaction. This paper summarizes the evidence in favour of a global salt reduction strategy and analyses the peddling of well-worn myths behind the false controversies.




Opponent's comments

2016-09-02T00:05:37-07:00




Con: Reducing salt intake at the population level: is it really a public health priority?

2016-09-02T00:05:37-07:00

Scientific evidence to support the recommended salt intake of < 5.8 g/day is virtually non-existingent. There are no randomized controlled trials (RCTs) to investigate the effect of salt reduction (SR) below 5.8 g on health outcomes. The effect of SR on blood pressure (BP) reaches maximal efficacy at 1 week. RCTs in healthy individuals lasting at least 1 week show that the effect of SR on BP is <1 mmHg, but that SR has significant side effects, including increases in renin, aldosterone, noradrenalin, adrenalin, cholesterol and triglyceride. Still, disregarding confounders and side effects, health authorities use BP effects obtained in studies of pre-hypertensive and hypertensive patients to recommend SR in the healthy population and use these biased BP effects in statistical models indirectly to project millions of saved lives. These fantasy projections are in contrast to real data from prospective observational population studies directly associating salt intake with mortality, which show that salt intake <5.8 g/day is associated with an increased mortality of ~15%. The population studies also show that a very high salt intake >12.2 g is associated with increased mortality. However, since <5% of populations consume such high amounts of salt, SR at the population level should not be a public health priority. Consequently, this policy should be abolished, not because any attempt to implement it has failed, and not because it costs taxpayers and food consumers unnecessary billions of dollars, but because—if implemented—it might kill people instead of saving them.




Opponent's comments

2016-09-02T00:05:37-07:00




Moderator's view: Salt, cardiovascular risk, observational research and recommendations for clinical practice

2016-09-02T00:05:37-07:00

In observational studies, blood pressure (BP), cholesterol and nutritional status biomarkers, including sodium intake, coherently show a J- or U-shaped relationship with health outcomes. Yet these data may reflect a stable sodium intake or a reduced intake due to comorbidities or intercurrent disease, or an intentional decrease in salt intake. Adjusting for comorbidities and risk factors may fail to eliminate confounding. For cholesterol and BP, we base our recommendations for prevention and treatment on interventional (experimental) studies. For sodium, we lack the perfect large-scale trial we would need, but substantial circumstantial information derived from interventional studies cannot be ignored. The objection that modelling the risk of salt excess for cardiovascular disease events based on the effect of salt intake on BP is unjustified fails to consider a recent meta-analysis showing that, independently of the intervention applied, intensive BP-lowering treatment (average BP 133/76 mmHg), compared with the less intensive treatment (140/81 mmHg), is associated with a 14% risk reduction for major cardiovascular events. In this knowledge context, inertia, i.e. awaiting the ‘mother trial’, is not justified. While recognizing that this trial may still be needed and that actual data, rather than modelled data, are the ideal solution, for now, the World Health Organization recommendation of reducing salt intake to <2 g/day of sodium (5 g/day of salt) in adults stands.




Chronicity following ischaemia-reperfusion injury depends on tubular-macrophage crosstalk involving two tubular cell-derived CSF-1R activators: CSF-1 and IL-34

2016-09-02T00:05:37-07:00

Two structurally unrelated ligands activate the macrophage colony stimulating factor receptor (CSF-1R, c-fms, CD115): M-CSF/CSF-1 and interleukin-34 (IL-34). Both ligands promote macrophage proliferation, survival and differentiation. IL-34 also activates the protein-tyrosine phosphatase receptor (PTP-, PTPRZ1). Both receptors and cytokines are increased during acute kidney injury. While tubular cell-derived CSF-1 is required for kidney repair, Baek et al. (J Clin Invest 2015; 125: 3198–3214) have now identified tubular epithelial cell-derived IL-34 as a promoter of kidney neutrophil and macrophage infiltration and tubular cell destruction during experimental kidney ischaemia-reperfusion, leading to chronic injury. IL-34 promoted proliferation of both intrarenal macrophages and bone marrow cells, increasing circulating neutrophils and monocytes and their kidney recruitment. Thus, injured tubular cells release two CSF-1R activators, one (CSF-1) that promotes tubular cell survival and kidney repair and another (IL-34) that promotes chronic kidney damage. These results hold promise for the development of IL-34-targeting strategies to prevent ischaemia-reperfusion kidney injury in contexts such as kidney transplantation. However, careful consideration should be given to the recent characterization by Bezie et al. (J Clin Invest 2015; 125: 3952–3964) of IL-34 as a T regulatory cell (Treg) cytokine that modulates macrophage responses so that IL-34-primed macrophages potentiate the immune suppressive capacity of Tregs and promote graft tolerance.




Kelch-like 3/Cullin 3 ubiquitin ligase complex and WNK signaling in salt-sensitive hypertension and electrolyte disorder

2016-09-02T00:05:38-07:00

Pseudohypoaldosteronism type II (PHAII) is a hereditary disease characterized by salt-sensitive hypertension, hyperkalemia and thiazide sensitivity. Mutations in with-no-lysine kinase 1 (WNK1) and WNK4 genes are reported to cause PHAII. Rigorous studies have demonstrated that WNK kinases constitute a signaling cascade with oxidative stress-responsive gene 1 (OSR1), Ste20-related proline-alanine-rich kinase (SPAK) and the solute carrier family 12a (SLC12a) transporter, including thiazide-sensitive NaCl cotransporter. The WNK–OSR1/SPAK-SLC12a signaling cascade is present in the kidneys and vascular smooth muscle cells (VSMCs) and regulates salt sensitivity physiologically, i.e. urinary sodium excretion and arterial tone by various hormonal and dietary factors. However, although it was clear that the abnormal activation of this signaling cascade is the molecular basis of PHAII, the molecular mechanisms responsible for the physiological regulation of WNK signaling and the effect of WNK4 mutations on PHAII pathogenesis are poorly understood. Two additional genes responsible for PHAII, Kelch-like 3 (KLHL3) and Cullin 3 (CUL3), were identified in 2012. WNK1 and WNK4 have been shown to be substrates of KLHL3–CUL3 E3 ubiquitin ligase both in vitro and in vivo. In PHAII, the loss of interaction between KLHL3 and WNK4 induces increased levels of WNK kinases due to impaired ubiquitination. These results indicate that WNK signaling is physiologically regulated by KLHL3/CUL3-mediated ubiquitination. Here, we review recent studies investigating the pathophysiological roles of the WNK signaling cascade in the kidneys and VSMCs and recently discovered mechanisms underlying the regulation of WNK signaling by KLHL3 and CUL3.




Glomerular filtration rate decline as a surrogate end point in kidney disease progression trials

2016-09-02T00:05:38-07:00

Chronic kidney disease (CKD) is strongly associated with increased risks of progression to end-stage kidney disease (ESKD) and mortality. Clinical trials evaluating CKD progression commonly use a composite end point of death, ESKD or serum creatinine doubling. However, due to low event rates, such trials require large sample sizes and long-term follow-up for adequate statistical power. As a result, very few interventions targeting CKD progression have been tested in randomized controlled trials. To overcome this problem, the National Kidney Foundation and Food and Drug Administration conducted a series of analyses to determine whether an end point of 30 or 40% decline in estimated glomerular filtration rate (eGFR) over 2–3 years can substitute for serum creatinine doubling in the composite end point. These analyses demonstrated that these alternate kidney end points were significantly associated with subsequent risks of ESKD and death. However, the association between, and consistency of treatment effects on eGFR decline and clinical end points were influenced by baseline eGFR, follow-up duration and acute hemodynamic effects. The investigators concluded that a 40% eGFR decline is broadly acceptable as a kidney end point across a wide baseline eGFR range and that a 30% eGFR decline may be acceptable in some situations. Although these alternate kidney end points could potentially allow investigators to conduct shorter duration clinical trials with smaller sample sizes thereby generating evidence to guide clinical decision-making in a timely manner, it is uncertain whether these end points will improve trial efficiency and feasibility. This review critically appraises the evidence, strengths and limitations pertaining to eGFR end points.




Changes in inflammatory biomarkers after renal revascularization in atherosclerotic renal artery stenosis

2016-09-02T00:05:38-07:00

Background

Atherosclerotic renal artery stenosis (ARAS) activates oxidative stress and chronic inflammatory injury. Contrast imaging and endovascular stenting pose potential hazards for acute kidney injury, particularly when superimposed upon reduced kidney perfusion.

Methods

We measured sequential early and long-term changes in circulating inflammatory and injury biomarkers in 12 ARAS subjects subjected to computed tomography imaging and stent revascularization compared with essential hypertensive (EH) subjects of similar age under fixed sodium intake and medication regimens in a clinical research unit.

Results

NGAL, TIMP-2, IGFBP7, MCP-1 and TNF-α all were elevated before intervention. Post-stenotic kidney volume, perfusion, blood flow and glomerular filtration rate (GFR) were lower in ARAS than in EH subjects. TIMP-2 and IGFBP7 fell briefly, then rose over 18 h after contrast imaging and stent deployment. Circulating NGAL decreased and remained lower for 27 h. These biomarkers in ARAS returned to baseline after 3 months, while kidney volume, perfusion, blood flow and GFR increased, but remained lower than EH.

Conclusions

These divergent patterns of inflammatory signals are consistent with cell cycle arrest (TIMP-2, IGFBP7) and relative protection from acute kidney injury after imaging and stenting. Sustained basal elevation of circulating and renal venous inflammatory biomarkers support ongoing, possibly episodic, renal stress in ARAS that limits toxicity from stent revascularization.




Metallothioneins and renal ageing

2016-09-02T00:05:38-07:00

Background

Human lifespan is increasing continuously and about one-third of the population >70 years of age suffers from chronic kidney disease. The pathophysiology of the loss of renal function with ageing is unclear.

Methods

We determined age-associated gene expression changes in zero-hour biopsies of deceased donor kidneys without laboratory signs of impaired renal function, defined as a last serum creatinine >0.96 mg/dL in females and >1.18 mg/dL in males, using microarray technology and the Significance Analysis of Microarrays routine. Expression changes of selected genes were confirmed by quantitative polymerase chain reaction and in situ hybridization and immunohistochemistry for localization of respective mRNA and protein. Functional aspects were examined in vitro.

Results

Donors were classified into three age groups (<40, 40–59 and >59 years; Groups 1, 2 and 3, respectively). In Group 3 especially, genes encoding for metallothionein (MT) isoforms were more significantly expressed when compared with Group 1; localization studies revealed predominant staining in renal proximal tubular cells. RPTEC/TERT1 cells overexpressing MT2A were less susceptible towards cadmium chloride–induced cytotoxicity and hypoxia-induced apoptosis, both models for increased generation of reactive oxygen species.

Conclusions

Increased expression of MTs in the kidney with ageing might be a protective mechanism against increased oxidative stress, which is closely related to the ageing process. Our findings indicate that MTs are functionally involved in the pathophysiology of ageing-related processes.




Extended versus standard azathioprine maintenance therapy in newly diagnosed proteinase-3 anti-neutrophil cytoplasmic antibody-associated vasculitis patients who remain cytoplasmic anti-neutrophil cytoplasmic antibody-positive after induction of remission: a randomized clinical trial

2016-09-02T00:05:38-07:00

Background

Cytoplasmic anti-neutrophil cytoplasmic antibody (C-ANCA) positivity at remission has been associated with an increased relapse rate in patients with proteinase 3 anti-neutrophil cytoplasmic antibody-associated vasculitis (PR3-AAV) after a switch to azathioprine maintenance therapy. We therefore hypothesized that extended azathioprine maintenance therapy could reduce the incidence of relapse in this setting.

Methods

Patients newly diagnosed with PR3-AAV at 12 centres in The Netherlands during 2003–11 who received a standardized induction regimen consisting of oral cyclophosphamide and corticosteroids were enrolled (n = 131). Patients were randomized to standard or extended azathioprine maintenance therapy when C-ANCA was positive at the time of stable remission. Standard maintenance treatment consisted of azathioprine (1.5–2.0 mg/kg) until 1 year after diagnosis and subsequent tapering to 25 mg every 3 months. Extended azathioprine maintenance therapy (1.5–2.0 mg/kg) was continued until 4 years after diagnosis and tapered thereafter. The primary endpoint was relapse-free survival at 4 years after diagnosis.

Results

In patients with PR3-AAV who were C-ANCA positive at the time of stable remission, relapse-free survival at 4 years after diagnosis did not differ significantly between standard azathioprine (n = 24) and extended azathioprine (n = 21) maintenance therapy (P = 0.40). There was also no significant difference in relapse-free survival between patients receiving standard azathioprine (n = 106) versus extended azathioprine maintenance therapy (n = 21; P = 0.94). In addition, there was no difference in the relapse rate between patients with PR3-AAV who were C-ANCA positive (n = 45) at the time of remission versus patients who became C-ANCA negative at the time of remission (n = 82; P = 0.62).

Conclusions

This randomized trial suggests that extended azathioprine maintenance therapy has only a limited effect on the prevention of relapse in patients with PR3-AAV at 4 years after diagnosis. Moreover, positive C-ANCA status at stable remission was not associated with an increased rate of relapse.

Trial registration

ClinicalTrials.gov NCT 00128895.




Relationship of proximal tubular injury to chronic kidney disease as assessed by urinary kidney injury molecule-1 in five cohort studies

2016-09-02T00:05:38-07:00

Background

The primary biomarkers used to define CKD are serum creatinine and albuminuria. These biomarkers have directed focus on the filtration and barrier functions of the kidney glomerulus even though albuminuria results from tubule dysfunction as well. Given that proximal tubules make up ~90% of kidney cortical mass, we evaluated whether a sensitive and specific marker of proximal tubule injury, urinary kidney injury molecule-1 (KIM-1), is elevated in individuals with CKD or with risk factors for CKD.

Methods

We measured urinary KIM-1 in participants of five cohort studies from the USA and Sweden. Participants had a wide range of kidney function and were racially and ethnically diverse. Multivariable linear regression models were used to test the association of urinary KIM-1 with demographic, clinical and laboratory values.

Results

In pooled, multivariable-adjusted analyses, log-transformed, creatinine-normalized urinary KIM-1 levels were higher in those with lower eGFR {β = –0.03 per 10 mL/min/1.73 m2 [95% confidence interval (CI) –0.05 to –0.02]} and greater albuminuria [β = 0.16 per unit of log albumin:creatinine ratio (95% CI 0.15–0.17)]. Urinary KIM-1 levels were higher in current smokers, lower in blacks than nonblacks and lower in users versus nonusers of angiotensin-converting enzyme inhibitors and angiotensin receptor blockers.

Conclusion

Proximal tubule injury appears to be an integral and measurable element of multiple stages of CKD.




Individual long-term albuminuria exposure during angiotensin receptor blocker therapy is the optimal predictor for renal outcome

2016-09-02T00:05:38-07:00

Background

Albuminuria reduction due to angiotensin receptor blockers (ARBs) predicts subsequent renoprotection. Relating the initial albuminuria reduction to subsequent renoprotection assumes that the initial ARB-induced albuminuria reduction remains stable during follow-up. The aim of this study was to assess individual albuminuria fluctuations after the initial ARB response and to determine whether taking individual albuminuria fluctuations into account improves renal outcome prediction.

Methods

Patients with diabetes and nephropathy treated with losartan or irbesartan in the RENAAL and IDNT trials were included. Patients with >30% reduction in albuminuria 3 months after ARB initiation were stratified by the subsequent change in albuminuria until Month 12 in enhanced responders (>50% albuminuria reduction), sustained responders (between 20 and 50% reduction), and response escapers (<20% reduction). Predictive performance of the individual albuminuria exposure until Month 3 was compared with the exposure over the first 12 months using receiver operating characteristics (ROC) curves.

Results

Following ARB initiation, 388 (36.3%) patients showed an >30% reduction in albuminuria. Among these patients, the albuminuria level further decreased in 174 (44.8%), remained stable in 123 (31.7%), and increased in 91 (23.5%) patients. Similar albuminuria fluctuations were observed in patients with <30% albuminuria reduction. Renal risk prediction improved when using the albuminuria exposure during the first 12 months versus the initial Month 3 change [ROC difference: 0.78 (95% CI 0.75–0.82) versus 0.68 (0.64–0.72); P < 0.0001].

Conclusions

Following the initial response to ARBs, a large within-patient albuminuria variability is observed. Hence, incorporating multiple albuminuria measurements over time in risk algorithms may be more appropriate to monitor treatment effects and quantify renal risk.




Use of urine biomarker-derived clusters to predict the risk of chronic kidney disease and all-cause mortality in HIV-infected women

2016-09-02T00:05:38-07:00

Background

Although individual urine biomarkers are associated with chronic kidney disease (CKD) incidence and all-cause mortality in the setting of HIV infection, their combined utility for prediction remains unknown.

Methods

We measured eight urine biomarkers shown previously to be associated with incident CKD and mortality risk among 902 HIV-infected women in the Women's Interagency HIV Study: N-acetyl-β-d-glucosaminidase (NAG), kidney injury molecule-1 (KIM-1), alpha-1 microglobulin (α1m), interleukin 18, neutrophil gelatinase-associated lipocalin, albumin-to-creatinine ratio, liver fatty acid-binding protein and α-1-acid-glycoprotein. A group-based cluster method classified participants into three distinct clusters using the three most distinguishing biomarkers (NAG, KIM-1 and α1m), independent of the study outcomes. We then evaluated associations of each cluster with incident CKD (estimated glomerular filtration rate <60 mL/min/1.73 m2 by cystatin C) and all-cause mortality, adjusting for traditional and HIV-related risk factors.

Results

Over 8 years of follow-up, 177 CKD events and 128 deaths occurred. The first set of clusters partitioned women into three groups, containing 301 (Cluster 1), 470 (Cluster 2) and 131 (Cluster 3) participants. The rate of CKD incidence was 13, 21 and 50% across the three clusters; mortality rates were 7.3, 13 and 34%. After multivariable adjustment, Cluster 3 remained associated with a nearly 3-fold increased risk of both CKD and mortality, relative to Cluster 1 (both P < 0.001). The addition of the multi-biomarker cluster to the multivariable model improved discrimination for CKD (c-statistic = 0.72–0.76, P = 0.0029), but only modestly for mortality (c = 0.79–0.80, P = 0.099). Clusters derived with all eight markers were no better for discrimination than the three-biomarker clusters.

Conclusions

For predicting incident CKD in HIV-infected women, clusters developed from three urine-based kidney disease biomarkers were as effective as an eight-marker panel in improving risk discrimination.




PLA2R antibodies, glomerular PLA2R deposits and variations in PLA2R1 and HLA-DQA1 genes in primary membranous nephropathy in South Asians

2016-09-02T00:05:38-07:00

Background

Antibodies to M-type phospholipase A2 receptor (PLA2R) correlate with clinical activity of primary membranous nephropathy (PMN). Risk alleles in PLA2R1 and HLA-DQA1 genes are associated with PMN. Whether these alleles are associated with the development of anti-PLA2R is unknown. In this prospective study we evaluated anti-PLA2R, enhanced glomerular staining for PLA2R and variations in PLA2R1 and HLA-DQA1 genes in Indian patients with PMN and examined their association with response to treatment.

Methods

A total of 114 adult PMN patients were studied. Anti-PLA2R was estimated before treatment and after 6 and 12 months of therapy. Enhanced glomerular staining for PLA2R was assessed on fresh frozen tissue. Genotype analysis was done on recruited patients and 95 healthy controls by TaqMan assays for six single-nucleotide polymorphisms (SNPs; rs4664308, rs3749119, rs3749117, rs4664308, rs3828323 and rs2187668). Patients were followed up monthly for a period of 12 months.

Results

Of 114 patients, 66.7% showed elevated serum anti-PLA2R by ELISA and 64.9% by indirect immunofluorescence. About 75% had enhanced glomerular staining for PLA2R. A total of 82% of patients had PLA2R-related disease. Reduction in serum anti-PLA2R titer had a significant association with remission of nephrotic syndrome (P = 0.0003) at 6 and 12 months. More than 85% of patients showing >90% reduction in the anti-PLA2R titer achieved remission of the nephrotic state, whereas of those showing <50% reduction in titers, 87.5% had persistent nephrotic state. The SNPs rs3749119, rs3749117, rs4664308 in PLA2R1 and rs2187668 in HLA-DQA1 were significantly associated with PMN. The SNP rs2187668 was associated with anti-PLA2R positivity. Patients with a high-risk genotype had higher anti-PLA2R levels.

Conclusion

To conclude, anti-PLA2R and enhanced glomerular PLA2R staining are found in more than two-thirds of Indian PMN cases. A reduction in the anti-PLA2R titer correlated with response to therapy.




Fibroblast growth factor 23 correlates with volume status in haemodialysis patients and is not reduced by haemodialysis

2016-09-02T00:05:38-07:00

Background

Recent data suggest a role for fibroblast growth factor 23 (FGF-23) in volume regulation. In haemodialysis patients, a large ultrafiltration volume (UFV) reflects poor volume control, and both FGF-23 and a large UFV are risk factors for mortality in this population. We studied the association between FGF-23 and markers of volume status including UFV, as well as the intradialytic course of FGF-23, in a cohort of haemodialysis patients.

Methods

We carried out observational, post hoc analysis of 109 prevalent haemodialysis patients who underwent a standardized, low-flux, haemodialysis session with constant ultrafiltration rate. We measured UFV, plasma copeptin and echocardiographic parameters including cardiac output, end-diastolic volume and left ventricular mass index at the onset of the haemodialysis session. We measured the intradialytic course of plasma C-terminal FGF-23 (corrected for haemoconcentration) and serum phosphate levels at 0, 1, 3 and 4 h after onset of haemodialysis and analysed changes with linear mixed effect model.

Results

Median age was 66 (interquartile range: 51–75) years, 65% were male with a weekly Kt/V 4.3 ± 0.7 and dialysis vintage of 25.4 (8.5–52.5) months. In univariable analysis, pre-dialysis plasma FGF-23 was associated with UFV, end-diastolic volume, cardiac output, early diastolic velocity e' and plasma copeptin. In multivariable regression analysis, UFV correlated with FGF-23 (standardized β: 0.373, P < 0.001, model R2: 57%), independent of serum calcium and phosphate. The association between FGF-23 and echocardiographic volume markers was lost for all but cardiac output upon adjustment for UFV. Overall, FGF-23 levels did not change during dialysis [7627 (3300–13 514) to 7503 (3109–14 433) RU/mL; P = 0.98], whereas phosphate decreased (1.71 ± 0.50 to 0.88 ± 0.26 mmol/L; P < 0.001).

Conclusions

FGF-23 was associated with volume status in haemodialysis patients. The strong association with UFV suggests that optimization of volume status, for example by more intensive haemodialysis regimens, may also benefit mineral homeostasis. A single dialysis session did not lower FGF-23 levels.




Mortality trends among Japanese dialysis patients, 1988-2013: a joinpoint regression analysis

2016-09-02T00:05:38-07:00

Background

Evaluation of mortality trends in dialysis patients is important for improving their prognoses. The present study aimed to examine temporal trends in deaths (all-cause, cardiovascular, noncardiovascular and the five leading causes) among Japanese dialysis patients.

Methods

Mortality data were extracted from the Japanese Society of Dialysis Therapy registry. Age-standardized mortality rates were calculated by direct standardization against the 2013 dialysis population. The average annual percentage of change (APC) and the corresponding 95% confidence interval (CI) were computed for trends using joinpoint regression analysis.

Results

A total of 469 324 deaths occurred, of which 25.9% were from cardiac failure, 17.5% from infectious disease, 10.2% from cerebrovascular disorders, 8.6% from malignant tumors and 5.6% from cardiac infarction. The joinpoint trend for all-cause mortality decreased significantly, by –3.7% (95% CI –4.2 to –3.2) per year from 1988 through 2000, then decreased more gradually, by –1.4% (95% CI –1.7 to –1.2) per year during 2000–13. The improved mortality rates were mainly due to decreased deaths from cardiovascular disease, with mortality rates due to noncardiovascular disease outnumbering those of cardiovascular disease in the last decade. Among the top five causes of death, cardiac failure has shown a marked decrease in mortality rate. However, the rates due to infectious disease have remained stable during the study period [APC 0.1 (95% CI –0.2–0.3)].

Conclusions

Significant progress has been made, particularly with regard to the decrease in age-standardized mortality rates. The risk of cardiovascular death has decreased, while the risk of death from infection has remained unchanged for 25 years.




Phosphorus metabolism in peritoneal dialysis- and haemodialysis-treated patients

2016-09-02T00:05:38-07:00

Background

Phosphorus control is generally considered to be better in peritoneal dialysis (PD) patients as compared with haemodialysis (HD) patients. Predialysis phosphorus concentrations are misleading as a measure of phosphorus exposure in HD, as these neglect significant dialysis-related fluctuations.

Methods

Parameters of mineral metabolism, including parathyroid hormone (PTH) and fibroblast growth factor-23 (FGF-23), were determined in 79 HD and 61 PD patients. In PD, phosphorus levels were determined mid-morning. In HD, time-averaged phosphorus concentrations were modelled from measurements before and after the mid-week dialysis session. Weekly renal, dialytic and total phosphorus clearances as well as total mass removal were calculated from urine and dialysate collections.

Results

Time-averaged serum phosphorus concentrations in HD (3.5 ± 1.0 mg/dL) were significantly lower than the mid-morning concentrations in PD (5.0 ± 1.4 mg/dL, P < 0.0001). In contrast, predialysis phosphorus concentrations (4.6 ± 1.4 mg/dL) were not different from PD. PTH and FGF-23 levels were significantly higher in PD. Despite higher residual renal function, total phosphorus clearance was significantly lower in PD (P < 0.0001). Total phosphorus mass removal, conversely, was significantly higher in PD (P < 0.05).

Conclusions

Our data suggest that the time-averaged phosphorus concentrations in patients treated with PD are higher as compared with patients treated with HD. Despite a better preserved renal function, total phosphorus clearance is lower in patients treated with PD. Additional studies are needed to confirm these findings in a population with a different demographic profile and dietary background and to define clinical implications.




High-urgency kidney transplantation in the Eurotransplant Kidney Allocation System: success or waste of organs? The Eurotransplant 15-year all-centre survey

2016-09-02T00:05:38-07:00

Background

In the Eurotransplant Kidney Allocation System (ETKAS), transplant candidates can be considered for high-urgency (HU) status in case of life-threatening inability to undergo renal replacement therapy. Data on the outcomes of HU transplantation are sparse and the benefit is controversial.

Methods

We systematically analysed data from 898 ET HU kidney transplant recipients from 61 transplant centres between 1996 and 2010 and investigated the 5-year patient and graft outcomes and differences between relevant subgroups.

Results

Kidney recipients with an HU status were younger (median 43 versus 55 years) and spent less time on the waiting list compared with non-HU recipients (34 versus 54 months). They received grafts with significantly more mismatches (mean 3.79 versus 2.42; P < 0.001) and the percentage of retransplantations was remarkably higher (37.5 versus 16.7%). Patient survival (P = 0.0053) and death with a functioning graft (DwFG; P < 0.0001) after HU transplantation were significantly worse than in non-HU recipients, whereas graft outcome was comparable (P = 0.094). Analysis according to the different HU indications revealed that recipients listed HU because of an imminent lack of access for dialysis had a significantly worse patient survival (P = 0.0053) and DwFG (P = 0.0462) compared with recipients with psychological problems and suicidality because of dialysis. In addition, retransplantation had a negative impact on patient and graft outcome.

Conclusions

Facing organ shortages, increasing wait times and considerable mortality on dialysis, we question the current policy of HU allocation and propose more restrictive criteria with regard to individuals with vascular complications or repeated retransplantations in order to support patients on the non-HU waiting list with a much better long-term prognosis.




Estimated nephron number of the remaining donor kidney: impact on living kidney donor outcomes

2016-09-02T00:05:38-07:00

Background

It has been demonstrated that low birth weight gives rise to a reduction in nephron number with increased risks for hypertension and renal disease. Its impact on renal function in kidney donors, however, has not been addressed.

Methods

To investigate the impact of birth weight, kidney weight, kidney volume and estimated nephron number on kidney function, we collected data from 91 living kidney donors before nephrectomy, at +12, +36 and +60 months after nephrectomy.

Results

Birth weight showed a positive correlation with estimated glomerular filtration rate (eGFR) at +12, +36 and +60 months after nephrectomy (P < 0.05). The strongest link was observed in donors >50 years old (R = 0.535, P < 0.001 at +12 months). Estimated nephron number and eGFR showed a strong positive correlation at +12, +36 and +60 months after nephrectomy (R = 0.540; R = 0.459; R = 0.506, P < 0.05). Daily proteinuria at +12 months showed a negative correlation with birth weight (P = 0.009). Donors with new-onset hypertension showed significantly lower birth weights and higher uric acid levels (P < 0.05). Kidney weight and volume did not show any impact on donor outcomes (P > 0.05).

Conclusions

Low nephron number predisposes donors to inferior remaining eGFR, hypertension and proteinuria. The strong correlation in elderly donors may be attributed to reduced renal functional reserve due to the decline of renal function with age.




'I feel stronger and younger all the time--perspectives of elderly kidney transplant recipients: thematic synthesis of qualitative research

2016-09-02T00:05:38-07:00

Background

Kidney transplantation offers improved survival and quality of life to an increasing number of elderly patients with end-stage kidney disease. However, elderly kidney transplant recipients may face unique challenges due to a higher burden of comorbidity, greater cumulative risk of immunosuppression-related complications and increasing frailty. We aimed to describe the perspectives of elderly kidney transplant recipients.

Methods

Electronic databases were searched to April 2015. Qualitative studies were eligible if they reported views from elderly kidney transplant recipients (≥60 years). Thematic synthesis was used to analyse the findings.

Results

Twenty-one studies involving >116 recipients were included. We identified seven themes. ‘Regaining strength and vitality’ meant valuing the physical and psychosocial improvements in daily functioning and life participation. ‘Extending life’ was the willingness to accept any organ, including extended criteria kidneys, to prolong survival. ‘Debt of gratitude’ entailed conscious appreciation toward their donor while knowing they were unable to repay their sacrifice. ‘Moral responsibility to maintain health’ motivated adherence to medication and lifestyle recommendations out of an ethical duty to protect their gift for graft survival. ‘Unabating and worsening forgetfulness’ hindered self-management. ‘Disillusionment with side effects and complications’ reflected disappointment and exasperation with the unintended consequences of medications. ‘Finality of treatment option’ was an acute awareness that the current transplant may be their last.

Conclusions

Kidney transplantation was perceived to slow and even reverse the experience of aging among elderly recipients, especially compared with dialysis. However, some were frustrated over persistent limitations after transplant, struggled with the burden of medication side effects and worried about a possible return to dialysis if the transplant failed. Clarifying patient expectations of transplantation, providing support to alleviate the debilitating impacts of immunosuppression and addressing fears about deteriorating health and graft failure may improve satisfaction and outcomes in elderly kidney transplant recipients.




Next generation sequencing and functional analysis of patient urine renal progenitor-derived podocytes to unravel the diagnosis underlying refractory lupus nephritis

2016-09-02T00:05:38-07:00

Often the cause of refractory lupus nephritis (RLN) remains unclear. We performed next-generation sequencing for podocyte genes in an RLN patient and identified compound heterozygosity for APOL1 risk alleles G1 and G2 and a novel homozygous c.[1049C>T]+[1049C>T] NPHS1 gene variant of unknown significance. To test for causality renal progenitor cells isolated from urine of this patient were differentiated into podocytes in vitro. Podocytes revealed aberrant nephrin trafficking, cytoskeletal structure and lysosomal leakage, and increased detachment as compared with podocytes isolated from controls. Thus, lupus podocytopathy can be confirmed as a cause of RLN by functional genetics on patient-derived podocytes.




Announcements

2016-09-02T00:05:38-07:00
















Effects of non-pharmacological interventions on urinary citrate levels: a systematic review and meta-analysis

2016-08-01T00:05:39-07:00

Background

Hypocitraturia is a known risk factor for nephrolithiasis, present in 20–60% of stone-forming patients. The administration of citrate or other alkali preparations has been demonstrated to benefit hypocitraturic stone formers. Dietary modifications that include citrate-containing fluids can be an alternative option to pharmacological agents. We aimed to systematically review, summarize and quantify available evidence on the effects of non-pharmacological interventions on urinary citrate and nephrolithiasis.

Methods

Manual and electronic database searches (MEDLINE/PubMed, Embase, Cochrane Library, Scopus, Scielo, LILACS) were performed for studies published up to July 2014. Two reviewers independently identified studies for inclusion and extracted data on study characteristics, outcomes and quality assessments. We included controlled studies with non-pharmacological interventions that assessed urinary citrate levels or nephrolithiasis pre- and post-intervention. Meta-analysis was performed by random effects and subgrouped by the type of intervention, and heterogeneity was analysed by I2.

Results

Of the 427 studies identified, 13 studies were included (18 samples), involving 358 participants with a mean age of 43 ± 11.0 years across the studies. Interventions were grouped as commercial fruit juices, soft drinks, calcium-/magnesium-rich mineral water, high-fiber diet, low-animal-protein diet and plant extract. Almost half of the studies (6/13; 8/18 samples) reported effects in non-stone formers. Two studies included stone formers and non-stone formers. Commercial fruit juice interventions showed high I2 (88.1%, P = 0.000) and an increase in citraturia levels ( 95% confidence interval) of 167.2 (65.4; 269) mg/day. Other types of intervention did not show important heterogeneity; however, pooled estimates were not significant.

Conclusion

Our review indicates that further larger scale trials are required to analyze whether non-pharmacological interventions can increase urinary citrate levels and act in kidney stone prevention.




Lack of evidence does not justify neglect: how can we address unmet medical needs in calciphylaxis?

2016-08-01T00:05:39-07:00

Calcific uraemic arteriolopathy (CUA), or calciphylaxis, is a rare disease predominantly occurring in comorbidity with dialysis. Due to the very low frequency of CUA, prospective studies on its management are lacking and even anecdotal reports on treatment remain scarce. Therefore, calciphylaxis is still a challenging disease with dismal prognosis urgently requiring adequate strategies for diagnosis and treatment.

In an attempt to fill some of the current gaps in evidence on various, highly debated and controversial aspects of dialysis-associated calciphylaxis, 13 international experts joined the 1st Consensus Conference on CUA, held in Leuven, Belgium on 21 September 2015. The conference was supported by the European Calciphylaxis Network (EuCalNet), which is a task force of the ERA-EDTA scientific working group on Chronic Kidney Disease—Mineral and Bone Disorders (CKD-MBD). After an intense discussion, a 9-point Likert scale questionnaire regarding 20 items on calciphylaxis was anonymously answered by each participant. These 20 items addressed unsolved issues in terms of diagnosis and management of calciphylaxis. On the one hand, the analysis of the expert opinions identified areas of general consensus, which might be a valuable aid for physicians treating such a disease with less experience in the field. On the other hand, some topics such as the pertinence of skin biopsy and administration of certain treatments revealed divergent opinions. The aim of the present summary report is to provide some guidance for clinicians who face patients with calciphylaxis in the current setting of absence of evidence-based medicine.




Pro: Higher serum bicarbonate in dialysis patients is protective

2016-08-01T00:05:39-07:00

Chronic metabolic acidosis is common in dialysis patients. Bicarbonate administration via the dialysate helps maintain the acid–base balance in these patients. Serum bicarbonate level in dialysis patients is determined by several factors that include dietary protein intake, nutritional status and dialysis prescription, etc. Additionally, a meaningful interpretation of serum bicarbonate in dialysis patients requires an understanding of complexities involving its measurement. Both very low as well very high levels of serum bicarbonate have been associated with adverse outcomes in observational studies. However, recent observational data, when adjusted for the confounding effects of nutritional status, do not associate higher predialysis serum bicarbonate with adverse consequences. At this time, there are no prospective studies available that have examined the association of serum bicarbonate with hard outcomes in dialysis patients. The ideal level of serum bicarbonate in dialysis patients is therefore unknown. This article examines the available data with regard to the benefits of higher predialysis serum bicarbonate.




Opponent's comments

2016-08-01T00:05:39-07:00




Con: Higher serum bicarbonate in dialysis patients is protective

2016-08-01T00:05:39-07:00

Metabolic acidosis is often observed in advanced chronic kidney disease, with deleterious consequences on the nutritional status, bone and mineral status, inflammation and mortality. Through clearance of the daily acid load and a net gain in alkaline buffers, dialysis therapy is aimed at correcting metabolic acidosis. A normal bicarbonate serum concentration is the recommended target in dialysis patients. However, several studies have shown that a mild degree of metabolic acidosis in patients treated with dialysis is associated with better nutritional status, higher protein intake and improved survival. Conversely, a high bicarbonate serum concentration is associated with poor nutritional status and lower survival. It is likely that mild acidosis results from a dietary acid load linked to animal protein intake. In contrast, a high bicarbonate concentration in patients treated with dialysis could result mainly from an insufficient dietary acid load, i.e. low protein intake. Therefore, a high pre-dialysis serum bicarbonate concentration should prompt nephrologists to carry out nutritional investigations to detect insufficient dietary protein intake. In any case, a high bicarbonate concentration should be neither a goal of dialysis therapy nor an index of adequate dialysis, whereas mild acidosis could be considered as an indicator of appropriate protein intake.




Opponent's comments

2016-08-01T00:05:39-07:00




Moderator's view: Higher serum bicarbonate in dialysis patients is protective

2016-08-01T00:05:39-07:00

Several observational studies have reported an association between higher serum bicarbonate level and high mortality risk in dialysis patients. However, in such studies mere discovery of associations does not allow one to infer causal relationships. This association may be related to inadequate dietary protein intake that may lead to less acid generation and hence a higher serum bicarbonate level. Since undernutrition is a strong predictor of death in hemodialysis patients, the observed association may be an epiphenomenon and not a biologically plausible relationship. Higher protein and fluid intake between two subsequent hemodialysis treatments may lead to lower serum bicarbonate level. This low bicarbonate level may appear protective, as patients with higher food intake and better appetite generally exhibit greater survival. In the contemporary three-stream proportioning system of hemodialysis treatment, the bicarbonate concentrate is separate from the acid concentrate, and the contribution of the acid concentrate organic acid (acetate, citrate or diacetate) to the delivered bicarbonate pool of the patient is negligible. The concept of ‘total buffer’ that assumes that the combination of bicarbonate and acetate concentrations in the dialysate are added equally as bicarbonate equivalents is likely wrong and based on the misleading notion that the acetate of the acid concentrate is fully metabolized to bicarbonate in the dialysate. Given these uncertainties it is prudent to avoid excessively high or low bicarbonate levels in dialysis patients.




Protective role of mouse IgG1 in cryoglobulinaemia; insights from an animal model and relevance to human pathology

2016-08-01T00:05:39-07:00

Strait et al. described a novel mouse model of cryoglobulinaemia by challenging mice deficient in the immunoglobulin (Ig)G1 subclass (1 mice) with goat anti-mouse IgD [5]. The phenotype of wild-type mice was not remarkable, whereas 1 mice developed IgG3 anti-goat IgG cryoglobulins as well as severe and lethal glomerulonephritis. Renal phenotype could not be rescued in 1 mice by the deletion of C3, fragment crystalline receptor (FcR) or J chain. On the other hand, early injection of IgG1, IgG2a or IgG2b inhibited the pathogenic effects of IgG3 in an antigen-dependent manner even in the absence of the FcRIIb, an anti-inflammatory receptor. The authors concluded that the pathogenic role of IgG3 and the protective characteristic of IgG1 in this model were not explained by their abilities to bind to FcRs or effector molecules but are rather due to structural discrepancies enhancing the precipitation properties/solubility of IgG3/IgG1-containing immune complexes. The present article aims to discuss the current knowledge on IgG biology and the properties of IgGs explaining their differential propensity to acquire cryoglobulin activity.




How much can the tubule regenerate and who does it? An open question

2016-08-01T00:05:39-07:00

The tubular compartment of the kidney is the primary site of a wide range of insults that can result in acute kidney injury (AKI), a condition associated with high mortality and an increased risk to develop end-stage renal disease. Nevertheless, kidney function is often quickly recovered after tubular injury. How this happens has only partially been unveiled. Indeed, although it has clearly been demonstrated that regenerated epithelial cells arise from survived intratubular cells, the true entity, as well as the cellular source of this regenerative process, remains mostly unknown. Is whichever proximal tubular epithelial cell able to dedifferentiate and divide to replace neighboring lost tubular cells, thus suggesting an extreme regenerative ability of residual tubular epithelium, or is the regenerative potential of tubular epithelium limited, and mostly related to a preexisting population of intratubular scattered progenitor cells which are more resistant to death? Gaining insights on how this process takes place is essential for developing new therapeutic strategies to prevent AKI, as well as AKI-related chronic kidney disease. The aim of this review is to discuss why the answers to these questions are still open, and how further investigations are needed to understand which is the true regenerative potential of the tubule and who are the players that allow functional recovery after AKI.




Cost of renal replacement: how to help as many as possible while keeping expenses reasonable?

2016-08-01T00:05:39-07:00

The treatment of kidney diseases consumes a substantial amount of the health budget for a relatively small fraction of the overall population. If the nephrological community and society do not develop mechanisms to contain those costs, it will become impossible to continue assuring optimal outcomes and quality of life while treating all patients who need it. In this article, we describe several mechanisms to maintain sustainability of renal replacement therapy. These include (i) encouragement of transplantation after both living and deceased donation; (ii) stimulation of alternative dialysis strategies besides classical hospital haemodialysis, such as home haemodialysis, peritoneal dialysis or self-care and necessitating less reimbursement; (iii) promotion of educational activities guiding the patients towards therapies that are most suited for them; (iv) consideration of one or more of cost containment incentives such as bundling of reimbursement (if not affecting quality of the treatment), timely patient referral, green dialysis, start of dialysis based on clinical necessity rather than renal function parameters and/or prevention of CKD or its progression; (v) strategically planned adaptations to the expected growth of the ageing population in need of renal replacement; (vi) the necessity for support of research in the direction of helping as large as possible patient populations for acceptable costs; and (vii) the need for more patient-centred approaches. We also extend the discussion to the specific situation of kidney diseases in low- and middle-income countries. Finally, we point to the dramatic differences in accessibility and reimbursement of different modalities throughout Europe. We hope that this text will offer a framework for the nephrological community, including patients and nurses, and the concerned policy makers and caregivers on how to continue reaching all patients in need of renal replacement for affordable expenses.




Immunosuppression in the failing and failed transplant kidney: optimizing outcomes

2016-08-01T00:05:39-07:00

There is little data to guide clinicians on the optimal management of immunosuppression in patients whose kidney transplant has failed and who have returned to dialysis. Nor is there robust data on whether to perform a transplant nephrectomy. Finally, management of late stage chronic kidney disease, including deciding on dialysis initiation, modality and access planning, must occur simultaneously with efforts aimed at preserving the failing kidney and residual renal function for as long as possible. In this article, we will review the evidence on these topics and suggest areas for improvement.




Endurance exercise and growth hormone improve bone formation in young and growth-retarded chronic kidney disease rats

2016-08-01T00:05:39-07:00

Background

Childhood chronic kidney disease (CKD) is associated with both short stature and abnormal bone mineralization. Normal longitudinal growth depends on proper maturation of epiphyseal growth plate (EGP) chondrocytes, leading to the formation of trabecular bone in the primary ossification centre. We have recently shown that linear growth impairment in CKD is associated with impaired EGP growth hormone (GH) receptor signalling and that exercise improved insulin-like growth factor I (IGF-I) signalling in CKD-related muscle atrophy.

Methods

In this study, 20-day-old rats underwent 5/6 nephrectomy (CKD) or sham surgery (C) and were exercised with treadmill, with or without GH supplementation.

Results

CKD-related growth retardation was associated with a widened EGP hypertrophic zone. This was not fully corrected by exercise (except for tibial length). Exercise in CKD improved the expression of EGP key factors of endochondral ossification such as IGF-I, vascular endothelial growth factor (VEGF), receptor activator of nuclear factor kappa-B ligand (RANKL) and osteocalcin. Combining GH treatment with treadmill exercise for 2 weeks improved the decreased trabecular bone volume in CKD, as well as the expression of growth plate runt-related transcription factor 2, RANKL, metalloproteinase 13 and VEGF, while GH treatment alone could not do that.

Conclusions

Treadmill exercise improves tibial bone linear growth, as well as growth plate local IGF-I. When combined with GH treatment, running exercise shows beneficial effects on trabecular bone formation, suggesting the potential benefit of this combination for CKD-related short stature and bone disease.




Targeted sequencing of 96 renal developmental microRNAs in 1213 individuals from 980 families with congenital anomalies of the kidney and urinary tract

2016-08-01T00:05:39-07:00

Background

Congenital anomalies of the kidney and urinary tract (CAKUT) are the most common cause of chronic kidney diseases in children and young adults, accounting for ~50% of cases. These anomalies represent maldevelopment of the genitourinary system and can be genetically explained in only 10–16% of cases by mutations or by copy number variations in protein coding sequences. Knock-out mouse models, lacking components of the microRNA (miRNA) processing machinery (i.e. Dicer, Drosha, Dgcr8), exhibit kidney malformations resembling human CAKUT.

Methods

Given the Dicer-null mouse phenotype, which implicates a central role for miRNAs gene regulation during kidney development, we hypothesized that miRNAs expressed during kidney development may cause CAKUT in humans if mutated. To evaluate this possibility we carried out Next-Generation sequencing of 96 stem-loop regions of 73 renal developmental miRNA genes in 1248 individuals with non-syndromic CAKUT from 980 families.

Results

We sequenced 96 stem-loop regions encoded by 73 miRNA genes that are expressed during kidney development in humans, mice and rats. Overall, we identified in 31/1213 individuals from 26 families with 17 different single nucleotide variants. Two variants did not segregate with the disease and hence were not causative. Thirteen variants were likely benign variants because they occurred in control populations and/or they affected nucleotides of weak evolutionary conservation. Two out of 1213 unrelated individuals had potentially pathogenic variants with unknown biologic relevance affecting miRNAs MIR19B1 and MIR99A.

Conclusions

Our results indicate that mutations affecting mature microRNAs in individuals with CAKUT are rare and thus most likely not a common cause of CAKUT in humans.




The impact of dialysis on the survival of patients with immunoglobulin light chain (AL) amyloidosis undergoing autologous stem cell transplantation

2016-08-01T00:05:39-07:00

Background

Acute renal failure requiring dialysis is associated with high mortality during autologous stem cell transplantation (ASCT). This study examined the association between acute renal failure and mortality in immunoglobulin light chain (AL) amyloidosis during ASCT.

Methods

Between 1996 and 2010, 408 ASCT patients were evaluated. Data were collected from electronic medical records.

Results

Dialysis was performed on 72 (17.6%) patients. Eight patients started dialysis >30 days prior to ASCT (Group II), 36 started ±30 days after ASCT (Group III) and 28 initiated dialysis >1 month after ASCT (Group IV). Patients who never dialyzed were assigned to Group I. There were no significant age or sex differences. Median overall survival (OS) had not been reached in Groups I and II but was 7.0 months in Group III and 48.5 months in Group IV (P < 0.001). Treatment-related mortality (TRM) was observed in 44.4% of the patients in Group III, 6-fold higher than the next highest group (P < 0.001). The most common causes of TRM were cardiac and sepsis. In the multivariate analysis, only hypoalbuminemia (<2.5 g/dL, P < 0.001) and estimated glomerular filtration rate (eGFR) <40 mL/min/1.73 m2 (P < 0.001) were independently associated with starting dialysis within 30 days of ASCT.

Conclusions

The study found significant differences in the OS depending on when the acute renal failure occurred. Patients who required dialysis within 30 days of ASCT had the highest rate of TRM. Screening with serum albumin and eGFR may reduce the risk.




Renal hemodynamic effects of the HMG-CoA reductase inhibitors in autosomal dominant polycystic kidney disease

2016-08-01T00:05:39-07:00

Background

To determine the effect of statins on renal hemodynamics in normal volunteers and those with autosomal dominant polycystic kidney disease either with mild or moderate renal dysfunction.

Methods

Thirty-two study subjects were enrolled in this study: 11 normal volunteers, 11 study subjects with autosomal dominant polycystic kidney disease (ADPKD) and mild kidney disease and 10 study subjects with ADPKD and moderate kidney disease. Subjects in each group received simvastatin 40 mg once daily for a period of 4 weeks. Renal blood flow was measured based on para-amino-hippurate (PAH) clearance and with the use of a magnetic resonance (MR) scanner at the beginning and following 4 weeks of therapy with statins.

Results

At the end of the study, except for the lipid profile, which was significantly lower in all groups, other laboratory results showed no change. Four weeks of therapy with simvastatin resulted in no change in serum creatinine, 24-h urinary protein, sodium, iothalamate clearance, PAH clearance or renal blood flow as measured by MRI or based on PAH clearance.

Conclusions

Four weeks of therapy with simvastatin did not change renal blood flow in the study subjects with ADPKD with mild-to-moderate renal dysfunction or in healthy volunteers.

Clinical Trial Registration Number

NCT02511418.




Increased risk of glomerular hyperfiltration in subjects with impaired glucose tolerance and newly diagnosed diabetes

2016-08-01T00:05:39-07:00

Background

Glomerular hyperfiltration is closely related to diabetes and may lead to subsequent nephropathy, but the association between glomerular hyperfiltration and prediabetic state is unclear. We examined the relationship of different glycemic statuses, including normal glucose tolerance (NGT), isolated impaired fasting glucose (IFG), impaired glucose tolerance (IGT) and newly diagnosed diabetes (NDD), with glomerular hyperfiltration.

Methods

This study included 12 833 subjects ≥20 years of age without a history of renal disease, cancer, moderate/severe anemia or diabetes and taking medications for hypertension, diabetes, hyperlipidemia or cardiovascular disease from National Cheng Kung University Hospital between January 2000 and August 2009. Hyperfiltration was defined as an estimated GFR (eGFR) above the age- and gender-specific 95th percentile for apparently healthy subjects, while hypofiltration was defined as an eGFR below the 5th percentile. eGFR was assessed using the Chronic Kidney Disease Epidemiology Collaboration equation.

Results

After further excluding hypofiltration and adjusting for available confounders, fasting plasma glucose (FPG), 2-hour postload glucose (2hPG), 2hPG-FPG (fluctuating blood glucose), HbA1c (average blood glucose), NDD and IGT but not isolated IFG were significantly associated with increased eGFR and a higher risk of hyperfiltration {NDD: odds ratio [OR] 1.97 [95% confidence interval (CI), 1.48–2.64], P < 0.001; IGT: OR 1.34 (95% CI 1.07–1.66), P = 0.009}.

Conclusions

High glucose states increase hyperfiltration risk. In addition to newly diagnosed diabetes, excessively high GFR also deserves attention in subjects with IGT.




Effects of an intradialytic resistance training programme on physical function: a prospective stepped-wedge randomized controlled trial

2016-08-01T00:05:39-07:00

Background

Intradialytic exercise programmes are important because of the deterioration in physical function that occurs in people receiving haemodialysis. Unfortunately, exercise programmes are rarely sustained in haemodialysis clinics. The aim of this study was to determine the efficacy of a sustainable resistance exercise programme on the physical function of people receiving haemodialysis.

Methods

A total of 171 participants from 15 community satellite haemodialysis clinics performed progressive resistance training using resistance elastic bands in a seated position during the first hour of haemodialysis treatment. We used a stepped-wedge design of three groups, each containing five randomly allocated cluster units allocated to an intervention of 12, 24 or 36 weeks. The primary outcome measure was objective physical function measured by the 30-s sit-to-stand (STS) test, the 8-foot timed up and go (TUG) test and the four-square step test. Secondary outcome measures included quality of life, involvement in community activity, blood pressure and self-reported falls.

Results

Exercise training led to significant improvements in physical function as measured by STS and TUG. There was a significant average downward change (β = –1.59, P < 0.01) before the intervention and a significant upward change after the intervention (β = 0.38, P < 0.01) for the 30-s STS with a similar pattern noted for the TUG.

Conclusion

Intradialytic resistance training can improve the physical function of people receiving dialysis.




Examining the robustness of the obesity paradox in maintenance hemodialysis patients: a marginal structural model analysis

2016-08-01T00:05:39-07:00

Background

The inverse association between body mass index (BMI) and mortality observed in patients treated with maintenance hemodialysis (MHD), also known as the obesity paradox, may be a result of residual confounding. Marginal structural model (MSM) analysis, a technique that accounts for time-varying confounders, may be more appropriate to investigate this association. We hypothesize that after applying MSM, the inverse association between BMI and mortality in MHD patients is attenuated.

Methods

We examined the associations between BMI and all-cause mortality among 123 624 adult MHD patients treated during 2001–6. We examined baseline and time-varying BMI using Cox proportional hazards models and MSM while considering baseline and time-varying covariates, including demographics, comorbidities and markers of malnutrition and inflammation.

Results

The patients included 45% women and 32% African Americans with a mean age of 61(SD 15) years. In all models, BMI showed a linear incremental inverse association with mortality. Compared with the reference (BMI 25 to <27.5 kg/m2), a BMI of <18 kg/m2 was associated with a 3.2-fold higher death risk [hazard ratio (HR) 3.17 (95% CI 3.05–3.29)], and mortality risks declined with increasing BMI with the greatest survival advantage of 31% lower risk [HR 0.69 (95% CI 0.64–0.75)] observed with a BMI of 40 to <45 kg/m2.

Conclusions

The linear inverse relationship between BMI and mortality is robust across models including MSM analyses that more completely account for time-varying confounders and biases.




Dialysis modality and nutritional status are associated with variability of inflammatory markers

2016-08-01T00:05:40-07:00

Background

Inflammation is a common feature in dialysis patients and is associated with cardiovascular complications and poor outcome. Measuring the variability of inflammatory markers may help in understanding underlying factors triggering inflammation. Whether the inflammatory pattern in hemodialysis (HD) and peritoneal dialysis (PD) patients differs has scarcely been studied. Here we explored factors associated with the magnitude and variability of inflammation markers in HD and PD patients.

Methods

In two 3-month, prospective cohort studies comprising 228 prevalent HD and 80 prevalent PD patients, interleukin-6 (IL-6) and high-sensitivity C-reactive protein (CRP) were measured in blood samples drawn each month and every week, respectively. Information on comorbidity, protein-energy wasting (PEW) and medications was gathered at baseline, and information on symptoms potentially related to inflammation was gathered weekly. A mixed-effect model was used for multivariate analysis of factors linked to CRP and IL-6 variation.

Results

IL-6 and CRP levels were higher and showed higher variability in HD versus PD patients [median IL-6 8.3 (interquartile range, IQR, 5.3–14.5) versus 6.7 (IQR 4.2–10.0) pg/mL, P < 0.001 and median CRP 6.1 (IQR 2.5–14.0) versus 5.4 (IQR 1.6–9.0) mg/L, P < 0.001). PEW predicted increased inflammation variability after correcting for age, sex, dialysis vintage, modality and comorbidity. Increased comorbidity predicted IL-6, but not CRP, variability.

Conclusions

Circulating concentrations as well as variability of IL-6 and CRP levels were higher in HD as compared with PD patients. In HD and PD patients, short-term variability of IL-6 and CRP levels associated strongly with PEW, while comorbidity was related to IL-6 but not to CRP variability.




Effects of citrate dialysate in chronic dialysis: a multicentre randomized crossover study

2016-08-01T00:05:40-07:00

Background

Although citrate dialysate (CiDi) is regarded to be safe, dialysis modalities using higher dialysate volumes, like haemodiafiltration (HDF), may expose patients to higher citrate load and thus increase the risk of complications. We investigated the residual risk of CiDi compared with standard dialysate (StDi) in patients on different dialysis modalities and its effect on dialysis dose.

Methods

In a multicentre randomized crossover study, 92 dialysis patients (HDF post-dilution: n = 44, HDF pre-dilution: n = 26, haemodialysis: n = 25) were treated for 4 weeks with each dialysate (StDi and CiDi). Hypocalcaemia (ionized calcium ≤0.9 mmol/L), alkalosis (pH ≥7.55), post-treatment bicarbonate ≥32 mmol/L, pre-treatment bicarbonate ≥27 mmol/L, intra-dialytic events (IEs) and adverse events (AEs) between dialysis sessions were investigated as primary end points. The secondary objective was dialysis efficacy, i.e. dose and removal ratios of urea, creatinine, phosphate and β-2-microglobulin.

Results

Post-dialysis overcorrection of bicarbonate (>32 mmol/L) was less frequent with CiDi (P = 0.008). Other predefined calcium and acid–base disturbances did not vary. There was no significant difference in IE. However, more patients developed AEs such as fatigue, muscle spasms or pain using CiDi (StDi: 41 versus CiDi: 55 patients, P = 0.02), particularly in the first 2 weeks of exposure. Dialysis efficacy was comparable with both dialysates.

Conclusions

It can be confirmed that CiDi is not associated with the development of severe calcium and acid–base disorders, even when dialysis modalities with higher citrate loads are used. However, a refinement of the CiDi composition to minimize AEs is necessary.




Effect of a sustained difference in hemodialytic clearance on the plasma levels of p-cresol sulfate and indoxyl sulfate

2016-08-01T00:05:40-07:00

Background

The protein-bound solutes p-cresol sulfate (PCS) and indoxyl sulfate (IS) accumulate to high plasma levels in renal failure and have been associated with adverse events. The clearance of these bound solutes can be altered independently of the urea clearance by changing the dialysate flow and dialyzer size. This study tested whether a sustained difference in clearance would change the plasma levels of PCS and IS.

Methods

Fourteen patients on thrice-weekly nocturnal hemodialysis completed a crossover study of two periods designed to achieve widely different bound solute clearances. We compared the changes in pre-dialysis plasma PCS and IS levels from baseline over the course of the two periods.

Results

The high-clearance period provided much higher PCS and IS clearances than the low-clearance period (PCS: 23 ± 4 mL/min versus 12 ± 3 mL/min, P < 0.001; IS: 30 ± 5 mL/min versus 17 ± 4 mL/min, P < 0.001). Despite the large difference in clearance, the high-clearance period did not have a different effect on PCS levels than the low-clearance period [from baseline, high: +11% (–5, +37) versus low: –8% (–18, +32), (median, 25th, 75th percentile), P = 0.50]. In contrast, the high-clearance period significantly lowered IS levels compared with the low-clearance period [from baseline, high: –4% (–17, +1) versus low: +22% (+14, +31), P < 0.001). The amount of PCS removed in the dialysate was significantly greater at the end of the high-clearance period [269 (206, 312) versus 199 (111, 232) mg per treatment, P < 0.001], while the amount of IS removed was not different [140 (87, 196) versus 116 (89, 170) mg per treatment, P = 0.15].

Conclusions

These findings suggest that an increase in PCS generation prevents plasma levels from falling when the dialytic clearance is increased. Suppression of solute generation may be required to reduce plasma PCS levels in dialysis patients.




Deceased donor kidney transplantation across donor-specific antibody barriers: predictors of antibody-mediated rejection

2016-08-01T00:05:40-07:00

Background

Apheresis-based desensitization allows for successful transplantation across major immunological barriers. For donor-specific antibody (DSA)- and/or crossmatch-positive transplantation, however, it has been shown that even intense immunomodulation may not completely prevent antibody-mediated rejection (ABMR).

Methods

In this study, we evaluated transplant outcomes in 101 DSA+ deceased donor kidney transplant recipients (transplantation between 2009 and 2013; median follow-up: 24 months) who were subjected to immunoadsorption (IA)-based desensitization. Treatment included a single pre-transplant IA session, followed by anti-lymphocyte antibody and serial post-transplant IA. In 27 cases, a positive complement-dependent cytotoxicity crossmatch (CDCXM) was rendered negative immediately before transplantation. Seventy-four of the DSA+ recipients had a negative CDCXM already before IA.

Results

Three-year death-censored graft survival in DSA+ patients was significantly worse than in 513 DSA– recipients transplanted during the same period (79 versus 88%, P = 0.008). Thirty-three DSA+ recipients (33%) had ABMR. While a positive baseline CDCXM showed only a trend towards higher ABMR rates (41 versus 30% in CDCXM– recipients, P = 0.2), DSA mean fluorescence intensity (MFI) in single bead assays significantly associated with rejection, showing 20 versus 71% ABMR rates at <5000 versus >15 000 peak DSA MFI. The predictive value of MFI was moderate, with the highest accuracy at a median of 13 300 MFI (after cross-validation: 0.72). Other baseline variables, including CDC assay results, human leukocyte antigen mismatch, prior transplantation or type of induction treatment, did not add independent predictive information.

Conclusions

IA-based desensitization failed to prevent ABMR in a considerable number of DSA+ recipients. Assessing DSA MFI may help stratify risk of rejection, supporting its use as a guide to organ allocation and individualized treatment.




Dynamics and epitope specificity of anti-human leukocyte antibodies following renal allograft nephrectomy

2016-08-01T00:05:40-07:00

Background

A considerable proportion of patients awaiting kidney transplantation is immunized by previous transplantation(s). We investigated how allograft nephrectomy (Nx) and withdrawal of maintenance immunosuppression (WD-MIS) in patients with a failed renal allograft contribute to allosensitization.

Methods

HLA antibodies (HLAabs) were analyzed before and after Nx and/or WD-MIS using a single antigen bead assay. Patients were grouped as follows: (A) Nx and concomitant WD-MIS (n = 28), (B) Nx (n = 14) and (C) WD-MIS (n = 12). In a subgroup of patients, the epitope specificity of HLAabs was determined by adsorption and elution of sera with recombinant single HLA allele–expressing cell lines.

Results

Following Nx and/or WD-MIS, HLAabs were detectable in 100, 100 and 92% of patients in Groups A, B and C, respectively. In patients of all groups, de novo donor-specific HLAabs (DSAs) were found. After Nx, an increase in the breadth [percent panel reactive antibody (%PRA)] and mean fluorescence intensity of class I HLAabs was predominant. In contrast, an increase of class II HLAabs prevailed following WD-MIS. Experimental analysis of the epitope specificities revealed that 64% of the class I HLAabs classically denoted as non-DSA were donor epitope-specific HLAabs (DESA).

Conclusions

Both Nx and WD-MIS contribute to alloimmunization with differing patterns concerning class I and II HLAabs. Nx preferentially increased class I HLAabs and most of the observed class I HLAabs were DESA. Considering that class I, but not class II, HLA molecules are constitutively expressed, our results support the hypothesis that the increase of HLAabs following Nx might have been caused by removal of the adsorbing donor tissue (sponge hypothesis).




Immunosuppression with mammalian target of rapamycin inhibitor and incidence of post-transplant cancer in kidney transplant recipients

2016-08-01T00:05:40-07:00

Background

Evidence is limited regarding the effect of de novo therapy with mammalian target of rapamycin (mTOR) inhibitors on cancer risk after kidney transplantation.

Methods

Collaborative Transplant Study data from 78 146 adult recipients of first deceased-donor kidney transplants (1999–2013) were analysed (4279 mTOR inhibitor, 73 867 no mTOR inhibitor) using standard methods. Propensity score matching was performed for analysis of basal cell and squamous cell skin cancer.

Results

Standardized incidence ratios (SIR) versus a matched non-transplant population showed reduced tumour incidence in recipients with de novo mTOR inhibitor therapy compared with no mTOR inhibitor for non-melanoma skin cancer (NMSC) (SIR 5.1 versus 6.1; P =0.019) but not non-NMSC cancers (SIR 1.6 versus 1.7; P =0.35). Within propensity score-matched groups (n = 4265), multivariable Cox regression analysis showed a trend to reduced NMSC with mTOR inhibition [hazard ratio (HR) 0.77; P =0.063] but not for all non-NMSC tumours (HR 0.94; P= 0.59). A significant effect for mTOR inhibition was observed for basal cell carcinoma of the skin (HR 0.56; P= 0.004) but not squamous cell carcinoma (HR 0.87; P= 0.54).

Conclusions

De novo mTOR inhibition was associated with a substantially and significantly reduced risk of basal cell carcinoma of the skin after kidney transplantation. A significant reduction of the incidence of other cancers was not found.




Announcements

2016-08-01T00:05:40-07:00
















Competing risks: you only die once

2016-06-22T09:15:57-07:00




Beat it early: putative renoprotective haemodynamic effects of oral hypoglycaemic agents

2016-06-22T09:15:57-07:00

Diabetic kidney disease represents a considerable burden; around one-third of patients with type 2 diabetes develop chronic kidney disease. In health, the kidneys play an important role in the regulation of glucose homeostasis via glucose utilization, gluconeogenesis and glucose reabsorption. In patients with diabetes, renal glucose homeostasis is significantly altered with an increase in both gluconeogenesis and renal tubular reabsorption of glucose. Environmental factors, both metabolic (hyperglycaemia, obesity and dyslipidaemia) and haemodynamic, together with a genetic susceptibility, lead to the activation of pro-oxidative, pro-inflammatory and pro-fibrotic pathways resulting in kidney damage. Hyperfiltration and its haemodynamic-driven insult to the kidney glomeruli is an important player in proteinuria and progression of kidney disease towards end-stage renal failure. Control of glycaemia and blood pressure are the mainstays to prevent kidney damage and slow its progression. There is emerging evidence that some hypoglycaemic agents may have renoprotective effects which are independent of their glucose-lowering effects. Sodium-glucose co-transporter-2 (SGLT-2) inhibitors may exert a renoprotective effect by a number of mechanisms including restoring the tubuloglomerular feedback mechanism and lowering glomerular hyperfiltration, reducing inflammatory and fibrotic markers induced by hyperglycaemia thus limiting renal damage. Simultaneous use of an SGLT-2 inhibitor and blockade of the renin-angiotensin-aldosterone system may be a strategy to slow progression of diabetic nephropathy more than either drug alone. The use of dipeptidyl peptidase-4 inhibitors and glucagon-like peptide 1 receptor agonists may exert a renoprotective effect by reducing inflammation, fibrosis and blood pressure. Given the burden of diabetic kidney disease, any additional renoprotective benefit with hypoglycaemic therapy is to be welcomed. Large randomized controlled trials are currently underway investigating if these new anti-diabetic agents can provide renoprotection in diabetes.




Highlights of the 2015 ERA-EDTA congress: chronic kidney disease, hypertension

2016-06-22T09:15:57-07:00

This paper reports the highlights in the field of chronic kidney disease (CKD) and hypertension of the last European Renal Association (ERA-EDTA) Congress (London, May 21–24, 2016). The items include an European survey on CKD epidemiology and a multicenter, open-label, randomized controlled trial on the effect of renal denervation for hypertension management. Furthermore, a polar view regarding ABPM for the diagnosis and the monitoring of hypertension in dialysis patients is presented.




Pro: Cyclophosphamide in lupus nephritis

2016-06-22T09:15:57-07:00

Based on efficacy and toxicity considerations, both low-dose pulse cyclophosphamide as part of the Euro-Lupus Nephritis protocol and mycophenolate mofetil (MMF) with corticosteroids may be considered for induction of remission in patients with proliferative lupus nephritis. The long-term follow-up data available for low-dose pulse cyclophosphamide, the fact that compliance is guaranteed with this regimen and economic issues all favour the Euro-Lupus regimen in this author's opinion. For maintenance treatment, either azathioprine (AZA) or MMF may be used; AZA is preferred in case pregnancy is planned, while MMF is preferred when the disease relapses during use of AZA and, possibly, after successful induction of remission with MMF.




Opponent's comments

2016-06-22T09:15:57-07:00




Con: Cyclophosphamide for the treatment of lupus nephritis

2016-06-22T09:15:57-07:00

Kidney involvement is a major determinant for morbidity and mortality in patients with systemic lupus erythematosus. The treatment target of lupus renal disease is to induce and maintain remission and to minimize disease or treatment-related comorbidities. Cyclophosphamide (CYC), in conjunction with glucocorticoids, has conventionally been used for the initial treatment of lupus nephritis. However, the major concerns of CYC are its toxicities, such as infertility, urotoxicity and oncogenicity, which are particularly relevant in women of childbearing age. As a result, maintenance therapy of lupus nephritis with an extended course of CYC pulses has largely been replaced by other immunosuppressive agents such as mycophenolate mofetil (MMF) and azathioprine. Recent randomized controlled trials have demonstrated non-inferiority of MMF to pulse CYC as induction therapy of lupus nephritis. Although MMF as induction-maintenance therapy has been increasingly used in lupus nephritis, its efficacy in the long-term preservation of renal function remains to be elucidated. MMF is not necessarily less toxic than CYC. Meta-analyses of clinical trials show similar incidence of infective complications and gastrointestinal adverse events in both MMF- and CYC-based regimens. However, considering the reduction in gonadal toxicity and the risk of oncogenicity, MMF may be used as first-line therapy of lupus nephritis. Tacrolimus (TAC) has recently been shown to be equivalent to either MMF or CYC for inducing remission of lupus nephritis and may be considered as another non-CYC alternative. Combined low-dose MMF and TAC appears to be more effective than CYC pulses in Chinese patients with lupus nephritis and has the potential to replace the more toxic CYC regimens in high-risk patients. Currently, CYC still plays an important role in the management of lupus nephritis patients with impaired or rapidly deteriorating renal function, crescentic glomerulonephritis or as salvage therapy for recalcitrant disease.




Opponent's comments

2016-06-22T09:15:57-07:00




Moderator's view: Cyclophosphamide in lupus nephritis

2016-06-22T09:15:57-07:00

Mycophenolate mofetil was recently accepted as the effective induction treatment of lupus nephritis, with the potential to replace cyclophosphamide or at least expand our therapeutic armamentarium in patients with this lifelong disease often requiring repeated induction treatment of its relapses. Compared with cyclophosphamide, mycophenolate may be more effective in black patients, and the risk of gonadotoxicity may be significantly lower in mycophenolate-treated subjects. However, experience with mycophenolate in severe lupus nephritis is still limited and we also have insufficient data on the long-term outcome of mycophenolate-treated patients. Treatment with mycophenolate is more expensive than with cyclophosphamide, which may limit its use, especially in low- and middle-income countries. The efficacy of mycophenolate mofetil may be more dependent on the patient's compliance compared with intravenous cyclophosphamide pulses. Low-dose cyclophosphamide remains an effective and relatively safe induction treatment of active lupus nephritis, but to decrease its cumulative toxicity, repeated exposure to cyclophosphamide in relapsing patients should be (if possible) avoided.




Mitochondria: a therapeutic target in acute kidney injury

2016-06-22T09:15:57-07:00

Acute kidney injury (AKI) is a common clinical entity that is associated with high mortality and morbidity. It is a risk factor for the development and progression of chronic kidney disease. Presently, no effective treatment for AKI is available, and novel therapeutic approaches are desperately needed. Accumulating evidence highlights mitochondrial dysfunction as an important factor in the pathogenesis of AKI. Recent advances in our understanding of the molecules involved in mitochondrial biogenesis, fusion/fission, mitophagy and their pathophysiological roles will lead to the development of drugs that target mitochondria for the treatment of various diseases, including AKI. In this review, we summarize current knowledge of the contribution of mitochondria-related pathophysiology in AKI and the prospective benefits of mitochondria-targeting therapeutic approaches against AKI.




Muscle wasting in end-stage renal disease promulgates premature death: established, emerging and potential novel treatment strategies

2016-06-22T09:15:57-07:00

Muscle wasting (or sarcopenia) is a common feature of the uremic phenotype and predisposes this vulnerable patient population to increased risk of comorbid complications, poor quality of life, frailty and premature death. The old age of dialysis patients is in addition a likely contributor to loss of muscle mass. As recent evidence suggests that assessment of muscle strength (i.e. function) is a better predictor of outcome and comorbidities than muscle mass, this opens new screening, assessment and therapeutic opportunities. Among established treatment strategies, the benefit of resistance exercise and endurance training are increasingly recognized among nephrologists as being effective and should be promoted in sedentary chronic kidney disease patients. Testosterone and growth hormone replacement appear as the most promising among emerging treatments strategies for muscle wasting. As treatment of muscle wasting is difficult and seldom successful in this often old, frail, sedentary and exercise-hesitant patient group, novel treatment strategies are urgently needed. In this review, we summarize recent studies on stimulation of mitochondrial biogenesis, myogenic stem (satellite) cells and manipulation of transforming growth factor family members, all of which hold promise for more effective therapies to target muscle mass loss and function in the future.




Balancing wobbles in the body sodium

2016-06-22T09:15:57-07:00

Sodium balance is achieved within a matter of days and everything that enters should come out; sodium stores are of questionable relevance and sodium accumulation is accompanied by weight gain. Careful balance studies oftentimes conflicted with this view, and long-term studies suggested that total body sodium (TBNa) fluctuates independent of intake or body weight. We recently performed the opposite experiment in that we fixed sodium intake for weeks at three levels of sodium intake and collected all urine made. We found weekly (circaseptan) patterns in sodium excretion that were inversely related to aldosterone and directly related to cortisol. TBNa was not dependent on sodium intake, but instead exhibited far longer (greater than or equal to monthly) infradian rhythms independent of extracellular water, body weight or blood pressure. To discern the mechanisms further, we delved into sodium magnetic resonance imaging (Na-MRI) to identify sodium storage clinically. We found that sodium stores are greater in men than in women, increase with age and are higher in hypertensive than normotensive persons. We have suggestive evidence that these sodium stores can be mobilized, also in dialysis patients. The observations are in accordance with our findings that immune cells regulate a hypertonic interface in the skin interstitium that could serve as a protective barrier. Returning to our balance studies, we found that due to biological variability in 24-h sodium excretion, collecting urine for a day could not separate 12, 9 or 6 g/day sodium intakes with the precision of tossing a coin. Every other daily urine sampling correctly classified a 3-g difference in salt intake less than half the time, making the gold standard 24-h urine collection of little value in predicting salt intake. We suggest that wobbles in expected outcomes can lead to novel clinical insights even with respect to banal salt questions.




Vitamin D receptor activator and dietary sodium restriction to reduce residual urinary albumin excretion in chronic kidney disease (ViRTUE study): rationale and study protocol

2016-06-22T09:15:57-07:00

Optimal albuminuria reduction is considered essential to halting chronic kidney disease (CKD) progression. Both vitamin D receptor activator (VDRA) treatment and dietary sodium restriction potentiate the efficacy of renin–angiotensin–aldosterone-system (RAAS) blockade to reduce albuminuria. The ViRTUE study addresses whether a VDRA in combination with dietary sodium restriction provides further albuminuria reduction in non-diabetic CKD patients on top of RAAS blockade. The ViRTUE study is an investigator-initiated, prospective, multi-centre, randomized, double-blind (paricalcitol versus placebo), placebo-controlled trial targeting stage 1–3 CKD patients with residual albuminuria of >300 mg/day due to non-diabetic glomerular disease, despite angiotensin-converting enzyme inhibitor or angiotensin receptor blocker use. During run-in, all subjects switched to standardized RAAS blockade (ramipril 10 mg/day) and blood pressure titrated to <140/90 mmHg according to a standardized protocol. Eligible patients are subsequently enrolled and undergo four consecutive study periods in random order of 8 weeks each: (i) paricalcitol (2 µg/day) combined with a liberal sodium diet (~200 mmol Na+/day, i.e. mean sodium intake in the general population), (ii) paricalcitol (2 µg/day) combined with dietary sodium restriction (target: 50 mmol Na+/day), (iii) placebo combined with a liberal sodium diet and (iv) placebo combined with dietary sodium restriction. Data are collected at the end of each study period. The primary outcome is 24-h urinary albumin excretion. Secondary study outcomes are blood pressure, renal function (estimated glomerular filtration rate), plasma renin activity and, in a sub-population (N = 9), renal haemodynamics (measured glomerular filtration rate and effective renal plasma flow). A sample size of 50 patients provides 90% power to detect a 23% reduction in albuminuria, assuming a 25% dropout rate. Further reduction of residual albuminuria by combination of VDRA treatment and sodium restriction during single-agent RAAS-blockade [...]



Induction of cardiac FGF23/FGFR4 expression is associated with left ventricular hypertrophy in patients with chronic kidney disease

2016-06-22T09:15:57-07:00

Background In chronic kidney disease (CKD), serum concentrations of fibroblast growth factor 23 (FGF23) increase progressively as glomerular filtration rate declines, while renal expression of the FGF23 coreceptor Klotho decreases. Elevated circulating FGF23 levels are strongly associated with mortality and with left ventricular hypertrophy (LVH), which is a major cause of cardiovascular death in CKD patients. The cardiac FGF23/FGF receptor (FGFR) system and its role in the development of LVH in humans have not been addressed previously. Methods We conducted a retrospective case–control study in 24 deceased patients with childhood-onset end-stage renal disease (dialysis: n = 17; transplanted: n = 7), and 24 age- and sex-matched control subjects. Myocardial autopsy samples of the left ventricle were evaluated for expression of endogenous FGF23, FGFR isoforms, Klotho, calcineurin and nuclear factor of activated T-cells (NFAT) by immunohistochemistry, immunofluorescence microscopy, qRT-PCR and western blotting. Results The majority of patients presented with LVH (67%). Human cardiomyocytes express full-length FGF23, and cardiac FGF23 is excessively high in patients with CKD. Enhanced myocardial expression of FGF23 in concert with Klotho deficiency strongly correlates with the presence of LVH. Cardiac FGF23 levels associate with time-averaged serum phosphate levels, up-regulation of FGFR4 and activation of the calcineurin-NFAT signaling pathway, an established mediator of cardiac remodelling and LVH. These changes are detected in patients on dialysis but not in those with a functioning kidney transplant. Conclusions Our results indicate a strong association between LVH and enhanced expression levels of FGF23, FGFR4 and calcineurin, activation of NFAT and reduced levels of soluble Klotho in the myocardium of patients with CKD. These alterations are not observed in kidney transplant patients. [...]



Preconditioned suppression of prolyl-hydroxylases attenuates renal injury but increases mortality in septic murine models

2016-06-22T09:15:57-07:00

Background Septic conditions contribute to tissue hypoxia, potentially leading to multiple organ failure, including acute kidney injury. The regulation of cellular adaptation to low oxygen levels is regulated by hypoxia-inducible transcription factors (HIFs). While the role of HIFs in ischaemia/reperfusion is more studied, their function in sepsis-induced renal injury is not well characterized. In this study, we investigated whether pharmacological activation of HIFs by suppression of prolyl-hydroxylases (PHDs) protects against septic acute kidney injury. Methods Two models of sepsis—caecal ligation and punction and peritoneal contamination and infection—were induced on 12-week-old C57BL6/J mice. Pharmacological inhibition of PHDs, leading to HIF activation, was achieved by intraperitoneal application of 3,4-dihydroxybenzoate (3,4-DHB) before sepsis. A quantitative real-time reverse transcription polymerase chain reaction, immunohistology and enzyme-linked immunosorbent assays were utilized to detect gene expression, renal protein levels and renal functional parameters, respectively. Tissue morphology was analysed by periodic acid–Schiff reaction. Early kidney injury was estimated by kidney injury molecule-1 analyses. Apoptosis was detected in situ by terminal deoxynucleotidyl transferase–mediated dUTP nick end labelling stain. The systemic effect of 3,4-DHB pretreatment in sepsis was analysed by 72-h survival studies. Results Pharmacological activation of HIFs before sepsis induction attenuated sepsis-related vacuolization and dilation of the proximal tubules, reduced tubular apoptosis and correlated to lower T-cell infiltration in renal tissue compared with the non-treated septic animals. PHD suppression elevated the basal renal HIF-1α expression and basal plasma concentrations of HIF targets erythropoietin and vascular endothelial growth factor. Whereas it preserved renal structure in both models, it improved renal function in a model-dependent manner. Moreover, inhibition of P[...]



Systemic complement activation and complement gene analysis in enterohaemorrhagic Escherichia coli-associated paediatric haemolytic uraemic syndrome

2016-06-22T09:15:57-07:00

Background

In contrast to atypical haemolytic uraemic syndrome (aHUS), only single case reports and limited data have been published on systemic activation of the complement system and mutations in complement genes in paediatric enterohaemorrhagic Escherichia coli-induced HUS (EHEC-HUS).

Methods

Complement activation (CH50, APH50, C3d, sC5b-9) was analysed at four timepoints (Week 1, Week 2, Month 3 and Month 6 after primary diagnosis of HUS) in 25 children with EHEC-HUS. Seven patients received the complement C5 inhibitor eculizumab. Targeted next generation sequencing for a total of 89 genes involved in complement regulation and coagulation and haemostasis was performed in all patients.

Results

Activity of classical (CH50) and alternative (APH50) complement pathways was normal or even elevated throughout the observation time, except for patients under eculizumab treatment. In contrast, the mean concentration of the soluble terminal complement complex (sC5b-9) was significantly elevated at the first timepoint (mean 498 ng/mL), dropping to normal values after 2 weeks. Initially elevated (42 mU/L) median C3d concentration reached normal levels from Week 2. Levels of sC5b-9 >320 ng/mL at the time of HUS diagnosis were associated with arterial hypertension, oedema and lower platelet counts, but not with the duration of dialysis. Genetic analysis revealed various changes that may have had a modifying impact on the clinical course.

Conclusions

Complement activation at the acute phase of EHEC-HUS, indicated by increased levels of sC5b-9, predicts a poor outcome. Complement alterations appear to be more frequent in patients with EHEC-HUS than previously thought and are suspected to have a role in the severity of the disease.




Progression to Stage 4 chronic kidney disease and death, acute kidney injury and hospitalization risk: a retrospective cohort study

2016-06-22T09:15:57-07:00

Background

Chronic kidney disease (CKD) Stage 4 is on the path to kidney failure, but there is little information on the risks associated with progression to Stage 4 per se. The objective of this study is to determine how progression from Stage 3 to Stage 4 CKD alters morbidity and mortality in a referred cohort of patients.

Methods

We conducted a retrospective cohort study consisting of 1607 patients with estimated glomerular filtration rate (eGFR) of 30–59 mL/min/1.73 m2 referred to a nephrologist at a tertiary care center in Ontario, Canada, between January 2001 and December 2008. Interim progression from Stage 3 to Stage 4 chronic kidney disease was defined by two independent outpatient eGFR values <30 mL/min/1.73 m2. Death, acute kidney injury (AKI) and all-cause hospitalizations subsequent to Stage 4 progression, but prior to the development of end-stage renal disease (ESRD), ascertained from administrative databases.

Results

The mean (standard deviation) baseline eGFR was 43 (8) mL/min/1.73 m2. Over 2.66 years (interquartile range: 1.42–4.45), 344 (21%) patients progressed to Stage 4, 47 (3%) developed ESRD, 188 (12%) patients died, 143 (9%) were hospitalized with AKI and 688 (43%) were hospitalized for any reason. Compared with patients who did not progress to Stage 4, those who did progress had significantly higher adjusted risks of death [hazard ratio (HR) = 2.56, 95% confidence interval (95% CI): 1.75–3.75], AKI (HR = 2.32, 95% CI: 1.44–3.74) and all-cause hospitalization (HR = 1.87, 95% CI: 1.45–2.42).

Conclusions

Progression from Stage 3 to Stage 4 CKD is associated with increased risks of death, AKI and hospitalization prior to ESRD.




Increased psychosocial risk, depression and reduced quality of life living with autosomal dominant polycystic kidney disease

2016-06-22T09:15:57-07:00

Background The psychosocial impact of living with autosomal dominant polycystic kidney disease (ADPKD) is poorly understood. In this study, we assessed the overall quality of life (QOL), mood, perceived social support and psychosocial risk of having a diagnosis of ADPKD in a patient cohort from a major UK nephrology centre serving a large catchment population. Methods A postal questionnaire was sent to 349 patients registered at the Sheffield Kidney Institute with chronic kidney disease but not on renal replacement therapy (RRT). The questionnaire incorporated three validated forms: kidney disease quality-of-life short form (KDQOL SF1.3) to assess QOL; nine-item patient health questionnaire (PHQ9) to screen for depression; multidimensional scale of perceived social support (MSPSS) to evaluate perceived social support; as well as a novel genetic psychosocial risk instrument (GPRI-ADPKD) designed to study the specific psychosocial impact of coping with a diagnosis of ADPKD. Results The overall response rate was 53%. Patients with a lower estimated glomerular filtration rate (<30 mL/min) or larger kidneys (mean length on ultrasound ≥17 cm) reported reduced QOL and increased psychosocial risk. Clinically significant depression was reported in 22% and 62% felt guilty about passing ADPKD on to their children. In multivariate analysis, female gender was associated with overall poorer psychosocial well-being, whereas increasing age, lower kidney function, larger kidneys and loss of a first degree relative from ADPKD were additional risk factors for QOL, depression or psychosocial risk, respectively. Conclusions Our results reveal a significantly poorer QOL and increasing psychosocial risk with markers of disease progression in patients, particularly women, with ADPKD prior to starting RRT. The future management strategy of ADPKD should address these issues and provide for better individual and family support throughout the patient[...]



Caveolin-1 single-nucleotide polymorphism and arterial stiffness in non-dialysis chronic kidney disease

2016-06-22T09:15:57-07:00

Background Arteriosclerosis is an independent predictor of increased cardiovascular mortality in chronic kidney disease (CKD). Histologically it is characterized by hypertrophy and fibrosis of the arterial media wall leading to increased arterial stiffness and end-organ damage. Caveolin-1 acts as an intracellular signalling pathway chaperone in human fibrotic and vascular diseases. The purpose of this study was to assess the association between caveolin-1 (CAV1) single-nucleotide polymorphism (SNP) rs4730751 and arterial stiffness as measured by arterial pulse wave velocity (PWV) in an early-stage CKD cohort and in a cohort with more severe CKD. Methods Two prospectively maintained patient cohorts with non-dialysis CKD were studied: 144 patients in the Chronic Renal Impairment in Birmingham (CRIB) cohort and 147 patients in the Renal Impairment in Secondary Care (RIISC) cohort, with matched exclusion criteria and DNA sampling availability. At entry to each cohort database, each patient's initial arterial PWV was measured, as well as their anthropomorphic and biochemical data. CAV1 rs4730751 SNP genotyping was performed using Taqman technology. Results The CAV1 rs4730751 SNP CC genotype was associated with lower arterial PWV in both CRIB early stage CKD patients [8.1 versus 8.6 m/s; coefficient –0.780 (–1.412, –0.149); P = 0.016] and RIISC more advanced stage CKD patients [8.7 versus 9.4 m/s; coefficient –0.695 (–1.288, –0.102); P = 0.022]; these relationships held following adjustment for other important confounders. Conclusions This replicated study suggests potential utility of the studied CAV1 SNP as a genetic biomarker in CKD and a role for CAV1 in the development of arteriosclerosis in this setting. Further studies are warranted to further explore the basic science driving these clinical observations. [...]



Traditional and non-traditional risk factors for incident peripheral arterial disease among patients with chronic kidney disease

2016-06-22T09:15:57-07:00

Background The risk of peripheral arterial disease (PAD) is higher in patients with chronic kidney disease (CKD) compared with those without. However, reasons for this increased risk are not fully understood. Methods We studied risk factors for incident PAD among 3169 participants in the Chronic Renal Insufficiency Cohort (CRIC) Study. Patients with CKD aged 21–74 years were recruited between 2003 and 2008 and followed for a median of 6.3 years. Incident PAD was defined as a new onset ankle-brachial index (ABI) of <0.9 or confirmed clinical PAD. Results In a multivariate-adjusted model, older age, female sex, non-Hispanic Black, current smoking, diabetes, higher pulse pressure, lower high-density lipoprotein cholesterol, higher low-density lipoprotein cholesterol, and lower estimated glomerular filtration rate were significantly associated with the increased risk of incident PAD. After adjustment for these traditional risk factors as well as use of medications and CRIC Study clinic sites, the following baseline novel risk factors were significantly associated with risk of incident PAD [hazard ratio and 95% confidence interval (CI) for a one standard deviation (SD) higher level]: log[C-reactive protein (CRP)] (1.16, 1.06–1.25, P < 0.001), white blood cell count (1.09, 1.01–1.18, P = 0.03), fibrinogen (1.15, 1.06–1.26, P = 0.002), log(myeloperoxidase) (1.12, 1.03–1.23, P = 0.01), uric acid (0.88, 0.80–0.97, P = 0.01), glycated hemoglobin (1.16, 1.05–1.27, P = 0.003), log(homeostatic model assessment-insulin resistance) (1.21, 1.10–1.32, P < 0.001) and alkaline phosphatase (1.15, 1.07–1.24, P < 0.001). Conclusions Among patients with CKD, inflammation, prothrombotic state, oxidative stress, glycated hemoglobin, insulin resistance and alkaline phosphatase are associated with an increased risk of PAD, independent o[...]



Comparison of oral versus intravenous vitamin D receptor activator in reducing infection-related mortality in hemodialysis patients: the Q-Cohort Study

2016-06-22T09:15:57-07:00

Background

Hemodialysis patients who receive vitamin D receptor activator (VDRA) reportedly have better survival after infection than those who do not. However, the optimal route of its administration for minimizing death from infection remains unclear.

Methods

This prospective cohort study aimed to compare the effectiveness of oral versus intravenous VDRA regarding infection-related mortality in 3372 hemodialysis patients. Eligible subjects were divided into the following three groups by route of administration of VDRA: oral (n = 1868), intravenous (n = 492) and not administered (n = 1012). The effect of VDRA on infection-related mortality was examined using a Cox regression model with propensity score-based adjustments.

Results

During follow-up (median, 4.0 years), 118 study patients died of infection. There was a significantly lower incidence of death from infection in subjects who received intravenous VDRA than in those who did not receive VDRA; however, oral VDRA did not significantly reduce the risk of mortality from infection compared with those who did not receive VDRA [hazard ratio (HR) for intravenous VDRA, 0.16; 95% confidence interval (CI), 0.10–0.25, and HR for oral VDRA, 0.78; 95% CI, 0.60–1.01]. Direct comparison between the oral and intravenous VDRA groups showed that the intravenous group had significantly better survival than the oral group (HR, 0.39; 95% CI, 0.27–0.62).

Conclusions

Treatment with intravenous VDRA more effectively reduces the incidence of mortality from infection than oral VDRA in hemodialysis patients.




Differences in survival on chronic dialysis treatment between ethnic groups in Denmark: a population-wide, national cohort study

2016-06-22T09:15:57-07:00

Background In Western countries, black and Asian dialysis patients experience better survival compared with white patients. The aim of this study is to compare the survival of native Danish dialysis patients with that of dialysis patients originating from other countries and to explore the association between the duration of residence in Denmark before the start of dialysis and the mortality on dialysis. Methods We performed a population-wide national cohort study of incident chronic dialysis patients in Denmark (≥18 years old) who started dialysis between 1995 and 2010. Results In total, 8459 patients were native Danes, 344 originated from other Western countries, 79 from North Africa or West Asia, 173 from South or South-East Asia and 54 from sub-Saharan Africa. Native Danes were more likely to die on dialysis compared with the other groups (crude incidence rates for mortality: 234, 166, 96, 110 and 53 per 1000 person-years, respectively). Native Danes had greater hazard ratios (HRs) for mortality compared with the other groups {HRs for mortality adjusted for sociodemographic and clinical characteristics: 1.32 [95% confidence interval (CI) 1.14–1.54]; 2.22 [95% CI 1.51–3.23]; 1.79 [95% CI 1.41–2.27]; 2.00 [95% CI 1.10–3.57], respectively}. Compared with native Danes, adjusted HRs for mortality for Western immigrants living in Denmark for ≤10 years, >10 to ≤20 years and >20 years were 0.44 (95% CI 0.27–0.71), 0.56 (95% CI 0.39–0.82) and 0.86 (95% CI 0.70–1.04), respectively. For non-Western immigrants, these HRs were 0.42 (95% CI 0.27–0.67), 0.52 (95% CI 0.33–0.80) and 0.48 (95% CI 0.35–0.66), respectively. Conclusions Incident chronic dialysis patients in Denmark originating from countries other than Denmark have a better survival compared with [...]



Regression of asymptomatic cardiomyopathy and clinical outcome of renal transplant recipients: a long-term prospective cohort study

2016-06-22T09:15:57-07:00

Background Asymptomatic left ventricular hypertrophy (LVH) is highly prevalent and associated with an adverse outcome in renal transplant recipients (RTRs). Nonetheless, there are currently no available studies analyzing the effect of LVH regression on solid clinical endpoints in these patients. Methods This study is the prospective observational extension of two randomized controlled trials aimed at assessing the effect of active intervention on post-transplant LVH in RTRs. We evaluated the incidence of a composite of death and any cardiovascular (CV) or renal event in 60 RTRs in whom LVH regression was observed and in 40 whose LVH remained unchanged or worsened. Results During an 8.4 ± 3.5-year follow-up, 8 deaths, 18 CV events and 6 renal events occurred in the entire cohort. Multivariable analysis showed that age [hazard ratio (HR) 1.07, 95% confidence interval (CI) 1.03–1.12 each 1 year, P = 0.002] and LVH regression (HR 0.42, 95% CI 0.22–0.87, P = 0.019) were significant predictors of the composite endpoint. Kaplan–Meier estimates showed better survival rates in patients in whom actual LVH regression was achieved (P < 0.001, log-rank test). Age (HR 1.09, 95% CI 1.03–1.15 each 1 year, P = 0.004), better graft function (HR 0.95, 95% CI 0.91–0.99 each 1 mL/min/1.73 m2 increase in estimated glomerular filtration rate, P = 0.03) and LVH regression (HR 0.41, 95% CI 0.22–0.79, P = 0.01) were significant predictors of the CV endpoint. Patients with a left ventricular mass index decrease also showed better cardiac event-free survival (P = 0.0022, log-rank test). Conclusions This is the first study to demonstrate that LVH regression, regardless of the therapeutic strategy adopted to achieve it, portends better long-term clinical outcome in RTRs. [...]



Simultaneous pancreas/kidney transplant recipients present with late-onset BK polyomavirus-associated nephropathy

2016-06-22T09:15:57-07:00

Background Infections have increased in simultaneous pancreas/kidney transplant recipients (SPKTRs) with BK polyomavirus (BKV)-associated nephropathy (BKVN) being the most important infectious cause of allograft loss. Comparisons of BKVN with kidney transplant recipients (KTRs), however, are lacking. Methods We studied all SPKTRs and KTRs at our transplant centre between 2003 and 2012. Eleven of 106 SPKTs (10.4%) and 21 of 1062 KTRs (2.0%) were diagnosed with BKVN with allograft loss in 1 SPKTR (9.1%) and 2 KTRs (9.5%). A control of 95 SPKTRs without BKVN was used for comparison. Results SPKTRs showed an increased incidence of BKVN compared with KTRs (P < 0.001). Onset of BKVN in SPKTRs was significantly later compared with KTRs (P = 0.033). While 67% of KTRs showed early-onset BKVN, 64% of SPKTRs developed late-onset BKVN. Older recipient age and male gender increased the risk of BKVN in SPKTRs (P < 0.05). No differences were observed for patient and allograft survival (P > 0.05). However, SPKTRs with BKVN showed inferior estimated glomerular filtration rate and a higher incidence of de novo donor-specific antibodies compared with SPKTRs without BKVN in long-term follow-up (P < 0.05). SPKTRs showed higher peak BKV loads, a need for more intense therapeutic intervention and were more likely not to recover to baseline creatinine after BKVN (P < 0.05). Conclusions Our results suggest a higher incidence, more severe course and inferior outcome of BKVN in SPKTRs. An increased vulnerability of the allograft kidney due to inferior organ quality may predispose KTRs to early-onset BKVN. In contrast, SPKTRs present with late-onset BKVN in the presence of high-dose immunosuppression. [...]



Measured and not estimated glomerular filtration rate should be used to assess renal function in heart transplant recipients

2016-06-22T09:15:57-07:00

Background In organ transplanted patients, impaired renal function is of major prognostic importance and influences therapeutic decisions. Therefore, monitoring of renal function with glomerular filtration rate (GFR) is of importance, both before and after heart transplantation (HTx). The GFR can be measured directly (mGFR) or estimated (eGFR) with equations based on circulating creatinine or cystatin C levels. However, these equations have not been thoroughly validated in the HTx population. Methods We investigated the correlation, agreement and accuracy between mGFR (using 51Cr-ethylenediaminetetraacetic acid or iohexol clearance) and three commonly used eGFR equations (Modification of Diet in Renal Disease, Cockcroft–Gault and Chronic Kidney Disease Epidemiology Collaboration) in a retrospective analysis of 416 HTx recipients followed between 1988 and 2012. Comparisons were performed prior to transplantation and at 1, 5 and 10 years of follow-up. Results The correlations between eGFR and mGFR were only moderate, with r-values ranging from 0.55 preoperatively to 0.82 during follow-up. Most importantly, the level of agreement between eGFR and mGFR was very low for all three estimates, with percentage errors ranging from 93.3 to 157.3%. Also, the percentage of patients with eGFR within 30% of mGFR (P30) rarely reached the National Kidney Foundation recommended minimum level of 75%. Conclusion We argue that the accuracy and the precision of the most commonly used estimation equations for assessment of kidney function are unacceptably low and we believe that mGFR should be used liberally as the basis for clinical decision-making both before and after HTx when eGFR is subnormal. [...]



Announcements

2016-06-22T09:15:57-07:00