Subscribe: Nephrology Dialysis Transplantation - current issue
http://ndt.oxfordjournals.org/rss/current.xml
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
albuminuria  dialysis  disease  intake  kidney  mortality  ndash  patients  pla  reduction  renal  risk  salt intake  salt 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Nephrology Dialysis Transplantation - current issue

Nephrology Dialysis Transplantation - current issue



Nephrology Dialysis Transplantation - RSS feed of current issue



 















Causality at the dawn of the 'omics era in medicine and in nephrology

2016-09-02T00:05:37-07:00

Causality is a core concept in medicine. The quantitative determinacy characterizing today's biomedical science is unprecedented. The assessment of causal relations in human diseases is evolving, and it is therefore fundamental to keep up with the steady pace of theoretical and technological advancements. The exact specification of all causes of pathologies at the individual level, precision medicine, is expected to allow the complete eradication of disease. In this article, we discuss the various conceptualizations of causation that are at play in the context of randomized clinical trials and observational studies. Genomics, proteomics, metabolomics and epigenetics can now produce the precise knowledge we need for 21st century medicine. New conceptions of causality are needed to form the basis of the new precision medicine.




The blood pressure-salt sensitivity paradigm: pathophysiologically sound yet of no practical value

2016-09-02T00:05:37-07:00

Sodium plays an important pathophysiological role in blood pressure (BP) values and in the development of hypertension, and epidemiological studies such as the Intersalt Study have shown that the increase in BP occurring with age is determined by salt intake. Recently, a meta-analysis of 13 prospective studies has also shown the close relationship between excess sodium intake and higher risk of stroke and total cardiovascular events. However, the BP response to changing salt intake displayed a marked variability, as first suggested by Kawasaki et al. (The effect of high-sodium and low-sodium intakes on blood pressure and other related variables in human subjects with idiopathic hypertension. Am J Med 1978; 64: 193–198) and later by Weinberger et al. (Definitions and characteristics of sodium sensitivity and blood pressure resistance. Hypertension 1986; 8: II127–II134), who recognized the heterogeneity of the BP response to salt and developed the concept of salt sensitivity. We have a large body of evidence in favour of a major role of metabolic and neuro-hormonal factors in determining BP salt sensitivity in association with the effect of genetic variation. There is evidence that salt sensitivity influences the development of organ damage, even independently—at least in part—of BP levels and the occurrence of hypertension. In addition, several observational studies indicate that salt sensitivity is clearly associated with a higher rate of cardiovascular events and mortality, independently of BP levels and hypertension. A cluster of factors with well-known atherogenic potential such as hyperinsulinaemia, dyslipidaemia and microalbuminuria—all known to be prevalent in salt-sensitive hypertension—might at least partially explain the increased cardiovascular risk observed in salt sensitive individuals. The gold standard for the evaluation of BP salt sensitivity is the BP response to a moderate reduction of salt intake for several weeks; nevertheless, these protocols often suffer of poor patient compliance to dietary instructions. To overcome this problem, short-term tests have been proposed that evaluate either large differences in salt intake for a few days or the response to intravenous administration of saline solution and short-acting diuretics. Recently, the use of ambulatory BP measurement has been proposed for the clinical assessment of BP salt sensitivity. Noteworthy, BP salt sensitivity, in whomever or however assessed, behaves as a continuous variable but salt sensitivity is used as a categorical parameter, with salt-sensitive individuals being defined as those with a difference in BP between low- and high-sodium intake >10%, and salt-resistant subjects those in whom BP does not increase or shows an increase <5% under sodium loading. The general conclusion that can and should be drawn from the above considerations is that the paradigm of salt sensitivity, despite its important pathophysiological meaning, is not helpful, so far, to the practising physician in clinical practice nor is it relevant or useful to the design and implementation of a population-based strategy of salt intake reduction; however, further studies are warranted for an accurate assessment of the salt-sensitivity phenotype in clinical practice. In the absence of a population strategy for salt intake reduction, the aim should be the generation of a ‘low sodium environment’ allowing for a dietary salt intake tailored on true human requirements and not on deleterious lifestyle habits.




Pro: Reducing salt intake at population level: is it really a public health priority?

2016-09-02T00:05:37-07:00

A reduction in salt intake reduces blood pressure, stroke and other cardiovascular events, including chronic kidney disease, by as much as 23% (i.e. 1.25 million deaths worldwide). It is effective in both genders, any age, ethnic group, and in high-, medium- and low-income countries. Population salt reduction programmes are both feasible and effective (preventive imperative). Salt reduction programmes are cost-saving in all settings (high-, middle- and low-income countries) (economic imperative). Public health policies are powerful, rapid, equitable and cost-saving (political imperative). The important shift in public health has not occurred without obstinate opposition from organizations concerned primarily with the profits deriving from population high salt intake and less with public health benefits. A key component of the denial strategy is misinformation (with ‘pseudo’ controversies). In general, poor science has been used to create uncertainty and to support inaction. This paper summarizes the evidence in favour of a global salt reduction strategy and analyses the peddling of well-worn myths behind the false controversies.




Opponent's comments

2016-09-02T00:05:37-07:00




Con: Reducing salt intake at the population level: is it really a public health priority?

2016-09-02T00:05:37-07:00

Scientific evidence to support the recommended salt intake of < 5.8 g/day is virtually non-existingent. There are no randomized controlled trials (RCTs) to investigate the effect of salt reduction (SR) below 5.8 g on health outcomes. The effect of SR on blood pressure (BP) reaches maximal efficacy at 1 week. RCTs in healthy individuals lasting at least 1 week show that the effect of SR on BP is <1 mmHg, but that SR has significant side effects, including increases in renin, aldosterone, noradrenalin, adrenalin, cholesterol and triglyceride. Still, disregarding confounders and side effects, health authorities use BP effects obtained in studies of pre-hypertensive and hypertensive patients to recommend SR in the healthy population and use these biased BP effects in statistical models indirectly to project millions of saved lives. These fantasy projections are in contrast to real data from prospective observational population studies directly associating salt intake with mortality, which show that salt intake <5.8 g/day is associated with an increased mortality of ~15%. The population studies also show that a very high salt intake >12.2 g is associated with increased mortality. However, since <5% of populations consume such high amounts of salt, SR at the population level should not be a public health priority. Consequently, this policy should be abolished, not because any attempt to implement it has failed, and not because it costs taxpayers and food consumers unnecessary billions of dollars, but because—if implemented—it might kill people instead of saving them.




Opponent's comments

2016-09-02T00:05:37-07:00




Moderator's view: Salt, cardiovascular risk, observational research and recommendations for clinical practice

2016-09-02T00:05:37-07:00

In observational studies, blood pressure (BP), cholesterol and nutritional status biomarkers, including sodium intake, coherently show a J- or U-shaped relationship with health outcomes. Yet these data may reflect a stable sodium intake or a reduced intake due to comorbidities or intercurrent disease, or an intentional decrease in salt intake. Adjusting for comorbidities and risk factors may fail to eliminate confounding. For cholesterol and BP, we base our recommendations for prevention and treatment on interventional (experimental) studies. For sodium, we lack the perfect large-scale trial we would need, but substantial circumstantial information derived from interventional studies cannot be ignored. The objection that modelling the risk of salt excess for cardiovascular disease events based on the effect of salt intake on BP is unjustified fails to consider a recent meta-analysis showing that, independently of the intervention applied, intensive BP-lowering treatment (average BP 133/76 mmHg), compared with the less intensive treatment (140/81 mmHg), is associated with a 14% risk reduction for major cardiovascular events. In this knowledge context, inertia, i.e. awaiting the ‘mother trial’, is not justified. While recognizing that this trial may still be needed and that actual data, rather than modelled data, are the ideal solution, for now, the World Health Organization recommendation of reducing salt intake to <2 g/day of sodium (5 g/day of salt) in adults stands.




Chronicity following ischaemia-reperfusion injury depends on tubular-macrophage crosstalk involving two tubular cell-derived CSF-1R activators: CSF-1 and IL-34

2016-09-02T00:05:37-07:00

Two structurally unrelated ligands activate the macrophage colony stimulating factor receptor (CSF-1R, c-fms, CD115): M-CSF/CSF-1 and interleukin-34 (IL-34). Both ligands promote macrophage proliferation, survival and differentiation. IL-34 also activates the protein-tyrosine phosphatase receptor (PTP-, PTPRZ1). Both receptors and cytokines are increased during acute kidney injury. While tubular cell-derived CSF-1 is required for kidney repair, Baek et al. (J Clin Invest 2015; 125: 3198–3214) have now identified tubular epithelial cell-derived IL-34 as a promoter of kidney neutrophil and macrophage infiltration and tubular cell destruction during experimental kidney ischaemia-reperfusion, leading to chronic injury. IL-34 promoted proliferation of both intrarenal macrophages and bone marrow cells, increasing circulating neutrophils and monocytes and their kidney recruitment. Thus, injured tubular cells release two CSF-1R activators, one (CSF-1) that promotes tubular cell survival and kidney repair and another (IL-34) that promotes chronic kidney damage. These results hold promise for the development of IL-34-targeting strategies to prevent ischaemia-reperfusion kidney injury in contexts such as kidney transplantation. However, careful consideration should be given to the recent characterization by Bezie et al. (J Clin Invest 2015; 125: 3952–3964) of IL-34 as a T regulatory cell (Treg) cytokine that modulates macrophage responses so that IL-34-primed macrophages potentiate the immune suppressive capacity of Tregs and promote graft tolerance.




Kelch-like 3/Cullin 3 ubiquitin ligase complex and WNK signaling in salt-sensitive hypertension and electrolyte disorder

2016-09-02T00:05:38-07:00

Pseudohypoaldosteronism type II (PHAII) is a hereditary disease characterized by salt-sensitive hypertension, hyperkalemia and thiazide sensitivity. Mutations in with-no-lysine kinase 1 (WNK1) and WNK4 genes are reported to cause PHAII. Rigorous studies have demonstrated that WNK kinases constitute a signaling cascade with oxidative stress-responsive gene 1 (OSR1), Ste20-related proline-alanine-rich kinase (SPAK) and the solute carrier family 12a (SLC12a) transporter, including thiazide-sensitive NaCl cotransporter. The WNK–OSR1/SPAK-SLC12a signaling cascade is present in the kidneys and vascular smooth muscle cells (VSMCs) and regulates salt sensitivity physiologically, i.e. urinary sodium excretion and arterial tone by various hormonal and dietary factors. However, although it was clear that the abnormal activation of this signaling cascade is the molecular basis of PHAII, the molecular mechanisms responsible for the physiological regulation of WNK signaling and the effect of WNK4 mutations on PHAII pathogenesis are poorly understood. Two additional genes responsible for PHAII, Kelch-like 3 (KLHL3) and Cullin 3 (CUL3), were identified in 2012. WNK1 and WNK4 have been shown to be substrates of KLHL3–CUL3 E3 ubiquitin ligase both in vitro and in vivo. In PHAII, the loss of interaction between KLHL3 and WNK4 induces increased levels of WNK kinases due to impaired ubiquitination. These results indicate that WNK signaling is physiologically regulated by KLHL3/CUL3-mediated ubiquitination. Here, we review recent studies investigating the pathophysiological roles of the WNK signaling cascade in the kidneys and VSMCs and recently discovered mechanisms underlying the regulation of WNK signaling by KLHL3 and CUL3.




Glomerular filtration rate decline as a surrogate end point in kidney disease progression trials

2016-09-02T00:05:38-07:00

Chronic kidney disease (CKD) is strongly associated with increased risks of progression to end-stage kidney disease (ESKD) and mortality. Clinical trials evaluating CKD progression commonly use a composite end point of death, ESKD or serum creatinine doubling. However, due to low event rates, such trials require large sample sizes and long-term follow-up for adequate statistical power. As a result, very few interventions targeting CKD progression have been tested in randomized controlled trials. To overcome this problem, the National Kidney Foundation and Food and Drug Administration conducted a series of analyses to determine whether an end point of 30 or 40% decline in estimated glomerular filtration rate (eGFR) over 2–3 years can substitute for serum creatinine doubling in the composite end point. These analyses demonstrated that these alternate kidney end points were significantly associated with subsequent risks of ESKD and death. However, the association between, and consistency of treatment effects on eGFR decline and clinical end points were influenced by baseline eGFR, follow-up duration and acute hemodynamic effects. The investigators concluded that a 40% eGFR decline is broadly acceptable as a kidney end point across a wide baseline eGFR range and that a 30% eGFR decline may be acceptable in some situations. Although these alternate kidney end points could potentially allow investigators to conduct shorter duration clinical trials with smaller sample sizes thereby generating evidence to guide clinical decision-making in a timely manner, it is uncertain whether these end points will improve trial efficiency and feasibility. This review critically appraises the evidence, strengths and limitations pertaining to eGFR end points.




Changes in inflammatory biomarkers after renal revascularization in atherosclerotic renal artery stenosis

2016-09-02T00:05:38-07:00

Background

Atherosclerotic renal artery stenosis (ARAS) activates oxidative stress and chronic inflammatory injury. Contrast imaging and endovascular stenting pose potential hazards for acute kidney injury, particularly when superimposed upon reduced kidney perfusion.

Methods

We measured sequential early and long-term changes in circulating inflammatory and injury biomarkers in 12 ARAS subjects subjected to computed tomography imaging and stent revascularization compared with essential hypertensive (EH) subjects of similar age under fixed sodium intake and medication regimens in a clinical research unit.

Results

NGAL, TIMP-2, IGFBP7, MCP-1 and TNF-α all were elevated before intervention. Post-stenotic kidney volume, perfusion, blood flow and glomerular filtration rate (GFR) were lower in ARAS than in EH subjects. TIMP-2 and IGFBP7 fell briefly, then rose over 18 h after contrast imaging and stent deployment. Circulating NGAL decreased and remained lower for 27 h. These biomarkers in ARAS returned to baseline after 3 months, while kidney volume, perfusion, blood flow and GFR increased, but remained lower than EH.

Conclusions

These divergent patterns of inflammatory signals are consistent with cell cycle arrest (TIMP-2, IGFBP7) and relative protection from acute kidney injury after imaging and stenting. Sustained basal elevation of circulating and renal venous inflammatory biomarkers support ongoing, possibly episodic, renal stress in ARAS that limits toxicity from stent revascularization.




Metallothioneins and renal ageing

2016-09-02T00:05:38-07:00

Background

Human lifespan is increasing continuously and about one-third of the population >70 years of age suffers from chronic kidney disease. The pathophysiology of the loss of renal function with ageing is unclear.

Methods

We determined age-associated gene expression changes in zero-hour biopsies of deceased donor kidneys without laboratory signs of impaired renal function, defined as a last serum creatinine >0.96 mg/dL in females and >1.18 mg/dL in males, using microarray technology and the Significance Analysis of Microarrays routine. Expression changes of selected genes were confirmed by quantitative polymerase chain reaction and in situ hybridization and immunohistochemistry for localization of respective mRNA and protein. Functional aspects were examined in vitro.

Results

Donors were classified into three age groups (<40, 40–59 and >59 years; Groups 1, 2 and 3, respectively). In Group 3 especially, genes encoding for metallothionein (MT) isoforms were more significantly expressed when compared with Group 1; localization studies revealed predominant staining in renal proximal tubular cells. RPTEC/TERT1 cells overexpressing MT2A were less susceptible towards cadmium chloride–induced cytotoxicity and hypoxia-induced apoptosis, both models for increased generation of reactive oxygen species.

Conclusions

Increased expression of MTs in the kidney with ageing might be a protective mechanism against increased oxidative stress, which is closely related to the ageing process. Our findings indicate that MTs are functionally involved in the pathophysiology of ageing-related processes.




Extended versus standard azathioprine maintenance therapy in newly diagnosed proteinase-3 anti-neutrophil cytoplasmic antibody-associated vasculitis patients who remain cytoplasmic anti-neutrophil cytoplasmic antibody-positive after induction of remission: a randomized clinical trial

2016-09-02T00:05:38-07:00

Background

Cytoplasmic anti-neutrophil cytoplasmic antibody (C-ANCA) positivity at remission has been associated with an increased relapse rate in patients with proteinase 3 anti-neutrophil cytoplasmic antibody-associated vasculitis (PR3-AAV) after a switch to azathioprine maintenance therapy. We therefore hypothesized that extended azathioprine maintenance therapy could reduce the incidence of relapse in this setting.

Methods

Patients newly diagnosed with PR3-AAV at 12 centres in The Netherlands during 2003–11 who received a standardized induction regimen consisting of oral cyclophosphamide and corticosteroids were enrolled (n = 131). Patients were randomized to standard or extended azathioprine maintenance therapy when C-ANCA was positive at the time of stable remission. Standard maintenance treatment consisted of azathioprine (1.5–2.0 mg/kg) until 1 year after diagnosis and subsequent tapering to 25 mg every 3 months. Extended azathioprine maintenance therapy (1.5–2.0 mg/kg) was continued until 4 years after diagnosis and tapered thereafter. The primary endpoint was relapse-free survival at 4 years after diagnosis.

Results

In patients with PR3-AAV who were C-ANCA positive at the time of stable remission, relapse-free survival at 4 years after diagnosis did not differ significantly between standard azathioprine (n = 24) and extended azathioprine (n = 21) maintenance therapy (P = 0.40). There was also no significant difference in relapse-free survival between patients receiving standard azathioprine (n = 106) versus extended azathioprine maintenance therapy (n = 21; P = 0.94). In addition, there was no difference in the relapse rate between patients with PR3-AAV who were C-ANCA positive (n = 45) at the time of remission versus patients who became C-ANCA negative at the time of remission (n = 82; P = 0.62).

Conclusions

This randomized trial suggests that extended azathioprine maintenance therapy has only a limited effect on the prevention of relapse in patients with PR3-AAV at 4 years after diagnosis. Moreover, positive C-ANCA status at stable remission was not associated with an increased rate of relapse.

Trial registration

ClinicalTrials.gov NCT 00128895.




Relationship of proximal tubular injury to chronic kidney disease as assessed by urinary kidney injury molecule-1 in five cohort studies

2016-09-02T00:05:38-07:00

Background

The primary biomarkers used to define CKD are serum creatinine and albuminuria. These biomarkers have directed focus on the filtration and barrier functions of the kidney glomerulus even though albuminuria results from tubule dysfunction as well. Given that proximal tubules make up ~90% of kidney cortical mass, we evaluated whether a sensitive and specific marker of proximal tubule injury, urinary kidney injury molecule-1 (KIM-1), is elevated in individuals with CKD or with risk factors for CKD.

Methods

We measured urinary KIM-1 in participants of five cohort studies from the USA and Sweden. Participants had a wide range of kidney function and were racially and ethnically diverse. Multivariable linear regression models were used to test the association of urinary KIM-1 with demographic, clinical and laboratory values.

Results

In pooled, multivariable-adjusted analyses, log-transformed, creatinine-normalized urinary KIM-1 levels were higher in those with lower eGFR {β = –0.03 per 10 mL/min/1.73 m2 [95% confidence interval (CI) –0.05 to –0.02]} and greater albuminuria [β = 0.16 per unit of log albumin:creatinine ratio (95% CI 0.15–0.17)]. Urinary KIM-1 levels were higher in current smokers, lower in blacks than nonblacks and lower in users versus nonusers of angiotensin-converting enzyme inhibitors and angiotensin receptor blockers.

Conclusion

Proximal tubule injury appears to be an integral and measurable element of multiple stages of CKD.




Individual long-term albuminuria exposure during angiotensin receptor blocker therapy is the optimal predictor for renal outcome

2016-09-02T00:05:38-07:00

Background

Albuminuria reduction due to angiotensin receptor blockers (ARBs) predicts subsequent renoprotection. Relating the initial albuminuria reduction to subsequent renoprotection assumes that the initial ARB-induced albuminuria reduction remains stable during follow-up. The aim of this study was to assess individual albuminuria fluctuations after the initial ARB response and to determine whether taking individual albuminuria fluctuations into account improves renal outcome prediction.

Methods

Patients with diabetes and nephropathy treated with losartan or irbesartan in the RENAAL and IDNT trials were included. Patients with >30% reduction in albuminuria 3 months after ARB initiation were stratified by the subsequent change in albuminuria until Month 12 in enhanced responders (>50% albuminuria reduction), sustained responders (between 20 and 50% reduction), and response escapers (<20% reduction). Predictive performance of the individual albuminuria exposure until Month 3 was compared with the exposure over the first 12 months using receiver operating characteristics (ROC) curves.

Results

Following ARB initiation, 388 (36.3%) patients showed an >30% reduction in albuminuria. Among these patients, the albuminuria level further decreased in 174 (44.8%), remained stable in 123 (31.7%), and increased in 91 (23.5%) patients. Similar albuminuria fluctuations were observed in patients with <30% albuminuria reduction. Renal risk prediction improved when using the albuminuria exposure during the first 12 months versus the initial Month 3 change [ROC difference: 0.78 (95% CI 0.75–0.82) versus 0.68 (0.64–0.72); P < 0.0001].

Conclusions

Following the initial response to ARBs, a large within-patient albuminuria variability is observed. Hence, incorporating multiple albuminuria measurements over time in risk algorithms may be more appropriate to monitor treatment effects and quantify renal risk.




Use of urine biomarker-derived clusters to predict the risk of chronic kidney disease and all-cause mortality in HIV-infected women

2016-09-02T00:05:38-07:00

Background

Although individual urine biomarkers are associated with chronic kidney disease (CKD) incidence and all-cause mortality in the setting of HIV infection, their combined utility for prediction remains unknown.

Methods

We measured eight urine biomarkers shown previously to be associated with incident CKD and mortality risk among 902 HIV-infected women in the Women's Interagency HIV Study: N-acetyl-β-d-glucosaminidase (NAG), kidney injury molecule-1 (KIM-1), alpha-1 microglobulin (α1m), interleukin 18, neutrophil gelatinase-associated lipocalin, albumin-to-creatinine ratio, liver fatty acid-binding protein and α-1-acid-glycoprotein. A group-based cluster method classified participants into three distinct clusters using the three most distinguishing biomarkers (NAG, KIM-1 and α1m), independent of the study outcomes. We then evaluated associations of each cluster with incident CKD (estimated glomerular filtration rate <60 mL/min/1.73 m2 by cystatin C) and all-cause mortality, adjusting for traditional and HIV-related risk factors.

Results

Over 8 years of follow-up, 177 CKD events and 128 deaths occurred. The first set of clusters partitioned women into three groups, containing 301 (Cluster 1), 470 (Cluster 2) and 131 (Cluster 3) participants. The rate of CKD incidence was 13, 21 and 50% across the three clusters; mortality rates were 7.3, 13 and 34%. After multivariable adjustment, Cluster 3 remained associated with a nearly 3-fold increased risk of both CKD and mortality, relative to Cluster 1 (both P < 0.001). The addition of the multi-biomarker cluster to the multivariable model improved discrimination for CKD (c-statistic = 0.72–0.76, P = 0.0029), but only modestly for mortality (c = 0.79–0.80, P = 0.099). Clusters derived with all eight markers were no better for discrimination than the three-biomarker clusters.

Conclusions

For predicting incident CKD in HIV-infected women, clusters developed from three urine-based kidney disease biomarkers were as effective as an eight-marker panel in improving risk discrimination.




PLA2R antibodies, glomerular PLA2R deposits and variations in PLA2R1 and HLA-DQA1 genes in primary membranous nephropathy in South Asians

2016-09-02T00:05:38-07:00

Background

Antibodies to M-type phospholipase A2 receptor (PLA2R) correlate with clinical activity of primary membranous nephropathy (PMN). Risk alleles in PLA2R1 and HLA-DQA1 genes are associated with PMN. Whether these alleles are associated with the development of anti-PLA2R is unknown. In this prospective study we evaluated anti-PLA2R, enhanced glomerular staining for PLA2R and variations in PLA2R1 and HLA-DQA1 genes in Indian patients with PMN and examined their association with response to treatment.

Methods

A total of 114 adult PMN patients were studied. Anti-PLA2R was estimated before treatment and after 6 and 12 months of therapy. Enhanced glomerular staining for PLA2R was assessed on fresh frozen tissue. Genotype analysis was done on recruited patients and 95 healthy controls by TaqMan assays for six single-nucleotide polymorphisms (SNPs; rs4664308, rs3749119, rs3749117, rs4664308, rs3828323 and rs2187668). Patients were followed up monthly for a period of 12 months.

Results

Of 114 patients, 66.7% showed elevated serum anti-PLA2R by ELISA and 64.9% by indirect immunofluorescence. About 75% had enhanced glomerular staining for PLA2R. A total of 82% of patients had PLA2R-related disease. Reduction in serum anti-PLA2R titer had a significant association with remission of nephrotic syndrome (P = 0.0003) at 6 and 12 months. More than 85% of patients showing >90% reduction in the anti-PLA2R titer achieved remission of the nephrotic state, whereas of those showing <50% reduction in titers, 87.5% had persistent nephrotic state. The SNPs rs3749119, rs3749117, rs4664308 in PLA2R1 and rs2187668 in HLA-DQA1 were significantly associated with PMN. The SNP rs2187668 was associated with anti-PLA2R positivity. Patients with a high-risk genotype had higher anti-PLA2R levels.

Conclusion

To conclude, anti-PLA2R and enhanced glomerular PLA2R staining are found in more than two-thirds of Indian PMN cases. A reduction in the anti-PLA2R titer correlated with response to therapy.




Fibroblast growth factor 23 correlates with volume status in haemodialysis patients and is not reduced by haemodialysis

2016-09-02T00:05:38-07:00

Background

Recent data suggest a role for fibroblast growth factor 23 (FGF-23) in volume regulation. In haemodialysis patients, a large ultrafiltration volume (UFV) reflects poor volume control, and both FGF-23 and a large UFV are risk factors for mortality in this population. We studied the association between FGF-23 and markers of volume status including UFV, as well as the intradialytic course of FGF-23, in a cohort of haemodialysis patients.

Methods

We carried out observational, post hoc analysis of 109 prevalent haemodialysis patients who underwent a standardized, low-flux, haemodialysis session with constant ultrafiltration rate. We measured UFV, plasma copeptin and echocardiographic parameters including cardiac output, end-diastolic volume and left ventricular mass index at the onset of the haemodialysis session. We measured the intradialytic course of plasma C-terminal FGF-23 (corrected for haemoconcentration) and serum phosphate levels at 0, 1, 3 and 4 h after onset of haemodialysis and analysed changes with linear mixed effect model.

Results

Median age was 66 (interquartile range: 51–75) years, 65% were male with a weekly Kt/V 4.3 ± 0.7 and dialysis vintage of 25.4 (8.5–52.5) months. In univariable analysis, pre-dialysis plasma FGF-23 was associated with UFV, end-diastolic volume, cardiac output, early diastolic velocity e' and plasma copeptin. In multivariable regression analysis, UFV correlated with FGF-23 (standardized β: 0.373, P < 0.001, model R2: 57%), independent of serum calcium and phosphate. The association between FGF-23 and echocardiographic volume markers was lost for all but cardiac output upon adjustment for UFV. Overall, FGF-23 levels did not change during dialysis [7627 (3300–13 514) to 7503 (3109–14 433) RU/mL; P = 0.98], whereas phosphate decreased (1.71 ± 0.50 to 0.88 ± 0.26 mmol/L; P < 0.001).

Conclusions

FGF-23 was associated with volume status in haemodialysis patients. The strong association with UFV suggests that optimization of volume status, for example by more intensive haemodialysis regimens, may also benefit mineral homeostasis. A single dialysis session did not lower FGF-23 levels.




Mortality trends among Japanese dialysis patients, 1988-2013: a joinpoint regression analysis

2016-09-02T00:05:38-07:00

Background

Evaluation of mortality trends in dialysis patients is important for improving their prognoses. The present study aimed to examine temporal trends in deaths (all-cause, cardiovascular, noncardiovascular and the five leading causes) among Japanese dialysis patients.

Methods

Mortality data were extracted from the Japanese Society of Dialysis Therapy registry. Age-standardized mortality rates were calculated by direct standardization against the 2013 dialysis population. The average annual percentage of change (APC) and the corresponding 95% confidence interval (CI) were computed for trends using joinpoint regression analysis.

Results

A total of 469 324 deaths occurred, of which 25.9% were from cardiac failure, 17.5% from infectious disease, 10.2% from cerebrovascular disorders, 8.6% from malignant tumors and 5.6% from cardiac infarction. The joinpoint trend for all-cause mortality decreased significantly, by –3.7% (95% CI –4.2 to –3.2) per year from 1988 through 2000, then decreased more gradually, by –1.4% (95% CI –1.7 to –1.2) per year during 2000–13. The improved mortality rates were mainly due to decreased deaths from cardiovascular disease, with mortality rates due to noncardiovascular disease outnumbering those of cardiovascular disease in the last decade. Among the top five causes of death, cardiac failure has shown a marked decrease in mortality rate. However, the rates due to infectious disease have remained stable during the study period [APC 0.1 (95% CI –0.2–0.3)].

Conclusions

Significant progress has been made, particularly with regard to the decrease in age-standardized mortality rates. The risk of cardiovascular death has decreased, while the risk of death from infection has remained unchanged for 25 years.




Phosphorus metabolism in peritoneal dialysis- and haemodialysis-treated patients

2016-09-02T00:05:38-07:00

Background

Phosphorus control is generally considered to be better in peritoneal dialysis (PD) patients as compared with haemodialysis (HD) patients. Predialysis phosphorus concentrations are misleading as a measure of phosphorus exposure in HD, as these neglect significant dialysis-related fluctuations.

Methods

Parameters of mineral metabolism, including parathyroid hormone (PTH) and fibroblast growth factor-23 (FGF-23), were determined in 79 HD and 61 PD patients. In PD, phosphorus levels were determined mid-morning. In HD, time-averaged phosphorus concentrations were modelled from measurements before and after the mid-week dialysis session. Weekly renal, dialytic and total phosphorus clearances as well as total mass removal were calculated from urine and dialysate collections.

Results

Time-averaged serum phosphorus concentrations in HD (3.5 ± 1.0 mg/dL) were significantly lower than the mid-morning concentrations in PD (5.0 ± 1.4 mg/dL, P < 0.0001). In contrast, predialysis phosphorus concentrations (4.6 ± 1.4 mg/dL) were not different from PD. PTH and FGF-23 levels were significantly higher in PD. Despite higher residual renal function, total phosphorus clearance was significantly lower in PD (P < 0.0001). Total phosphorus mass removal, conversely, was significantly higher in PD (P < 0.05).

Conclusions

Our data suggest that the time-averaged phosphorus concentrations in patients treated with PD are higher as compared with patients treated with HD. Despite a better preserved renal function, total phosphorus clearance is lower in patients treated with PD. Additional studies are needed to confirm these findings in a population with a different demographic profile and dietary background and to define clinical implications.




High-urgency kidney transplantation in the Eurotransplant Kidney Allocation System: success or waste of organs? The Eurotransplant 15-year all-centre survey

2016-09-02T00:05:38-07:00

Background

In the Eurotransplant Kidney Allocation System (ETKAS), transplant candidates can be considered for high-urgency (HU) status in case of life-threatening inability to undergo renal replacement therapy. Data on the outcomes of HU transplantation are sparse and the benefit is controversial.

Methods

We systematically analysed data from 898 ET HU kidney transplant recipients from 61 transplant centres between 1996 and 2010 and investigated the 5-year patient and graft outcomes and differences between relevant subgroups.

Results

Kidney recipients with an HU status were younger (median 43 versus 55 years) and spent less time on the waiting list compared with non-HU recipients (34 versus 54 months). They received grafts with significantly more mismatches (mean 3.79 versus 2.42; P < 0.001) and the percentage of retransplantations was remarkably higher (37.5 versus 16.7%). Patient survival (P = 0.0053) and death with a functioning graft (DwFG; P < 0.0001) after HU transplantation were significantly worse than in non-HU recipients, whereas graft outcome was comparable (P = 0.094). Analysis according to the different HU indications revealed that recipients listed HU because of an imminent lack of access for dialysis had a significantly worse patient survival (P = 0.0053) and DwFG (P = 0.0462) compared with recipients with psychological problems and suicidality because of dialysis. In addition, retransplantation had a negative impact on patient and graft outcome.

Conclusions

Facing organ shortages, increasing wait times and considerable mortality on dialysis, we question the current policy of HU allocation and propose more restrictive criteria with regard to individuals with vascular complications or repeated retransplantations in order to support patients on the non-HU waiting list with a much better long-term prognosis.




Estimated nephron number of the remaining donor kidney: impact on living kidney donor outcomes

2016-09-02T00:05:38-07:00

Background

It has been demonstrated that low birth weight gives rise to a reduction in nephron number with increased risks for hypertension and renal disease. Its impact on renal function in kidney donors, however, has not been addressed.

Methods

To investigate the impact of birth weight, kidney weight, kidney volume and estimated nephron number on kidney function, we collected data from 91 living kidney donors before nephrectomy, at +12, +36 and +60 months after nephrectomy.

Results

Birth weight showed a positive correlation with estimated glomerular filtration rate (eGFR) at +12, +36 and +60 months after nephrectomy (P < 0.05). The strongest link was observed in donors >50 years old (R = 0.535, P < 0.001 at +12 months). Estimated nephron number and eGFR showed a strong positive correlation at +12, +36 and +60 months after nephrectomy (R = 0.540; R = 0.459; R = 0.506, P < 0.05). Daily proteinuria at +12 months showed a negative correlation with birth weight (P = 0.009). Donors with new-onset hypertension showed significantly lower birth weights and higher uric acid levels (P < 0.05). Kidney weight and volume did not show any impact on donor outcomes (P > 0.05).

Conclusions

Low nephron number predisposes donors to inferior remaining eGFR, hypertension and proteinuria. The strong correlation in elderly donors may be attributed to reduced renal functional reserve due to the decline of renal function with age.




'I feel stronger and younger all the time--perspectives of elderly kidney transplant recipients: thematic synthesis of qualitative research

2016-09-02T00:05:38-07:00

Background

Kidney transplantation offers improved survival and quality of life to an increasing number of elderly patients with end-stage kidney disease. However, elderly kidney transplant recipients may face unique challenges due to a higher burden of comorbidity, greater cumulative risk of immunosuppression-related complications and increasing frailty. We aimed to describe the perspectives of elderly kidney transplant recipients.

Methods

Electronic databases were searched to April 2015. Qualitative studies were eligible if they reported views from elderly kidney transplant recipients (≥60 years). Thematic synthesis was used to analyse the findings.

Results

Twenty-one studies involving >116 recipients were included. We identified seven themes. ‘Regaining strength and vitality’ meant valuing the physical and psychosocial improvements in daily functioning and life participation. ‘Extending life’ was the willingness to accept any organ, including extended criteria kidneys, to prolong survival. ‘Debt of gratitude’ entailed conscious appreciation toward their donor while knowing they were unable to repay their sacrifice. ‘Moral responsibility to maintain health’ motivated adherence to medication and lifestyle recommendations out of an ethical duty to protect their gift for graft survival. ‘Unabating and worsening forgetfulness’ hindered self-management. ‘Disillusionment with side effects and complications’ reflected disappointment and exasperation with the unintended consequences of medications. ‘Finality of treatment option’ was an acute awareness that the current transplant may be their last.

Conclusions

Kidney transplantation was perceived to slow and even reverse the experience of aging among elderly recipients, especially compared with dialysis. However, some were frustrated over persistent limitations after transplant, struggled with the burden of medication side effects and worried about a possible return to dialysis if the transplant failed. Clarifying patient expectations of transplantation, providing support to alleviate the debilitating impacts of immunosuppression and addressing fears about deteriorating health and graft failure may improve satisfaction and outcomes in elderly kidney transplant recipients.




Next generation sequencing and functional analysis of patient urine renal progenitor-derived podocytes to unravel the diagnosis underlying refractory lupus nephritis

2016-09-02T00:05:38-07:00

Often the cause of refractory lupus nephritis (RLN) remains unclear. We performed next-generation sequencing for podocyte genes in an RLN patient and identified compound heterozygosity for APOL1 risk alleles G1 and G2 and a novel homozygous c.[1049C>T]+[1049C>T] NPHS1 gene variant of unknown significance. To test for causality renal progenitor cells isolated from urine of this patient were differentiated into podocytes in vitro. Podocytes revealed aberrant nephrin trafficking, cytoskeletal structure and lysosomal leakage, and increased detachment as compared with podocytes isolated from controls. Thus, lupus podocytopathy can be confirmed as a cause of RLN by functional genetics on patient-derived podocytes.




Announcements

2016-09-02T00:05:38-07:00