Subscribe: Nephrology Dialysis Transplantation - recent issues
http://ndt.oxfordjournals.org/rss/recent.xml
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
chronic kidney  chronic  ckd  dialysis  disease  kidney disease  kidney  ndash  patients  renal  risk  study  therapy  treatment 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Nephrology Dialysis Transplantation - recent issues

Nephrology Dialysis Transplantation - recent issues



Nephrology Dialysis Transplantation - RSS feed of recent issues (covers the latest 3 issues, including the current issue)



 















Pro: STOP immunosuppression in IgA nephropathy?

2016-11-02T00:05:41-07:00

The Kidney Disease: Improving Global Outcomes (KDIGO) guidelines suggest a 6-month course of corticosteroids (CS) for IgA nephropathy (IgAN) patients with persistent proteinuria ≥1 g/day despite 3–6 months of renin–angiotensin system (RAS) blockers and glomerular filtration rate (GFR) >50 mL/min/1.73 m2. In December 2015, Rauen et al. (N Engl J Med 2015; 373: 2225–2236) published an article entitled ‘Intensive supportive care plus immunosuppression in IgA nephropathy’ (STOP-IgAN), which presented results from 379 IgAN patients from 32 nephrology centres in Germany. During a run-in phase of 6 months, patients received supportive care therapy including RAS blockers, dietary counselling, advice to stop smoking and avoid nephrotoxic drugs, and statins if required. After 6 months, 177 patients with proteinuria >0.75 g/day (non-responder patients) were randomized to either receive continued supportive care or supportive care plus immunosuppression (monotherapy with CS or combined therapy with three immunosuppressants). The authors reported that, after 36 months of observation, the addition of immunosuppressants to ongoing comprehensive supportive care was not beneficial in IgAN patients with moderate proteinuria and chronic kidney disease stages 1–3. These conclusions are questionable for several reasons: (i) studies on time-average proteinuria have shown that beneficial effects on renal survival, not evident after 36 months, emerge over the course of longer observation periods; (ii) supportive care in the STOP-IgAN study resulted in a small loss of renal function during the 36 months of observation (annual decrease in the estimated GFR of 1.6 mL/min/1.73 m2), but was unable to reduce proteinuria below 1 g/day; in contrast, 6 months of steroid therapy lowered proteinuria below 1 g/day; and (iii) the lack of any assessment of the histological data does not allow the importance of the morphological lesions on renal survival and therapy effects to be monitored. Further evaluation with a longer follow-up period is needed to obtain more reliable answers than the weak evidence of this study.




Opponent's comments

2016-11-02T00:05:41-07:00




Con: STOP immunosuppression in IgA nephropathy

2016-11-02T00:05:41-07:00

A comprehensive supportive therapy approach constitutes the mainstay treatment of IgA nephropathy (IgAN) patients. In our recent Supportive versus immunosuppressive Therapy Of Progressive IgA Nephropathy (STOP-IgAN) trial, we systematically selected for patients at high risk of a progressive disease course and evaluated the effect of immunosuppression, combined with supportive care, on renal end points in these patients. There was a higher rate of full clinical remission and transient proteinuria reduction in immunosuppressed patients. However, deterioration of renal function (i.e. number of patients with an estimated glomerular filtration rate (eGFR) decrease of at least 15 mL/min over the 3-year trial phase) was remarkably slow in both groups, compared with previous studies, and was not slowed further by adding immunosuppression to supportive care. Here, we address several concerns raised on the design and interpretation of our trial. In our randomized patients, we confirmed a lower baseline proteinuria to be predictive of clinical remission in IgAN. However, the observed transient drop in proteinuria in the immunosuppressed patients did not translate into an improved overall renal outcome in these patients. Although longer follow-up would be desirable, there was not even a trend for the eGFR course to diverge between our two treatment arms during the trial phase. Finally, it is important to note that we excluded specific infrequent patient groups during our run-in phase. Therefore, IgAN patients with a rapidly progressing course and those with persistent proteinuria >3.5 g/day would require further evaluation regarding potential benefits of immunosuppressive therapies.




Opponent's comments

2016-11-02T00:05:41-07:00




Moderator's view: Treatment of IgA nephropathy--getting comfortable with uncertainty

2016-11-02T00:05:41-07:00

A Polar Views discussion by Pozzi and Rauen et al. on the interpretation and clinical application of the recently published Supportive Versus Immunosuppressive Therapy of Progressive IgA Nephropathy (STOP-IgAN) trial has elucidated important points concerning potential strengths and weaknesses of this landmark randomized trial. This critical examination of the impact of steroid monotherapy or steroid plus an immunosuppressive (IS) agent compared with ‘supportive’ therapy with inhibitors of the renin–angiotensin system (RAS) has enhanced our appreciation of the importance of rigorous application of titrated RAS inhibition in high-risk patients with persistent proteinuria >0.75 g/day. At the same time, it brings a new level of uncertainty concerning the overall value and risk of steroid or steroid plus IS therapy in patients failing such ‘supportive' therapy. Some of these uncertainties revolve on issues of study design, such as the duration of follow-up, participant stratification (particularly underlying pathology) and dosing regimens. It is hoped that additional trials, better methods of patient selection, improved surrogate end points and safer regimens will lead to less uncertainty over the best treatment practices. On balance, the STOP-IgAN trial raises some key concerns about the merits of steroid alone or steroid plus IS therapy for selected subjects with IgAN, but it does not by itself close the door on the utility of steroid monotherapy in subjects with high-risk IgAN, even as it further degrades the value of steroid plus IS, at least with sequential cyclophosphamide and azathioprine.




Midkine, a heparin-binding growth factor, and its roles in atherogenesis and inflammatory kidney diseases

2016-11-02T00:05:41-07:00

The heparin-binding protein midkine is a potent growth factor with emerging roles in numerous inflammatory diseases. Beyond its characterization in embryogenesis and organ development, ample insights into its function have been collected from experimental disease models using knockout animals or knockdown intervention strategies. Here a comprehensive overview on midkine and its functions in atherogenesis and kidney diseases is provided. Molecular clues to key signalling pathways (Akt, ERK, HIF1α) and key events in atherosclerotic vessels link midkine expression with vascular smooth muscle proliferation and (neo)angiogenesis. In acute and chronic kidney diseases, midkine expression is upregulated in tubular as well as endothelial cells. Experimental disease models that mimic diabetic nephropathy and/or immunologic glomerular damage indicate dichotomous midkine activities, with cytoprotective as well as injurious effects. This review also pinpoints the commonalities of the disease models. An understanding of the underlying molecular events will be required in order to design a targeted intervention into cardiovascular or renal diseases as well as inflammatory processes.




Sex hormones in women with kidney disease

2016-11-02T00:05:41-07:00

Menstrual disorders, infertility and premature menopause are common but often underrecognized phenomena among women with chronic kidney disease. Hypothalamic, rather than ovarian dysfunction, may be the cause of the abnormal reproductive milieu, which can be at least partially reversed by kidney transplantation and increased intensity of hemodialysis. Endogenous sex hormones, and specifically estradiol, appear to be renoprotective in women, although the effects of exogenous estradiol (as an oral contraceptive and postmenopausal hormone therapy) on kidney function are more controversial. Treatment with postmenopausal hormone therapy in women with end-stage kidney disease (ESKD) has been associated with improved quality of life, bone health and markers of cardiovascular risk, as well as an increased risk of arteriovenous access thrombosis. The selective estrogen receptor modulator raloxifene has been associated with both a decreased fracture risk as well as renoprotection in women with kidney disease. Young women with ESKD are more likely to die from infection or develop malignancy, suggesting an immunomodulatory role of estrogen. Whether the premature menopause commonly observed in female patients with kidney disease results in increased cardiovascular morbidity and mortality is unknown, although preliminary studies have suggested a possible therapeutic role for manipulation of the sex hormone milieu to mitigate risk in this population. Large, prospective, randomized studies examining the role of sex hormones in women with kidney disease are required to address the question.




Should chronic metabolic acidosis be treated in older people with chronic kidney disease?

2016-11-02T00:05:41-07:00

Metabolic acidosis is common in advanced chronic kidney disease and has been associated with a range of physiological derangements of importance to the health of older people. These include associations with skeletal muscle weakness, cardiovascular risk factors, and bone and mineral disorders that may lead to fragility fractures. Although metabolic acidosis is associated with accelerated decline in kidney function, end-stage renal failure is a much less common outcome in older, frail patients than cardiovascular death. Correction of metabolic acidosis using bicarbonate therapy is commonly employed, but the existing evidence is insufficient to know whether such therapy is of net benefit to older people. Bicarbonate is bulky and awkward to take, may impose additional sodium load with effects on fluid retention and blood pressure, and may cause gastrointestinal side effects. Trial data to date suggest potential benefits of bicarbonate therapy on progression of renal disease and nutrition, but trials have not as yet been published examining the effect of bicarbonate therapy across a range of domains relevant to the health of older people. Fortunately, a number of trials are now underway that should allow us to ascertain whether bicarbonate therapy can improve physical function, quality of life, and vascular, bone and kidney health in older people, and hence decide whether any benefits seen outweigh adverse effects and additional treatment burden in this vulnerable group of patients.




Genetic testing in steroid-resistant nephrotic syndrome: when and how?

2016-11-02T00:05:41-07:00

Steroid-resistant nephrotic syndrome (SRNS) represents the second most frequent cause of chronic kidney disease in the first three decades of life. It manifests histologically as focal segmental glomerulosclerosis (FSGS) and carries a 33% risk of relapse in a renal transplant. No efficient treatment exists. Identification of single-gene (monogenic) causes of SRNS has moved the glomerular epithelial cell (podocyte) to the center of its pathogenesis. Recently, mutations in >30 recessive or dominant genes were identified as causing monogenic forms of SRNS, thereby revealing the encoded proteins as essential for glomerular function. These findings helped define protein interaction complexes and functional pathways that could be targeted for treatment of SRNS. Very recently, it was discovered that in the surprisingly high fraction of ~30% of all individuals who manifest with SRNS before 25 years of age, a causative mutation can be detected in one of the ~30 different SRNS-causing genes. These findings revealed that SRNS and FSGS are not single disease entities but rather are part of a spectrum of distinct diseases with an identifiable genetic etiology. Mutation analysis should be offered to all individuals who manifest with SRNS before the age of 25 years, because (i) it will provide the patient and families with an unequivocal cause-based diagnosis, (ii) it may uncover a form of SRNS that is amenable to treatment (e.g. coenzyme Q10), (iii) it may allow avoidance of a renal biopsy procedure, (iv) it will further unravel the puzzle of pathogenic pathways of SRNS and (v) it will permit personalized treatment options for SRNS, based on genetic causation in way of ‘precision medicine’.




Innate immunity in CKD-associated vascular diseases

2016-11-02T00:05:42-07:00

Chronic kidney disease (CKD) is associated with an increased risk for cardiovascular events. Therefore, the activation of the innate immune system plays an important role. In contrast to the adaptive immunity, unspecific recognition of conserved endogenous and exogenous structures by pattern recognition receptors (PRRs) represents a key feature of the innate immunity. Of these PRRs, Toll-like receptors (TLRs) as well as the inflammasome complex have been documented to be involved in the pathogenesis of cardiovascular diseases (CVDs). They are not only expressed in leukocytes but also in a variety of cell types such as endothelial cells or fibroblasts. While activation of TLRs on the cell surface leads to nuclear factor B-dependent expression of pro-inflammatory mediators, the inflammasome is a cytosolic multimeric protein complex, which cleaves cytokines such as interleukin-1β into their biologically active forms. Several endogenous ligands for these PRRs have been identified as contributing to the development of a CKD-specific pro-inflammatory microenvironment. Notably, activation of TLRs as well as the inflammasome is associated with arterial hypertension, formation of atherosclerotic vascular lesions and vascular calcification. However, detailed molecular mechanisms on how the innate immune system contributes to CKD-associated CVDs are as yet poorly understood. Currently, several agents modulating the activation of the innate immune system are the focus of cardiovascular research. Large clinical studies will provide further information on the therapeutic applicability of these substances to reduce cardiovascular morbidity and mortality in the general population. Further trials including patients with CKD will be necessary to assess their effects on CKD-associated CVD.







Resveratrol delays polycystic kidney disease progression through attenuation of nuclear factor {kappa}B-induced inflammation

2016-11-02T00:05:42-07:00

Background

Inflammation plays an important role in polycystic kidney disease (PKD). The current study aimed to examine the efficacy of the anti-inflammatory compound resveratrol in PKD and to investigate its underlying mechanism of action.

Methods

Male Han:SPRD (Cy/+) rats with PKD were treated with 200 mg/kg/day resveratrol or vehicle by gavage for 5 weeks. Human autosomal dominant (AD) PKD cells, three-dimensional (3D) Madin-Darby canine kidney cells and zebrafish were treated with various concentrations of resveratrol or the nuclear factor B (NF-B) inhibitor QNZ.

Results

Resveratrol treatment reduced blood urea nitrogen levels and creatinine levels by 20 and 24%, respectively, and decreased two-kidney/total body weight ratio by 15% and cyst volume density by 24% in Cy/+ rats. The proliferation index and the macrophage infiltration index were reduced by 40 and 43%, respectively, in resveratrol-treated cystic kidneys. Resveratrol reduced the levels of the pro-inflammatory factors monocyte chemoattractant protein-1 (MCP-1), tumor necrosis factor-α (TNF-α) and complement factor B (CFB) in Cy/+ rat kidneys in parallel with the decreased activity of NF-B (p50/p65). The activation of NF-B and its correlation with pro-inflammatory factor expression were confirmed in human ADPKD cells and kidney tissues. Resveratrol and QNZ inhibited the expression of MCP-1, TNF-α and CFB and reduced NF-B activity in ADPKD cells. Moreover, NF-B blockage minimized the inhibition of inflammatory factor production by resveratrol treatment. Furthermore, resveratrol or QNZ inhibited cyst formation in the 3D cyst and zebrafish models.

Conclusions

The NF-B signaling pathway is activated and partly responsible for inflammation in polycystic kidney tissues. Targeting inflammation through resveratrol could be a new strategy for PKD treatment in the future.




Impact of individual intravenous iron preparations on the differentiation of monocytes towards macrophages and dendritic cells

2016-11-02T00:05:42-07:00

Background

Treatment of iron deficiency with intravenous (i.v.) iron is a first-line strategy to improve anaemia of chronic kidney disease. Previous in vitro experiments demonstrated that different i.v. iron preparations inhibit differentiation of haematopoietic stem cells to monocytes, but their effect on monocyte differentiation to macrophages and mature dendritic cells (mDCs) has not been assessed. We investigated substance-specific effects of iron sucrose (IS), sodium ferric gluconate (SFG), ferric carboxymaltose (FCM) and iron isomaltoside 1000 (IIM) on monocytic differentiation to M1/M2 macrophages and mDCs.

Methods

Via flow cytometry and microRNA (miRNA) expression analysis, we morphologically and functionally characterized monocyte differentiation to M1/M2 macrophages and mDCs after monocyte stimulation with IS, SFG, FCM and IIM (0.133, 0.266 and 0.533 mg/mL, respectively). To assess potential clinical implications, we compared monocytic phagocytosis capacity in dialysis patients who received either 500 mg IS or IIM.

Results

Phenotypically, IS and SFG dysregulated the expression of macrophage (e.g. CD40, CD163) and mDC (e.g. CD1c, CD141) surface markers. Functionally, IS and SFG impaired macrophage phagocytosis capacity. Phenotypic and functional alterations were less pronounced with FCM, and virtually absent with IIM. In miRNA expression analysis of mDCs, IS dysregulated miRNAs such as miR-146b-5p and miR-155-5p, which are linked to Toll-like receptor and mitogen-activated protein kinase signalling pathways. In vivo, IS reduced monocytic phagocytosis capacity within 1 h after infusion, while IIM did not.

Conclusions

This study demonstrates that less stable i.v. iron preparations specifically affect monocyte differentiation towards macrophages and mDCs.




A simple care bundle for use in acute kidney injury: a propensity score-matched cohort study

2016-11-02T00:05:42-07:00

Background

Consensus guidelines for acute kidney injury (AKI) have recommended prompt treatment including attention to fluid balance, drug dosing and avoidance of nephrotoxins. These simple measures can be incorporated in a care bundle to facilitate early implementation. The objective of this study was to assess the effect of compliance with the AKI care bundle (AKI-CB) on in-hospital case–fatality and AKI progression.

Methods

In this larger, propensity score-matched cohort of multifactorial AKI, we examined the impact of compliance with an AKI-CB in 3717 consecutive episodes of AKI in 3518 patients between 1 August 2013 and 31 January 2015. Propensity score matching was performed to match 939 AKI events where the AKI-CB was completed with 1823 AKI events where AKI-CB was not completed.

Results

The AKI-CB was completed in 25.6% of patients within 24 h. The unadjusted case–fatality was higher when the AKI-CB was not completed versus when the AKI-CB was completed (24.4 versus 20.4%, P = 0.017). In multivariable analysis, AKI-CB completion within 24 h was associated with lower odds for in-hospital death [odds ratio (OR): 0.76; 95% confidence interval (95% CI): 0.62–0.92]. Increasing age (OR: 1.04; 95% CI: 1.03–1.05), hospital-acquired AKI (OR: 1.28; 95% CI: 1.04–1.58), AKI stage 2 (OR: 1.91; 95% CI: 1.53–2.39) and increasing Charlson's comorbidity index (CCI) [OR: 3.31 (95% CI: 2.37–4.64) for CCI of more than 5 compared with zero] had higher odds for death, whereas AKI during elective admission was associated with lower odds for death (OR: 0.29; 95% CI: 0.16–0.52). Progression to higher AKI stages was lower when the AKI-CB was completed (4.2 versus 6.7%, P = 0.02).

Conclusions

Compliance with an AKI-CB was associated with lower mortality and reduced progression of AKI to higher stages. The AKI-CB is simple and inexpensive, and could therefore be applied in all healthcare settings to improve outcomes.




Atherosclerotic renal artery stenosis is associated with elevated cell cycle arrest markers related to reduced renal blood flow and postcontrast hypoxia

2016-11-02T00:05:42-07:00

Background

Atherosclerotic renal artery stenosis (ARAS) reduces renal blood flow (RBF), ultimately leading to kidney hypoxia and inflammation. Insulin-like growth factor binding protein-7 (IGFBP-7) and tissue inhibitor of metalloproteinases-2 (TIMP-2) are biomarkers of cell cycle arrest, often increased in ischemic conditions and predictive of acute kidney injury (AKI). This study sought to examine the relationships between renal vein levels of IGFBP-7, TIMP-2, reductions in RBF and postcontrast hypoxia as measured by blood oxygen level–dependent (BOLD) magnetic resonance imaging.

Methods

Renal vein levels of IGFBP-7 and TIMP-2 were obtained in an ARAS cohort (n= 29) scheduled for renal artery stenting and essential hypertensive (EH) healthy controls (n = 32). Cortical and medullary RBFs were measured by multidetector computed tomography (CT) immediately before renal artery stenting and 3 months later. BOLD imaging was performed before and 3 months after stenting in all patients, and a subgroup (N = 12) underwent repeat BOLD imaging 24 h after CT/stenting to examine postcontrast/procedure levels of hypoxia.

Results

Preintervention IGFBP-7 and TIMP-2 levels were elevated in ARAS compared with EH (18.5 ± 2.0 versus 15.7 ± 1.5 and 97.4 ± 23.1 versus 62.7 ± 9.2 ng/mL, respectively; P< 0.0001); baseline IGFBP-7 correlated inversely with hypoxia developing 24 h after contrast injection (r = –0.73, P< 0.0001) and with prestent cortical blood flow (r = –0.59, P= 0.004).

Conclusion

These data demonstrate elevated IGFBP-7 and TIMP-2 levels in ARAS as a function of the degree of reduced RBF. Elevated baseline IGFBP-7 levels were associated with protection against postimaging hypoxia, consistent with ‘ischemic preconditioning’. Despite contrast injection and stenting, AKI in these high-risk ARAS subjects with elevated IGFBP-7/TIMP-2 was rare and did not affect long-term kidney function.




Primary care physicians perceived barriers, facilitators and strategies to enhance conservative care for older adults with chronic kidney disease: a qualitative descriptive study

2016-11-02T00:05:42-07:00

Background

Although primary care physicians (PCPs) are often responsible for the routine care of older adults with chronic kidney disease (CKD), there is a paucity of evidence regarding their perspectives and practice of conservative (non-dialysis) care. We undertook a qualitative study to describe barriers, facilitators and strategies to enhance conservative, non-dialysis, CKD care by PCPs in the community.

Methods

Semi-structured telephone and face-to-face interviews were conducted with PCPs from Alberta, Canada. Participants were identified using a snowball sampling strategy and purposively sampled based on sex, age and rural/urban location of clinical practice. Eligible participants had managed at least one patient ≥75 years with Stage 5 CKD (estimated glomerular filtration rate <15 mL/min/1.73 m2, not on dialysis) in the prior year. Participant recruitment ceased when data saturation was reached. Transcripts were analyzed thematically using conventional content analysis.

Results

In total, 27 PCPs were interviewed. The majority were male (15/27), were aged 40–60 years (15/27) and had practiced in primary care for >20 years (14/27). Perceived barriers to conservative CKD care included: managing expectations of kidney failure for patients and their families; dealing with the complexity of medical management of patients requiring conservative care; and challenges associated with managing patients jointly with specialists. Factors that facilitated conservative CKD care included: establishing patient/family expectations early; preserving continuity of care; and utilizing a multidisciplinary team approach. Suggested strategies for improving conservative care included having: direct telephone access to clinicians familiar with conservative care; treatment decision aids for patients and their families; and a conservative care clinical pathway to guide management.

Conclusions

PCPs identified important barriers and facilitators to conservative care for their older patients with Stage 5 CKD. Further investigation of potential strategies that address barriers and enable facilitators is required to improve the quality of conservative care for older adults in the community.




Selective screening for distal renal tubular acidosis in recurrent kidney stone formers: initial experience and comparison of the simultaneous furosemide and fludrocortisone test with the short ammonium chloride test

2016-11-02T00:05:42-07:00

Background

Distal renal tubular acidosis (dRTA) is associated with renal stone disease, and it often needs to be considered and excluded in some recurrent calcium kidney stone formers (KSFs). However, a diagnosis of dRTA, especially when ‘incomplete’, can be missed and needs to be confirmed by a urinary acidification (UA) test. The gold standard reference test is still the short ammonium chloride (NH4Cl) test, but it is limited by gastrointestinal side effects and occasionally failure to ingest sufficient NH4Cl. For this reason, the furosemide plus fludrocortisone (F+F) test has been proposed as an easier and better-tolerated screening test. The aim of the present study was to assess the usefulness of the F+F test as a clinical screening tool for dRTA in a renal stone clinic.

Methods

We studied 124 patients retrospectively in whom incomplete dRTA was suspected: 71 had kidney stones only, 9 had nephrocalcinosis only and 44 had both. A total of 158 UA tests were performed: 124 F+F and 34 NH4Cl; both tests were completed in 34 patients.

Results

The mean age was 45.4 ± 15 years, and 49% of patients were male. The prevalence of complete and incomplete dRTAs was 7 and 13.7%, respectively. Of the 34 patients tested using both tests, 17 (50%) were abnormal and 4 (12%) were normal. Thirteen (39%) patients were abnormal by F+F, but normal by NH4Cl [sensitivity 100% (95% CI 80–100), specificity 24% (95% CI 7–50), positive predictive value 57% (95% CI 37–75), negative predictive value 100% (95% CI 40–100)].

Conclusions

The F+F test is characterized by an excellent sensitivity and negative predictive value, and the diagnosis of incomplete dRTA can be excluded reliably in a patient who acidifies their urine normally with this test. However, its lack of specificity is a drawback, and if there is any doubt, an abnormal F+F test may need to be confirmed by a follow-up NH4Cl test. Ideally, a prospective blinded study in unselected KSFs is needed to accurately assess the reliability of the F+F test in diagnosing, rather than excluding, dRTA.




Development and initial validation of prescribing quality indicators for patients with chronic kidney disease

2016-11-02T00:05:42-07:00

Background

Quality assessment is a key element for improving the quality of care. Currently, a comprehensive indicator set for measuring the quality of medication treatment in patients with chronic kidney disease (CKD) is lacking. Our aim was to develop and validate a set of prescribing quality indicators (PQIs) for CKD care, and to test the feasibility of applying this set in practice.

Methods

Potential indicators were based on clinical practice guidelines and evaluated using the RAND/UCLA Appropriateness Method. This is a structured process in which an expert panel assesses the validity of the indicators. Feasibility was tested in a Dutch primary care database including >4500 diabetes patients with CKD.

Results

An initial list of 22 PQIs was assessed by 12 experts. After changing 10 PQIs, adding 2 and rejecting 8, a final list of 16 indicators was accepted by the expert panel as valid. These PQIs focused on the treatment of hypertension, albuminuria, mineral and bone disorder, statin prescribing and possible unsafe medication. The indicators were successfully applied to measure treatment quality in the primary care database, but for some indicators the number of eligible patients was too small for reliable calculation. Results showed that there was room for improvement in the treatment quality of this population.

Conclusions

We developed a set of 16 PQIs for measuring the quality of treatment in CKD patients, which had sufficient content and face validity as well as operational feasibility. These PQIs can be used to point out priority areas for improvement.




Albuminuria and tolvaptan in autosomal-dominant polycystic kidney disease: results of the TEMPO 3:4 Trial

2016-11-02T00:05:42-07:00

Background

The TEMPO 3:4 Trial results suggested that tolvaptan had no effect compared with placebo on albuminuria in autosomal-dominant polycystic kidney disease (ADPKD) patients. However, the use of categorical ‘albuminuria events’ may have resulted in a loss of sensitivity to detect changes. The aim of this study is to investigate the effects of tolvaptan on albuminuria as a continuous variable.

Methods

Post hoc analysis of a 3-year prospective, blinded randomized controlled trial, including 1375 ADPKD patients. Albuminuria was measured in a spot morning urine sample prior to tolvaptan dosing and expressed as albumin-to-creatinine ratio (ACR).

Results

Baseline median (interquartile range) ACR was 3.2 (1.7–7.1) mg/mmol. Of note, 47.9% of ADPKD patients had normal, 48.7% moderately increased and 3.4% severely increased ACR. Subjects with higher baseline ACR had higher blood pressure and total kidney volume (TKV) and lower estimated glomerular filtration rate (eGFR). During follow-up, higher baseline ACR was associated with more rapid eGFR loss (P < 0.0001 for trend), but not with rate of growth in TKV. During the 3-year trial, ACR rose in placebo- and decreased in tolvaptan-treated patients (+0.23 versus –0.40 mg/mmol). The difference ACR increased over time, reaching a maximum of 24% at Month 36 (P < 0.001). At that time only a minor difference in blood pressure was observed (mean arterial pressure –1.9 mmHg for tolvaptan). The decrease in ACR was similar in all subgroups investigated, and remained after withdrawal of study drug. The beneficial effect of tolvaptan on TKV growth and eGFR loss was stronger in patients with higher baseline ACR.

Conclusions

In ADPKD, higher baseline albuminuria was associated with more eGFR loss. Tolvaptan decreased albuminuria compared with placebo, independent of blood pressure. Treatment efficacy of tolvaptan on changes in TKV and eGFR was more readily detected in patients with higher albuminuria.




Association between low birth weight and childhood-onset chronic kidney disease in Japan: a combined analysis of a nationwide survey for paediatric chronic kidney disease and the National Vital Statistics Report

2016-11-02T00:05:42-07:00

Background

Although numerous epidemiological surveys performed across several continents and ethnic groups have linked low birth weight (LBW) to increased risk of chronic kidney disease (CKD) in adulthood, the effects of birth weight and prematurity on development of CKD in childhood have not been clearly established.

Methods

Data on sex, LBW incidence and gestational age were compared between paediatric CKD cases and a control group. Paediatric CKD cases were obtained from a nationwide survey conducted by the Pediatric CKD Study Group in Japan. The population attributable fraction was calculated to evaluate the effects of reducing the prevalence of LBW infants (LBWI).

Results

Of 447 individuals born between 1993 and 2010 that fulfilled the eligibility criteria, birth weight data were obtained for 381 (85.2%) (231 boys and 150 girls), 106 (27.8%) of whom were LBWI. The proportion of LBWI in the general population during the same period was much lower (8.6%). Therefore, the risk ratio (RR) for paediatric CKD was significantly higher in the LBW group [crude RR: 4.10; 95% confidence interval (CI) 3.62–5.01], and the overall impact on paediatric CKD for removal of LBW amounted to 21.1% (95% CI 16.0–26.1%). In addition, 82 patients (21.9%) with paediatric CKD were born prematurely (before 37 weeks of gestation), and as with LBW, a strong correlation was observed between prematurity and CKD (RR: 4.73; 95% CI 3.91–5.73).

Conclusions

Both birth weight and gestational age were strongly associated with childhood-onset CKD in this study.




Lipoprotein(a) concentrations, apolipoprotein(a) isoforms and clinical endpoints in haemodialysis patients with type 2 diabetes mellitus: results from the 4D Study

2016-11-02T00:05:42-07:00

Background

High lipoprotein(a) [Lp(a)] concentrations and low molecular weight (LMW) apolipoprotein(a) [apo(a)] isoforms are associated with cardiovascular disease and mortality in the general population. We examined the association of both with all-cause mortality and cardiovascular endpoints in haemodialysis patients with diabetes mellitus.

Methods

This is a post hoc analysis of the prospective 4D Study (German Diabetes Dialysis Study) that evaluated atorvastatin compared with placebo in 1255 haemodialysis patients with type 2 diabetes mellitus (median follow-up 4 years). The association of natural logarithm-transformed Lp(a) concentrations (increment one unit) and apo(a) isoforms with outcomes was analysed by Cox proportional hazards regression. The influence of age (median 66 years) was evaluated by stratified survival analyses.

Results

The median baseline Lp(a) concentration was 11.5 mg/dL (IQR 5.0–41.8). A quarter of patients had at least one LMW apo(a) isoform. Increased Lp(a) concentrations were associated with all-cause mortality in the total group [hazard ratio (HR) 1.09 (95% CI 1.03–1.16), P = 0.004]. LMW apo(a) isoforms were only associated with all-cause mortality in patients ≤ 66 years [HR 1.38 (95% CI 1.05–1.80), P = 0.02]. The strongest association for Lp(a) concentrations and LMW apo(a) isoforms was found for death due to infection in patients ≤ 66 years [HR 1.39 (95% CI 1.14–1.71), P = 0.001; HR 2.17 (95% CI 1.26–3.75), P = 0.005]. Lp(a) concentrations were also associated with fatal stroke in patients ≤66 years of age [HR 1.54 (95% CI 1.05–2.24), P = 0.03]. Neither Lp(a) nor LMW apo(a) isoforms were associated with other atherosclerosis-related events.

Conclusions

High Lp(a) concentrations and LMW apo(a) isoforms are risk predictors for all-cause mortality and death due to infection in haemodialysis patients with diabetes mellitus. These associations are modified by age.




A novel COL4A1 frameshift mutation in familial kidney disease: the importance of the C-terminal NC1 domain of type IV collagen

2016-11-02T00:05:42-07:00

Background

Hereditary microscopic haematuria often segregates with mutations of COL4A3, COL4A4 or COL4A5 but in half of families a gene is not identified. We investigated a Cypriot family with autosomal dominant microscopic haematuria with renal failure and kidney cysts.

Methods

We used genome-wide linkage analysis, whole exome sequencing and cosegregation analyses.

Results

We identified a novel frameshift mutation, c.4611_4612insG:p.T1537fs, in exon 49 of COL4A1. This mutation predicts truncation of the protein with disruption of the C-terminal part of the NC1 domain. We confirmed its presence in 20 family members, 17 with confirmed haematuria, 5 of whom also had stage 4 or 5 chronic kidney disease. Eleven family members exhibited kidney cysts (55% of those with the mutation), but muscle cramps or cerebral aneurysms were not observed and serum creatine kinase was normal in all individuals tested.

Conclusions

Missense mutations of COL4A1 that encode the CB3 [IV] segment of the triple helical domain (exons 24 and 25) are associated with HANAC syndrome (hereditary angiopathy, nephropathy, aneurysms and cramps). Missense mutations of COL4A1 that disrupt the NC1 domain are associated with antenatal cerebral haemorrhage and porencephaly, but not kidney disease. Our findings extend the spectrum of COL4A1 mutations linked with renal disease and demonstrate that the highly conserved C-terminal part of the NC1 domain of the α1 chain of type IV collagen is important in the integrity of glomerular basement membrane in humans.




Pregnancy in dialysis patients in the new millennium: a systematic review and meta-regression analysis correlating dialysis schedules and pregnancy outcomes

2016-11-02T00:05:42-07:00

Background

Advances have been made in the management of pregnancies in women receiving dialysis; however, single-centre studies and small numbers of cases have so far precluded a clear definition of the relationship between dialysis schedules and pregnancy outcomes. The aim of the present systematic review was to analyse the relationship between dialysis schedule and pregnancy outcomes in pregnancies in chronic dialysis in the new millennium.

Methods

Medline–PubMed, Embase and the Cochrane library were searched (1 January 2000–31 December 2014: MESH, Emtree, free terms on pregnancy and dialysis). A separate analysis was performed for case series (more than five cases) and case reports. Meta-regression was performed in case series dealing with the larger subset of haemodialysis (HD) patients; case reports were analysed separately [according to peritoneal dialysis (PD) versus HD; conception before or during dialysis].

Results

We obtained 190 full texts and 25 congress abstracts from 2048 references. We selected 101 full papers and 25 abstracts (36 series; 90 case reports), for a total of 681 pregnancies in 647 patients. In the case series (574 pregnancies in 543 patients), preterm delivery was extremely frequent (83%). Meta-regression analysis showed a relationship between hours of dialysis per week in HD and preterm delivery, and was significant for preterm deliveries (<37 gestational weeks: P = 0.044; r2 = 0.22) and for small for gestational age (SGA) (P = 0.017; r2 = 0.54). SGA was closely associated with the number of dialysis sessions per week (P = 0.003; r2 = 0.84). Case report analysis suggests a lower incidence of SGA on HD versus PD (31 versus 66.7%; P = 0.015). No evidence of an increased risk of congenital abnormality was found in the retrieved papers.

Conclusions

Data on pregnancy on dialysis are heterogeneous but rapidly accumulating; the main determinant of outcomes on HD is the dialysis schedule. The differences between PD and HD should be further analysed.




Solute solver 'what if module for modeling urea kinetics

2016-11-02T00:05:42-07:00

Background

The publicly available Solute Solver module allows calculation of a variety of two-pool urea kinetic measures of dialysis adequacy using pre- and postdialysis plasma urea and estimated dialyzer clearance or estimated urea distribution volumes as inputs. However, the existing program does not have a ‘what if’ module, which would estimate the plasma urea values as well as commonly used measures of hemodialysis adequacy for a patient with a given urea distribution volume and urea nitrogen generation rate dialyzed according to a particular dialysis schedule.

Methods

Conventional variable extracellular volume 2-pool urea kinetic equations were used.

Results

A javascript-HTML Web form was created that can be used on any personal computer equipped with internet browsing software, to compute commonly used Kt/V-based measures of hemodialysis adequacy for patients with differing amounts of residual kidney function and following a variety of treatment schedules.

Conclusions

The completed Web form calculator may be particularly useful in computing equivalent continuous clearances for incremental hemodialysis strategies.




Impact of the Banff 2013 classification on the diagnosis of suspicious versus conclusive late antibody-mediated rejection in allografts without acute dysfunction

2016-11-02T00:05:42-07:00

Background

The Banff classification is used worldwide to characterize pathological findings in renal allograft biopsies. During the 11th Banff meeting, relevant changes were introduced in the diagnostic criteria for Category 2 antibody-mediated rejection (ABMR). Here, we assess the effect of these changes on the diagnosis of late chronic ABMR.

Methods

Seventy-three indication renal graft biopsies (chronic dysfunction, proteinuria and/or the presence of de novo donor-specific antibodies) from 68 kidney transplant recipients initially classified following the Banff 2009 criteria were reviewed and reclassified as per the new Banff 2013 criteria.

Results

The diagnostic category changed in 18% of the study biopsies with Banff 2013. The reclassification mainly involved Category 2 cases, from which 23.5% of the biopsies from older patients with worse graft function were overlooked by Banff 2009. ABMR was ruled out in 13% of cases under the Banff 2009 criteria. A significant number of the study samples were conclusively diagnosed as ABMR (40% as per Banff 2009 and 74% as per Banff 2013; P = 0.006), because of the inclusion of microvascular inflammation and the acceptance of some ultrastructural diagnostic criteria. However, when following the criteria of the new classification, samples with histological signs of chronic ABMR, in which human leucocyte antigen donor-specific antibodies are not detected or ultrastructural studies are not performed, may be inadequately characterized.

Conclusions

The Banff 2013 classification helps in making a diagnosis of late ABMR, identifying cases, decreasing the percentage of suspected ABMR and making more conclusive diagnoses.




Epidemiology and management of hypertension in paediatric and young adult kidney transplant recipients in The Netherlands

2016-11-02T00:05:42-07:00

Introduction

Hypertension in kidney transplant recipients (KTRs) is a risk factor for cardiovascular mortality and graft loss. Data on the prevalence of hypertension and uncontrolled hypertension (uHT) in paediatric and young adult KTRs are scarce. Also, it is unknown whether ‘transition’ (the transfer from paediatric to adult care) influences control of hypertension. We assessed the prevalence of hypertension and uHT among Dutch paediatric and young adult KTRs and analysed the effects of transition. Additionally, we made an inventory of variations in treatment policies in Dutch transplant centres.

Methods

Cross-sectional and longitudinal national data from living KTRs ≤30 years of age (≥1-year post-transplant, eGFR >20 mL/min) were extracted from the ‘RICH Q’ database, which comprises information about all Dutch KTRs <19 years of age, and the Netherlands Organ Transplant Registry database for adult KTRs (≥18–30 years of age). We used both upper-limit blood pressure (BP) thresholds for treatment according to Kidney Disease: Improving Global Outcomes (KDIGO) guidelines. uHT was defined as a BP above the threshold. A questionnaire on treatment policies was sent to paediatric and adult nephrologists at eight Dutch transplant centres.

Results

Hypertension and uHT were more prevalent in young adult KTRs (86.4 and 75.8%) than in paediatric KTRs (62.7 and 38.3%) according to the KDIGO definition. Time after transplantation was comparable between these groups. Longitudinal analysis showed no evidence of effect of transition on systolic BP or prevalence of uHT. Policies vary considerably between and within centres on the definition of hypertension, BP measurement and antihypertensive treatment.

Conclusion

Average BP in KTRs increases continuously with age between 6 and 30 years. Young adult KTRs have significantly more uHT than paediatric KTRs according to KDIGO guidelines. Transition does not influence the prevalence of uHT.




Pregnancy outcomes after kidney graft in Italy: are the changes over time the result of different therapies or of different policies? A nationwide survey (1978-2013)

2016-11-02T00:05:42-07:00

Background Kidney transplantation is the treatment of choice to restore fertility to women on renal replacement therapy. Over time, immunosuppressive, support therapies and approaches towards high-risk pregnancies have changed. The aim of this study was to analyse maternal–foetal outcomes in two cohorts of transplanted women who delivered a live-born baby in Italy in 1978–2013, dichotomized into delivery before and after January 2000. Methods A survey involving all the Italian transplant centres was carried out, gathering data on all pregnancies recorded since the start of activity at each centre; the estimated nationwide coverage was 75%. Data on cause of ESRD, dialysis, living/cadaveric transplantation, drug therapy, comorbidity, and the main maternal–foetal outcomes were recorded and reviewed. Data were compared with a low-risk cohort of pregnancies from two large Italian centres (2000–14; Torino and Cagliari Observational Study cohort). Results The database consists of 222 pregnancies with live-born babies after transplantation (83 before 2000 and 139 in 2000–13; 68 and 121 with baseline and birth data, respectively), and 1418 low-risk controls. The age of the patients significantly increased over time (1978–99: age 30.7 ± 3.7 versus 34.1 ± 3.7 in 2000–13; P < 0.001). Azathioprine, steroids and cyclosporine A were the main drugs employed in the first time period, while tacrolimus emerged in the second. The prevalence of early preterm babies increased from 13.4% in the first to 27.1% in the second period (P = 0.049), while late-preterm babies non-significantly decreased (38.8 versus 33.1%), thus leaving the prevalence of all preterm babies almost unchanged (52.2 and 60.2%; P = 0.372). Babies below the 5th percentile decreased over time (22.2 versus 9.6%; P = 0.036). In spite of high prematurity rates, no neonatal deaths occurred after 2000. The results in kidney transplant patients are significantly different from controls both considering all cases [preterm delivery: 57.3 versus 6.3%; early preterm: 22.2 versus 0.9%; small for gestational age (SGA): 14 versus 4.5%; P < 0.001] and considering only transplant patients with normal kidney function [preterm delivery: 35 versus 6.3%; early preterm: 10 versus 0.9%; SGA: 23.7 versus 4.5% (P < 0.001); risks increase across CKD stages]. Kidney function remained stable in most of the patients up to 6 months after delivery. Multiple regression analysis performed on the transplant cohort highlights a higher risk of preterm delivery in later CKD stages, an increase in preterm delivery and a decrease in SGA across periods. Conclusions Pregnancy after transplantation has a higher risk of adverse outcomes compared with the general population. Over time, the incidence of SGA babies decreased while the incidence of ‘early preterm’ babies increased. Although acknowledging the differences in therapy (cyclosporine versus tacrolimus) and in maternal age (significantly increased), the decrease in SGA and the increase in prematurity may be explained by an obstetric policy favouring earlier delivery against the risk of foetal growth restriction. [...]



Announcements

2016-11-02T00:05:42-07:00













Cum grano salis

2016-09-28T00:05:55-07:00







Pro: The use of calcineurin inhibitors in the treatment of lupus nephritis

2016-09-28T00:05:55-07:00

Renal disease in systemic lupus erythematosus (SLE) carries significant morbidity and mortality. Cyclophosphamide (CYC)- and mycophenolate mofetil (MMF)–based induction regimens are not ideal in terms of efficacy and toxicity. The adverse effects of CYC, such as infection risk, infertility, urotoxicity and oncogenicity, limit its use in lupus nephritis. Although MMF is non-inferior to CYC as induction therapy and has reduced gonadal toxicity and oncogenic potential, meta-analyses of clinical trials do not show a lower rate of infective and gastrointestinal complications. Tacrolimus (TAC) has recently been shown to have equal efficacy to either MMF or CYC for inducing remission of lupus nephritis. A low-dose combination of MMF and TAC appears to be more effective than intravenous CYC pulses in Chinese patients, and has potential to replace the more toxic CYC regimens in high-risk subgroups. TAC may be considered as another non-CYC alternative for induction therapy of lupus nephritis and in those with refractory disease or intolerance to CYC or MMF. TAC has no negative effect on fertility in younger women, and unlike MMF and CYC, it is safe in pregnancy. However, TAC has a narrow therapeutic window and drug level monitoring is required to ensure drug exposure and minimize acute toxicities. Current evidence for the efficacy of TAC in lupus nephritis is limited to 6 months and the incidence of renal flare after discontinuation of therapy or switching to azathioprine appears to be higher than other induction agents. Long-term data and the incidence of chronic nephrotoxicity of TAC as maintenance therapy in lupus nephritis are currently lacking and further prospective trials are needed to address these issues.




Opponent's comments

2016-09-28T00:05:55-07:00




Con: The use of calcineurin inhibitors in the treatment of lupus nephritis

2016-09-28T00:05:55-07:00

Lupus nephritis (LN) therapy has limited efficacy due to its toxicity, and LN patients suffer high risks of renal and cardiovascular morbidity and mortality. Calcineurin inhibitors (CNIs) have been used for over >30 years in LN treatment and are an established alternative therapy for Class V nephritis, but uncertainty remains about their role in proliferative disease or in the maintenance of remission. More recently, the combination of CNIs with mycophenolate mofetil (MMF) and glucocorticoid combination therapy, ‘multitarget’ therapy and the use of tacrolimus as opposed to ciclosporin has received attention. Is the evidence now sufficient to support the routine use of regimens including CNIs in LN? Although CNIs appear to have similar efficacy to MMF-based regimens as induction therapy, and are comparable with azathioprine as maintenance treatment, CNI toxicities, such as new-onset hypertension, hyperglycaemia and nephrotoxicity, have been problematic. Multitarget therapy improves the rate of complete remission in short-term studies, but whether this benefit is maintained over the longer term is uncertain. However, patient tolerability is lower and the frequency of serious events is higher in multitarget versus cyclophosphamide-based regimens, and there is a paucity of evidence from non-Asian ethnic groups. CNI-based therapy is also complicated by the absence of standardized dosing and the need for drug level monitoring, as well as by pharmacogenetic differences. Also, multitarget therapy increases the complexity and the cost of treatment. There is insufficient evidence to support the routine use of CNI-based or multitarget therapy for proliferative LN. Further data on long-term renal and cardiovascular outcomes and strategies to improve tolerability and safety are required.




Opponent's comments

2016-09-28T00:05:55-07:00




Moderator's view: The use of calcineurin inhibitors in the treatment of lupus nephritis

2016-09-28T00:05:55-07:00

Lupus nephritis (LN) is one of the most severe manifestations of systemic lupus erythematosus (SLE), affecting ~50% of patients, and both renal disease and treatment-related toxicity contribute to significant morbidity and mortality. Although our understanding of the aetiopathogenesis of LN is improving, treatment still remains a challenge, with the achievement of complete remission at 1 year in <50% of patients treated with current standard of care immunosuppressive therapy; this is associated with considerable short- and long-term side effects, some of which further contribute to non-adherence. Calcineurin inhibitors (CNIs) have been successfully used in organ transplantation and there is increasing evidence that cyclosporin A (CSA), and especially tacrolimus (TAC), are also effective in the treatment of LN. Randomised controlled trials showed similar efficacy for TAC when compared with mycophenolate mofetil (MMF) and multitarget therapy, including TAC and low-dose MMF, and resulted in significantly more complete remissions and overall responses compared with intravenous cyclophosphamide (CYC). Flares are observed in up to 45% of patients with LN, and an increase in relapse rate following induction with CNIs may be an issue. Most studies on this matter have been restricted to patients from Asia, and studies in more balanced cohorts are desirable. Moreover, there is a need to understand and determine the long-term effects of CNIs on renal function, proteinuria and comorbidities, with a special focus on cardiovascular risk. In this ‘Pros and Cons’ debate, the potential benefits and disadvantages of CNIs in the treatment of LN will be critically highlighted.




Targeting the podocyte cytoskeleton: from pathogenesis to therapy in proteinuric kidney disease

2016-09-28T00:05:55-07:00

Glomerular injury often incites a progression to chronic kidney disease, which affects millions of patients worldwide. Despite our current understanding of this disease's pathogenesis, there is still a lack of therapy available to curtail its progression. However, exciting new data strongly suggest the podocyte—an actin-rich, terminally differentiated epithelial cell that lines the outside of the glomerular filtration barrier—as a therapeutic target. The importance of podocytes in the pathogenesis of human nephrotic syndrome is best characterized by identification of genetic mutations, many of which regulate the actin cytoskeleton. The intricate regulation of the podocyte actin cytoskeleton is fundamental in preserving an intact glomerular filtration barrier, and this knowledge has inspired new research targeting actin-regulating proteins in these cells. This review will shed light on recent findings, which have furthered our understanding of the molecular mechanisms regulating podocyte actin dynamics, as well as discoveries that have therapeutic implications in the treatment of proteinuric kidney disease.




Fetuin-A-containing calciprotein particles in mineral trafficking and vascular disease

2016-09-28T00:05:55-07:00

Calcium and phosphate combine to form insoluble precipitates in both inorganic and organic materials. This property is useful biologically and has been used by numerous organisms to create hard tissues, a process referred to as biomineralisation [1]. In humans, calcium and phosphate combine to form useful crystal structures largely composed of calcium hydroxyapatite [Ca10(PO4)6(OH)2] and these are essential in the growth, maintenance and strength of parts of the skeleton and other structures like teeth. However, it remains unclear how the body achieves the exquisite specificity involved in biomineralisation. In ageing and disease, these pathways are perturbed, resulting in ectopic calcium crystal deposition impairing tissue function and, interestingly, frequently accompanied by simultaneous loss of mineral from sites where it is useful (e.g. bone). One paradigm for this maladaptive situation is renal failure; a situation that we know is associated with vascular stiffening and calcification, along with mineral loss from the skeleton. Mineral trafficking is a loose term used to describe the movements of calcium salts around the body, and new insights into these pathways may explain some of the problems of previous models of bone mineral disease in renal failure and point to potential future therapeutic strategies.




Novel iron-containing phosphate binders and anemia treatment in CKD: oral iron intake revisited

2016-09-28T00:05:55-07:00

Recent reports have shown that novel phosphate binders containing iron are not only efficacious for the treatment of hyperphosphatemia but also may reduce the need for erythropoiesis-stimulating agents and intravenous (IV) iron for anemia management in patients on maintenance hemodialysis (MHD). Possible healthcare cost savings, which have not been demonstrated in a long-term study, may be an additional advantage of using such multi-pronged treatment strategies for the control of both hyperphosphatemia and iron needs. It is currently assumed that oral iron supplementation is less efficient than the IV route in patients with chronic kidney disease (CKD). The unexpected efficacy of novel iron-containing phosphate binders, such as ferric citrate, in repleting insufficient iron stores and improving the anemia of CKD could change this view. Previous assumptions of self-controlled iron uptake by ‘mucosal block’ or hepcidin, or else by impaired intestinal iron absorption due to CKD-associated inflammation cannot be reconciled with recent observations of the effects of ferric citrate administration. Citrate in the intestinal lumen may partly contribute to the acceleration of iron absorption. Animal experiments and clinical studies have also shown that oral iron overload can cause excessive iron accumulation despite high hepcidin levels, which are not able to block iron absorption completely. However, like with IV iron agents, no long-term safety data exist with respect to the effects of iron-containing phosphate binders on ‘hard’ patient outcomes. Future randomized prospective studies in patients with CKD are necessary to establish the safety of oral iron-containing phosphate binders for the control of both hyperphosphatemia and renal anemia.




Progress in the treatment of atherosclerotic renovascular disease: the conceptual journey and the unanswered questions

2016-09-28T00:05:55-07:00

Over the past decades, management of atherosclerotic renovascular disease (ARVD) has undergone significant progress, in parallel with increased knowledge about the complex pathophysiology of this condition. Modern multi-targeted medical management of atherosclerosis has driven a change in both the natural history and the clinical outcomes of ARVD. Progression to total renal artery occlusion is a much less common occurrence and while early studies have reported that up to 41% of patients reached renal end-points over a mean follow-up of 44 months, the latest randomized controlled trials have shown that progressive renal impairment occurs in 16–22% of patients, with <8% of patients reaching end-stage kidney disease (ESKD) over a similar time-frame. However, the results of the latest large ARVD trials investigating the effect of renal stenting upon clinical outcomes have been influenced by selection bias as high-risk patients with clinically significant renal artery stenosis (RAS) have largely been excluded from these studies. Although the neutral results of these trials have shown uncertainty about the role of revascularization in the management of patients with ARVD, there is evidence that revascularization can optimize outcomes in selected patients with a high-risk clinical phenotype. Future challenges lie in identifying important subgroups of patients with critical RAS and viable kidneys, while continuing to develop strategies to protect the renal parenchyma and hence improve clinical outcomes.




Neurological complications in chronic kidney disease patients

2016-09-28T00:05:55-07:00

Chronic kidney disease (CKD) is associated with a high prevalence of cerebrovascular disorders such as stroke, white matter diseases, intracerebral microbleeds and cognitive impairment. This situation has been observed not only in end-stage renal disease patients but also in patients with mild or moderate CKD. The occurrence of cerebrovascular disorders may be linked to the presence of traditional and non-traditional cardiovascular risk factors in CKD. Here, we review current knowledge on the epidemiological aspects of CKD-associated neurological and cognitive disorders and discuss putative causes and potential treatment. CKD is associated with traditional (hypertension, hypercholesterolaemia, diabetes etc.) and non-traditional cardiovascular risk factors such as elevated levels of oxidative stress, chronic inflammation, endothelial dysfunction, vascular calcification, anaemia and uraemic toxins. Clinical and animal studies indicate that these factors may modify the incidence and/or outcomes of stroke and are associated with white matter diseases and cognitive impairment. However, direct evidence in CKD patients is still lacking. A better understanding of the factors responsible for the elevated prevalence of cerebrovascular diseases in CKD patients may facilitate the development of novel treatments. Very few clinical trials have actually been performed in CKD patients, and the impact of certain treatments is subject to debate. Treatments that lower LDL cholesterol or blood pressure may reduce the incidence of cerebrovascular diseases in CKD patients, whereas treatment with erythropoiesis-stimulating agents may be associated with an increased risk of stroke but a decreased risk of cognitive disorders. The impact of therapeutic approaches that reduce levels of uraemic toxins has yet to be evaluated.




Curcumin ameliorates nephrosclerosis via suppression of histone acetylation independent of hypertension

2016-09-28T00:05:55-07:00

Background

Although histone acetylation, an epigenetic modification, has been reported to be related to the progression of various diseases, its involvement in nephrosclerosis is unclear.

Methods

Dahl salt-sensitive rats were used as a model of nephrosclerosis in this study. The rats were divided into three groups: (i) normal-salt diet group, (ii) high-salt diet group (HS), and (iii) HS administered daily with curcumin, a histone acetyltransferase inhibitor (HS+C). At 6 weeks after the treatment, the kidneys were dissected. Morphologic changes were assessed by Masson's trichrome staining. The number of macrophages, fibroblasts and the cells expressing acetylated histone H3 at Lys 9 (H3K9) were assessed by immunohistochemistry.

Results

Although both HS and HS+C rats revealed a marked increase in systolic blood pressure, serum creatinine was increased only in HS rats at 6 weeks. In the HS rats, nephrosclerosis was induced, accompanying a significant accumulation of macrophages and fibroblasts. The inflammation and fibrosis was markedly suppressed in the HS+C group. The level of histone acetylation at Lys 9 was enhanced in the HS rats, whereas curcumin administration suppressed the histone acetylation. Moreover, in the HS rats, interleukin-6 gene expression was associated with acetylated H3K9, as revealed by chromatin immunoprecipitation assay.

Conclusions

Our results suggested that curcumin ameliorates nephrosclerosis via suppression of histone acetylation, independently of hypertension.




Acetazolamide enhances the release of urinary exosomal aquaporin-1

2016-09-28T00:05:55-07:00

Background

Renal aquaporin-1 (AQP1), a water channel protein, is known to be secreted into urine, conveyed by nano-sized extracellular vesicles called exosomes. A previous study has demonstrated that acetazolamide (AZ), a diuretic that inhibits carbonic anhydrases, alters the expression level of AQP1 in cultured cells. Here we investigated whether AZ alters the release of urinary exosomal AQP1 in vivo.

Methods

The effect of AZ on urinary exosomal AQP1 secretion was examined in rats and compared with furosemide (another diuretic), NaHCO3 (an alkalizing agent) and NH4Cl (an acidifying agent). Urine, blood and kidney samples were obtained 2 h after each treatment. Urinary exosomes were isolated by a differential centrifugation technique and urinary exosomal proteins were analyzed by immunoblotting.

Results

The release of exosomal AQP1 into urine was markedly increased after treatment with AZ, accompanied by alkaluria and metabolic acidosis. Immunohistochemistry clearly demonstrated that AZ increased the apical membrane expression of AQP1 in the proximal tubules. AZ did not affect the release of exosomal marker proteins (tumor susceptibility gene 101 protein and apoptosis-linked gene 2 interacting protein X). Treatment with furosemide did not change, whereas NaHCO3 and NH4Cl decreased the exosomal release of AQP1.

Conclusion

The present findings indicate that AZ increases the release of exosomal AQP1 into urine in association with enhanced apical membrane expression of AQP1.




Common chronic conditions do not affect performance of cell cycle arrest biomarkers for risk stratification of acute kidney injury

2016-09-28T00:05:55-07:00

Background

Identification of acute kidney injury (AKI) can be challenging in patients with underlying chronic disease, and biomarkers often perform poorly in this population. In this study we examined the performance characteristics of the novel biomarker panel of urinary tissue inhibitor of metalloproteinases-2 (TIMP2) and insulin-like growth factor-binding protein 7 ([IGFBP7]) in patients with a variety of comorbid conditions.

Methods

We analyzed data from two multicenter studies of critically ill patients in which [TIMP2]•[IGFBP7] was validated for prediction of Kidney Disease: Improving Global Outcomes (KDIGO) Stage 2 or 3 AKI within 12 h. We constructed receiver operating characteristic (ROC) curves for AKI prediction both overall and by comorbid conditions common among patients with AKI, including diabetes mellitus, congestive heart failure (CHF) and chronic kidney disease (CKD).

Results

In the overall cohort of 1131 patients, 139 (12.3%) developed KDIGO Stage 2 or 3 AKI. [TIMP2]•[IGFBP7] was significantly higher in AKI versus non-AKI patients, both overall and within each comorbidity subgroup. The AUC for [TIMP2]•[IGFBP7] in predicting AKI was 0.81 overall. Higher AUC was noted in patients with versus without CHF (0.89 versus 0.79; P = 0.026) and CKD (0.91 versus 0.80; P = 0.024).

Conclusions

We observed no significant impairment in the performance of cell cycle arrest biomarkers due to the presence of chronic comorbid conditions.




Clinical adjudication in acute kidney injury studies: findings from the pivotal TIMP-2*IGFBP7 biomarker study

2016-09-28T00:05:55-07:00

Background

The NEPROCHECK test (Astute Medical, San Diego, CA, USA) combines urinary tissue inhibitor of metalloproteinases-2 (TIMP-2) and insulin-like growth factor binding protein 7 (IGFBP7) to identify patients at high risk for acute kidney injury (AKI). In a US Food and Drug Administration registration trial (NCT01573962), AKI was determined by a three-member clinical adjudication committee. The objectives were to examine agreement among adjudicators as well as between adjudicators and consensus criteria for AKI and to determine the relationship of biomarker concentrations and adjudicator agreement.

Methods

Subjects were classified as AKI 3/3, 2/3, 1/3 or 0/3 according to the proportion of adjudicators classifying the case as AKI. Subjects were classified as Kidney Disease: Improving Global Outcomes (KDIGO) AKI(+) when stage 2 or 3 AKI criteria were met.

Results

Concordance between adjudicators and between adjudicators and KDIGO criteria were lower for AKI than non-AKI subjects [78.9 versus 97.3% (P < 0.001) and 91.5 versus 97.9% (P = 0.01)]. Subjects who were AKI 3/3 or 2/3 but KDIGO AKI(–) had higher median [TIMP-2]•[IGFBP7] compared with those who were AKI-1/3 or 0/3 but KDIGO AKI(+) {2.78 [interquartile range (IQR) 2.33–3.56] versus 0.52 [IQR 0.26–1.64]; P = 0.008}. [TIMP-2]•[IGFBP7] levels were highest in patients with AKI 3/3 and lowest in AKI 0/3, whereas AKI 2/3 and 1/3 exhibited intermediate values.

Conclusions

In this analysis, urine [TIMP-2]•[IGFBP7] levels correlated to clinically adjudicated AKI better than to KDIGO criteria. Furthermore, in difficult cases where adjudicators overruled KDIGO criteria, the biomarker test discriminated well. This study highlights the importance of clinical adjudication of AKI for biomarker studies and lends further support for the value of urine [TIMP-2]•[IGFBP7].




Patterns of oral disease in adults with chronic kidney disease treated with hemodialysis

2016-09-28T00:05:55-07:00

Background

Oral disease is a potentially treatable determinant of mortality and quality of life. No comprehensive multinational study to quantify oral disease burden and to identify candidate preventative strategies has been performed in the dialysis setting.

Methods

The ORAL disease in hemoDialysis (ORALD) study was a prospective study in adults treated with hemodialysis in Europe (France, Hungary, Italy, Poland, Portugal and Spain) and Argentina. Oral disease was assessed using standardized WHO methods. Participants self-reported oral health practices and symptoms. Sociodemographic and clinical factors associated with oral diseases were determined and assessed within nation states.

Results

Of 4726 eligible adults, 4205 (88.9%) participated. Overall, 20.6% were edentulous [95% confidence interval (CI), 19.4–21.8]. Participants had on average 22 (95% CI 21.7–22.2) decayed, missing or filled teeth, while moderate to severe periodontitis affected 40.6% (95% CI 38.9–42.3). Oral disease patterns varied markedly across countries, independent of participant demographics, comorbidity and health practices. Participants in Spain, Poland, Italy and Hungary had the highest mean adjusted odds of edentulousness (2.31, 1.90, 1.90 and 1.54, respectively), while those in Poland, Hungary, Spain and Argentina had the highest odds of ≥14 decayed, missing or filled teeth (23.2, 12.5, 8.14 and 5.23, respectively). Compared with Argentina, adjusted odds ratios for periodontitis were 58.8, 58.3, 27.7, 12.1 and 6.30 for Portugal, Italy, Hungary, France and Poland, respectively. National levels of tobacco consumption, diabetes and child poverty were associated with edentulousness within countries.

Conclusions

Oral disease in adults on hemodialysis is very common, frequently severe and highly variable among countries, with much of the variability unexplained by participant characteristics or healthcare. Given the national variation and high burden of disease, strategies to improve oral health in hemodialysis patients will require implementation at a country level rather than at the level of individuals.




Risk of fracture in adults on renal replacement therapy: a Danish national cohort study

2016-09-28T00:05:56-07:00

Background

Patients on dialysis treatment or living with a transplanted kidney have several risk factors for bone fracture, especially disturbances in mineral metabolism and immunosuppressive therapy. We describe the incidence of fracture in this retrospective national Danish cohort study and explore the influence of age, gender, comorbidity and prescribed medication.

Methods

By individual-level linkage between nationwide administrative registries, the risk of fracture was compared between the group of patients receiving chronic dialysis treatment and patients receiving their first renal transplant in the study period, using the Danish background population as reference group. All three groups were followed up until first fracture, emigration, death or end of study. Cox proportional hazard models with fracture as outcome were fitted to the data.

Results

The hazard ratio (HR) for any fracture was 3.14 [95% confidence interval (95% CI):2.97–3.31] in the dialysis group and 1.94 (95% CI: 1.72–2.18) in the renal transplanted group. The HR remained increased, but was modified by adjustment for age, gender, comorbidity and prior fracture [dialysis group: 1.85 (95% CI: 1.75–1.95); renal transplanted group: 1.82 (95% CI: 1.62–2.06)]. Prescribed diuretics, lipid-modifying agents and proton pump inhibitors also modulated the fracture risk.

Conclusions

Patients on dialysis or living with a transplanted kidney have a significantly higher risk of fracture than the Danish background population. Differences in age, gender, drug use and comorbidity only partly explain this increased risk. Further studies are warranted to explore the reason for this increased fracture risk in patients on renal replacement therapy.




Mild prolonged chronic hyponatremia and risk of hip fracture in the elderly

2016-09-28T00:05:56-07:00

Background

Hip fractures are among the most serious bone fractures in the elderly, producing significant morbidity and mortality. Several observational studies have found that mild hyponatremia can adversely affect bone, with fractures occurring as a potential complication. We examined if there is an independent association between prolonged chronic hyponatremia (>90 days duration) and risk of hip fracture in the elderly.

Methods

We performed a retrospective cohort study in adults >60 years of age from a prepaid health maintenance organization who had two or more measurements of plasma sodium between 2005 and 2012. The incidence of hip fractures was assessed in a very restrictive population: subjects with prolonged chronic hyponatremia, defined as plasma sodium values <135 mmol/L, lasting >90 days. Multivariable Cox regression was performed to determine the hazard ratio (HR) for hip fracture risk associated with prolonged chronic hyponatremia after adjustment for the propensity to have hyponatremia, fracture risk factors and relevant baseline characteristics.

Results

Among 31 527 eligible patients, only 228 (0.9%) had prolonged chronic hyponatremia. Mean plasma sodium was 132 ± 5 mmol/L in hyponatremic patients and 139 ± 3 mmol/L in normonatremic patients (P < 0.001). The absolute risk for hip fracture was 7/282 in patients with prolonged chronic hyponatremia and 411/313 299 in normonatremic patients. Hyponatremic patients had a substantially elevated rate of hip fracture [adjusted HR 4.52 (95% CI 2.14–9.6)], which was even higher in those with moderate hyponatremia (<130 mmol/L) [adjusted HR 7.61 (95% CI 2.8–20.5)].

Conclusion

Mild prolonged chronic hyponatremia is independently associated with hip fracture risk in the elderly population, although the absolute risk is low. However, proof that correcting hyponatremia will result in a reduction of hip fractures is lacking.




Smoking patterns and chronic kidney disease in US Hispanics: Hispanic Community Health Study/Study of Latinos

2016-09-28T00:05:56-07:00

Background

Intermittent smoking is prevalent among Hispanics, but little is known about whether this smoking pattern associates with increased chronic kidney disease (CKD) risk in this population. The objective of the present study is to identify patterns of exposure associated with CKD in US Hispanics.

Methods

We used cross-sectional data on 15 410 participants of the Hispanics Community Health Study/the Study of Latinos, a population-based study of individuals aged 18–74 years, recruited in 2008 to 2011 from four US field centers (Bronx, NY; Chicago, IL; Miami, FL; San Diego, CA). Smoking exposure was obtained through a questionnaire. CKD was defined by an estimated glomerular filtration rate of <60 mL/min/1.73 m2 or a urine albumin-to-creatinine ratio of ≥30 mg/g.

Results

Approximately 14% of individuals were daily and 7% were intermittent smokers, and 16% were past smokers. There was a significant interaction between smoking status and pack-years of exposure (P = 0.0003). In adjusted models, there was an increased odds of CKD among daily, intermittent and past smokers by pack-years compared with never smokers. The association of intermittent smokers was significant at 10 pack-years [odds ratio (OR) = 1.38, 95% confidence intervals (CI) 1.06, 1.81], whereas for daily smokers this association was observed at 40 pack-years (OR = 1.43, 95% CI 1.09, 1.89).

Conclusions

Our findings of increased risk of CKD among Hispanics who are intermittent smokers support screening and smoking cessation interventions targeted to this population for the prevention of CKD. It also suggests novel mechanistic pathways for kidney toxicity that should be further explored in future studies.




Micro-RNA analysis of renal biopsies in human lupus nephritis demonstrates up-regulated miR-422a driving reduction of kallikrein-related peptidase 4

2016-09-28T00:05:56-07:00

Background

Aberrancies in gene expression in immune effector cells and in end-organs are implicated in lupus pathogenesis. To gain insights into the mechanisms of tissue injury, we profiled the expression of micro-RNAs in inflammatory kidney lesions of human lupus nephritis (LN).

Methods

Kidney specimens were from patients with active proliferative, membranous or mixed LN and unaffected control tissue. Micro-RNAs were quantified by TaqMan Low Density Arrays. Bioinformatics was employed to predict gene targets, gene networks and perturbed signaling pathways. Results were validated by transfection studies (luciferase assay, real-time PCR) and in murine LN. Protein expression was determined by immunoblotting and immunohistochemistry.

Results

Twenty-four micro-RNAs were dysregulated (9 up-regulated, 15 down-regulated) in human LN compared with control renal tissue. Their predicted gene targets participated in pathways associated with TGF-β, kinases, NF-B, HNF4A, Wnt/β-catenin, STAT3 and IL-4. miR-422a showed the highest upregulation (17-fold) in active LN and correlated with fibrinoid necrosis lesions (β = 0.63, P = 0.002). In transfection studies, miR-422a was found to directly target kallikrein-related peptidase 4 (KLK4) mRNA. Concordantly, KLK4 mRNA was significantly reduced in the kidneys of human and murine LN and correlated inversely with miR-422a levels. Immunohistochemistry confirmed reduced KLK4 protein expression in renal mesangial and tubular epithelial cells in human and murine LN.

Conclusions

KLK4, a serine esterase with putative renoprotective properties, is down-regulated by miR-422a in LN kidney suggesting that, in addition to immune activation, local factors may be implicated in the disease.




Prevalence and distribution of (micro)albuminuria in toddlers

2016-09-28T00:05:56-07:00

Background

Microalbuminuria is common in the general adult population, with a prevalence of ~7%, and is an independent indicator of renal and cardiovascular risks. Whether albuminuria is acquired during life (as a result of hypertension/diabetes) or is congenital and already present at birth is unknown. We studied the prevalence of microalbuminuria in toddlers and compared the distribution of albuminuria with that of the general adult population. In addition, we looked for possible associations between microalbuminuria and antenatal, postnatal and maternal factors.

Methods

The urinary albumin concentration (UAC) was measured in 1352 children and the urinary albumin:creatinine ratio (UACR) in 1288 children from the Groningen Expert Center for Kids with Obesity (GECKO) Drenthe cohort (age range 20–40 months). Albuminuria distribution was compared with the albuminuria distribution in 40 854 participants of the general adult cohort of the Prevention of Renal and Vascular End stage Disease (PREVEND) study. Associations between albuminuria (expressed as UAC and UACR) and antenatal, postnatal and maternal factors were tested with linear regression analysis.

Results

The median UAC in the GECKO study was 2.3 mg/L (5th–95th percentiles: 2.1–25.5) and in the PREVEND study it was 6.0 mg/L (2.3–28.6) (P distribution comparison 0.053). The prevalence of UAC ≥ 20 mg/L was 6.9% in the GECKO study and 7.8% in the PREVEND study (P = 0.195). The prevalence of UACR ≥ 30 mg/g in the GECKO study was 23.4%. UAC and UACR were lower in boys. UAC was not associated with other determinants, but UACR was associated with age and gestational diabetes.

Conclusions

The distribution of UAC and the prevalence of UAC > 20 mg/L in toddlers and in the young general adult population are comparable. These findings suggest that microalbuminuria is a congenital condition that may predispose to a higher cardiovascular risk later in life.




Prenatal diagnosis of fetal multicystic dysplastic kidney via high-resolution whole-genome array

2016-09-28T00:05:56-07:00

Background

Women with fetal multicystic dysplastic kidneys (MCDK) are commonly referred for genetic counseling, for which identification of the correct etiology is a prerequisite.

Methods

A total of 72 women with fetal MCDK at Guangzhou Women and Children's Medical Center were examined via invasive prenatal diagnosis from May 2010 to June 2015. Standard karyotyping analysis was provided to all fetuses, and chromosomal microarray with Affymetrix CytoSan HD arrays was offered to cases whose DNA samples were available.

Results

Abnormal karyotypes were detected in 3 of 72 (4.17%) fetuses. Of the 69 (95.8%, 69/72) fetuses with normal karyotypes, 30 (42%, 30/69) underwent chromosome microarray analysis (CMA) testing. The CMA identified pathogenic copy number variations in five fetuses, leading to a pathogenic detection rate of 16.7% (5/30). Well-known microdeletion or microduplication syndromes including renal cysts and diabetes (RCAD) syndrome and Williams–Beuren syndrome (WBS) were identified in three cases. Moreover, four chromosomal imbalanced regions were also identified in our MCDK fetuses: 22q11.1 duplication, 4q35.2 deletion, 22q13.33 duplication and 1p33 duplication. Genes PEX26, ELN, HNF1B, ALG12, FRG1, FRG2 and CYP4A11 were possible candidates for fetal MCDK. The proportions of variants of unknown significance before and after parental analysis were 13.3% (4/30) and 3.3% (1/30), respectively.

Conclusions

In the present study, the frequency of chromosomal abnormalities in MCDK fetuses was 4.17% and all rearrangements were imbalanced aberrations. CMA was able to increase the pathogenic detection rate to 16.7% in MCDK fetuses with normal karyotype. Critical regions for RCAD syndrome, WBS and copy number variants of 22q11.1 duplication, 4q35.2 deletion, 22q13.33 duplication and 1p33 duplication were associated with fetal MCDK. Genes PEX26, ELN, HNF1B, ALG12, FRG1, FRG2 and CYP4A11 were possible candidates for fetal MCDK.




Body mass index trend in haemodialysis patients: the shift of nutritional disorders in two Italian regions

2016-09-28T00:05:56-07:00

Background

In the USA, the increase in the prevalence of obesity in the general population has been accompanied by a marked increase in the prevalence and incidence of obesity in the dialysis population. However, secular trends of body mass index (BMI) have not been investigated in European renal registries.

Methods

We investigated the secular trend of BMI across 18 years (1994–2011) in two haemodialysis (HD) registries (Calabria in southern Italy and Emilia in northern Italy) on a total of 16 201 prevalent HD patients and in a series of 3559 incident HD patients. We compared trends in BMI for HD patients with those in the background general population of the same regions.

Results

The average BMI rose from 23.5 kg/m2 in 1994 to 25.5 (+8.5%) in 2011 in the Calabria registry and from 23.7 in 1998 to 25.4 (+7.1%) in 2011 in the Emilia registry (P < 0.001). The proportion of obese patients (i.e. with BMI >30 kg/m2) rose from 6 to 14% in Calabria and from 6 to 16% in Emilia (P < 0.001). These patterns were fully confirmed in incident patients and were mirrored by a substantial decline in the prevalence of underweight–normal and underweight (P < 0.001) patients. Of note, the steepness of the increase in BMI in haemodialysis patients was 3.7 times more pronounced than that in the coeval, age- and sex-matched general population of Calabria and Emilia.

Conclusions

In two regional haemodialysis registries in Italy a steady increase in overweight and obese patients is observed. These patterns are more pronounced than those found in the general population. If further confirmed in other European haemodialysis cohorts, these findings may have relevant public health implications.




High cut-off dialysis in chronic haemodialysis patients reduces serum procalcific activity

2016-09-28T00:05:56-07:00

Background

Vascular calcification is enhanced in chronic dialysis patients, possibly due to the insufficient removal of various intermediate molecular weight uraemic toxins such as interleukins with conventional membranes. In this study, we assessed the modulation of in vitro vascular calcification with the use of high cut-off (HCO) membranes in chronic dialysis patients.

Methods

In a PERCI trial, 43 chronic dialysis patients were treated with conventional high-flux and HCO filters for 3 weeks in a randomized order following a 2-period crossover design. After each phase, serum predialysis samples were drawn. Calcifying human coronary vascular smooth muscle cells (VSMCs) were incubated with the patient's serum samples. Calcification was assessed with alkaline phosphatase (ALP) and alizarin red staining. In the clinical trial, HCO dialysis reduced the serum levels of the soluble tumour necrosis factor receptor (sTNFR) 1 and 2, vascular cell adhesion molecule 1 (VCAM-1) and soluble interleukin-2 receptor (sIL2R). We therefore investigated the in vitro effects of these mediators on vascular calcification.

Results

VSMCs incubated with HCO dialysis serum showed a 26% reduction of ALP with HCO serum compared with high-flux serum. Alizarin was 43% lower after incubation with the HCO serum compared with the high-flux serum. While sIL2R and sTNFR 1 and 2 showed no effects on VSMC calcification, VCAM-1 caused a dose-dependent enhancement of calcification.

Conclusions

The use of HCO dialysis membranes in chronic dialysis patients reduces the procalcific effects of serum on VSMC in vitro. The mechanisms of the strong effect of HCO on in vitro calcification are not completely understood. One factor may be lower levels of VCAM-1 in HCO serum samples, since VCAM-1 was able to induce vascular calcification in our experiments. Neither sTNFR 1, sTNFR 2 nor sIL2R enhance vascular calcification in vitro. Regardless of the mechanisms, our results encourage further studies of highly permeable filters in chronic dialysis patients.




The effects of resistance exercise and oral nutritional supplementation during hemodialysis on indicators of nutritional status and quality of life

2016-09-28T00:05:56-07:00

Background

Protein-energy wasting (PEW) is common in patients undergoing hemodialysis (HD). Studies have assessed the positive effect of oral nutritional supplementation (ONS) or resistance exercise (RE) on nutritional status (NS) markers in patients undergoing HD.

Methods

The aim of this study was to assess the effect of ONS and RE on NS and the quality of life (QOL) of 36 patients undergoing HD. In a randomized clinical trial, patients were divided into the following two groups: a control group (ONS) that received a can of ONS during their HD sessions and an intervention group (ONS + RE) that received a can of ONS and underwent a 40-min session of RE during their HD sessions. Both interventions lasted 12 weeks. The patients' anthropometric, biochemical, dietetic and bioelectrical impedance measurements as well as their QOL, evaluated using the Kidney Disease Quality of Life Short Form, were recorded.

Results

At baseline, 55.5% of patients presented with PEW according to International Society of Renal Nutrition and Metabolism criteria (20 patients). We found statistically significant changes from baseline in both groups, such as increases in body weight, body mass index, midarm circumference, midarm muscle circumference, triceps skinfold thickness, fat mass percentage, handgrip strength, phase angle and serum albumin. A decrease in the prevalence of PEW was observed in both groups at the end of the intervention. A delta comparison between groups showed no statistically significant differences in the anthropometric and biochemical parameters. No significant improvement was observed in QOL and body composition measured by bioimpedance vector analysis. Dietary energy and protein intake increased significantly during the study period for all patients.

Conclusion

Oral nutritional supplementation during HD improves NS. The addition of RE during HD does not seem to augment the acute anabolic effects of intradialytic ONS on NS.




The influence of renal transplantation on retained microbial-human co-metabolites

2016-09-28T00:05:56-07:00

Background

Colonic microbial metabolism contributes substantially to uraemic retention solutes accumulating in chronic kidney disease (CKD) and various microbial–human co-metabolites relate to adverse outcomes. The influence of renal transplantation on these solutes is largely unexplored.

Methods

We prospectively followed 51 renal transplant recipients at the time of transplantation, Day 7 and Months 3 and 12 post-transplantation. Serum levels of p-cresyl sulphate (PCS), p-cresyl glucuronide (PCG), indoxyl sulphate (IS), trimethylamine N-oxide (TMAO) and phenylacetylglutamine (PAG) were determined with liquid chromatography–tandem mass spectrometry. At each time point, transplant recipients were compared with CKD control patients matched for age, gender, diabetes mellitus and renal function. Determinants of serum levels were also compared between an unrelated cohort of 65 transplant recipients at Month 3 post-transplantation and CKD patients with 24-h urinary collection.

Results

Serum levels of the tested microbial–human co-metabolites significantly decreased following renal transplantation (P < 0.001). At each time point post-transplantation, serum levels of PCS, PCG, PAG and, to a lesser extent, IS, but not TMAO, were significantly lower in transplant recipients when compared with CKD control patients. Further analysis demonstrated significantly lower 24-h urinary excretion of these solutes in transplant recipients (P < 0.001). Also, renal clearances of PCG, IS, TMAO and PAG were significantly lower in transplant recipients without differences in estimated glomerular filtration rate.

Conclusions

Colonic microbiota-derived uraemic retention solutes substantially decrease following renal transplantation. The 24-h urinary excretion of these microbial–human co-metabolites is lower when compared with CKD patients, suggesting an independent influence of transplantation on intestinal uptake, a composite of colonic microbial metabolism and intestinal absorption. Renal solute handling may differ between transplant recipients and CKD patients.




Efficacy and safety of antibody induction therapy in the current era of kidney transplantation

2016-09-28T00:05:56-07:00

Background

Antibody induction with polyclonal rabbit-antithymocyte globulin (rATG) or an interleukin-2 receptor antagonist (IL–2RA) is widely used in kidney transplantation.

Methods

Collaborative Transplant Study data from 38 311 first deceased-donor kidney transplants (2004–13) were analysed. Transplants were classified as ‘normal risk’ or ‘increased risk’ according to current guidelines. Cox regression analysis was applied to subpopulations of propensity score–matched recipients.

Results

rATG or IL–2RA induction was given to 64% of increased-risk and 53% of normal-risk patients, respectively. rATG and IL–2RA induction were each associated with reduced risk for graft loss versus no induction in increased-risk patients [hazard ratio (HR) 0.85, P = 0.046 and HR 0.89, P = 0.011, respectively]. The HR values for incidence of treated rejection in increased-risk patients for rATG and IL–2RA versus no induction were 0.75 (P = 0.037) and 0.77 (P < 0.001), respectively. In the normal risk subpopulation, neither induction therapy significantly affected the risk of graft loss or treated rejection. Hospitalization for infection was increased by rATG (P < 0.001) and IL–2RA (P < 0.001) induction. In contrast to patients transplanted during 1994–2003, among patients transplanted during 2004–13, rATG did not significantly affect the risk of non-Hodgkin's lymphoma versus no induction (P = 0.68).

Conclusion

Induction therapy following kidney transplantation should be targeted to increased-risk transplants. In this analysis, a beneficial effect of antibody induction in normal-risk transplants could not be demonstrated.




Pretransplant angiotensin II type 1-receptor antibodies are a risk factor for earlier detection of de novo HLA donor-specific antibodies

2016-09-28T00:05:56-07:00

Background

Angiotensin II type 1 receptor antibodies (AT1Rabs) have been associated with significantly reduced graft survival. Earlier graft loss has been observed in patients who had pretransplant AT1Rabs and posttransplant donor-specific antibodies (DSA).

Methods

The main goal of this retrospective cohort study was to examine the association between AT1Rabs and the time period to detection of de novo human leukocyte antigen (HLA-DSA) posttransplantation in living donor kidney transplant recipients (KTR). The analysis included 141 KTRs. Pretransplant frozen serum samples were tested for AT1Rabs by ELISA and HLA-DSA by SAB (Luminex) at both the pre- and post-KT time points.

Results

The median AT1Rab level was 9.13 U (interquartile range 5.22–14.33). After a mean follow-up period of 3.55 years, 48 patients were found to harbour de novo HLA-DSAs. The presence of AT1Rabs [hazard ratio (HR) 1.009, 95% confidence interval (CI) 1.002–1.01, P = 0.010], male-to-male transplantation (HR 2.57, 95% CI 1.42–4.67, P = 0.002) and antecedent borderline changes or acute cellular rejection (ACR) (HR 2.47, 95% CI 1.29–4.75, P = 0.006) were significantly associated with de novo DSA detection. A dose-dependent association between AT1Rab levels (<10 U, 10.1–16.9 U, 17–29.9 U and >30 U) and de novo DSA detection was observed (log-rank P = 0.0031). After multivariate analysis of AT1Rab levels (continuous variable), AT1Rabs >30 U, male-to-male transplantation, donor age, higher class I percentage of Panel Reactive Antibody and antecedent borderline changes or ACR remained as independent significant risk factors for the detection of de novo DSAs.

Conclusions

The findings suggest that higher levels of pretransplant circulating antibodies against AT1R (>30 U) in kidney graft recipients constitute an independent risk factor for earlier de novo HLA-DSA detection during the posttransplant period.




The mode of sensitization and its influence on allograft outcomes in highly sensitized kidney transplant recipients

2016-09-28T00:05:56-07:00

Background

We sought to determine whether the mode of sensitization in highly sensitized patients contributed to kidney allograft survival.

Methods

An analysis of the United Network for Organ Sharing dataset involving all kidney transplants between 1997 and 2014 was undertaken. Highly sensitized adult kidney transplant recipients [panel reactive antibody (PRA) ≥98%] were compared with adult, primary non-sensitized and re-transplant recipients. Kaplan–Meier survival analyses were used to determine allograft survival rates. Cox proportional hazards regression analyses were conducted to determine the association of graft loss with key predictors.

Results

Fifty-three percent of highly sensitized patients transplanted were re-transplants. Pregnancy and transfusion were the only sensitizing event in 20 and 5%, respectively. The 10-year actuarial graft survival for highly sensitized recipients was 43.9% compared with 52.4% for non-sensitized patients, P < 0.001. The combination of being highly sensitized by either pregnancy or blood transfusion increased the risk of graft loss by 23% [hazard ratio (HR) 1.230, confidence interval (CI) 1.150–1.315, P < 0.001], and the combination of being highly sensitized from a prior transplant increased the risk of graft loss by 58.1% (HR 1.581, CI 1.473–1.698, P < 0.001).

Conclusions

The mode of sensitization predicts graft survival in highly sensitized kidney transplant recipients (PRA ≥98%). Patients who are highly sensitized from re-transplants have inferior graft survival compared with patients who are highly sensitized from other modes of sensitization.




Announcements

2016-09-28T00:05:56-07:00
















Causality at the dawn of the 'omics era in medicine and in nephrology

2016-09-02T00:05:37-07:00

Causality is a core concept in medicine. The quantitative determinacy characterizing today's biomedical science is unprecedented. The assessment of causal relations in human diseases is evolving, and it is therefore fundamental to keep up with the steady pace of theoretical and technological advancements. The exact specification of all causes of pathologies at the individual level, precision medicine, is expected to allow the complete eradication of disease. In this article, we discuss the various conceptualizations of causation that are at play in the context of randomized clinical trials and observational studies. Genomics, proteomics, metabolomics and epigenetics can now produce the precise knowledge we need for 21st century medicine. New conceptions of causality are needed to form the basis of the new precision medicine.




The blood pressure-salt sensitivity paradigm: pathophysiologically sound yet of no practical value

2016-09-02T00:05:37-07:00

Sodium plays an important pathophysiological role in blood pressure (BP) values and in the development of hypertension, and epidemiological studies such as the Intersalt Study have shown that the increase in BP occurring with age is determined by salt intake. Recently, a meta-analysis of 13 prospective studies has also shown the close relationship between excess sodium intake and higher risk of stroke and total cardiovascular events. However, the BP response to changing salt intake displayed a marked variability, as first suggested by Kawasaki et al. (The effect of high-sodium and low-sodium intakes on blood pressure and other related variables in human subjects with idiopathic hypertension. Am J Med 1978; 64: 193–198) and later by Weinberger et al. (Definitions and characteristics of sodium sensitivity and blood pressure resistance. Hypertension 1986; 8: II127–II134), who recognized the heterogeneity of the BP response to salt and developed the concept of salt sensitivity. We have a large body of evidence in favour of a major role of metabolic and neuro-hormonal factors in determining BP salt sensitivity in association with the effect of genetic variation. There is evidence that salt sensitivity influences the development of organ damage, even independently—at least in part—of BP levels and the occurrence of hypertension. In addition, several observational studies indicate that salt sensitivity is clearly associated with a higher rate of cardiovascular events and mortality, independently of BP levels and hypertension. A cluster of factors with well-known atherogenic potential such as hyperinsulinaemia, dyslipidaemia and microalbuminuria—all known to be prevalent in salt-sensitive hypertension—might at least partially explain the increased cardiovascular risk observed in salt sensitive individuals. The gold standard for the evaluation of BP salt sensitivity is the BP response to a moderate reduction of salt intake for several weeks; nevertheless, these protocols often suffer of poor patient compliance to dietary instructions. To overcome this problem, short-term tests have been propo[...]



Pro: Reducing salt intake at population level: is it really a public health priority?

2016-09-02T00:05:37-07:00

A reduction in salt intake reduces blood pressure, stroke and other cardiovascular events, including chronic kidney disease, by as much as 23% (i.e. 1.25 million deaths worldwide). It is effective in both genders, any age, ethnic group, and in high-, medium- and low-income countries. Population salt reduction programmes are both feasible and effective (preventive imperative). Salt reduction programmes are cost-saving in all settings (high-, middle- and low-income countries) (economic imperative). Public health policies are powerful, rapid, equitable and cost-saving (political imperative). The important shift in public health has not occurred without obstinate opposition from organizations concerned primarily with the profits deriving from population high salt intake and less with public health benefits. A key component of the denial strategy is misinformation (with ‘pseudo’ controversies). In general, poor science has been used to create uncertainty and to support inaction. This paper summarizes the evidence in favour of a global salt reduction strategy and analyses the peddling of well-worn myths behind the false controversies.




Opponent's comments

2016-09-02T00:05:37-07:00




Con: Reducing salt intake at the population level: is it really a public health priority?

2016-09-02T00:05:37-07:00

Scientific evidence to support the recommended salt intake of < 5.8 g/day is virtually non-existingent. There are no randomized controlled trials (RCTs) to investigate the effect of salt reduction (SR) below 5.8 g on health outcomes. The effect of SR on blood pressure (BP) reaches maximal efficacy at 1 week. RCTs in healthy individuals lasting at least 1 week show that the effect of SR on BP is <1 mmHg, but that SR has significant side effects, including increases in renin, aldosterone, noradrenalin, adrenalin, cholesterol and triglyceride. Still, disregarding confounders and side effects, health authorities use BP effects obtained in studies of pre-hypertensive and hypertensive patients to recommend SR in the healthy population and use these biased BP effects in statistical models indirectly to project millions of saved lives. These fantasy projections are in contrast to real data from prospective observational population studies directly associating salt intake with mortality, which show that salt intake <5.8 g/day is associated with an increased mortality of ~15%. The population studies also show that a very high salt intake >12.2 g is associated with increased mortality. However, since <5% of populations consume such high amounts of salt, SR at the population level should not be a public health priority. Consequently, this policy should be abolished, not because any attempt to implement it has failed, and not because it costs taxpayers and food consumers unnecessary billions of dollars, but because—if implemented—it might kill people instead of saving them.




Opponent's comments

2016-09-02T00:05:37-07:00




Moderator's view: Salt, cardiovascular risk, observational research and recommendations for clinical practice

2016-09-02T00:05:37-07:00

In observational studies, blood pressure (BP), cholesterol and nutritional status biomarkers, including sodium intake, coherently show a J- or U-shaped relationship with health outcomes. Yet these data may reflect a stable sodium intake or a reduced intake due to comorbidities or intercurrent disease, or an intentional decrease in salt intake. Adjusting for comorbidities and risk factors may fail to eliminate confounding. For cholesterol and BP, we base our recommendations for prevention and treatment on interventional (experimental) studies. For sodium, we lack the perfect large-scale trial we would need, but substantial circumstantial information derived from interventional studies cannot be ignored. The objection that modelling the risk of salt excess for cardiovascular disease events based on the effect of salt intake on BP is unjustified fails to consider a recent meta-analysis showing that, independently of the intervention applied, intensive BP-lowering treatment (average BP 133/76 mmHg), compared with the less intensive treatment (140/81 mmHg), is associated with a 14% risk reduction for major cardiovascular events. In this knowledge context, inertia, i.e. awaiting the ‘mother trial’, is not justified. While recognizing that this trial may still be needed and that actual data, rather than modelled data, are the ideal solution, for now, the World Health Organization recommendation of reducing salt intake to <2 g/day of sodium (5 g/day of salt) in adults stands.




Chronicity following ischaemia-reperfusion injury depends on tubular-macrophage crosstalk involving two tubular cell-derived CSF-1R activators: CSF-1 and IL-34

2016-09-02T00:05:37-07:00

Two structurally unrelated ligands activate the macrophage colony stimulating factor receptor (CSF-1R, c-fms, CD115): M-CSF/CSF-1 and interleukin-34 (IL-34). Both ligands promote macrophage proliferation, survival and differentiation. IL-34 also activates the protein-tyrosine phosphatase receptor (PTP-, PTPRZ1). Both receptors and cytokines are increased during acute kidney injury. While tubular cell-derived CSF-1 is required for kidney repair, Baek et al. (J Clin Invest 2015; 125: 3198–3214) have now identified tubular epithelial cell-derived IL-34 as a promoter of kidney neutrophil and macrophage infiltration and tubular cell destruction during experimental kidney ischaemia-reperfusion, leading to chronic injury. IL-34 promoted proliferation of both intrarenal macrophages and bone marrow cells, increasing circulating neutrophils and monocytes and their kidney recruitment. Thus, injured tubular cells release two CSF-1R activators, one (CSF-1) that promotes tubular cell survival and kidney repair and another (IL-34) that promotes chronic kidney damage. These results hold promise for the development of IL-34-targeting strategies to prevent ischaemia-reperfusion kidney injury in contexts such as kidney transplantation. However, careful consideration should be given to the recent characterization by Bezie et al. (J Clin Invest 2015; 125: 3952–3964) of IL-34 as a T regulatory cell (Treg) cytokine that modulates macrophage responses so that IL-34-primed macrophages potentiate the immune suppressive capacity of Tregs and promote graft tolerance.




Kelch-like 3/Cullin 3 ubiquitin ligase complex and WNK signaling in salt-sensitive hypertension and electrolyte disorder

2016-09-02T00:05:38-07:00

Pseudohypoaldosteronism type II (PHAII) is a hereditary disease characterized by salt-sensitive hypertension, hyperkalemia and thiazide sensitivity. Mutations in with-no-lysine kinase 1 (WNK1) and WNK4 genes are reported to cause PHAII. Rigorous studies have demonstrated that WNK kinases constitute a signaling cascade with oxidative stress-responsive gene 1 (OSR1), Ste20-related proline-alanine-rich kinase (SPAK) and the solute carrier family 12a (SLC12a) transporter, including thiazide-sensitive NaCl cotransporter. The WNK–OSR1/SPAK-SLC12a signaling cascade is present in the kidneys and vascular smooth muscle cells (VSMCs) and regulates salt sensitivity physiologically, i.e. urinary sodium excretion and arterial tone by various hormonal and dietary factors. However, although it was clear that the abnormal activation of this signaling cascade is the molecular basis of PHAII, the molecular mechanisms responsible for the physiological regulation of WNK signaling and the effect of WNK4 mutations on PHAII pathogenesis are poorly understood. Two additional genes responsible for PHAII, Kelch-like 3 (KLHL3) and Cullin 3 (CUL3), were identified in 2012. WNK1 and WNK4 have been shown to be substrates of KLHL3–CUL3 E3 ubiquitin ligase both in vitro and in vivo. In PHAII, the loss of interaction between KLHL3 and WNK4 induces increased levels of WNK kinases due to impaired ubiquitination. These results indicate that WNK signaling is physiologically regulated by KLHL3/CUL3-mediated ubiquitination. Here, we review recent studies investigating the pathophysiological roles of the WNK signaling cascade in the kidneys and VSMCs and recently discovered mechanisms underlying the regulation of WNK signaling by KLHL3 and CUL3.




Glomerular filtration rate decline as a surrogate end point in kidney disease progression trials

2016-09-02T00:05:38-07:00

Chronic kidney disease (CKD) is strongly associated with increased risks of progression to end-stage kidney disease (ESKD) and mortality. Clinical trials evaluating CKD progression commonly use a composite end point of death, ESKD or serum creatinine doubling. However, due to low event rates, such trials require large sample sizes and long-term follow-up for adequate statistical power. As a result, very few interventions targeting CKD progression have been tested in randomized controlled trials. To overcome this problem, the National Kidney Foundation and Food and Drug Administration conducted a series of analyses to determine whether an end point of 30 or 40% decline in estimated glomerular filtration rate (eGFR) over 2–3 years can substitute for serum creatinine doubling in the composite end point. These analyses demonstrated that these alternate kidney end points were significantly associated with subsequent risks of ESKD and death. However, the association between, and consistency of treatment effects on eGFR decline and clinical end points were influenced by baseline eGFR, follow-up duration and acute hemodynamic effects. The investigators concluded that a 40% eGFR decline is broadly acceptable as a kidney end point across a wide baseline eGFR range and that a 30% eGFR decline may be acceptable in some situations. Although these alternate kidney end points could potentially allow investigators to conduct shorter duration clinical trials with smaller sample sizes thereby generating evidence to guide clinical decision-making in a timely manner, it is uncertain whether these end points will improve trial efficiency and feasibility. This review critically appraises the evidence, strengths and limitations pertaining to eGFR end points.




Changes in inflammatory biomarkers after renal revascularization in atherosclerotic renal artery stenosis

2016-09-02T00:05:38-07:00

Background

Atherosclerotic renal artery stenosis (ARAS) activates oxidative stress and chronic inflammatory injury. Contrast imaging and endovascular stenting pose potential hazards for acute kidney injury, particularly when superimposed upon reduced kidney perfusion.

Methods

We measured sequential early and long-term changes in circulating inflammatory and injury biomarkers in 12 ARAS subjects subjected to computed tomography imaging and stent revascularization compared with essential hypertensive (EH) subjects of similar age under fixed sodium intake and medication regimens in a clinical research unit.

Results

NGAL, TIMP-2, IGFBP7, MCP-1 and TNF-α all were elevated before intervention. Post-stenotic kidney volume, perfusion, blood flow and glomerular filtration rate (GFR) were lower in ARAS than in EH subjects. TIMP-2 and IGFBP7 fell briefly, then rose over 18 h after contrast imaging and stent deployment. Circulating NGAL decreased and remained lower for 27 h. These biomarkers in ARAS returned to baseline after 3 months, while kidney volume, perfusion, blood flow and GFR increased, but remained lower than EH.

Conclusions

These divergent patterns of inflammatory signals are consistent with cell cycle arrest (TIMP-2, IGFBP7) and relative protection from acute kidney injury after imaging and stenting. Sustained basal elevation of circulating and renal venous inflammatory biomarkers support ongoing, possibly episodic, renal stress in ARAS that limits toxicity from stent revascularization.




Metallothioneins and renal ageing

2016-09-02T00:05:38-07:00

Background

Human lifespan is increasing continuously and about one-third of the population >70 years of age suffers from chronic kidney disease. The pathophysiology of the loss of renal function with ageing is unclear.

Methods

We determined age-associated gene expression changes in zero-hour biopsies of deceased donor kidneys without laboratory signs of impaired renal function, defined as a last serum creatinine >0.96 mg/dL in females and >1.18 mg/dL in males, using microarray technology and the Significance Analysis of Microarrays routine. Expression changes of selected genes were confirmed by quantitative polymerase chain reaction and in situ hybridization and immunohistochemistry for localization of respective mRNA and protein. Functional aspects were examined in vitro.

Results

Donors were classified into three age groups (<40, 40–59 and >59 years; Groups 1, 2 and 3, respectively). In Group 3 especially, genes encoding for metallothionein (MT) isoforms were more significantly expressed when compared with Group 1; localization studies revealed predominant staining in renal proximal tubular cells. RPTEC/TERT1 cells overexpressing MT2A were less susceptible towards cadmium chloride–induced cytotoxicity and hypoxia-induced apoptosis, both models for increased generation of reactive oxygen species.

Conclusions

Increased expression of MTs in the kidney with ageing might be a protective mechanism against increased oxidative stress, which is closely related to the ageing process. Our findings indicate that MTs are functionally involved in the pathophysiology of ageing-related processes.




Extended versus standard azathioprine maintenance therapy in newly diagnosed proteinase-3 anti-neutrophil cytoplasmic antibody-associated vasculitis patients who remain cytoplasmic anti-neutrophil cytoplasmic antibody-positive after induction of remission: a randomized clinical trial

2016-09-02T00:05:38-07:00

Background Cytoplasmic anti-neutrophil cytoplasmic antibody (C-ANCA) positivity at remission has been associated with an increased relapse rate in patients with proteinase 3 anti-neutrophil cytoplasmic antibody-associated vasculitis (PR3-AAV) after a switch to azathioprine maintenance therapy. We therefore hypothesized that extended azathioprine maintenance therapy could reduce the incidence of relapse in this setting. Methods Patients newly diagnosed with PR3-AAV at 12 centres in The Netherlands during 2003–11 who received a standardized induction regimen consisting of oral cyclophosphamide and corticosteroids were enrolled (n = 131). Patients were randomized to standard or extended azathioprine maintenance therapy when C-ANCA was positive at the time of stable remission. Standard maintenance treatment consisted of azathioprine (1.5–2.0 mg/kg) until 1 year after diagnosis and subsequent tapering to 25 mg every 3 months. Extended azathioprine maintenance therapy (1.5–2.0 mg/kg) was continued until 4 years after diagnosis and tapered thereafter. The primary endpoint was relapse-free survival at 4 years after diagnosis. Results In patients with PR3-AAV who were C-ANCA positive at the time of stable remission, relapse-free survival at 4 years after diagnosis did not differ significantly between standard azathioprine (n = 24) and extended azathioprine (n = 21) maintenance therapy (P = 0.40). There was also no significant difference in relapse-free survival between patients receiving standard azathioprine (n = 106) versus extended azathioprine maintenance therapy (n = 21; P = 0.94). In addition, there was no difference in the relapse rate between patients with PR3-AAV who were C-ANCA positive (n = 45) at the time of remission versus patients who became C-ANCA negative at the time of remission (n = 82; P = 0.6[...]



Relationship of proximal tubular injury to chronic kidney disease as assessed by urinary kidney injury molecule-1 in five cohort studies

2016-09-02T00:05:38-07:00

Background

The primary biomarkers used to define CKD are serum creatinine and albuminuria. These biomarkers have directed focus on the filtration and barrier functions of the kidney glomerulus even though albuminuria results from tubule dysfunction as well. Given that proximal tubules make up ~90% of kidney cortical mass, we evaluated whether a sensitive and specific marker of proximal tubule injury, urinary kidney injury molecule-1 (KIM-1), is elevated in individuals with CKD or with risk factors for CKD.

Methods

We measured urinary KIM-1 in participants of five cohort studies from the USA and Sweden. Participants had a wide range of kidney function and were racially and ethnically diverse. Multivariable linear regression models were used to test the association of urinary KIM-1 with demographic, clinical and laboratory values.

Results

In pooled, multivariable-adjusted analyses, log-transformed, creatinine-normalized urinary KIM-1 levels were higher in those with lower eGFR {β = –0.03 per 10 mL/min/1.73 m2 [95% confidence interval (CI) –0.05 to –0.02]} and greater albuminuria [β = 0.16 per unit of log albumin:creatinine ratio (95% CI 0.15–0.17)]. Urinary KIM-1 levels were higher in current smokers, lower in blacks than nonblacks and lower in users versus nonusers of angiotensin-converting enzyme inhibitors and angiotensin receptor blockers.

Conclusion

Proximal tubule injury appears to be an integral and measurable element of multiple stages of CKD.




Individual long-term albuminuria exposure during angiotensin receptor blocker therapy is the optimal predictor for renal outcome

2016-09-02T00:05:38-07:00

Background Albuminuria reduction due to angiotensin receptor blockers (ARBs) predicts subsequent renoprotection. Relating the initial albuminuria reduction to subsequent renoprotection assumes that the initial ARB-induced albuminuria reduction remains stable during follow-up. The aim of this study was to assess individual albuminuria fluctuations after the initial ARB response and to determine whether taking individual albuminuria fluctuations into account improves renal outcome prediction. Methods Patients with diabetes and nephropathy treated with losartan or irbesartan in the RENAAL and IDNT trials were included. Patients with >30% reduction in albuminuria 3 months after ARB initiation were stratified by the subsequent change in albuminuria until Month 12 in enhanced responders (>50% albuminuria reduction), sustained responders (between 20 and 50% reduction), and response escapers (<20% reduction). Predictive performance of the individual albuminuria exposure until Month 3 was compared with the exposure over the first 12 months using receiver operating characteristics (ROC) curves. Results Following ARB initiation, 388 (36.3%) patients showed an >30% reduction in albuminuria. Among these patients, the albuminuria level further decreased in 174 (44.8%), remained stable in 123 (31.7%), and increased in 91 (23.5%) patients. Similar albuminuria fluctuations were observed in patients with <30% albuminuria reduction. Renal risk prediction improved when using the albuminuria exposure during the first 12 months versus the initial Month 3 change [ROC difference: 0.78 (95% CI 0.75–0.82) versus 0.68 (0.64–0.72); P < 0.0001]. Conclusions Following the initial response to ARBs, a large within-patient albuminuria variability is observed. Hence, inco[...]



Use of urine biomarker-derived clusters to predict the risk of chronic kidney disease and all-cause mortality in HIV-infected women

2016-09-02T00:05:38-07:00

Background Although individual urine biomarkers are associated with chronic kidney disease (CKD) incidence and all-cause mortality in the setting of HIV infection, their combined utility for prediction remains unknown. Methods We measured eight urine biomarkers shown previously to be associated with incident CKD and mortality risk among 902 HIV-infected women in the Women's Interagency HIV Study: N-acetyl-β-d-glucosaminidase (NAG), kidney injury molecule-1 (KIM-1), alpha-1 microglobulin (α1m), interleukin 18, neutrophil gelatinase-associated lipocalin, albumin-to-creatinine ratio, liver fatty acid-binding protein and α-1-acid-glycoprotein. A group-based cluster method classified participants into three distinct clusters using the three most distinguishing biomarkers (NAG, KIM-1 and α1m), independent of the study outcomes. We then evaluated associations of each cluster with incident CKD (estimated glomerular filtration rate <60 mL/min/1.73 m2 by cystatin C) and all-cause mortality, adjusting for traditional and HIV-related risk factors. Results Over 8 years of follow-up, 177 CKD events and 128 deaths occurred. The first set of clusters partitioned women into three groups, containing 301 (Cluster 1), 470 (Cluster 2) and 131 (Cluster 3) participants. The rate of CKD incidence was 13, 21 and 50% across the three clusters; mortality rates were 7.3, 13 and 34%. After multivariable adjustment, Cluster 3 remained associated with a nearly 3-fold increased risk of both CKD and mortality, relative to Cluster 1 (both P < 0.001). The addition of the multi-biomarker cluster to the multivariable model improved discrimination for CKD (c-statistic = 0.72–0.76, P = 0.0029), but only modestly for mortality (c = 0.79–0.8[...]



PLA2R antibodies, glomerular PLA2R deposits and variations in PLA2R1 and HLA-DQA1 genes in primary membranous nephropathy in South Asians

2016-09-02T00:05:38-07:00

Background Antibodies to M-type phospholipase A2 receptor (PLA2R) correlate with clinical activity of primary membranous nephropathy (PMN). Risk alleles in PLA2R1 and HLA-DQA1 genes are associated with PMN. Whether these alleles are associated with the development of anti-PLA2R is unknown. In this prospective study we evaluated anti-PLA2R, enhanced glomerular staining for PLA2R and variations in PLA2R1 and HLA-DQA1 genes in Indian patients with PMN and examined their association with response to treatment. Methods A total of 114 adult PMN patients were studied. Anti-PLA2R was estimated before treatment and after 6 and 12 months of therapy. Enhanced glomerular staining for PLA2R was assessed on fresh frozen tissue. Genotype analysis was done on recruited patients and 95 healthy controls by TaqMan assays for six single-nucleotide polymorphisms (SNPs; rs4664308, rs3749119, rs3749117, rs4664308, rs3828323 and rs2187668). Patients were followed up monthly for a period of 12 months. Results Of 114 patients, 66.7% showed elevated serum anti-PLA2R by ELISA and 64.9% by indirect immunofluorescence. About 75% had enhanced glomerular staining for PLA2R. A total of 82% of patients had PLA2R-related disease. Reduction in serum anti-PLA2R titer had a significant association with remission of nephrotic syndrome (P = 0.0003) at 6 and 12 months. More than 85% of patients showing >90% reduction in the anti-PLA2R titer achieved remission of the nephrotic state, whereas of those showing <50% reduction in titers, 87.5% had persistent nephrotic state. The SNPs rs3749119, rs3749117, rs4664308 in PLA2R1 and rs2187668 in HLA-DQA1 were significantly associated with PMN. The SNP rs2187668 was associated with anti-PLA2R positivity. Pati[...]



Fibroblast growth factor 23 correlates with volume status in haemodialysis patients and is not reduced by haemodialysis

2016-09-02T00:05:38-07:00

Background Recent data suggest a role for fibroblast growth factor 23 (FGF-23) in volume regulation. In haemodialysis patients, a large ultrafiltration volume (UFV) reflects poor volume control, and both FGF-23 and a large UFV are risk factors for mortality in this population. We studied the association between FGF-23 and markers of volume status including UFV, as well as the intradialytic course of FGF-23, in a cohort of haemodialysis patients. Methods We carried out observational, post hoc analysis of 109 prevalent haemodialysis patients who underwent a standardized, low-flux, haemodialysis session with constant ultrafiltration rate. We measured UFV, plasma copeptin and echocardiographic parameters including cardiac output, end-diastolic volume and left ventricular mass index at the onset of the haemodialysis session. We measured the intradialytic course of plasma C-terminal FGF-23 (corrected for haemoconcentration) and serum phosphate levels at 0, 1, 3 and 4 h after onset of haemodialysis and analysed changes with linear mixed effect model. Results Median age was 66 (interquartile range: 51–75) years, 65% were male with a weekly Kt/V 4.3 ± 0.7 and dialysis vintage of 25.4 (8.5–52.5) months. In univariable analysis, pre-dialysis plasma FGF-23 was associated with UFV, end-diastolic volume, cardiac output, early diastolic velocity e' and plasma copeptin. In multivariable regression analysis, UFV correlated with FGF-23 (standardized β: 0.373, P < 0.001, model R2: 57%), independent of serum calcium and phosphate. The association between FGF-23 and echocardiographic volume markers was lost for all but cardiac output upon adjustment for UFV. Overall, [...]



Mortality trends among Japanese dialysis patients, 1988-2013: a joinpoint regression analysis

2016-09-02T00:05:38-07:00

Background Evaluation of mortality trends in dialysis patients is important for improving their prognoses. The present study aimed to examine temporal trends in deaths (all-cause, cardiovascular, noncardiovascular and the five leading causes) among Japanese dialysis patients. Methods Mortality data were extracted from the Japanese Society of Dialysis Therapy registry. Age-standardized mortality rates were calculated by direct standardization against the 2013 dialysis population. The average annual percentage of change (APC) and the corresponding 95% confidence interval (CI) were computed for trends using joinpoint regression analysis. Results A total of 469 324 deaths occurred, of which 25.9% were from cardiac failure, 17.5% from infectious disease, 10.2% from cerebrovascular disorders, 8.6% from malignant tumors and 5.6% from cardiac infarction. The joinpoint trend for all-cause mortality decreased significantly, by –3.7% (95% CI –4.2 to –3.2) per year from 1988 through 2000, then decreased more gradually, by –1.4% (95% CI –1.7 to –1.2) per year during 2000–13. The improved mortality rates were mainly due to decreased deaths from cardiovascular disease, with mortality rates due to noncardiovascular disease outnumbering those of cardiovascular disease in the last decade. Among the top five causes of death, cardiac failure has shown a marked decrease in mortality rate. However, the rates due to infectious disease have remained stable during the study period [APC 0.1 (95% CI –0.2–0.3)]. Conclusions Significant progress has been made, particularly with regard to the decrease in age-standardized mortality rates.[...]



Phosphorus metabolism in peritoneal dialysis- and haemodialysis-treated patients

2016-09-02T00:05:38-07:00

Background Phosphorus control is generally considered to be better in peritoneal dialysis (PD) patients as compared with haemodialysis (HD) patients. Predialysis phosphorus concentrations are misleading as a measure of phosphorus exposure in HD, as these neglect significant dialysis-related fluctuations. Methods Parameters of mineral metabolism, including parathyroid hormone (PTH) and fibroblast growth factor-23 (FGF-23), were determined in 79 HD and 61 PD patients. In PD, phosphorus levels were determined mid-morning. In HD, time-averaged phosphorus concentrations were modelled from measurements before and after the mid-week dialysis session. Weekly renal, dialytic and total phosphorus clearances as well as total mass removal were calculated from urine and dialysate collections. Results Time-averaged serum phosphorus concentrations in HD (3.5 ± 1.0 mg/dL) were significantly lower than the mid-morning concentrations in PD (5.0 ± 1.4 mg/dL, P < 0.0001). In contrast, predialysis phosphorus concentrations (4.6 ± 1.4 mg/dL) were not different from PD. PTH and FGF-23 levels were significantly higher in PD. Despite higher residual renal function, total phosphorus clearance was significantly lower in PD (P < 0.0001). Total phosphorus mass removal, conversely, was significantly higher in PD (P < 0.05). Conclusions Our data suggest that the time-averaged phosphorus concentrations in patients treated with PD are higher as compared with patients treated with HD. Despite a better preserved renal function, total phosphorus clearance is lower in patients treated with PD. Additional studies are needed to confirm these [...]



High-urgency kidney transplantation in the Eurotransplant Kidney Allocation System: success or waste of organs? The Eurotransplant 15-year all-centre survey

2016-09-02T00:05:38-07:00

Background In the Eurotransplant Kidney Allocation System (ETKAS), transplant candidates can be considered for high-urgency (HU) status in case of life-threatening inability to undergo renal replacement therapy. Data on the outcomes of HU transplantation are sparse and the benefit is controversial. Methods We systematically analysed data from 898 ET HU kidney transplant recipients from 61 transplant centres between 1996 and 2010 and investigated the 5-year patient and graft outcomes and differences between relevant subgroups. Results Kidney recipients with an HU status were younger (median 43 versus 55 years) and spent less time on the waiting list compared with non-HU recipients (34 versus 54 months). They received grafts with significantly more mismatches (mean 3.79 versus 2.42; P < 0.001) and the percentage of retransplantations was remarkably higher (37.5 versus 16.7%). Patient survival (P = 0.0053) and death with a functioning graft (DwFG; P < 0.0001) after HU transplantation were significantly worse than in non-HU recipients, whereas graft outcome was comparable (P = 0.094). Analysis according to the different HU indications revealed that recipients listed HU because of an imminent lack of access for dialysis had a significantly worse patient survival (P = 0.0053) and DwFG (P = 0.0462) compared with recipients with psychological problems and suicidality because of dialysis. In addition, retransplantation had a negative impact on patient and graft outcome. Conclusions Facing organ shortages, increasing wait times and considerable mortality on dialysis, we question the current policy of HU allocat[...]



Estimated nephron number of the remaining donor kidney: impact on living kidney donor outcomes

2016-09-02T00:05:38-07:00

Background

It has been demonstrated that low birth weight gives rise to a reduction in nephron number with increased risks for hypertension and renal disease. Its impact on renal function in kidney donors, however, has not been addressed.

Methods

To investigate the impact of birth weight, kidney weight, kidney volume and estimated nephron number on kidney function, we collected data from 91 living kidney donors before nephrectomy, at +12, +36 and +60 months after nephrectomy.

Results

Birth weight showed a positive correlation with estimated glomerular filtration rate (eGFR) at +12, +36 and +60 months after nephrectomy (P < 0.05). The strongest link was observed in donors >50 years old (R = 0.535, P < 0.001 at +12 months). Estimated nephron number and eGFR showed a strong positive correlation at +12, +36 and +60 months after nephrectomy (R = 0.540; R = 0.459; R = 0.506, P < 0.05). Daily proteinuria at +12 months showed a negative correlation with birth weight (P = 0.009). Donors with new-onset hypertension showed significantly lower birth weights and higher uric acid levels (P < 0.05). Kidney weight and volume did not show any impact on donor outcomes (P > 0.05).

Conclusions

Low nephron number predisposes donors to inferior remaining eGFR, hypertension and proteinuria. The strong correlation in elderly donors may be attributed to reduced renal functional reserve due to the decline of renal function with age.




'I feel stronger and younger all the time--perspectives of elderly kidney transplant recipients: thematic synthesis of qualitative research

2016-09-02T00:05:38-07:00

Background Kidney transplantation offers improved survival and quality of life to an increasing number of elderly patients with end-stage kidney disease. However, elderly kidney transplant recipients may face unique challenges due to a higher burden of comorbidity, greater cumulative risk of immunosuppression-related complications and increasing frailty. We aimed to describe the perspectives of elderly kidney transplant recipients. Methods Electronic databases were searched to April 2015. Qualitative studies were eligible if they reported views from elderly kidney transplant recipients (≥60 years). Thematic synthesis was used to analyse the findings. Results Twenty-one studies involving >116 recipients were included. We identified seven themes. ‘Regaining strength and vitality’ meant valuing the physical and psychosocial improvements in daily functioning and life participation. ‘Extending life’ was the willingness to accept any organ, including extended criteria kidneys, to prolong survival. ‘Debt of gratitude’ entailed conscious appreciation toward their donor while knowing they were unable to repay their sacrifice. ‘Moral responsibility to maintain health’ motivated adherence to medication and lifestyle recommendations out of an ethical duty to protect their gift for graft survival. ‘Unabating and worsening forgetfulness’ hindered self-management. ‘Disillusionment with side effects and complications’ reflected disappointment and exasperation with the unintended consequences o[...]



Next generation sequencing and functional analysis of patient urine renal progenitor-derived podocytes to unravel the diagnosis underlying refractory lupus nephritis

2016-09-02T00:05:38-07:00

Often the cause of refractory lupus nephritis (RLN) remains unclear. We performed next-generation sequencing for podocyte genes in an RLN patient and identified compound heterozygosity for APOL1 risk alleles G1 and G2 and a novel homozygous c.[1049C>T]+[1049C>T] NPHS1 gene variant of unknown significance. To test for causality renal progenitor cells isolated from urine of this patient were differentiated into podocytes in vitro. Podocytes revealed aberrant nephrin trafficking, cytoskeletal structure and lysosomal leakage, and increased detachment as compared with podocytes isolated from controls. Thus, lupus podocytopathy can be confirmed as a cause of RLN by functional genetics on patient-derived podocytes.




Announcements

2016-09-02T00:05:38-07:00