Subscribe: Nephrology Dialysis Transplantation - current issue
http://ndt.oxfordjournals.org/rss/current.xml
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
ckd  clinical  creatinine  dialysis  disease  donor  kidney disease  kidney  patients  rate  renal  risk  studies  study     
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Nephrology Dialysis Transplantation - current issue

Nephrology Dialysis Transplantation Current Issue





Published: Thu, 02 Nov 2017 00:00:00 GMT

Last Build Date: Thu, 02 Nov 2017 07:50:12 GMT

 



Announcements

2017-11-02

News from ERA-EDTA:






Social media in medicine: a game changer?

2017-10-12

medicine and nephrologysocial mediasocial network



The association of donor and recipient age with graft survival in paediatric renal transplant recipients in a European Society for Paediatric Nephrology/European Renal Association–European Dialysis and Transplantation Association Registry study

2017-09-15

Abstract
Background
The impact of donor age in paediatric kidney transplantation is unclear. We therefore examined the association of donor–recipient age combinations with graft survival in children.
Methods
Data for 4686 first kidney transplantations performed in 13 countries in 1990–2013 were extracted from the ESPN/ERA-EDTA Registry. The effect of donor and recipient age combinations on 5-year graft-failure risk, stratified by donor source, was estimated using Kaplan–Meier survival curves and Cox regression, while adjusting for sex, primary renal diseases with a high risk of recurrence, pre-emptive transplantation, year of transplantation and country.
Results
The risk of graft failure in older living donors (50–75 years old) was similar to that of younger living donors {adjusted hazard ratio [aHR] 0.74 [95% confidence interval (CI) 0.38–1.47]}. Deceased donor (DD) age was non-linearly associated with graft survival, with the highest risk of graft failure found in the youngest donor age group [0–5 years; compared with donor ages 12–19 years; aHR 1.69 (95% CI 1.26–2.26)], especially among the youngest recipients (0–11 years). DD age had little effect on graft failure in recipients’ ages 12–19 years.
Conclusions
Our results suggest that donations from older living donors provide excellent graft outcomes in all paediatric recipients. For young recipients, the allocation of DDs over the age of 5 years should be prioritized.



Effect of human leukocyte antigen mismatching on the outcomes of pediatric kidney transplantation: a systematic review and meta-analysis

2017-08-23

Abstract
Background
Kidney transplantation is regarded as the optimal treatment for pediatric patients with end-stage renal disease. Here, we address a controversial topic in pediatric kidney transplantation by performing a quantitative evaluation of the effect of human leukocyte antigen (HLA) mismatching on the outcomes of pediatric kidney transplantation.
Methods
We systematically searched PubMed, EMBASE and the Cochrane Library from their inception to 31 December 2016 for cohort studies assessing the risk ratio (RR) of HLA mismatching on pediatric kidney transplantation. Outcome measures included graft failure, rejection and all-cause mortality. RRs and 95% confidence intervals (CIs) were used as estimates of effect size in random-effect models.
Results
Eighteen studies comprising a total of 26 018 pediatric recipients were included in the evaluation. Compared with 0–1 HLA-DR mismatch, 2 mismatches significantly increased the risk of graft failure at 1 year (RR: 1.41, 95% CI: 1.11–1.80), 3 years (RR: 1.28, 95% CI: 1.08–1.52), 5 years (RR: 1.21, 95% CI: 1.04–1.41) and 10 years (RR: 1.30, 95% CI: 1.02–1.67). For HLA-A + B, the 5-year graft failure risk was higher for 2–4 mismatches compared with 0–1 mismatch (RR: 3.17, 95% CI: 1.20–8.36), but not for 3–4 compared with 0–2 mismatches (RR: 1.49, 95% CI: 0.79–2.80).
Conclusions
Based on pooled analysis, HLA-DR and HLA-A + B are important factors affecting post-transplant outcomes, especially graft failure, in pediatric recipients. Additional randomized controlled trials with higher quality evidence are needed for further investigation.



Does Kidney Donor Risk Index implementation lead to the transplantation of more and higher-quality donor kidneys?

2017-08-21

Abstract
Background
The Kidney Donor Risk Index (KDRI) is a quantitative evaluation of the quality of donor organs and is implemented in the US allocation system. This single-centre study investigates whether the implementation of the KDRI in our decision-making process to accept or decline an offered deceased donor kidney, increases our acceptance rate.
Methods
From April 2015 until December 2016, we prospectively calculated the KDRI for all deceased donor kidney offers allocated by Eurotransplant to our centre. The number of the transplanted versus declined kidney offers during the study period were compared to a historical set of donor kidney offers.
Results
After implementation of the KDRI, 26.1% (75/288) of all offered donor kidneys were transplanted, compared with 20.7% (136/657) in the previous period (P < 0.001). The median KDRI of all transplanted donor kidneys during the second period was 0.97 [Kidney Donor Profile Index (KDPI) 47%], a value significantly higher than the median KDRI of 0.85 (KDPI 34%) during the first period (P = 0.047). A total of 68% of patients for whom a first-offered donor kidney was declined during this period were transplanted after a median waiting time of 386 days, mostly with a lower KDRI donor kidney.
Conclusions
Implementing the KDRI in our decision-making process increased the transplantation rate by 26%. The KDRI can be a supportive tool when considering whether to accept or decline a deceased donor kidney offer. More data are needed to validate this score in other European centres.



Long-term outcome in 145 patients with assumed benign immunoglobulin A nephropathy

2017-08-17

Abstract
Background
Patients with immunoglobulin A nephropathy (IgAN) who present with mild to moderate proteinuria and normal renal function are assumed to have excellent short-term renal prognosis, but the long-term prognosis is uncertain.
Methods
Patients were selected from the Norwegian Kidney Biopsy Registry based on the following criteria: diagnostic renal biopsy performed in the period 1988–99, with estimated glomerular filtration rate (eGFR) ≥60 mL/min/1.73 m2 and proteinuria <1 g/24 h at the time of biopsy. Patients were invited for a nephrological examination with a review of medical history and investigation of blood pressure, urinary findings and eGFR.
Results
A total of 145 patients attended the examination, performed by the first author, after a median of 22 (interquartile range 19–25) years after diagnosis. At the examination, 27 patients (18.6%) had a ≥50% decrease in GFR, of whom 4 (2.8%) had developed end-stage renal disease (ESRD). The mean duration from renal biopsy to ≥ 50% decrease in GFR was 17.3 ± 5.1 years in our cohort. Clinical remission was observed in 42 (29.0%) patients. Renal biopsies were re-examined utilizing the Oxford classification criteria. Mesangial hypercellularity was found in 12.3%, endocapillary proliferation was detected in 10.7% and segmental glomerulosclerosis was observed in 23.8%. All biopsies were scored as T0 (tubular atrophy in < 25% of the cortical area). None of the clinical or histopathological variables recorded at the time of biopsy could identify patients with progressive disease. Cumulative risks of ≥50% decrease in eGFR were 2.1% after 10 years, 4.1% after 15 years, 13.9% after 20 years and 24.7% after 25 years.
Conclusions
We have shown that 18.6% of patients with assumed benign IgAN had progressive disease after a median duration of 22 years and that these patients could not be predicted at the time of biopsy. Our study demonstrates that an extended follow-up period is needed when assessing prognosis in this group of patients.



The longer the better: follow-up in seemingly ‘benign’ immunoglobulin A nephropathy

2017-07-18

IgA nephropathy (IgAN), the most common primary glomerulonephritis in Caucasian and Asian patients, can have a highly variable course ranging from minor urinary abnormalities to rapidly progressive kidney failure. Cases coming to medical attention probably represent the tip of an iceberg, as up to 1.5% of European and Asian populations may have asymptomatic IgAN [1–3]. Thus IgAN is easily missed in countries without urinary screening programmes in children and young adults and given the frequent reservation to perform a kidney biopsy in a patient with isolated microhaematuria, little proteinuria, normotension and normal glomerular filtration rate (GFR). If biopsies are performed in these latter patients, the diagnosis is IgAN in 14–40% if there is no evidence of a urologic pathology [4–7]. In such IgAN patients, the prognosis is generally believed to be excellent and, in fact, up to a third of the patients will experience a spontaneous complete remission of the disease over the next 5–10 years [8–11]. The large majority of non-remitting patients will maintain good renal function and over a 10-year follow-up <5% of patients will have progressive disease (i.e. proteinuria increasing beyond 1 g/day and/or GFR decreasing) [9, 12]. So far, no reliable clinical predictors prospectively identify these few progressive patients. Thus there is currently wide consensus, backed by guidelines [13], to just periodically monitor these patients.



Regulation of miR-29b and miR-30c by vitamin D receptor activators contributes to attenuate uraemia-induced cardiac fibrosis

2017-04-28

Abstract
Background
Uraemic cardiomyopathy, a process mainly associated with increased myocardial fibrosis, is the leading cause of death in chronic kidney disease patients and can be prevented by vitamin D receptor activators (VDRAs). Since some microRNAs (miRNAs) have emerged as regulators of the fibrotic process, we aimed to analyse the role of specific miRNAs in VDRA prevention of myocardial fibrosis as well as their potential use as biomarkers.
Methods
Wistar rats were nephrectomized and treated intraperitoneally with equivalent doses of two VDRAs: calcitriol and paricalcitol. Biochemical parameters, cardiac fibrosis, miRNA (miR-29b, miR-30c and miR-133b) levels in the heart and serum and expression of their target genes collagen I (COL1A1), matrix metalloproteinase 2 (MMP-2) and connective tissue growth factor (CTGF) in the heart were evaluated.
Results
Both VDRAs attenuated cardiac fibrosis, achieving a statistically significant difference in the paricalcitol-treated group. Increases in RNA and protein levels of COL1A1, MMP-2 and CTGF and reduced expression of miR-29b and miR-30c, known regulators of these pro-fibrotic genes, were observed in the heart of chronic renal failure (CRF) rats and were attenuated by both VDRAs. In serum, significant increases in miR-29b, miR-30c and miR-133b levels were observed in CRF rats, which were prevented by VDRA use. Moreover, vitamin D response elements were identified in the three miRNA promoters.
Conclusions
VDRAs, particularly paricalcitol, attenuated cardiac fibrosis acting on COL1A1, MMP-2 and CTGF expression, partly through regulation of miR-29b and miR-30c. These miRNAs and miR-133b could be useful serum biomarkers for cardiac fibrosis and also potential new therapeutic targets.



Urinary podocyte and TGF-β1 mRNA as markers for disease activity and progression in anti-glomerular basement membrane nephritis

2017-04-17

Abstract
Background
Podocyte depletion causes glomerulosclerosis, with persistent podocyte loss being a major factor driving disease progression. Urinary podocyte mRNA is potentially useful for monitoring disease progression in both animal models and in humans. To determine whether the same principles apply to crescentic glomerular injury, a rat model of anti-glomerular basement membrane (anti-GBM) nephritis was studied in parallel with a patient with anti-GBM nephritis.
Methods
Podocyte loss was measured by Wilms’ Tumor 1-positive podocyte nuclear counting and density, glomerular epithelial protein 1 or synaptopodin-positive podocyte tuft area and urinary podocyte mRNA excretion rate. Glomerulosclerosis was evaluated by Azan staining and urinary transforming growth factor (TGF)-β1 mRNA excretion rate.
Results
In the rat model, sequential kidney biopsies revealed that after a threshold of 30% podocyte loss, the degree of glomerulosclerosis was linearly associated with the degree of podocyte depletion, compatible with podocyte depletion driving the sclerotic process. Urinary podocyte mRNA correlated with the rate of glomerular podocyte loss. In treatment studies, steroids prevented glomerulosclerosis in the anti-GBM model in contrast to angiotensin II inhibition, which lacked a protective effect, and urinary podocyte and TGF-β1 mRNA markers more accurately reflected both the amount of podocyte depletion and the degree of glomerulosclerosis compared with proteinuria under both scenarios. In a patient successfully treated for anti-GBM nephritis, urinary podocyte and TGB-β1 mRNA reflected treatment efficacy.
Conclusion
These results emphasize the role of podocyte depletion in anti-GBM nephritis and suggest that urinary podocyte and TGF-β1 mRNA could serve as markers of disease progression and treatment efficacy.



Effect of parathyroidectomy and cinacalcet on quality of life in patients with end-stage renal disease-related hyperparathyroidism: a systematic review

2017-04-10

Abstract
Background
Patients with end-stage renal disease (ESRD) have a decreased quality of life (QoL), which is attributable in part to ESRD-related hyperparathyroidism (HPT). Both cinacalcet and parathyroidectomy (PTx) are treatments for advanced HPT, but their effects on QoL are unclear. We performed a systematic review to evaluate the impact of cinacalcet and PTx on QoL.
Methods
A systematic literature search was performed using PubMed and EMBASE databases to identify relevant articles. The search was based on the following keywords: ‘parathyroidectomy’ or ‘cinacalcet’, ‘secondary hyperparathyroidism’ or ‘renal hyperparathyroidism’ combined with ‘quality of life’ or ‘SF-36’ or ‘symptomatology’. Only studies reporting on QoL at baseline and during follow-up were included. QoL scores were extracted from the selected manuscripts and weighted means were calculated. Due to a lack of available data on QoL improvement in patients using cinacalcet, a meta-analysis could not be performed.
Results
In all, eight articles reached our inclusion criteria. Of this, five articles reported the effect of PTx on QoL. All PTx studies were observational and non-controlled. The physical component scores of the 36-item Medical Outcomes Study Short-Form Health Survey increased significantly with a weighted mean of 35.5% (P < 0.05). Mental component scores increased with 13.7% (P < 0.05). Parathyroidectomy assessment of symptom scores improved from 561 preoperatively to 302 postoperatively (−259 points; −46.2%). Visual analogue scale scores decreased significantly for skin itching (46.6%), joint pain (30.4%) and muscle weakness (28.7%) (P < 0.05). Three studies on the effect of cinacalcet on QoL were included, including one randomized controlled trial. None of these studies showed significant improvement of physical component and mental component scores.
Conclusions
PTx improved QoL in patients treated for ESRD-related HPT, whereas cinacalcet did not. The difference in impact between PTx and cinacalcet on QoL has not been compared directly.



Microbiome perturbation by oral vancomycin reduces plasma concentration of two gut-derived uremic solutes, indoxyl sulfate and p -cresyl sulfate, in end-stage renal disease

2017-03-31

Abstract
Background
Observational studies have suggested a relationship between the plasma concentration of indoxyl sulfate (IS) and p-cresyl sulfate (PCS), small gut-derived ‘uremic solutes’, and the high incidence of uremic cardiomyopathy in patients with end-stage renal disease (ESRD). IS and PCS are derived from the metabolism of dietary components (tryptophan and tyrosine) by gut bacteria. This pilot study was designed to examine the effects of a poorly absorbable antibiotic (vancomycin) on the plasma concentration of two gut-derived ‘uremic solutes’, IS and PCS, and on the composition of the gut microbiome.
Methods
Plasma concentrations of IS and PCS were measured by MS-HPLC. The gut microbiome was assessed in stool specimens sequenced for the 16S rRNA gene targeting the V4 region.
Results
The pre-dialysis mean plasma concentrations of both IS and PCS were markedly elevated. Following the administration of vancomycin (Day 0), the IS and PCS concentrations decreased at Day 2 or Day 5 and returned to baseline by Day 28. Following vancomycin administration, several changes in the gut microbiome were observed. Most striking was the decrease in diversity, a finding that was evident on Day 7 and was still evident at Day 28. There was little change at the phylum level but at the genus level, broad population changes were noted. Changes in the abundance of several genera appeared to parallel the concentration of IS and PCS.
Conclusions
These findings suggest that alteration of the gut microbiome, by an antibiotic, might provide an important strategy in reducing the levels of IS and PCS in ESRD.



Creatinine generation from kinetic modeling with or without postdialysis serum creatinine measurement: results from the HEMO study

2017-03-31

Abstract
Background
A convenient method to estimate the creatinine generation rate and measures of creatinine clearance in hemodialysis patients using formal kinetic modeling and standard pre- and postdialysis blood samples has not been described.
Methods
We used data from 366 dialysis sessions characterized during follow-up month 4 of the HEMO study, during which cross-dialyzer clearances for both urea and creatinine were available. Blood samples taken at 1 h into dialysis and 30 min and 60 min after dialysis were used to determine how well a two-pool kinetic model could predict creatinine concentrations and other kinetic parameters, including the creatinine generation rate. An extrarenal creatinine clearance of 0.038 l/kg/24 h was included in the model.
Results
Diffusive cross-dialyzer clearances of urea [230 (SD 37 mL/min] correlated well (R2 = 0.78) with creatinine clearances [164 (SD 30) mL/min]. When the effective diffusion volume flow rate was set at 0.791 times the blood flow rate for the cross-dialyzer clearance measurements at 1 h into dialysis, the mean calculated volume of creatinine distribution averaged 29.6 (SD 7.2) L], compared with 31.6 (SD 7.0) L for urea (P < 0.01). The modeled creatinine generation rate [1183 (SD 463) mg/day] averaged 100.1 % (SD 29; median 99.3) of that predicted in nondialysis patients by an anthropometric equation. A simplified method for modeling the creatinine generation rate using the urea distribution volume and urea dialyzer clearance without use of the postdialysis serum creatinine measurement gave results for creatinine generation rate [1187 (SD 475) mg/day; that closely matched the value calculated using the formally modeled value, R2 = 0.971].
Conclusions
Our analysis confirms previous findings of similar distribution volumes for creatinine and urea. After taking extra-renal clearance into consideration, the creatinine generation rate in dialysis patients is similar to that in nondialysis patients. A simplified method based on urea clearance and urea distribution volume not requiring a postdialysis serum creatinine measurement can be used to yield creatinine generation rates that closely match those determined from standard modeling.



Differences in acute kidney injury ascertainment for clinical and preclinical studies

2017-03-23

Abstract
Background
Acute kidney injury (AKI) is a common clinical condition directly associated with adverse outcomes. Several AKI biomarkers have been discovered, but their use in clinical and preclinical studies has not been well examined. This study aims to investigate the differences between clinical and preclinical studies on AKI biomarkers.
Methods
We performed a systematic review of clinical and preclinical interventional studies that considered AKI biomarkers in enrollment criteria and/or outcome assessment and described the main differences according to their setting, the inclusion of biomarkers in the definition of AKI and the use of biomarkers as primary or secondary end points.
Results
In the 151 included studies (76 clinical, 75 preclinical), clinical studies have prevalently focused on cardiac surgery (38.1%) and contrast-associated AKI (17.1%), while the majority of preclinical studies have focused on ether ischemia–reperfusion injury or drug-induced AKI (42.6% each). A total of 57.8% of clinical studies defined AKI using the standard criteria and only 19.7% of these studies used AKI biomarkers in the definition of renal injury. Conversely, the majority of preclinical studies defined AKI according to the increase in serum creatinine and blood urea nitrogen, and 32% included biomarkers in that definition. The percentage of both clinical and preclinical studies with biomarkers as a primary end point has not significantly increased in the last 10 years; however, preclinical studies are more likely to use AKI biomarkers as a primary end point compared with clinical studies [odds ratio 2.31 (95% confidence interval 1.17–4.59); P  =  0.016].
Conclusion
Differences between clinical and preclinical studies are evident and may affect the translation of preclinical findings in the clinical setting.



Nutritional assessment of elderly patients on dialysis: pitfalls and potentials for practice

2017-03-22

Abstract
The chronic kidney disease (CKD) population is aging. Currently a high percentage of patients treated on dialysis are older than 65 years. As patients get older, several conditions contribute to the development of malnutrition, namely protein energy wasting (PEW), which may be compounded by nutritional disturbances associated with CKD and from the dialysis procedure. Therefore, elderly patients on dialysis are vulnerable to the development of PEW and awareness of the identification and subsequent management of nutritional status is of importance. In clinical practice, the nutritional assessment of patients on dialysis usually includes methods to assess PEW, such as the subjective global assessment, the malnutrition inflammation score, and anthropometric and laboratory parameters. Studies investigating measures of nutritional status specifically tailored to the elderly on dialysis are scarce. Therefore, the same methods and cutoffs used for the general adult population on dialysis are applied to the elderly. Considering this scenario, the aim of this review is to discuss specific considerations for nutritional assessment of elderly patients on dialysis addressing specific shortcomings on the interpretation of markers, in addition to providing clinical practice guidance to assess the nutritional status of elderly patients on dialysis.



One-year efficacy and safety of the iron-based phosphate binder sucroferric oxyhydroxide in patients on peritoneal dialysis

2017-02-23

Abstract
Background
Sucroferric oxyhydroxide is a noncalcium, iron-based phosphate binder that demonstrated sustained serum phosphorus control, good tolerability and lower pill burden compared with sevelamer carbonate (sevelamer) in a Phase 3 study conducted in dialysis patients. This subanalysis examines the efficacy and tolerability of sucroferric oxyhydroxide and sevelamer in the peritoneal dialysis (PD) patient population.
Methods
The initial study (NCT01324128) and its extension (NCT01464190) were multicenter, Phase 3, open-label, randomized (2:1), active-controlled trials comparing sucroferric oxyhydroxide (1.0–3.0 g/day) with sevelamer (2.4–14.4 g/day) in dialysis patients over 52 weeks in total.
Results
In the overall study, 84/1055 (8.1%) patients received PD and were eligible for efficacy analysis (sucroferric oxyhydroxide, n = 56; sevelamer, n = 28). The two groups were broadly comparable to each other and to the overall study population. Serum phosphorus concentrations decreased comparably with both phosphate binders by week 12 (mean change from baseline − 0.6 mmol/L). Over 52 weeks, sucroferric oxyhydroxide effectively reduced serum phosphorus concentrations to a similar extent as sevelamer; 62.5% and 64.3% of patients, respectively, were below the Kidney Disease Outcomes Quality Initiative target range (≤1.78 mmol/L). This was achieved with a lower pill burden (3.4 ± 1.3 versus 8.1 ± 3.7 tablets/day) with sucroferric oxyhydroxide compared with sevelamer. Treatment adherence rates were 91.2% with sucroferric oxyhydroxide and 79.3% with sevelamer. The proportion of patients reporting at least one treatment-emergent adverse event was 86.0% with sucroferric oxyhydroxide and 93.1% with sevelamer. The most common adverse events with both treatments were gastrointestinal: diarrhea and discolored feces with sucroferric oxyhydroxide and nausea, vomiting and constipation with sevelamer.
Conclusions
Sucroferric oxyhydroxide is noninferior to sevelamer for controlling serum phosphorus in patients undergoing PD, while providing a relatively low pill burden and a high rate of adherence.



Racial variation in cardiovascular disease risk factors among European children on renal replacement therapy—results from the European Society for Paediatric Nephrology/European Renal Association – European Dialysis and Transplant Association Registry

2017-02-01

Abstract
Background
Racial differences in overall mortality rates have been found in children on renal replacement therapy (RRT). We used data from the European Society for Paediatric Nephrology/European Renal Association – European Dialysis and Transplant Association Registry to study racial variation in the prevalence of cardiovascular disease (CVD) risk factors among European children on RRT.
Methods
We included patients aged <20 years between 2006–13 who (i) initiated dialysis treatment or (ii) had a renal transplant vintage of ≥1 year. Racial groups were defined as white, black, Asian and other. The CVD risk factors assessed included uncontrolled hypertension, obesity, hyperphosphataemia and anaemia. Differences between racial groups in CVD risk factors were examined using generalized estimating equation (GEE) models while adjusting for potential confounders.
Results
In this study, 1161 patients on dialysis and 1663 patients with a transplant were included. The majority of patients in both groups were white (73.8% and 79.9%, respectively). The crude prevalence of the CVD risk factors was similar across racial groups. However, after adjustment for potential confounders, Asian background was associated with higher risk of uncontrolled hypertension both in the dialysis group [odds ratio (OR): 1.27; 95% confidence interval (CI): 1.01–1.64] and the transplant group (OR: 1.37; 95% CI: 1.11–1.68) compared with white patients. Patients of Asian and other racial background with a renal transplant had a higher risk of anaemia compared with white patients (OR: 1.50; 95% CI: 1.15–1.96 and OR: 1.45; 95% CI: 1.01–2.07, respectively). Finally, the mean number of CVD risk factors among dialysis patients was higher in Asian patients (1.83, 95% CI: 1.64–2.04) compared with white patients (1.52, 95% CI: 1.40–1.65).
Conclusions
We found a higher prevalence of modifiable CVD risk factors in Asian children on RRT. Early identification and management of these risk factors could potentially improve long-term outcomes.



Initiation of erythropoiesis-stimulating agents and outcomes: a nationwide observational cohort study in anaemic chronic kidney disease patients

2016-09-24

Abstract
Background
In 2012, new clinical guidelines were introduced for use of erythropoiesis-stimulating agents (ESA) in chronic kidney disease (CKD) patients, recommending lower haemoglobin (Hb) target levels and thresholds for ESA initiation. These changes resulted in lower blood levels in these patients. However, there is limited evidence on just when ESA should be initiated and the safety of a low Hb initiation policy.
Methods
In this observational inception cohort study, Swedish, nephology-referred, ESA-naïve CKD patients (n = 6348) were enrolled when their Hb dropped below 12.0 g/L, and they were followed for mortality and cardiovascular events. Four different ESA treatments were evaluated applying dynamic marginal structural models: (i) begin ESA immediately, (ii) begin ESA when Hb <11.0 g/dL, (iii) begin ESA when Hb <10.0 g/dL and (iv) never begin ESA in comparison with ‘current practice’ [the observed (factual) survival of the entire study cohort]. The adjusted 3-year survival following ESA begun over a range of Hb (from <9.0 to 12.0 g/dL) was evaluated, after adjustment for covariates at baseline and during follow-up.
Results
Overall, 36% were treated with ESA. Mortality during follow-up was 33.4% of the ESA-treated and 27.9% of the non-treated subjects. The adjusted 3-year survival associated with ESA initiation improved for subjects with initial Hb <9.0 to 11 g/dL and then decreased again for those with Hb above 11.5 g/dL. Initiating ESA at Hb <11.0 g/dL and <10.0 g/dL was associated with improved survival compared with ‘current practice’ [hazard ratio (HR) 0.83; 95% confidence interval (CI) 0.79–0.89 and 0.90; 95% CI 0.86–0.94, respectively] and did not increase the risk of a cardiovascular event (HR 0.93; 95% CI 0.87–1.00).
Conclusion
In non-dialysis patients with CKD, ESA initiation at Hb < 10.0–11.0 g/dL is associated with improved survival in patients otherwise treated according to guidelines.



Mineral metabolism factors predict accelerated progression of common carotid intima-media thickness in chronic kidney disease: the NEFRONA study

2016-08-26

Abstract
Background
The leading cause of premature death in chronic kidney disease (CKD) is cardiovascular disease (CVD), but risk assessment in renal patients is challenging. The aim of the study was to analyse the factors that predict accelerated progression of common carotid intima-media thickness (CCIMT) in a CKD cohort after 2 years of follow-up (2010–12).
Methods
The study included 1152 patients from the NEFRONA cohort with CKD stages 3–5D and without a clinical history of CVD. CCIMT was measured at the far wall on both common carotids. CCIMT progression was defined as the change between CCIMT at baseline and at 24 months for each side, averaged and normalized as change per year. Accelerated progressors were defined as those with a CCIMT change ≥75th percentile.
Results
The median CCIMT progression rate was 0.0125 mm/year, without significant differences between CKD stages. The cut-off value for defining accelerated progression was 0.0425 mm/year. After adjustment, age was a common factor among all CKD stages. Traditional cardiovascular risk factors, such as diabetes and systolic blood pressure, were predictors of progression in CKD stages 4–5, whereas high-density lipoprotein and low-density lipoprotein cholesterol predicted progression in women in stage 3. Mineral metabolism factors predicting accelerated progression were serum phosphorus in stages 3 and 5D; low 25-hydroxyvitamin D and parathyroid hormone levels >110 pg/mL in stages 4–5 and intact parathyroid hormone levels out of the recommended range in stage 5D.
Conclusions
Mineral metabolism parameters might predict accelerated CCIMT progression from early CKD stages.



Thyroid function, reduced kidney function and incident chronic kidney disease in a community-based population: the Atherosclerosis Risk in Communities study

2016-08-18

Abstract
Background
Reduced kidney function is a common public health problem that increases risk for a wide variety of adverse outcomes, making the identification of potentially modifiable factors associated with the development of incident chronic kidney disease (CKD) important. Alterations in the hypothalamic–pituitary–thyroid axis have been linked to reduced kidney function, but the association of thyroid function with the development of incident CKD is largely uncharacterized.
Methods
Concentrations of thyroid stimulating hormone (TSH), free thyroxine (FT4), triiodothyronine (T3) and thyroid peroxidase antibody (TPOAb) were quantified in 12 785 black and white participants of the ongoing community-based prospective Atherosclerosis Risk in Communities study. Thyroid markers and clinical categories of thyroid dysfunction (euthyroidism, combined subclinical and overt hypothyroidism, combined subclinical and overt hyperthyroidism) were also evaluated for their association with reduced kidney function (estimated glomerular filtration rate <60 mL/min/1.73 m2) at study baseline and with incident CKD over a median follow-up time of 19.6 years.
Results
Higher TSH and FT4 as well as lower T3 concentrations were strongly and independently associated with reduced kidney function at study baseline. The clinical entities hypothyroidism and hyperthyroidism were also associated with higher odds of baseline reduced kidney function, but this was not significant. However, none of the markers of thyroid function nor different clinical categories of thyroid dysfunction (hypothyroidism, hyperthyroidism or TPOAb positivity) were associated with incident CKD in adjusted analyses.
Conclusions
Elevated TSH, FT4 and reduced T3 concentrations were associated with reduced kidney function cross-sectionally. The lack of association with the development of incident CKD suggests that altered thyroid function in the general population is not causally related to CKD development, but screening for thyroidal status may be especially relevant in persons with reduced kidney function.



Urinary proteomics predict onset of microalbuminuria in normoalbuminuric type 2 diabetic patients, a sub-study of the DIRECT-Protect 2 study

2016-08-09

Abstract
Background
Early prevention of diabetic nephropathy is not successful as early interventions have shown conflicting results, partly because of a lack of early and precise indicators of disease development. Urinary proteomics has shown promise in this regard and could identify those at high risk who might benefit from treatment. In this study we investigate its utility in a large type 2 diabetic cohort with normoalbuminuria.
Methods
We performed a post hoc analysis in the Diabetic Retinopathy Candesartan Trials (DIRECT-Protect 2 study), a multi centric randomized clinical controlled trial. Patients were allocated to candesartan or placebo, with the aim of slowing the progression of retinopathy. The secondary endpoint was development of persistent microalbuminuria (three of four samples). We used a previously defined chronic kidney disease risk score based on proteomic measurement of 273 urinary peptides (CKD273-classifier). A Cox regression model for the progression of albuminuria was developed and evaluated with integrated discrimination improvement (IDI), continuous net reclassification index (cNRI) and receiver operating characteristic curve statistics.
Results
Seven hundred and thirty-seven patients were analysed and 89 developed persistent microalbuminuria (12%) with a mean follow-up of 4.1 years. At baseline the CKD273-classifier predicted development of microalbuminuria during follow-up, independent of treatment (candesartan/placebo), age, gender, systolic blood pressure, urine albumin excretion rate, estimated glomerular filtration rate, HbA1c and diabetes duration, with hazard ratio 2.5 [95% confidence interval (CI) 1.4–4.3; P = 0.002] and area under the curve 0.79 (95% CI 0.75–0.84; P < 0.0001). The CKD273-classifier improved the risk prediction (relative IDI 14%, P = 0.002; cNRI 0.10, P = 0.043).
Conclusions
In this cohort of patients with type 2 diabetes and normoalbuminuria from a large intervention study, the CKD273-classifier was an independent predictor of microalbuminuria. This may help identify high-risk normoalbuminuric patients for preventive strategies for diabetic nephropathy.



Prognostic enrichment design in clinical trials for autosomal dominant polycystic kidney disease: the HALT-PKD clinical trial

2016-08-02

Abstract
Background
Patients with mild autosomal dominant polycystic kidney disease (ADPKD) are less likely to be informative in randomized clinical trials (RCTs). We previously developed an imaging classification of ADPKD (typical diffuse cyst distribution Class 1A–E and atypical cyst distribution Class 2) for prognostic enrichment design in RCTs. We investigated whether using this classification would have increased the power to detect a beneficial treatment effect of rigorous blood pressure (BP) control on HALT-PKD participants with early disease (Study A).
Methods
Post hoc analysis of the early disease HALT-PKD study, an RCT that studied the effect of rigorous versus standard BP control on rates of total kidney volume (TKV) increase and estimated glomerular filtration rate (eGFR) decline in ADPKD patients with eGFR >60 mL/min/1.73 m2.
Results
Five hundred and fifty-one patients were classified by two observers (98.2% agreement) into Class 1A (6.2%), 1B (20.3%), 1C (34.1%), 1D (22.1%), 1E (11.8%) and 2 (5.4%). The TKV increase and eGFR decline became steeper from Class 1A through 1E. Rigorous BP control had been shown to be associated with slower TKV increase, without a significant overall effect on the rate of eGFR decline (faster in the first 4 months and marginally slower thereafter). Merging Classes 1A and 2 (lowest severity), 1B and 1C (intermediate severity) and 1D and 1E (highest severity) detected stronger beneficial effects on TKV increase and eGFR decline in Class 1D and E with a smaller number of patients.
Conclusions
Strategies for prognostic enrichment, such as image classification, should be used in the design of RCTs for ADPKD to increase their power and reduce their cost.



Arterial stiffness and its relationship to clinic and ambulatory blood pressure: a longitudinal study in non-dialysis chronic kidney disease

2016-07-29

Abstract
Background
Both arterial stiffness and systolic blood pressure (BP) are established cardiovascular risk factors, yet little is known about their interrelationship in chronic kidney disease (CKD). The goal of this prospective study was to describe the trajectory of aortic pulse wave velocity (PWV) and BP and to compare the longitudinal interrelationship of BP (clinic and 24 h ambulatory recording) with the PWV.
Methods
Clinic BP was taken in two ways: at the time of the measurement of the PWV (Clinic-S) and as an average of triplicate measurements on three separate occasions within 1 week (Clinic-M). 24 h ambulatory BP was measured using a validated monitor and PWV was measured in the aorta using an echo-Doppler technique.
Results
Among 255 veterans with CKD followed for over up to 4 years, the rate of change of log PWV was inversely related to the baseline PWV; the trajectories were variable among individuals and the net population change was no different from zero. In contrast, systolic BP significantly increased, but linearly, and a strong relationship was seen between cross-sectional and longitudinal changes in Clinic-M systolic BP and log PWV. In contrast, a longitudinal relationship between Clinic-S and log PWV was absent. In the case of 24-h ambulatory BP, a strong cross-sectional change was seen between awake and 24 h systolic BP but not between sleep BP and log PWV
Conclusion
In conclusion, among people with CKD, the PWV changes over time and is inversely related to the baseline PWV. An average of clinic BP measurements taken over three visits, but not single measurements, are useful to assess the PWV and its change over time. Differences exist between ambulatory BP monitoring recording during the sleep and awake states in their ability to predict the PWV. Taken together, these data support the view that among those with CKD not on dialysis, targeting clinic BP taken on multiple occasions using a standardized methodology or daytime ambulatory systolic BP may slow the progression of arterial damage.