Read more

September 03, 2019
10 min read
Save

How artificial intelligence impacts the treatment of kidney disease

You've successfully added to your alerts. You will receive an email when new content is published.

Click Here to Manage Email Alerts

We were unable to process your request. Please try again later. If you continue to have this issue please contact customerservice@slackinc.com.

Health care is entering a new era where the advent of “big data” brings tremendous opportunities to revolutionize the profession. In order to be purposeful, data must be analyzed, interpreted and used to improve patient care. Exploring the associations among different pieces of information derived from large and diverse datasets to enable intelligent and informed decisions is now possible with the emergence of artificial intelligence (AI).

AI is “the science and engineering of making intelligent machines, especially intelligent computer programs” as described by the pioneer Alan Turing in 1950.1 AI is considered the fourth generation of the industrial revolution, since it is anticipated to transform and impact every possible sector of our lives, just like electricity did in the early 1900s (see Figure 1).

Len A. Usvyat, PhD (left); Sheetal Chaudhuri, MS (center); and Andrew Long, PhD (right), and colleagues understand artificial intelligence is an aid, not a replacement, for professional care givers.

Source: Gerry Evelyn

AI in kidney care and health care uses algorithms and software engineering principles to approximate the decisions made by clinicians in the analysis of complex health care data. Traditionally computer-based algorithms in health care include a set of rules encoding expert knowledge on medical decisions. These rules are subsequently applied to draw conclusions about specific clinical scenarios. AI algorithms, however, strive to learn from the data without concrete rules.

Click here to read the companion sidebar "Back to basics on AI" before continuing with our cover story.

PAGE BREAK

In the profession of nephrology, various initiatives have incorporated machine learning (ML). One such example is with a ML model developed to predict CKD progression. The model was developed and validated using demographic, clinical and the most recent laboratory data from two independent Canadian cohorts of patients with CKD stages 3 to 5.12 This CKD progression model has been validated externally and in international populations,13,14 and is used for triage of CKD care in Canada.15,16 Other studies have developed ML models to accurately predict outcomes and graft survival after kidney transplant.17,18 ML models have just begun to be applied in health care and have the potential to improve patient care paradigms when implemented at the point of care in clinical applications in nephrology.

AI in kidney care

Large dialysis organizations (LDOs) have been at the forefront of the big data revolution, which has contributed a vast number of epidemiological insights to the literature. Fresenius Medical Care (FMC) is a large integrated kidney disease care organization that includes an LDO, a specialty pharmacy, outpatient endovascular centers, central laboratories, renal care coordination and management, software tools for nephrologists, and medical device and drug companies, among many others. In North America alone, this LDO has collected demographic, clinical, treatment and laboratory de-identified data from more than one million patients, which can be harnessed to train ML models (see Figure 3). Data from this large integrated kidney disease care organization have been leveraged to develop, pilot and deploy several ML predictive models that provide clinical decision support to aid clinicians and improve care models and ultimately, the quality and quantity of life of patients with CKD and ESRD.

Examples of some of the ML models in the dialysis clinic operations environment include a 12-month (long-term) and 7-day (imminent) hospital admission risk stratification in patients on in-center dialysis.19,20 Implementation of reporting risk stratifications and subsequent patient-centered clinical evaluations and interventions have revealed positive initial results, including a 23% decrease in 12-month hospital admission rates.20

Another AI tool deployed includes a CKD progression model that identifies the trajectory of the eGFR based on two or more historic values.21 The model has been integrated in an EMR system by Acumen Physician Solutions and is being used in nephrology practices across the United States to aid in confirming patient prognosis and providing patient education. Preliminary signals based on limited adoption indicated nephrologists who use the tool had about a 2% lower rate of central venous catheters in patients who progressed to ESKD and initiated hemodialysis (HD). Other models developed include an end-of-life ML model and a prediction model to identify a decreasing trend in overall functional status in patients on dialysis.22 These models might have utility in identifying patients who could be assessed for timely palliative care.

AI in home dialysis

For home therapy, we have developed an ML model to predict patients on PD who are at risk of developing peritonitis in the next month. Early detection of peritonitis could reduce the risk of occurrence of severe infections in patients on PD.

Figure 1: Phases of the industrial revolution are shown. First phase of the Industrial Revolution shows the introduction of mechanical production using steam-powered engines, followed by the introduction of mass production of goods using electricity, further automation of production using information and technology and the automation-using AI techniques.

Source: Fresenius Medical Care

To identify the risk of peritonitis, an ML model was trained on historical infection and patient data types (see Figure 4) to identify which active patients on PD were at risk of developing peritonitis within the next month. The model was trained using a machine learning algorithm called XGBoost classifier.23 The performance of the model was evaluated using area-under-the-receiver operating characteristic curve24 which was about 0.66.25

Beginning in December 2018, the peritonitis ML model was used in a nationwide rollout in the United States. The ML model generated risk scores on a monthly basis for active patients on PD treated by the LDO within its integrated kidney disease care organization. As one of the top predictors of peritonitis is that the patient is new to PD, the clinical team scheduled a nurse home visit for all patients within the first 30 days of PD and a phone call between 30 and 90 days after starting the modality.

PAGE BREAK

For the remaining patients, the ML model segmented patients into three risk groups for peritonitis. Nurse home visits were scheduled for high-risk patients and medium-risk patients received a phone call. An assessment was created to track actions, vital assessments and specific interventions performed for each phone call and home visit. Initial findings have seen a decreasing trend in the peritonitis episodes. A thorough analysis is ongoing to quantitatively assess the effectiveness of the new AI risk-directed care paradigm.

Figure 3: Data collected at an LDO of a large integrated kidney disease care organization in North America (as of June 2018) are shown.This LDO has a vast amount of clinical data collected from more than 1.1 million patients.

AI in vascular access

Renal Research Institute, a specialized research team with expertise in computational biomedicine and data analytics, is working on an AI-based classification model in collaboration with Azura Vascular Care, a network of outpatient vascular and ambulatory surgery centers, to detect and diagnose arteriovenous fistulae aneurysm (AVFA) stages in the United States. The team has developed a convolutional neural network (CNN) to automatically classify AVFA stages. CNN is a form of DL that is specifically applied for image recognition. CNNs consist of layers that receive an input image and perform mathematical operation to predict the output.26 The team has collected 15- to 20-second panning videos from 30 patients with two AVFA categories:

  • AVFA stage 2 with enlarged AVF with hypopigmented skin; and
  • AVFA stage 3 with enlarged fistula with open ulcer.

The extracted video frames were used to create an image set and a CNN model was trained using a cloud-based ML platform. CNN was able to automatically classify AVFA stages with greater than 90% classification accuracy. Using this model in a clinical application may reduce workload for physicians, provide timely AVFA diagnosis and improve patient care.

Figure 4: Data utilized by the machine learning peritonitis risk model at a large integrated kidney disease care organization are shown. This figure shows how various clinical data elements like peritonitis infection history, treatment history, nurse assessment, lab values, demographics, comorbidities, lifestyle and quality of living data are provided as input into the peritonitis prediction model.

Source: Fresenius Medical Care

AI in anemia management

Outside the United States, the large integrated kidney disease care organization has successfully developed, validated and deployed an anemia control model (ACM) using ML that provides an individualized prediction of optimal erythropoietin stimulating agent (ESA) and IV iron dosing recommendations to maintain target hemoglobin (Hgb) and iron indices.27-30 The ACM is a class 1 medical device that has been approved to be marketed in the European Union (Council Directive 93/42/EEC). The ACM uses the ANN model to reproduce human erythropoiesis and erythrocyte lifespan in each patient and recommend a suitable ESA and IV iron dose upon each hemoglobin (Hgb) measurement received, with the aim to reach predefined clinical targets. This ML model considers an array of factors including the following demographics and current/historic factors:

PAGE BREAK
  • Hgb, transferrin saturation, and ferritin levels;
  • ESA and IV iron dose, frequency and accumulation;
  • weight, adequacy and nutritional status; and
  • inflammatory status and mineral bone disorder status.

The ACM has been integrated into the EMR of the dialysis organization to optimize the anemia management process. Also, the ACM is being used by external dialysis providers in the European Union and recommendations are provided through an application.

Leveraging ML to individualize anemia management has yielded promising results. Barbieri and colleagues have found in retrospective analysis that the ACM’s prediction of Hgb levels 1 month in the future was less than or equal to 0.6 g/dL of the observed value.27 A multisite observational clinical trial of the ACM conducted in Czech Republic, Portugal and Spain identified that use of the ACM-guided management was significantly associated with 0.17 mg/kg/month decrease in the median dose of darbepoetin alfa, about 6% more patients on HD achieving Hgb targets; a decrease was seen in the fluctuation of Hgb levels over time, and a decrease in the rate of hospital days, cardiovascular events and transfusions.28 Figure 5 shows the longitudinal changes in Hgb levels and darbepoetin alfa dose before and after use of the ACM. Continued prospective testing in Spain has yielded consistent beneficial findings.30 Retrospective analysis of the ACM’s ability to predict Hgb levels 3 months in the future found predictions were within 0.75 g/dL of the observed value.29 This ML model appears to provide tighter control of anemia management, thereby minimizing ESA use and reducing the risks of negative outcomes, while yielding improvements in achieved targets and streamlining nephrologists workflows.

Figure 5. Hemoglobin (Hgb) series and erythropoietic-stimulating agent (ESA) administrations for a sample patient are shown. (Top graphic) Hgb temporal evolution for a sample patient is plotted. The vertical dotted line represents the time of Anemia Control Model (ACM) introduction; green circles identify Hb values resulting from confirmed suggestions. ESA administrations. (Bottom graphic) The first ACM suggestion was rejected (as indicated by the red circle on the resulting Hgb value), whereas all subsequent suggestions were accepted. Correspondingly, a reduction in Hgb cycling can be observed. Darbo, (darbepoetin alfa). Figure reproduced with permission of International Society of Nephrology/Elsevier.28 Source: Barbieri C, et al. Kidney Int. 2016;doi:10.1016/j.kint.2016.03.036.

Source: International Society of Nephrology/ Elsevier

Conclusion

The rise of AI has the potential to redefine health care delivery. It should be viewed as a decision support tool to extend human insight and cognition and not something that will replace human medical decision-making on interventions. It should be employed to assist overburdened providers and support their endeavor of achieving best practices and outcomes for the patients they serve.

However, factors such as clinical effectiveness of an AI solution, and accountabilities in case of error, need to be carefully considered before implementing such solutions. Furthermore, policies and regulations need to evolve to guide the development of AI at an acceptable scale. Most importantly, the value of an AI solution to effectively deliver better outcomes needs to be demonstrated to patients, physicians, and providers to foster their trust.

PAGE BREAK

Disclosures: The authors report no relevant financial disclosures.

PAGE BREAK