· Federated Learning-based platform securely and rapidly enables access to anonymized and structured health data for research and collaboration.
· This will enable a collaborative engagement across healthcare ecosystem to create innovative AI solutions.
Aster Innovation and Research Centre, the innovation hub of Aster DM Healthcare, has joined with Intel Corporation, and CARPL to announce a state-of-the-art ‘Secure Federated Learning Platform.’ This collaboration will enable the development of AI-enabled health tech solutions where data can securely reside where it is generated. The collaboration will boost innovation in areas such as drug discovery, diagnosis, genomics, and predictive healthcare. It will also allow clinical trials to access relevant data sets in a secure and distributed manner.
A single patient generates nearly 80 MB of data annually in imaging and EMR data; according to 2017 estimates, RBC Capital Market projects that “by 2025, the compound annual growth rate of data for healthcare will reach 36%. Genomic data alone is predicted to be 2–40 exabytes by 2025—eclipsing the amount of data acquired by all other technological platforms.”
AI-enabled solutions in areas such as medical imaging are helping to address pressing challenges in healthcare such as staffing shortages and aging populations. However, accessing silos of relevant data spread across the different hospitals, geographies, and other health systems while complying with regulatory policies is a massive challenge.
Commenting on this first-of-its-kind collaboration, Dr Azad Moopen, Founder, Chairman and Managing Director of Aster DM Healthcare, said, “Aster Innovation and Research Centre is glad to partner with technology giants like CARPL and INTEL to bring highly progressive healthcare solutions through digital advances and Artificial Intelligence. The Secure Federated Learning initiative will help analyse data and support the development of a predictive mechanism for patients, the opportunity for a second opinion on treatments, and most importantly, affirming data security and confidentiality of patients. So far, only a few such initiatives have been conducted especially in the healthcare space. This collaborative platform with world leaders will open doors to many players in the sector to participate in developing accessible healthcare solutions.”
Nivruti Rai, Country Head, Intel India & Vice President, Intel Foundry Services, said,“AI applications are at the cusp of revolutionizing healthcare through timely and effective screening, diagnosis and treatment of diseases. Getting access to high quality training datasets and addressing limitations in the form of regulatory frameworks and geographic boundaries are critical imperatives. I am happy to announce that Aster and Intel have collaborated to address these challenges and deployed a first-of-its-kind Secure Federated Learning Platform in India. It offers a real-world solution by addressing key aspects like security, trust and confidentiality for optimal use of data. This solution will be offered as a service to be used by both AI researchers and data custodians in their pursuit of advancing AI innovation and extensive impact in healthcare. It marks a paradigm shift by getting the compute to the data rather than getting the data to the compute. Our joint intent is to make this platform available to the health ecosystem to solve some of the large-scale healthcare problems and enable quality, affordable and at-scale healthcare.”
Dr. Vidur Mahajan, Chief Executive Officer, CARPL.ai said, “There is no doubt that de-centralised data storage and subsequent training of AI models in a federated manner is the future, especially since lack of generalisability of AI is becoming a bigger problem. We are glad to partner with brands that are doyens of their respective fields – Aster in healthcare and Intel in computation – to enable the extraction, anonymisation, annotation and delivery of data to the AI models through CARPL. Our mission is to take AI from bench to clinic, and this is another example of the same.”
How it works
Federated Learning (FL) is a method of training AI algorithms with data stored at multiple decentralized sources without moving that data. To facilitate the adoption of federated learning, Intel has led the development of OpenFLopen source framework for training machine learning algorithms,that provide a solution to “data silos” by leveraging Intel’s security technology.
Intel® Software Guard Extensions (Intel® SGX) offers hardware-based memory protection by isolating specific application code and data in memory. This secure FL solution enables the protection of workload intellectual property (IP) and secures health data with its custodians. OpenFL was combined with CARPL’s rich data extract, transform, and load (ETL) capabilities for end-to-end AI model training.
The demonstration of the capability of this platform was done using hospital data from the Kerala, Bengaluru, and Vijayawada clusters of Aster Hospital. Over 125,000 chest X-Ray images, including 18573 images selected from over 30,000 unique patient data from Bengaluru, were used to train a CheXNet AI model using a two-node/site approach – Bengaluru and Vijayawada – Federated Learning to detect abnormalities in the X-Ray report. The 18,573 unique images, in addition, provided a 3% accuracy boost due to real-world data that was otherwise not available for training the AI model.
Advantages for the healthcare ecosystem
• Allows data scientists from multiple organizations to perform AI training without sharing raw data
• Gives healthcare providers and other ecosystem partners access to larger datasets used to develop AI models used in preventative and predictive medicine
• Ensures organizational data compliance and governance as data is not shared due to security and privacy guarantees
• Increases the accuracy of AI model training due to access to larger datasets.
The success of this pilot has demonstrated engagement to the next level, which is to democratise access to health data across organisational & geographical boundaries without compromising on the data privacy and security aspects.