This community is in archive. Visit community.xprize.org for the current XPRIZE Community.

Developing the Foundation for Healthcare Data Integration

In many countries, patient data resides across multiple point-of-care systems, and in a variety of structured and unstructured formats. Often, various point-of-care systems (clinics, primary physicians, hospitals, pharmacies, etc.) within a health system collect the same information and can’t share that data with other point-of-care systems, leading to an inefficient system and a fragmented picture of a patient’s health.

Breaking down the barriers to data access while adhering to privacy and security principles is an immense challenge and tremendous opportunity. Standardized and granular patient data that can be shared across systems provides the foundation for a low-cost and high-quality healthcare, including through enabling frontline health systems to function optimally and effectively. Additionally, such data is essential for AI algorithms to deliver insights.

In such a scenario, data standardization and aggregation is top priority. We want to know:
  • What are the measurements of successful data standardization and aggregation, or systems interoperability? How has it been assessed in other sectors or industries, such as finance?
  • What is the best analog for this type of data standardization and aggregation from another sector or industry?
  • What technologies can help with data standardization and aggregation? How can AI and ML help?
  • What considerations need to be made with respect to an infrastructure that still protects privacy and security?

Comments

  • ShashiShashi Posts: 596 admin
    Hi @scveena, @jonc101, @ajchenx, @synhodo, @reubenwenisch, @mashizaq, @ymedan, @nastyahaut, @Sujana, @joshnesbit, @namkugkim, @Haruyo, @bngejane, @RickyM, @Stefania - Curious to know if AI / ML can help in data standardization and aggregation?
  • ymedanymedan Posts: 127 ✭✭✭
    The only "technology" that can help in data standardization and aggregation is human collaboration and data sharing. The current situation, at least in my side of the pond, is that data is fragmented and locked in islands.
    IMHO, ML/AI role is in extracting knowledge from data lakes/ocean. Humans need to set the standards for content structure and interfaces (like was done with many "Omics").
  • ShashiShashi Posts: 596 admin
    edited September 2020
    Thanks @ymedan for sharing your thoughts.

    Hello @aylin, @sarahb, @sarahkhenry, @ukarvind, @ingmarweber, @KarenBett, @Pavel - This discussion is related to one of the other prizes we're designing: Frontline Health. I know you have expertise in technology specifically AI. I wonder if you have any suggestion on data standardization and aggregation across various platforms? Any analogs from a different sector / industry on data standardization and aggregation would also be helpful in understanding this topic.
  • bngejanebngejane Posts: 76 ✭✭
    edited September 2020
    Hey team, please checkout Ocean Port:
    1. https://port.oceanprotocol.com/t/dataset-structure-formatting-standards/50?u=bngejane.
    2. https://oceanprotocol.com/technology/compute-to-data
    I think the approach that Ocean has embarked on can address some of the issues raised here.
  • ShashiShashi Posts: 596 admin
    Thanks @bngejane for sharing these resources. Great insights!

    Hi @Shabbir, @jda @fbaothman @acowlagi @elekaja @MachineGenes @Nitesh @yuanluo @dykki @kenjisuzuki @SArora - As a tech expert in digital health space, you might have inputs to share on data standardization and aggregation. Please join the discussion.
  • NiteshNitesh Posts: 7
    This is challenge but not something which cannot be solved. From my experience in India and Africa, I found that rarely there are connected devices and even if few centres have them, they still record patient data on paper and in a case of continuous monitoring like in ICU, data is recorded at a defined interval. In few centres they have started putting this data in an app/webpage for the central repository. One more thing we have seen on field that everyone thinks that frontline healthcare workers want a tool to enter this data and every one creates an app now healthcare workers have multiple apps to enter data which they find it difficult as this is not their primary job and they are already burdened. So, a data collection tool should work in random with existing workflow and tools already used by the healthcare system. There also needs to be some incentive for the caregivers to share/enter this data till the time automation is not there.
    This system has worked only in the single caregiver system like public health or hospital chain. To enable a cross caregiver data transfer there has be a partnership like partners in health in Boston.
    For the first we can start work in a public Heathcare system of a state/county/district to prove a model and showcase it's benefit to local administrators for a fulltime integration in to a larger system.
    For the data security and privacy, there are already many standards available like HIPPA compliance which can be but into the data collection system.
  • DidierCDidierC Posts: 10 ✭✭
    Hello,
    Thank you for the questions and reactions. I have the impression that a question like "Which kind of Data is the most important to aggregate to make better medical research?" could be added, but maybe it is another topic.

    Cheers. Didier
  • ShashiShashi Posts: 596 admin
    Thanks @Nitesh for sharing those insights. @DidierC - We agree that we need to understand which data is important to aggregate - AI could be extremely helpful in data selection.

    Hi @cimdal2, @preciouslunga, @owen, @Debbie_Rogers, @biki, @sakeuriticus, @lswright, @CHardaker, @Hongsoo, @care2communities, @kkatara, @alabriqu, @abejanis, @ShmuleyG and @rguimara - As you are tech expert and entrepreneur, we feel you might have inputs to share with us on the technologies which can help us with data standardization and aggregation. Join the discussion.
  • HeatherSuttonHeatherSutton Posts: 77 XPRIZE
    @ymedan ~ I really like your line: "The only 'technology' that can help in data standardization and aggregation is human collaboration and data sharing. " It highlights the essential element of human cooperation when it comes to making real systemic change. Do you have any ideas on how to best foster this kind of collaboration between stakeholders in health? In some ways, it seems to me that a competition (such as ours) is the perfect way to bring these stakeholders together and to catalyze some of this necessary collaboration. But I'm also wondering, in this collaborative process, who takes the lead? Who guides and moderates these types of discussions amongst stakeholders and makes sure that there is alignment on the direction forward?

  • HeatherSuttonHeatherSutton Posts: 77 XPRIZE
    @bngejane ~ Indeed, thanks for sharing the link to Ocean Protocol. Looks like one of our colleagues is an advisor on the Ocean Protocol, so we'll definitely set up some time to explore it in more detail with him. Do you happen to know any digital health companies agencies or countries that have adopted the Ocean Protocol by chance?
  • HeatherSuttonHeatherSutton Posts: 77 XPRIZE
    @Nitesh ~ Thank you for sharing your experience from the field. Thank you too for mentioning Partners in Health. I read up about it -- very interesting!
  • HeatherSuttonHeatherSutton Posts: 77 XPRIZE
    @DidierC ~ You hit the nail on the head! We absolutely must ask the question about which kind of data is the most important to aggregate to make better research. It seems as though this would help not only with the downstream effects of having clean, standardized data but also that this could potentially help upstream to relieve the burden frontline health workers face in their process of collecting data. If the data collection process is more curated, that could potentially mean less unnecessary fields to populate in their digital tools.

    Since the question is so important (and also so big), how does a country come to answer that question? What key perspectives are needed?
  • ShashiShashi Posts: 596 admin
    Hi @meallen3, @Lauren, @C_Castellaz, @baldhame9, @Mohanad530, @aassif_lg, @nothmany, @Marthavjennings, @shamakarkal, @rajpanda, @Kwenz - Curious to know if you have any inputs to share on the questions raised and the comments so far.
  • KwenzKwenz Posts: 2
    The Health Data Collaborative's "Digital Health Systems and Interoperability Working Group" has a range of helpful toolkits and guidelines, notably the " Health Information Systems Interoperability Maturity Toolkit: Self-Assessment Tool" which proposes a unified approach to data standardization and quality.
    The Community Data Working Group developed a toolkit which integrates and builds upon various tools and methods (historical and current) designed to assess data quality at facility level, taking into account good practices and lessons learned from many country contexts.Module 1 looks at Framework and Metrics, Module 2 looks at a Desk Review of Data Quality and Module 3 looks at Data Verification and System Assessment.

    The World Bank ID4D group developed A Catalog of Technical Standards for Digital Identification Systems, which includes Technical Standards and an overview of standards setting bodies, Frameworks for Interoperability which is interesting to review. The application to health is captured in The Role of Digital Identification for Healthcare : The Emerging Use Cases document.

    Considerations with respect to infrastructure which protects privacy and security:
    1. Legal and policy frameworks must be in place first and foremost. secondly, the technology must be appropriate for the entire country context, not just the city center, but can work on and offline in places with limited or no connectivity and unreliable energy sources.
  • ShashiShashi Posts: 596 admin
    @Kwenz - Thanks Kristen for sharing insights and awesome reports on HIS interoperability maturity and data quality frameworks and metrics.
    We have started exploring future use cases for AI and ML in healthcare. If you have inputs / experiences on it, please share it here.
  • mashizaqmashizaq Posts: 47 ✭✭
    @Shashi With the advent of computer systems and its potential, the digitization of all clinical exams and medical records in the healthcare systems has become a standard and widely adopted practice nowadays. In 2003, a division of the National Academies of Sciences, Engineering, and Medicine known as Institute of Medicine chose the term “electronic health records” to represent records maintained for improving the health care sector towards the benefit of patients and clinicians. Electronic health records (EHR) as defined by Murphy, Hanken and Waters are computerized medical records for patients any information relating to the past, present or future physical/mental health or condition of an individual which resides in electronic system(s) used to capture, transmit, receive, store, retrieve, link and manipulate multimedia data for the primary purpose of providing healthcare and health-related services
  • kenjisuzukikenjisuzuki Posts: 5
    My topic is not AI to aggregate or standardize patient data across multiple point-of-care systems, but there is a new technology that can build AI that integrates multiple local AI’s in multiple point-of-care systems. If your purpose for the aggregation and standardization is to use patient data in multiple point-of-care systems for building good AI for those patients, the new technology would work, in theory. The new technology is called federated learning. It is simply described, from AI developers’ points of view, as collaborative machine learning without centralized training data. With federated learning, one can build an integrated AI for patients in multiple point-of-care systems, multiple regions, or multiple countries. First, AI developers build multiple local AI’s with the patient data in individual point-of-care systems. Without bringing patient data out from point-of-care systems, local AI’s are built in local. So, there is no concern about patient privacy or identity leakage. Instead of collecting patient data from outside, an integrating AI developer collects the local AI models and integrates them into a bigger, better integrated AI for all patients. It’s still a quite new technology that emerged a couple of years ago in university labs. So, it’s unknown if this idea would really work in the real world.
Sign In or Register to comment.