This community is in archive. Visit community.xprize.org for the current XPRIZE Community.

Prize Design Summary

XPRIZEXPRIZE Posts: 193 admin
edited December 2020 in Gender Data Gap
The first data challenge in a series, the XPRIZE Mental Health Gender Data Challenge, will incentivize teams to collect missing data that informs our understanding of how women experience depression and the factors that lead them to experience higher rates of depression and related concepts of distress.

The challenge is comprised of two rounds:

Round One: Proposal Submission

In Round One, teams will submit detailed proposals to a panel of expert judges appointed by XPRIZE in tandem with XPRIZE’s university partner. Once the competition is active, teams will have up to five months to submit their proposals. We expect proposals to contain the following minimum requirements:
  • Innovation Potential: Proposed methods innovation or new forms of data collection or analysis.
  • Location: Detail on the proposed research setting and evidence of the team’s understanding of the context and culture.
  • Methodology: A description of the plan for collecting gender-disaggregated, qualitative and quantitative data on depression.
  • Partnerships: A description of current and potential collaborative partnerships with government ministries, advocacy groups, research centers and universities, NGOs, and other organizations.
  • Qualifications: Team member CVs and bios detailing qualifications and expertise relevant to the challenge.
  • Budget: An overview of the proposed operational plan and budget.
  • Ethical Framework: A locally adapted ethical framework that ensures the protection of human subjects, informed consent procedures, and data privacy and security protocols; and
  • Secure Data Accessibility: Teams must consent to XPRIZE open access and accessibility requirements.

Round Two: Data Collection

During Round Two: Data Collection, teams will obtain Institutional Review Board (IRB) approval, complete research and data collection in the field (fieldwork), and ensure that data is prepared for submission. At the end of Round Two: Data Collection, judges will evaluate the quality and type of data collected by teams, and their effectiveness in carrying out the goals set forth in their Round One proposals.

Quantitative Data Requirements: Judges will evaluate teams’ quantitative data across a sections of criteria such as:
  • Interoperability: Data files must be formatted in a manner that is easily accessible.
  • Secure Open Access: Data collected by teams must be secure and protect user privacy.
  • Metadata: Data collected must have the appropriate metadata and contextual information.
  • Contextual Coverage: Data points collected must include a minimum set of demographic variables such as age, sex, gender, marital status, and employment status (with additional data points to be determined by our expert partnerships).
  • Statistically Significant Sample Size: Data collection must be statistically significant for the research goals and population under scrutiny.
  • Clean: Teams will be expected to detect and remove inaccurate or duplicative records to ensure that data collected is optimized for analysis.
  • Complete: Judges will evaluate teams’ effectiveness in filling all required data fields.
  • Construct and Internal Validity: For data collected, teams must evidence the relationship between data collected and depression diagnosis.
  • Disaggregation: Data collected by teams must be sex and gender disaggregated.

Qualitative Data Requirements: Judges will evaluate qualitative data and the team’s report of their data collection process according to criteria such as:
  • Themes: Does the data address a minimum of 90% of the specific themes of interest identified by XPRIZE?
  • Triangulation: Has the team triangulated their findings with other sources (people, data, observations)? Have multiple methods been used to come to specific data points or conclusions? Are the methods consistent with the questions?
  • Saturation: Was saturation achieved? Does the theme saturation have adequate justification based on the evidence available?
  • Conformability: Does the team leave an “audit trail” showing where their data came from and how it was collected? Are all procedures communicated?
  • Internal validity/Member checking: Are the data trustworthy? Is there evidence that the data came from someone in that community and is a view that a person in that community would have?
  • Reflexivity: Is there an articulation of how the researchers were reflexive and how they accounted for their presence and influence on the research process and data collection?

In your opinion, what is missing, or, what isn’t necessary?

We look forward to seeing all of your insights and feedback!

Comments

  • mhackettmhackett Posts: 14 ✭✭
    1. Data are plural - the singular is datum
    2. There seems to be some confusion regarding qualitative criteria. Are XPRIZE mandating the qualitative approach/method?
    3. If XPRISE are identifying Themes before data are collected this demonstrates a lack of understanding of qualitative methods. Themes are identified by analysing transcriptions for commonly repeated utterances and topics, grouped and prioritized based on information received. If XPRIZE stipulates that information must be provided on 90% of their themes of interest, it negates the purpose of qualitative interviews -> identifying what is important to the participants. If that pre-specified 'theme/topic' is not important to respondents it won't emerge as a theme. This 90% criterion may not be met for very good reason.
    4. You appear to be specifying deductive coding if you are prespecifying themes. Are you also providing a coding framework to go with this type of coding?
    5. Not all qualitative methods require triangulation - so again what method are you stipulating?
    6. Not all qualitative research requires saturation - so what method/framework/theory are you stipulating researchers must use?
    7. I don't think I have ever seen an audit trail for qualitative research - this is common for quantitative research and it should be moved there
    8. You will need to provide criteria for how you will assess internal validity as I can't work it out
  • ShashiShashi Posts: 596 admin
    @mhackett - Thanks Prof. Maree for your feedback. Our research team has taken a note of all the points and will reply back.

    Hello @jpayne5, @AnnalijnUBC, @MarianneSeney and @WD_Research - It would be nice to hear your feedback on the final prize design. Thanks.
  • mhackettmhackett Posts: 14 ✭✭
    Also, while I'm here again just thinking about the quantitative criteria also:
    1. Statistically significant sample size - this is appropriate for a trial where we are looking for differences between interventions, or for observational studies where we might have a theory that x increases chances of y - and we know the usual frequency of y and what we would consider a clinically important difference. So this might apply to the study of factors that lead women to experience higher rates of depression and related concepts of distress, but not the understanding of how women experience depression (qualitative).
    2. For completeness, you will need to think about what this means. ie. do you mean all respondents answering all questions (which is possible if every question includes a 'don't know/don't care/choose not to answer' option), or x % of respondents answering x % of questions. Many people choose not to answer questions they view as personal, incriminating or stigmatising.
    3. I would like to think that Construct and Internal Validity would be covered during the proposal phase where the applicants would indicate that they would use culturally validated methods to assess the various endpoints.
    4. You might like to consider information from the Enhancing the QUAlity and Transparency Of health Research network available here https://www.equator-network.org/reporting-guidelines/strobe/ - specifically the STROBE and SPQR guides.
  • jpayne5jpayne5 Posts: 6
    I agree with the above comments. I'm also concerned about mixing qualitative and quantitative methods in the same competition- it might be better to let teams choose one or the other method- if you get teams trying to do everything- the product is likely to be shoddy. Maybe two different challenges?
  • Aaron_DenhamAaron_Denham Posts: 33 XPRIZE
    Thanks for the feedback everyone. We unfortunately could not post all the design details here, which would speak to some of the concerns and ideas. However, once this advances to the next stage, we will be taking on your feedback, in collaboration with our partners, to build the comprehensive study design. Once that is complete, we'll make it public and open for comment.
  • jpayne5jpayne5 Posts: 6
    Hi All- Just wondering where this challenge is in the process? Very exciting stuff....
  • Kathleen_HamrickKathleen_Hamrick Posts: 66 XPRIZE
    Hi @jpayne5 it's good to hear from you — thanks for reaching out! The design is complete and under review. We hope to share more information soon!
Sign In or Register to comment.