Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Multidisciplinary ONC Work Group #9: Meeting Summary PCOR Privacy and Security Research Scenario Initiative and Legal Analysis and Ethics Framework Development March 30, 2016 1:00 PM – 2:00 PM Meeting Summary Introductions The NORC team began the ninth work group meeting for the Patient Centered Outcomes Research (PCOR) Privacy and Security Research Scenario Initiative with introductions. Eric Goplerud, from the NORC team, led the call. Other presenters included Jane Hyatt Thorpe, Lara Cartwright Smith, and Elizabeth Gray, from George Washington University (GW); Daniella Meeker, from the University of Southern California; and Ioana Singureanu from Eversolve. The project representative from the Office of the National Coordinator for Health Information Technology (ONC) included Devi Mehta, Privacy Policy Analyst. Please see Appendix A below for a full list of meeting participants. The meeting covered four main areas: ■ ■ ■ ■ Three new scenarios related to precision medicine: ■ One about disclosures that affect family privacy ■ One scenario focused on testing and disclosure of genomic data to a minor ■ One scenario on the use of genetic biomarkers for research An update on previous scenarios undergoing refinement Walk through of other scenarios being considered for development Logistics and next steps Key Questions The NORC team is attempting to answer specific questions for each research data use scenario. The NORC team is interested in the “who”, “what”, and “why” to understand how the data is moving between different parties. The GW team will review these issues more closely after the initial development process and consider the legal, ethical, and policy dimensions. The multi-stakeholder group will need to consider arrangements and agreements that govern data movement between parties sharing/exchanging data. ■ ■ What arrangements/agreements exist between the sources of data and the entity combining the data? For what purpose(s) are the data sets being combined? MEETING #5 SUMMARY, FEBRUARY 5, 2016 | 1 NORC | NORC PCOR Research Data Use Scenarios Work Group | Meeting #6 Summary PCOR Privacy and Security Research Scenario Initiative and Legal Analysis and Ethics Framework Development Consent will be an important consideration in the research data use scenarios and the legal and ethical framework. ■ ■ ■ Was consent obtained for the specific use of data? What was the process for obtaining consent or waiver? Understanding the barriers to collecting, accessing, and using data for the research community will also be very important to this project. The multi-stakeholder group should consider any barriers or impediments they have experienced while conducting research that could have potentially been improved with more flexibility in the legal structure. Working Through Research Data Use Scenarios The NORC team developed three scenarios related to precision medicine; disclosures that affect family privacy, testing and disclosure of genomic data to a minor, and using genetic biomarkers for research. These scenarios were inspired, in part, by the work underway with the Precision Medicine Initiative, as well as work in the field. The NIH provided feedback on the three scenarios presented today. Research Data Use Scenario: Disclosures that Affect Family Privacy Daniella Meeker described the first scenario about an unintended compromise of a family member’s privacy, in a case where research participants might answer questions about their family members on research-related questionnaires. It is important to think of ways in which this kind of information might need to be disclosed ethically. Under what circumstances would we have an obligation to disclose information about family members or to family members? The idea in this scenario is that a repository has environmental health data as well as demographic data and participant-reported data about family history. In the course of analysis, the researcher discovers information about environmental health risks that could affect not only the research participants but also the family members. Questions for stakeholder consideration: The following questions relate to informed consent regarding research discovery and study results of potential interest to both participants and family members: o At the time of consent, should there be an opportunity where the participants can opt in/out of receiving information related to study results? o Is there a case in which family members whose information (as reported by the study participant) is stored in research repositories, should consent or assent so that they may be informed of study results? o What factors should IRBs consider when making determinations about return of results to patients? o When participants find that they are at risk for certain conditions, what policies and procedures should be in place for contacting and counseling patients? What obligations does the researcher have, if any, about informing participants or family members of findings? Are counseling resources required? What guidelines should researchers consider in the process of informing participants of potential privacy risks to family members? Should researchers inform family members initially that their information has been reported as part of the study? Does the presence of genetic data in the sample affect privacy considerations? What are the legal and privacy implications of “big data” collection and analysis? MEETING #6 SUMMARY, FEBRUARY 17, 2016 | 2 NORC | NORC PCOR Research Data Use Scenarios Work Group | Meeting #6 Summary PCOR Privacy and Security Research Scenario Initiative and Legal Analysis and Ethics Framework Development One stakeholder felt strongly that researchers have no obligation to inform family members who are not involved in the research, since they were not consulted at the outset and they may not even know that a family member is participating in the research project. Stakeholders discussed the idea that privacy should not dominate common sense. In an extreme case, if an individual has an infectious disease, it is difficult to put up boundaries around the research and refuse to recognize the consequences of the research findings. In the case of genomic research, you may be looking at family history as part of an individual, and it may be the case that the person of greatest risk for a condition is someone other than the research participant. Whether that information should be shared or not is an ethical issue, as well as a health care issue. There are also circumstances in which the significance of what you find may impact the whole family. Additionally, stakeholders felt the definition of “family” needs to be broader. Another specific example of these issues comes from a case many years ago that involved the French Data Protection Commission. Researchers found that an intervention would prevent the participants from going blind, but contact was prohibited on privacy grounds. On one hand, once you create an obligation to contact people and give them information, you run the risk of creating legal liabilities; on the other, there is also the possibility of a lawsuit if you do not notify people. The decision to contact someone may depend on something like the specific risk to that individual with regard to the condition of interest. Daniella asked the group whether anyone is aware of guidelines around these kinds of situations. Another stakeholder responded that we may first want to look into guidelines on informing the individuals themselves about findings since there are guidelines for return of results to individuals from the American College of Medical Genetics. Then, it might be possible to extrapolate the guidelines on disclosing to family members. Eric asked the group whether including language to contact family members in the informed consent adds/resolves/complicates these issues? One stakeholder replied that contacting people always complicates issues. However, other circumstances may require contact such as when an intervention may result in better outcomes for people. There is a need to establish clear boundaries for what you are willing to do as a researcher. Ioana summarized the discussion with the conclusion that guidance would be useful for researchers to include in their protocol. She suggested that if there is a specific risk in not disclosing information with people other than the study participant, then there may be an obligation to disclose; and if there is no risk, then there is no obligation. The stakeholders agreed that if we do not have guidelines, then every researcher who faces this problem deals with it differently, which could lead to drastically different outcomes. One solution could be some sort of decision tree to clarify what actions researchers should take in various situations related to disclosure of findings. Eric asked the stakeholder group whether there are any populations for which disclosure might require greater sensitivity (e.g., substance use, HIV risk, or TB risk). Lara from GW added that some states do have laws, particularly for HIV disclosure, where providers are prohibited from disclosing that information to a partner. Research Data Use Scenario: Testing and Disclosure of Genomic Data to a Minor Daniella then introduced the second scenario the NORC team prepared related to precision medicine. This scenario focuses on Huntington’s disease in particular. In this case, the research results are fairly definitive as to disease markers (as opposed to risk factors). There is a high likelihood that if a researcher MEETING #6 SUMMARY, FEBRUARY 17, 2016 | 3 NORC | NORC PCOR Research Data Use Scenarios Work Group | Meeting #6 Summary PCOR Privacy and Security Research Scenario Initiative and Legal Analysis and Ethics Framework Development sees the Huntington’s disease genes, the participant will display symptoms in the present or future. This study includes not only patients that are seeking care, but also their family members, particularly minors, who would be children of the Huntington’s patients. In this case, the parents are giving consent on behalf of the minor, and minors are providing assent (which is consistent with preconditions discussed in the context of other scenarios). Participants are also asked whether the researchers can conduct follow-up with them after the study. Questions for stakeholder consideration: Do considerations change if researchers return of results to a participating minor before they have achieved age of consent? What are the relevant state laws? Should participants opt in/out to receiving these types of results? Should parents have access to results about their children, or should the research team retain that information until minor has reached the age of consent and can consent/decline to receive the results directly? What are researchers’ obligations for the timing, level of detail, security, and education around the return of participant results? Is it necessary to discuss these policies/gaps as part of the consent process to ensure the participant understands the risks of participating in the research? Potential issues: o Loss of autonomy as to when the participant finds out their disease status o Conflicts between when a parent wants their child to find out about disease status vs. legal requirements for disclosure How secure is the system used to administer/process results? There are two important angles here: information system used for return of results; and how you navigate dynamics between family members when you have certainty around what the results imply. A stakeholder brought up an example where an individual from Duke had his entire genome sequenced and discovered he had certain genetic predictors of disease. He had young daughters and struggled trying to determine at what age he would tell them. It comes down to personal preference of what you do or do not want to know, whether the condition is treatable, whether intervention makes difference to an outcome, and at what age you reveal the information. It is essential to do research to look into the factors that affect the answers to some of these questions. One stakeholder has experience conducting research related to granular consent and is actively working to understand proxy consent. From the current work, which is limited in numbers and settings, they ask participants whether they want their EHR data to be used for research in a deidentified manner, who they wanted to share information with, and which components. The findings so far are that people are concerned with family history, personal history, financial information, and disclosure of their information to for-profit institutions. The participant should decide whether to share that information or not, unless there is a real definitive intervention that could be done, but those examples are extremely rare. In her view, contacting family members without explicit participant consent to do so is ethically wrong. Eric asked the stakeholder group to weigh in on whether the certainty or a greater probability of a diagnosis or a condition developing (e.g., Huntington’s disease) affects the concerns and issues the stakeholders introduced in the previous scenario. One stakeholder suggested the team should consider whether there are any actionable steps after diagnosis. Eric then asked whether the probability of affecting on the outcome on a sliding scale (e.g., the diagnosis will only have an incremental benefit in terms of what can be changed in the patient’s outcome versus a more dramatic treatment that can be offered) changes how the stakeholders would consider the scenario. MEETING #6 SUMMARY, FEBRUARY 17, 2016 | 4 NORC | NORC PCOR Research Data Use Scenarios Work Group | Meeting #6 Summary PCOR Privacy and Security Research Scenario Initiative and Legal Analysis and Ethics Framework Development Daniella asked the stakeholders to continue to express their thoughts on these questions offline if possible. She also asked them to relay their own experiences in terms of use cases and the legal and ethical boundaries they have felt need additional clarification in their own research. She asked them to consider the obligation or liability researchers incur when protecting patients within the course of PCOR, including privacy and security concerns. Ioana was also interested in how the NORC team should explain how to manage this kind of information and how long a researcher should store and secure the information (e.g., if future research revealed something actionable). Granular consent already requires several management and consent procedures, and these kinds of re-contact procedures can add a new layer to the use case models. Ioana further explained the NORC team is trying to determine from a workflow standpoint, without getting into the details of the data types related to CLIN2, the key privacy or ethical decisions points that must be evaluated by humans or machines and which workflow steps and activities can be automated. The team would like to communicate these requirements in a format that is understandable to a diverse group of policymakers, implementers, legal experts, and other interested parties that may use the products of this initiative downstream. A stakeholder responded that he agreed that this process may have several different stages, but these possibilities (e.g., re-contact procedures) should be part of the consent process. Daniella asked whether he thought the IRB should be responsible for ensuring all of these possibilities are clarified in the consent process, since this may require a shift existing guidelines and increasing their rigor. The stakeholder responded that this an important question for participants to be aware before entering a study. Eric asked the stakeholders to consider whether there is a qualitative difference between communicable risk and a genetic risk in a study, specifically genetic transmission or genetic risk markers. Certain scenarios may include index subjects that have a disease or have been exposed to a disease that can be transmitted to a family member or others. He asked the stakeholders to think about whether there is a different quality of risk to an infectious condition versus a genetic marker. Research Data Use Scenario: Use of Genetic Biomarkers for Research Daniella introduced the third precision medicine scenario to the multi-stakeholder group, which focused on the use of genetic biomarkers in research. The researcher in this scenario plans a study of gene networks within the human genome thought to be involved in metabolic regulation. The purpose of the study is to better understand the genetic risk factors associated with metabolic disorders. All necessary approvals from the IRB have been obtained to conduct the study as a precondition. The researchers collect blood and saliva samples from participants to test for genetic biomarkers of interest and combine data from these genetic samples with genetic data obtained from the coalition database. No data whatsoever from HIPAA covered entities are used in this study. Participants volunteer for the study on the basis of responding to public advertising. Under the terms of the federal grant, coalition members must share all data collected under grant funding in order to support each other’s research and further the public good (e.g., biomarkers, loci for certain diseases). Therefore, the researcher will share and obtain data from the other members of the coalition, and sets up data use agreements with these organizations. Patients consent to the study with the understanding that their data will be used for multiple research studies over time. They will not be contacted to renew consent because the IRB judged re-use of the data to be minimum risk since any direct identifiers associated with the data will be removed and they will not need to provide additional samples. MEETING #6 SUMMARY, FEBRUARY 17, 2016 | 5 NORC | NORC PCOR Research Data Use Scenarios Work Group | Meeting #6 Summary PCOR Privacy and Security Research Scenario Initiative and Legal Analysis and Ethics Framework Development After the study receives approval, computer science researchers published findings demonstrating a way to re-identify patients with advanced cryptography methods. These cryptographic methods put the researcher’s sample and the coalition’s data at risk. In addition, the cryptographers make their methods publically available, thereby increasing the likelihood of their use for reidentification, and increasing the risk to participants who contributed genetic data to the coalition. Daniella asked the multi-stakeholder group to consider the following questions related to the scenario: What types of laws, policies or governance structures could be established to address reidentification issues? Should consent forms include information about reidentification risk? How do you allow patients the opportunity to control what types of research is conducted with their data? By what methods should data be deidentified to minimize risk of reidentification? Does an enclave or a repository where computing is centralized reduce risk? What policies should be applied? What potential group and individual harms exist even when data is de-identified? What additional data may need to be collected for research that in tandem with genetic biomarker data (clinical, environmental or otherwise) might increase the risk of reidentifiability? Should the biospecimen samples be destroyed once researchers collect the data of interest? Is this helpful to protect patients? What kinds of obligations do the individuals who have discovered re-identification algorithms have to alert public institutions or the research teams themselves about their findings before they make them available in research journals or other forums? These questions demonstrate the tension between individuals who are trying to innovate data reidentification methods and individuals trying to protect privacy through security mechanisms. One stakeholder responded that he published a legislative proposal in law journal a few years ago focused on the risks of reidentification. He posited that everything should be controlled with the DUA and the DUA would plug into the statute voluntarily. The DUA would set the terms and make all parties accountable by providing for liability and enforcement. The key is that any individual or institution allowed access to this data must the sign the requisite DUA outlining what they can and cannot do with the data and how they are allowed to and not allowed to reidentify data with any technology that is available. This requirement does address many of the problems introduced in this scenario. Daniella asked the stakeholders whether this kind of law would protect against adversaries. This also introduced the question whether researchers should be treated as adversaries or instead given some sort of special considerations. These kinds of agreements often have assumptions that raise security concerns. The stakeholders responded that this scenario is not just a matter of security, although that is always a challenge, but also what an individual or other entity is allowed to do with the data. These entities can be held liable for giving access to the data to an authorized party and any resulting issues or using the data in a way not appropriated in the DUA. Ioana believes this raises two distinct issues, one where an entity is not authorized to access the data at all, and a second where an entity is authorized to access the data, but they are still not allowed to reidentify patients even if they have the means or technology given the regulations laid out in the DUA. Another stakeholder indicated that she agrees that entities in this kind of scenario should certainly have a DUA. Although the DUA is not meant to treat the researcher as adversary, the DUA does provide protection for the whole research community in the case the data are abused. The rules and regulations MEETING #6 SUMMARY, FEBRUARY 17, 2016 | 6 NORC | NORC PCOR Research Data Use Scenarios Work Group | Meeting #6 Summary PCOR Privacy and Security Research Scenario Initiative and Legal Analysis and Ethics Framework Development must be applied broadly across the research community to be effective. In some cases, you may have citizen scientists that are not even covered under an institution. The DUA can increase the responsibility of citizen scientists in precision medicine that have access to the data but are not covered under an institution. The DUAs can potentially reduce the risk because unregulated use of the data can have consequences. However, despite these protections, the informed consent documents should still clarify when there is some risk a participant’s data could be reidentified. The stakeholder also commented that she believes that individuals who have discovered reidentification algorithms certainly have an obligation to alert public institutions or the research teams before they publish their findings. Daniella asked the stakeholders to consider the significant amount of communication with research participants in precision medicine in reference to consent forms. She asked what level of authorization researchers should ask from participants for consent for any individual use as regulated by a DUA. One stakeholder commented that asking research participants to decipher a complicated DUA and provide opinions on every component may be unfeasible, especially if every participant has a different opinion. Another stakeholder countered that it is still important to maintain consistent communication with participants in PCOR and it is important to actually consult with participants on these issues, otherwise researchers may not be truly engaging participants as they should in PCOR. However, another stakeholder commented that almost every website web application has a terms of service agreement, and generally users accept the default without reading those terms of services. The researchers would actually need to carefully educate research participants on every aspect of the terms of participation. This would require significant resources and effort. A stakeholder countered that researchers need to simplify these terms of participation, perhaps offering some kind of granular consent options to indicate whether participants would like to have their data used for other studies without their consent or they would like to be contacted monthly or when their data is included in any new research. The stakeholder countered that this may still be unfeasible because there is a limit to the granularity that is possible to contact the participants. Scenarios under Comment and Revision The NORC team is actively revising and refining the research data use scenarios presented at previous work group meetings. These scenarios have associated feedback forms on Confluence, where the multistakeholder group may continue to provide feedback. The NORC team welcomes this feedback and will discuss and integrate it accordingly. The links to these forms have been added to the slide deck for work group meeting #9, which will also be posted on Confluence as well as attached to the meeting invite. Next Steps The agenda and slide deck from this meeting will be posted on Confluence for the multi-stakeholder group to review. The NORC team will revise the scenarios based on our discussion during work group meeting #9 as well and update the precision medicine scenarios on Confluence. We have three remaining work group meeting next month. The next meeting will be held next week on Wednesday, April 6 from 1:00-2:00PM EST. The NORC team will send out the agenda and slides for this meeting next week as usual. MEETING #6 SUMMARY, FEBRUARY 17, 2016 | 7 NORC | NORC PCOR Research Data Use Scenarios Work Group | Meeting #6 Summary PCOR Privacy and Security Research Scenario Initiative and Legal Analysis and Ethics Framework Development Appendix A ONC PCOR Research Data Use Scenarios Work Group Meeting #9 March 30, 2016 List of Invitees Moderator Name Eric Goplerud Katherine Donaldson Gagan Jindal Lara Cartwright-Smith Elizabeth Gray Jane Hyatt Thorpe Daniella Meeker Ioana Singureanu Helen Caton-Peters Devi Mehta Meeting Moderators Moderator Affiliation NORC at the University of Chicago (NORC) NORC at the University of Chicago (NORC) NORC at the University of Chicago (NORC) George Washington University (GWU) George Washington University (GWU) George Washington University (GWU) University of Southern California (USC) Eversolve LLC Office of the National Coordinator for Health Information Technology (ONC) Office of the National Coordinator for Health Information Technology (ONC) Stakeholder Name Alexa Limeres Anjanette Raber Bob Gellman Meeting Participants Stakeholder Affiliation Centers for Disease Control (CDC) Oregon Health and Sciences University Privacy and Information Policy Consultant Curt Mueller Dennis McCarty Ed Hammond Health Resources and Services Administration (HRSA) Oregon Health & Science University (OHSU) Duke University Eleanor Celeste White House Policy Analyst Hunter Hardy Childrens Hospital of Los Angeles Jeremy Maxwell Office of the National Coordinator for Health Information Technology (ONC) Office of the National Coordinator for Health Information Technology (ONC) Centers for Disease Control (CDC) University of California San Diego Office of the National Coordinator for Health Information Technology (ONC) National Institutes of Health Kathryn Marchesini Loria Pollack Lucila Ohno-Machado Michelle Murray Taunton Paine MEETING #6 SUMMARY, FEBRUARY 17, 2016 | 8