Ethical Issues for the Future of Digital Pathology
Pathology services across the globe are evolving as digital methods transform cancer diagnostics. Growing numbers of hospitals now harness the flexibility of digital histology slides to improve clinical workflows and thus patient care. The widespread use of digital images also enables new opportunities for AI driven research, allowing for the possibility of new discoveries and automation of routine diagnostic tasks. The question that concerns us here is: what are the ethical consequences of these changes? Below we highlight four foundational ethical issues for the future of digital pathology, namely, privacy, choice, equity, and trust.
Privacy
How can patient privacy be ensured when histopathology images are shared for AI research?
It is a well-established principle of medical law and ethics that health professionals keep patient information confidential unless they are given explicit consent from the patient to share it, or sharing it falls under one of several common exceptions to a duty of confidentiality (in the UK, for instance, this includes things like anonymity of data, the existence of a legal requirement to share, where there is clear public interest in sharing, and so on). Anonymisation of data is arguably the most common way images are shared without compromising confidentiality and, fortunately, histopathology images can easily and automatically be stripped of information that might disclose patient information, suggesting minimal risks in this regard.
The situation is more complicated, though not necessarily more problematic, when linking images with other clinical datasets. Recent research on anonymisation practices suggest that big data linkages raise a theoretical possibility of re-identification (1–3). It is perhaps true that, with enough data, a patient’s identity might be able to be reconstructed without permission, though it would certainly not be straightforward and would highly depend on the kind of data being shared, who it is shared with, what other data researchers have access to, etc. Complete anonymity may not be able to be guaranteed in a theoretical sense, then, but in many ways, this is no different from non-digital information doctors possess, where unwarranted disclosure is likewise always a theoretical possibility. Fortunately, there exists multiple forms of protections and a wealth of expertise in hospital data extraction teams in order to mitigate against that possibility, namely: robust de-identification practices following the latest state-of-the-art anonymisation techniques, laws and professional duties obliging clinicians to the highest standards of confidentiality, data sharing contracts between data controllers and researchers specifying exactly what can be done with data (including data destruction policies), and correctives in the case of breaches such as, in the UK case, reporting to the Information Commissioners Office and developing a plan of action for limiting damage from breaches. Taken collectively, these multiple layers of protection mean there is a framework in place that when implemented, can help to assure that confidentiality can be maintained.
Choice
Should patients have choice in how and with whom de-identified data is shared?
Many legal and bioethical precedents allow for the sharing of de-identified patient data without explicit opt-in consent from the patient. If data no longer identifies a patient, the argument goes, it poses no harm to them if shared, and so can be done so freely. This does not mean, however, that patients hold no interests in de-identified data. From our experience, many patients express a desire for some kind of choice in how pathology data flows even if they cannot be identified from it. This is often due to general social concerns about how such data is being used – they may wish not to support, for instance, what they perceive as certain inappropriate uses of data. The problem, then, is given that consent is not required for sharing de-identified data, how much choice should be given to patients in the process and how to enable that?
Providing some degree of control over data sharing would go a long way to improving public confidence and acceptability of big data and AI driven digital pathology, though there is no clear and easy answer on how to do it. This is because there are multiple options available. At a societal level, it could be achieved by listening to public opinion, as made available in things like surveys, focus groups, patient and public involvement groups, or citizen juries. At the individual level, it could come from providing patients with opportunities to opt-out from data sharing. It is an open question on which is preferable, though some combination of both would be ideal.
Equity
Will the benefits of AI driven digital pathology be shared equally?
The motivation for digitising pathology services is to improve patient care and patients are justified in their need to be reassured of the centrality of this value. It is especially important, however, given two commonly perceived problems areas that patients voice in relation to medical AI research, namely, the role commercial involvement on the one hand, and the possibility of algorithmic bias on the other. Both challenge the equity of AI driven pathology research, in the former case, due to the concern over whether private incentives may eclipse or corrode public benefits, and in the latter case, due to the concern over whether systematic digital exclusions may exist, for instance, through things like algorithmic bias. Neither possibility is an inevitable outcome for digital pathology and much work is going into ensuring that continues to be the case for medical AI research in general (4–6). It should also be remembered that digital pathology may even be a benefit in this regard, insofar as it can play a role in lessening health inequalities overall by enabling new forms of expertise and distributing that to harder to reach areas through remote assessment.
Trust
How do we ensure patients have full confidence in AI driven digital pathology?
It is clear that the benefits of digital pathology will not be realised if digital diagnosis does not have the public’s trust. Trust, however, is not something to be expected, but something to be earned. Hence, the issue facing the future of digital pathology is how to evidence that it is trustworthy. Strengthening commitments to privacy, choice, and equity can all help in that regard, by highlighting that digital pathology is motivated by the right kind of values. In addition, we think there is much to be done in the area of patient and public involvement and engagement including developing and validating an integrated model which leverages existing approaches; namely, lay representation on data access committees (DACs), through patient and public involvement (PPI) groups, and through deliberative democratic projects such as citizens’ juries, citizen panels, citizen assemblies, etc. - what we collectively call “citizen forums” (7). Recognition of the value and benefits of collaborating across not just commercial and academic organisations, but with citizens as well, goes some way to achieve the maximum benefit for patient care. The effort, leadership and determination required to do this must not be underestimated and everyone must play their part in ensuring that at each step on that journey authentic involvement of patients and citizens has enabled public support and trust to be nurtured.
Ending
For more information on each of the ethical challenges of digital pathology, and how some hospital departments are responding to those challenges, you can read our article:
McKay, Francis, Bethany J Williams, Graham Prestwich, Daljeet Bansal, Nina Hallowell, and Darren Treanor. 2022. “The Ethical Challenges of Artificial Intelligence-Driven Digital Pathology.” The Journal of Pathology: Clinical Research 8 (3): 209–16. https://doi.org/10.1002/cjp2.263.
About the presenters
Francis Mckay is a medical anthropologist and researcher at the Ethox Center, working on the ethics of digital pathology and AI-driven health within the Northern Pathology Imaging Cooperative (NPIC). He conducts ethnographic research across West Yorkshire and neighboring regions on the emergent ethical concerns around the digitalization of health. From 2019 to 2020 he was a post-doctoral scholar at the Berkeley Center for New Media and a research fellow for the Berggruen Institute's "Transformations of the Human" Project. From 2016 to 2019 he was the Earl S Johnson Instructor in Anthropology for the University of Chicago's Master of Arts Program in the Social Sciences.
Graham Prestwich retired from a career in the medicines industry in 2007 and established his own engineering company. In 2012, he joined the Board of NHS Leeds North Clinical Commissioning Group - Lay Member for Patient and Public Involvement, until March 2018. In that role, he established the Patient Assurance Group and chaired Primary Care Commissioning Committee and Remuneration and Nominations Committee. He is the Lay Member of the Leeds Area Prescribing Committee and was inspired to develop “Me and My Medicines” campaign with the involvement of many local people. He is a member of the Board of Healthwatch Leeds and currently Lead for Patient and Public Involvement at the Yorkshire and Humber Academic Health Science Network. He is the Patient Director at HN Company and Trustee of Omnis CIC.
Dr. Bethany Williams is a Specialty Doctor at Leeds Teaching Hospitals NHS Trust and the University of Leeds, and the Lead for Training, Validation and PPI at the National Pathology Imaging Co-Operative in the United Kingdom. She has published extensively in the fields of digital pathology patient safety, evidence based digital pathology training and validation and effective digital deployment. Her body of research earned her the Pathological Society’s medal for research impact, and she is regularly invited to speak at international conferences as an authority on the digital pathology evidence base, practical deployment and patient safety.
Dr. Treanor is a Consultant Histopathologist at Leeds Teaching Hospitals NHS Trust and Honorary Clinical Associate Professor at the University of Leeds, UK. He is also Guest Professor in Digital Pathology at Linköping University, Sweden, where he works with the university and hospital teams on digital pathology research in a well-established project where 100% of clinical slides are scanned.
References
- Lubarsky B. Re-Identification of “Anonymized Data.” 1 GEO TECH REV 202 [Internet]. 2017 [cited 2020 Sep 25]; Available from: https://perma.cc/86RR-JUFT
- Sweeney L. Discrimination in Online Ad Delivery. ArXiv13016822 Cs [Internet]. 2013 Jan 28 [cited 2019 Sep 12]; Available from: http://arxiv.org/abs/1301.6822
- Information Commissioner’s Office. Anonymisation: managing data protection risk code of practice [Internet]. Cheshire: Information Commissioner’s Office; 2012. Available from: https://ico.org.uk/media/1061/anonymisation-code.pdf
- Cole CL, Sengupta S, Rossetti S, Vawdrey DK, Halaas M, Maddox TM, et al. Ten principles for data sharing and commercialization. J Am Med Inform Assoc. 2020;00(0):4.
- Simm K. Benefit Sharing: From Compensation to Collaboration. In: Laurie G, Dove E, Ganguli-Mitra A, McMillan C, Postan E, Sethi N, et al., editors. The Cambridge Handbook of Health Research Regulation. Cambridge University Press; 2021. p. 148–57.
- STANDING Together Working Group. STANDING Together [Internet]. 2022 [cited 2022 Apr 5]. Available from: https://www.datadiversity.org/
- McKay, F., Williams, B.J., Prestwich, G. et al. Public governance of medical artificial intelligence research in the UK: an integrated multiscale model. Res Involv Engagem 8, 21 (2022).
https://doi.org/10.1186/s40900-022-00357-7; Available from: https://rdcu.be/cOMbj
Related Content
Leica Biosystems Knowledge Pathway content is subject to the Leica Biosystems website terms of use, available at: Legal Notice. The content, including webinars, training presentations and related materials is intended to provide general information regarding particular subjects of interest to health care professionals and is not intended to be, and should not be construed as, medical, regulatory or legal advice. The views and opinions expressed in any third-party content reflect the personal views and opinions of the speaker(s)/author(s) and do not necessarily represent or reflect the views or opinions of Leica Biosystems, its employees or agents. Any links contained in the content which provides access to third party resources or content is provided for convenience only.
For the use of any product, the applicable product documentation, including information guides, inserts and operation manuals should be consulted.
Copyright © 2024 Leica Biosystems division of Leica Microsystems, Inc. and its Leica Biosystems affiliates. All rights reserved. LEICA and the Leica Logo are registered trademarks of Leica Microsystems IR GmbH.