Using Computational Ethnography to Enhance the Curation of Real-world Data (RWD) for Chronic Pain and Invisible Disability Use Cases
Rhonda J. Moore, PhD, FRSA, US Department of Health and Human Services, Food and Drug Administration, rhondajmoore3@gmail.com
Ross Smith, FRSA, Microsoft Corporation/ University College Dublin Rosss@microsoft.com
Qi Liu, PhD, US Department of Health and Human Services, Food and Drug Administration, Office of Clinical Pharmacology Qi.Liu@fda.hhs.gov
Abstract
Chronic pain is a significant source of suffering, disability and societal cost in the US. However, while the ability to detect a person’s risk for developing persistent pain is desirable for timely assessment, management, treatment, and reduced health care costs— no objective measure to detect clinical pain intensity exist. Recent Artificial Intelligence (AI) methods have deployed clinical decision- making and assessment tools to enhance pain risk detection across core social and clinical domains. Yet, risk assessment models are only as “good” as the data they are based on. Thus, ensuring fairness is also a critical component of equitable care in both the short and long term. This paper takes an intersectional and public health approach to AI fairness in the context of pain and invisible disability, suggesting that computational ethnography is a multimodal and participatory real-world data (RWD) methodology that can be used to enhance the curation of intersectional knowledge bases, thereby expanding existing boundaries of AI fairness in terms of inclusiveness and transparency for pain and invisible disability use cases.
Introduction
Chronic pain is a complex biopsychosocial, environmental, and subjective experience expressed as multifaceted and heterogeneous social and clinical phenotypes [1-13]. Global estimates of chronic pain range between 8% to 60% globally due to differences in methodologies employed in determining prevalence rates and the different populations studied. Over 100 million Americans continue to suffer from pain [10, 16]. The estimated cost to society is between $500 and $635 billion annually (USD) [1, 10, 16]. The ability to detect a person’s risk for developing persistent pain is clinically desirable for the timely assessment, management, and treatment—with a goal to further reduce long term morbidity and health care costs [10-12, 16]. However, at present, there is still no objective measure to detect clinical pain intensity [11-15].
Pain is a common symptom, yet it is often difficult to define, communicate and sometimes to prove [1-9, 15]. In the US, pain remains an important source of morbidity, personal and social suffering and significant disparities exist across populations [10-12, 16]. Pain also means different things to different people, gets assessed differently across different subpopulations; further contributing to persistent disparities [4-7, 15]. In addition to patient related barriers, increasingly clinician and system related barriers have also been shown to hinder the effective management, assessment and treatment of pain, with implications for poor outcomes including increased poor outcomes and health care costs in both state and in global care settings [4, 10, 16].
Chronic pain often intersects the lived experience of disability [4-7] overlapping what Trewin (2019) notes as "differences from other protected attributes like race and gender, in terms of extreme diversity and data privacy" [9, 17]. Disability results from the interaction between people with impairments and institutions, and socio-cultural and environmental factors that hinder an individual’s full and effective participation in society [5-9, 17-24]. In addition, many chronic pain patients also struggle living with invisible disabilities (e.g., Fibromyalgia, Migraine, etc.,), that are not readily apparent, that may be also hidden from the clinical purview by seemingly generalized, difficult to convey and non-specific symptoms [7-9, 20-24]. Existing data curation methods reinforce these data gaps, drawing lines between clinically verifiable pain and disability experiences, rendering others as invisible, marginalized, or as outliers [7, 8, 22]. Consequently, patients may suffer for years due to misclassification of symptoms, leading to delayed diagnosis, treatment and poor pain and disability outcomes [7-9, 21].
AI APPROACHES TO PAIN
Artificial Intelligence methods are increasingly being deployed to assist clinical decision- making including pain risk detection across core social and clinical domains, particularly in areas such as image recognition, natural speech processing, language translation, textual analysis, and self-learning [13-14, 23-33]. Broadly, these methods are being used to better identify and classify what is potentially “relevant” to a specific pain patient, potentially increasing the probability of an accurate diagnosis, treatment and prediction of outcomes. A detailed review is beyond the scope of this paper. Instead, we direct you to some of the relevant research and reviews on this topic [13-14, 23-33]. Lötsch & Ultsch (2018) discuss some of the ways that AI-ML has been used across a variety of scenarios, such as: predictive pain phenotype prediction from complex case data, structure detection in complex pain-related data, exploration of data sets by reversing the analytical focus of classifier building and pattern detection [13]. Other important work describes the use of AI as part of neuroimaging and autonomic measurements to improve the classification of high and low pain states and that might also be able to predict pain intensity (e.g., fluctuations). Other important work describes the use of AI as part of neuroimaging and autonomic measurements to improve the classification of high and low pain states and that might also be able to predict pain intensity (e.g., fluctuations), remote monitoring, virtual coaching to increase adherence to treatments and improve clinical outcomes, mental health, digital health technologies and reinforcement learning [23-32].
PAIN AND DISABILITY—A problem of fairness?
Despite all this important work leveraging AI, chronic pain and invisible disability remain a considerable challenge to existing AI frameworks due to different types of bias. The main method for clinical pain assessment still relies mainly on a patient’s ability to communicate a self-assessment of pain (e.g., numerical rating scale (NRS), visual analogue scales (VAS)), behavioral signs), and clinical judgement [1, 3, 6]. Chronic pain is also an experience related to culture, emotion, mind, spirit and body that everyone experiences differently. Significant patient, clinician and system barriers also persist [2-7]. Also, while people with disabilities (PWDs) are four times more likely to report their health as fair or poor than people without disabilities (40.3% vs. 9.9%); there are still significant problems with the ways ‘disability’ has been labelled, defined and analysed differently across different surveillance frameworks (e.g. state, national data bases etc.) [34-35]. The exact prevalence of chronic pain and invisible disability is also underreported and under recognized [33-35]. All these issues contribute to different types of bias at different points across the life course of the data—as integral parts of the data inputs and outputs—in the thought of what data is relevant to the problem, in curation decisions around what data to collect or exclude, in processes of screening, assessments, exclusions and analyses of patient social and clinical data based on the perceived visibility and judgement of the validity of the pain and disability.
Guo et al (2019) has cautioned that “widely deployed AI systems may not work properly for people with disabilities, or worse, may actively discriminate against them” [18]. As we have indicated, the sources of bias for chronic pain and invisible disability are many as are the unintended consequences: reinforcement bias in assessment, management and treatment, perpetuation of outlier bias, oversimplification and erasure of important differences and storied contexts, allostatic bias, less data transparency—all leading to missed opportunities and poor outcomes in care experiences across diverse populations. As an alternative method and tool to understand bias and bias mitigation, we suggest that computational ethnography as a multimodal and participatory real-world data (RWD) methodology can be used to potentially expand the boundaries of AI fairness in terms of inclusiveness and transparency for pain and invisible disability use cases.
COMPUTATIONAL ETHNOGRAPHY
Ethnography is often described as a qualitative method and approach concerned with learning about people through contextual data immersion (e.g. interviews, observation, participant-observation, case studies, archival research; micro level data) [33-39]. Characterized by in-depth observation of individuals/ groups, with an awareness of the sociocultural, historical and political contexts that influence interaction and experience [33-34]. Computational ethnography extends ethnography’s traditional qualitative methodological toolkit adding computational methods. It continues to leverage the strength of ethnography in terms of understanding the lived experiences of individuals/groups who give that data meaning, while also placing a participatory lens on how larger systems - including norms, values, assumptions - are culturally encoded in and reproduced through the design of sociotechnical data-driven systems [36-43]. As a complement to existing AI methods, it can add to comprehensive data collection and analytic processes in the following ways (not a conclusive list):
- A set of tools that can scale ethnography improving definitions and understanding of context specific bias and fairness knowledges (e.g. assist in the clarification of nuances in calibration, predictive equality, or statistical parity) [24, 41-50].
- Enhance understandings of context situated bias and fairness knowledges and enhance transparency. For example, provide insight as to fairness features are acceptable to include in AI models and under what conditions and for what purpose, address fundamental concerns about internal and external validity leading to improved understandings of fairness through blindness, fairness through awareness, etc. [24, 41, 45-47].
- Begin to address responsibility and accountability gaps in frameworks to realize risk-based approaches (e.g., understand and potentially address the underpinnings of representational harms and promote inclusive representation; better characterize risk to high risk users) [24, 41, 45-48].
- Improve social bias detection in training data due to inaccuracies in labeling and classification through awareness [17-19, 24, 45-47].
- Better outlier detection and improved transparency in AI models, also revealing underlying assumptions and affordances [41-45].
- Leveraging ethnographic approaches to provide a more detailed and “fair” examination of how AI system design fosters inequality, placing a critical lens on the role of AI in the development of certain kinds of identities and communities while suppressing others [17-19, 45-49].
- Improving data labeling transparency and enhancing the build of balanced data sets. This includes co-creative participatory design strategies to clarify and include representational definitions, also including the perspectives of diverse designers, advocates and end users. It is especially important to include the perspectives of those most vulnerable to unintended consequences of these technologies) [19, 41-43].
- Intersectional engagement of the perspectives of a wide range of diverse stakeholders in the design of sociotechnical products (e.g. via diverse designers, user testing, Community Based Participatory Research) [7-9, 41-43, 51-57]
- The continued development of an inclusive social justice AI fairness narrative and design for social good framework that will continue to engage diverse stakeholders who live with chronic pain and invisible disability [39-43, 51-57].
The inclusion of computational ethnography as a method can also be aligned with the on-going research and work of AI fairness and disability research , conversations and exploratory tools. Together this work can lead to improvements in chronic pain and invisible disability related outcomes and care for all people.
Summary
AI can amplify both risks and opportunities; looking forward, it is beneficial to strive to leverage the unique opportunities across the pain and disability spectrum to better inform AI policy and design. In this paper, we have described how chronic pain can often overlap the experience of invisible disability. We also briefly discuss broad trends in AI that are trying to predict risk for pain and related disability with the goal of improving clinical outcomes in the short and long term. We note that while these efforts are certainly important; they fail to fully address the problem of potential bias that impacts the equitable design of AI sociotechnical systems, including the potential unintended consequences for people living with chronic pain and invisible disability. Finally, we briefly highlight how computational anthropology as an alternative methodology can potentially enhance inclusiveness and fairness for chronic pain and disability use cases and is an important addition to the AI fairness and disability toolkits.
ACKNOWLEDGMENTS
We thank Rebecca Racz, PharmD (FDA, CDER), Phaedra Boinodiris (IBM) and Christopher Schoppet (Computer Scientist, USDA) for their kind support of this project.
Disclaimer
The contents of this article reflect the views of the authors and should not be construed to represent the FDA’s views or policies. No official support or endorsement by the FDA is intended or should be inferred.
References
- IASP, 1994. Part III: Pain Terms, A Current List with Definitions and Notes on Usage (pp 209-214). Classification of Chronic Pain, Second Edition, IASP Task Force on Taxonomy, edited by H. Merskey and N. Bogduk, ISAP Press, Seattle, 1994. http://www.iasp-pain.org.
- Cheatle MD. Biopsychosocial Approach to Assessing and Managing Patients with Chronic Pain. Med Clin North Am. 2016;100(1):43-53. doi:10.1016/j.mcna.2015.08.007.
- Shanthanna H, Strand NH, Provenzano DA, et al. Caring for patients with pain during the COVID-19 pandemic: consensus recommendations from an international expert panel. Anaesthesia. 2020;75(7):935-944. doi:10.1111/anae.15076H, Strand NH, Provenzano DA, et al. Caring for patients with pain during the COVID-19 pandemic: consensus recommendations from an international expert panel. Anaesthesia. 2020;75(7):935-944. doi:10.1111/anae.15076.
- Lee P, Le Saux M, Siegel R, et al. Racial and ethnic disparities in the management of acute pain in US emergency departments: Meta-analysis and systematic review. Am J Emerg Med. 2019;37(9):1770-1777. doi:10.1016/j.ajem.2019.06.014.
- Taylor JL, Drazich BF, Roberts L, et al. Pain in low-income older women with disabilities: a qualitative descriptive study [published online ahead of print, 2020 May 31]. J Women Aging. 2020;1-22. doi:10.1080/08952841.2020.1763895
- Kempner J.I nvisible people with invisible pain: A commentary on "Even my sister says I'm acting like a crazy to get a check": Race, gender, and moral boundary-work in women's claims of disabling chronic pain.Soc Sci Med. 2017;189:152-154. doi: 10.1016/j.socscimed.2017.06.009. Epub 2017 Jun 10
- Kiesel L. Chronic Pain—the Invisible Disability. Harvard Health Blog. May 2017 Retrieved from: https://www.health.harvard.edu/blog/chronic-pain-the-invisible-disability-2017042811360.
- Morgan P. Invisible Disabilities: Break Down The Barriers. Forbes. 3/20/2020. Retrieved from: https://www.forbes.com/sites/paulamorgan/2020/03/20/invisible-disabilities-break-down-the-barriers/#4ee2be48fa50.
- Trewin S, Basson S, Muller M, Branham S, Treviranus J, Gruen D, Natalia D, Lyckowski N, Manser. Considerations for AI Fairness for People with Disabilities. AI Matters 2019; 5(3), 40-63, ACM.
- Dahlhamer J, Lucas J. Zelaya C, Nahin R. et al. Prevalence of Chronic Pain and High-Impact Chronic Pain Among Adults — United States, 2016. MMWR. 2018; 67(36);1001–1006.
- Grimmer-Somers K, Vipond N, Kumar S, Hall G. A review and critique of assessment instruments for patients with persistent pain. J Pain Res. 2009;2:21-47. Published 2009 Mar 11. doi:10.2147/jpr.s4949.
- Dawes TR, Eden-Green B, Rosten C, et al. Objectively measuring pain using facial expression: is the technology finally ready?. Pain Manag. 2018;8(2):105-113. doi:10.2217/pmt-2017-0049
- Lötsch J, Ultsch A. Machine learning in pain research. Pain. 2018;159(4):623-630. doi:10.1097/j.pain.0000000000001118
- Tack C. Artificial intelligence and machine learning | applications in musculoskeletal physiotherapy. Musculoskelet Sci Pract. 2019;39:164-169. doi:10.1016/j.msksp.2018.11.012
- Butow P, Sharpe L. The impact of communication on adherence in pain management. PAIN. 2013;(154): S101-7.
- Gaskin DJ, Richard P.The Economic Costs of Pain in the United States. The Journal of Pain. 2012; 13(8): 715-724. https://doi.org/10.1016/j.jpain.2012.03.009.
- Trewin, S. AI Fairness for People with Disabilities: Point of View. arXiv:1811.10670 [cs.AI]
- Guo A, Kamar, Vaughan JW, Wallach H, Morris MR. Toward Fairness in AI for People with Disabilities: A Research Roadmap. ASSETS 2019 Workshop on AI Fairness for People with Disabilities | October 2019 Organized by ACM.
- Trewin S, How to tackle AI bias for people with disabilities. VentureBeat, 2018. Retrieved from: https://venturebeat.com/2018/12/03/how-to-tackle-ai-bias-for-people-with-disabilities/
- Wylezinski, LS, Gray JD, Polk JB, Harmata AJ, Spurlock CF. Illuminating an Invisible Epidemic: A Systemic Review of the Clinical and Economic Benefits of Early Diagnosis and Treatment in Inflammatory Disease and Related Syndromes J. Clin. Med. 2019, 8(4), 493.
- Lillywhite A., Wolbring G. Coverage of ethics within the artificial intelligence and machine learning academic literature: The case of disabled people. Assist Technol. 2019 Apr 17:1-7
- Connor CW. Artificial Intelligence and Machine Learning in Anesthesiology. Anesthesiology. 2019;131(6):1346-1359. doi:10.1097/ALN.0000000000002694
- Forstenpointer J, Moller P, Sendel M, Reimer M, Hullemann P, Baron R. Stratification of patients with unclassified pain in the FabryScan database. Journal of Pain. 2019; 12:2223-2230 .
- Mitchell S,Potash E, Barocas S, D’Amour A, Lum K. Prediction-Based Decisions and Fairness: A Catalogue of Choices, Assumptions, and Definitions. 2018arXiv181107867M 2018/11
- Rahman QA, Janmohamed T, Pirbaglou M, et al. Defining and Predicting Pain Volatility in Users of the Manage My Pain App: Analysis Using Data Mining and Machine Learning Methods. J Med Internet Res. 2018;20(11):e12001. Published 2018 Nov 15. doi:10.2196/12001
- Tropea P, Schlieter H, Sterpi I, et al. Rehabilitation, the Great Absentee of Virtual Coaching in Medical Care: Scoping Review. J Med Internet Res. 2019;21(10):e12805. Published 2019 Oct 1. doi:10.2196/12805
- Howard A, Borenstein J. The Ugly Truth About Ourselves and Our Robot Creations: The Problem of Bias and Social Inequity. Sci Eng Ethics. 2018;24(5):1521-1536.
- Forsyth AW, Barzilay R, Hughes KS, Lui D, Lorenz KA, Enzinger A, Tulsky JA, Lindvall C. Machine learning methods to extract documentation of breast cancer symptoms from electronic health records. J Pain Symptom Manage. 2018 Jun;55(6):1492–1499. doi: 10.1016/j.jpainsymman.2018.02.016.
- Lötsch J, Geisslinger G, Walter C. Generating knowledge from complex data sets in human experimental pain research. Schmerz. 2019 Sep 2. doi: 10.1007/s00482-019-00412-5. [Epub ahead of print]
- Lötsch J, Alfredsson L, Lampa J. Machine-learning based knowledge discovery in rheumatoid arthritis related registry data to identify predictors of persistent pain. Pain. 2019 Aug 30. doi: 10.1097/j.pain.0000000000001693. [Epub ahead of print]
- Ortiz-Catalan M, Guðmundsdóttir RA, Kristoffersen MB, Zepeda-Echavarria A, Caine-Winterberger K, Kulbacka-Ortiz K, Widehammar C, Eriksson K, Stockselius A, Ragnö C, Pihlar Z, Burger H, Hermansson L. Phantom motor execution facilitated by machine learning and augmented reality as treatment for phantom limb pain: a single group, clinical trial in patients with chronic intractable phantom limb pain. Lancet. 2016 Dec 10;388(10062):2885–2894. doi: 10.1016/S0140-6736(16)31598-7.
- Triantafyllidis AK, Tsanas A. Applications of Machine Learning in Real-Life Digital Health Interventions: Review of the Literature. J Med Internet Res. 2019 Apr; 21(4): e12286.
- Krahn, G. L., Walker, D. K., & Correa-De-Araujo, R. (2015). Persons with disabilities as an unrecognized health disparity population. American Journal of Public Health, 2015; 105(S2), S198-S206.
- Altman BM, Berstein A. Disability and Health in the United States, 2001-2005. Hyattsville, MD: National Center for Health Statistics, 2008.
- Moore RJ (Ed). Handbook of Pain and Palliative Care—Biopsychosocial and Environmental Perspectives for the Life Course. New York, Springer, 2019.
- Gooberman-Hill R. Ethnographies of Pain: Culture, Context and Complexity. Br J Pain. 2015; 9(1): 32–35.
- Atkinson PA, Delamont S, Coffey A, Lofland J and Lofland LH (eds). Handbook of Ethnography. Los Angeles: SAGE, 2007.
- Harper P. Ethnomethodological ethnography and its application in nursing. Journal of Research in Nursing. 2008;13(4): 311–323. https://doi.org/10.1177/1744987108090722.
- Agar M.H. Qualitative Research Method series: Speaking of ethnography. Newbury Park CA.: SAGE Publications, 1984.
- Wallack L. Building a Social Justice Narrative for Public Health. Health Education & Behavior. 2019; 46(6), 901–904.
- Abramson, C. M., Joslyn, J., Rendle, K. A., Garrett, S. B., & Dohan, D. The promises of computational ethnography: Improving transparency, replicability, and validity for realist approaches to ethnographic analysis. Ethnography, 2018;19(2), 254–284. https://doi.org/10.1177/1466138117725340
- Dohan D, Sánchez-Jankowski M. Using computers to analyze ethnographic field data: Theoretical and practical considerations. Annual Review of Sociology 24: 477–98.
- Abramson CM, Dohan, D (2015) Beyond text: Using arrays to represent and analyze ethnographic data. Sociological Methodology 45: 272–319.
- Gaudiano C. Inclusion Is Invisible: How To Measure It. Forbes. Apr 23, 2019. Retrieved from: https://www.forbes.com/sites/paologaudiano/2019/04/23/inclusion-is-invisible-how-to-measure-it/#4791072da3d2.
- Patton DU, Frey WR, McGregor KA, McKeown, Moss E. Contextual analysis of social media—the promise and challenge of eliciting context in social media posts with NLP. AIES '20: Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society. February 2020. Retrieved from: https://safelab.socialwork.columbia.edu/sites/default/files/2020-03/CASM.pdf
- Hutchinson B, Prabhakaran V, Denton E et al. Social Bias in NLP models at Barriers for Persons with Disabilities. 2020 ACM Retrieved from: https://arxiv.org/pdf/2005.00813.pdf
- Hutchinson B, Mitchell M. 50 Years of Test (Un)fairness: Lessons for Machine Learning. 2019.Proceedings of FAT* 2019. Retrieved from: https://arxiv.org/abs/1811.10104
- Corbett-Davis S. Pierson E, Feller A, Goel S, Hug A. Algorithmic Decision-Making and the cost of fairness. Proceedings of KDD’17. Retrieved from: arXiv:1701.08230v4.
- Kleinberg J, Mullainathan S. 2019. Simplicity Creates Inequity: Implications for Fairness, Stereotypes, and Interpretability. In Proceedings of the 2019 ACM Conference on Economics and Computation (EC '19). ACM, New York, NY, USA, 807-808. DOI: https://doi.org/10.1145/3328526.3329621.
- Corbett-Davis S,Goel S. The Measure and Mismeasure of Fairness: A Critical Review of Fair Machine Learning. August 14, 2018. Retrieved from: https://arxiv.org/pdf/1808.00023.pdf.
- Stylianou N. Ethics must be at centre of AI technology, says Lords report. Retrieved from: https://news.sky.com/story/ethics-must-be-at-centre-of-ai-technology-says-lords-report-11333333.
- West S, Whittaker M, Crawford K. Discriminating Systems: Gender, Race and Power in AI. AI Now Institute. 2019. Retrieved from: https://ainowinstitute.org/discriminatingsystems.html.
- Benjamin R. Race After Technology: Abolitionist Tools for the New Jim Code. Polity, 2019.. ISBN 9781509526390.
- Benjamin R. Captivating Technology: Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life. Duke University Press, 2019. ISBN 978-1-4780-0381-6.
- Hanna A et al. Towards a Critical Race Methodology in Algorithmic Fairness. Conference on Fairness, Accountability, and Transparency (FAT*), Barcelona, Spain. January 27-30, 2020. Retrieved from: https://arxiv.org/pdf/1912.03593.pdf
- Wolbring G, Lashewicz B. Home care technology through an ability expectation lens.J Med Internet Res. 2014 Jun 20;16(6):e155. doi: 10.2196/jmir.3135.
- The United Nations, General Assembly Reports and Resolutions. 73rd Session, 2018-1019. Inclusive development for and with persons with disabilities Inclusive development for and with persons with disabilities (73/142). Retrieved from: https://www.un.org/development/desa/disabilities/resources/general-assembly.html.