The chimera of algorithmic objectivity: difficulties of machine learning in the development of a non-normative notion of health
DOI:
https://doi.org/10.12795/IETSCIENTIA.2022.i01.03Keywords:
Machine learning, Health, Objectivity, NormativityAbstract
This essay explores whether machine learning, a sub-discipline of artificial intelligence, can contribute to developing a more objective approach to the development and formulation of concepts and descriptions. Taking as an example the case of the definition of health proposed by Christopher Boorse, the paper discusses and contrasts a series of possibilities and problems that
may arise when applying machine learning to solving some of the problems encountered by this theory. Based on the analysis, the paper concludes that machine learning (both supervised and unsupervised) entail elements of normativity and subjectivity that make it unfeasible to develop concepts and descriptions in a neutral and objective manner as the theory requires. This does not imply that machine learning is invalidated for the evaluative analysis of health, but rather highlights and makes explicit the subjective elements present in it.
Downloads
References
Adamson, A. S., y Smith, A. (2018). Machine learning and health care disparities in dermatology. JAMA Dermatology, 154(11), 1247-1248.
Anderson, C. (2008). The end of theory: The data deluge makes the scientific method obsolete. Wired Magazine. Recuperado 15/11/21 de https://www.wired.com/2008/06/pb-theory/
Beery, T. A. (1995). Gender bias in the diagnosis and treatment of coronary artery disease. Heart & Lung, 24(6), 427-435.
Benjamin, R. (2019). Race after technology: Abolitionist tools for the new Jim Code. Cambridge: Polity.
Blakeley, J. (2020). We built reality: How social science infiltrated culture, politics, and power. Oxford: Oxford University Press.
Boden, M. (2016). AI: Its nature and future. Oxford: Oxford University Press.
Boorse, C. (1975). On the distinction between disease and illness. Philosophy and Public Affairs, 5(1), 49–68.
Boorse, C. (1977). Health as a theoretical concept. Philosophy of Science, 44(4), 542-573.
Boorse, C. (1997). A rebuttal on health. In J. M. Humber & R. F. Almeder (Eds.), What is disease? (pp. 1–143). Totowa: Humana Press.
Boorse, C. (2014). A second rebuttal on health. Journal of Medicine and Philosophy, 39, 683–724.
Bostrom, N. (2014). Superintelligence: Paths, Dangers, Strategies. Oxford: Oxford University Press.
Bowker, G. C., y Star, S. L. (2000). Sorting things out: Classification and its consequences. Cambridge: MIT Press.
Carnap, R. (1998). Der logische aufbau der welt (Vol. 514). Felix Meiner Verlag.
Casacuberta, D., y Vallverdú, J. (2014, 2014/01/02). E-science and the data deluge. Philosophical Psychology, 27(1), 126-140.
Domingos, P. (2021) We must stop militant liberals from politicizing artificial intelligence. The Spectator. Recuperado 15/11/21 https://spectator.us/militant-liberals-politicizing-artificial-intelligence/
Ereshefsky, M. (2009). Defining “health” and “disease”. Studies in the History and Philosophy of Biology and Biomedical Sciences, 40(3), 221–227.
Esteva, A., Kuprel, B., Novoa, R. A., Ko, J., Swetter, S. M., Blau, H. M., y Thrun, S. (2017). Dermatologist-level classification of skin cancer with deep neural networks. Nature, 542(7639), 115-118.
Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. New York: St. Martin’s Press.
Fink, C., Blum, A., Buhl, T., Mitteldorf, C., Hofmann-Wellenhof, R., Deinlein, T., Stolz, W., Trennheuser, L., Cussigh, C., Deltgen, D., Winkler, J. K., Toberer, F., Enk, A., Rosenberger, A., y Haenssle, H. A. (2020). Diagnostic performance of a deep learning convolutional neural network in the differentiation of combined naevi and melanomas. Journal of the European Academy of Dermatology and Venereology, 34(6), 1355-1361.
Feyerabend, P. (1993). Against method. Verso.
Gammelgaard, A. (2000). Evolutionary biology and the concept of disease. Medicine, Health Care and Philosophy, 3, 109-116.
Gomez-Uribe, C. A., y Hunt, N. (2016). The Netflix recommender system. ACM Transactions on Management Information Systems, 6(4), 1-19.
Harcourt, B. (2001). Illusion of order: The false promise of broken windows policing. Cambridge: Harvard University Press.
Heaven, W. D. (2020). La IA de plegamiento de proteínas de google resuelve un histórico desafío de la biología. MIT Technology Review. Recuperado 20/01/2021 de https://www.technologyreview.es/s/12935/la-ia-de-plegamiento-de-proteinas-de-google-resuelve-un-historico-desafio-de-la-biologia
Hinton, E. (2016). From the war on poverty to the war on crime. Cambridge: Harvard University Press.
Huntington, A., y Gilmour, J. A. (2005). A life shaped by pain: Women and endometriosis. Journal of Clinical Nursing, 14(9), 1124-1132.
Kingma, E. (2007). What is it to be healthy? Analysis, 67(2), 128–133.
Kingma, E. (2014). Naturalism about health and disease: Adding nuance for progress. The Journal of Medicine and Philosophy: A Forum for Bioethics and Philosophy of Medicine, 39(6), 590-608.
Kosinski, M. (2021, 2021/01/11). Facial recognition technology can expose political orientation from naturalistic facial images. Scientific Reports, 11(1), 100.
Kovács, J. (1998). The concept of health and disease. Medicine, Health Care and Philosophy, 1, 31-39.
Marcus, G., y David, E. (2019). Rebooting AI: Building artificial intelligence we can trust. New York: Vintage.
Millikan, R. (1984). Language, Truth, and Other Biological Categories. Cambridge, Mass: MIT Press..
Murphy, D. (2015). Concepts of disease and health. Recuperado 15/04/2016 de http://plato.stanford.edu/archives/spr2015/entries/health-disease/
Nordenfelt, L. (2007). The concepts of health and illness revisited. Medicine, Health Care and Philosophy, 10, 5-10.
Nordenfelt, L. (2016). A defence of a holistic concept of health. In G. É. (Ed.), Naturalism in the philosophy of health. History, philosophy and theory of the life sciences (pp. 209-225). Dordrecht: Springer.
OECD. (2019). The heavy burden of obesity: The economics of prevention. Paris: OECD Publishing.
Radder, H. (2009). Why technologies are inherently normative. In D. Gabbay, P. Thagard, & J. Woods (Eds.), Handbook of the philosophy of science (pp. 887-921). Amsterdam: Elsevier.
Rudin, C. (2019). Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead. Nature Machine Intelligence, 1(5), 206-215.
SEEDO. (n/d). IMC. Recuperado 19/01/2021 de https://www.seedo.es/index.php/imc
Van Fraassen, B. C. (1980). The scientific image. Oxford University Press.
Vorvick, L. (2019). Pulse. U.S. Department of Health and Human Services. Recuperado 10/02/2021 de https://medlineplus.gov/ency/article/003399.htm
Wang, Y., y Kosinski, M. (2018). Deep neural networks are more accurate than humans at detecting sexual orientation from facial images. Journal of Personality and Social Psychology, 114(2), 246-257.
Published
How to Cite
Issue
Section
License
Copyright (c) 2022 Ariel Guersenzvaig, David Casacuberta
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Those authors being published in this journal agree to the following terms:
- Authors retain their copyright and they will guarantee to the journal the right of first publication of their work, which will be simultaneously subject to license recognition by Creative Commons that allows others to share such work provided it is stated the author’s name and his first publishing in IUS ET SCIENTIA.
- Authors may take other non-exclusive distribution license agreements version of the published work (e.g. deposit in an institutional digital file or publish it in a monographic volume) provided it is stated the initial publication in this journal.
- It is allowed and encouraged that Author s disseminate their work via the Internet (e. g. institutional digital files or on their website) prior to and during the submission process, which can lead to interesting exchanges and to increase citation of the published work.
- Abstract 230
- PDF (Español (España)) 256
- HTML (Español (España)) 69