Issues in Interdisciplinarity 2018-19/Truth in Medicine

From Wikibooks, open books for an open world
Jump to navigation Jump to search

Introduction[edit | edit source]

Effective treatment depends upon an accurate and effective diagnosis therefore Medicine is dependent on the understanding of truth, from initial diagnosis, to treatment and prevention. This WikiBooks chapter will explore the issues that arise due to the variations in truth within the medical field as an estimated 15% of all medical cases are inaccurate. The patients, doctors and Artificial Intelligence (AI) are the sources in which truth must be acquired and utilised, in which through interdisciplinary approaches explores the origin of discrepancy in truth. The philosophical scope of the issue looks at the concept of strong and weak AI, the ethical ramifications and the issues that could arise upon the implication of AI in medicine. Using the work of philosophers such as John Searle to evaluate the solutions to the issue of misdiagnosis. Finally, this extract will analyse the social and economic impacts of misdiagnosis.

Medical truth[edit | edit source]

There are many reasons why truth in medicine is not always uncovered, leading to misdiagnosis, delayed recovery and depletion of resources. Initial diagnosis is vital, however due to human behaviour 15% of patients are categorised as ‘difficult patients’ according to practitioners.[1]

Patient: The behaviour of patients influences directly on the diagnosis of their illness, such as rude comments, demands and threats towards the doctor.[2] Even though time taken for doctors to diagnose a patient doesn’t vary however, the accuracy of ‘difficult patients’ diagnosis was drastically lowered by 20%.[3] The difficulty in obtaining truthful information from patients or the lack of co-operation may result in wasting time and resources.

Doctor: Doctors use sense perception and reasoning when diagnosing patients but these ways of knowing is prone to human errors. The practitioner’s initial diagnosis- not necessarily accurate- becomes their final diagnosis due to the difficult situation. The uncomfortable environment created becomes distracting, therefore following statements given by the patient may not have been considered by the doctor so the true illness may remain unveiled.[4] As well as this patients behaviour may trigger emotional responses, in particular anger, affecting judgement as they aren’t using their full capacity to evaluate their findings.[5] In addition, more brain capacity is directed to responding accordingly to a difficult situation, allowing less analytical process for diagnosis.[6]


Illness: Even when patients suffer from the same illness, each medical case is unique as each person’s body works, medical history and personal experiences differ. As ‘Chronic pain’ -pain occurs for longer than 12 weeks-illustrates the similarity between patients is the length of issue, yet everyone’s experience is diverse. The diagnosis of chronic pain is entirely dependent on the patient’s description of experience, such as degree of pain, location and type of feeling. Pain is subjective, therefore effectiveness of different treatments is personal, so there is no single treatment that will cure every case.[7] The lack of a universal truth in treatment makes it time consuming, possibly prolonging the suffering.

Ethical Ramifications of Artificial Intelligence[edit | edit source]

To avoid the issues related to misdiagnosis by doctors of difficult patients, artificial intelligence could be used to provide an unimpeded by emotion diagnosis. Tests of this new technology have shown that artificial intelligence does outperform doctors in some aspect as they have the capacity to test thousands of theories in under a second.[8] However, ethical considerations must be made when thinking about data protection and the ways in which this information could be abused. Through maintaining a large online database for medical AI, this could be creating huge vulnerabilities for modern society as hacking becomes more and more commonplace. There are also worries of the government using such information as a form of covert surveillance.[9]

Strong AI vs Weak AI[edit | edit source]

John Searle’s concept of strong AI versus weak AI brings into question the nature of AI and if it can really replace a human interaction. Strong AI refers to AI that tries to completely replicate the actions and thought process of a human, alternatively weak AI is only made as an information processing machine.[10] One could argue that medical AI needs to only be weak AI, to act as a catalyst for running through the possibilities at a diagnostic stage. With the development of AI being reliant on the capacity of the programmer to encode the system with a capable diagnostic system, this creates a new issue, as the technology is only as strong as the research and programmers behind it. This raises the issue of who is responsible when negative outcomes arise as a result of using AI informed diagnosis.[11]

Social impact[edit | edit source]

Misdiagnosis is an undermined problem, in developed countries most patients will be misdiagnosed once in their lifetime[12], often leading to life-threatening consequences. Yet there are very few organisation engaged in a system to reduce the frequency of misdiagnosis.[13]

Those impacted by misdiagnosis are:

Patients: One in ten patients are harmed in the treatment that they receive in hospitals.[14] Between 40 000 to 80 000 patients die every year because of misdiagnosis.[12] Those most likely to be misdiagnosed are the ones in poorer health and the elderly[15].  Misdiagnosis is particularly present in patients suffering from cancer, fractures and scaphoid.[16] Patients sometimes have to pay for lifelong care for permanent disabilities. Furthermore, misdiagnosis has repercussion on the social life of the patient such as the loss of a family member or losing the ability to work.

Institutions: On a larger scale, this leads to hospitals losing a significant amount of time, because patients need to be rediagnosed which hugely impacts their reputation as well as their income.[13] Businesses lose experienced employees which brings down their productivity, adding to that comes an increase in insurance payouts.

Economic impact[edit | edit source]

The cost of misdiagnosis is increasing faster than any other component of health care expenditures, impacting the budget of patients, hospitals and public programs. According to the Institute Of Medicine (IOM), in the United States, 30% of annual health care spending’s- 750 billion US dollars- are wasted on unnecessary services.[12] Similarly, in the United Kingdom, the National Health Services (NHS) hospitals lose 197,2 million pounds per year due to cases of misdiagnosis.[16] Bringing the average price of a misdiagnosis to $386, 849 per claim. Misdiagnosis not only has a huge financial impact on the resources of hospitals and the patients but public assistance programs suffer too.

Conclusion[edit | edit source]

As medical knowledge improves, aided by the advance of technology, it is likely that we will be solving many problems in the future with the help of AIs. With the development of AIs, there are still strong concerns relating to safety of AIs as well as the effectiveness for it to self-operate with no human interference. However, at present we are still dependent on the expertise of trained physicians. Finally, in order to reduce the lack of truth medicine we are bound to efficient communication between patient and doctor and the progress of research to understand more about diseases.

  1. Davies M. Managing challenging interactions with patients. BMJ [Internet]. 2013 [cited 8 December 2018];:f4673. Available from: https://www.bmj.com/content/347/bmj.f4673
  2. Ovens H. Part I: the difficult patient: medical and legal approaches. Can Fam Physician. [Internet]. 1989;35:1797-802. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2280874/?page=1
  3. Inflammation; 'Difficult' patients increase doctors' misdiagnosis risk regardless of case complexity. NewsRx Health 2016 Apr 03:8. Available from: https://search.proquest.com/docview/1775446165
  4. Mamede S, Van Gog T, Schuit S, Van den Berge K, Van Daele P, Bueving H et al. Why patients’ disruptive behaviours impair diagnostic reasoning: a randomised experiment. BMJ Quality & Safety [Internet]. 2016 [cited 8 December 2018];26(1):13-18. Available from: https://search.proquest.com/docview/1883802763
  5. Lerner J, Tiedens L. Portrait of the angry decision maker: how appraisal tendencies shape anger's influence on cognition. Journal of Behavioral Decision Making [Internet]. 2006 [cited 8 December 2018];19(2):115-137. Available from: https://onlinelibrary.wiley.com/doi/epdf/10.1002/bdm.515
  6. Mamede S, Van Gog T, Schuit S, Van den Berge K, Van Daele P, Bueving H et al. Why patients’ disruptive behaviours impair diagnostic reasoning: a randomised experiment. BMJ Quality & Safety [Internet]. 2016 [cited 8 December 2018];26(1):13-18. Available from: https://search.proquest.com/docview/1883802763
  7. Chronic Pain: Symptoms, Diagnosis, & Treatment | NIH MedlinePlus the Magazine [Internet]. Medlineplus.gov. 2018 [cited 9 December 2018]. Available from: https://medlineplus.gov/magazine/issues/spring11/articles/spring11pg5-6.html
  8. AI FOR HEALTHCARE: BALANCING EFFICIENCY AND ETHICS. Bengaluru, India: Infosys; 2017.
  9. Artificial intelligence (AI) in healthcare and research. London, UK: Nuffield council on Bioethics; 2018.
  10. Bringsjord, Selmer, Govindarajulu, Sundar N. Artificial Intelligence [Internet]. Plato.stanford.edu. 2018 [cited 7 December 2018]. Available from: https://plato.stanford.edu/archives/fall2018/entries/artificial-intelligence/
  11. Griffiths S. The big ethical questions for artificial intelligence (AI) in healthcare – Nuffield Bioethics [Internet]. Nuffield Bioethics. 2018 [cited 7 December 2018]. Available from: http://nuffieldbioethics.org/news/2018/big-ethical-questions-artificial-intelligence-ai-healthcare
  12. a b c Pinnaclecare: The human cost and financial impact of Misdiagnosis. White paper [internet]. 2016 available from:https://pinnaclecare.com/forms/download/Human-Cost-Financial-Impact-Whitepaper.pdf
  13. a b Graber ML, Wachter RM, Cassel CK. Bringing Diagnosis Into the Quality and Safety Equations. JAMA.2012;308(12):1211–1212. doi:10.1001/2012.jama.11913. Available from:https://jamanetwork.com/journals/jama/fullarticle/1362034
  14. Develin K, Smith R. One in six NHS patients 'misdiagnosed'. The telegraph [internet]. 2009. Available from:https://www.telegraph.co.uk/news/health/news/6216559/One-in-six-NHS-patients-misdiagnosed.html
  15. Carter mw , et al. Nihgov. BMJ [internet]. 2014[Accessed 9 December 2018].Available from: https://www.ncbi.nlm.nih.gov/pubmed/24871958
  16. a b Graysons. What Are Really The Top Misdiagnosed Conditions In NHS Hospitals In 2014/15?. Graysonscouk. [internet]. 2015 [Accessed 3 December 2018].Available from: https://www.graysons.co.uk/advice/the-top-misdiagnosed-conditions-in-nhs-hospitals/