Hi everyone. This text opens up a new realm of topics compared to what I’ve written about so far. It is still very relevant to and needed in physiotherapy, and I also can’t fully separate these topics from philosophy. I will be attempting to explain what is evidence-based practice, why it is needed in healthcare, and how its underpinning philosophy can even be helpful in our day-to-day.
I wasn’t planning to write this text now. However, there is a greater need for us to examine evidence-based practice than I thought. I was also assuming while writing my previous texts, that it was likely people reading them would understand why I’ve been putting references throughout the text. Maybe I was assuming more than I should and a good first step when arguing for anything is making sure that everyone involved in that discussion has the same understanding of the concepts involved.
Looking at my own profession – because part of my personal philosophy is that being critical should start with being critical of ourselves – I’ve recently become more aware that even within a profession that is supposed to base its practice on science, like what is expected from medicine and other health professions, this is still mostly not the case. The underlying reasons for this are complex, but they start at the basic level of a lack of understanding of what the scientific method even is and why it is still the best form of generating knowledge that we have.
I’ve become aware of this gap of knowledge both anecdotally, through seeing what people I know and peers of mine share on social media – I need to mention that Covid seems to have greatly highlighted the lack of understanding of science in general – and how they argue certain concepts; but this can also be seen in the scientific literature, which shows that for the treatment and management of musculoskeletal conditions, close to 50% of physiotherapists do not follow evidence based-guidelines (Zadro, O’Keeffe & Maher, 2019). Close to 50% of people in a modern, healthcare profession, often working in hospitals and integrated into national health services, do not follow evidence-based guidelines in their daily practice!

This will be the first part of my introduction to evidence-based practice. But to understand why we need to be concerned with this, we first need to look at what is evidence-based practice.
Initially coined in 1992 by Gordon Guyatt and ‘The Evidence-Based Medicine Working Group’ he was chairing at the time (Guyatt et al, 1992), at the time named ‘Evidence-Based Medicine’ specifically, was described as a paradigm that “…de-emphasizes intuition, unsystematic clinical experience, pathophysiologic rationale as sufficient grounds for clinical decision making and stresses the examination of evidence from clinical research.”
This epistemological conflict isn’t something new, as at least since the time of Hippocrates (which I will be covering in a post in the near future), there has been an ongoing debate between un-verified personal clinical experience and rigorous systematic research (Djulbegovic, Guyatt, 2017).
Some of you might be by now asking “What is the problem of basing my decisions on my personal experience? It has served me well in life so far. You’re just a nerd who wants to feel superior by bringing others down.”
All I ask is for a chance to explain, as to understand why we should do something differently, we first must recognise what is the problem with the way we currently do that something.
Throughout our day-to-day, we have to make a lot of decisions and some decisions are harder than others. We can consider this difficulty to go up when we have to make a decision about someone else’s health with our job potentially being on the line with the outcome of that decision – a clinical decision. When you work in healthcare, you have to do a lot of these decisions, in addition to our normal day-to-day decisions. So it makes sense we try to make these as quickly as we can – everyone has got a lot to do during their day!
The way our mind generally works around decision making and behavioural choice has been theorised by cognitive scientists as having two types of processes running simultaneously (Evans, 2008; Phua, Fams and Tan, 2013; Houlihan, 2018; Monteiro et al, 2020), which were first described by Wason and Evans (1974). These processes are often known as Type 1 and Type 2 (Monteiro et al, 2020), and have the following characteristics:
- Type 1 processes do not require working memory, are autonomous and often described as unconscious and faster (Evans, 2008; Evans and Stanovich, 2013; Phua, Fams and Tan, 2013; Houlihan, 2018; Monteiro et al, 2020).
- Type 2 processes require working memory, involve mental simulation and are often describes as conscious and slower (Evans, 2008; Evans and Stanovich, 2013; Phua, Fams and Tan, 2013; Houlihan, 2018; Monteiro et al, 2020).
Both processes are very helpful in our everyday personal and professional life. It is described in the literature that more experienced clinicians tend to utilise Type 1 processes for pattern recognition, based on previous experiences and intuition (Norman, 2009; Phua, Fams and Tan, 2013; Monteiro 2020), changing to Type 2 processes if they come upon something they haven’t encountered previously (Norman, 2009; Phua, Fams and Tan, 2013; Monteiro 2020). This is where clinical experience can have value.
However, as we know, none of us is perfect, or as rational and objective as we would like to think we are, leading to all of us committing errors, particularly in such a complex context as healthcare (Phua, Fams, and Tan, 2013; Richardson, 2014; Saposnik et al, 2016). This is because things like our prior beliefs and our emotions influence both how we perceive external information as well as our conscious and unconscious reasoning processes (Phua, Fams, and Tan, 2013; Houlihan, 2018).
In addition to this, in order to make decisions quicker within our limited ability to process information, we often take mental shortcuts for problem-solving, formally called Heuristics (Richardson, 2014; Saposnik et al, 2016; Monteiro et al, 2020). When these mental shortcuts are overused, they lead to errors in reasoning, called Cognitive Bias (Phua, Fams, and Tan, 2013). Heuristics and biases were first described by Kanheman and Tversky (1974), when through a series of studies on psychology undergraduates they demonstrated how heuristics were often used and how they could lead to mistakes. It is important to highlight that both Type 1 and Type 2 processes can lead to errors and bias (Phua, Fams and Tan, 2013; Monteiro et al, 2020)

Here are some examples:
- Availability bias: we tend to consider things that are easier to recall, either due to happening recently or their impact on us, as occurring more frequently than they actually do (Kanheman and Tversky, 1974; Norman, 2009; Monteiro et al, 2020)
- Confirmation bias: We tend to, even unconsciously, seek and pay more attention to data that confirms our hypothesis, sometimes ignoring opposing data (Kanheman and Tversky, 1974; Norman, 2009; Monteiro et al, 2020)
- Hindsight bias: when we already know the outcome of an event, that will influence our understanding of how that outcome came to be, making things seem connected when they actually weren’t or miss a crucial the effect of an event leading to that outcome (Kanheman and Tversky, 1974; Monteiro et al, 2020)
Just to reinforce how fallible and irrational our minds can be, here is an illustration of all the cognitive biases identified until today.
However, the limitations of our minds and cognitive processes don’t end here. It is not simply about the way we think about topics. It also appears that the knowledge we have or don’t have, as a basis to think about, will also influence our decisions with a lack of knowledge also being pointed to in research as a source of errors (Phua, Fams, and Tan, 2013; Monteiro et al, 2020). This helps understand why neither identifying biases nor applying debiasing strategies have shown to lead to a reduction in errors in making clinical diagnosis (Monteiro et al, 2020). It is also very difficult to differentiate which errors come from a lack of knowledge and which ones come from mistakes in our thinking (Norman, 2009).
I’m not going to go into a lot more detail about this as we already started talking about things such as psychology, neurosciences, and meta-cognition, with me not being an expert in any of these and an in-depth exploration of these topics, despite sounding incredibly interesting, may not be required to get across the point I want to make in this first part: our perception of the world and what happens in it is very flawed and not warrants some suspicion when thinking about complex topics – such as clinical practice. And this includes my own perceptions.
Thus, opinion, be it my own, yours, or anyone’s, without being supported by facts or critically analysed information, is not a trustworthy source to make affirmations about the world, particularly about health constructs, diagnostic tests, or the benefits of clinical treatments.

Thinking that we can understand the complex world we live in based on just our own experience, absent of any systematic criticism is nothing short of overconfidence in our own knowledge and capabilities. And let me tell you that overconfidence has been highlighted as one of the more common biases as well as the one leading to more diagnostic errors in healthcare (Phua, Fams, and Tan, 2013; Saposnik et al, 2016).
We need something to make us double-check our own conclusions, some type of system that executes a criticism of our reasoning and juxtaposes it to both opposing reasoning and experiences, as well as provides us quality knowledge from data gathered from the perceivable elements of the universe.
In the next part, I will explain that such a system has already been created and continues to alter its way of working while maintaining a dynamic core philosophy of trying to reduce errors as much as possible: evidence-based practice.
Thank you for reading and until the next one.
References:
Djulbegovic, B., & Guyatt, G. H. (2017). Progress in evidence-based medicine: a quarter century on. In The Lancet (Vol. 390, Issue 10092, pp. 415–423). Lancet Publishing Group. https://doi.org/10.1016/S0140-6736(16)31592-6
Evans, J. S. B. T. (2008). Dual-processing accounts of reasoning, judgment, and social cognition. Annual Review of Psychology, 59, 255–278. https://doi.org/10.1146/annurev.psych.59.103006.093629
Evans, J. S. B. T., & Stanovich, K. E. (2013). Dual-Process Theories of Higher Cognition: Advancing the Debate. Perspectives on Psychological Science, 8(3), 223–241. https://doi.org/10.1177/1745691612460685
Guyatt, G., Cairns, J., Churchill, D., Cook, D., Haynes, B., Hirsh, J., Irvine, J., Levine, M., Levine, M., Nishikawa, J., Sackett, D., Brill-Edwards, P., Gerstein, H., GIbson, J., Jaeschke, R., Kerigan, A., Nevile, A., Panju, A., Detsky, A., … Tugwell, P. (1992). Evidence-Based Medicine – A New Approach to Teaching the Practice of Medicine. JAMA, 268(17), 2420–2425.
Houlihan, S. (2018). Dual-process models of health-related behaviour and cognition: a review of theory. In Public Health (Vol. 156, pp. 52–59). Elsevier B.V. https://doi.org/10.1016/j.puhe.2017.11.002
Monteiro, S., Sherbino, J., Sibbald, M., & Norman, G. (2020). Critical thinking, biases and dual processing: The enduring myth of generalisable skills. Medical Education, 54(1), 66–73. https://doi.org/10.1111/medu.13872
Phua, D. H., Fams, E., & Tan, N. C. (2013). Cognitive Aspect of Diagnostic Errors (Vol. 42, Issue 1).
Richardson, L. G. (2014). Awareness of Heuristics in Clinical Decision Making. Clinical Scholars Review, 7(1), 16–23. https://doi.org/10.1891/1939-2095.7.1.16
Sacket, D. L., Rosenberd, W. M., Gray, J. A. M., Haynes, R. B., & Richardson, W. S. (1996). Evidence based medicine: what it is and what it isn’t. BMJ, 312, 71–72.
Saposnik, G., Redelmeier, D., Ruff, C. C., & Tobler, P. N. (2016). Cognitive biases associated with medical decisions: a systematic review. BMC Medical Informatics and Decision Making, 16(1), 1–14. https://doi.org/10.1186/s12911-016-0377-1
Tversky, A., & Kahneman, D. (1974). Judgement under Uncertainty – Heuristics and Biases. Science, 185(4157), 1124–1131.
Wason, P. C., & Evans, T. (1974). Dual processes in reasoning?*. Cognition, 3(2), 141–154.
Zadro, J., O’Keeffe, M., & Maher, C. (2019). Do physical therapists follow evidence-based guidelines when managing musculoskeletal conditions? Systematic review. In BMJ Open (Vol. 9, Issue 10). BMJ Publishing Group. https://doi.org/10.1136/bmjopen-2019-032329
2 thoughts on “Introduction to Evidence-Based Practice part 1: Flawed Opinions”