Nurse practitioners (NPs) perform assessments, make diagnoses, provide treatment, and manage illnesses (American Association of Nurse Practitioners, 2015). These diagnoses are different than those of nurses who are not NPs. The diagnostic role is comparable to that of physicians and is central to the function of NPs and physicians. In many cases, a patient's life depends on the correct and timely diagnosis. But how do NPs and physicians come to the right diagnosis?
Diagnostic reasoning is used both colloquially among providers and in the literature to describe the process by which providers come to the correct diagnosis, but it isn't often assigned a specific definition in articles discussing this topic. How do physicians and NPs arrive at the correct diagnosis? Much is written about the diagnostic reasoning of physicians, but there is a dearth of literature on the diagnostic reasoning of NPs. The aim of this article is to review the concept of diagnostic reasoning, specifically that of physicians and NPs, and to discuss threats to accurate diagnosis. This article presents the method of literature review, reviews definitions and use, and discusses strengths and gaps.
Literature Review Method
A series of database searches was performed without date limitations to avoid exclusion of potentially salient sentinel literature. PubMed®, Web of Science®, and PsycINFO® were queried with the search criteria diagnostic reasoning and filters to exclude editorials and non-English language articles and to include peer-reviewed literature within the health care sciences. Duplicates were removed and abstracts were screened for relevance (Figure).
Literature review flow diagram.
Attempts to further narrow by adding the search criteria nurse practitioner yielded a prohibitively small sample size (n = 8). Given the practical similarities between NP and physician diagnosing, the decision was made to include articles that discussed the process of coming to a medical diagnosis by NPs, NP students, physicians, physician assistants, and medical students. The diagnostic reasoning of our colleagues in closely related fields that do not perform medical diagnosing functions were excluded. As such, articles discussing orthodontics, occupational therapy, and nursing (nonadvanced practice) were not considered. The remaining articles were imported into EndNote X7 ( http://endnote.com) reference management software, and the duplicates were removed. Dimensional analysis, which has roots in grounded theory, is a method of selecting texts using theoretical choices to inform the researcher on the complexity and contextuality of a concept (Caron & Bowers, 2000; Kools, McCarthy, Durham, & Robrecht, 1996). This method of sampling was conducted from the 169 articles, seeking new conceptual insight into diagnostic reasoning. Conceptual saturation was reached when a critical mass of explanatory dimensions was obtained, at which point data collection was closed, resulting in final inclusion of 26 articles (Kools et al., 1996). Of these, seven were original research, four were case studies, three were thought pieces, and eight were reviews.
Definitions and Application of the Concept
Not only does most literature not provide a rich, clear, useful definition of diagnostic reasoning, this large body of literature uses the term diagnostic reasoning yet does not generally provide an explicit definition, leaving the reader to colloquial notions of its meaning. A notable exception is the definition by Appel, Wadas, Talley, and Williams (2013): “Diagnostic reasoning is the complex cognitive process used by clinicians from many health care disciplines to ascertain a correct diagnosis and therefore prescribe appropriate treatment for patients” (p. 125).
Several common themes emerged: Cognitive Biases and Debiasing Strategies, the Dual Process Theory, Diagnostic Error, and Patient Harm. Bias is systematic error, and in this context cognitive biases can be defined as faulty beliefs that affect decision making and can occur as the result of using heuristics in the diagnostic process (Croskerry, Singhal, & Mamede, 2013a; Zwaan, Thijs, Wagner, & Timmermans, 2013). There are more than 30 types of cognitive bias (Monteiro & Norman, 2013), but six are noted specifically in the literature included in this article:
- Premature closure (Brosinski, 2014; Elstein, 2009; Ilgen, Eva, & Regehr, 2016; Mamede, van Gog, van den Berge, van Saase, & Schmidt, 2014; Monteiro & Norman, 2013; Pirret, Neville, & La Grow, 2015).
- Search satisficing (Sherbino, Kulasegaram, Howey, & Norman, 2014).
- Availability (Brosinski, 2014; Sherbino et al., 2014; Mamede et al., 2014).
- Anchoring (Brosinski, 2014; Ilgen et al., 2016).
- Base rate neglect (Thammasitboon & Cutrer, 2013).
- Diagnostic momentum (Thammasitboon & Cutrer, 2013; Thammasitboon, Thammasitboon, & Singhal, 2013).
In addition to these specific subtypes of cognitive bias, bias affecting diagnostic reasoning accuracy in general is described in three additional articles (Lambe, O'Reilly, Kelly, & Curristan, 2016; Shimizu, Matsumoto, & Tokuda, 2013; Zwaan et al., 2013).
Cognitive Biases Moderate Diagnostic Reasoning's Effect on Timely and Correct Diagnosis
Premature closure is an automatic process that occurs when the provider closes the diagnostic reasoning process without fully considering all the salient cues (Mamede et al., 2014). Mamede et al. (2014) studied internal medicine residents' (n = 72) diagnostic reasoning with a within-subjects design through the use of case studies in which the authors included salient distracting features. Several versions of the same cases were presented with the salient distracting feature early or late in the case. They found that this distracting cue, when placed earlier in the case study, produced more diagnostic errors, purportedly because the resident physicians clung to the early distractor and ignored subsequent data to suggest an alternate diagnosis.
Brosinski (2014) presented a hypothetical case study in which a patient presents with hemiparesis suggesting acute ischemic stroke. Given that stroke is a clinical diagnosis and an emergency, with guidelines suggesting a course of treatment starting within minutes of arrival, providers may be prone to premature closure. As the case evolves, a history of epilepsy is elicited, which suggests Todd's paralysis, a transient postictal hemiplegia requiring no treatment. If the provider encountering this patient prematurely closes the diagnostic reasoning process, he may expose the patient to unnecessary risk by incorrectly prescribe thrombolytics.
Search satisficing is a subtype of premature closure in which searches for further evidence are terminated after a diagnosis is reached (Sherbino et al., 2014). Sherbino et al. (2014) studied two groups of medical students during an emergency medicine rotation; one group received didactic content on cognitive biases (n = 145) including search satisficing and a control group which did not (n = 46). All the students were given cases in a computerized format, some of which had a second diagnosis. They found only half of the students in each group initiated a search for a secondary diagnosis. Although this intervention was found to be unhelpful, it demonstrated well the ubiquity of search satisficing in medical students.
Availability bias causes the provider to falsely enhance the probability of a diagnosis following recent exposure of the provider to that diagnosis (Mamede et al., 2014). Thirty-six internal medicine residents, 18 first-year and 18 second-year, were given case studies in three phases, with the first phase designed to provide exposure to several diagnoses (Mamede et al., 2010). Diagnostic accuracy scores were measured at each phase. Similar cases in subsequent phases were designed to induce availability bias based on the priming in phase one. They found the second-year internal medicine residents made statistically significant errors in the second phase by misdiagnosing, choosing a diagnosis encountered in phase 1. In the third phase, the students were given a template to guide their thinking, writing down findings that support or refute their diagnosis, resulting in higher accuracy scores.
Anchoring is another subtype of premature closure in which a provider stakes their claim on a particular diagnosis, minimizing the information that does not support the diagnosis to which they have attached their proverbial anchor (Ilgen et al., 2016). For example, in the above case of hemiparesis (Brosinski, 2014), a provider may anchor to the diagnosis of acute ischemic stroke even after a history of epilepsy is elicited.
Base Rate Neglect
Base rate neglect is a bias in the way providers predict the probability of a diagnosis. This form of bias occurs when two independent probabilities are erroneously combined, ignoring the base rate, and leading to under- or overestimating the possibility of a diagnosis (Thompson, 2003). Thammasitboon and Cutrer (2013) presented the case of a 12-day-old infant with vomiting and bloody stool. The infant had an abdominal radiograph consistent with bowel intussusception, a diagnosis which is uncommon in children before 3 months of age but that is the most common cause of bowel obstruction in children ages 6 to 36 months of age. The infant was ultimately found to have bowel malrotation, a condition more likely to present during the first weeks of life. The correct diagnosis was delayed because providers were affected by base rate neglect, not accounting for the incidence of the disease in the subgroup of the population at hand.
Diagnosis momentum is a subtype of anchoring and premature closure in which the power of suggestion by colleagues is taken at face value. Thammasitboon and Cutrer (2013) defined diagnosis momentum as “the tendency for an opinion or working diagnosis to become almost certain when it is passed from person to person, thereby suppressing further evaluation.” In their case of the infant with bowel malrotation, the initial working diagnosis of intussusception was passed from providers at the initial hospital to the tertiary center and then among providers at the tertiary center. This combination of biases contributed to delayed diagnosis.
Debiasing Strategies May Mediate Cognitive Bias Effect on Timely and Accurate Diagnosis
After enumerating a number of cognitive biases that contribute to diagnostic delays and diagnostic errors, one may wonder what remedies are available. No consensus exists on the effectiveness of debiasing strategies. Croskerry, Singhal, and Mamede (2013a) advocated for a view in which clinicians can change thinking patterns through awareness of bias and feedback. A staged process for adoption of debiasing strategies is presented, beginning with awareness, through making changes, and ending with maintenance (Croskerry, Singhal, & Mamede, 2013b). They go on to provide several cognitive forcing functions, such as training on theories of reasoning and medical decision making, bias inoculation, simulation training, computerized cognitive tutoring, metacognition, slow-down strategies, group decision strategy, and clinical decision support systems to force diagnostic reasoning out of bias-prone thought processes into a more analytic process.
In a large narrative review of cognitive interventions to reduce diagnostic error (42 studies included) (Graber et al., 2012), the authors provided a counterpoint to Croskerry et al. (2013a, 2013b), noting that several authors expressed uncertainty in the efficacy of debiasing due to difficulty changing subconscious processes. For example, Sherbino's (2014) study on medical students in the emergency department found that providing education on cognitive forcing strategies did not improved diagnostic accuracy.
Shimizu et al. (2013) studied the effects of a differential diagnosis checklist and a debiasing checklist on Japanese medical students' performance on a series of case studies. In this nonrandomized study, one group was asked to write three differential diagnoses before and after being given a differential diagnosis checklist and a debiasing checklist, whereas the control group was instructed to make intuitive diagnoses without access to these checklists. The differential diagnosis checklist provided more than 30 possible diagnoses for the particular chief complaint, with frequently missed diagnoses and “must not miss” diagnoses highlighted. They found that the differential diagnosis checklist improved diagnostic accuracy, more so in the difficult cases. The debiasing checklist prompted the students to consider several patterns of flawed thinking, but it was not found to improve diagnostic error in this study.
Croskerry (2009) presented a universal model of diagnostic reasoning that adapts the dual process of reasoning to medicine. The theoretical basis for the dual process framework is based on a default-interventionalist model (Monteiro & Norman, 2013). The dual process framework consists of two discrete cognitive systems, labeled system 1 and system 2 (Croskerry, 2009). Decisions made with system 1 are intuitive and are based on pattern recognition, experience, and heuristics and are largely unconscious. In this model, system 2 is engaged when signs and symptoms do not fit into a typical illness script or are unfamiliar to the clinician. System 2 uses deliberate purposeful conscious thinking. In Croskerry's model, these systems can interact with each other, primarily though system 2 providing a rational override to system 1. As most medical conditions present typically and are easily recognized, perhaps most diagnoses are reached intuitively (Graber et al., 2012).
Dual process framework is the dominant model (Monteiro & Norman, 2013) and was referenced in many of the articles in the literature review (Brosinski, 2014; Croskerry, 2009; Durham, Fowler, & Kennedy, 2014; Gehlhar, Klimke-Jung, Stosch, & Fischer, 2014; Kellogg, Coute, & Garra, 2015; Lambe et al., 2016; Mamede et al., 2014; Monteiro et al., 2015; Monteiro & Norman, 2013; Pirret et al., 2015; Shimizu et al., 2013; Stolper et al., 2013; Thammasitboon & Cutrer, 2013; Wiswell, Tsao, Bellolio, Hess, & Cabrera, 2013). Dual process framework, although dominant, is not without critique. Monteiro and Norman (2013) explicated an assumption within dual process that systems 1 and 2 can be consciously decoupled. They further said that many of the cognitive biases that are typically assigned to system 1 actually involve reasoning and seeking out more information, processes which fit into system 2. Many of the interventions that attempted to improve diagnostic reasoning revolve around shifting the thought process from system 1 to system 2, thus explaining many negative trials. Monteiro and Norman (2013) contended that dual process addresses the cognitive processing side of errors in diagnostic reasoning and proposed a second set of processes, categorization, and recognition to address aspects of memory or recall.
Diagnostic Error, Accuracy of Diagnosis, and Harm
Diagnostic error “is considered to have occurred if the diagnosis is incorrect or does not fully address the patient's problems regardless of any occurrence of an adverse event (or patient harm)” (Thammasitboon et al., 2013, p. 228). Thammasitboon et al. (2013) proposed a framework that asserts several important points regarding diagnostic error: (a) harm may or may not have occurred, (b) the error may or may not have been preventable, (c) the clinician may or may not be at fault, and (d) the error, especially if no harm occurred, may go unnoticed. Pirret et al. (2015) used a comparative design and purposeful sampling comparing NP (n = 30) and physician (n = 16) diagnostic reasoning through a think-aloud protocol analyzing a complex case scenario finding statistically insignificant diagnostic accuracy between the professional groups. In their study, they extensively used both diagnostic error and diagnostic accuracy in their literature review and discussion, yet they left definitions implicit. As described above, a differential diagnosis checklist may improve diagnostic accuracy (Shimizu et al., 2013). A systematic review of interventions to improve diagnostic accuracy found mixed results in improvement in accuracy and expressed concerns over artifactual findings relating to use of inexperienced clinicians in most studies (Lambe et al., 2016). A retrospective chart review evaluated the etiology of diagnostic error in 247 patients presenting with dyspnea (Zwaan et al., 2013). In 163 (45%) of the identified cases, inappropriately selective information gathering was implicated. They found, with follow-up interviews with treating physicians, that inappropriate selectivity in information processing and information gathering led to patient harm both from delayed or incorrect diagnosis and also from contextually inappropriate testing, although harm occurred in only several of the reviewed cases.
Strengths and Gaps in Definition and Use
The term diagnostic reasoning is commonly used among clinicians and seems to have a well-accepted colloquial meaning. However, the most striking gap is the lack of any explicit definition of diagnostic reasoning in the vast majority of the studies. The inferential definitions lead to a lack of clarity. Ilgen et al. (2016) framed this issue when they provided two complimentary aspects of diagnosis: “1) the correct solution resulting from a diagnostic reasoning process, and 2) a dynamic aid to an ongoing clinical reasoning process” (p. 435). In addition to lack of simple definitions, a search of PubMed and Web of Science with the search criteria diagnostic reasoning AND concept analysis yielded only one concept analysis—a concept analysis of the clinical judgement of nurses (Simmons, 2010).
NPs diagnose patients as physicians do, but although there is much literature from graduate medical education and postgraduate medical education, there is a paucity of literature surrounding the diagnostic reasoning of NPs. The notable exception is a comparison conducted by Pirret et al. (2015) of diagnostic reasoning style between NPs and physicians. Chiffi and Zanotti (2015) contrasted medical and nursing diagnosis but did not mention where NPs fit into this dichotomy. Much of the remaining literature addresses prelicensure and postgraduate medical education.
The dual process theory provides a compelling framework to understand the thought processes involved in diagnostic reasoning. Although skeptics question the ability of people to deliberately and consciously separate intuitive from analytic thought, an awareness of unconscious tendencies toward nonanalytic thought may be useful for clinicians to check the assumptions in their differential diagnosis.
Debiasing is the logical solution to cognitive bias affecting diagnostic reasoning. Croskerry et al. (2013a, 2013b) proposed a staged approach of cognitive bias mitigation, beginning with awareness and ultimately completing the metacognitive exercise of forcing oneself to consider necessary alternative diagnoses. Different types of bias may require different solutions, and some may be more resistant than others (Croskerry et al., 2013b). Clinicians should take the initiative to self-monitor for when more analytic thought is necessary. This should ultimately come from the experience of repeated purposeful practice, but a jump start on this process during the role transition to NP practice seems the ideal place for the first stage in the approach.
There is not consensus as to the etiology of cognitive bias, but proposed sources include the use of heuristics and intuitive decision making; however, many other inputs into the accuracy of clinical decision making have been proposed, such as team factors, resource limitations, personality, gender, and sleep deprivation, among many others (Croskerry et al., 2013a). For example, an NP in a hectic situation with multiple high-acuity patients competing for attention will need to utilize heuristics and intuition initially to start a diagnostic evaluation and to make a tentative diagnosis and treatment plan. Once each patient is stabilized, the NP can then return to each individual's case and provide uninterrupted analytic thought. Returning to the patient after a quick initial decision has been made, in an effort to apply an additional thoughtful approach, is a practical way to avoid premature closure and diagnostic error.
Conclusions and Implications for Education and Practice
Diagnostic reasoning has been practiced in medicine since the profession's inception, but the definition of diagnostic reasoning has remained largely implicit. Although most physicians and NPs generally understand what is meant by the term, a rich and scholarly definition is not included in most publications. Threats to accurate and timely diagnosis, such as cognitive bias, were reviewed. Although there is not consensus on the efficacy of debiasing strategies, they are logically intuitive and may be useful. The dual process theory's conceptual division of unconscious and intuitive versus conscious and logical decision making may be of use to NP educators and preceptors who guide their students into a new way of thinking about patients. Much of the literature cited above used simulation of varying levels of fidelity. Simulation may be an effective tool for mentoring students through complex diagnostic reasoning and a pragmatic laboratory for researchers interested in this topic. Facilitating this cognitive transition is important for professional development and patient safety.
- American Association of Nurse Practitioners. (2015). Scope of practice for nurse practitioners. Retrieved from https://www.aanp.org/images/documents/publications/scopeofpractice.pdf
- Appel, S.J., Wadas, T.M., Talley, M.H. & Williams, A.M. (2013). Teaching diagnostic reasoning: Transitioning from a live to a distance accessible online classroom in an adult acute care nurse practitioner program. Journal of Nursing Education and Practice, 3(12). doi:10.5430/jnep.v3n12p125 [CrossRef]
- Brosinski, C.M. (2014). Implementing diagnostic reasoning to differentiate Todd's paralysis from acute ischemic stroke. Advanced Emergency Nursing Journal, 36, 78–86. doi:10.1097/TME.0000000000000007 [CrossRef]
- Caron, C.D. & Bowers, B.J. (2000). Methods and application of dimensional analysis: A contribution to concept and knowledge development in nursing. In Rodgers, B.L. & Knafl, K.A. (Eds.), Concept development in nursing: Foundations, techniques, and applications (2nd ed., pp. 285–319). Philadelphia, PA: W.B. Saunders.
- Chiffi, D. & Zanotti, R. (2015). Medical and nursing diagnoses: A critical comparison. Journal of Evaluation in Clinical Practice, 21, 1–6. doi:10.1111/jep.12146 [CrossRef]
- Croskerry, P. (2009). A universal model of diagnostic reasoning. Academic Medicine, 84, 1022–1028. doi:10.1097/ACM.0b013e3181ace703 [CrossRef]
- Croskerry, P., Singhal, G. & Mamede, S. (2013a). Cognitive debiasing 1: Origins of bias and theory of debiasing. BMJ Quality & Safety, 22(Suppl. 2), ii58–ii64. doi:10.1136/bmjqs-2012-001712 [CrossRef]
- Croskerry, P., Singhal, G. & Mamede, S. (2013b). Cognitive debiasing 2: Impediments to and strategies for change. BMJ Quality & Safety, 22(Suppl. 2), ii65–ii72. doi:10.1136/bmjqs-2012-001713 [CrossRef]
- Durham, C.O., Fowler, T. & Kennedy, S. (2014). Teaching dual-process diagnostic reasoning to doctor of nursing practice students: Problem-based learning and the illness script. Journal of Nursing Education, 53, 646–650. doi:10.3928/01484834-20141023-05 [CrossRef]
- Elstein, A.S. (2009). Thinking about diagnostic thinking: A 30-year perspective. Advances in Health Sciences Education, 14(Suppl. 1), 7–18. doi:10.1007/s10459-009-9184-0 [CrossRef]
- Gehlhar, K., Klimke-Jung, K., Stosch, C. & Fischer, M.R. (2014). Do different medical curricula influence self-assessed clinical thinking of students?GMS Zeitschrift für Medizinische Ausbildung, 31(2), Document 23. doi:10.3205/zma000915 [CrossRef]
- Graber, M.L., Kissam, S., Payne, V.L., Meyer, A., Sorensen, A., Lenfestey, N. & Singh, H. (2012). Cognitive interventions to reduce diagnostic error: A narrative review. BMJ Quality & Safety, 21, 535–557. doi:10.1136/bmjqs-2011-000149 [CrossRef]
- Ilgen, J.S., Eva, K.W. & Regehr, G. (2016). What's in a label? Is diagnosis the start or the end of clinical reasoning?Journal of General Internal Medicine, 31, 435–437. doi:10.1007/s11606-016-3592-7 [CrossRef]
- Kellogg, A.R., Coute, R.A. & Garra, G. (2015). Diagnostic reasoning for ST-Segment Elevation Myocardial Infarction (STEMI) interpretation is preserved despite fatigue. Journal of Graduate Medical Education, 7, 27–31. doi:10.4300/JGME-D-14-00056.1 [CrossRef]
- Kools, S., McCarthy, M., Durham, R. & Robrecht, L. (1996). Dimensional analysis: Broadening the conception of grounded theory. Qualitative Health Research, 6, 312–330. doi:10.1177/104973239600600302 [CrossRef]
- Lambe, K.A., O'Reilly, G., Kelly, B.D. & Curristan, S. (2016). Dual-process cognitive interventions to enhance diagnostic reasoning: A systematic review. BMJ Quality & Safety, 25, 808–820. doi:10.1136/bmjqs-2015-004417 [CrossRef]
- Mamede, S., van Gog, T., van den Berge, K., Rikers, R., van Saase, J.L., van Guldener, C. & Schmidt, H.G. (2010). Effect of availability bias and reflective reasoning on diagnostic accuracy among internal medicine residents. JAMA, 304, 1198–1203. doi:10.1001/jama.2010.1276 [CrossRef]
- Mamede, S., van Gog, T., van den Berge, K., van Saase, J.L.C.M. & Schmidt, H.G. (2014). Why do doctors make mistakes? A study of the role of salient distracting clinical features. Academic Medicine, 89, 114–120. doi:10.1097/ACM.0000000000000077 [CrossRef]
- Monteiro, S.D., Sherbino, J.D., Ilgen, J.S., Dore, K.L., Wood, T.J., Young, M.E. & Howey, E. (2015). Disrupting diagnostic reasoning: Do interruptions, instructions, and experience affect the diagnostic accuracy and response time of residents and emergency physicians?Academic Medicine, 90, 511–517. doi:10.1097/ACM.0000000000000614 [CrossRef]
- Monteiro, S.M. & Norman, G. (2013). Diagnostic reasoning: Where we've been, where we're going. Teaching and Learning in Medicine, 25(Suppl. 1), S26–S32. doi:10.1080/10401334.2013.842911 [CrossRef]
- Pirret, A.M., Neville, S.J. & La Grow, S.J. (2015). Nurse practitioners versus doctors diagnostic reasoning in a complex case presentation to an acute tertiary hospital: A comparative study. International Journal of Nursing Studies, 52, 716–726. doi:10.1016/j.ijnurstu.2014.08.009 [CrossRef]
- Sherbino, J., Kulasegaram, K., Howey, E. & Norman, G. (2014). Ineffectiveness of cognitive forcing strategies to reduce biases in diagnostic reasoning: A controlled trial. Canadian Journal of Emergency Medicine, 16, 34–40.
- Shimizu, T., Matsumoto, K. & Tokuda, Y. (2013). Effects of the use of differential diagnosis checklist and general de-biasing checklist on diagnostic performance in comparison to intuitive diagnosis. Medical Teacher, 35(6), e1218–e1229. doi:10.3109/0142159X.2012.742493 [CrossRef]
- Stolper, C.F., Van de Wiel, M.W., De Vet, H.C., Rutten, A.L., Van Royen, P., Van Bokhoven, M.A. & Dinant, G.J. (2013). Family physicians' diagnostic gut feelings are measurable: Construct validation of a questionnaire. BMC Family Practice, 14, 1. doi:10.1186/1471-2296-14-1 [CrossRef]
- Thammasitboon, S. & Cutrer, W.B. (2013). Diagnostic decision-making and strategies to improve diagnosis. Current Problems in Pediatric and Adolescent Health Care, 43, 232–241. doi:10.1016/j.cppeds.2013.07.003 [CrossRef]
- Thammasitboon, S., Thammasitboon, S. & Singhal, G. (2013). Diagnosing diagnostic error. Current Problems in Pediatric and Adolescent Health Care, 43, 227–231. doi:10.1016/j.cppeds.2013.07.002 [CrossRef]
- Thompson, C.C. (2003). Clinical experience as evidence in evidence-based practice. Journal of Advanced Nursing, 43, 230–137. doi:10.1046/j.1365-2648.2003.02705.x [CrossRef]
- Wiswell, J., Tsao, K., Bellolio, M.F., Hess, E.P. & Cabrera, D. (2013). “Sick” or “not-sick”: Accuracy of System 1 diagnostic reasoning for the prediction of disposition and acuity in patients presenting to an academic ED. American Journal of Emergency Medicine, 31, 1448–1452. doi:10.1016/j.ajem.2013.07.018 [CrossRef]
- Zwaan, L., Thijs, A., Wagner, C. & Timmermans, D.R.M. (2013). Does inappropriate selectivity in information use relate to diagnostic errors and patient harm? The diagnosis of patients with dyspnea. Social Science and Medicine, 91, 32–38. doi:10.1016/j.socscimed.2013.05.001 [CrossRef]