Dr. Rankin is Assistant Professor, and Ms. Elena is Instructor, Faculty of Nursing, University of Calgary, Calgary, Alberta; Ms. Malinsky is Assistant Professor, Faculty of Health and Social Development, University of British Columbia Okanagan, Kelowna, and Ms. Tate is Instructor, Bachelor of Science in Nursing Program, North Island College, Courtenay, British Columbia, Canada.
This study was supported in part by Vancouver Island University (formerly Malaspina University-College), Western Region Canadian Association of Schools of Nursing, and North Island College.
The authors have no financial or proprietary interest in the materials presented herein.
The authors thank Marilyn Chapman, Lori Crawford, Ruth Dubois, Donna Malyon Ginther, Diane Jacquest, Mary Lougheed, Mary Anne Moloney, and Coby Tschanz, who are the team of faculty researchers who contributed to the data collection and analysis for this study.
Address correspondence to Janet M. Rankin, PhD, RN, Assistant Professor, Faculty of Nursing, University of Calgary, 2500 University Dr. NW, Calgary, Alberta, Canada T2N 1N4; e-mail: firstname.lastname@example.org.
Making decisions to promote or fail nursing students is often difficult and stressful. This article describes one research team’s experience with an institutional ethnographic study designed to examine what actually happens when nurse educators evaluate nursing students in their practicum experiences. Our research team members are all practicing nurse educators, and our research interest arose from the challenges we face in our efforts to maintain a relational teaching practice while being involved in the work of assessment and evaluation. These frustrating experiences became the entry point for our research. In the field notes from our early discussions, a nurse educator commented:
Somehow grading and evaluating is distracting from my teaching.... I can’t be in the moment, be present with the student while making judgments about her practice at the same time. I don’t always remember to take notes and then at the end of our session I have an overall feeling of “things will be ok” or “I hope things will be ok” or “this could be trouble” but can’t always retrieve specific examples. I can’t think about teaching at the same time as I am gathering evidence. It’s frustrating.
These kinds of contradictions that nurse educators experience in their teaching and evaluation work are not well understood. Using institutional ethnography, we set out to investigate how the evaluation practices of nurse educators are socially organized.
In this article, we describe how we worked to accomplish what Dorothy Smith (2005, p. 4) referred to as an “ontological shift” to position ourselves as institutional ethnography researchers. Institutional ethnography’s emphasis on ontology required us to maintain a constant and tangible connection with the material world of people’s everyday activities. This ontological shift is critical to an institutional ethnography project as it necessitates a specific way to think about social phenomena. Maintaining a “determinedly empirical” (McCoy, 2007, p. 706) stance, institutional ethnography researchers avoid using models and preestablished theoretical explanations common to other qualitative and quantitative projects. Institutional ethnographers do not group data into categories and themes. Nor do they work to identify concepts or formulate any level of theory. Institutional ethnography requires a fundamental adjustment in the way data are reviewed, organized, and analyzed. Our research team learned how to make this adjustment; we learned how to approach our topic in a way that allowed us first to unravel our ordinary everyday activities and then to see how these activities empirically link into institutional practices.
It is important to describe the work we did to make the ontological shift because it enabled us to examine our assumptions and understandings about our teaching and evaluation practices. Our findings provided a different view into the tensions embedded in evaluation as we tracked our data into established practices of due process. Through questioning our taken-for-granted work to establish due process, we could begin to see some hidden contradictions.
Institutional Ethnography: A Method of Inquiry
Institutional ethnography is a method of inquiry developed by Canadian sociologist Dorothy Smith (1987, 1990, 2001, 2005, 2006). Institutional ethnographers think in terms of the social organization of knowledge and understand that the social world is constituted in the activities of people. Smith’s approach is a critique of conventional social inquiry. Her reading of Marx’s materialist theory coalesced with her work as a feminist when she began to recognize:
ruptures between [women’s] everyday experience and the dominant forms of knowledge which, while seemingly neutral and general, concealed a standpoint in particular experiences of gender, race and class.
The institutional ethnography approach is intended to provide analytical descriptions of how things are happening. In particular, it is directed toward examining how experiences that are contradictory and troubling are organized and shaped. The contribution of this methodology is its capacity to expand the knowledge of those who know their work close up.
Relying on the expert knowledge and know-how of nurse educators who function inside the tensions of evaluation work, our focus was on describing and analyzing how that work may be dominated and shaped by forces outside the local purview and purposes of the nurse educators. We identified and tracked clues that were buried in nurse educators’ descriptions of their work. These clues connected to the broad organizational structures governing evaluation work. Although at times we were tempted by our understanding of traditional research methods, we did not apply or develop theory nor gather data to test a theory. We did not use systems of categorizing, counting, or measuring. Our goal was not to generalize about nurse educators’ evaluation work; rather, we wanted to discover the social processes that organize generalizing practices of evaluation across similar settings.
Technical Tools of Institutional Ethnography Research
Institutional ethnographers use terms that support the researchers to focus on material events (DeVeau, 2008). A particular understanding of these terms informs the technical tools that underpin institutional ethnography and are known as the research problematic, the standpoint, social relations, ruling relations, and texts.
To accomplish the tracking of nurse educators’ evaluation work, institutional ethnography researchers formulate a problematic to guide and ground their investigation. Most often, the problematic for inquiry is developed during the preliminary stage of data collection and early analysis when the researcher begins to discover how troubles in the setting are positioned as sequences of action. The problematic arises at moments of disjuncture when something happening locally is at odds with how it is known about officially or ideologically (Smith, 1987). The researcher is required to pay close attention to the taken-for-granted or business-as-usual events that do not align with what those in the setting intend or need to happen. For instance, in our research, we carefully examined nurse educators’ work in education settings and began to notice how some of the routine practices that we thought were accomplishing one thing, such as fairness and transparency in student practice evaluation, were actually accomplishing something contradictory that made the instructor-student relationship difficult.
Taking a specific research standpoint is critical to being able to develop an inquiry that begins in the expert knowledge rooted in people’s activities. A standpoint maintains the institutional ethnographer’s commitment to expand what can be known by those whose troubles are being investigated. The researcher works on behalf of the expert insiders to learn how their practices are organized. In our study, we took the standpoint of nurse educators; we stood on their side of the evaluation troubles to learn from them about their work. The activities and experiences that demanded our attention during data collection were always the substance of what was actually happening, and we endeavored to learn what people knew about how those happenings were organized. The data were not categorically organized for their similarities or themes. Instead, the data were closely examined to determine how activities hook people into a broad social organization and were then traced to the institutional organization of nurse educators’ evaluation work using institutional ethnography’s formulation of social relations.
A social relation is something happening that links individuals together. For this research, the social relations being examined were the practices nurse educators engage in as they produce their everyday evaluation work. Nurse educators’ evaluation decisions and actions do not happen in isolation; they are linked together in complex and purposeful ways. For example, a nurse educator whose work with a student happens in a particular place and time is inextricably linked into the standards and practices that arise within educational, health agency, and professional institutions. The institutional documents and policies that coordinate nurse educators’ evaluation activities, such as the assessment forms and the requirement for feedback to students, are examples of the routine practices that institutional ethnographers refer to as social relations.
Ruling Relations and Texts
The critical interest for an institutional ethnographer is to examine social relations for their capacity to rule. A ruling relation is a practice accomplished in the local setting that inserts institutional or extra-local interests into that local time and place that may not arise from inside the needs and interests of the individuals working there. Smith (2005) said, “Ruling relations direct attention to the distinctive translocal forms of social organization and social relations mediated by texts” (p. 227). For the purpose of institutional ethnography, text is defined as any document that has a “relatively fixed and replicable character” (DeVault & McCoy, 2006, p. 34). Texts include hardcopy organizational forms, reports, protocols, and policies. They also include less formal institutional texts such as memos and, more recently, e-mails or other electronic textual media. To examine the ruling relations that permeate nurse educators’ evaluation work, our research team paid particular attention to the documents being used and produced. This attention to texts and the documentary activities that are conducted in contemporary institutions is a particularly important analytic feature of ruling relations. Texts provide a critical source of data because they are an important material feature of many of the activities that constitute people’s daily courses of action.
Research Team Work: The Ontological Shift
The research team convened around our shared interest in exploring our work of evaluating students. We started with the research question: How does it happen that, within a curriculum that is designed to be emancipatory, transformative, and embedded within caring relationships, teachers describe serious tensions and contradictions arising in their evaluation experiences?
The ethnographic approach directed us to gather detailed descriptive data about the activities that teachers engage in with students, professional colleagues, and administrators. Data collection included gathering field notes during faculty meetings focused on student evaluation, conducting teacher interviews with specific questions about how the evaluation work happened, and writing personal reflections on student evaluation practices. In addition, a variety of forms and documents (texts) were collected as they were identified from the field notes, interviews, and reflections. We attempted to work systematically through each piece of data during the teleconferences and face-to-face meetings of our research team.
More often than not, our attempts to review the data engaged us in circling conversations. We were routinely drawn into explanations of what we already understood to be happening. Our data resonated with our experiences and how we knew our evaluation work from inside its social relations. Reading the data, we identified with educators’ descriptions of their efforts to remain unbiased and impartial in their evaluation work. We understood the challenges educators faced when they were giving students feedback. We knew the importance of assessing and documenting patterns of unsafe practice. We recognized the critical implications of patient safety as it relates to teaching students in practice settings.
It was during these early forays into the data that it became apparent how difficult it was to go beyond our taken-for-granted accounts. Our insider knowledge about how to be impartial, give feedback, and identify patterns of behavior consistently hooked us into our preformed ideas about how we thought evaluation ought to proceed. We had to learn to direct our attention back to the ontological and empirical terrain of actual day-to-day activities; for example, What are the activities of establishing impartiality, giving feedback, and noticing patterns? What is actually happening? We regularly had to remind ourselves to avoid getting pulled into what Smith (2001) referred to as the “blob ontology” (p. 166) of our preformed ideas without letting go of the expert knowledge that came from being positioned inside the institutional practices. We needed to move beyond the common language of sociology that purports to describe an organization but fails to capture what is real for the individuals involved (Smith, 2001).
Our own explanations offered only a partial, and what institutional ethnographers recognize as an ideological, view into how evaluation practices are actually organized. Although our explanations and perceptions about our work may indeed have been relevant, these subjective experiences were not at the center of this research. Rather, we were interested in teachers’ institutional knowledge about what directs evaluation practices. The relevant line of questioning was: How do we know to do this or that? What is this work accomplishing? We developed analytic strategies that held us to the empirical ground of our data, of things actually going on—the social world being put together to happen as it does, through the activities of knowledgeable people. We also developed a process of interrogating the data that led us to formulate a problematic.
We decided to use the data to develop pictorial representations of the institutions within which our evaluation work happened. This was a pragmatic exercise that helped us turn our attention to the empirics of our study. From the data and our own knowledge as educators, we created two pictorial schemata that identified as many of the institutional and regulatory processes that we knew or conjectured had some connection to our routine evaluative work with nursing students.
Figure 1 focuses on the policies and processes of the colleges and universities where we work and relates these to our evaluation practices. The schematic includes semester evaluation requirements, standards for midterm and final evaluation, and the texts and forms used to document student progress. It also incorporates the college or university calendars with their incumbent requirements for registration, withdrawal, assessment, promotion, appeals, and so forth. Other regulatory features of the educational institutions include the curriculum and the curriculum guide, principles for ethical teaching practice, and documents related to matters such as human rights and student appeals.
Figure 1. Regulatory Education Regime.Note. FTE = Full-Time Equivalent.
Figure 2 details the professional, nursing, and health care agencies that intersect with nursing education. The schematic makes a connection between the evaluation work of nurse educators and agencies with practice colleagues who mentor students during their practicum and employers who hire the graduates of our program. These colleagues and employers all work within their own institutional realms that are organized by a variety of policies and protocols, such as guidelines for monitoring students in preceptorships and orientation programs for new graduates. The schematic also makes connections between nurse educators and their professional associations. All of these practices and regulations link into nurse educators’ work of evaluating nursing students.
Figure 2. Regulatory Nursing Regime.Note. HSPNET = Health Sciences Placement Network.
When we had oriented ourselves to the institutional features of our ethnographic study and had our schemata in hand, we again reviewed the transcripts and textual samples that had been collected. We looked through the data to determine what was happening and how what was happening linked into the regulatory regimes we had identified.
Our expert knowledge as teachers thus became a useful source of data. For example, when we read descriptions about faculty going into careful detail about their expectations of students, we recognized this as a component of evaluation work. When we attempted to link these faculty practices into the regulatory regimes, we recognized that they were directed in part by the requirement of the educational regime to be transparent and also by the professional standards of practice. Similarly, when we analyzed teachers’ descriptions of how they decided to provide written feedback to students, we found traces in their talk and in our observations of a work process directed toward tracking patterns that we recognized as an institutional requirement for transparency and fairness.
As a way to engage with the large volume of data, we began to organize it into instances of work. Although this system of data analysis held some resemblance to the “theme and category” approach that is familiar to researchers using other methods, it was significantly different because our instances-of-work headings consistently sustained the connections back into the data, into something that was tangible and material. At the same time, this approach provided the links into the practices outside the setting, also tangible and material. This strategy sustained the ontological shift, keeping us grounded in the empirical focus of our analysis.
Our data management system provided us with a way to begin to scrutinize components of our everyday work that we had previously glossed over as instances of competent teaching. Smith (1987) wrote:
If we cease to take them [everyday experiences] for granted, if we strip away everything we imagine we know of how they came about (and ordinarily that is very little), if we examine them as they happen within the everyday world, they become fundamentally mysterious.
In rendering the instances-of-work fundamentally mysterious, we began to unearth what teachers’ work processes were actually accomplishing. We looked for the connections between what was happening inside our evaluation work and the extra-local practices that link and coordinate our work into the regulatory regimes we had sketched on our schemata. We paid attention to what was going on and actively looked for the disjuncture between what we supposed was happening and what our research approach was now allowing us to see differently.
Formulating the Problematic
Our data analysis revealed empirically how the social organization of our evaluation work places us on a line-of-fault between the regulatory demands of the institutions and our teaching intentions. We began to see the flaws in our taken-for-granted assumptions and accounts of our work. This disjuncture became our research problematic.
To demonstrate how we were able to see the research problematic, we used the following data excerpt taken from field notes generated at a meeting where nurse educators consulted with one another about a student’s progress during a practicum experience. Ostensibly, progress meetings of this sort are designed to support teachers’ effectiveness as they discuss their students’ practice with colleagues to develop individualized teaching strategies. The meetings also serve as an early warning system to intervene in a timely and supportive way with students who are experiencing difficulties. The field notes from the progress meeting record a discussion about a first-year student. The student is in the second semester of the nursing program, and her weekly practicum is in a residential care facility where she is learning to care for dependent older adults. The student and the practice teacher had worked two shifts together, and the progress meeting was the first of the semester. The issue presented by the teacher to her colleagues was the student’s lack of preparation for practice. The following is an excerpt from the field notes that recorded the discussion:
Practice teacher: The student was not prepared. This isn’t attitude.… I don’t want it to be about me working harder than [the] student.
Teaching colleague: Could she tell you verbally what should be in the care plan?
Practice teacher: No.
Teaching colleague: Is she overwhelmed?
In this progress meeting, the talk appeared to be focused on strategies to promote student learning. There has been concern, and perhaps frustration, expressed, and the activities of the teachers are directed toward understanding the student’s progress and experience. The notes reflect the practice teacher’s concerns about the student as she consults with her colleagues. The practice teacher identified that the student was not prepared for practice because she had not developed a written care plan. The teaching colleague asked whether the student could verbally explain her plan of care as she worked to understand the issue of the student’s preparation. The comment “Is she overwhelmed?” was directed toward understanding and supporting the student’s beginning practice. These teachers’ work could be described as enactment of an intention to develop the relational pedagogy and critical feminist philosophies that are curricular foundations for the nursing programs being studied (Collaborative Nursing Program, 2004).
As the discussion evolved, the student’s preparatory work on the care plan continued to be a central interest. However, in the following excerpt, there is a subtle shift. On the surface, the work at the meeting appears to support student learning. However, if we apply the institutional ethnographic method of inquiry to guide a critical analysis of what is actually happening, a tension emerges, as evident in this dialogue:
Teaching colleague 1: In the care plan, what are her foci statements?
Teaching colleague 2: Do you need another set of eyes to see what is passed?
Teaching colleague 3: We have this bar and we don’t let them in [to practice] until they make the bar.
The focus was brought back to the care plan, the documentary tool being used to assess the student’s prepared-ness. It was suggested that perhaps another teacher could review the document to offer more insight into whether the student has met the requirements for being prepared. This is the shift. The issue of the student passing has explicitly entered the discussion. The shift is barely noticeable because it slipped so subtly into the talk; however, at this juncture, the focus shifted away from the student’s learning about how to take care of her patient and was directed toward the evaluation decision to be made. Although this was early in the semester, the ground was being developed to support faculty decisions about whether the student may continue. The final entry in the field notes asked:
Practice teacher: How can we maximize the student’s potential?
The data excerpt reflects the teachers’ intentions, but our analysis revealed how these are infiltrated and ruled by a requirement to evaluate.
There is a dual work process happening here. There is no doubt that the teachers are committed to supporting student success. However, when the data are scrutinized to explicate the taken-for-granted enactments of competent teaching practices, the disjuncture emerges. Here at the beginning of the semester, work practices are being enacted that may be used to establish whether the student can meet the requirements. We have identified a juncture in teachers’ work when teaching and guiding a student’s learning is overtaken by activities directed toward gathering evidence. We uncovered the built-in contradiction that is supported by the nurse educator’s comment in the introductory section of this article: “I can’t think about teaching at the same time as I am gathering evidence. It’s frustrating.”
We argue that the teachers’ work at the progress meeting was to develop a case that could be used against the student and be held up as fair and transparent if the student should decide to contest it. This sets in motion practices that seem at odds with how this student might be coached and supported to develop her study skills and her nursing work with older adult patients. When teachers activate another set of eyes to review a student’s written work and compare it with standards of preparation, something else is going on. These activities are vested in documentary and institutional practices. In the local contexts of a teacher’s work with a student, teachers are being organized to enact contradictory practices that produce an ideological account to serve broad institutional goals. What we identified here was an instance of teachers’ work that was directed toward producing due process. When we looked more closely at due process, we discovered that it is a term that emerged from the legal discourse. In teaching work, due process is a set of practices that are linked into the student’s right to appeal a teacher’s grade allocation. Ideologically, it is based on a concept of fundamental fairness. It is the standard to which a teacher is held to determine whether his or her teaching practices have been fair and impartial. In this preliminary analysis, we found evidence to suggest that practices conceptualized as due process actually organize us to gather evidence to fail. We can now follow these clues into the extra-local arenas of their enactment and interrogate our data for empirical evidence of due process.
We have formulated a research problematic at the juncture of the local and extra-local practices of due process. The problematic provides the focus for our second-stage interviews, during which we talked to deans and registrars about their expert knowledge of how due process works when it is used as a measure for decisions about student appeals. It provides one direction for our research as we explore how practices of due process are linked into the student appeal process and how these reflexively get played out in the evaluation practices of teachers. As we follow the work of enacting due process into the day-to-day activities of the various players within the education regime, we are discovering how the activation of due process seems to be a significant point of tension in many of the practices and interactions teachers have with students. We continue to question what these practices accomplish in the work of teachers.
In the early stage of this research, we developed what we believe is a useful analysis about how contradictory practices are built into evaluation work. Institutional ethnography has supported our ontological shift and has provided the theoretical framework through which we can suspend our taken-for-granted assumptions about competent evaluation. We have come to realize that engaging in this study from a committed research standpoint has involved the research team in a consciousness-raising project that requires us to rethink our work with students.
By questioning our taken-for-granted understanding of due process, related to the concepts of fairness, transparency, and rights, and by examining what due process actually accomplishes on behalf of students, we can begin to track it as a set of practices that work across all educational settings. This raises important ethical questions: Do students understand how these processes work? Are they really fair and transparent processes that are organized in the interests of students, or are they in fact relations of ruling that protect universities and colleges? These are the sorts of thorny questions institutional ethnographic research raises. Although our research project is still underway, we are compelled to circulate these important questions to stimulate a critical discussion about what our evaluation work is actually accomplishing.
We invite our colleagues in nursing education to focus attention differently on what goes on in our evaluation practices and to begin to question the integrity of couching our work as being in the interest of students’ learning when what actually may be happening are practices focused on gathering evidence to be used against them. We can begin to identify how an alternate knowledge of our evaluation practices might alter our interactions with students. If we cannot circumvent the requirement for due process, then we can, at minimum, explicate the work for students so that they too can be clear about what is actually being accomplished. We can work with faculty colleagues and students to make our work more transparent, not ideologically, but pragmatically. We can open up due process and learn how to talk about its real terms.
- Collaborative Nursing Program in British Columbia. (2004). Collaborative curriculum guide. Victoria, British Columbia, Canada: Author.
- DeVault, M. & McCoy, L. (2006). Institutional ethnography: Using interviews to investigate ruling relations. In Smith, D. (Ed.), Institutional ethnography as practice (pp. 15–44). Toronto, Ontario, Canada: Rowman & Littlefield.
- DeVeau, J.L. (2008). Examining the institutional ethnographer’s toolkit. Socialist Studies, 4(2), 1–20.
- McCoy, L. (2007). Institutional ethnography and constructionism. In Denzin, J.A. & Gubrium, J.F. (Eds.), Handbook of constructionist research (pp. 701–714). New York, NY: Guildford.
- Smith, D.E. (1987). The everyday world as problematic: A feminist sociology. Toronto, Ontario, Canada: University of Toronto Press.
- Smith, D.E. (1990). The conceptual practices of power: A feminist sociology of knowledge. Toronto, Ontario, Canada: University of Toronto Press.
- Smith, D.E. (2001). Texts and the ontology of organizations and institutions. Studies in Cultures, Organizations, and Societies, 7(2), 159–198. doi:10.1080/10245280108523557 [CrossRef]
- Smith, D.E. (2005). Institutional ethnography: A sociology for people. Toronto, Ontario, Canada: AltaMira.
- Smith, D.E. (2006). Institutional ethnography as practice. Toronto, Ontario, Canada: Rowman & Littlefield.