Using Thematic Content Analysis in Evaluation

No matter how qualitative data is collected, it is necessary to reduce and analyze the raw data in order to extract information and determine its meanings. While there are a large number of ways to approach this analytic process, thematic content analysis (Braun & Clarke, 2006) and interpretative phenomenological analysis (Smith, Flowers, & Larkin, 2009) are particularly flexible and useful approaches. Thematic content analysis can be used with virtually any type of qualitative data. Interpretative phenomenological analysis is more time consuming and complex but yields detailed information on interviewees’ personal understandings of phenomena related to the evaluation questions. It is best reserved for the analyses of in-depth interviews.

Thematic content analysis is one of the most straightforward ways to analyze qualitative data (Maguire & Delahunt, 2017). Braun and Clarke (2006) suggest a six-step analytic process: (1) become familiar with the data, (2) generate initial codes, (3) search for themes, (4) review themes, (5) define themes, and

(6) write-up. To start, the school counselor reads through all the raw data in order to get a general sense of what it contains and makes general notes to guide subsequent analyses. Next, the school counselor scrolls through the data identifying the sections that pertain to the evaluation question that is being examined and codes each identified section with a short phrase that summarizes its meaning. After the data is coded, the school counselor examines the codes to identify clusters of related codes or “themes.” Names are given to the themes that reflect aspects of answers to the evaluation questions. These themes are then reviewed to determine if they make sense, are supported by the data, are organized appropriately, and present a comprehensive picture of the data. At this point, the existing themes may be combined or divided and new codes may be added. The school counselor then defines these refined themes and describes how the themes appear to be related to each other. The themes, definitions, and description of hypothetical relationships are then written up in a form that will appear in a evaluation report.

Using Interpretive Phenomenological Analysis in Evaluation

Smith et al. (2009) have described a similar process for analyzing qualitative data that involves five steps. First, the data from the first respondent is reviewed several times by the school counselor who takes notes on the respondent’s beliefs and personal understandings. Second, the school counselor re-reviews the data and notes and writes concise themes that reflect the respondent’s expressions. Third, these themes are then clustered into meaningful chunks and superordinate themes are identified. Fourth, the school counselor next creates a table of the themes which identifies the most salient and important themes. Fifth, the school counselor moves on to analyze the data from the next respondent, either starting fresh or using the existing table of themes as the starting point in the subsequent analysis. The endpoint of the process is a summary of the themes that reflect how all respondents understood and evaluated phenomena that are related to the particular evaluation question being investigated.

Ensuring the Trustworthiness of Qualitative Findings

Because of the subjective nature of qualitative approaches to evaluation, it is necessary to attend to the trustworthiness of the evaluation findings. According to Guba (1981), trustworthy findings are findings that are credible, dependable, confirmable, and transferable. Every qualitative evaluation should employ multiple strategies to ensure trustworthiness (Shenton, 2004).

Evaluations that use well-accepted methods for collecting and analyzing qualitative data are more likely to yield trustworthy findings. If the school counselor clearly identifies their own beliefs and biases at the initial stages of the evaluation, they will be in a better position to conduct an evaluation that leads to trustworthy findings. The school counselor can do this by writing a few brief paragraphs summarizing their reflections on the program and its operations. The triangulation of findings to verify their consistency across different groups (and/or across different methods) also helps ensure trustworthiness. In addition, periodic debriefing with peers during the evaluation also helps a school counselor conduct an unbiased qualitative evaluation. Finally, having interviewees and/or stakeholders review the analyses and interpretations of qualitative evaluations further ensures trustworthiness.

BOX 8.2

In qualitative evaluations, steps must be takes to ensure the trustworthiness of the data.

Qualitative Approaches and Culturally Responsive Evaluation

Qualitative evaluation approaches present some unique challenges in multicultural settings. In order to conduct a culturally responsive evaluation, special care must be taken in the collection, analysis, and interpretation of qualitative data.

Language preferences and facility present a challenge to the collection of valid information. Ideally, interviews ought to be conducted in the language of the stakeholders. The next best alternative is conducting an interview through an interpreter.

Being able to bridge differences in language, however, is only a start. Interviewees need to feel comfortable and safe in interview settings. Interviewers need to be sensitive to the societal dynamics that may be operating in the interview setting. These dynamics will affect interviewee comfort and openness. Interviewers should actively work to create a safe interview setting (Centers for Disease Control and Prevention, 2010; Public Policy Associates, 2015).

In addition, interviewee nonverbal behavior affects the conduct ofinterviews. Culture has a major influence on cues (e.g., eye contact, shifts in gaze, gestures, pauses) that interviewers use to gauge interest and understanding and contextualize verbal expression (Frierson, Hood, & Hughes, 2010). Due to the subliminal level in which nonverbal communication operates, misinterpretation is highly likely. Interviewers need to have sufficient experience with cross-cultural communication to be able to accurately ascertain the meaning of interviewees’ nonverbal communication (Public Policy Associates, 2015).

Group interviews (e.g., focus groups) can be especially problematic and need to be planned, conducted, and interpreted in a culturally responsive manner. For example, with parent stakeholder focus groups, it is important to consider whether the groups should be composed of culturally heterogeneous or homogeneous parents. In making this decision, a school counselor should ask himself or herself

  • • Considering the evaluation questions and the multicultural dynamics, are members of all the parental subgroups likely to feel safe enough to contribute openly to the focus group discussion?
  • • Are there some potential advantages associated with having a multicultural discussion of the evaluation questions?
  • • Are there any anticipatable problems that may arise in a multicultural discussion of the evaluation questions?
  • • Do I have adequate interview skills to effectively manage these potential problems (e.g., conflict that may occur in this discussion)?

Cultural responsiveness also needs to be expressed in the analysis and interpretation of qualitative evaluation data. The meaning of a piece of qualitative data can only be accurately ascertained within the cultural context through which the data was collected. School counselors need to take care to apply analytic procedures and interpret findings with a proper consideration of respondents’ cultural frame of reference. School counselors need to be able to identify when their own biases are affecting analysis and interpretation. They also need to have sufficient experience and cultural competence with members of the target group to fully comprehend the meaning and nuances of the responses (Public Policy Associates, 2015). Finally, to ensure trustworthiness, it is important to having respondents and stakeholders review the analyses and interpretations.

Summary

Using qualitative evaluation approaches in evaluating a school counseling program, its associated activities, and interventions is necessary to answer important questions related to program improvement. Qualitative approaches to evaluation align naturally with the existing skillset of school counselors. While qualitative data can be collected in many ways, open-ended survey questions, structured individual interviews, and focus group interviews are particularly well suited to evaluation of a school counseling program. Similarly, thematic content analysis and interpretative phenomenological analysis are well-accepted methods for the analysis of qualitative data and well suited for evaluation of a school counseling program. No matter which methods are employed, every qualitative evaluation should include multiple strategies to ensure the trustworthiness of its findings. In order to conduct a culturally responsive evaluation special care must be taken in the collection, analysis, and the interpretation of qualitative data.

References

American School Counselor Association. (2012). TheASCA national model: A framework for school counseling programs (3rd ed.). Alexandria, VA: Author.

Anderson, G. (1998). Fundamentals of educational research (2nd ed.). London: The Falmer Press.

Bloor, M., Frankland, J., Thomas, M., & Robson, K., (2001). Focus groups in social research. Thousand Oaks, CA: Sage.

Braun, V. & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3, 77—101.

Centers for Disease Control and Prevention. (2010). Practical strategies for culturally competent evaluation. Atlanta, GA: US Department of Health and Human Services.

Denscombe, M. (2007). The good research guide. For small-scale research projects (3rd ed.). New York, NY: Open University Press.

Frierson, H. T., Hood, S., & Hughes, G. B. (2010) A guide to conducting culturally responsive evaluations. In J. Frechtling (Ed.), The 2010 user-friendly handbook for project evaluation (pp. 75-93). Washington, DC: National Science Foundation.

Glaser, B. G. & Strauss, A. L. (1967). The discovery of grounded theory strategies for qualitative research. New York: Aldine De Gruyter.

Guba, E. G. (1981) Criteria for assessing the trustworthiness of naturalistic inquiries. Educational Communication and Technology Journal, 29, 75—91.

Krueger, R. & Casey, M. A. (2000). Focus groups: a practical guide for applied research (3rd ed.). London: Sage.

Maguire, M. & Delahunt, B. (2017). Doing a thematic analysis: A practical, step-by-step guide for learning and teaching scholars. All Ireland Journal of Teaching and Learning in Higher Education, 8, 1-14.

Patton, M. Q. (2015). Qualitative research and evaluation methods (4th ed). Thousand Oaks, CA: Sage.

Public Policy Associates. (2015). Considerations for conducting evaluation using a culturally responsive/racial equity lens. Lansing, MI: Author.

Rallis, S. F. & Rossman, G. B. (2017). An introduction to qualitative research (4th ed.). Thousand Oaks, CA: Sage.

Shenton, A. K. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for Information, 22, 63-75.

Smith, J. A., Flowers, P., & Larkin, M. (2009). Interpretative phenomenological analysis: Theory, method and research. London: Sage.

Reporting Evaluation Results

 
Source
< Prev   CONTENTS   Source   Next >