Implementing evidence-based practice in sport psychology: the 7As
Knowledge translation involves using research in ways to help people and is a process consisting of several steps. Figure 13.2 outlines the 7 As illustrating the common steps in knowledge transfer (de Groot, van der Woudeii van Hell. & Nieweg. 2013; Melnyk, Fineout-Overholt. Stillwell. & Williamson. 2010; Steglitz, Warnick. Hoffman. Johnston. & Spring. 2015; Straus. Glasziou. Richardson. & Haynes. 2019).
Andersen (Andersen. Van Raalte, & Harris. 2000) described sport psychology as “endlessly fascinating.” and he aimed to help liis supervisees develop profound interests in their clients, their own behaviours, and the consulting relationship. Becoming enthralled helps transcend dichotomous thinking about good and bad that is often unhelpful when examining why service delivery has or lias not gone well. Such a binary assessment stifles the deep reflection associated with professional growth and evidence-based practice.
Figure 13.2 The 7As.
The second step involves asking answerable client-focused questions (Spring et al., 2012). Questions can address any aspect of service delivery, such as interventions, assessment, or consulting relationsliips (Steglitz et al.. 2015). and can arise from various sources (de Groot et al., 2013). For example, questions can emerge from consultancy or from interactions among colleagues. The different sources give rise to general (orbackground) and specific (or foreground) questions (Spring et al.. 2012).
Background questions deal with general information about sen ice delivery, and examples include:
• What psychological strategies help heal strained coach-athlete relationships?
■ How can musicians manage performance anxiety?
■ How might practitioners encourage solid therapeutic relationsliips with cliildren?
Background questions are useful for continued professional development and reflect a commitment to lifelong learning.
Specific questions are tailored towards individual clients and may follow formats associated with systematic review methodology, such as PICOT (participant, intervention, control, outcome, time). An example specific question could be: in adult musicians (P). do cognitive behavioural interventions (I) help them manage performance anxieties (O) quicker (T) than spontaneous remission (C)? The formats from systematic review methodology can help refine and convert vague ideas into answerable questions (Straus et al.. 2019).'
Evidence-based practitioners search for both scientific and local knowledge, because each influences consulting processes and outcomes (Rousseau & Gunia. 2016). Locating original research can be time-consuming, especially for comprehensive searches (Tod. 2020). Accessing systematic reviews and practice guidelines, however, may be sufficient for a practitioner’s purposes (Spring et al.. 2012: Steglitz et al.. 2015). Consultants, particularly those in private practice, might not have ready access to scientific evidence and will benefit from having colleagues working in academia. Practitioners can also search consultant friendly journals, such as The Sport Psychologist or the Journal of Sport Psychology in Action. The efficiency and effectiveness of searches can be enhanced by framing questions according to formats such as PICOT because they assist individuals in identifying relevant combinations of keywords (Tod. 2020).
Local knowledge may not be subject to the same levels of quality assurance as scientific evidence, but it still helps in tailoring services to clients’ needs and circumstances (C.H. Brown. Gould. & Foster. 2005). Examples include players' statistics, performance results, typical weather conditions, and organizational culture. Although the rise of systematic reviewing as a scientific method has enhanced people’s skill in locating research, the same acliievements are yet to be attained with regards to local knowledge (Rousseau & Gunia. 2016). Nevertheless, such context specific information helps practitioners strike a balance between intervention fidelity and flexibility to ensure clients receive optimal sen ices (Cook et al.. 2017).
Critical appraisal involves the meticulous methodical inspection of a study ’s credibility, worth, and bearing on a specific question, client group, or context (Tod. 2020). Rigour, suitability. and relevance constitutes critical appraisal (Liabo et al.. 2017). Rigour refers to internal validity. Suitability refers to the match between the study ’s method and the practitioner’s question. Relevance refers to the study's significance for the question and context. Melnyk et al. (2010) provides tliree guiding questions to help practitioners consider the core dimensions of critical appraisal:
- • Are the results valid?
- • What are the results and are they important?
- • Will the results help me care for and assist my clients?
These three questions' usefulness will vary with the reader's purpose. Apractitioner may find them sufficient to assist them with helping clients. Scientists may find them too general to guide a review they intend on publishing in a scholarly journal.
Stephen King (2000. p. 201) wrote, “when you write a book, you spend day after day scanning and identifying the trees. When you're done, you have to step back and look at the forest.” His words also reflect critical appraisal because evaluating research involves dissecting each study to a fine level of detail. It is easy to become overly critical, an attitude Sackett. Richardson, Rosenberg, and Haynes (1997) labelled critical appraisal nihilism. Not every limitation influences results and imperfect studies help us answer our questions. Critical appraisal’s purpose becomes clear when we examine trends across a body of literature. Trends allow us to make informed decisions about how much we can trust and apply the findings from research.
Applying research can be a political exercise requiring shared decision-making among practitioners. clients, and other stakeholders (Steglitz et al.. 2015). Stakeholders in sport psychology include coaches, parents, and sporting organizations. The purpose is to devise an action plan that integrates research evidence, clinical expertise, and clients' preferences, needs, and circumstances (Melnyk et al.. 2010). Depending on the context, applying the evidence may occur at the individual or group level (de Groot et al., 2013). At the individual level, such as when consultants are working one-on-one with clients, integrating evidence is about personalising interventions. At the organizational levels, such as in team situations, there is the added need to balance personalizing sen ices with being consistent across the group. Few cookbook recipes exist to guide practitioners in integrating multiple sources of evidence. Different sources of evidence sometimes conflict. In ambiguous situations, practitioners can work with the people involved to identify the actions with the best chance of helping clients and avoiding harm.
Evaluating the process and outcome of applied work is a core principle in evidence-based practice (American Psychological Association. 2006). Assessment ideally occurs both when scientific knowledge is implemented and when practitioners operate in the absence of guiding research (Rousseau & Gunia. 2016). In the latter case, assessment can help justify novel ways of working. Optimal assessment is cyclical, with each iteration leading to enhanced tailoring of knowledge to the client 's specific preferences, needs, and circumstances (Spring et al.. 2012: Steglitz et al.. 2015). Also, assessment that begins prior to applying knowledge helps ascertain its effectiveness by supplying baseline data (Tod. 2020). When helping athletes. for example, baseline assessment might include objective and subjective performance indicators, thought patterns, and emotions. Practitioners might also gather data from different people. Monitoring and assessment helps consultants detect flaws and mistakes, learn improved ways of working, and observe where results and processes in sen ice delivery differ from those in scientific studies (Melnyk et al.. 2010). Practitioners who evaluate their work also gain insights into their knowledge, skills, and competencies. Their increased self-awareness can fuel personal improvement (Rousseau & Gunia. 2016).
Evidence-based practice allows consultants to work in effective, creative, safe, and ethical ways. Clients achieve desired goals, find solutions to difficult problem situations, attain satisfaction, secure peace of mind, and derive meaning in their lives. Practitioners who communicate their effective and ineffective sen ice delivery experiences help fellow consultants, researchers, students, and stakeholders (Melnyk et al., 2010). Colleagues and trainees learn new ways of helping clients, have novel grist for the reflective mill, and avoid reinventing methods and strategies. Researchers ascertain new topics to examine that extend theory. Stakeholders discover the contributions sport, exercise, and performance psycholog}1 make to their endeavours.
There are various ways practitioners can disseminate their accounts to the community (Melnyk et al., 2010). At a local level, for example, many professionals are members of groups that meet regularly to share experiences. At a national level, practitioners may belong to professional organizations that hold conferences, seminars, and workshops at which case studies might be welcome. Professional organizations also publish newsletters, magazines, or journals that disseminate applied work. At the global level, international conferences, textbooks, and scholarly journals are feasible outlets. Several sport psychology journals, for example, publish case studies, such as The Sport Psychologist, the Journal of Sport Psychology in Action, and Case Studies in Sport and Exercise Psychology.