Skip to main content
Erschienen in: Implementation Science 1/2022

Open Access 01.12.2022 | Letter to the Editor

A review of attitude research that is specific, accurate, and comprehensive within its stated scope: responses to Aarons

verfasst von: Jessica Fishman, Catherine Yang, David S. Mandell

Erschienen in: Implementation Science | Ausgabe 1/2022

download
DOWNLOAD
print
DRUCKEN
insite
SUCHEN
Hinweise
This comment refers to the article available at https://​doi.​org/​10.​1186/​s13012-021-01153-9.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Dear Editors-in-Chief (Implementation Science):
Thank you for the opportunity to respond to Dr. Aarons’ letter regarding our article Attitude theory and measurement in implementation science: a secondary review of empirical studies and opportunities for advancement [1]. Dr. Aarons shares three main concerns with our review: (1) that there was a missing attribution to him, as the creator of the EBPAS; (2) whether the EBPAS measures attitudes; and (3) if our review should have included additional studies using the EBPAS. Below, we address each.
First, Dr. Aarons states that we should have made an attribution to him when referencing the developers of the EBPAS. We did cite Aarons and colleagues in the version of the manuscript that was accepted for publication; it appears the journal mistakenly changed the reference. We hope that this can be rectified and thank Dr. Aarons for bringing it to our attention.
Secondly, we respectfully disagree with Dr. Aarons about whether the EBPAS measures attitudes. As defined in the social psychology literature from which the term emanates, an attitude towards a behavior, such as using an evidence-based practice, refers to how strongly one believes that performing that behavior would have favorable or unfavorable consequences [24]. In implementation science, one’s attitudes towards a particular evidence-based practice would represent the perceived advantages and disadvantages of doing so [24]. There are many published methodological accounts of how to adapt validated measurement approaches, which differ fundamentally from EBPAS items and response options.
In the 15-item version of the EBPAS [5], almost all items deviate conceptually from an attitude. As an example, several items ask respondents to report “how likely” they are to use EBP under different circumstances. In psychology, such items would be considered conceptually similar to behavioral intention, not attitudes [6, 7]. The more recent 36-item version of the EBPAS [8] also includes items that are conceptually closer to other psychological constructs. For example, the following item is conceptually related to self-efficacy: “I don’t know how to fit evidence-based practice into my administrative work.” We do not meant to diminish the importance of measuring constructs other than attitudes, but it is useful to distinguish between distinct psychological constructs, which have different roles in causal models predicting and changing behavior.
We also disagree about the importance of measuring attitudes towards specific behaviors rather than general categories of behaviors. EBPAS items refer to general categories of behavior, such as trying “new practices,” “evidence-based practices,” or “evidence-based treatment” [5, 8]. Yet, over several decades, a large attitude literature in psychology has empirically demonstrated the advantages of measuring attitudes towards a specific behavior, rather than general categories of behavior [24].
Consistent with the results from psychology, the implementation science literature has started to document how practitioners’ attitudes can vary greatly among evidence-based practices [912]. For example, we have found that therapists’ attitudes vary towards different components of cognitive-behavioral therapy [10]. Given this variability, a measure of attitudes towards “evidence-based practice” or even “cognitive behavioral therapy” would sacrifice psychometric performance, including predictive validity [912]. Depending on the specific evidence-based practice, other psychological variables also can vary [912].
A related concern is that practitioners often lack familiarity with the phrase “evidence-based practice,” as Dr. Aarons and colleagues have acknowledged [5, 8]. The EBPAS directions state that “evidence-based practice” refers to any intervention that is supported by “empirical research,” but as Dr. Aarons and colleagues acknowledge, practitioners may still be confused, due to a lack of knowledge [5, 8]. For example, Aarons wrote, “Familiarity with the term ‘evidence-based practice’ among program managers was low” [5]. He added that respondents had “only a low level of familiarity with even the terminology of EBP,” including the descriptor “empirically supported treatment” [5]. Additionally, practitioners may not know which practices have been designated as “evidence-based,” “research-based,” or “empirically supported.” Depending on one’s knowledge, responses to the EBPAS may differ, which is problematic if the goal is to measure attitudes.
Finally, Dr. Aarons points out that we did not include many studies that use the EBPAS. The EBPAS was featured only briefly in our review because our review was not focused on the EBPAS. Dr. Aarons suggests that our study selection was biased. We are surprised by this concern because we explicitly stated the inclusion and exclusion criteria, which relied on a rigorous, systematic review (authored by Aarons and colleagues [13]), and we adhered to the criteria.
We agree with Dr. Aarons that future reviews could change the inclusion criteria and generate a different sample of studies. Indeed, in our review [1], we called for this additional research, and we welcome the replication with a different sample. Dr. Aarons correctly notes that there are thousands of articles that could be reviewed if different inclusion criteria were used. He suggests that the studies we reviewed are not representative of all implementation studies that are concerned with attitudes. Since we lack reviews of how these other implementation studies define or measure attitudes, whether our results are representative is an open question.
Implementation science has been described as “somewhat elusive” because it has not yet developed distinct construct definitions [14]. Our review documents conceptual ambiguity and suggests that a definition of attitudes (from psychology) could be useful for implementation research [1]. Our review also provides specific examples of how implementation scientists measure attitudes in ways that differ from each other and from validated approaches used in social psychology. As implementation science strives to develop standardized measurement approaches, some of the rigorously developed methods from social psychology could offer valuable scientific opportunities.

Acknowledgements

Not applicable.

Declarations

The need for approval was waived.
Not applicable.

Competing interests

The authors declare that they have no competing interests.
Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://​creativecommons.​org/​licenses/​by/​4.​0/​. The Creative Commons Public Domain Dedication waiver (http://​creativecommons.​org/​publicdomain/​zero/​1.​0/​) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Literatur
1.
Zurück zum Zitat Fishman J, Yang C, Mandell D. Attitude theory and measurement in implementation science: a secondary review of empirical studies and opportunities for advancement. Implementation Sci. 2021;16:87.CrossRef Fishman J, Yang C, Mandell D. Attitude theory and measurement in implementation science: a secondary review of empirical studies and opportunities for advancement. Implementation Sci. 2021;16:87.CrossRef
2.
Zurück zum Zitat Albarracin D, Johnson B, Zanna M. Handbook of attitudes. Mahwah: Erlbaum; 2005. Albarracin D, Johnson B, Zanna M. Handbook of attitudes. Mahwah: Erlbaum; 2005.
3.
Zurück zum Zitat Ajzen I, Fishbein M. Attitude-behavior relations: a theoretical analysis and review of empirical research. Psychol Bull. 1977;84:888–918.CrossRef Ajzen I, Fishbein M. Attitude-behavior relations: a theoretical analysis and review of empirical research. Psychol Bull. 1977;84:888–918.CrossRef
4.
Zurück zum Zitat Fishbein M, Ajzen I. Predicting and Changing Behavior: The Reasoned Action Approach. 2015. Fishbein M, Ajzen I. Predicting and Changing Behavior: The Reasoned Action Approach. 2015.
5.
Zurück zum Zitat Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the Evidence-Based Practice Attitude Scale (EBPAS). Ment Health Serv Res. 2004;6(2):61–74.CrossRef Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the Evidence-Based Practice Attitude Scale (EBPAS). Ment Health Serv Res. 2004;6(2):61–74.CrossRef
6.
Zurück zum Zitat Fishman J, Lushin V, Mandell D.S. Predicting implementation: comparing validated measures of intention and assessing the role of motivation when designing behavioral interventions. Implement Sci Commun. 2020;1(81). Fishman J, Lushin V, Mandell D.S. Predicting implementation: comparing validated measures of intention and assessing the role of motivation when designing behavioral interventions. Implement Sci Commun. 2020;1(81).
7.
Zurück zum Zitat Sheeran P. Intention—Behavior Relations: A Conceptual and Empirical Review. Eur Rev Soc Psychol. 2002;12:1–36.CrossRef Sheeran P. Intention—Behavior Relations: A Conceptual and Empirical Review. Eur Rev Soc Psychol. 2002;12:1–36.CrossRef
8.
Zurück zum Zitat Rye M, Torres EM, Friborg O, Skre I, Aarons GA. The Evidence-based Practice Attitude Scale-36 (EBPAS-36): a brief and pragmatic measure of attitudes to evidence-based practice validated in US and Norwegian samples. Implement Sci. 2017;4;12(1):44. Rye M, Torres EM, Friborg O, Skre I, Aarons GA. The Evidence-based Practice Attitude Scale-36 (EBPAS-36): a brief and pragmatic measure of attitudes to evidence-based practice validated in US and Norwegian samples. Implement Sci. 2017;4;12(1):44.
9.
Zurück zum Zitat Maddox BB, Crabbe SR, Fishman JM, Beidas RS, Brookman-Frazee L, Miller JS, et al. Factors influencing the use of cognitive–behavioral therapy with autistic adults: a survey of community mental health clinicians. J Autism Dev Disord. 2019;49(11):4421.CrossRef Maddox BB, Crabbe SR, Fishman JM, Beidas RS, Brookman-Frazee L, Miller JS, et al. Factors influencing the use of cognitive–behavioral therapy with autistic adults: a survey of community mental health clinicians. J Autism Dev Disord. 2019;49(11):4421.CrossRef
10.
Zurück zum Zitat Wolk CB, Becker-Haimes EM, Fishman J, Affrunti NW, Mandell DS, Creed TA. Variability in clinician intentions to implement specific cognitive-behavioral therapy components. BMC Psychiatry. 2019;19(1):406.CrossRef Wolk CB, Becker-Haimes EM, Fishman J, Affrunti NW, Mandell DS, Creed TA. Variability in clinician intentions to implement specific cognitive-behavioral therapy components. BMC Psychiatry. 2019;19(1):406.CrossRef
11.
Zurück zum Zitat Fishman J, Beidas R, Reisinger E, Mandell DS. The utility of measuring intentions to use best practices: a longitudinal study among teachers supporting students with autism. J School Health. 2018;88:388–95.CrossRef Fishman J, Beidas R, Reisinger E, Mandell DS. The utility of measuring intentions to use best practices: a longitudinal study among teachers supporting students with autism. J School Health. 2018;88:388–95.CrossRef
12.
Zurück zum Zitat Becker-Haimes EM, Mandell DS, Fishman J, et al. Assessing causal pathways and targets of implementation variability for EBP use (Project ACTIVE): a study protocol. Implement Sci Commun. 2021;2:44.CrossRef Becker-Haimes EM, Mandell DS, Fishman J, et al. Assessing causal pathways and targets of implementation variability for EBP use (Project ACTIVE): a study protocol. Implement Sci Commun. 2021;2:44.CrossRef
13.
Zurück zum Zitat Lewis CC, Boyd MR, Walsh-Bailey C, et al. A systematic review of empirical studies examining mechanisms of implementation in health. Implement Sci. 2020;15:21.CrossRef Lewis CC, Boyd MR, Walsh-Bailey C, et al. A systematic review of empirical studies examining mechanisms of implementation in health. Implement Sci. 2020;15:21.CrossRef
14.
Zurück zum Zitat Martinez RG, Lewis CC, Weiner BJ. Instrumentation issues in implementation science. Implementation Sci. 2014;9:118.CrossRef Martinez RG, Lewis CC, Weiner BJ. Instrumentation issues in implementation science. Implementation Sci. 2014;9:118.CrossRef
Metadaten
Titel
A review of attitude research that is specific, accurate, and comprehensive within its stated scope: responses to Aarons
verfasst von
Jessica Fishman
Catherine Yang
David S. Mandell
Publikationsdatum
01.12.2022
Verlag
BioMed Central
Erschienen in
Implementation Science / Ausgabe 1/2022
Elektronische ISSN: 1748-5908
DOI
https://doi.org/10.1186/s13012-022-01200-z

Weitere Artikel der Ausgabe 1/2022

Implementation Science 1/2022 Zur Ausgabe