A Future for Mixed-Methods in Pharm Ed?

By Michael J. Peeters, PharmD, MEd, FCCP, BCPS

What is the nature of our obsession with certain aspects of assessment in pharmacy education? Over the past decade, a number of assessment instruments have been created in pharmacy education (including ones that I have developed). In my reflection, it seems the culture of pharmacy education is fixated on standardized, quantitative instruments to measure everything. “All we need is the right instrument to measure this, and then we will know.” Quantitative, close-ended assessment instruments (i.e., those where participants select responses from a pre-determined list of options) are seen as a cure-all in pharmacy academia. …But are they really?

Broadening our Perspective on Assessment

I often scan other areas of education (such as medical and more general education), and it is interesting that the description and development of close-ended assessment instruments are infrequent topics. More so, I am increasingly noticing qualitative methods such as reflective-writing and portfolios being used with programmatic assessment. Even the Accreditation Council for Pharmacy Education highlights a role for reflection and portfolios within their guidance to the 2016 PharmD Standards.

I do not want to foray into the quantitative/qualitative debate that raged during the 1990s in the social sciences, but suffice to say all research methods have limitations. It should be reasonable to deduce that research/assessment instruments which are completely quantitative (i.e., close-ended) or qualitative (i.e., open-ended) will likely have limitations associated with them. However, as McLaughlin et al. pointed out, mixed-methods is a promising approach going forward.1

Within pharmacy education research, it may be best to use more than one approach within an assessment. For example, a quantitative instrument may provide useful information, but you may need to supplement (i.e., mix it) with qualitative findings such as reflective-writings, interviews and/or focus-groups. Triangulate (i.e., integrate multiple data sources) your quantitative and qualitative findings.

Some Examples of Mixed-Methods Educational Research:

  • A mixed-methods analysis in assessing students’ professional development2
    • This investigation of development of PharmD students’ professionalism is a good example of mixed-methods. Over the course of three years, students were periodically assessed quantitatively and qualitatively with the results triangulated into the investigation’s conclusion for professionalism development.
  • A final-year rotation portfolio from medical education3
    • These portfolios included student scores on a handful of clerkship-based performance assessments, curricular progress test scores, a curricular OSCE score, 360° evaluations from other rotation healthcare team-members, critical appraisals of topics, students’ reflective-writing on experiences/successes/challenges, as well as mentor feedback.4
    • Thus, these portfolios included quantitative test scores, qualitative student reflections, and faculty mentors’ qualitative feedback. All of these data were triangulated into a more meaningful and clearer decision of pass versus remediation by a committee of faculty mentors.
  • Mixed-methods use in programmatic assessment4
    • Cogently, these authors remind pharmacy educators that programmatic assessment is a program of multiple learning assessments and is not program evaluation as a commonly-erred term in pharmacy academia.
    • Many colleges/schools of pharmacy already use mixed-methods in their programmatic evaluation by integrating interviews (qualitative), questionnaires (quantitative, and possibly with some qualitative comments), course grades (quantitative), etc. Triangulating these evidence sources can foster a clearer picture for programmatic assessment decision-making.
    • The more different information one has, qualitatively and quantitatively, the more accurate an assessment is likely to become.

Despite our efforts, there has not been a one-size-fits-all, silver-bullet assessment method and there likely never will be. Programmatic assessment should involve integration of multiple learning assessments,5 using a collection of quantitative and qualitative methods. One question that remains, however, is just what is the appropriate “mix”? That is a difficult question to answer because there is no single “correct” ratio. However, the wrong approach is to only marginally or inconsequentially involve either quantitative or qualitative data. Use all quantitative and qualitative sources substantively.

Moving Forward

To properly conduct mixed-methods research and collect and analyze meaningful qualitative data can take significantly more effort than using only quantitative data. However, the improved pictures for conclusions and for decision-making will be clearer, fairer, more rigorous, more valid (including more reliable), deal with more of life’s complexity, and be more defensible to scrutiny (including legal challenges). Mixed-methods should be a predominant approach for future pharmacy educators. Articles in Currents in Pharmacy Teaching and Learning’s Methodology Matters can be an excellent resource as you proceed with qualitative and quantitative methods described in much more detail.

Do you have more examples of non-quantitative approaches to pharmacy education research to share?

References:

  1. McLaughlin JE, Bush AA, Zeeman JM. Mixed methods: expanding research methodologies in pharmacy education. Curr Pharm Teach Learn. 2016; 8(5):715-721.
  2. Peeters MJ, Vaidya VA. A mixed-methods analysis in assessing students’ professional development by using an Assessment for Learning approach. Am J Pharm Educ. 2016; 80(5):article 77.
  3. Driessen EW, Van Tartwijk J, Govaerts M, Teunissen P, van der Vleuten CPM. The use of programmatic assessment in the clinical workplace: a Maastricht case report. Med Teach. 2012; 34:226-231.
  4. Fielding DW, Regehr G. A call for an integrated program of assessment. Am J Pharm Educ. 2017; 81(4):article 77.
  5. van der Vleuten CPM, Schuwirth LWT. Assessing professional competence: from methods to programmes. Med Educ. 2005; 39(3):309-317.

M Peeters profile photoMichael Peeters is a non-tenure track faculty at the University of Toledo College of Pharmacy. His educational scholarship interests include educational psychometrics, learning assessments, development of learners as well as interprofessional education.


Pulses is a scholarly blog supported by Currents in Pharmacy Teaching and Learning

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s