Associate Professor; Centre for Research in Applied Measurement and Evaluation
6-110 Education North Building
Cheryl has earned the following academic degrees: BSc (Queen’s University), BEd (Queen’s University), MA (University of Alabama), and PhD (Queen’s University). She is a faculty member in a research-intensive Centre for Applied Research in Assessment and Measurement in Education (CRAME) in the Measurement, Evaluation, and Cognition (MEC) program area. Cheryl conducts research, teaches, and supervises graduate students in the areas of mixed methods research, program evaluation, qualitative research, classroom assessment, and health sciences education. She has served as Associate Chair, Undergraduate for the Department of Educational Psychology (2014-2015) and as the coordinator for the Educational Assessment Undergraduate Course (EDPY 301) 2009-2013. She is also the co-founder in 2009 of The Alberta Clinical and Community-based Evaluation Research Team (ACCERT) along with Dr. Jacqueline Pei. Please feel free to contact her about graduate programs, research collaborations, and evaluation services.
Cheryl’s career as an educator and evaluator spans two decades working with partner communities across diverse contexts both internationally and within Canada. Cheryl taught high school science internationally at the American International School in Quito, Ecuador and with West Island College’s international program aboard the tallship S/V Concordia. She served as director of the Concordia-based International program before returning to teach science and French in the Halton District School Board (Ontario). She has worked as a program evaluator in a wide array of sectors across Canada and with international development agencies in South America and Africa. Cheryl joined the University of Alberta in the Faculty of Education in 2008.
Together with faculty and graduate students, ACCERT specializes in community-involved program evaluation and applied social research with a focus on building capacity both within our university-based team and with the clients and stakeholder organizations we work with. Since our inception, the team has been involved in small, single-site evaluation and research projects (e.g., not-for-profit organizations) as well as several large-system level projects involving provincial governments. Capitalizing on the diverse expertise across team members, we have worked in a wide variety of sectors, such as education, justice, social services, health care, mental health, and early childhood development. We have also worked with programs for traditionally marginalized populations and communities, including youth, women, Aboriginal groups, and people with disabilities. We have conducted evaluations of programs and services that delivered using in-person and computer-mediated environments. ACCERT’s faculty and associates are recognized experts within the evaluation field, research methods, and content areas ranging from assessment of learning in diverse contexts to specialized populations such as community members diagnosed with Fetal Alcohol Spectrum Disorders (FASD). In addition to more traditional summative (i.e., outcomes-oriented) evaluations we also offer expertise in formative (i.e., improvement-oriented) evaluations and developmental (i.e., innovative-oriented) evaluations.
Cheryl’s research focuses on enhancing the learning environments within three contexts: organizations, classrooms, and clinical settings. Her projects examine a range of issues
concerned with developing and implementing innovative programs within dynamic situations. These issues include how to facilitate organizational (e.g., government ministries, school boards) evidence-based program development, how to provide learners (e.g., pre-service teachers, medical students) with timely access to feedback, and how to engage instructors (e.g., teachers, clinical educators) in innovative instructional developments. Across her projects, she collaborates with graduate students, clinicians, teachers, and academics, implement qualitative, quantitative, and mixed methods research designs, and has disseminated research to participants, scholars, and practitioners using a variety of methods (e.g., papers, presentations, and websites). Importantly, these project findings contribute to developments in the fields of mixed method research, program evaluation, qualitative research, and assessment.
Mixed Methods Research:
The use of mixed methods as a research vehicle is not new; quantitative and qualitative data have been collected for many decades. The field of mixed methods continues to gain attention and acceptance among researchers has providing access to insights that are inaccessible by either qualitative or quantitative research alone. Cheryl brings to the field of mixed methods expertise in a variety of qualitative and quantitative research methods as well as experience as the mixed methods practitioner for research teams. Currents interests involve advancing integrative mixed methods research practices that are complexity-responsive and maintain methodological coherence among the research elements within dynamic research contexts.
Cheryl is an associate editor for the Journal of Mixed Methods Research (JMMR – IF=1.6 - 2016) and President of the Mixed Methods International Research Association (MMIRA). MMIRA provides individuals from across the world a venue for connecting, interacting, and discussing mixed methods related research and practice issues (see. http://MMIRA.org for more information). She is a member of the topical interest group Mixed Methods Evaluation of the American Evaluation Association and special interest group Mixed Methods Research of the American Educational Research Association and is regular contributor to annual meetings of Mixed Methods Researchers and to the Department of Educational Psychology graduate student Mixed Methods reading group. In 2016, Cheryl guest co-edited two special issues on Mixed Methods Research with Dr. Anthony Onwuegbuzie in the International Journal of Qualitative Methods. Cheryl is currently working on a book under contract with Sage Publications (UK) Mixed methods research: A practical guide.
Representative Publications & Presentations
Poth, C. (in press). The curious case of complexity: Implications for mixed methods research practices, International Journal of Multiple Research Approaches
Onwuegbuzie, A., & Poth, C. (2016). Editors' afterword: Toward evidence-based guidelines for reviewing mixed methods research manuscripts submitted to journals. International Journal of Qualitative Methods, 15(1), 1-13. doi:10.1177/1609406916628986
Poth, C., & Onwuegbuzie, A. (2015). Editorial: How Mixed Methods Informs and Enhances Qualitative Research, Special issue on Mixed Methods Research, 14(2), International Journal of Qualitative Methods, 1-4.
Poth, C., & Pei, J. (2016, August). Six habits of mixed methods researchers and evaluators enabling social change through complex educational interventions. Paper presented at the Mixed Methods International Research Association global conference, Durham, UK.
Poth, C. (2014). What constitutes effective learning experiences in a mixed methods research course? An examination from the student perspective. International Journal of Multiple Research Approaches, 8 (1), 74-86. doi:10.5172/mra.2014.8.1.74
Poth, C., McCallum, K., & Atkinson, E. (2014, June). Towards Enhanced Online Teacher Professionalism: A mixed methods examination of pre-service teachers’ perspectives, Mixed Methods International Research Association, Boston, MA
Poth, C. (2012). Exploring the role of mixed methods practitioner within educational research teams: A cross case comparison of the research planning process, Special issue of “Mixed Methods Research in Education” International Journal of Multiple Research Approaches, 6, 315-332.
Turner, S. R., White, J., & Poth, C. (2012). Learning the CanMEDS roles in a near-peer shadowing program: A mixed methods randomized control trial, Medical Teacher. 34, 888-892. doi: 10.3109/0142159X.2012.716179.
Developmental evaluation represents a radical shift from traditional evaluation approaches in that conducting evaluative inquiry is not predicated on pre-establishing evaluation goals, time constraints, or a detached role for the evaluator (Patton, 2010). The developmental evaluator is charged with stimulating discussions and using evaluative logic that in turn facilitates data- informed decisions supportive of ongoing organizational and program development. Cheryl’s specific research interest is in the how developmental evaluation can promote evaluation use within dynamic contexts. Organizational theories informed by complexity science provide an innovative framework in which to anchor this research. A recent Social Sciences and Humanities Research Council of Canada Standard Research Grant has supported work in building an Evaluation capacity network among Early Childhood Stakeholders. Examples of recently completed and ongoing funding for evaluations include: Alberta Mentoring Partnership, Alberta Centre for Child, Family and Community Research, Alberta Education, and Alberta Rural Development Network.
Cheryl is an editorial board member of the Canadian Journal of Program Evaluation (CJPE) and served as the Alberta/NWT chapter representative to the National Council 2011-2013. She is a credentialed evaluator by the Canadian Evaluation Society and member of the Canadian and American professional evaluation organizations and regularly contributes to the annual meetings and publications of the CES and the American Evaluation Association (AEA). Recently, Cheryl guest co-edited a special issue on Evaluation Use with Dr. Michelle Searle in the Canadian Journal of Program Evaluation. Her graduate students have interned at Alberta Innovates, Evaluation and Research Services, and Community and University Partnerships.
Representative Publications & Presentations
Poth, C., Anderson-Draper, M., & El Hassar, B. (2017). Internship experiences. Influential mentoring practices for navigating challenges and optimizing learning during an evaluation internship experience, Canadian Journal of Program Evaluation, 31(3), 374-396. doi: 10.3138/CJPE.325
Poth, C., & Searle, M. (2017). Editorial Introduction – Setting the evaluation use context, Canadian Journal of Program Evaluation, 31(3), 275-283. doi: 10.3138/CJPE.387
Gokiert, R., Kingsley, B., & Poth, C., & Tremblay, M. (2017, May). Fostering an enabling environment for meaningful evaluation: A network approach. Paper presented at the annual meeting of the Canadian Evaluation Society, Vancouver, BC
Poth, C., Pei, J., Atkinson, E., & Hanson, T. (2016). Evaluating system change initiatives: Advancing the need for adapting evaluation practices. Canadian Journal of Program Evaluation, 31(2), 242-252. doi: 10.3138/CJPE.263
El Hassar, B., Poth, C., Gokiert, R. (2016, October). Measuring evaluation capacity building: What do pilot instrument results tell us about the multi-method design process. Paper presented at the annual meeting of the American Evaluation Society, Atlanta, GA.
Poth, C., Lamarche, M. K., Yapp, A., Sulla, E., & Chisamore, C. (2014). Towards a definition of evaluation within the Canadian context: Who knew this would be so difficult? Canadian Journal of Program Evaluation, 29, 1-18, doi.10.3138/cjpe.BR Poth 1
Radil, A.I., Williams, J., Cormier, D.C., Pei, J., Poth, C., Seeger, S. & Regher, E. (2014). The Wellness Resiliency and Partnership (WRaP) Project (pp. 37). Edmonton, Alberta: University of Alberta.
Offrey, L., Leung, W., El Hassar, B., Pei, J., & Poth, C., (2014). Mackenzie network supportive living initiative: Evaluation design. Edmonton, Alberta: University of Alberta.
Yapp, A., Sulla, E., & Poth, C. (2014, November). Maintaining relevance in a post-secondary educational assessment course through developmental evaluation use, Paper presented at the annual meeting of the American Evaluation Association, Denver, CO.
Atkinson, E. Radil, A., Buhr, E., Tremblay, M., Pei, J., & Poth, C. (2014, October). The Prevention Conversation: Evaluating participant experiences of an FASD prevention initiative. Paper presented at the Alberta FASD Conference, Edmonton, AB.
Chudnovsky, K., & Poth, C. (2013, June). What constitutes impactful evaluation experiences for building CES competencies within a program evaluation course? Paper presented at the annual meeting of the Canadian Evaluation Society, Toronto, ON.
Poth, C., Brower, K., Yapp, A., & Pei, J. (2013). An Evaluation Framework: In support of ongoing development, improvement, and assessment. Report for the Alberta Mentoring Partnership, Edmonton, AB.
Poth, C., Howery, K., & Pinto, D. (2012). Addressing the challenges encountered during a developmental evaluation: Implications for evaluation practice, Canadian Journal of Program Evaluation, 26, 39-48.
Poth, C., Pei, J., & ACCERT team (2012). Fetal Alcohol Spectrum Disorder (FASD) 10-year strategic plan: A summary of the year five evaluation project. Report for Alberta Centre for Child, Family, and Community Research, Edmonton, AB.
Advances in qualitative research involve new procedures for generating and integrating data. Qualitative inquiry continues to demand attention as new approaches emerge and gain credibility. Current interests are focused on enhancing access to practical resources, opportunities for learning about innovative qualitative approaches, and integrating technology into data collection and analysis procedures.
Cheryl is an editorial board member of the International Journal of Qualitative Methods (IJQM, IF=0.62 - 2016) and a member of the advisory board as well as a member scholar of the International Institute for Qualitative Methodology (IIQM). She is a regular contributor to the Thinking Qualitatively workshop series.
Representative Publications & Presentations
Creswell, J., & Poth, C. (2017). Qualitative inquiry & research design (4th ed.). Thousand Oaks, CA: Sage.
Poth, C., Pei, J., Job, J., & Wyper, K. (2014). Towards intentional, reflective, and assimilative classroom practices with students with FASD, The Teacher Educator. 49,247-264. doi: 10.1080/08878730.2014.933642.
Job, J., Poth, C., Pei, J., Wyper, J., O’Riordan, T., & Taylor, L.. (2014). Combining visual methods with focus groups: An innovative approach for capturing the multifaceted and complex work of FASD prevention specialists. Special issue on International Perspectives on Fetal Alcohol Spectrum Disorders, International Journal of Alcohol and Drug Research, 3(1), 71-80. doi: http://dx.doi.org/10.7895/ijadr.v3i1.129.
Job, J., Poth, C., Pei, J., Carter-Pasula, B., Brandell, D., & Macnab, J. (2013). Toward better collaboration in the education of students with Fetal Alcohol Spectrum Disorders: Voices of teachers, administrators, caregivers, and allied professionals. Qualitative Research in Education, 2(1), 38-64. doi: 10.4471/qre.2013.15
Poth, C., Pei, J., & Job, J.* (2012, May). Towards Collaborative Research Practices: A case study with The Alberta Centre for Child, Family and Community Research. Paper presented at the 8th International Congress of Qualitative Inquiry, Urbana-Champaign, IL.
Alberta Education (2007) reports the assessment of student learning as one of the most complex responsibilities teachers undertake and requires the application of professional judgments. Many policies seek to embed both assessments for learning (also called formative assessment) along with assessments of learning (also called summative assessment). Together, a more balanced framework can empower students as a partner in their own learning. This shift in practice requires developing knowledge, skills, and attributes for a new classroom assessment approach. Cheryl’s research interests build upon her experiences as a classroom teacher exploring the development of pre-service and beginning teachers’ classroom assessment practices. She is also interested in the relationship between assessment practices that support learning, inform instructional practice, and communicate achievement and student motivation.
Cheryl is a member of Canadian and American professional educational research organizations and regularly contributes to the annual meetings of the American Educational Research Association (AERA) and Canadian Society for Studies in Education (CSSE).
Representative Publications & Presentations
Daniels, L., M. & Poth, C. (2017). Relationships between pre-service teachers’ conceptions of assessment, approaches to instruction, and assessment: An Achievement Goal Theory perspective, Educational Psychology, 7, doi: 10.1080/01443410.2017.1293800
Poth, C. (2017, April). A mixed methods investigation of influences to and impacts of participation in formative assessments. Paper presented at the annual meeting of the American Educational Research Association, San Antonio, TX.
Poth, C., McCallum, K., & Tang, W. (2016). Teacher e-professionalism: An examination of western Canadian pre -service teachers’ perceptions, attitudes, and Facebook behaviours. Alberta Journal of Educational Research, 62, 39-60.
Poth, C., Riedel, A., & Luth, R. (2015). Framing student perspectives into the higher education institutional review policy process. Canadian Journal of Higher Education, 45(4), 361-382. Retrieved from http://ojs.library.ubc.ca.login.ezproxy.library.ualberta.ca/index.php/cjhe/article /view/184831/pd f_47
Pei, J., Job, J., Poth, C., O’Brien-Langer, A., & Tang, W. (2015). Enhancing learning environments for students affected by Fetal Alcohol Spectrum Disorders: An exploratory study of Canadian pre-service teacher knowledge and conceptions. Journal of Education and Training Studies, 3(5), 134-143. doi: 10.11114/jets.v3i5.955
Daniels, L., M., Poth, C., Hutchison, M., & Papile, C. (2014). Validating the conceptions of Assessment-III scale in Canadian pre-service teachers, Educational Assessment Journal, 19, 139- 158. doi.10.1080/10627197.2014.903654
Poth, C. (2013). What assessment knowledge and skills do initial teacher education programs address? A western Canadian perspective, Alberta Journal of Educational Research, 58, 634- 656.
Pei, J., Job, J., Poth, C., & Atkinson, E. (2013). Assessment for intervention of children with Fetal Alcohol Spectrum Disorders: Perspectives of classroom teachers, administrators, caregivers, and Allied Professionals. Psychology, 4 (3A), 325-334. doi: 10.4236/psych.2013.43A047
Miller, T., Wagner, A., Poth, C., & Daniels, L. (2013, June). Do students choose formative opportunities? A cross-case comparison of two instructional approaches. Paper presented at the annual meeting of the Canadian Society for Studies in Education, Victoria, BC.
Poth, C. (2013, January). Why are pre-service teachers not prepared for their assessment responsibilities? Keynote presented at the Education Society in Edmonton, Edmonton, AB.
Poth, C. (2012, August). Enhancing the learning environment in large class, multi-sectional courses: Key features of a team instructional approach. Workshop presented at the Centre for Teaching and Learning’s Teaching Big Symposium: The Joy of Large Classes, Edmonton, AB.
Luhanga, U., Leighton, L., & Poth, C. (2012, June). Beyond conceptual boundaries: Using student feedback to enhance teaching and learning in higher education. Paper presented at the annual meeting of the Society for Teaching and Learning in Higher Education, Montreal, QC.
Reidel, A., Poth, C., & Luhanga, U. (2012, May). A mixed methods design exploring underlying reasons for student assessment preferences. Paper presented at the annual meeting of the Canadian Society for Studies in Education, Waterloo, ON.
Health Sciences Education
Innovative approaches for health science education are constantly emerging. Among the most recent are competency based medical education and inter-professional educational learning opportunities. Specifically, health education has undergone a shift towards competency-based models of education due to calls for greater accountability in the professions. Competency-based medical education approaches physician training by focusing on outcomes, emphasizing abilities, de-emphasizing time-based training, and promoting greater learner-centeredness (Frank, Mungroo, Ahmad, Wang, DeRossi, & Horsley, 2010). Recent research highlights the need for feedback and clinical opportunities related to progress towards achievement of competency as critical for supporting learning (Teztlaff, Dannefer, & Fishleder, 2009).
Dr. Poth is involved in health sciences education innovations as a methodologist and evaluator. Previously she was the principal evaluator for a 3-year inter-professional education initiative at Queen’s University funded by Health Canada. She collaborates with the Department of Family Medicine on the development and implementation of their Competency-Based Achievement System (CBAS). CBAS is an innovative framework that emphasizes assessment for learning in the development of competences for Family Medicine residents. Dr. Poth has also served as a research methodologist with the Health Sciences Council at the University of Alberta in the development of instruments to measure medical communication skills.
Representative Publications & Presentations
Hall, M., Poth, C., Manns, P., & Beaupre, L. (2016). An exploration of physiotherapists’ decisions whether to supervise physiotherapy students: Results from a national survey. Physiotherapy Canada, 68(2), 141-148. doi: 10.3138/ptc.2014-88E
Hall, M., Manns, P., Poth, C., & Beaupre, L. (2016). Examining the need for a new instrument to evaluate Canadian physiotherapy students during clinical education experiences. Physiotherapy Canada, 68(2), 151-155. doi: 10.3138/ptc.2014-89E
Ross, S., Humphries, P., Poth, C., & Donoff, M. (2013, June). Competency-based education: One program’s experience. Paper presented at the annual meeting of the Canadian Society for Studies in Education, Victoria, BC.
Turner, S. R., White, J., & Poth, C., (2012). 12 Tips for Developing a Near-Peer Shadowing Program, Medical Teacher, 34, 792-795. doi. 10.3109/0142159X.2012.684914.
Turner, S. R., White, J., Poth, C., & Rogers. W. T. (2012). Preparing students for clerkship: A resident shadowing program, Academic Medicine, 87, 1288-1291.
Ross, S., Poth, C., Donoff, M., Papile, C., Humphries, P., Stasiuk, S., & Georgis, R. (2012). Involving users in the refinement of the competency-based achievement system: An innovative approach to competency-based assessment, Medical Teacher, 34, e143-e147. doi: 10.3109/0142159X.2012.644828.
Cheryl’s teaching philosophy represents her approach to facilitating effective teaching and learning environments. Her approach is continually being shaped by her experiences as a learner and instructor, as well as by her beliefs and research related to what facilitates learning. To that end, she seeks opportunities that will engage students in their own learning process through implementing innovative learning strategies aimed at extending their existing knowledge and relevant skills.
Cheryl continually seeks feedback from my students, reflect on her interactions with students, and adapts her learning goals, instructional activities, and assessment methods. Her view of an effective instructor is one who facilitates rich learning environments by providing responsive and supportive mentorship during the learners’ journey. This view is well encapsulated in Louisa May Alcott’s assertion: “I am not afraid of storms, for I am learning how to sail my ship.” She views her role as a university teacher as a privilege as well as a great responsibility. The importance of conceptualizing and modeling learning as a lifelong process cannot be underestimated given her role as a mentor to future professions (inclusive of but not limited to teachers, counselors, researchers). Cheryl was awarded the University of Alberta’s Provost’s Award for Early Achievement of Excellent in Undergraduate Teaching in 2013.
Cheryl’s campus-based course offerings span four areas: mixed methods, program evaluation, research design, and classroom assessment. Please see calendar for description and Beartracks for upcoming course offerings.
EDPY 604 Mixed Methods Research is a doctoral course offered every second year with the aim of introducing course participants to the knowledge and skills required for undertaking a mixed methods study. This course embeds practical applications of the methods research-specific competencies and draws upon background coursework and/or experiences with qualitative and quantitative research data and methods.
EDPY 615 Program Evaluation is a doctoral course offered every fall (and on occasion in winter term) with the aim of introducing course participants to the complexity of social and program evaluation as a consultative process. This course embeds a community service component to provide practical application of the evaluation design process.
EDPY 501: Introduction to Methods of Educational Research is a required masters course within many of our departmental programs. It is offered most terms with the aim of introducing course participants to the decisions and processes involved in educational research. This course embeds a practical component and works towards producing a research proposal in the area of interest of the course participant.
EDPY 303 Educational Assessment is a required undergraduate course within our teacher education program and is multiple sections are offered on campus both fall and winter terms. Collaborative programs also offer off-campus options. As coordinator of the EDPY 303 Educational Assessment course, Cheryl in collaboration with course instructors and graduate teaching assistants introduce pre-service teachers to the current assessment practices using a team instructional approach.
Cheryl is a regular contributor to webinars and workshops on campus and globally. Most recently she has offered workshops for the Mixed Methods International Research Association, International Institute of Qualitative Methodology, and at a National Teaching Conference in Norway, and for the Centre for Teaching and Learning at the University of Alberta. The following are examples of recently offered workshops.
Comparing Five Qualitative Research Approaches: Exploring differences and similarities across study designs and procedures
This session prepares participants for choosing a qualitative approach that best fits their study purpose. Together we will explore the designs and procedures inherent to five qualitative research approaches: narrative research, phenomenology, grounded theory, ethnography, and case study. The session will be organized around four key questions: What are the origins and defining features of each approach, what types and methods are associated with each approach, what data analysis and writing structures are commonly used for each approach, and what challenges and ethical considerations are likely encountered for each approach? Participants are encouraged to bring study ideas that they can explore during the interactive workshop. Participants are also encouraged to attend fundamentals in qualitative research and more advanced workshops specific to one of the approaches.
Using Case Studies in Qualitative Research: A Primer
With a focus on the topic of case study research, this workshop is intended to provide an introductory overview of the approach spanning planning considerations to conducting procedures. Case studies have a rich history spanning disciplines as a strategy of inquiry, a
methodology, and a research method. This session will be organized around three key questions: What are the defining features of a case study? What must be considered when planning case study (or multi-case) research? What are the procedures for conducting a case study (or multi- case study)? Various hands-on activities will be incorporated into the workshop with the aim that participants gain not only a theoretical but also a practical understanding of doing case study research. Participants are encouraged to bring ideas for case studies that they can explore during the interactive workshop. Opportunities for applying their own areas of interest and/or projects to discuss in small groups will be embedded.
Qualitative Analysis Bootcamp: Practical Guidance for Beginning Your Analysis
With a focus on an overview of the process of qualitative data analysis and general introduction to qualitative data analysis software. This session is aimed at unpacking the process of
qualitative data analysis and providing practical guidance because analyzing text and multiple other forms of data presents a challenge for qualitative researchers. The process of analysis spans managing and organizing the data, reading and memoing emergent ideas, describing and classifying codes and themes, developing and assessing interpretations, and representing and visualizing the data. This session is organized around the following questions: How can the overall data analysis be conceptualized as a spiral and what are the key decision points? What are some issues that should be anticipated and how might they be mitigated? What does qualitative data analysis software offer and how do the programs differ? Participants are advised this is an interactive workshop. Participants are also encouraged to attend fundamentals in qualitative research and/or coding and software workshops.
Introducing and Unpacking the Design Process of Mixed Methods Research: Fundamentals of the Field and Approach
With a focus on introducing those with little or no knowledge of mixed methods research to the field, this session develops a conceptual foundation of how to design and convey a mixed methods research study plan in any discipline. The practice of engaging in designing and conveying mixed methods research studies requires specific knowledge and skills within the field. This session will situate the field of mixed methods and introduce the concept of methodological congruence and what it means for enhancing rigour within mixed methods research. We will then unpack the design process to illuminate key decision points related to the point(s) of interface for qualitative and quantitative types of data. We will discover how these decisions either correlate with existing mixed methods design or lead to new designs. In the afternoon we will engage in the process of designing a mixed methods study in small groups. Finally, we will outline writing structures for successful research proposals and engage in discussion about effective strategies for defending the design to those new (or not!) to mixed methods research. Participants are encouraged to bring ideas for mixed methods studies that they can explore during the workshop and build upon existing knowledge of qualitative and quantitative research designing processes. Opportunities for applying their own areas of interest and/or projects will be embedded. Participants are also encouraged to attend the Managing and Communicating Mixed Methods Research workshop.
Managing and Communicating Mixed Methods Research: Advanced Practice
With a focus on advancing the knowledge and skills of those with some familiarity and/or experience with mixed methods research, this session develops a conceptual foundation of how to manage and communicate a mixed methods research study in any discipline. The practice of engaging in managing and communicating mixed methods research studies requires specific knowledge and skills in data procedures and writing structures. This session will unpack the process of managing mixed methods research process and provide practical guidance for identifying opportunities for presenting and publishing research outcomes. The session is organized around the following questions: How can the overall data procedures be conceptualized and what are the key decision points? What are common pitfalls that should be anticipated and how might these be mitigated? What are the criteria for evaluating quality in mixed methods research manuscripts and how are these different/similar to qualitative and quantitative research? Participants are encouraged to bring ideas and/or current mixed methods studies that can be developed during the workshop as practical opportunities for applying their emerging understandings will be embedded. Some familiarity with mixed methods research is required and participants are encouraged to have attended the Introducing and Unpacking the Design Process of Mixed Methods Research Workshop or equivalent other experience.