Conference Workshops

Workshop A – Justin Shaffer & Brian Sato, UC Irvine
Hands-On Design of an Education Research Study in Your Classroom: From Start to Finish
Study design is one of the most difficult and essential parts of rigorously assessing classroom teaching and pedagogical methods. Oftentimes, true control groups are impossible to come by, and confounding factors are abundant. This workshop is being offered for novice and advanced educators alike who would like to learn more about how to effectively conduct educational research studies in their college science classrooms. Participants will work with the facilitators to critique published educational research studies and identify essential factors that must be taken into consideration when designing research studies. Participants will also design a study that they conduct at their home institutions, and they will receive feedback from their peers and the facilitators on their study design. The facilitators will also share their experiences with designing, conducting, analyzing, and publishing educational research studies that have taken place in lower- and upper-division college biology classes at UC Irvine, and how their research has informed and improved their teaching.

Workshop B – Wenliang He, UC Irvine
Data Manipulation and Exploratory Analysis using R
Statistical analysis is crucial for assessing the effectiveness of a variety of instructional techniques. STEM instructors in general, however, often find themselves ill equipped with the necessary tools and skills to analyze their own data. This workshop will focus on data manipulation and exploratory analysis. It starts with a short crash course on R, a powerful tool for statistical analysis. We will then focus on data manipulation to massage existing data and create new variables. Subsequent exploratory analysis includes generating descriptive statistics and using a variety of graphs to examine the distributions of and relations between variables. We will then learn to output descriptive statistics in tables of publishable quality, and use exploratory analysis to ensure data integrity and inform subsequent statistical modeling. By the end of the workshop, participants should be able to conduct routine data manipulation and exploratory data analysis tasks with R.

Workshop C – Debra Pires & Liz Roth-Johnson, UC Los Angeles
How to create the "right" question: Aligning assessment instruments with student learning outcomes
How do we, as instructors, know how much our students have learned? Measuring student learning in a meaningful and systematic way depends on the development and implementation of appropriate assessments. This workshop will introduce participants to different types of assessments (formative and summative) and provide hands-on practice creating assessment questions aligned with learning outcomes specific to participants' own courses. We will focus specifically on creating suitable multiple choice and multiple true-false assessments, which can be readily implemented in classrooms of any size and can be used to measure higher order levels of student thinking. Such assessments provide a useful way for instructors to track student learning both within a single course and across multiple courses. Participants are strongly encouraged to bring at least three student learning objectives and a copy of a past exam to use as a starting point for the workshop.

Workshop D – Victoria Bhavsar, Cal Poly Pomona
Project-planning a SoTL Investigation with Completion as the Goal
SoTL projects are especially vulnerable to interruption when other priorities arise.  Detailed project planning can keep a SoTL project on track to completion.  This workshop presents a planning methodology for SoTL projects, including:  Identify large project pieces and clarify milestones for completion, identify a full complement of actionable steps for each project piece, and plan action timelines.  Participants will also select options that will work for them from an array of research-based best practices for progressing on writing projects.  The workshop applies to a wide variety of disciplines because SoTL project planning with the goal of keeping the project moving is not limited to a specific disciplinary orientation.  The session will begin with an overview of project planning methodology, tailored to SoTL.  Participants will then work with a case study, using the case project's research question and course description to work through planning steps.  Participants will then consider their own SoTL questions, articulating specific steps to move their projects forward.  The case study is based on a project for a general education agriculture science class, which investigated a multi-step writing assignment.  The study investigated differences between inexperienced vs experienced students for:  1) the complexity of questions generated, 2) the quality of information located, and 3) the accuracy of information quality assessment.  More advanced students demonstrated more robust questioning and better information literacy skills.  A student survey showed that students considered the writing assignment to be a valuable learning experience.

Workshop E – William Grisham, UC Los Angeles
Statistical and Design Considerations in Discipline Based Educational Research
Integrating teaching, assessment, and education research to enhance student learning in STEM requires 1) developing suitable assessment instruments, 2) utilizing research designs with sufficient statistical power to find significant effects, and 3) designing appropriate controls and comparison groups so results can be meaningfully interpreted. This workshop will assist participants in developing suitable assessment instruments and in designing appropriately powered as well as well-considered studies.

Assessment instruments need to be both reliable and valid. This workshop will examine common means of determining an assessment instrument's reliability (test-retest, split-half, and Chronbach's alpha) as well as means of making instruments more reliable via item analysis. This workshop will also examine types of validity (content and construct) and ways to numerically determine these types of validity.

Further, this workshop will examine statistical considerations in Discipline Based Education Research, including considerations of statistical power and effect size. Studies should be sufficiently powered to detect differences when they are real, and there are some design considerations that make this more likely. Effect sizes are measures that relate to external validity and the degree of real-world applicability of findings.

Lastly, this workshop will consider threats to internal validity. Threats to internal validity include extraneous history, subject maturation, repeated testing, instrumentation changes, subject mortality, and regression to the mean. Many of these threats to internal validity can be side stepped either by employing sensible precautions or by adding appropriate comparison groups. 

Workshop F - Stanley Lo, UC San Diego
Qualitative data in educational assessment: What are they and how do we deal with them?
This interactive workshop will engage audience in hands-on analysis of qualitative data in educational research and assessment. From these activities, participants will learn about sources of data that can be used to assess student outcomes and how to collect them, analyze and code data, discuss the importance of reliability and validity, explore different qualitative methodologies and theoretical frameworks, and apply these understandings to their own work. Specially, we will begin by examining three sets of sample data on how faculty write exam questions, what students see as the purpose of undergraduate laboratory courses, and how students approach problem solving in biology. These data sets are chosen to highlight increasingly sophisticated ways to handle qualitative data: analyzing data using an established coding scheme, determining inter-rater reliability, developing a coding scheme de novo, ensuring validity, and utilizing theoretical frameworks in data analysis. After learning the fundamentals of working with qualitative data, participants will develop plans for collecting and analyzing qualitative data to assess student outcomes in their own courses.

Workshop G – Catherine Nameth, UC Los Angeles
Getting the data you need: Designing student surveys for your course or program
STEM faculty and instructors are experts in their subject matter but rarely have much, if any, formal education related to assessing student learning.   Staff and administrators of STEM programs may be proficient in program management but lack the knowledge needed for designing surveys which gather student perceptions of their learning related to a STEM course or program.   It can be tricky, if not daunting, to design an assessment tool which is targeted to the appropriate group of students or program and which gathers not just any data, but demographic, quantitative, and qualitative data useful for faculty and staff.  Through group work and discussion, and by using an organizational chart designed by the presenter, participants will gain practical knowledge for designing student surveys which are context-specific, clear, and absent of assumptions.