This project aims to enhance students’ learning in foundational engineering courses through oral exams. The adaptive dialogic nature of oral exams provides instructors an opportunity to better understand students’ thought process, thus holding promise for improving both assessment of conceptual mastery and students’ learning attitude and strategies. However, the issues of oral exam reliability, validity, and scalability have not been fully addressed. As with any assessment format, careful design is needed to maximize the benefits of oral exams to student learning and minimize the potential concerns. Compared to traditional written exams, oral exams have a unique design space, which involves a large range of parameters, including the type of oral assessment questions, grading criteria, how oral exams are administered, how questions are communicated and presented to the students, how feedback were provided, and other logistical perspectives such as weight of oral exam in overall course grade, frequency of oral assessment, etc. In order to address the scalability for high enrollment classes, key elements of the project are the involvement of the entire instructional team (instructors and teaching assistants). Thus the project will create a new training program to prepare faculty and teaching assistants to administer oral exams that includes considerations of issues such as bias and students with disabilities. The purpose of this study is to create a framework to integrate oral exams in core undergraduate engineering courses, complementing existing assessment strategies by: (1) create a guideline to optimize the oral exam design parameters for the best students learning outcome ; and (2) Create a new training program to prepare faculty and teaching assistants to administer oral exams. The project will implement an iterative design strategy using an evidence-based approach of evaluation. The effectiveness of the oral exams will be evaluated by tracking student improvements on conceptual questions across consecutive oral exams in a single course, as well as across other courses.Since the start in January 2021, the project is well underway. In this poster, we will present a summary of the results from year 1: (1) exploration of the oral exam design parameters, and its impact in students’ engagement and perception of oral exams towards learning; (2) the effectiveness of the newly developed instructor and teaching assistants training programs (3) The development of the evaluation instruments to gauge the project success; (4) instructors and teaching assistants experience and perceptions.
Nate Delson, UCSD; Curt Schurgers, UCSD; Marko Lubarda, UCSD; Carolyn Sandoval, UCSD.