Handing Rose-Colored Glasses to STEM Faculty: Institutional Priorities in Disciplinary Identity Development

“What gets measured gets managed.” – Quote often attributed to Peter Drucker

As departments think about what criteria to use for reappointments, promotion, and tenure, they need to understand how their choices shape what it means to contribute to a disciplinary field. As is, many of the criteria sidestep measurements of quality teaching and emphasize things related to discipline specific research output such as external funding. With faculty scrambling to produce another publication and fulfill their service requirements, there can be little evidence available other than student evaluations to consider an instructors’ quality of teaching. If departments do not value systematic reflection of teaching practices in a tangible way, they are handing instructors rose-colored glasses for viewing their teaching. They enable instructors to see their teaching with a self-serving bias if department chairs or deans base tenure and promotion primarily on outputs unrelated to teaching and student learning.

In this post, we provide recommendations for supporting scholarship on teaching and learning based on a Community of Practice (CoP) framing of disciplinary identity for two examples from our ongoing research.

Our Research Context at Hispanic-serving institutions

Our research examines incentives and barriers to undergraduate STEM faculty conducting scholarship on their teaching, in the context of Hispanic-serving institutions (HSIsa). In conducting and analyzing 40 interviews, we found some faculty to be reflective of their teaching practice, but lacked departmental incentives to rigorously investigate if their classes were having their intended impacts on students.

For example, some faculty spoke confidently about their teaching based on thank you messages from past students. Several platforms exist that instructors can use to evaluate the equity of their teaching practices (e.g., the EQUIP, GORP, LASSO), yet faculty rarely reported using such tools to judge their impact on students. If instructors’ teaching ability is affirmed and measured by student evaluations, it’s unsurprising that they would not take the time to reflect in a more systematic way.

How Communities of Practice Shape STEM Disciplinary Identity

Wenger1 developed a conception of identity in the context of Communities of Practice to pivot between the individual and social perspective, so they can be discussed with reference to the other. Identity in a CoP can be understood in terms of a perception of self as they participate in the negotiation of meaning. The negotiation of meaning, how the community understands or interprets the world, occurs through the complementary actions of participation and reification. Participation refers to the process of taking part in activities associated with a given community. Examples of participation for a faculty member include teaching a lesson, collecting data, or discussing something with a colleague. Reifications are the concrete objects that the community produces to give form to the joint experiences. Examples of reifications include a publication, writing on a white board, or an award for outstanding teaching. Reifications often reflect what the community finds meaningful. It is through a person’s lived experiences of participation, creation of reifications, and seeing their own impact on others through these activities, that people construct who they are.

If we use this framing to understand how disciplinary identity development for STEM faculty relates to institutional support of their research and teaching, the reifications that are affirmed by the community are publications unrelated to teaching.

How STEM Instructors Measured Their Impact

Instructors tended to evaluate their own teaching through ways subject to self-serving bias. Some faculty gave frank reflections that they do not know their impact on students. Others reported assessing their impact on students by considering course evaluations, grades from assignments and exams, classroom interactions, office hour interactions, feedback from past students, and peer observations. These avenues for assessing impact are open to implicit bias and cherry-picking of anecdotes. Exams are often written by instructors. Many instructors accounted for poor student performance on student factors rather than their own teaching. And student evaluations have been found to be biased measurements based on a number of confounding factors such as gender or racial preferences.2,3 Faculty can push their departments to take more rigorous approaches for evaluating instruction by sharing research on the large bias on student evaluations of teaching that privilege White men.4-6

We interviewed 40 STEM faculty across chemistry, physics, biology, and mathematics. Four of these faculty were discipline-based education researchers. Of the 36 non-education researchers, only two attempted a systematic assessment of their own teaching outside student evaluations. And, they expressed not getting much further in the process than data collection. The non-systematic ways STEM instructors measured their impact seem like ways to perpetuate inequities in the classroom, and not support the diversification of the field – which is particularly troubling in the HSI context. Yet it’s understandable that faculty have not engaged more with education research if we approach it from a CoP perspective.

Disciplinary Identity and Scholarship on Teaching and Learning

The development of a disciplinary identity can be at odds with doing research on teaching and learning. Denita, a full professor of mathematics, stated:

“I think what really matters, at least in my department, is published papers. […] The teaching, as long as it’s okay, you didn’t get into trouble. […] I certainly have to show papers in my area. So maybe out of the five, maybe I could get away with one on teaching. But I need to make sure that I have papers in my own discipline.”

From Denita’s perspective, producing valued reifications in her disciplinary CoP did not include papers on teaching and learning. Rather, it would be something she had to “get away with”. Umay, a chemistry associate professor, explained:

“In terms of my promotion, the language that’s used is research from your own lab […] Like I should be able to do educational research, right? But I’m not an education faculty. So it’s kind of complicated. I mean, you want to grow in your field, but there’s [only] so much you can do, because then you’re not going to have time to do hardcore research.”

For Umay, the documents outlining promotion reify that education research would not be a valued activity in her CoP. Therefore, engaging in educational research would not help develop her identity as a chemist, and might even threaten it if it takes away time from her other research.

Call to Action: Affirming Disciplinary Identity

Institutions and departments can emphasize research on teaching in tangible ways that affirm it as a valuable form of participation in the CoP. Research on teaching and learning can strengthen their field and the diversity of people who can enter the field. Denita continued that it would make sense for her department to cultivate the respect for research on teaching and learning because “we are an HSI and we can make a difference in providing understanding and knowledge to the nation on best practices for Latino students and first-generation college students.” Departments can and should reorganize tenure and promotion to value other measures of teaching.

If departments gave research on teaching more weight as faculty make their tenure cases, that will affirm the value of scholarship on teaching and learning as a part of participation in the disciplinary CoP.

To engage in reflective scholarship on their teaching, faculty can build collaborations with education researchers in their discipline who span the boundaries of both communities. We foster these collaborations through workshops to support faculty engaging in systematic reviews of their teaching that support equity and can lead to publications and grant funding. Systematic reflections need not be this formal and can start with a simple observation or question, some reliable data, and a few colleagues to reflect with and learn from. For example, the LASSO platform allows instructors to administer research-based assessments outside of class and supports looking at the overall efficacy and equity as measured by that data. Or, the EQUIP tool allows instructors to quantify who talks during class and use that information to look for inequities in their teaching. Reflecting on results from either tool with colleagues can raise hard questions and lead to adopting well established solutions, such as the Learning Assistant Model7 or developing and testing novel solutions.

Science has trained us to believe that science is objective and neutral. As educators, we may be inclined to rely on that neutrality to distance ourselves from the inequities and injustices that our instruction perpetrates and perpetuates.

But the science shows that we can choose strategies that broaden participation, that we have a huge sway in the outcomes of our students.

When instructors used Learning Assistants, they had almost half the failure rate as they did when they taught without them (17.1% versus 31.1%);8 these benefits improved equity in these course outcomes. Students also learn more in courses that use active-engagement strategies9 and the more active engagement the better.10 One hundred and seven schools have Learning Assistant programs and 449 schools have Supplemental Instruction programs. Both can provide educators with resources and a network of peers to engage in reflecting on and transforming their instruction.

Develop a plan, attend a workshop, make a small change, collect and analyze some data, find collaborators, build momentum, and ask your institutions for resources to meet their commitments to diversity. We aren’t neutral umpires calling balls and strikes. As educators, we are setting the rules. As scientists, we can draw on our skills and training to find and use tools that both broaden participation in the sciences and better prepare students.


Footnotes

HSIsa

Institutions with 25% or more full-time, Hispanic students can apply for HSI status.11


Acknowledgements

The authors thank all the members of the STEM Equity Project (www.stemequity.net) for setting up the work that led to this investigation. Special thanks to Eleanor Close and Ben Van Dusen who worked closely with the data that led to the ideas presented here. This research was supported by the National Science Foundation’s Improving Undergraduate STEM (IUSE) program under the award DUE-1928596. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect views of the National Science Foundation.

References