Ohio State is in the process of revising websites and program materials to accurately reflect compliance with the law. While this work occurs, language referencing protected class status or other activities prohibited by Ohio Senate Bill 1 may still appear in some places. However, all programs and activities are being administered in compliance with federal and state law.

Quantitative Psychology Brownbag

Y. Andre Yang
Mon, November 14, 2022
12:30 pm - 1:30 pm
Zoom

Dr. Y. Andre Yang
Department of Psychology  
University of Toronto, Scarborough

Title: Power Analysis for Parameter Estimation in Structural Equation Modeling

Abstract: Despite the widespread and rising popularity of structural equation modeling (SEM) in psychology, there is still much confusion surrounding how to choose an appropriate sample size for SEM. Dominant guidance on this topic primarily consists of sample-size rules of thumb that are not backed up by research and power analyses for detecting model misspecification. Missing from most current practices is power analysis for detecting a target effect (e.g., a regression coefficient between latent variables). In the first part of my talk, I distinguish between power to detect model misspecification and power to detect a target effect, report the results of a simulation study on the latter type of power, and introduce a user-friendly Shiny app, pwrSEM, for conducting power analysis for detecting target effects in structural equation models. In the second part of my talk, I reflect on the pros and cons of building a user-friendly statistical tool, and I consider epistemological reasons for (vs. against) conducting power analysis for complex models in the first place.

Dr. Andre Wang's research incorporates insights and methods from social, cognitive, and quantitative psychology to better understand how people connect abstract ideas to concrete experiences.

The distinction between abstract ideas and concrete experiences not only exists in the phenomena that researchers study, but also in the research process itself: How psychologists think about analytic methods may not connect with how they experience and use those methods. Abstract ideals about research practices (e.g., increase statistical power) often fail to translate into concrete practice due to barriers such as resource constraints or lack of accessible tools. As a result, researchers might end up relying on suboptimal methods or even drawing inaccurate inferences (e.g., making Type II errors based on underpowered studies). Importantly, disparities in access to resources could exacerbate this problem. I explore how researchers can better connect their abstract ideas about research practice to the concrete and often messy reality of doing research, and I develop free, open-access tools for this purpose.