Just Do It!

I want to chime in about the exciting new section in Perspectives on Psychological Science dedicated to replication.  (Note: Sanjay and David have more insightful takes!). This is an important development and I hope other journals follow with similar policies and guidelines.  I have had many conversations about methodological issues with colleagues over the last several years and I am constantly reminded about how academic types can talk themselves into inaction at the drop of a hat. That fact that something this big is actually happening in a high profile outlet is breathtaking (but in a good way!).

Beyond the shout out to Perspectives, I want to make a modest proposal:  Donate 5 to 10% of your time to replication efforts.  This might sound like a heavy burden but I think it is a worthy goal. It is also easier to achieve with some creative multitasking.   Steer a few of those undergraduate honors projects toward a meaningful replication study or have first year graduate students pick a study and try to replicate it during their first semester on campus.  Then make sure to take an active role in the process to make these efforts worthwhile for the scientific community.  Beyond that, let yourself be curious!  If you read about an interesting study, try to replicate it.  Just do it.

I also want to make an additional plug for a point Richard Lucas and I make in an upcoming comment (the title of our piece is my fault):  Support those journals who value replications by reviewing for them and providing them with content (i.e., submissions) and (gasp!) consider refusing to support journals that do not support replication studies or endorse sound methodological practices. Just do it (or not).

I will end with some shameless self-promotion and perhaps a useful reminder about reporting practices. Debby Kashy and I were kind of prescient in our 2009 paper about research practices in PSPB (along with Robert Ackerman and Daniel Russell).  Here is what we wrote (see p. 1139):

“All in all, we hope that researchers strive to find replicable effects, the building blocks of a cumulative science. Indeed, Steiger (1990) noted, “An ounce of replication is worth a ton of inferential statistics” (p. 176). As we have emphasized throughout, clear and transparent reporting is vital to this aim. Providing enough details in the Method and Results sections allows other researchers to make meaningful attempts to replicate the findings. A useful heuristic is for authors to consider whether the draft of their paper includes enough information so that another researcher could collect similar data and replicate their statistical analyses.”

Advertisements

Author: mbdonnellan

Professor Social and Personality Psychology Texas A &M University

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s