You are here

PSPB Editorial Philosophy

The New Editorial Team of PSPB Addresses Editorial Philosophy

Our new team of four takes the reins in January, 2017; some things will change at PSPB, but most will not. The Personality and Social Psychology Bulletin has always been a place for newer, creative ideas, and we will continue to seek papers that showcase creativity, progress, and innovation. We will continue the practice of seeking the highest quality, most rigorous, most informative manuscripts in personality and social psychology. We encourage you to submit to PSPB.

PSPB began in the mid-1970s, a mimeographed chapbook held together with two staples, filled with Society business, opinion pieces, newsletter items, and empirical articles. To appreciate how excellent PSPB was from the very start, see the first formal issue. It has come a long way; over four decades, PSPB has become an important and widely distributed outlet for empirical research articles, with occasional items of special interest. We thank Duane Wegener, Lee Fabrigar, and the team of Associate Editors for maintaining PSPB’s high standards and profile in the field.

The implicit goal of PSPB is to promote scientific progress in social-personality psychology. Progress in science emerges from a tension between innovation and selection. Ideas are considered, concepts are operationalized, data are collected and analyzed, the results are described and interpreted. The publication of an article is by no means the final word on scientific contribution (Fisher, 1947). Once a paper is accepted by the editorial team, PSPB’s reading public begins the process of post-publication review. What do the data mean? Are they interpreted reasonably? How can we use other research to interpret these findings? Can we believe the conclusions?

SPSP recently published a statement on scientific progress which began “Science advances largely by correcting errors, and scientific progress involves learning from mistakes. By eliminating errors in methods and theories, we provide a stronger evidentiary basis for science that allows us to better describe events, predict what will happen, and solve problems (SPSP Board of Directors, 2016).” PSPB plays a fundamental role in this process. But PSPB also receives, selects, and promotes new and innovative ideas, methods, and findings. Proposing innovative ideas or methods on the one hand, and selecting out errors and mistakes on the other, represents two important scientific values. We will be guided by both. Social-personality psychology is a community of scientists, working to develop theory, create manipulations and interventions, solve puzzles and human problems. Scientists compare theories to data and compare theories to each other; we are skeptical and selective (and sometimes slow) to consider confirmations and disconfirmations. This is what separates science from other practices (Thagard, 1978), and PSPB is one place for this to happen.

What is quality research; what is an informative paper? There are many, many ways to make scientific contributions. At PSPB, there will be no litmus test for what is an acceptable paper—each paper will be evaluated based on its particular aims and merits. The general standard will be information value—what can we learn from this paper? (See also Funder et al., 2014.) This standard will be applied to theories, hypotheses, research designs, data sets, and replications; the editorial team, as informed by the reviews, will make judgments about publication based on the value and importance of the paper, and this judgment will be based on what we can reasonably learn from the work. Logic, theoretical clarity and care, scholarly review, methodological rigor, statistical power, careful design, and careful interpretation of results are all important, and we intend to celebrate (and select) them because they make a contribution informative.

Each submission will be evaluated based on its particular aims and merits; the more clearly authors explain what they did and why—what they found and how—the better reviewers, editors, and readers can engage in the collective process of criticism and the growth of knowledge. Peer review and the editorial decisions that rely on it are a human process, and thus imperfect. However, as in all human endeavor, imperfection prevents neither beauty nor truth. When authors make the aims and merits of their work clear they best enable honest and constructive evaluation by reviewers and editors. When reviewers and editors make the basis of their evaluation clear, they best enable honest and constructive re-evaluation by authors. At its best, peer review is a principled (sometimes passionate) intellectual exchange. This is the standard we aim to achieve.

There has been a great deal of discussion lately about the honest and transparent reporting of research. This has been a creative era of suggestions, nostrums, bans, cures, and desiderata. Over the last few years, SPSP has instituted a number of requirements and recommendations to guide honest and transparent reporting (see Kashy, et al., 2009) Following the 2013 SPSP Presidential Task Force on Publication and Research Practices, SPSP’s statement of Best Practices and its recommendations for “Improving the dependability of research in personality and social psychology” (Funder et al., 2014) informed current PSPB submission guidelines. For several years PSPB has required that all stimulus and other materials be included with a submission, to be published online upon publication. Current PSPB submission guidelines reiterate SPSP’s (2013) Data Sharing Policy. And, of course, PSPB submission guidelines require compliance with the American Psychological Association’s Ethical Principles of Psychologists and Code of Conduct Standard regarding research and publication. The PSPB submission guidelines also prescribe the use of the 2009 APA Publication Manual, which includes Journal Article Reporting Standards for the honest and transparent reporting of research. PSPB is also a member of the Committee on Publication Ethics (COPE) which guides editorial practice, including research and publication misconduct. (We encourage readers to visit COPE’s website for clear communication about appropriate conduct in submission, review, decisions, and conflicts of interest.) The pre-registration of methods, data collection plans, and data analysis plans can play a role in increasing transparent scientific practice; research that involves pre-registration is welcome at PSPB and the process should be described in the manuscript and detailed in supplementary materials. Because pre-registration is not always possible with the innovative or exploratory work we expect to publish at PSPB, it is not required. When guided by these complementary and comprehensive standards, responsible reporting will add information value and enable the informed evaluation of a paper given its aims and merits.

PSPB currently offers double-blind review (authors do not know who the reviewers are; reviewers do not know who the authors are) as an option to authors. Beginning March 1st, 2017, double-blind review will be required of all manuscripts. All three journals sponsored by SPSP (Personality and Social Psychology Review, Social Psychological and Personality Science, and PSPB) now share the same policy.

The new century has brought an explosion of interest in social and personality psychology—we’ve seen a substantial increase in the number of print journals, online journals, popular books, blogs, new professional societies, conferences, workshops, and media coverage. An even greater explosion can be seen in the number of scientists seeking to join our ranks; attendance at SPSP meetings continues to grow, applications for graduate school are steadily rising, new Ph.D. programs are developing, membership in SPSP continues to climb. This makes publication intensely competitive. The scarcity of highly valued publication resources is especially difficult for younger scientists, who must early in their careers vie for space with experienced scientists and established research labs, but the same high standards apply to all submissions. A high level of competition has a positive effect on scientific progress (Kitcher, 1993), and we strongly encourage junior scientists to submit their work.

We will evaluate contributions for statistical power. In addition to reporting p-values for significant effects, authors should also report power they had to detect hypothesized effects of interest. If you hypothesized a null effect, reviewers will want to know whether you used a sample large enough to detect a small effect so they may attach some confidence to the null.

Authors should be clear about how they conducted power analyses (what formulae, which software programs were used?) and what values they used to estimate power. When conducting a priori analyses to determine sample size, please report the parameter estimates used in these analyses and how they were obtained (e.g., from pilot data or prior research). We encourage people to be transparent in the analysis and reporting of a priori power, consistent with the goals of transparency and clarity in reporting all statistical analyses. For the critical hypothesis tests, this information belongs in the body of the paper. For more complicated issues, minor tests, or lengthy discussion, description of power calculations may appear in the online supplement. If you did not conduct a power analysis prior to conducting the study, simply say so. Unresolved issues of statistical power should be addressed in the General Discussion.

We are in discussion with the SPSP Publication Committee about how to treat replications. Conceptual replications, because they extend operationalizations and test theories in new ways (Crandall & Sherman, 2016), have informational value, and will be considered for publication in the normal course of journal operations. Until an alternate outlet for exact, direct, close or literal replications appears, we will consider publishing them in PSPB. “Replication only” is less likely than new contributions to meet PSPB standards—the results must be informative beyond a mere “yes, replicated” or “no, didn’t replicate.” There are many excellent exemplars of useful and informative replication projects (e.g., Ebersole et al. 2016; Luttrell, Pett & Xu, 2017) of the type PSPB seeks to publish.

The development and advancement of theory is a useful component of many contributions. Generating a theoretical advance will always be a positive component of a paper. However, one result of a multidimensional approach to judging papers is that some empirical and methodological contributions may be so compelling, interesting, or surprising that they deserve to be published before their theoretical value is known (Greenwald, 2012). Sometimes, there is nothing so practical as a good set of data.

We will ask authors to include a “constraints on generality” paragraph in their Discussion sections (see Simons, Shoda & Lindsay, 2016). A section that covers limits on generality should define the scope of the ideas that the data support, by describing the population of interest, stimuli, and operations the authors are studying, as well as some sense of what they are not studying. This will aid in understanding the original data, and guide interpretation of subsequent replications, both direct and conceptual. Although not required, such a discussion would certainly increase the information value of a paper, and guide readers—skeptical and supportive—to a better understanding of what is claimed and what is not.

All journals in social and personality psychology are seeking to publish rigorous and high quality research; PSPB is no different. PSPB is a place for innovation and experiment. New ideas must be tested with rigor and openness, with sensible procedures and analyses, but we will not shy away from new concepts, new methods, or fresh takes on old problems. Careful selection must not limit innovation.

-Chris Crandall, Colin Wayne Leach, Michael Robinson, and Tessa West



Crandall, C. S., & Sherman, J. W. (2016). On the scientific superiority of conceptual replications for scientific progress. Journal of Experimental Social Psychology, 66, 93–99.

Ebersole, C. R., Atherton, O. E., Belanger, A. L., Skulborstad, H. M., Allen, J. M., Banks, J. B., ... & Brown, E. R. (2016). Many Labs 3: Evaluating participant pool quality across the academic semester via replication. Journal of Experimental Social Psychology, 67, 68-82.

Fisher, R. A. 1947. The Design of Experiments (4th Ed.). Edinburgh, UK: Oliver and Boyd.

Funder, D. C., Levine, J. M., Mackie, D. M., Morf, C. C., Sansone, C., Vazire, S., & West, S. G. (2014). Improving the dependability of research in personality and social psychology: Recommendations for research and educational practice. Personality and Social Psychology Review, 18, 3-12.

Greenwald, A. G. (2012). There is nothing so theoretical as a good method. Perspectives on Psychological Science, 7, 99-108.

Kashy, D. A., Donnellan, M. B., Ackerman, R. A., & Russell, D. W. (2009). Reporting and interpreting research in PSPB: Practices, principles, and pragmatics. Personality and Social Psychology Bulletin, 35, 1131-1142.

Kitcher, P. (1993). The advancement of science: Science without legend, objectivity without illusion. New York: Oxford University Press.

Luttrell, A., Petty, R. E., & Xu, M. (2017). Replicating and fixing failed replications: The case of need for cognition and argument quality. Journal of Experimental Social Psychology, 69, 178-183.

Simons, D.J., Shoda, Y. & Lindsay, D.S. (2016). Constraints on Generality (COG): A Proposed Addition to All Empirical Papers.. Manuscript submitted for publication.

SPSP Board of Directors (2016). The state of our science: Executive board perspectives., blog dated 03/28/2016.

Thagard, P.R. (1978). Why astrology is a pseudoscience. In P.D. Asquith & I. Hacking (Eds). Philosophy of Science Association, Volume 1. East Lansing, MI: Philosophy of Science Association.