Meta-Psychology, 2023, vol 7, MP.2022.3271 https://doi.org/10.15626/MP.2022.3271 Article type: Commentary Published under the CC-BY4.0 license Open data: Not Applicable Open materials: Not Applicable Open and reproducible analysis: Not Applicable Open reviews and editorial process: Yes Preregistration: No Edited by: Rickard Carlsson Reviewed by: David A. Neequaye, Elizabeth Tenney Analysis reproduced by: Not Applicable All supplementary files can be accessed at OSF: https://doi.org/10.17605/OSF.IO/TW9GA Unfortunately, Journals in Industrial, Work, and Organizational Psychology Still Fail to Support Open Science Practices Joachim Hüffmeier and Marc Mertes TU Dortmund University Currently, journals in Industrial, Work, and Organizational (IWO) Psychology collec- tively do too little to support Open Science Practices. To address this problematic state of affairs, we first point out numerous problems that characterize the IWO Psychology literature. We then describe seven frequent arguments, which all lead to the conclusion that the time is not ripe for IWO Psychology to broadly adopt Open Science Practices. To change this narrative and to promote the necessary change, we reply to these ar- guments and explain how Open Science Practices can contribute to a better future for IWO Psychology with more reproducible, replicable, and reliable findings. Keywords: Open Science Practices, Reproducibility, Replicability, Openness, Transparency It is unfortunate how slowly positive change is com- ing to the Industrial, Work, and Organizational Psychol- ogy (IWO Psychology) and the broader Management lit- erature.1 The field is riddled with problems, such as (i) low statistical power, (ii) non-transparent research prac- tices and a lack of data-sharing, (iii) a high prevalence of Questionable Research Practices (QRPs, e.g., hypoth- esizing after the results are known [HARKing] or non- disclosure of unsupported hypotheses), (iv) many false positive findings, (v) publication bias and a substantial file drawer problem (i.e., findings that are not published because they are not statistically significant), (vi) a bias towards novelty at the expense of replication studies and cumulative science, and (vii) the low replicability of its findings. Importantly, a promising cure for many of these problems has long been found: Open Science Practices (OSPs; see Table 1, for an overview of OSPs and their effectiveness).2 However, as our own (Torka et al., 2023) and other research (Tipu and Ryan, 2021) shows, most IWO Psychology and Management journals generally do little to support researchers’ use of OSPs. For instance, our analysis of the policies of IWO Psychol- ogy and Management journals showed that only five of 202 analysed journals (2.5%) offered registered reports as publication option and only one journal (0.5%) pro- vided Open Science Badges (Torka et al., 2023). If any- thing, the journals seem to endorse “business as usual”, which prevents overdue improvements of the state of the literature. In the following, we will illustrate that the listed problems do in fact exist, specifically in the IWO Psy- chology/Management field, and that there are hardly any excuses for not taking action on the part of the journals. We will do so by presenting typical arguments that we observed in our own studies with scientists (a survey with scientists in IWO Psychology, Hüffmeier et al., n.d.) and journal editors (a survey with Editors of IWO Psychology journals, Torka et al., 2023) and (over- )heard in informal conversations with colleagues. Then, we will reply to these arguments (see Table 2, for an overview of the seven arguments and our refutations). The first argument: OSPs are for scientific fields that evidentially have documented problems with the replica- bility of their findings like Social Psychology. It is of course true that replicability (or rather the lack thereof) is better documented in other fields, especially Social Psychology.3 However, the replicability of reported re- sults is low across many research domains. These do- mains include, but are not limited to, Management 1Because much IWO psychology research is published in management journals (e.g., Journal of Organizational Behav- ior or Journal of Management), the two fields cannot really be separated. 2Technically, some of these measures such as journals’ sup- port of replications do not necessarily make science more open, transparent, or accessible although they improve sci- ence. Other authors, therefore, speak of “open science and re- form practices” rather than of OSPs (see Tenney et al., 2021). However, to keep with established conventions, we still use the term “OSPs” in this manuscript. 3Replicability means that findings from new (replication) studies, which converge with those of the original studies, “can be obtained with other random samples drawn from a multidimensional space that captures the most important facets of the research design” (Asendorpf et al., 2013, p. 109; see, for instance, Open Science Foundation, 2015). https://doi.org/10.15626/MP.2022.3271 https://doi.org/10.17605/OSF.IO/TW9GA 2 (Bergh et al., 2017) and neighbouring disciplines such as Marketing (Simmons and Nelson, 2019) and Eco- nomics (Camerer et al., 2016). There is little reason to assume that the situation is fundamentally differ- ent in IWO Psychology research (see also Goldfarb and King, 2016) because the incentives and publishing prac- tices in all these fields are highly comparable and, thus, equally problematic. Finally, the methodological prob- lems of IWO Psychology and Management are not re- stricted to replicability (see also the next argument). The second argument: Show us the evidence that our field does in fact have severe methodological problems. Maybe then we will be willing to start supporting OSPs. We will use our above list to substantiate the prevailing methodological problems. First, low statistical power is very common in IWO Psychology and Management studies. One recent study found that only 37% of con- sidered studies had a power of at least .80 (Paterson et al., 2016; see also Mone et al., 1996). Second, at least for IWO Psychology and for strategic management re- search, research practices are so non-transparent that it is often impossible to reproduce reported findings even when the data are available (see Bergh et al., 2017; for an overview, see Artner et al., 2021). However, related efforts often fail already one step earlier because re- searchers are unwilling to share their data (e.g., Tenopir et al., n.d.; Wicherts et al., 2006). Third, there is con- verging evidence across many studies that Questionable Research Practices (QRPs) are widespread in the field (e.g., Banks et al., 2016; O’Boyle Jr et al., 2017). The problem is even more prevalent for articles appearing in prestigious journals such as Organizational Behavior and Human Decision Processes or the Academy of Man- agement Journal (Kepes et al., 2022). Fourth, the risk of committing Type I errors (i.e., re- jecting a true null hypothesis or obtaining a false pos- itive finding) is directly associated with low statistical power, which is prevalent in our field (see above). An- other perspective on the same issue is the rate of sup- ported hypotheses in a field, which has been found to be especially high for the overarching field of Eco- nomics and Business (i.e., no further differentiation was made within this field; Fanelli, 2012)—a clearly worrying finding for the state of the literature. Fifth, many pertinent journals do not publish statistically non- significant results (Tenney et al., 2021), deeming them either irrelevant or unworthy of publication. Therefore, such negative results typically remain in researchers’ file drawers (i.e., publication bias; Harrison et al., 2017; O’Boyle Jr et al., 2014). Sixth, nearly all journals in the field stress that new manuscripts must contribute the- oretical and empirical extensions to the current knowl- edge (e.g., Group and Organization Management seeks “[. . . ] the work of scholars and professionals who ex- tend management and organization theory [. . . ]. Inno- vation, conceptual sophistication, methodological rigor, and cutting-edge scholarship are the driving princi- ples”). This coincides with the underrepresentation of replication studies in the field. This underrepresenta- tion was shown by Ryan and Tipu (2022). They esti- mate in their quantitative analysis that less than 1.5% of published research in the business and management lit- erature are replication studies. The one-sided quest for novelty together with the prevailing disinterest for repli- cation studies that is well-documented for most jour- nals (Tipu and Ryan, 2021; see also Evanschitzky et al., 2007; Tenney et al., 2021) limits our collective ability to establish a cumulative knowledge base and to “dif- ferentiate ‘truth from nonsense’” (Kidwell et al., 2014, p. 304). The third argument: We would like to support OSPs, but they are made exclusively for experimental (labo- ratory) research. There are no suitable templates for other approaches (correlative [field] research, secondary data analyses, qualitative studies, etc.). This is not true and it has not been for a while. While the first OSPs and preregistration templates were indeed often de- veloped with a focus on experimental (laboratory) re- search (e.g., Van’t Veer and Giner-Sorolla, 2021), many further developments followed. There are now tem- plates that allow preregistering analyses of pre-existing data (Mertens and Krypotos, 2019), systematic reviews (Van den Akker et al., 2020), meta-analyses (Moreau and Gamble, 2022), and qualitative studies (Haven et al., 2020; Kern and Gleditsch, 2017). Moreover, ex- tant templates originally developed for experimental re- search can be adapted for all kinds of research with rel- ative ease. We argue that a preregistration not fitting the template perfectly is better than no preregistration at all. While a preregistration should always contain certain information (e.g., how the sample size is deter- mined and what measures will be used), every effort to limit researcher degrees of freedom (and thereby possi- bilities to engage in QRPs) via preregistration is a step in the right direction. Based on our own experience, we can recommend the template from the aspredicted.org website, which is also offered via the Open Science Framework (OSF; http://osf.io). The template can be used without a word limit or length restrictions on the OSF, while the as- predicted.org website has a word limit. The template is simple, short and can be easily adapted to a variety of study types. In different projects of our research group, we used it, for instance, for experimental studies, cor- relational studies, and the analysis of pre-existing data, meta-analyses, as well as qualitative studies. Thus, al- 3 though its original focus may have been experimen- tal research, it is clearly not restricted to such stud- ies. However, the templates that were designed for spe- cific study types are of course less generic and facilitate the declaration of necessary study-specific details (e.g., study eligibility criteria or the literature search strategy for meta-analyses, see Moreau and Gamble, 2022). The fourth argument: If journals implement OSPs, it raises the bar and makes publishing more difficult. This especially applies to certain kinds of research, for instance research on minorities, hard-to-reach or small popula- tions.4 Admittedly, such concerns about gatekeeping can be justified because the requirements for publica- tion would increase. For instance, when preregistering a study, researchers are asked to provide an a priori jus- tification of their sample size (see, for instance, the tem- plates on the aspredicted.org website or by Van’t Veer and Giner-Sorolla, 2021). This often means collecting [much] larger sample sizes as compared to conduct- ing a study without a sample size justification (Mone et al., 1996; Paterson et al., 2016). However, providing a sample size justification does not always mean that collecting a large sample size is necessary (and often it is not done, see Bakker et al., 2020). Having resource constraints and/or studying a hard-to-reach or small populations are legitimate justifications for the realized sample size (Lakens, 2022; although collecting surpris- ingly large sample sizes is more often possible than re- searchers might think at first, Vazire, 2015). However, while there are often good reasons to conduct and pub- lish research with rather small sample sizes, scientists should then actively acknowledge the potential, goals, and limits of their statistical analyses. Moreover, it can be debated how problematic a higher bar for publishing would actually be. In fact, there has been at least some agreement for some time now (e.g., Nelson et al., 2012; Vazire, 2018) that in- dividual researchers should publish fewer manuscripts while increasing their quality. To allow for making stronger scientific claims (Vazire, 2018), researchers should improve the methods they apply, including the use of OSPs, but not excluding further improvements in other methodological areas. The fifth argument: It does not make much of a dif- ference if journals actively support OSPs. Researchers do not want to use them. While it may be true that first initiatives to foster the use of OSPs in a field are not necessarily met with enthusiasm of most researchers, there is no reason to be pessimistic. As is the case with any innovation, people take it up at different speeds and it takes a while for change to affect the habits of the majority. But researchers do willingly take up these new measures especially if esteemed journals lead the movement. The psychological flagship journal Psycho- logical Science, for instance, has been an “early adopter” of OSPs since 2014 and has actively supported (but not enforced) the use of OSPs. The journal saw different positive results of its new policy rather quickly: Since the introduction of Open Science Badges (see Table 1), the data sharing rate for published articles increased. In fact, when researchers earned an Open Data Badge rather than merely indicating data availability, the data “were more likely to be actually available, correct, us- able, and complete” (Kidwell et al., 2016). The higher rate of published replication studies in the journal since the introduction of the “Preregistered Direct Replica- tion” article format indicates another positive change. The sixth argument: Journals that actively support OSPs experience a competitive disadvantage because scien- tists consider them as less attractive target journals. Jour- nals like Leadership Quarterly or the Journal of Business and Psychology have endorsed and supported the use of OSPs relatively early. If anything, these journals bene- fitted from this decision: Although their strongly pos- itive development in terms of journal metrics such as the journal impact factor is certainly driven by various factors and decisions, their articulated attitude towards OSPs did at least not hurt enough to prevent this devel- opment (see also the recent development of the Jour- nal of Applied Psychology after more recently introduc- ing transparency-related changes). And of course, there are other journals that have not yet embraced OSPs and did not have a comparable positive development in the same time span. The seventh argument: Journals that do at least en- courage some OSPs do more than others and they there- fore do enough. While some journals do actively support the use of selected OSPs (e.g., the Journal of Personnel Psychology or Group and Organization Management of- fering hybrid registered report submission; see Gardner, 2020), these efforts are not very visible. Researchers typically have to search actively for this option. If they do not know it is offered or do not know what to look for, there is a good chance that they will not even find the option on a journal website. Moreover, supporting only one OSP does and cannot address all of the prob- lems we listed above. To do so, it would be much more effective to actively support all OSPs (see Table 1). IWO Psychology and Management Journals should do more to Support OSPs Positive change is not coming to our field automati- cally. Illustrating this notion, a current study (Tenney 4We would like to thank our reviewer Elizabeth Tenney for suggesting this argument. 4 Ta b le 1 O p en S ci en ce P ra ct ic e D efi n it io n D em on st ra te d an d as su m ed be n efi ts fo r th e fi el d Tr an sp ar en cy re - qu ir em en ts fo r d at a, m et h od an d co d e, m at er ia l or st im u li A s a m in im u m re qu ir em en t, au th or s in d ic at e w h et h er th ey w il l m ak e th ei r d at a, an al yt ic m et h od s u se d in th e an al ys is (i .e ., m et h od s an d co d e) , an d re se ar ch m at er ia ls u se d to co n d u ct th e re se ar ch (i .e ., m at er ia l or st im u li ), av ai la bl e to an y re se ar ch er . • S tu d ie s em p lo yi n g h ig h st at is ti ca l p ow er , co m p le te m et h od ol og ic al tr an sp ar en cy , an d p re re gi st ra ti on ar e h ig h ly re p li ca bl e an d m or e re p li ca bl e th an p as t st u d ie s in p ri or m u lt i- la b re p li ca ti on ef fo rt s (P ro tz ko et al ., 2 0 2 0 ). P re re gi st ra ti on P re re gi st ra ti on is d efi n ed as “s p ec if yi n g yo u r re se ar ch p la n in ad va n ce of yo u r st u d y an d su bm it ti n g it to a re gi st ry ” (C en te r fo r O p en S ci en ce , n .d .- a) . • P re re gi st er ed st u d ie s ar e m or e tr an sp ar en t co n ce rn in g th e re p or ti n g of th ei r fi n d in gs th an n on -p re re gi st er ed st u d - ie s an d al so re p or t a lo w er ra te of co n fi rm ed h yp ot h es es (T ot h et al ., 2 0 2 1 ). • R es ea rc h er s w it h ex p er ie n ce u si n g p re re gi st ra ti on s (n = 2 9 9 ) re p or te d m os tl y p os it iv e ex p er ie n ce s, th ey be li ev ed th at p re re gi st ra ti on s h ad im p ro ve d th e qu al it y of th ei r re - se ar ch p ro je ct s an d “t h at th e be n efi ts ou tw ei gh th e ch al - le n ge s” (S ar af og lo u et al ., 2 0 2 1 ). R eg is te re d R ep or ts R eg is te re d re p or ts ar e “a p u bl is h in g fo rm at u se d by ov er 2 5 0 jo u rn al s th at em p h as iz es th e im p or ta n ce of th e re - se ar ch qu es ti on an d th e qu al it y of m et h od ol og y by co n d u ct - in g p ee r re vi ew p ri or to d at a co ll ec ti on ” (C en te r fo r O p en S ci en ce , n .d .- c) . • R es ea rc h er s bl in d ed to th e ra te d ar ti cl e ty p e, ra te d re g- is te re d re p or ts m or e p os it iv el y ac ro ss a n u m be r of cr it er ia , in cl u d in g th e ri go ro u sn es s of th e em p lo ye d m et h od s an d th e an al ys is as w el l as th e ov er al l m an u sc ri p t qu al it y an d th e im p or ta n ce of p ro d u ce d d is co ve ri es (S od er be rg et al ., 2 0 2 1 ). O p en S ci en ce B ad ge s O p en S ci en ce B ad ge s “a re in ce n ti ve s fo r re se ar ch er s to sh ar e d at a, m at er ia ls , or to p re re gi st er ” (C en te r fo r O p en S ci en ce , n .d .- b) . S p ec ifi c ba d ge s in d ic at e th at an ar ti cl e w as p re re gi st er ed , or th at it s d at a or it s m at er ia l h as be en m ad e p u bl ic ly av ai la bl e. • O p en S ci en ce B ad ge s in cr ea se d at a an d m at er ia ls sh ar in g (K id w el l et al ., 2 0 1 6 ). • S h ar ed d at a ar e “m or e li ke ly to be ac tu al ly av ai la bl e, co r- re ct ,u sa bl e, an d co m p le te ” w h en re se ar ch er s ea rn an O p en D at a ba d ge th an w h en th ey on ly in d ic at e d at a av ai la bi li ty (K id w el l et al ., 2 0 1 6 ). R ep li ca ti on s R ep li ca ti on s ar e “a fu n d am en ta l fe at u re of th e sc ie n ti fi c p ro ce ss ” (Z w aa n et al ., 2 0 1 8 ). W h en co n d u ct in g re p li ca - ti on s, re se ar ch er s cr it ic al ly te st th e ro bu st n es s an d va li d it y of sc ie n ti fi c d is co ve ri es . • R ep li ca ti on s en su re th e ro bu st n es s of p u bl is h ed re se ar ch be ca u se “a fi n d in g n ee d s to be re p ea ta bl e to co u n t as a sc i- en ti fi c d is co ve ry ” (Z w aa n et al ., 2 0 1 8 ) 5 Ta b le 2 T h e se ve n ar gu m en ts tr ea te d in th is co m m en ta ry T h e re fu ta ti on s of th es e ar gu m en ts (1 ) O S Ps ar e fo r sc ie n ti fi c fi el d s th at ev id en ti al ly h av e d oc u m en te d p ro bl em s w it h th e re p li ca bi li ty of th ei r fi n d in gs li ke S oc ia lP sy ch ol - og y. T h e re p li ca bi li ty of re se ar ch fi n d in gs is co n si st en tl y ra th er lo w fo r m an y sc ie n ti fi c fi el d s, in cl u d in g m an y n ei gh bo u ri n g fi el d s of IW O Ps yc h ol og y (b ey on d S oc ia l Ps yc h ol og y) . D u e to th e ex ta n t si m il ar - it ie s in p u bl is h in g p ra ct ic es an d in ce n ti ve s ac ro ss d is ci p li n es , it is u n li ke ly th at th e si tu at io n is d if fe re n t in IW O Ps yc h ol og y. (2 ) S h ow u s th e ev id en ce th at ou r fi el d d oe s in fa ct h av e se ve re m et h od ol og ic al p ro bl em s. M ay be th en w e w il l be w il li n g to st ar t su p p or ti n g O S Ps . IW O Ps yc h ol og y h as th e fo ll ow in g, w el l- d oc u m en te d m et h od ol og - ic al p ro bl em s: (i ) lo w st at is ti ca l p ow er , (i i) n on -t ra n sp ar en t re - se ar ch p ra ct ic es an d a la ck of d at a- sh ar in g, (i ii ) a h ig h p re va le n ce of Q u es ti on ab le R es ea rc h P ra ct ic es , (i v) m an y fa ls e p os it iv e fi n d - in gs , (v ) p u bl ic at io n bi as an d a su bs ta n ti al fi le d ra w er p ro bl em , (v i) a bi as to w ar d s n ov el ty at th e ex p en se of re p li ca ti on st u d ie s an d cu m u la ti ve sc ie n ce . (3 ) W e w ou ld li ke to su p p or t O S Ps , bu t th ey ar e m ad e ex cl u si ve ly fo r ex p er im en ta l (l ab or at or y) re se ar ch . T h er e ar e n o su it ab le te m - p la te s fo r ot h er ap p ro ac h es (c or re la ti ve [fi el d ] re se ar ch ,s ec on d ar y d at a an al ys es , qu al it at iv e st u d ie s, et c. ). S u it ab le te m p la te s h av e be en sp ec ifi ca ll y d ev el op ed fo r th e an al ys is of ex ta n t (c or re la ti on al ) d at a, qu al it at iv e st u d ie s, m et a- an al ys es , et c. M or eo ve r, ex is ti n g te m p la te s ca n be ea si ly ad ap te d . (4 ) If jo u rn al s im p le m en t O S Ps , it ra is es th e ba r an d m ak es p u b- li sh in g m or e d if fi cu lt . T h is es p ec ia ll y ap p li es to ce rt ai n ki n d s of re - se ar ch , fo r in st an ce re se ar ch on m in or it ie s, h ar d -t o- re ac h or sm al l p op u la ti on s. If jo u rn al s im p le m en t O S Ps , it w ou ld p ro ba bl y of te n ra is e th e ba r fo r p u bl is h in g re se ar ch . It is , h ow ev er , w ro n g th at O S Ps w ou ld al - w ay s re qu ir e la rg er sa m p le si ze s. M or eo ve r, ra is in g th e ba r w ou ld p ro ba bl y be go od fo r th e sc ie n ti fi c en te rp ri se . (5 ) It d oe s n ot m ak e m u ch of a d if fe re n ce if jo u rn al s ac ti ve ly su p - p or t O S Ps . R es ea rc h er s d o n ot w an t to u se th em . It m ay ta ke ti m e, bu t re se ar ch er s d o w an t to u se O S Ps if jo u rn al s im p le m en t th em an d in ce n ti vi ze th ei r u se , as fo r in st an ce th e ca se of Ps yc h ol og ic al S ci en ce sh ow s. (6 ) Jo u rn al s th at ac ti ve ly su p p or t O S Ps ex p er ie n ce a co m p et it iv e d is ad va n ta ge be ca u se sc ie n ti st s co n si d er th em as le ss at tr ac ti ve ta r- ge t jo u rn al s. T h e li m it ed ev id en ce th at w e h av e d oe s n ot su p p or t th is ar gu m en t. E ar ly O S P ad op te rs am on g th e jo u rn al s (J ou rn al of B u si n es s an d Ps yc h ol og y, Le ad er sh ip Q u ar te rl y) fa re d p re tt y w el l in co m p ar is on to n on -a d op te rs . (7 ) Jo u rn al s th at d o at le as t en co u ra ge so m e O S Ps d o m or e th an ot h er s an d th ey th er ef or e d o en ou gh . T h es e se le ct ed ef fo rt s ar e ty p ic al ly n ot su ffi ci en tl y vi si bl e. S u p p or t- in g on ly a p ar t of th e O S Ps ca n n ot fu ll y an d ef fe ct iv el y ad d re ss th e fi el d ’s p ro bl em s. 6 et al., 2021) found that less than one percent of ar- ticles in the field’s flagship journals are preregistered, less than one percent of the publications are replica- tion studies (for comparable results, see Ryan and Tipu, 2022) or report null results, and for less than three percent, authors indicate that they openly shared their data or their materials. These low rates most likely re- flect the journal policies concerning OSPs (cf. Torka et al., 2023). For example concerning replications, Tipu and Ryan (2021) showed that only 4.7% of more than 600 analysed business and management journals explic- itly considered replication studies, while “238 (39.7%) were implicitly dismissive of replication studies, and the remaining 3 (0.5%) journals were explicitly disinter- ested in considering replication studies for publication” (Tipu and Ryan, 2021, p. 101; for comparable results concerning replications and also further OSPs, see Torka et al., 2023). With this contribution, we would like to invite and challenge IWO Psychology and Management journals to foster researchers’ use of as many OSPs as possible. To be clear, we do not suggest forcing researchers to use certain OSPs. We rather ask the journals to contribute to the needed cultural change in the field’s research prac- tices by (i) encouraging and incentivizing methodolog- ical transparency and the use of preregistrations (e.g., by offering Open Science Badges), (ii) offering regis- tered reports as an equitable publishing format, and (iii) explicitly inviting well-designed replications. These measures, while cheap and easy to implement, can in- crease researchers’ perceptions of a journal as an attrac- tive outlet for their high-quality research, increase the quality of research overall and the resulting trust in it, and change the field for the better by addressing the systemic roots of QRPs and the low replicability of find- ings. Author Contact Joachim Hüffmeier, Department of Psychology, TU Dortmund University, Emil-Figge-Straße 50, 44227 Dortmund, Germany. E-mail: joachim.hueffmeier@tu- dortmund.de (Corresponding author) ; Marc Mertes, E- mail: marc.mertes@tu-dortmund.de Conflict of Interest and Funding The authors have no conflicts of interest. There was no specific source of funding. Author Contributions Joachim Hüffmeier wrote the manuscript and Marc Mertes provided revisions. Open Science Practices This article is conceptual and is not eligible for Open Science badges. The entire editorial process, including the open reviews, is published in the online supplement. References Artner, R., Verliefde, T., Steegen, S., Gomes, S., Traets, F., Tuerlinckx, F., & Vanpaemel, W. (2021). The reproducibility of statistical results in psycho- logical research: An investigation using unpub- lished raw data. Psychological Methods, 26(5), 527–546. https : / / doi . org / 10 . 1037 / met0000365 Asendorpf, J. B., Conner, M., De Fruyt, F., De Houwer, J., Denissen, J. J. A., Fiedler, K., Fiedler, S., Funder, D. C., Kliegl, R., Nosek, B. A., Perug- ini, M., Roberts, B. W., Schmitt, M., Van Aaken, M. A. G., Weber, H., & Wicherts, J. M. (2013). Recommendations for increasing replicability in psychology. Psychological Methods, 27(2), 108– 109. https://doi.org/10.1002/per.1919 Bakker, M., Veldkamp, C. L., van den Akker, O. R., van Assen, M. A., Crompvoets, E., Ong, H. H., & Wicherts, J. M. (2020). Recommendations in pre-registrations and internal review board proposals promote formal power analyses but do not increase sample size. Plos one, 15(7), e0236079. https://doi.org/10.1371/journal. pone.0236079 Banks, G. C., Rogelberg, S. G., Woznyj, H. M., Landis, R. S., & Rupp, D. E. (2016). Evidence on ques- tionable research practices: The good, the bad, and the ugly. Journal of Business and Psychol- ogy, 31, 323–338. https : / / doi . org / 10 . 1007 / s10869-016-9456-7 Bergh, D. D., Sharp, B. M., Aguinis, H., & Li, M. (2017). Is there a credibility crisis in strategic manage- ment research? evidence on the reproducibil- ity of study findings. Strategic Organization, 15(3), 423–436. https : / / doi . org / 10 . 1177 / 1476127017701076 Camerer, C. F., Dreber, A., Forsell, E., Ho, T.-H., Hu- ber, J., Johannesson, M., Kirchler, M., Almen- berg, J., Altmejd, A., Chan, T., Heikensten, E., Holzmeister, F., Imai, T., Isaksson, S., Nave, G., Pfeiffer T, M., Razen, & Wu, H. (2016). Eval- uating replicability of laboratory experiments in economics. Science, 351(6280), 1433–1436. https://doi.org/10.1126/science.aaf0918 Center for Open Science. (n.d.-a). Future-proof your re- search. preregister your next study. https : / / www.cos.io/initiatives/prereg https://doi.org/10.1037/met0000365 https://doi.org/10.1037/met0000365 https://doi.org/10.1002/per.1919 https://doi.org/10.1371/journal.pone.0236079 https://doi.org/10.1371/journal.pone.0236079 https://doi.org/10.1007/s10869-016-9456-7 https://doi.org/10.1007/s10869-016-9456-7 https://doi.org/10.1177/1476127017701076 https://doi.org/10.1177/1476127017701076 https://doi.org/10.1126/science.aaf0918 https://www.cos.io/initiatives/prereg https://www.cos.io/initiatives/prereg 7 Center for Open Science. (n.d.-b). Open science badges enhance openness, a core value of scientific practice. https : / / www . cos . io / initiatives / badges Center for Open Science. (n.d.-c). Registered reports: Peer review before results are known to align scientific values and practices. https : / / www. cos.io/initiatives/registered-reports Evanschitzky, H., Baumgarth, C., Hubbard, R., & Arm- strong, J. S. (2007). Replication research’s dis- turbing trend. Journal of Business Research, 60(4), 411–415. https : / / doi . org / 10 . 1016 / j . jbusres.2006.12.003 Fanelli, D. (2012). Negative results are disappearing from most disciplines and countries. Sciento- metrics, 90, 891–904. https : / / doi . org / 10 . 1007/s11192-011-0494-7 Gardner, W. L. (2020). Farewell from the outgoing ed- itor. Group Organization Management, 45(6), 762–767. https : / / doi . org / 10 . 1177 / 1059601120980536 Goldfarb, B., & King, A. A. (2016). Scientific apophenia in strategic management research: Significance tests mistaken inference. Strategic Management Journal, 37(1), 167–176. https://doi.org/10. 1002/smj.2459 Harrison, J. S., Banks, G. C., Pollack, J. M., O’Boyle, E. H., & Short, J. (2017). Publication bias in strategic management research. Journal of Management, 43(2), 400–425. https://doi.org/ 10.1177/0149206314535438 Haven, T. L., Errington, T. M., Gleditsch, K. S., van Grootel, L., Jacobs, A. M., Kern, F. G., Piñeiro, R., Rosenblatt, F., & Mokkink, L. B. (2020). Preregistering qualitative research: A delphi study. International Journal of Qualitative Meth- ods, 19, 1–13. https : / / doi . org / 10 . 1177 / 16094069209764179 Hüffmeier, J., Mertes, M., Schultze, T., Nohe, C., Mazei, J., & Zacher, H. (n.d.). Prevalence, problems, and potential: A survey on selected open science practices among iwo psycholo- gists. [manuscript in preparation]. Kepes, S., Keener, S. K., McDaniel, M. A., & Hart- man, N. S. (2022). Questionable research prac- tices among researchers in the most research- productive management programs. Journal of Organizational Behavior, Advanced online publi- cation. Kern, F. G., & Gleditsch, K. S. (2017). Exploring pre- registration and pre-analysis plans for qualita- tive inference. https://t1p.de/nvb9i Kidwell, M. C., Lazarević, L. B., Baranski, E., Hardwicke, T. E., Piechowski, S., Falkenberg, L. S., Kennett, C., Slowik, A., Sonnleitner, C., Hess-Holden, C., Errington, T. M., Fiedler, S., & Nosek, B. A. (2014). Facts are more important than novelty: Replication in the education sciences. Educa- tional Researcher, 43(6), 304–316. https://doi. org/10.3102/0013189X145455136 Kidwell, M. C., Lazarević, L. B., Baranski, E., Hard- wicke, T. E., Piechowski, S., Falkenberg, L. S., Kennett, C., Slowik, A., Sonnleitner, C., Hess- Holden, C., Errington, T. M., Fiedler, S., & Nosek, B. A. (2016). Badges to acknowledge open practices: A simple, low-cost, effective method for increasing transparency. PLoS Biol- ogy, 14(5), e1002456. https : / / doi . org / 10 . 1371/journal.pbio.1002456 Lakens, D. (2022). Sample size justification. Collabra: Psychology, 8(1), 33267. Mertens, G., & Krypotos, A.-M. (2019). Preregistration of analyses of preexisting data. Psychologica Bel- gica, 59(1), 338–352. https : / / doi . org / 10 . 5334/pb.493 Mone, M. A., Mueller, G. C., & Mauland, W. (1996). The perceptions and usage of statistical power in applied psychology and management research. Personnel Psychology, 49(1), 103–120. https:// doi.org/10.1111/j.1744-6570.1996.tb01793.x Moreau, D., & Gamble, B. (2022). Conducting a meta- analysis in the age of open science: Tools, tips, and practical recommendations. Psychological Methods, 27(3), 426–432. https://doi.org/10. 1037/met0000351 Nelson, L. D., Simmons, J. P., & Simonsohn, U. (2012). Let’s publish fewer papers. Psychological In- quiry, 23(3), 291–293. https : / / doi . org / 10 . 1080/1047840X.2012.70524 O’Boyle Jr, E. H., Banks, G. C., & Gonzalez-Mulé, E. (2017). The chrysalis effect: How ugly initial results metamorphosize into beautiful articles. Journal of Management, 43, 376–399. https:// doi.org/10.1177/0149206314527133 O’Boyle Jr, E. H., Rutherford, M. W., & Banks, G. C. (2014). Publication bias in entrepreneurship re- search: An examination of dominant relations to performance. Journal of Business Venturing, 29(6), 773–784. https : / / doi . org / 10 . 1016 / j . jbusvent.2013.10.001 Open Science Foundation. (2015). Estimating the re- producibility of psychological science. Science, 349(6251), aac4716. https://doi.org/10.1126/ science.aac4716 https://www.cos.io/initiatives/badges https://www.cos.io/initiatives/badges https://www.cos.io/initiatives/registered-reports https://www.cos.io/initiatives/registered-reports https://doi.org/10.1016/j.jbusres.2006.12.003 https://doi.org/10.1016/j.jbusres.2006.12.003 https://doi.org/10.1007/s11192-011-0494-7 https://doi.org/10.1007/s11192-011-0494-7 https://doi.org/10.1177/1059601120980536 https://doi.org/10.1177/1059601120980536 https://doi.org/10.1002/smj.2459 https://doi.org/10.1002/smj.2459 https://doi.org/10.1177/0149206314535438 https://doi.org/10.1177/0149206314535438 https://doi.org/10.1177/16094069209764179 https://doi.org/10.1177/16094069209764179 https://t1p.de/nvb9i https://doi.org/10.3102/0013189X145455136 https://doi.org/10.3102/0013189X145455136 https://doi.org/10.1371/journal.pbio.1002456 https://doi.org/10.1371/journal.pbio.1002456 https://doi.org/10.5334/pb.493 https://doi.org/10.5334/pb.493 https://doi.org/10.1111/j.1744-6570.1996.tb01793.x https://doi.org/10.1111/j.1744-6570.1996.tb01793.x https://doi.org/10.1037/met0000351 https://doi.org/10.1037/met0000351 https://doi.org/10.1080/1047840X.2012.70524 https://doi.org/10.1080/1047840X.2012.70524 https://doi.org/10.1177/0149206314527133 https://doi.org/10.1177/0149206314527133 https://doi.org/10.1016/j.jbusvent.2013.10.001 https://doi.org/10.1016/j.jbusvent.2013.10.001 https://doi.org/10.1126/science.aac4716 https://doi.org/10.1126/science.aac4716 8 Paterson, T. A., Harms, P., Steel, P., & Credé, M. (2016). An assessment of the magnitude of effect sizes: Evidence from 30 years of meta-analysis in management. Journal of Leadership & Organi- zational Studies, 23(1), 66–81. https://doi.org/ 10.1177/1548051815614321 Protzko, J., Krosnick, J., Nelson, L. D., Nosek, B. A., Axt, J., Berent, M., Buttrick, N., DeBell, M., Ebersole, C. R., Lundmark, S., MacInnis, B., O’Donnell, M., Perfecto, H., Pustejovsky, J. E., Roeder, S., Walleczek, J., & Schooler, J. W. (2020). High replicability of newly- discovered social-behavioral findings is achiev- able. PsyArXiv. https://doi.org/10.31234/osf. io/n2a9x Ryan, J. C., & Tipu, S. A. (2022). Business and man- agement research: Low instances of replica- tion studies and a lack of author independence in replications. Research Policy, 51(1), 104408. https://doi.org/10.1016/j.respol.2021.10440 Sarafoglou, A., Kovacs, M., Bakos, B. E., Wagenmakers, E., & Aczel, B. (2021). A survey on how prereg- istration affects the research workflow: Better science but more work. PsyArXiv. https://doi. org/10.31234/osf.io/6k5gr Simmons, J. P., & Nelson, L. D. (2019). Data replicada. Data Colada. https://datacolada.org/81 Soderberg, C. K., Errington, T. M., Schiavone, S. R., Bottesini, J., Thorn, F. S., Vazire, S., Esterling, K. M., & Nosek, B. A. (2021). Initial evidence of research quality of registered reports com- pared with the standard publishing model. Na- ture Human Behaviour, 5(8), 990–997. https : //doi.org/10.1038/s41562-021-01142-4 Tenney, E. R., Costa, E., Allard, A., & Vazire, S. (2021). Open science and reform practices in organi- zational behavior research over time (2011 to 2019). Organizational Behavior and Human De- cision Processes, 162, 218–223. https://doi.org/ 10.1016/j.obhdp.2020.10.015 Tenopir, C., Allard, S., Douglass, K., Aydinoglu, A., Wu, L., Read, E., & Manoff, M. (n.d.). Data sharing by scientists: Practices and perceptions. PloS one, 6(6), e21101. https://doi.org/10.1371/ journal.pone.0021101 Tipu, S. A. A., & Ryan, J. C. (2021). Are business and management journals anti-replication? an anal- ysis of editorial policies. Management Research Review, 45(1), 101–117. https : / / doi . org / 10 . 1108/MRR-01-2021-0050 Torka, A.-K., Mazei, J., Bosco, F., Cortina, J., Götz, M., Kepes, S., O’Boyle, E., & Hüffmeier, J. (2023). How well are open science practices imple- mented in industrial, work, and organizational psychology and management? Management Re- search Review, Manuscript under journal review. Toth, A. A., Banks, G. C., Mellor, D., O’Boyle, E. H., Dick- son, A., Davis, D. J., DeHaven, A., Bochantin, J., & Borns, J. (2021). Study preregistration: An evaluation of a method for transparent re- porting. Journal of Business and Psychology, 36, 553–571. https://doi.org/10.1108/MRR- 01- 2021-0050 Van den Akker, O., Peters, G. J., Bakker, C., Carlsson, R., Coles, N. A., Corker, K. S., ..., & Yeung, S. K. (2020). Prosysrev: A generalized form for registering producible systematic reviews. Data Colada. https : / / osf . io / preprints / metaarxiv / 3nbea/ Van’t Veer, A. E., & Giner-Sorolla, R. (2021). Pre- registration in social psychology—a discussion and suggested template. Journal of Experimen- tal Social Psychology, 67, 2–12. https://doi.org/ 10.1016/j.jesp.2016.03.004 Vazire, S. (2015). Super power. https : / / sometimesimwrong . typepad . com / wrong / 2015/11/super-power.html Vazire, S. (2018). Implications of the credibility revo- lution for productivity, creativity, and progress. Perspectives on Psychological Science, 13(4), 411–417. https : / / doi . org / 10 . 1177 / 1745691617751884 Wicherts, J. M., Borsboom, D., Kats, J., & Molenaar, D. (2006). The poor availability of psychological research data for reanalysis. American psycholo- gist, 61(7), 726–728. https://doi.org/10.1037/ 0003-066X.61.7.726 Zwaan, R. A., Etz, A., Lucas, R. E., & Donnellan, M. B. (2018). Making replication mainstream. Behav- ioral and Brain Sciences, 41, e120. https://doi. org/10.1017/S0140525X170019726 https://doi.org/10.1177/1548051815614321 https://doi.org/10.1177/1548051815614321 https://doi.org/10.31234/osf.io/n2a9x https://doi.org/10.31234/osf.io/n2a9x https://doi.org/10.1016/j.respol.2021.10440 https://doi.org/10.31234/osf.io/6k5gr https://doi.org/10.31234/osf.io/6k5gr https://datacolada.org/81 https://doi.org/10.1038/s41562-021-01142-4 https://doi.org/10.1038/s41562-021-01142-4 https://doi.org/10.1016/j.obhdp.2020.10.015 https://doi.org/10.1016/j.obhdp.2020.10.015 https://doi.org/10.1371/journal.pone.0021101 https://doi.org/10.1371/journal.pone.0021101 https://doi.org/10.1108/MRR-01-2021-0050 https://doi.org/10.1108/MRR-01-2021-0050 https://doi.org/10.1108/MRR-01-2021-0050 https://doi.org/10.1108/MRR-01-2021-0050 https://osf.io/preprints/metaarxiv/3nbea/ https://osf.io/preprints/metaarxiv/3nbea/ https://doi.org/10.1016/j.jesp.2016.03.004 https://doi.org/10.1016/j.jesp.2016.03.004 https://sometimesimwrong.typepad.com/wrong/2015/11/super-power.html https://sometimesimwrong.typepad.com/wrong/2015/11/super-power.html https://sometimesimwrong.typepad.com/wrong/2015/11/super-power.html https://doi.org/10.1177/1745691617751884 https://doi.org/10.1177/1745691617751884 https://doi.org/10.1037/0003-066X.61.7.726 https://doi.org/10.1037/0003-066X.61.7.726 https://doi.org/10.1017/S0140525X170019726 https://doi.org/10.1017/S0140525X170019726 IWO Psychology and Management Journals should do more to Support OSPs Author Contact Conflict of Interest and Funding Author Contributions Open Science Practices