-
desirable to identify experimental studies and distinguish from e.g. corpus
- possible automatice approximation: Look for terms like "participants / speakers / listeners & experiment" etc. (in header), "p >/=/<"
possible criteria:
- "experiment" im abstract
- "experiment" im header
- "participant" (et al.) im header
- ""
- maybe check keywords
-
manual checks should probably be coded by both of us and then discussed if diverging
- stat check results (http://statcheck.io/)
- done already, we potentially need to manually check the errors
- manual checked: checke ich manual
- check for significant results (first hypothesis mentioned) (see Scheel et al. 2020: https://psyarxiv.com/p6e9c/)
-
automatization: "test* the hypothes*"(Fanelli, 2010), "predicted that" (Scheel et al. 2020)
- "For the majority of Registered Reports (49), we identified one hypothesis-introduction phrase; the remaining ones used two (16 RRs), three (4 RRs), or four (1 RR) different phrases or had no identifiable hypothesis introduction (1 RR). In this total set of 97 hypothesis introductions, we found 64 unique phrases showing substantial linguistic variation (see Tables 2 and 3). [...] the five most frequent word stems were ‘hypothes∗’ (34 occurrences), ‘replicat∗’ (24), ‘test∗’ (20), ‘examine∗’ (8), and ‘predict∗’" (Scheel et al. 2020: 7)
-
check p-values?
-
check manual 10 journal and try manually
-
- manually
- check for magnitude of acoustic results
- for that we need to identify acoustic studies
- maybe terms like acoustic anal*
- focus on f0 (Hz), duration (ms) and intensity (dB)
- also need to check JND literature to set thresholds (ask Dan)
- if effect doesn't pass JND, manually check whether the paper discusses this
- for that we need to identify acoustic studies
- follow Makel et al: Check for replic* and then manually check the content.
- subsample manually check (10)
- check for preregist*
- Check hyperlinks
- What is actually available (raw, datatable, materials, scripts?)
- raw data
- data table
- statistical scripts
- material
- exposition/illustration files
- behauptet aber
- keine Daten
- rotten link
- daten nicht available
———
Covariates:
- readability