Our links are on break until the new year, but here are a couple of catch-up links now our job market series has finished: BITSS had its annual conference (program and live video for the different talks posted online). Lots of discussion of the latest in transparency and open science. Includes a replication exercise with all AEJ ...
David McKenzie considers the following as important:
This could be interesting, too:
Bank of Japan writes Amendment to “Principal Terms and Conditions for the Loan Support Program”
Bank of Japan writes Statement on Monetary Policy
Bank of Japan writes Outlook for Economic Activity and Prices (January 2019, The Bank’s View)
Bank of Japan writes Average Interest Rates by Type of Deposit
- BITSS had its annual conference (program and live video for the different talks posted online). Lots of discussion of the latest in transparency and open science. Includes a replication exercise with all AEJ applied papers: “69 of 162 eligible replication attempts successfuly replicated the article's analysis 42.6%. A further 68 (42%) were at least partially successful. A total of 98 out of 303 (32.3%) relied on confidential or proprietary data, and were thus not reproducible by this project.” And slides by Evers and Moore that should cause you to question any analysis done using Poissons or Negative Binomials.
At the BITSS conference, Andrew Foster also gave an update of the JDE’s pilot of registered reports/pre-results review. Some points he noted:
- They see this process as potentially encouraging people to take on riskier projects where there are states of the world in which the interventions might not work, or where there is a chance that someone else might scope the paper by doing something similar before you are done – so the insurance function could be useful.
- To date they have had 21 submissions, of which 2 have been accepted, 10 rejected (in an average of 6 weeks), and 9 are under review. The main reasons for rejection are i) concerns that a null result will be imprecise or uninformative; ii) power calculations that seem too over-optimistic or insufficiently justified; iii) papers that fail to adequately justify what the contribution is to the literature – making clear that this is not just submitting your pre-analysis plan, but working hard as to make clear why we should care about the answer; and iv) not providing sufficient detail on research design.
- He noted a couple of things that differ from registered reports in psychology and other sciences are i) the much longer timelines of many econ studies and all the associated implementation issues and delays outside the controls of researchers this brings; and ii) that researchers may want to avoid too much publicity about ongoing studies – something I have previously blogged about, and which I am pleased the JDE is being sensitive too.
- He reiterated that they see this being of particular usefulness in helping young researchers, and that they do not want to close the door to people taking accepted stage 1 proposals and trying top general interest journals with the paper first (but they may at some point put a statute of limitations on how long you have to do this)
- They will make clearer going forward that they will only consider papers in which at most baseline data has been collected at the time of submission – in the early stages they have had some submissions where data are being collected but not yet analyzed.
- Vox talks with Mushfiq Mobarak on why programs fail to scale.
- Dave Evans takes on the Economist’s claim that too often teachers are the problem (on the Let’s Talk Development blog).
- Job opening with me: I am looking for a field coordinator with experience in running randomized trials and/or in supervising surveys and field implementation to work on a new project on migration from the Gambia. If you are interested, please see the job announcement for more details and apply here.