For papers to have scientific impact, they need to impress our peers in their role as referees, journal editors, and members of conference committees. Does better writing help our papers make it past these gatekeepers? In this study, we estimate the effect of writing quality by comparing how 30 economists judge the quality of papers written by PhD students in economics. Each economist judged five papers in their original version and five different papers that had been language edited. No economist saw both versions of the same paper. Our results show that writing matters. Compared to the original versions, economists judge edited versions as higher quality; they are more likely to accept edited versions for a conference; and they believe that edited versions have a better chance of being accepted at a good journal.
Our results show that writing matters. Writing experts judged the edited papers as 0.6 standard deviations (SD) better written overall (1.22 points on an 11–point scale). They further judged the language–edited papers as allowing the reader to find the key message more easily (0.58 SD), having fewer mistakes (0.67 SD), being easier to read (0.53 SD), and being more concise (0.50 SD). These large improvements in writing quality translated into still substantial effects on economists’ evaluations.Economists evaluated the edited versions as being 0.2 SD better overall (0.4 points on an 11–point scale). They were also 8.4 percentage points more likely to accept the paper for a conference, and were 4.1 percentage points more likely to believe that the paper would get published in a good economics journal. Our heterogeneity analysis shows that the effects of language editing on writing quality and perceived academic quality are particularly large if the original versions were poorly written.
Latest posts by Ryan Watkins (see all)
- Synergizing Human-AI Agency: A Guide of 23 Heuristics for Service Co-Creation with LLM-Based Agents - December 1, 2023
- AI-Augmented Surveys: Leveraging Large Language Models and Surveys for Opinion Prediction [imputation] - November 29, 2023
- Enhancing Human Persuasion With Large Language Models - November 29, 2023