Sprint Report #6: Enhanced Editorial Reports
Internal statistics in OJS are crucial to improve editorial workflow management. These metrics (e.g., the number of submissions in the system, average time to review, acceptance rates, etc.) support internal evaluations, facilitate account reporting to funding agencies, and enable users to report on a journal’s readership and editorial performance. In short, they allow journals to run a “healthcheck.”
Originally raised at our PKP 2017 Sprint in Montréal, work on OJS’ internal editorial reporting is ongoing. Our SFU Sprint in May 2019 enabled us to dive deep again, with sprinters looking specifically at improvements to journal-level reports in OJS 3+. In Montréal, several fields were requested for addition to our existing statistical reports. This work was completed for OJS 3.1.2, and was reviewed as the first task of the SFU Sprint. In particular, this meant going through the table created in 2017 that lists the minimum required fields for two OJS reports: Articles and Reviews.
Reports are generated in CSV format which requires a spreadsheet application to view. Our SFU sprinters discovered that about 80% of the requested fields have been added to the exported CSV files since the Montréal sprint, but that there are still some fields pending. They re-prioritized, adding new fields (e.g., Latest Decision and Latest Decision Date) and removing others (e.g., Gender). The updated spreadsheet can be found here.
The group also reworked the Journal Health Reports document, a project sponsored by the University of Minnesota Libraries Publishing Services. Renamed the OJS Editorial Report, this feature will provide at a glance statistics for journal editors – it won’t be necessary to download the CSV file in order to check the data and relevant statistical data will be available on the fly with filter options by date range. Sprint work included adding more fields and filters, specifying what each type of data means and where to find in the OJS database, and how to calculate average rates and processing times. The updated “OJS Editorial Report” document can be found here, and a GitHub entry to track progress is here.
Statistics and reporting is an ongoing conversation for which we anticipate the need for future sprint contributions. Since the SFU Sprint, we have been working on the back end – coding and developing – with continued support and sponsorship from the University of Minnesota Libraries Publishing Services, and extensive review and guidance from SciELO (who have participated in both sprints). Progress on the OJS Editorial Report can be seen on GitHub. Community feedback is welcome.
Thank you to our SFU Sprint team: Alex Mendonça (SciELO Brazil), James MacGregor, (Public Knowledge Project), John Barneson (University of Minnesota), and Kevin Hawkins (University of North Texas).