A scientific paper consists of a constellation of artifacts that extend beyond the document itself: software, hardware, evaluation data and documentation, raw survey results, mechanized proofs, models, test suites, benchmarks, and so on. In some cases, the quality of these artifacts is as important as that of the document itself. Last year, 71% of accepted OSDI papers participated in the artifact evaluation process. Based on last year’s success, OSDI ‘23 will continue to run an optional artifact evaluation process combined with USENIX ATC ‘23.
The artifact evaluation process will consider the availability and functionality of artifacts associated with their corresponding papers, along with the reproducibility of the paper’s key results and claims with these artifacts. Artifact evaluation is single-blind. Artifacts will be held in confidence by the evaluation committee.
All (conditionally) accepted OSDI papers are encouraged to participate in artifact evaluation. Because the time between paper acceptance and artifact submission is short, we strongly encourage authors to start preparing their artifacts for evaluation while their papers are still under consideration by the OSDI Program Committee. See the Submitting an Artifact section for details on the submission process.
Questions about the process can be directed to email@example.com.
- Notification for paper authors: Thursday, March 23, 2023
- Artifact registration deadline: Thursday, April 6, 2023, AOE
- Artifact submission deadline: Monday, April 24, 2023, AOE
- Kick-the-tires response period: Monday, May 1st – Monday, May 7, 2023
- Artifact decisions announced: Thursday, May 25, 2023
- OSDI final papers deadline: Thursday, June 1, 2023
Note: For an artifact to be considered, at least one contact author for the submission must be reachable via email and respond to questions in a timely manner during the kick-the-tires period.
Benefits and Goals
The dissemination of artifacts benefits our science and engineering as a whole. Their availability encourages replicability and reproducibility and enables authors to build on top of each others’ work. It can also help more unambiguously resolve questions about cases not considered by the original authors. It also confers direct and indirect benefits to the authors themselves.
The goal of artifact evaluation is to incentivize authors to invest in their broader scientific community by producing artifacts that illustrate their claims, enable others to validate those claims, and accelerate future scientific progress by providing a platform for others to start from. A paper with artifacts that have passed the artifact evaluation process is recognized in two ways: first by badges that appear on the paper’s first page, and second by an appendix that details the artifacts.
Eventually, the assessment of a paper’s accompanying artifacts may guide the decision-making about papers: that is, the Artifact Evaluation Committee (AEC) would inform and advise the Program Committee (PC). For now, artifact evaluation will begin only after paper acceptance decisions have already been made. Artifact evaluation is optional, although we hope all papers will participate.
Each paper sets up certain expectations and claims of its artifacts based on its content. The AEC will read the paper and then judge whether the artifacts match those criteria. Thus, the AEC’s decision will be that the artifacts do or do not “conform to the expectations set by the paper.” Ultimately, the AEC expects that high-quality artifacts will be:
- consistent with the paper
- as complete as possible
- documented well
- easy to reuse, facilitating further research
The AE process at OSDI ‘23 is a continuation of the AE process at OSDI ‘22 and was inspired by multiple other conferences, such as USENIX Security, SOSP, and several SIGPLAN conferences. See artifact-eval.org for the origins of the AE process, and sysartifacts.github.io for the previous AE processes held in systems.