An artifact can qualify for one, two, or all three of the following badges, which are defined by the ACM:
To earn this badge, the version of the artifact that was evaluated must be made available for retrieval—permanently and publicly. Valid hosting options include publisher repositories (e.g., the ACM Digital Library) institutional repositories (e.g., provided by your university), and open commercial repositories (e.g., GitHub), but not personal webpages. On repositories that can be modified, a stable reference to the evaluated version (e.g., a commit hash) is required. Other than making the artifacts available, this badge does not mandate any further requirements on functionality, correctness, or documentation. Artifacts in this category can include both software systems and datasets.
To earn this badge, all artifacts associated with the paper must undergo an independent audit by members of the AEC, but it is not necessary for the artifact to be publicly released. The submitted artifacts will be evaluated on three aspects: (i) Documentation: are they sufficiently documented to enable them to be used independently? (ii) Completeness: do they include all key components described in the paper? (iii) Exercisability: do they include scripts and data needed to run the experiments described in the paper?
To earn this badge, in addition to the artifact being functional, the main results of the paper have to be independently obtained by members of the AEC on the same (or comparable) setup as the authors. The authors must provide the artifact (e.g., source code, scripts, input data, and the necessary infrastructure to run the experiments described in the paper), and the reviewers will attempt to reproduce the results using the authors’ setup, within the allowed tolerance. The badge is awarded if the reviewers are able to produce results that are consistent with the main claims of the paper.
Note that the ACM Publications Board interchanged the definitions of the words “reproduced” and “replicated” in July 2020. The text above reflects the new definition.
Artifact Evaluation—especially reproducing results—requires time commitment and effort from both authors and reviewers. Please see the Call for Artifacts for more details on the expectations for authors, submitted artifacts, and reviewers.