Artifact Evaluation Track

Authors of accepted research papers are invited to submit an artifact to the ICPE Artifact Track. According to ACM’s “Result and Artifact Review and Badging” policy, an “artifact” is “a digital object that was either created by the authors to be used as part of the study or generated by the experiment itself […] software systems, scripts used to run experiments, input datasets, raw data collected in the experiment, or scripts used to analyze results”. A formal review of such artifacts not only ensures that the “study is repeatable” by the same team, if they are available online then other researchers “can replicate the findings” as well and this also allows reuse by other teams.

In this spirit, the ICPE 2022 Artifacts Track exists to review, promote, share and catalog the research artifacts produced by full papers accepted to the research track. Artifacts of interest include (but are not limited to):

  • Tools, libraries or frameworks, which are implementations of systems or algorithms essential for the results described in the associated paper, possibly also useful in other work.

  • Data or repositories, which are essential for the results described in the associated paper, ideally also useful in other work. The authors must ensure that the artifacts are available from a stable URL or DOI with an archival plan, such as the SPEC RG Zenodo repository (personal page is not sufficient).

If you require an exception from the conditions above, please mail the chairs before submitting.

What do you get out of it?

If your artifact is accepted, it will receive one of the following badges in the text of the paper and in the ACM Digital Library:

  • Artifacts Evaluated - Functional: The artifacts are complete, well-documented and allow to obtain the same results as the paper.

  • Artifacts Evaluated - Reusable: As above, but the artifacts are of such a high quality that they can be reused as is on other data sets, or for other purposes.

  • Artifacts Available: For artifacts made permanently available. This will only be awarded in conjunction with one of the Artifacts Evaluated badges.

Regarding archival, all accepted artifacts will be indexed on the conference web site.

How to submit?

Submissions are made via Easychair

Submission deadlines are listed on the important dates page.

To facilitate artifact review, we request the submission of a 1-page PDF to the ICPE submission site that includes a brief summary of the artifact, the link to the artifact (can be password protected), as well as the specification of software and hardware requirements. Your artifact should be made available as a link to a single archive file using a widely available compressed archive format (preferably zip or tar.gz). The repository or archive must contain:

  • Paper PDF: The camera-ready version of the accompanying research paper, so that the reviewers can properly judge the artifact.

  • A README file that (1) describes the software and hardware requirements of the artifact, (2) describes the artifact and includes relative links to the files that constitute the artifact, (3) a getting started guide that should should enable the reviewers to run, execute or analyze your artifact, including step-by-step instructions on how to evaluate your artifact.

  • Your Artifact: The artifact itself, which may include, but is not limited to, source code, executables, data, a virtual machine image, and documents (please use open formats for documents). Please make sure that your artifact is self-contained (with the exception of pointers to external tools or libraries, which we will not consider being part of the evaluated artifact, but which we will try to use when evaluating the artifact).

  • [Optional] Video: Optionally, authors are encouraged to submit a link to a short video (maximum 5 minutes) demonstrating the artifact.

To submit an artifact for your accepted ICPE 2022 full research track paper, it is important to keep in mind: a) how accessible you are making your artifact to other researchers, and b) that the ICPE artifact evaluators will have very limited time for making an assessment of each artifact. Artifact evaluation can be rejected for artifacts whose configuration and installation takes an undue amount of time. If you envision difficulties, please provide your artifact in some easily ported form, such as a virtual machine image ( or a container image (

Review Process and Selection Criteria

The artifact will go through an anonymous review where they are evaluated in relation to the expectations set by the paper. Submitted artifacts will go through a two-phase evaluation:

  • Kicking the tires: reviewers check the artifact integrity and look for any possible setup problems that may prevent it from being properly evaluated (e.g., corrupted or missing files, VM does not start, immediate crashes on the simplest example). Authors are informed of the outcome of this phase and can help to resolve found issues during a brief author response period.

  • Artifact assessment: reviewers evaluate the artifacts, checking if they live up to the expectations created by the paper.

Since portability bugs are easy to make, the review committee can issue additional requests to authors during the assessment to fix such bugs in the artifact. The resulting version of the artifact should be considered “final” and should allow reviewers to decide about artifact acceptance and badges.

Artifacts will be scored using the following criteria:

Artifacts Evaluated - Functional:

  • Documented: Is it accompanied by relevant documentation making it easy to use?

  • Consistent: Is the artifact relevant to the associated paper, and contribute in some inherent way to the generation of its main results?

  • Complete: To the extent possible, are all components relevant to the paper in question included? (Proprietary artifacts need not be included. If they are required to exercise the package then this should be documented, along with instructions on how to obtain them. Proxies for proprietary data should be included so as to demonstrate the analysis.)

  • Exercisable: If the artifact is executable, is it easy to download, install, or execute? Included scripts and/or software used to generate the results in the associated paper can be successfully executed, and included data can be accessed and appropriately manipulated.

Artifacts Evaluated - Reusable:

  • The artifacts associated with the paper are of a quality that significantly exceeds minimal functionality. That is, they have all the qualities of the Artifacts Evaluated – Functional level, but, in addition, they are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. In particular, norms and standards of the research community for artifacts of this type are strictly adhered to.

Artifacts Available:

  • Author-created artifacts relevant to this paper have been placed on a publicly accessible archival repository. A DOI or link to this repository along with a unique identifier of the object is provided.

Artifact Evaluation Chairs

  • Emma Söderberg, Lund University, Sweden
  • Simon Eismann, University of Würzburg, Germany


Artifact Evaluation Committee