Artifacts are files produced by test runner, usually screenshots, videos or traces. Reporter uses a custom S3 bucket for artifacts which can be obtained from any S3 provider, like AWS, DigitalOcean and others. This makes artifacts storage to be independent from Application. In case you decide to stop using it, you still control your data. It is also up to you to clean old artifacts when you don't need them.

By using external storage allows getting full control over how the storage is used. You can either clean up old test artifacts or contrary extend storage to store all history for all periods. S3 was chosen as a de-facto standard for file storage so all cloud providers support it. If you don't have S3 storage yet, you can purchase one from your favorite cloud provider and create a new bucket (storage space) on it. has no direct access to artifacts storage.

Test artifacts can be uploaded with public (default) or private access to a bucket

  1. reporter sends a file to S3 storage and sends link back
  2. displays artifacts by the link in public mode or pre-signed links in private mode

Note Application won't bill you for stored artifacts, as they are stored in your bucket. If you don't have S3 bucket yet, it's not more than 5$ a month to purchase them one of the cloud providers. S3 was chosen as it is de-facto standard for file storage for cloud hosting. All popular hosting except Microsoft Azure support S3 protocol for storing files.

If a test files, a test runner may produce screenshot, video, or trace of a failed test. These files will be picked up by a reporter and uploaded to S3 bucket and attached to test.


There are two options for setting S3 bucket credentials:

  • via environment variables
  • by connecting to Application

Recommended way is to set S3 bucket credentials as environment variables:

  • S3_REGION - Region your bucket lies.
  • S3_BUCKET - Bucket name.
  • S3_ACCESS_KEY_ID - Access key.
  • S3_SECRET_ACCESS_KEY - Secret.
  • S3_ENDPOINT - for providers other than AWS

These variables can be stored into .env file if tests are executed locally or added to CI configuration. Please keep S3_SECRET_ACCESS_KEY in secret.

If you use Application, you can set those variables inside Settings > Artifacts and share credentials with reporter

In this case Reporter will obtain S3 credentials for server and use them to save artifacts.


👀 By default tests artifacts are uploaded to bucket with public-read permission.

In this case uploaded files will be publicly accessible in Internet. These public links will be used by Applicationopen in new window as well as GitHub and GitLab Pipes to display images.

➿ To upload files with private access bucket add TESTOMATIO_PRIVATE_ARTIFACTS=1 environment value. Then update provide the same S3 credentials in "Settings > Artifacts" section of a project, so Applicationopen in new window could connect to the same bucket and fetch uploaded artifacts.

Links to files will be pre-signed and expire automatically in 10 minutes.



It is important to add S3_FORCE_PATH_STYLE var for minio setup

Google Cloud Storage

Google Cloud Storage can work via S3 protocol if Interoperability mode is enabled.

Open Cloud Storage. Create a bucket for artifacts, then inside Settings select "Interoperability". Create Access Key and Secret Key for the current user.


Adding Artifacts

JavaScript Frameworks Reporter has built-in support for test artifacts of following frameworks:

  • Playwright
  • CodeceptJS
  • Cypress
  • webdriverio

If a screenshot, a video, or a trace was saved and attached to test, reporter will automatically upload any of these as artifacts.

Usually, artifacts are stored automatically for failed test cases. To send arbitrary files as artifacts use addArtifact function from @testomatio/reporter package.

const { addArtifact } = require('@testomatio/reporter');


Use this syntax to also add a name for uploading artifact:

addArtifact({ name: 'Screenshot', path: './img/file.png' });

Common use case would be to make a screenshot and add it to current test. Here is the sample code for it:

const { addArtifact } = require('@testomatio/reporter');

// inside test
test('my test', async () => {

  // saveScreenshot is a custom function that makes a screenshot, 
  // saves it to file, and returns a path to this file
  const pathToFile = await saveScreenshot();
  // Ensure, that file at `pathToFile` exists, and is not a promise. 

If S3 credentials are not set, artfact file will be ignored.

WarningaddArtifact doesn't work (yet) in parallel mode for mocha or jest frameworks. We are working to support for these frameworks in next versions.

Other Frameworks

For tests in Java, Python, C# and other languages, which are not (yet) integrated into Reporter, use JUnit format for reporting.

To attach a file to a test as an artifact print the file name into console with file:// prefix and upload XML report with

If S3 credentials are set, files will be uploaded to bucket and attached to test in a report.

Java Example:

Attaching a screenshot to the Java test

// inside Java test
// assume you created a screenshot
// which is located at `pathToScreenshot`
System.out.println("file://" + pathToScreenshot);
C# Example:

Attaching a screenshot to the C# test

// assume you created a screenshot
// which is located at `pathToScreenshot`
Console.WriteLine("file://" + pathToScreenshot);


Attaching a screenshot to the Python test

print("file://" + pathToScreenshot)


Attaching a screenshot to the Ruby test

puts "file://" + path_to_screenshot


I don't have a S3 Bucket

Well then, just get it. Even if your company doesn't provide one, you can purchase a S3 storage by yourself.

Publishing Artifacts from Docker Container

If your tests are running within Docker container pass all environment variables explicitly

-e S3_ACCESS_KEY_ID=11111111111111111111 \
-e S3_SECRET_ACCESS_KEY=2222222222222222222222222222222222222222222 \
-e S3_BUCKET=artifacts \
-e S3_REGION=us-west-1 \

Environment variables for S3 are not working

This can be caused by various reasons. As an alternative approach, you can try to set S3 credentials inside Application and enable shared credentials.

How to cleanup old artifacts?

At this moment we don't provide any tools for the cleanup of files. It is recommended to write a custom cleanup tool on a S3 bucket.

Last Updated:
Contributors: TetianaKhomenko