AI-Powered Features
Testomat.io introduces AI-powered generative features to simplify and enhance your test management workflows. These tools leverage artificial intelligence to assist QA engineers by automating test documentation, generating actionable insights, and providing answers about their projects.
Testomat.io uses Groq (not Grok designed by xAI by Elon Musk) as the main AI provider, it was founded in 2016 by a group of former Google engineers.
Groq uses opensource models like Llama or Mixtral and doesn’t train its own models. However, we urge you to ensure compliance with data privacy regulations when sharing sensitive information. Enable AI features only if you are sure that your data is not sensitive.
AI-Powered Chat with Tests
Section titled “AI-Powered Chat with Tests”‘Chat with Tests’ feature — an AI-powered assistant that allows you to ask questions about existing tests in your project. The AI analyzes your test repository and responds with insights, summaries, or clarifications based on the actual test content.
This interactive capability makes it easier to explore, understand, and manage large sets of tests without manually browsing through them.
You can use ‘Chat with Tests’ feature on Project or Folder level.
Use ‘Chat with Tests’ Feature at the Project Level
Section titled “Use ‘Chat with Tests’ Feature at the Project Level”- Go to ‘Tests’ page.
- Click on ‘Chat with tests’ AI icon displayed in the header.
- Select a pre-configured AI promt offered by Testomat.io, update it as needed:
-
Summarize this project, list all features tested, separate by sections, use bullet points - if you want to have short overview on your project.
-
Suggest new test cases for the first suite in the project - if you want AI to gerenare new test cases.
-
Create plan with 30 tests for smoke testing max. Pick at least one test from each suite, trying to cover most crucial features - if you want AI to generate smoke test plan for you.
OR
Create you own AI-promt.
- Click on ‘Ask’ button.
Use ‘Chat with Tests’ at the Folder Level
Section titled “Use ‘Chat with Tests’ at the Folder Level”You can also use ‘Chat with Tests’ on folder level to analyze and summarize information within the selected folder:
- Go to ‘Tests’ page.
- Select the Folder.
- Click on ‘Chat with Tests’ button.
Summarize Suite Description Based on Test Cases
Section titled “Summarize Suite Description Based on Test Cases”You can automatically generate a suite description by analyzing the test cases within it. This saves time by eliminating the need for manual suite documentation and ensures descriptions accurately reflect the test content:
- Go to ‘Tests’.
- Select Suite with test cases.
- Click on ‘Summarize’ button.
AI-generated response will include the suggested suite summary and suggested actions.
You can copy (1) AI-generated response, regenarate it (2), and as well, you can edit it, improve, change formatting, or add specific sections using ‘Follow up’ input field (3) if suggestion is unsatisfactory:
On ‘Suggested Actions’ side, you can directly save the description to your suite (4).
If your suite already has a description, you can click the ‘Show Diff’ button (5) to compare your current description with the AI’s suggestion.
Suggest Test Cases
Section titled “Suggest Test Cases”You can also use AI to enhance your test coverage by creating additional test cases based on test cases that you already have in your test suite, as well they can be created based on Suite description or Requirements. This feature makes it easier to create comprehensive test suites.
- Open Test Suite that already contains Test Cases.
- Click on ‘Extra menu’ button.
- Select ‘Suggest Tests’ option.
You can review the suggested tests, select those that align with their needs, and directly add them to the suite. As well, you can generate more test cases, by clicking the ‘Suggest More Tests’ button (1).
Testomat.io recommends adding only the necessary tests cases to your suite!
If your test suite is linked to requirements (e.g., User story in Jira), AI will suggest checking your existing test cases for redundancy by clicking the ‘Remove Redundant Tests’ button (2).
You can remove redundant test cases directly within the AI-assistance window:
This feature accelerates test creation, enhances coverage by identifying overlooked scenarios, and streamlines workflows by reducing manual effort while maintaining test quality.
Suggest Test Case Description
Section titled “Suggest Test Case Description”This feature allows you to create test case description based just on its name or improve description that you previously added to your test case.
- Open Test Case.
- Click on ‘Suggest Description’ button.
Generate Test Case Description Based on Test Code
Section titled “Generate Test Case Description Based on Test Code”Use AI to analyze your test code and produce detailed test descriptions. Bridges the gap between technical code and human-readable documentation, improving collaboration between technical and non-technical team members:
- Go to ‘Tests’.
- Select Test Case with code.
- Click on ‘Write Description from Code’ button.
Test Summary is created:
Generate Code Based on Test Casse Description
Section titled “Generate Code Based on Test Casse Description”Provide a test description, and the AI generates the corresponding test automation code. Please note that generated code may be not completely comprehensive.
Code will be created based on the project framework settings and other tests in this suite. Use it as boilerplate code only.
To check your Project framework settings go to Project Settings page:
Generate Bug Description Based on the Test Case
Section titled “Generate Bug Description Based on the Test Case”When you executing tests and creating a new defect, Testomat.io will automatically suggest a concise, context-aware bug title and a description. These suggestions are based on the test case content and its execution results, helping teams report issues faster and more consistently.
Why is this useful:
- Speeding up defect logging: Testers can instantly use or refine AI-suggested bug details, reducing time spent writing repetitive or obvious issue reports.
- Maintaining consistent bug reporting standards: The AI helps standardize descriptions across team members, which improves clarity and communication with developers.
- Assisting less experienced testers: Junior team members or non-technical testers can rely on AI-generated suggestions as a starting point, ensuring important details aren’t missed.
Analyze Failed Automated Test Cases
Section titled “Analyze Failed Automated Test Cases”Use AI to analyze your failed automated tests to understand ans summarize main reasons for your tests to fail.
- Go to ‘Runs’ page.
- Open finished automated run.
- Click on ‘Clusterize Errors’ button.
Example of errors clustarization:
Explain Autotest Failures Based on Logs
Section titled “Explain Autotest Failures Based on Logs”Using stack trace, code of test, test execution logs and screenshot of failure, AI will identify and explain reasons behind failures. It helps to reduce debugging time by providing actionable insights directly within the Testomat UI. It also offers you a possible fixes.
The same as in the previous case, it also available only for finished, automated runs with 5+ failures.
- Go to ‘Runs’ page.
- Open finished automated run.
- Click on Failed Test Case.
- Click on ‘Explain Failure’ button.
Test Run Summary
Section titled “Test Run Summary”Testomat.io allows you to use AI-powered feature to analyze and summarize your finished test runs. It highlights risk areas and provides recommendations for improvements based on test results.
- Go to ‘Runs’ page.
- Select finished test run for statistics snalysis.
- Click on ‘Run Summary’ button.