Why AI Test Generation?
TestMap.ai’s AI Test Generation feature turns work items from your existing tools—like GitHub PRs and issues—into test cases automatically. You connect an integration once, define which items to process and how often, and the system generates test cases in your TestMap project. This post walks you through the full path: integration → configuration → execution → history.
Step 1: Add an Integration
Everything starts in Organization Settings → Integrations. Here you connect TestMap.ai to the platform that hosts your work items.
- In Add New Integration, choose the platform (currently GitHub is available; Azure DevOps and Jira are coming soon).
- Click GitHub, then complete the form with your credentials and options (e.g., token, repositories).
- After saving, the integration appears under Your Integrations and can be used for AI Test Generation.
Important: You need at least one integration set up before you can create an AI test generation job.
Step 2: Configure the AI Test Generation Job
Go to Organization Settings → AI Test Generation. The flow is two steps: choose the integration, then define the job.
Choose the integration
Select which integration to use (e.g., the GitHub integration you just created). Once selected, you can proceed to the configuration form.
Define the job
You’ll set:
- Name — A label for this configuration (e.g., “Repo X PRs”).
- Work item types / Labels — For GitHub, choose PR or Issue labels. At least one is required.
- Statuses — When applicable, which item statuses to include.
- Timer interval — How often the job can run automatically: every 5 min, 15 min, 30 min, 1 hr, 6 hr, 12 hr, or once per day.
- TestMap project — The project where generated test cases will be created.
- Enabled — Whether the configuration is active for manual and scheduled runs.
You can also turn on options such as including AI-generated test cases, including existing test cases with a similarity threshold, and running test runs automatically after generation. When you save, the configuration appears in your list; you can edit, enable/disable, or delete it anytime.
Step 3: How the Job Runs
Jobs can run in two ways:
Manual execution
On the AI Test Generation screen, use the Run button on a saved configuration. Or go to AI Test Generation → Jobs and start a job by choosing the configuration. The system creates a new job, processes work items according to your settings (labels, statuses, etc.), and generates test cases in the selected TestMap project.
Automatic (scheduled) execution
If the configuration is enabled and has a timer interval, the system schedules the job to run at that frequency (e.g., every hour). When the time comes, a new job is created and executed with the same configuration. Disabled configurations are not run until you turn them back on.
Step 4: Job History and Filters
Under Organization Settings → AI Test Generation → Jobs you see the execution history: status (e.g., completed, failed, running), configuration used, project, integration, number of tests generated, and date/time.
Filters
- Status — Filter by job status (completed, failed, running, etc.).
- Config — Filter by the saved configuration name.
- Only jobs with processed items — When on (default), the list shows only jobs that actually processed items (generated at least one test case or had items marked as processed). Jobs that only had skipped or ignored items are hidden. Turn it off to see all jobs for the organization.
The list is paginated (e.g., 10 jobs per page). You can move between pages and see the total that match your filters (“Showing X–Y of Z jobs”).
Quick Summary
- Integration — In Organization Settings → Integrations, add an integration (e.g., GitHub).
- Job configuration — In Organization Settings → AI Test Generation, choose the integration and set name, labels/item types, timer interval, TestMap project, and options; save.
- Execution — Run manually from the UI or let the system run automatically on the defined interval when the config is enabled.
- History — In AI Test Generation → Jobs, view jobs and use Status, Config, and “Only jobs with processed items” to focus on runs that generated tests.
With this flow, you keep integrations, job settings, and job history in one place—and spend less time on manual test case creation.
Ready to try it?
Set up your first integration and AI test generation job in TestMap.ai. Free until Feb 28, 2026.
Get Started with TestMap.ai