Pentest Robots are available on all paid plans. View plans
What are pentest robots?
Pentest robots are automation workflows that run multiple security tools in sequence. Each robot defines a series of scans where output from one tool feeds into the next. You configure a target, start the robot, and it handles the rest. Robots can be shared with team members. See Sharing robots for details.Default robots
Several pre-built robots are included on all plans:| Robot | What it does |
|---|---|
| The HTTP Lockpicker | Scans for web services, crawls for login interfaces, then brute-forces them |
| Domain Recon | Discovers subdomains, port scans them, and runs Website Recon on all HTTP/S ports. Populates the Attack Surface |
| All Domains Recon | Extended version of Domain Recon. Discovers company domains (filtered by certainty >= 80%), then runs the full subdomain/port/recon chain |
| Log4Shell Detector (CVE-2021-44228) | Discovers web apps, crawls pages and forms, and injects payloads to test for CVE-2021-44228 |
| Treasure Hunter (domain) | Finds subdomains, port scans each one, then runs URL Fuzzer on all HTTP/S ports to discover hidden files and directories |
| Treasure Hunter (host) | Port scans the target host (top 1000), then runs URL Fuzzer on each HTTP/S port |
| Auto HTTP Login Bruteforcer | Discovers password-protected URLs (HTTP 401) and brute-forces them with common credentials across all HTTP/S ports |
| Website Scanner - All Ports | Discovers all HTTP/S ports (1-65535), then runs Website Scanner on each |
| Website Scanner - Top 1000 Ports | Discovers HTTP/S ports (top 1000), then runs Website Scanner on each |
| Deep WordPress Scan | Runs Website Scanner first, then if WordPress is detected, runs the WordPress Scanner for CMS-specific vulnerabilities |
| Network Scanner - Critical CVEs (domain) | Discovers subdomains, identifies machines behind the domain, runs Sniper Detection Modules on each |
| Network Scanner - Full (domain) | Same as above, but runs Network Scanner with OpenVAS Full&Fast plus Sniper detection modules |
Running a robot
Scheduling robots
Robots can run on a schedule:Enable scheduling
Toggle the scheduling option and configure:
- Frequency: how often the robot should run (once, daily, weekly, monthly, quarterly, yearly)
- Start date: when to begin the schedule
Scheduled robot scans appear in your Scheduled Scans list where you can manage, pause, or cancel them.
Managing robots
Viewing robot details
Click on any robot to see:- Description of what the robot does
- The workflow and tools it uses
- Available actions (edit, delete) depending on your permissions
Editing robots
For robots you own, you can update the name and description. Renaming a robot also updates the name on all its past scan runs in the Scans history.Deleting robots
Deleting a robot removes the robot configuration. Past scan runs are not deleted. They remain on the Scans page but show “[Robot deleted]” as their name. To remove a past run, delete it from the Scans page.Robot execution
Tools execute in the defined order. Some robots include filters that control which results trigger subsequent steps.Node types
The workflow diagram has four node types:- Tool runs one or more scans. Shows status and finding counts as work progresses.
- Filter tests conditions against each result from the previous tool. Only matching results move to the next step. The node shows how many results passed through.
- Extractor pulls new targets out of a tool’s results to feed the next tool. For example, it might pull hostnames from Subdomain Finder results, or build URLs from a port scan’s list of open HTTP ports.
- Reducer works like an extractor, but processes all scan results together before producing targets. Its primary use is deduplication: when Subdomain Finder returns multiple subdomains that resolve to the same IP, a reducer picks one hostname per IP to avoid scanning the same server twice.
Monitoring progress
The scan result page shows the robot workflow as an interactive diagram. Each node updates as the robot runs. Tool nodes show the tool name, a status indicator, and finding counts by severity. When scans start for that step, a “X / Y Finished Scans” button appears. Click it to open a panel listing each individual scan with its target, status, and finding summary. From that panel, click any scan to open its full result in a new tab. Scans that failed to start appear in the same panel with their error message. You can pan the diagram by clicking and dragging. Use the + and − buttons to zoom.Results
When the robot finishes, the diagram shows the final state with finding counts at each tool step. To see findings in detail, click the Finished Scans button on any tool node to open individual scans from that step.Scans and reports
Each robot run appears as a single entry on the Scans page. The individual tool scans that run in the background are hidden from the list. When all tools finish and you have email notifications configured, you get one email notification with an aggregated PDF report attached. The report is also saved to the Reports page. Configure notification rules on the Notifications page. Deleting a robot run from the Scans page removes the entire run and all its child scans. No warning is shown before this happens.Example use cases
| Goal | Robot to use |
|---|---|
| Map a new domain’s exposure | Domain Recon or All Domains Recon |
| Find hidden files and sensitive data | Treasure Hunter (domain or host) |
| Regular web vulnerability checks | Schedule Website Scanner - Top 1000 Ports weekly |
| Test for a specific CVE | Log4Shell Detector |
| Full web + CMS assessment | Deep WordPress Scan (if WordPress) or Website Scanner - All Ports |
Sharing robots
You can share robots with team members. Shared robots appear alongside default robots so they can run them on their own targets.Permission levels
| Permission | What they can do |
|---|---|
| No access | Cannot see or use your robots |
| View | Can view and run robots, but not edit or delete |
| Edit | Full access to view, run, edit, and delete robots |
How to share robots
- Go to Team in the sidebar
- Select the team members you want to configure sharing for
- Click Share
- Set the Robots permission level
- Click Save
Best practices
Start with default robots
Start with default robots
The pre-built robots cover common use cases. Try them before building custom workflows.
Watch your scanned assets quota
Watch your scanned assets quota
Robots can generate many scans across many targets, especially domain-level robots that discover subdomains first. Each unique target counts against your scanned assets limit. Check your quota before running broad discovery robots.
Use scheduling for monitoring
Use scheduling for monitoring
Set up recurring robot runs to catch new vulnerabilities as they appear.
Test on non-production first
Test on non-production first
Verify robot behavior on test assets before running against production systems.
Share robots with your team
Share robots with your team