Accessing the Performance Report
Navigate to More Reports > Performance Report from the left sidebar. Select your team and date range — the report loads automatically.
What the Report Includes
The Performance Report compiles key metrics into a single, readable summary.
DORA Metrics Summary
Metric | What's shown |
Change Lead Time | 85th percentile time from first commit to deploy, with breakdown by stage |
Deployment Frequency | Total deployments and average deployments per week |
Change Failure Rate | Percentage of deployments that included a hotfix |
Mean Time to Recovery | Average incident resolution time (requires PagerDuty or OpsGenie) |
PR Metrics
Metric | What's shown |
PRs merged | Total count for the period |
Review Time (85th percentile) | 85th percentile time from PR opened to merged |
Throughput | PRs merged per developer per week |
First Response Time | 85th percentile time until first reviewer action |
Trend Indicators
Each metric includes a trend arrow showing whether performance improved, declined, or stayed flat compared to the previous period of equal length.
Viewing a Report
Go to More Reports > Performance Report
Select the team (or All Teams)
Choose a date range — preset options include Last 7 Days, Last 30 Days, Last Quarter, or Custom
The report loads automatically with your selected filters
Downloading the Report
Click the Download button to save the report. Use this to share with leadership, attach to retrospective notes, or archive for future comparison.
Tips:
Download before changing filters if you want to keep a snapshot of the current view
The downloaded report reflects the data at the time of download — it does not update
Interpreting the Data
What good looks like
Metric | Strong performance | Needs work |
Change Lead Time | < 1 day | > 1 week |
Deployment Frequency | Multiple times per day | Less than weekly |
Change Failure Rate | < 5% | > 15% |
MTTR | < 1 hour | > 1 day |
Review Time | < 24 hours | > 3 days |
These benchmarks come from the DORA State of DevOps research. Your team's targets may vary based on context.
Note: These benchmarks are based on the DORA State of DevOps research. For benchmarks based on Haystack's own aggregated data from 2000+ teams, see the Benchmarks page in your dashboard.
Common patterns
High lead time but good review time — Development Time is the bottleneck. Look at PR size and branch lifetime.
Good throughput but high failure rate — The team is shipping fast but introducing regressions. Tighten testing and review practices.
Low deployment frequency with fast lead time — Deployments may be batched. Consider smaller, more frequent releases.
FAQ
Q: Can I schedule reports to be generated automatically? A: Automatic scheduling is not currently available. Open the Performance Report page and download when needed.
Q: What date range should I use? A: For regular check-ins, Last 30 Days provides a stable view. For retrospectives, match the sprint or quarter length. Avoid very short ranges (< 7 days) as they may not have enough data points.
Q: Can Members view and download reports, or only Admins? A: Both Admin and Member roles can view and download Performance Reports.
Q: Why is MTTR missing from my report? A: MTTR requires a PagerDuty or OpsGenie integration. If neither is connected, the MTTR section will not appear in the report.