Value Stream Analytics
Value Stream Analytics traces each Jira issue’s journey from creation to completion, measuring how long work spends in development, code review, merge, and testing phases. It combines data from Jira, GitLab, and GitHub into a single timeline so you can see exactly where time is spent — and where it gets stuck.
Use it to answer questions like:
- How long does it take our team to ship a typical story?
- Are merge requests sitting in review for days before anyone looks at them?
- Which epic has the longest cycle time, and what is causing the delay?
Getting Started
Navigate to Value Stream in the sidebar. The page requires at least two data sources to be configured and synced:
- Atlassian (Jira) — provides issue lifecycle events (creation, status changes, assignments, comments, resolution).
- GitLab and/or GitHub — provides development events (commits, branches, merge requests, code reviews).
If either source is missing, the report will still run but metrics will be incomplete. For example, without GitLab or GitHub data the development and review phases cannot be measured.
Selecting Issues
The search bar at the top of the page offers two modes, selectable from a dropdown:
- JQL — type a Jira Query Language expression directly. For example,
project = NXP AND issuetype = Story AND status changed to "Done" after -30dreturns stories completed in the last 30 days. Press Enter or click Run report to submit. - Filter — select from your organization’s saved Jira filters. The dropdown is searchable — start typing to narrow the list.
Choose the mode that fits your workflow. JQL gives full flexibility; saved filters are convenient for queries your team runs repeatedly.
Running the Report
Two action buttons appear next to the search bar:
- Run report — fetches the matching issues, resolves their parent-child hierarchy, collects all related activity events from every configured data source, and computes cycle time metrics. The results appear immediately below.
- Analyze with AI — does everything Run report does, then sends the structured data to the configured LLM provider (Claude or OpenAI) for a written analysis. The AI report includes bottleneck identification, review effectiveness observations, and prioritized recommendations.
URL parameters update as you interact with the controls, so you can bookmark or share a specific report configuration.
Understanding the Report
Summary Panel
Two cards appear at the top of the report:
Aggregate Metrics shows a quick overview of the result set:
| Metric | Description |
|---|---|
| Issues analyzed | Total number of issues returned by the query |
| Completed | Issues whose status is Done, Closed, or Resolved |
| Avg cycle time | Average total cycle time across all issues with metrics |
| Avg dev time | Average development phase duration |
| Avg review wait | Average time from MR/PR creation to first review |
| Avg review time | Average time from first review to approval or merge |
| Avg merge time | Average time from approval to merge |
| Avg QA time | Average time in testing/QA phase |
Longest Cycle Times lists the three issues with the highest total cycle time. Click any of them to scroll directly to that issue in the list below.
Issue Hierarchy List
Below the summary, issues are displayed in a grid with the following columns:
- Issue info — Jira key (links to Jira), summary, status badge, issue type, and assignee.
- Progress — for parent issues, a progress bar showing how many children are completed out of the total (e.g., 3/5).
- Cycle time — the total cycle time for the issue. Parent issues show an aggregated “span” value (see Parent Metric Aggregation below).
- Details toggle — click the chart icon to expand the detail view for that issue.
Issues are organized in a tree structure based on Jira’s parent-child relationships. Parent issues (epics, initiatives) can be collapsed or expanded by clicking the chevron. Children appear indented beneath their parent.
Detail View
Expanding any issue reveals two sections:
Cycle Metrics Bar — a color-coded horizontal bar that visualizes how the total cycle time breaks down across phases:
| Color | Phase |
|---|---|
| Blue | Development |
| Amber | Review wait |
| Purple | Review |
| Green | Merge |
| Orange | QA / Testing |
A legend below the bar shows each phase with its duration. Phases with zero duration are omitted.
Timeline — a chronological list of every event associated with the issue. Each entry shows the timestamp, source tag (Jira, GitLab, or GitHub), event type, the person who performed the action, and a detail summary. For example:
2026-02-06 03:57 [GitLab] Commit pushed by Alice Johnson: feat: Add startup options (+75 -3)
2026-02-06 05:01 [Jira] Issue assigned by Alice Johnson: Assigned to Bob Smith
Parent Metric Aggregation
When a parent issue (such as an epic) has children, its cycle time is computed as the span of all descendant events — not an average. The system collects every event from the parent and all of its children and grandchildren, merges them into a single chronological timeline, and then computes cycle metrics from that combined sequence.
This means a parent’s total cycle time reflects the wall-clock time from when the first child started work to when the last child was completed. The cycle time column shows the label “span” to distinguish this from a leaf issue’s own metrics.
Cycle Time Metrics
Each issue’s cycle time is broken into five phases that together make up the total. The start and end points of each phase are determined by specific events in the issue’s timeline:
| Metric | Starts when | Ends when |
|---|---|---|
| Total cycle time | Issue moves to “In Progress” or “In Development” (or first commit if no status change) | Issue is closed or resolved |
| Dev time | Same as total cycle time start | First merge request or pull request is created |
| Review wait | MR/PR is created | First review comment is posted (or MR/PR is approved, if no comments) |
| Review time | First review comment | MR/PR is approved (or merged, if no explicit approval) |
| Merge time | MR/PR is approved | MR/PR is merged |
| QA time | MR/PR is merged, or issue status changes to a testing phase (whichever comes first) | Issue is closed or resolved |
QA status detection — the following Jira statuses are recognized as the start of a QA/testing phase: In Test, In Testing, Testing, In QA, QA, Ready for Test, Ready for Testing, Ready for QA. Status matching is case-insensitive.
If an issue’s workflow skips a phase entirely (for example, a merge request is merged without a separate approval step), that phase will show as empty (–) in the metrics table and will not appear in the cycle metrics bar.
In-Progress Metrics
For issues that are still open, metrics are computed against the current time rather than waiting for the phase to complete. These in-progress metrics are clearly marked:
- In the detail view metrics table, ongoing values show an (ongoing) suffix — for example, “5d 3h (ongoing)”.
- In the issue hierarchy’s cycle time column, ongoing values are prefixed with ~ — for example, “~5d 3h”.
Only the currently active phase uses the current time as its endpoint. Completed phases always show their actual durations. For example, if an issue’s merge request was created three days after development started, dev time shows as “3d” (completed), while review wait shows as “2d (ongoing)” if no review has been posted yet.
Event Types
Value Stream Analytics recognizes the following events from each data source:
Jira events:
| Event | Description |
|---|---|
| Issue created | A new issue was created |
| Issue assigned | The issue was assigned or reassigned |
| Status changed | The issue’s workflow status changed (e.g., To Do to In Progress) |
| Issue commented | A comment was added to the issue |
| Issue resolved | The issue was marked as resolved |
| Issue closed | The issue was closed |
GitLab / GitHub events:
| Event | Description |
|---|---|
| Branch created | A new branch was created |
| Commit pushed | One or more commits were pushed (includes lines added/removed) |
| MR/PR created | A merge request or pull request was opened |
| MR/PR marked ready | A draft MR/PR was marked as ready for review |
| Review comment | A reviewer left a comment on the MR/PR |
| MR/PR approved | The MR/PR was approved |
| MR/PR merged | The MR/PR was merged |
Events are matched to Jira issues by Jira issue keys found in commit messages, branch names, and MR/PR titles and descriptions.
AI Analysis
The Analyze with AI button generates a written report by sending the full structured data — issue hierarchy, timelines, metrics, and associated merge requests — to the configured LLM provider.
The AI analysis typically covers:
- Cycle time breakdown and bottleneck identification — which phases are taking the longest and which issues are most affected.
- Code review effectiveness — how quickly reviews happen, whether reviews are thorough or rubber-stamped.
- Development patterns — commit frequency, branch strategies, and collaboration signals.
- Risk assessment — issues that have been in progress for an unusually long time or are missing expected events.
- Recommendations — specific, actionable suggestions to improve delivery speed and quality.
An LLM provider (Claude or OpenAI) must be configured in Settings > LLM for this feature to work. If no provider is configured, the button will not be available.
The quality of the analysis depends on the volume and richness of the underlying data. Ensure your data sources are synced and the query covers enough history to give the LLM meaningful patterns to work with.