Skip to content

[None][test] Fix token throughput name to output_token_throughput#13890

Open
fredricz-20070104 wants to merge 2 commits intoNVIDIA:mainfrom
fredricz-20070104:feature/refact_perf_metric_name
Open

[None][test] Fix token throughput name to output_token_throughput#13890
fredricz-20070104 wants to merge 2 commits intoNVIDIA:mainfrom
fredricz-20070104:feature/refact_perf_metric_name

Conversation

@fredricz-20070104
Copy link
Copy Markdown
Collaborator

@fredricz-20070104 fredricz-20070104 commented May 8, 2026

Summary by CodeRabbit

  • Tests
    • Updated performance metric tracking to differentiate output token throughput, replacing the generic token throughput measurement. Performance test configurations and metric extraction now properly identify and track output token throughput across benchmark, server, and aggregated metrics.
  1. Fix token throughput name to output_token_throughput

Signed-off-by: FredricZ-2007 <226039983+fredricz-20070104@users.noreply.github.com>
@fredricz-20070104 fredricz-20070104 requested a review from a team as a code owner May 8, 2026 07:49
@fredricz-20070104 fredricz-20070104 changed the title [None][test] Fix token throughput name to output_token_throughput [None][test] Fix token throughput name to output_token_throughput May 8, 2026
@fredricz-20070104 fredricz-20070104 requested a review from ruodil May 8, 2026 07:52
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented May 8, 2026

Review Change Stack

📝 Walkthrough

Walkthrough

The PR updates performance metric infrastructure by renaming a metric type from TOKEN_THROUGHPUT to OUTPUT_TOKEN_THROUGHPUT. The enum constant is defined in utils.py, and all references across metric parsing regex maps, regression thresholds, metric-to-string mappings, and metric execution lists in test_perf.py are updated to use the new constant.

Changes

Metric Type Standardization

Layer / File(s) Summary
Enum Definition
tests/integration/defs/perf/utils.py
PerfMetricType enum replaces TOKEN_THROUGHPUT with OUTPUT_TOKEN_THROUGHPUT.
Metric Configuration
tests/integration/defs/perf/test_perf.py
Benchmark metric regex maps, aggregated server metric regex maps, regression threshold configuration, metric-to-string mapping, and metric execution lists all updated to reference the new OUTPUT_TOKEN_THROUGHPUT constant.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~8 minutes

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Description check ⚠️ Warning The PR description is minimal and does not follow the repository template, missing key sections like detailed description, test coverage, and PR checklist. Expand the description to include a detailed explanation of the change, affected test files, and verification that existing tests cover these modifications.
✅ Passed checks (4 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly and concisely describes the main change: renaming the token throughput metric to output_token_throughput in test code.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
Linked Issues check ✅ Passed Check skipped because no linked issues were found for this pull request.
Out of Scope Changes check ✅ Passed Check skipped because no linked issues were found for this pull request.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Tip

💬 Introducing Slack Agent: The best way for teams to turn conversations into code.

Slack Agent is built on CodeRabbit's deep understanding of your code, so your team can collaborate across the entire SDLC without losing context.

  • Generate code and open pull requests
  • Plan features and break down work
  • Investigate incidents and troubleshoot customer tickets together
  • Automate recurring tasks and respond to alerts with triggers
  • Summarize progress and report instantly

Built for teams:

  • Shared memory across your entire org—no repeating context
  • Per-thread sandboxes to safely plan and execute work
  • Governance built-in—scoped access, auditability, and budget controls

One agent for your entire SDLC. Right inside Slack.

👉 Get started


Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
tests/integration/defs/perf/test_perf.py (1)

1-1: ⚠️ Potential issue | 🟠 Major | ⚡ Quick win

Update the file header year for this modified Python file.

Line 1 still ends at 2025, but this file is modified in 2026 and should reflect the latest modification year.

Proposed fix
-# SPDX-FileCopyrightText: Copyright (c) 2022-2025 NVIDIA CORPORATION & AFFILIATES. All rights reserved.
+# SPDX-FileCopyrightText: Copyright (c) 2022-2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved.

As per coding guidelines: “Include NVIDIA copyright header on all new files; update year on modified files” and “All source files (.cpp, .h, .cu, .py) should contain an NVIDIA copyright header with the year of latest modification and Apache 2.0 license notice”.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@tests/integration/defs/perf/test_perf.py` at line 1, Update the file header
line that currently reads "SPDX-FileCopyrightText: Copyright (c) 2022-2025
NVIDIA CORPORATION & AFFILIATES. All rights reserved." to reflect the latest
modification year 2026, and ensure the Apache-2.0 license notice is present in
the header; locate the header string in tests/integration/defs/perf/test_perf.py
(the top comment) and change the year range to 2022-2026 and add or verify the
accompanying Apache 2.0 license line.
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

Outside diff comments:
In `@tests/integration/defs/perf/test_perf.py`:
- Line 1: Update the file header line that currently reads
"SPDX-FileCopyrightText: Copyright (c) 2022-2025 NVIDIA CORPORATION &
AFFILIATES. All rights reserved." to reflect the latest modification year 2026,
and ensure the Apache-2.0 license notice is present in the header; locate the
header string in tests/integration/defs/perf/test_perf.py (the top comment) and
change the year range to 2022-2026 and add or verify the accompanying Apache 2.0
license line.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Enterprise

Run ID: 19d43141-ae10-468d-b548-8535733840b3

📥 Commits

Reviewing files that changed from the base of the PR and between 01618fa and 707d6bf.

📒 Files selected for processing (2)
  • tests/integration/defs/perf/test_perf.py
  • tests/integration/defs/perf/utils.py

@fredricz-20070104 fredricz-20070104 enabled auto-merge (squash) May 8, 2026 07:54
@fredricz-20070104
Copy link
Copy Markdown
Collaborator Author

/bot skip --comment "only affect qa's test"

@github-actions
Copy link
Copy Markdown

github-actions Bot commented May 8, 2026

⚠️ Bot command ignored: The /bot command must appear at the very beginning of the comment (no leading blank lines or spaces). Please post a new comment with /bot as the first character.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants