Skip to content

Conversation

@hassiebp
Copy link
Contributor

@hassiebp hassiebp commented Feb 9, 2026

Only subtract known token detail subcategories in _parse_usage_model to fix incorrect input/output totals caused by priority-tier metadata (fixes #11896).


Open in Cursor Open in Web


Important

Fix issue #11896 by updating _parse_usage_model to skip subtraction for 'priority' token details and adding tests in test_langchain_usage.py.

  • Behavior:
    • Modify _parse_usage_model in CallbackHandler.py to skip subtraction for token details with keys starting with "priority".
    • Introduce _should_subtract_token_detail() to determine if a token detail should be subtracted.
  • Tests:
    • Add test_langchain_usage.py with tests test_parse_usage_model_skips_priority_subtraction and test_parse_usage_model_subtracts_known_details to verify correct behavior of _parse_usage_model.

This description was created by Ellipsis for 04fca34. You can customize this summary. It will automatically update as commits are pushed.

Co-authored-by: Hassieb Pakzad <hassiebp@users.noreply.github.com>
@cursor
Copy link

cursor bot commented Feb 9, 2026

Cursor Agent can help with this pull request. Just @cursor in comments and I'll start working on changes in this branch.
Learn more about Cursor Agents

@hassiebp hassiebp closed this Feb 9, 2026
Co-authored-by: Hassieb Pakzad <hassiebp@users.noreply.github.com>
@hassiebp hassiebp reopened this Feb 9, 2026
@hassiebp hassiebp changed the title Issue 11896 resolution fix(langchain-openai): do not subtract priority token counts Feb 9, 2026
@hassiebp hassiebp closed this Feb 9, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants