runtime: capture token usage once per stream#1478
Open
Pnkcaht wants to merge 3 commits intodocker:mainfrom
Open
runtime: capture token usage once per stream#1478Pnkcaht wants to merge 3 commits intodocker:mainfrom
Pnkcaht wants to merge 3 commits intodocker:mainfrom
Conversation
Signed-off-by: pnkcaht <samzoovsk19@gmail.com>
Member
|
@Pnkcaht Could you please rebase? Is it possible to add a test that shows the issue? |
Signed-off-by: pnkcaht <samzoovsk19@gmail.com>
6571944 to
d04fb38
Compare
Contributor
Author
|
I performed a rebase and added an example to the test file, which compiles and runs correctly. @dgageot
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.

Context
Token usage always shows 0% in the TUI because some providers emit token usage multiple times during streaming, including partial or zero-valued snapshots.
Before this change, the runtime would overwrite previously captured usage with later snapshots, causing the final session token usage to be zero even when tokens were actually consumed.
This PR fixes token usage reporting by capturing usage once per stream and treating it as immutable, making the behavior provider-agnostic.
References
Fixed #1475
What I did
per stream.
valid usage data.
providers (OpenAI, Anthropic, Gemini, Bedrock).
Before / After
Before
After