Skip to content

Conversation

@mikayla-maki
Copy link
Member

@mikayla-maki mikayla-maki commented Jan 14, 2026

This feature cost $15.

Up -> Tokens we're sending to the model (input tokens)
Down -> Tokens we've received from the model (output tokens)

Screenshot 2026-01-14 at 12 31 01 PM

Release Notes:

  • Changed the display of tokens for OpenAI models to reflect the input/output limits.
mikayla-maki and others added 2 commits January 12, 2026 13:59
…dels

For OpenAI and OpenAI-compatible providers, display separate input and
output token counts with their respective limits:

  ↑ 9k / 300k  ↓ 1.4k / 100k

This provides more granular visibility into token usage, showing:
- Input tokens used / max input tokens (context - output limit)
- Output tokens used / max output tokens

Other providers continue to show the combined format: "used / max"

Changes:
- Add `supports_split_token_display()` to LanguageModel trait
- Add `input_tokens` and `max_output_tokens` to acp_thread::TokenUsage
- Implement split display for OpenAI, OpenAI-compatible, xAI, and cloud providers
- Update thread_view to render split format with arrows

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@cla-bot cla-bot bot added the cla-signed The user has signed the Contributor License Agreement label Jan 14, 2026
mikayla-maki and others added 2 commits January 14, 2026 12:18
…dels

For OpenAI and OpenAI-compatible providers, display separate input and
output token counts with their respective limits:

  ↑ 9k / 300k  ↓ 1.4k / 100k

This provides more granular visibility into token usage, showing:
- Input tokens used / max input tokens (context - output limit)
- Output tokens used / max output tokens

Other providers continue to show the combined format: "used / max"

Changes:
- Add `supports_split_token_display()` to LanguageModel trait
- Add `input_tokens` to acp_thread::TokenUsage
- Implement split display for OpenAI, OpenAI-compatible, xAI, and cloud providers
- Update thread_view to render split format with arrows

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@mikayla-maki mikayla-maki force-pushed the split-token-display-openai branch from 11a0e96 to 5d30fb0 Compare January 14, 2026 20:26
@maxdeviant maxdeviant changed the title Split token display OpenAI Jan 14, 2026
@mikayla-maki mikayla-maki marked this pull request as ready for review January 14, 2026 22:29
@mikayla-maki mikayla-maki merged commit 9c5fc6e into main Jan 14, 2026
24 checks passed
@mikayla-maki mikayla-maki deleted the split-token-display-openai branch January 14, 2026 22:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla-signed The user has signed the Contributor License Agreement

2 participants