Skip to content

Conversation

@Vidit-Ostwal
Copy link
Contributor

@Vidit-Ostwal Vidit-Ostwal commented Nov 6, 2025

More context here

Fixes #3845

#3739 (comment)


Note

Add response_id to LLMStreamChunkEvent and plumb it through streaming flows (LiteLLM and native providers) with updated tests.

  • Events
    • Add response_id to LLMStreamChunkEvent in events/types/llm_events.py.
  • Core LLM (LiteLLM streaming)
    • Capture chunk id as response_id and include in LLMStreamChunkEvent emissions.
    • Pass response_id into _handle_streaming_tool_calls and related emissions in llm.py.
  • Base LLM
    • Extend _emit_stream_chunk_event to accept and emit response_id.
  • Native Providers (streaming)
    • Anthropic: extract event.message.id and emit with chunks.
    • Azure: include update.id in chunk events.
    • Bedrock: use contentBlockIndex as response_id for deltas.
    • Gemini: forward chunk.response_id in chunk events.
    • OpenAI: forward chunk.id during streaming (including structured streaming path).
  • Tests
    • Update streaming tests to set/validate response_id in Azure, OpenAI, and event utility tests.

Written by Cursor Bugbot for commit 51b78f3. This will update automatically on new commits. Configure here.

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the final PR Bugbot will review for you during this billing cycle

Your free Bugbot reviews will reset on November 28

Details

Your team is on the Bugbot Free tier. On this plan, Bugbot will review limited PRs each billing cycle for each member of your team.

To receive Bugbot reviews on all of your PRs, visit the Cursor dashboard to activate Pro and start your 14-day free trial.

@Vidit-Ostwal Vidit-Ostwal mentioned this pull request Nov 6, 2025
Copy link
Contributor Author

@Vidit-Ostwal Vidit-Ostwal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding documentation for easy references.

@Vidit-Ostwal Vidit-Ostwal changed the title Adding chunk_id in Streaming Response Adding response_id in Streaming Response Nov 6, 2025
@Vidit-Ostwal
Copy link
Contributor Author

Vidit-Ostwal commented Nov 6, 2025

Fixes #3845

@Vidit-Ostwal
Copy link
Contributor Author

Requesting this one for a review @lucasgomide

@Vidit-Ostwal
Copy link
Contributor Author

Just a gentle ping on this one @greysonlalonde,
I think this will be complement on the recent PR on streaming support on crew and flows.

@greysonlalonde
Copy link
Contributor

Hey @Vidit-Ostwal thanks for this! Will get it in soon, we're on holiday right now

@Vidit-Ostwal
Copy link
Contributor Author

Hey @Vidit-Ostwal thanks for this! Will get it in soon, we're on holiday right now

Yup no worries..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] LLMStreamChunkEvent has no differentiator

2 participants