Conversation
|
@claude review |
…orasync-object-has-no-attribute
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: b5a7071be7
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Fix OpenAI v1 streaming instrumentation to preserve the original
openai.Streamandopenai.AsyncStreamobjects instead of replacing them with Langfuse wrapper types.This keeps the native stream contract intact for
_iterator,.response,close()/aclose(),async for, and manual__anext__()while still collecting streamed chunks for Langfuse generation updates.Changes
async forand__anext__()Verification
uv run --frozen ruff check langfuse/openai.py tests/unit/test_openai.pyuv run --frozen pytest tests/unit/test_openai.pyRefs LFE-8788.
Disclaimer: Experimental PR review
Greptile Summary
This PR fixes OpenAI v1 streaming instrumentation by patching
_iteratorandclose/acloseon the nativeopenai.Streamandopenai.AsyncStreamobjects in place, rather than replacing them with Langfuse wrapper types. The approach is sound:_iteratorreplacement correctly intercepts all iteration paths (__iter__,__next__,__anext__,async for) as confirmed by the SDK's implementation, and theis_finalizedflag reliably prevents double-finalization.Confidence Score: 5/5
Safe to merge — core patching logic is correct, double-finalization is guarded, and tests cover all three iteration patterns.
All remaining findings are P2 style suggestions. The
_finalize_stream_response_asyncwrapper is harmless. No P0/P1 logic bugs found; the_iteratorreplacement approach is confirmed sound by the OpenAI SDK's implementation.No files require special attention.
Important Files Changed
_instrument_openai_stream/_instrument_openai_async_streamto monkey-patch_iteratorandclose/acloseon nativeopenai.Streamobjects, preserving type identity while still collecting chunks for Langfuse; logic is sound but_finalize_stream_response_asyncis a no-op async wrapper.__anext__usage;DummyOpenAIStream/DummyOpenAIAsyncStreamcorrectly bypasssuper().__init__and set_iteratordirectly, which is appropriate for unit-level testing.Sequence Diagram
sequenceDiagram participant User participant openai.Stream participant traced_iterator (generator) participant raw_iterator participant finalize_once participant LangfuseGeneration Note over User,LangfuseGeneration: _instrument_openai_stream patches _iterator and close in place User->>openai.Stream: for chunk in stream (or next()) openai.Stream->>traced_iterator (generator): __next__() via self._iterator traced_iterator (generator)->>raw_iterator: next item raw_iterator-->>traced_iterator (generator): chunk traced_iterator (generator)->>traced_iterator (generator): items.append(chunk), set completion_start_time traced_iterator (generator)-->>openai.Stream: yield chunk openai.Stream-->>User: chunk Note over traced_iterator (generator): On exhaustion or close() traced_iterator (generator)->>finalize_once: finally block (is_finalized guard) finalize_once->>LangfuseGeneration: _finalize_stream_response → generation.end() alt User calls stream.close() explicitly User->>openai.Stream: close() [patched to traced_close] openai.Stream->>openai.Stream: await original_close() (closes HTTP) openai.Stream->>finalize_once: finally in traced_close finalize_once->>finalize_once: is_finalized=True → no-op if already finalized endReviews (1): Last reviewed commit: "fix(openai): preserve native v1 stream c..." | Re-trigger Greptile