Fix js_repl hangs on U+2028/U+2029 dynamic tool responses#14421
Fix js_repl hangs on U+2028/U+2029 dynamic tool responses#14421aaronl-openai merged 4 commits intomainfrom
Conversation
Replace the readline-based kernel stdin reader with byte-oriented JSONL framing and add regression coverage for U+2028/U+2029 dynamic tool responses. Co-authored-by: Codex <noreply@openai.com>
Update the high-level js_repl regression to assert against the current codex.tool() return shape on latest main. Co-authored-by: Codex <noreply@openai.com>
|
@codex review |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 5075936900
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
|
|
||
| process.stdin.on("data", (chunk) => { | ||
| inputBuffer = | ||
| inputBuffer.length === 0 ? Buffer.from(chunk) : Buffer.concat([inputBuffer, chunk]); |
There was a problem hiding this comment.
Avoid quadratic buffer copies when parsing stdin frames
Buffer.concat([inputBuffer, chunk]) reallocates and copies the entire accumulated frame on every data event until a newline arrives. For large or highly chunked JSONL messages, this becomes O(n²) CPU/memory and can stall or OOM js_repl under load. The previous readline path did not introduce this repeated full-copy pattern.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
@aaronl-openai can you see how hard this would be to fix? my local codex recommends something like:
- append incoming chunks to an array
- scan for \n across queued chunks
- only assemble bytes when a full frame is ready
Avoid repeatedly concatenating the accumulated js_repl stdin buffer on every data event by queuing chunk slices until a full JSONL frame is available. Co-authored-by: Codex <noreply@openai.com>
Resolve the js_repl test module move by keeping main's mod_tests.rs split and carrying forward the separator regression on the merged branch. Co-authored-by: Codex <noreply@openai.com>
Summary
Dynamic tool responses containing literal U+2028 / U+2029 would cause await codex.tool(...) to hang even though the response had already arrived. This PR replaces the kernel’s readline-based stdin handling with byte-oriented JSONL framing that handles these characters properly.
Testing
cargo test -p codex-core