Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Client-side exception upon recursive append to streamable UI in production #1627

Closed
amithmkini opened this issue May 17, 2024 · 2 comments
Closed
Labels
ai/rsc bug Something isn't working

Comments

@amithmkini
Copy link

Description

I'm trying to perform multiple calls to the LLM using streamUI to generate a tool -> text reply. When I called streamUI recursively while appending the results to a streaming UI object, it works in the development server, but fails with a client-side error in the production build (npm start).

I assumed it was some issue with streamUI, but I managed to reproduce it just a bunch of createStreamableUI. In the production build, the first response works, but fails when response.append is called again.

Expected output:
Response from: 0
Response from: 1
Response from: 2

Actual output:
Response from: 0
<White screen with the text: Application error: a client-side exception has occurred (see the browser console for more information).>

The console has the following error:

TypeError: Cannot read properties of null (reading 'children')
<minified traceback>

I tried to expand the traceback by disabling minification and enabling production source maps, but all I could see was it being pointed to a line in React DOM and scheduler.

Versions

  • AI SDK: 3.1.11
  • Next.js: 14.2.3

Code example

async function askLLM(iteration: number, response: any) {
  const result = createStreamableUI(<SpinnerMessage />);
  response.append(result.value);
  
  // Simulate multiple calls to the API and responses
  // IIFE
  (async () => {
    // Wait for 2 seconds before responding
    await new Promise((resolve) => setTimeout(resolve, 2000))
    
    result.done(<BotMessage content={"Response from: " + iteration} />)
    
    // Ask AI 3 times before ending the conversation
    if (iteration < 2) {
      await askLLM(iteration + 1, response)
    } else {
      response.done()
    }
  })();
}

async function submitUserMessage() {
  "use server"

  const response = createStreamableUI()
  await askLLM(0, response)

  return {
    id: nanoid(),
    display: response.value
  }
}

Additional context

I feel like there's a bug in Next.js itself rather than in AI SDK.

I've created a repo that reproduces the issue with minimal code: https://github.com/amithmkini/ai-sdk-debug/

The installed version of Next.js in this is 14.1.4, and the above code works as expected in that version of Next.js. Upgrading Next.js to 14.2.0 breaks it.

@unstubbable
Copy link
Contributor

This issue has been fixed in React, and will be provided with the upcoming Next.js release. You can verify that in your repro with these commands:

bun add next@canary react@beta react-dom@beta eslint-config-next@canary
bun run build && bun run start

Reference:

@amithmkini
Copy link
Author

Yep, canary Next.js works with this code. Thank you! Should I go ahead and close this bug, or wait for the canary to come out as full release?

@lgrammel lgrammel added bug Something isn't working ai/rsc labels May 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai/rsc bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants