Replies: 5 comments
-
Happens with both Next.js 13.3 app dir API routes and pages ones. If I hardcode how I assign Its seems to be ignoring |
Beta Was this translation helpful? Give feedback.
-
This seems to be relevant https://vercel.com/docs/concepts/functions/edge-functions/streaming#caveats. It suggests ways to handle chunks that may or may not be JSON parseable yet. But that doesn't help me when too many chunks are buffered in the first place and I want to force them from going out one by one. I'm starting to see that a Otherwise, I can't iterate through an array and provide individual before and after messages for an operation on an element that are seen in realtime. I would have to use websockets and couldn't do that within the API route options of Next.js. |
Beta Was this translation helpful? Give feedback.
-
This is just how ReadableStream works. You don't have control over how the chunks are set. It's dynamically adjusted based on network conditions (ex: how far away servers are or how fast the connection is) |
Beta Was this translation helpful? Give feedback.
-
hey @jschuur did you find a work around? |
Beta Was this translation helpful? Give feedback.
-
My best solution to this at this point has been to suffix with a newline const encoder = new TextEncoder();
const encoded = encoder.encode(`${JSON.stringify(obj)}\n`)
controller.enqueue(encoded) and in the consumer you'll have to expect partial chunks, something like this: const decoder = new TextDecoder();
let partial = "";
while (true) {
const { done, value } = await reader.read();
if (done) {
break;
}
const decoded = decoder.decode(value, { stream: true });
const lines = decoded.split(SKYFFEL_LINE_SEPARATOR).filter(Boolean);
for (const line of lines) {
try {
const json = JSON.parse(partial);
yield json
} catch (error) {
partial += line;
}
}
} Not the prettiest, but it works |
Beta Was this translation helpful? Give feedback.
-
Summary
I'm trying to stream JSON data in a long running Edge function (via a
ReadableStream
), but I'm running into issues because sometimes multiple queued chunks are sent out together (which breaks the JSON stringified encoding) and I don't see a way to force an encoded chunk to be sent out.There is also an 'object mode' for streams but this seems to apply to
Readable
instances and not theReadableStream
that I'm using.Additional information
I have an edge function that looks like this:
Combined with a custom hook that makes the post request and a component to show the data as chunks come in (using
getReader
andTextDecoder
, this generates an output like this:Note that in the output each newline is a new chunk received on the client side.
So the first one was
Looking up https://turbo.build/rss.xml...
and the second one wasLooking up https://turbo.build/rss.xml... (FAIL)Looking up https://turbo.build/feed.xml...
. Someone my first outgoing chunk went out right away, but all subsequent ones were paired up with the next one.Client side console logs also confirm that the chunks arrived merge together.
You can imagine if I'd JSON stringified these chunks that the concatenated ones wouldn't have parsed.
I'm not sure if
objectMode
would help here, but I also don't understand enough of streams to rewrite this to usestream.Readable
.Example
https://github.com/jschuur/next-edge-streaming-bug
Beta Was this translation helpful? Give feedback.
All reactions