What is fetchstream-js?
fetchstream-js is a drop-in replacement for fetch() and axios.get() for JSON endpoints. Same URL, same server, same RequestInit options — but instead of blocking until the full response body arrives, it hands you values as bytes stream in.
For any non-trivial JSON response (hundreds of KB and up), that's the difference between a blank screen for 2–3 seconds and a UI that paints its first row in ~100 ms.
The problem with fetch and axios
Both fetch().json() and axios.get() buffer the entire response body before they return. You don't get any data — not one object, not one row — until the last byte has arrived.
// ❌ fetch — UI blocks for the full download time
const res = await fetch("/api/products");
const data = await res.json(); // ⏳ waits for all 5 MB
render(data);// ❌ axios — identical behavior
const { data } = await axios.get("/api/products"); // ⏳ waits for all 5 MB
render(data);For a 5 MB JSON list over a real network, that's a 3-second blank screen before your user sees anything.
The fix
fetchstream-js parses the response incrementally from the raw byte stream. Values fire the moment their closing } arrives on the wire — long before the server has finished writing the last byte.
// ✅ Renders as bytes arrive — first row in ~120 ms
import { fetchStream } from "fetchstream-js";
await fetchStream("/api/products").live(({ data }) => render(data));.live() is the headline API: it hands you a fresh { data, chunks, done, path } wrapper each tick. data is the same in-place-mutating tree (zero-copy reads); the outer wrapper is a new reference each time, so it slots straight into React/Vue/Svelte state.
Use it exactly like fetch
fetchStream() mirrors the WHATWG fetch() signature 1:1 — same URL types, same RequestInit, same AbortController. If you know fetch(), you already know this library.
const ac = new AbortController();
fetchStream("/api/products", {
method: "GET",
headers: { authorization: "Bearer …" },
signal: ac.signal,
}).live(setSnap);
// later…
ac.abort();How it works
fetchStream(url)starts a nativefetch()and gets a bodyReadableStream- Each chunk (
Uint8Array) is fed into a hand-rolled state machine - The parser emits SAX-like events (
onStartObject,onKey,onValue, ...) - A picker layer watches the path stack and only materializes subtrees you subscribed to
- Completed matches fire your callback — or the live mirror grows the root object in place
Two ways to consume the stream
1. .live() — the easy one (start here)
A single data value grows in place as the document streams in, wrapped in a fresh { data, chunks, done, path } outer object each tick. Pass the wrapper straight to React state and you're done.
fetchStream(url).live(setSnap);Built-in requestAnimationFrame throttling (default in browsers) means one re-render per frame even if the parser mutates the tree thousands of times per second.
2. .on(path, cb) — per-match callbacks
Fire once per fully-formed subtree, e.g. to append items one-by-one:
fetchStream(url).on("$.items.*", (item) => list.push(item));Great when you don't want the whole tree in memory.
When to use it
Use fetchstream-js instead of fetch / axios when the response is:
- A list (>50 rows) your UI renders progressively
- Any JSON payload where time-to-first-row beats total download time as the KPI
- A slow backend that generates data server-side and streams it out
- A payload you want to cancel mid-download if the user navigates away
Keep using fetch / axios for small responses (a single config object, a 5-row list, an auth token). The fixed parser overhead doesn't pay off at that size.
What it is not
- Not a JSON Lines (NDJSON) parser — it reads plain
application/json, the same content typefetch().json()handles - Not a JSONPath implementation — it supports a deliberately small, fast subset (no recursive descent
..) - Not a general-purpose HTTP client — for multipart uploads, interceptors, request batching, stick with
axiosorky
Ready to wire it up? → React quick start