Usage in Deno
```typescript import * as mod from "node:node__stream.d.ts"; ```A stream is an abstract interface for working with streaming data in Node.js.
The `node:stream` module provides an API for implementing the stream interface.
There are many stream objects provided by Node.js. For instance, a [request to an HTTP server](https://nodejs.org/docs/latest-v22.x/api/http.html#class-httpincomingmessage)
and [`process.stdout`](https://nodejs.org/docs/latest-v22.x/api/process.html#processstdout) are both stream instances.
Streams can be readable, writable, or both. All streams are instances of [`EventEmitter`](https://nodejs.org/docs/latest-v22.x/api/events.html#class-eventemitter).
To access the `node:stream` module:
```js
import stream from 'node:stream';
```
The `node:stream` module is useful for creating new types of stream instances.
It is usually not necessary to use the `node:stream` module to consume streams.
c
default.Duplex
Duplex streams are streams that implement both the `Readable` and `Writable` interfaces.
Examples of `Duplex` streams include:
* `TCP sockets`
* `zlib streams`
* `crypto streams`
c
default.PassThrough
The `stream.PassThrough` class is a trivial implementation of a `Transform` stream that simply passes the input bytes across to the output. Its purpose is
primarily for examples and testing, but there are some use cases where `stream.PassThrough` is useful as a building block for novel sorts of streams.
c
c
default.Stream
No documentation available
c
default.Transform
Transform streams are `Duplex` streams where the output is in some way
related to the input. Like all `Duplex` streams, `Transform` streams
implement both the `Readable` and `Writable` interfaces.
Examples of `Transform` streams include:
* `zlib streams`
* `crypto streams`
c
c
internal.Duplex
Duplex streams are streams that implement both the `Readable` and `Writable` interfaces.
Examples of `Duplex` streams include:
* `TCP sockets`
* `zlib streams`
* `crypto streams`
c
internal.PassThrough
The `stream.PassThrough` class is a trivial implementation of a `Transform` stream that simply passes the input bytes across to the output. Its purpose is
primarily for examples and testing, but there are some use cases where `stream.PassThrough` is useful as a building block for novel sorts of streams.
c
c
internal.Stream
No documentation available
c
internal.Transform
Transform streams are `Duplex` streams where the output is in some way
related to the input. Like all `Duplex` streams, `Transform` streams
implement both the `Readable` and `Writable` interfaces.
Examples of `Transform` streams include:
* `zlib streams`
* `crypto streams`
c
c
ReadableBase
No documentation available
- _construct
- _destroy
- _read
- addListener
- asIndexedPairs
- closed
- destroy
- destroyed
- drop
- emit
- errored
- every
- filter
- find
- flatMap
- forEach
- from
- isDisturbed
- isPaused
- iterator
- map
- on
- once
- pause
- prependListener
- prependOnceListener
- push
- read
- readable
- readableAborted
- readableDidRead
- readableEncoding
- readableEnded
- readableFlowing
- readableHighWaterMark
- readableLength
- readableObjectMode
- reduce
- removeListener
- resume
- setEncoding
- some
- take
- toArray
- unpipe
- unshift
- wrap
c
WritableBase
No documentation available
f
default.addAbortSignal
A stream to attach a signal to.
Attaches an AbortSignal to a readable or writeable stream. This lets code
control stream destruction using an `AbortController`.
Calling `abort` on the `AbortController` corresponding to the passed `AbortSignal` will behave the same way as calling `.destroy(new AbortError())` on the
stream, and `controller.error(new AbortError())` for webstreams.
```js
import fs from 'node:fs';
const controller = new AbortController();
const read = addAbortSignal(
controller.signal,
fs.createReadStream(('object.json')),
);
// Later, abort the operation closing the stream
controller.abort();
```
Or using an `AbortSignal` with a readable stream as an async iterable:
```js
const controller = new AbortController();
setTimeout(() => controller.abort(), 10_000); // set a timeout
const stream = addAbortSignal(
controller.signal,
fs.createReadStream(('object.json')),
);
(async () => {
try {
for await (const chunk of stream) {
await process(chunk);
}
} catch (e) {
if (e.name === 'AbortError') {
// The operation was cancelled
} else {
throw e;
}
}
})();
```
Or using an `AbortSignal` with a ReadableStream:
```js
const controller = new AbortController();
const rs = new ReadableStream({
start(controller) {
controller.enqueue('hello');
controller.enqueue('world');
controller.close();
},
});
addAbortSignal(controller.signal, rs);
finished(rs, (err) => {
if (err) {
if (err.name === 'AbortError') {
// The operation was cancelled
}
}
});
const reader = rs.getReader();
reader.read().then(({ value, done }) => {
console.log(value); // hello
console.log(done); // false
controller.abort();
});
```
f
default.duplexPair
The utility function `duplexPair` returns an Array with two items,
each being a `Duplex` stream connected to the other side:
```js
const [ sideA, sideB ] = duplexPair();
```
Whatever is written to one stream is made readable on the other. It provides
behavior analogous to a network connection, where the data written by the client
becomes readable by the server, and vice-versa.
The Duplex streams are symmetrical; one or the other may be used without any
difference in behavior.
f
N
default.finished
A readable and/or writable stream/webstream.
A function to get notified when a stream is no longer readable, writable
or has experienced an error or a premature close event.
```js
import { finished } from 'node:stream';
import fs from 'node:fs';
const rs = fs.createReadStream('archive.tar');
finished(rs, (err) => {
if (err) {
console.error('Stream failed.', err);
} else {
console.log('Stream is done reading.');
}
});
rs.resume(); // Drain the stream.
```
Especially useful in error handling scenarios where a stream is destroyed
prematurely (like an aborted HTTP request), and will not emit `'end'` or `'finish'`.
The `finished` API provides [`promise version`](https://nodejs.org/docs/latest-v22.x/api/stream.html#streamfinishedstream-options).
`stream.finished()` leaves dangling event listeners (in particular `'error'`, `'end'`, `'finish'` and `'close'`) after `callback` has been
invoked. The reason for this is so that unexpected `'error'` events (due to
incorrect stream implementations) do not cause unexpected crashes.
If this is unwanted behavior then the returned cleanup function needs to be
invoked in the callback:
```js
const cleanup = finished(rs, (err) => {
cleanup();
// ...
});
```
f
default.finished.__promisify__
No documentation available
f
default.getDefaultHighWaterMark
Returns the default highWaterMark used by streams.
Defaults to `65536` (64 KiB), or `16` for `objectMode`.
f
default.isErrored
Returns whether the stream has encountered an error.
f
default.isReadable
Returns whether the stream is readable.
f
N
default.pipeline
A module method to pipe between streams and generators forwarding errors and
properly cleaning up and provide a callback when the pipeline is complete.
```js
import { pipeline } from 'node:stream';
import fs from 'node:fs';
import zlib from 'node:zlib';
// Use the pipeline API to easily pipe a series of streams
// together and get notified when the pipeline is fully done.
// A pipeline to gzip a potentially huge tar file efficiently:
pipeline(
fs.createReadStream('archive.tar'),
zlib.createGzip(),
fs.createWriteStream('archive.tar.gz'),
(err) => {
if (err) {
console.error('Pipeline failed.', err);
} else {
console.log('Pipeline succeeded.');
}
},
);
```
The `pipeline` API provides a [`promise version`](https://nodejs.org/docs/latest-v22.x/api/stream.html#streampipelinesource-transforms-destination-options).
`stream.pipeline()` will call `stream.destroy(err)` on all streams except:
* `Readable` streams which have emitted `'end'` or `'close'`.
* `Writable` streams which have emitted `'finish'` or `'close'`.
`stream.pipeline()` leaves dangling event listeners on the streams
after the `callback` has been invoked. In the case of reuse of streams after
failure, this can cause event listener leaks and swallowed errors. If the last
stream is readable, dangling event listeners will be removed so that the last
stream can be consumed later.
`stream.pipeline()` closes all the streams when an error is raised.
The `IncomingRequest` usage with `pipeline` could lead to an unexpected behavior
once it would destroy the socket without sending the expected response.
See the example below:
```js
import fs from 'node:fs';
import http from 'node:http';
import { pipeline } from 'node:stream';
const server = http.createServer((req, res) => {
const fileStream = fs.createReadStream('./fileNotExist.txt');
pipeline(fileStream, res, (err) => {
if (err) {
console.log(err); // No such file
// this message can't be sent once `pipeline` already destroyed the socket
return res.end('error!!!');
}
});
});
```
f
default.pipeline.__promisify__
No documentation available
f
default.setDefaultHighWaterMark
Sets the default highWaterMark used by streams.
f
internal.addAbortSignal
A stream to attach a signal to.
Attaches an AbortSignal to a readable or writeable stream. This lets code
control stream destruction using an `AbortController`.
Calling `abort` on the `AbortController` corresponding to the passed `AbortSignal` will behave the same way as calling `.destroy(new AbortError())` on the
stream, and `controller.error(new AbortError())` for webstreams.
```js
import fs from 'node:fs';
const controller = new AbortController();
const read = addAbortSignal(
controller.signal,
fs.createReadStream(('object.json')),
);
// Later, abort the operation closing the stream
controller.abort();
```
Or using an `AbortSignal` with a readable stream as an async iterable:
```js
const controller = new AbortController();
setTimeout(() => controller.abort(), 10_000); // set a timeout
const stream = addAbortSignal(
controller.signal,
fs.createReadStream(('object.json')),
);
(async () => {
try {
for await (const chunk of stream) {
await process(chunk);
}
} catch (e) {
if (e.name === 'AbortError') {
// The operation was cancelled
} else {
throw e;
}
}
})();
```
Or using an `AbortSignal` with a ReadableStream:
```js
const controller = new AbortController();
const rs = new ReadableStream({
start(controller) {
controller.enqueue('hello');
controller.enqueue('world');
controller.close();
},
});
addAbortSignal(controller.signal, rs);
finished(rs, (err) => {
if (err) {
if (err.name === 'AbortError') {
// The operation was cancelled
}
}
});
const reader = rs.getReader();
reader.read().then(({ value, done }) => {
console.log(value); // hello
console.log(done); // false
controller.abort();
});
```
f
internal.duplexPair
The utility function `duplexPair` returns an Array with two items,
each being a `Duplex` stream connected to the other side:
```js
const [ sideA, sideB ] = duplexPair();
```
Whatever is written to one stream is made readable on the other. It provides
behavior analogous to a network connection, where the data written by the client
becomes readable by the server, and vice-versa.
The Duplex streams are symmetrical; one or the other may be used without any
difference in behavior.
f
N
internal.finished
A readable and/or writable stream/webstream.
A function to get notified when a stream is no longer readable, writable
or has experienced an error or a premature close event.
```js
import { finished } from 'node:stream';
import fs from 'node:fs';
const rs = fs.createReadStream('archive.tar');
finished(rs, (err) => {
if (err) {
console.error('Stream failed.', err);
} else {
console.log('Stream is done reading.');
}
});
rs.resume(); // Drain the stream.
```
Especially useful in error handling scenarios where a stream is destroyed
prematurely (like an aborted HTTP request), and will not emit `'end'` or `'finish'`.
The `finished` API provides [`promise version`](https://nodejs.org/docs/latest-v22.x/api/stream.html#streamfinishedstream-options).
`stream.finished()` leaves dangling event listeners (in particular `'error'`, `'end'`, `'finish'` and `'close'`) after `callback` has been
invoked. The reason for this is so that unexpected `'error'` events (due to
incorrect stream implementations) do not cause unexpected crashes.
If this is unwanted behavior then the returned cleanup function needs to be
invoked in the callback:
```js
const cleanup = finished(rs, (err) => {
cleanup();
// ...
});
```
f
internal.finished.__promisify__
No documentation available
f
internal.getDefaultHighWaterMark
Returns the default highWaterMark used by streams.
Defaults to `65536` (64 KiB), or `16` for `objectMode`.
f
internal.isErrored
Returns whether the stream has encountered an error.
f
internal.isReadable
Returns whether the stream is readable.
f
N
internal.pipeline
A module method to pipe between streams and generators forwarding errors and
properly cleaning up and provide a callback when the pipeline is complete.
```js
import { pipeline } from 'node:stream';
import fs from 'node:fs';
import zlib from 'node:zlib';
// Use the pipeline API to easily pipe a series of streams
// together and get notified when the pipeline is fully done.
// A pipeline to gzip a potentially huge tar file efficiently:
pipeline(
fs.createReadStream('archive.tar'),
zlib.createGzip(),
fs.createWriteStream('archive.tar.gz'),
(err) => {
if (err) {
console.error('Pipeline failed.', err);
} else {
console.log('Pipeline succeeded.');
}
},
);
```
The `pipeline` API provides a [`promise version`](https://nodejs.org/docs/latest-v22.x/api/stream.html#streampipelinesource-transforms-destination-options).
`stream.pipeline()` will call `stream.destroy(err)` on all streams except:
* `Readable` streams which have emitted `'end'` or `'close'`.
* `Writable` streams which have emitted `'finish'` or `'close'`.
`stream.pipeline()` leaves dangling event listeners on the streams
after the `callback` has been invoked. In the case of reuse of streams after
failure, this can cause event listener leaks and swallowed errors. If the last
stream is readable, dangling event listeners will be removed so that the last
stream can be consumed later.
`stream.pipeline()` closes all the streams when an error is raised.
The `IncomingRequest` usage with `pipeline` could lead to an unexpected behavior
once it would destroy the socket without sending the expected response.
See the example below:
```js
import fs from 'node:fs';
import http from 'node:http';
import { pipeline } from 'node:stream';
const server = http.createServer((req, res) => {
const fileStream = fs.createReadStream('./fileNotExist.txt');
pipeline(fileStream, res, (err) => {
if (err) {
console.log(err); // No such file
// this message can't be sent once `pipeline` already destroyed the socket
return res.end('error!!!');
}
});
});
```
f
internal.pipeline.__promisify__
No documentation available
f
internal.setDefaultHighWaterMark
Sets the default highWaterMark used by streams.
I
I
I
I
I
I
I
default.StreamOptions
No documentation available
I
I
I
I
I
I
I
I
internal.StreamOptions
No documentation available
I
I
T
ComposeFnParam
No documentation available
T
default.PipelineCallback
No documentation available
T
default.PipelineDestination
No documentation available
T
default.PipelineDestinationIterableFunction
No documentation available
T
default.PipelineDestinationPromiseFunction
No documentation available
T
default.PipelinePromise
No documentation available
T
default.PipelineSource
No documentation available
T
default.PipelineSourceFunction
No documentation available
T
default.PipelineTransform
No documentation available
T
default.PipelineTransformSource
No documentation available
T
default.TransformCallback
No documentation available
T
internal.PipelineCallback
No documentation available
T
internal.PipelineDestination
No documentation available
T
internal.PipelineDestinationIterableFunction
No documentation available
T
internal.PipelineDestinationPromiseFunction
No documentation available
T
internal.PipelinePromise
No documentation available
T
internal.PipelineSource
No documentation available
T
internal.PipelineSourceFunction
No documentation available
T
internal.PipelineTransform
No documentation available
T
internal.PipelineTransformSource
No documentation available
T
internal.TransformCallback
No documentation available
v
default.consumers
No documentation available
v
default.promises
No documentation available
v
internal.consumers
No documentation available
v
internal.promises
No documentation available