stream.pipeline(streams, callback)
streams
<Stream[]> | <Iterable[]> | <AsyncIterable[]> | <Function[]>source
<Stream> | <Iterable> | <AsyncIterable> | <Function>- 返回: <Iterable> | <AsyncIterable>
...transforms
<Stream> | <Function>source
<AsyncIterable>- 返回: <AsyncIterable>
destination
<Stream> | <Function>source
<AsyncIterable>- 返回: <AsyncIterable> | <Promise>
callback
<Function> 当管道完全完成时调用。err
<Error>val
destination
返回的Promise
的解析值。
- 返回: <Stream>
模块方法,用于在流和生成器之间进行管道转发错误并正确清理并在管道完成时提供回调。
const { pipeline } = require('node:stream');
const fs = require('node:fs');
const zlib = require('node:zlib');
// 使用管道 API 可以轻松地将一系列流传输到一起,
// 并在管道完全完成时收到通知。
// 有效地 gzip 潜在巨大的 tar 文件的管道:
pipeline(
fs.createReadStream('archive.tar'),
zlib.createGzip(),
fs.createWriteStream('archive.tar.gz'),
(err) => {
if (err) {
console.error('Pipeline failed.', err);
} else {
console.log('Pipeline succeeded.');
}
},
);
pipeline
API 提供 promise 版本。
stream.pipeline()
将在所有流上调用 stream.destroy(err)
,除了:
- 已触发
'end'
或'close'
的Readable
流。 - 已触发
'finish'
或'close'
的Writable
流。
在调用 callback
后,stream.pipeline()
在流上留下悬空事件监听器。
在失败后重用流的情况下,这可能会导致事件监听器泄漏和吞下错误。
如果最后一个流是可读的,则悬空的事件监听器将被移除,以便稍后可以使用最后一个流。
stream.pipeline()
在出现错误时关闭所有流。
IncomingRequest
与 pipeline
一起使用可能会导致意外行为,一旦发生它会销毁套接字且不发送预期的响应。
参见以下示例:
const fs = require('node:fs');
const http = require('node:http');
const { pipeline } = require('node:stream');
const server = http.createServer((req, res) => {
const fileStream = fs.createReadStream('./fileNotExist.txt');
pipeline(fileStream, res, (err) => {
if (err) {
console.log(err); // 没有这样的文件
// 一旦 `pipeline` 已经销毁了套接字,则无法发送此消息
return res.end('error!!!');
}
});
});
streams
<Stream[]> | <Iterable[]> | <AsyncIterable[]> | <Function[]>source
<Stream> | <Iterable> | <AsyncIterable> | <Function>- Returns: <Iterable> | <AsyncIterable>
...transforms
<Stream> | <Function>source
<AsyncIterable>- Returns: <AsyncIterable>
destination
<Stream> | <Function>source
<AsyncIterable>- Returns: <AsyncIterable> | <Promise>
callback
<Function> Called when the pipeline is fully done.err
<Error>val
Resolved value ofPromise
returned bydestination
.
- Returns: <Stream>
A module method to pipe between streams and generators forwarding errors and properly cleaning up and provide a callback when the pipeline is complete.
const { pipeline } = require('node:stream');
const fs = require('node:fs');
const zlib = require('node:zlib');
// Use the pipeline API to easily pipe a series of streams
// together and get notified when the pipeline is fully done.
// A pipeline to gzip a potentially huge tar file efficiently:
pipeline(
fs.createReadStream('archive.tar'),
zlib.createGzip(),
fs.createWriteStream('archive.tar.gz'),
(err) => {
if (err) {
console.error('Pipeline failed.', err);
} else {
console.log('Pipeline succeeded.');
}
},
);
The pipeline
API provides a promise version.
stream.pipeline()
will call stream.destroy(err)
on all streams except:
Readable
streams which have emitted'end'
or'close'
.Writable
streams which have emitted'finish'
or'close'
.
stream.pipeline()
leaves dangling event listeners on the streams
after the callback
has been invoked. In the case of reuse of streams after
failure, this can cause event listener leaks and swallowed errors. If the last
stream is readable, dangling event listeners will be removed so that the last
stream can be consumed later.
stream.pipeline()
closes all the streams when an error is raised.
The IncomingRequest
usage with pipeline
could lead to an unexpected behavior
once it would destroy the socket without sending the expected response.
See the example below:
const fs = require('node:fs');
const http = require('node:http');
const { pipeline } = require('node:stream');
const server = http.createServer((req, res) => {
const fileStream = fs.createReadStream('./fileNotExist.txt');
pipeline(fileStream, res, (err) => {
if (err) {
console.log(err); // No such file
// this message can't be sent once `pipeline` already destroyed the socket
return res.end('error!!!');
}
});
});