-
Notifications
You must be signed in to change notification settings - Fork 66
Open
Description
Problem:
While using decryptStream to stream-decrypt large S3 objects, I noticed a memory increase roughly equal to the size of the object.
I think the issue is that _decryptStream returns a duplexify wrapper and internally runs a pipeline:
const stream = new Duplexify(parseHeaderStream, decipherStream)
pipeline(
parseHeaderStream,
verifyStream,
decipherStream,
new PassThrough(),
(err: Error) => {
if (err) stream.emit('error', err)
}
)
The caller reads from decipherStream via the duplexify wrapper. The pipeline also pushes decipherStream's output into the PassThrough. Since nothing ever reads from that PassThrough, its internal buffer appears to grow without bound.
Solution:
Replacing the PassThrough with a no-op Writable that discards chunks fixes the memory growth while still absorbing the destroy() call:
const drain = new Writable({
write(_chunk, _encoding, callback) {
callback()
},
})
pipeline(
parseHeaderStream,
verifyStream,
decipherStream,
drain,
(err: Error) => {
if (err) stream.emit('error', err)
}
)
I tested this change locally and the memory usage stopped increasing while streaming.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels