Skip to content

decryptStream causes unbounded memory growth #1656

@farispoljcic-abh

Description

@farispoljcic-abh

Problem:

While using decryptStream to stream-decrypt large S3 objects, I noticed a memory increase roughly equal to the size of the object.

I think the issue is that _decryptStream returns a duplexify wrapper and internally runs a pipeline:

const stream = new Duplexify(parseHeaderStream, decipherStream)

pipeline(
  parseHeaderStream,
  verifyStream,
  decipherStream,
  new PassThrough(),
  (err: Error) => {
    if (err) stream.emit('error', err)
  }
)

The caller reads from decipherStream via the duplexify wrapper. The pipeline also pushes decipherStream's output into the PassThrough. Since nothing ever reads from that PassThrough, its internal buffer appears to grow without bound.

Solution:

Replacing the PassThrough with a no-op Writable that discards chunks fixes the memory growth while still absorbing the destroy() call:

const drain = new Writable({
  write(_chunk, _encoding, callback) {
    callback()
  },
})

pipeline(
  parseHeaderStream,
  verifyStream,
  decipherStream,
  drain,
  (err: Error) => {
    if (err) stream.emit('error', err)
  }
)

I tested this change locally and the memory usage stopped increasing while streaming.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions