Streams
Writing Streaming Libraries
Created by Evan Oxfeld / @evanoxfeld
Goal
- Writing Streams Today
- Writing Streams Tomorrow (Today)
Using Streams
fs.createReadStream('zip path').pipe(
unzip.Extract( {path: 'output/path'} ))
Using Streams
fs.createReadStream('zip').pipe(
unzip.Parse()).pipe(
fstream.Writer('out'))
Using Streams
readable stream.pipe(
duplex stream).pipe(
writable stream)
pipe() returns the destination stream starting in Node 0.6
Readable Streams (Present)
- Emit data events
- Maybe implement pause() and resume()
Issues
- No on.('pipe') method
- pause() isn't a guarantee
Writable Streams (Present)
- Implement write() and end()
Issues
- Backpressure
- Hope the source stream buffers
To exert backpressure, return false. To resume, emit 'drain'
Backpressure
Readable
Writable
Backpressure
Readable
Writable
Backpressure
Readable
Writable
Enter "Streams2"
Streams in 0.10
- Readable streams are now suck streams
- read([size]) is equivalent to write(data)
- read() returns null if less data is buffered than size
- write() continues to return false if the buffer is full
- Can use Readable to wrap old-style streams
- Same friendly pipe API
Writing 0.10 Streams
- Readable: implement _read(size, cb)
- Writable: implement _write(data, cb)
- Duplex: implement _read() and _write()
- Transform: implement _transform(data, outputFn, cb)
- PassThrough: no method to implement
unzip.Parse
Parse.prototype._transform = function (data, outputFn, callback) {
if (this._pullStream.write(data)) {
return callback();
}
this._pullStream.once('drain', callback);
};
Use Streams2 Today
var Transform = require('stream').Transform;
if (!Transform) {
Transform = require('readable-stream/transform');
}
@substackStreams make programming in node simple, elegant, and composable.
References