I've been trying to find libraries (to avoid re-inventing the wheel) to read a single line on a CSV file and pushing the value to a downstream process.
我一直试图找到库(以避免重新发明*)读取CSV文件中的单行并将值推送到下游进程。
However, I need the read and processing to only happen once per second. Is there a way to do this on Node?
但是,我需要每秒只进行一次读取和处理。有没有办法在Node上执行此操作?
1 个解决方案
#1
3
I came across a similar problem recently, as I needed a way to read the file one line at a time and not move to the next until I was done processing the previous one.
我最近遇到了类似的问题,因为我需要一种方法一次读取一行文件而不是移动到下一行,直到我完成处理前一行。
I solved it by using promises, stream.pause()
, and stream.resume()
. You could do it like this:
我通过使用promises,stream.pause()和stream.resume()来解决它。你可以这样做:
const Promise = require('bluebird');
const fs = require('fs');
const byline = require('byline');
function readCSV(file, callback) {
let stream = fs.createReadStream(file);
stream = byline.createStream(stream);
stream.on('data', (line) => {
stream.pause();
Promise.resolve(line.toString())
.then(callback)
.then(() => setTimeout(() => stream.resume(), 1000));
});
}
readCSV('file.csv', console.log);
#1
3
I came across a similar problem recently, as I needed a way to read the file one line at a time and not move to the next until I was done processing the previous one.
我最近遇到了类似的问题,因为我需要一种方法一次读取一行文件而不是移动到下一行,直到我完成处理前一行。
I solved it by using promises, stream.pause()
, and stream.resume()
. You could do it like this:
我通过使用promises,stream.pause()和stream.resume()来解决它。你可以这样做:
const Promise = require('bluebird');
const fs = require('fs');
const byline = require('byline');
function readCSV(file, callback) {
let stream = fs.createReadStream(file);
stream = byline.createStream(stream);
stream.on('data', (line) => {
stream.pause();
Promise.resolve(line.toString())
.then(callback)
.then(() => setTimeout(() => stream.resume(), 1000));
});
}
readCSV('file.csv', console.log);