fs

Watching for file changes on complete file transfer

前提是你 提交于 2019-12-06 12:28:18
I have a simple node script that looks for a file changes and copies file to the remote server using ssh. fs.watch(filename,function(curr,prev){ //copy file to the remote server }); However, since the file i'm watching is uploaded via ftp and for every chunk of data i recieve the file gets changed and the callback gets fired. Is there any way to look for changes only when the complete file has been transfered? Thanks in advance. I know this is an old question but for anyone else in the same situation there is now the below module: https://www.npmjs.com/package/remote-file-watcher For Linux,

Sending response after forEach

不问归期 提交于 2019-12-06 05:26:46
(Please note this is not a duplicate of two similarly titled questions, those two questions use Mongoose and the answers apply to Mongoose queries only) I have a list of directories, each of these directories contains a file. I want to return a JSON list with the contents of each of these files. I can load the files no problem, but because I'm looping over the array with forEach , my empty response is sent before I've actually loaded the contents of the files: function getInputDirectories() { return fs.readdirSync(src_path).filter(function(file) { return fs.statSync(path.join(src_path, file))

Returning the content of multiple files in node.js

假装没事ソ 提交于 2019-12-06 04:59:06
Im using the fs module of node.js to read all the files of a directory and return their content, but the array i use to store the content is always empty. server-side: app.get('/getCars', function(req, res){ var path = __dirname + '/Cars/'; var cars = []; fs.readdir(path, function (err, data) { if (err) throw err; data.forEach(function(fileName){ fs.readFile(path + fileName, 'utf8', function (err, data) { if (err) throw err; files.push(data); }); }); }); res.send(files); console.log('complete'); }); ajax function: $.ajax({ type: 'GET', url: '/getCars', dataType: 'JSON', contentType:

How to check in Node.js if a file is open/being written to?

喜欢而已 提交于 2019-12-06 04:43:31
问题 I found many answers for C++, C#, etc. but haven't yet found one for Node.js. Here is my case. I've created a module which watches for file/directory changes (new or updated, doesn't care about deleted files) in my FTP server. Once I notice a new file, or an existing one is changed, I get a notification and notify my module B to sync this file to Sourceforge.net. The module works fine, reacts to changes instantly. But what I'm not satisfied with yet is that when a new file is added, that file

HDFS的Shell、java操作

大城市里の小女人 提交于 2019-12-06 02:35:34
HDFS的Shell操作 1.基本语法 bin/hadoop fs 具体命令 OR bin/hdfs dfs 具体命令 dfs是fs的实现类。 2.命令大全 [Tesla@hadoop102 hadoop-2.7.2]$ bin/hadoop fs [-appendToFile <localsrc> ... <dst>] [-cat [-ignoreCrc] <src> ...] [-checksum <src> ...] [-chgrp [-R] GROUP PATH...] [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...] [-chown [-R] [OWNER][:[GROUP]] PATH...] [-copyFromLocal [-f] [-p] <localsrc> ... <dst>] [-copyToLocal [-p] [-ignoreCrc] [-crc] <src> ... <localdst>] [-count [-q] <path> ...] [-cp [-f] [-p] <src> ... <dst>] [-createSnapshot <snapshotDir> [<snapshotName>]] [-deleteSnapshot <snapshotDir> <snapshotName>] [-df

Remove last n lines from file using nodejs

為{幸葍}努か 提交于 2019-12-06 00:23:52
I'm trying to remove the last 3 lines from a file using fs as part of nodejs. I'm currently reading the file into memory and then writing it again without the 3 lines, but I'm sure there is a more efficient way that doesn't involve reading the whole file into memory. My code now fs.readFile(filename, function (err, data) { if (err) throw err; theFile = data.toString().split("\n"); theFile.splice(-3, 3); fs.writeFile(filename, theFile.join("\n"), function (err) { if (err) { return console.log(err); } console.log("Removed last 3 lines"); console.log(theFile.length); }); }); Let's create a huge

fs.readFileSync is not a function Meteor, React

时光怂恿深爱的人放手 提交于 2019-12-05 20:58:46
I'm getting a 'fs.readFileSync is not a function' in Chrome debugger after trying to call readFileSync(); I call it... const fs = require('fs'); call the function... let content = fs.readFileSync('/path/to/my/file.stuff'); And attempt to display content.. console.log(content); I get nothing. When I do... console.log(fs); I appear to get a generic javascript object... I'm completely stuck. Meteor version: 1.5.1 npm version: 3.10.10 node version: v6.10.1 Thanks for all the answers! I have confirmed that you cannot use fs on the client side. Instead, I made another local simple express node api

Why is fs.createReadStream … pipe(res) locking the read file?

谁都会走 提交于 2019-12-05 20:33:11
I'm using express to stream audio & video files according to this answer . Relevant code looks like this: function streamMedia(filePath, req, res) { // code here to determine which bytes to send, compute response headers, etc. res.writeHead(status, headers); var stream = fs.createReadStream(filePath, { start, end }) .on('open', function() { stream.pipe(res); }) .on('error', function(err) { res.end(err); }) ; } This works just fine to stream bytes to <audio> and <video> elements on the client. However after these requests are served, another express request can delete the file being streamed

nodejs 文件复制 fs.createReadSream&fs.createWriteStream&pipe

强颜欢笑 提交于 2019-12-05 15:48:41
文件结构: e:nodejs/filecopy/demo.js e:nodejs/filecopy/1/result.txt e:nodejs/filecopy/2 demo.js: var fs=require('fs'); var rOption={ flags:"r", encoding:null, mode:0666 } var wOption = { flags: 'a', encoding: null, mode: 0666 } var fileReadStream=fs.createReadStream('filecopy/1/result.txt',rOption); var fileWriteStream = fs.createWriteStream('filecopy/2/new_result.txt',wOption); fileReadStream.on('data',function(data){ fileWriteStream.write(data); }); fileReadStream.on('end',function(){ console.log("readStream end"); fileWriteStream.end(); }); 启动运行可实现将result.txt从e:nodejs/filecopy/1/复制到e:nodejs

How to follow a (changing) log file in node.js

时光总嘲笑我的痴心妄想 提交于 2019-12-05 12:43:53
OK this might appear to be an easy question but I couldn't find the answer from here so I am posting it in hope someone might have encountered the similar problem. I need to monitor a symlink which points to a web server file ( /var/log/lighttpd/error.log to be more specific, thanks to Linus G Thiel I figured out how to follow symlinks ). I know I can set up fs.fileWatch to monitor it but I should also point out that the error.log file also got rotated at a specific time, depending on whatever the log daemon settings are. When that happens, fs.fileWatch stops working. I also know I can spawn a