I'm using collectionFS and gm for some image mainpulation in my meteor app. I'm creating a temporary file:
var fs = Npm.require('fs');
var filename = '/tmp/gm_' + Date.now();
var read = file.createReadStream('source'),
var temp = fs.createWriteStream(filename);
gm(read)
.crop(100, 100, 10, 10)
.stream().pipe(temp);
// Do some more things
// remove temp-file
At the end I want to delete this temporary file. How do I do that? I'm not very familiar with streams...
I think of something like fs.remove(filename).
You are looking for this guy: fs.unlinkSync(filename);
Related
I like to write a Thunderbird AddOn that encrypts stuff. For this, I already extracted all data from the compose window. Now I have to save this into files and run a local executable for encryption. But I found no way to save the files and execute an executable on the local machine. How can I do that?
I found the File and Directory Entries API documentation, but it seems to not work. I always get undefined while trying to get the object with this code:
var filesystem = FileSystemEntry.filesystem;
console.log(filesystem); // --> undefined
At least, is there a working AddOn that I can examine to find out how this is working and maybe what permissions I have to request in the manifest.json?
NOTE: Must work cross-platform (Windows and Linux).
The answer is, that WebExtensions are currently not able to execute local files. Also, saving to some local folder on the disk is also not possible.
Instead, you need to add some WebExtension Experiment to your project and there use the legacy APIs. There you can use the IOUtils and FileUtils extensions to reach your goal:
Execute a file:
In your background JS file:
var ret = await browser.experiment.execute("/usr/bin/executable", [ "-v" ]);
In the experiment you can execute like this:
var { ExtensionCommon } = ChromeUtils.import("resource://gre/modules/ExtensionCommon.jsm");
var { FileUtils } = ChromeUtils.import("resource://gre/modules/FileUtils.jsm");
var { XPCOMUtils } = ChromeUtils.import("resource://gre/modules/XPCOMUtils.jsm");
XPCOMUtils.defineLazyGlobalGetters(this, ["IOUtils");
async execute(executable, arrParams) {
var fileExists = await IOUtils.exists(executable);
if (!fileExists) {
Services.wm.getMostRecentWindow("mail:3pane")
.alert("Executable [" + executable + "] not found!");
return false;
}
var progPath = new FileUtils.File(executable);
let process = Cc["#mozilla.org/process/util;1"].createInstance(Ci.nsIProcess);
process.init(progPath);
process.startHidden = false;
process.noShell = true;
process.run(true, arrParams, arrParams.length);
return true;
},
Save an attachment to disk:
In your backround JS file you can do like this:
var f = messenger.compose.getAttachmentFile(attachment.id)
var blob = await f.arrayBuffer();
var t = await browser.experiment.writeFileBinary(tempFile, blob);
In the experiment you can then write the file like this:
async writeFileBinary(filename, data) {
// first we need to convert the arrayBuffer to some Uint8Array
var uint8 = new Uint8Array(data);
uint8.reduce((binary, uint8) => binary + uint8.toString(2), "");
// then we can save it
var ret = await IOUtils.write(filename, uint8);
return ret;
},
IOUtils documentation:
https://searchfox.org/mozilla-central/source/dom/chrome-webidl/IOUtils.webidl
FileUtils documentation:
https://searchfox.org/mozilla-central/source/toolkit/modules/FileUtils.jsm
I'm using node.js and pdf2json parser to parse a pdf file.
Currently it is working with a local pdf file.
But I'm trying to get a pdf-file through the URL/HTTP Module of node.js and I want to open this file to parse it.
Is there any possibility to parse/work with an online pdf?
let query = url.parse(req.url, true).query;
let pdfLink = query.pdf;
...
pdfParser.loadPDF(pdfLink + "");
So the url should be given through the url like: https://localhost:8080/?pdf=http://whale-cms.de/pdf.pdf
Is there any way to parse it within the online pdf/link?
Thanks in advance.
Im just faced with the same problem, and found a solution:
var request = require('request');
var PDFParser = require("pdf2json");
var pdfUrl = "http://localhost:3000/cdn/storage/PDFFiles/sk87bAfiXxPre428b/original/sk87bAfiXxPre428b"
var pdfParser = new PDFParser();
var pdfPipe = request({url: pdfUrl, encoding:null}).pipe(pdfParser);
pdfPipe.on("pdfParser_dataError", err => console.error(err) );
pdfPipe.on("pdfParser_dataReady", pdf => {
let usedFieldsInTheDocument = pdfParser.getAllFieldsTypes();
console.log(usedFieldsInTheDocument)
});
Source from:
https://github.com/modesty/pdf2json/issues/65
Cheers
I have a large json file that looks like that:
[
{"name": "item1"},
{"name": "item2"},
{"name": "item3"}
]
I want to stream this file (pretty easy so far), for each line run a asynchronous function (that returns a promise) upon the resolve/reject call edit this line.
The result of the input file could be:
[
{"name": "item1", "response": 200},
{"name": "item2", "response": 404},
{"name": "item3"} // not processed yet
]
I do not wish to create another file, I want to edit on the fly the SAME FILE (if possible!).
Thanks :)
I don't really answer the question, but don't think it can be answered in a satisfactory way anyway, so here are my 2 cents.
I assume that you know how to stream line by line, and run the function, and that the only problem you have is editing the file that you are reading from.
Consequences of inserting
It is not possible to natively insert data into any file (which is what you want to do by changing the JSON live). A file can only grow up at its end.
So inserting 10 bytes of data at the beginning of a 1GB file means that you need to write 1GB to the disk (to move all the data 10 bytes further).
Your filesystem does not understand JSON, and just sees that you are inserting bytes in the middle of a big file so this is going to be very slow.
So, yes it is possible to do.
Write a wrapper over the file API in NodeJS with an insert() method.
Then write some more code to be able to know where to insert bytes into a JSON file without loading the whole file and not producing invalid JSON at the end.
Now I would not recommend it :)
=> Read this question: Is it possible to prepend data to an file without rewriting?
Why do it then?
I assume that want to either
Be able to kill your process at any time, and easily resume work by reading the file again.
Retry partially treated files to fill only the missing bits.
First solution: Use a database
Abstracting the work that needs to be done to live edit files at random places is the sole purpose of existence of databases.
They all exist only to abstract the magic that is behind UPDATE mytable SET name = 'a_longer_name_that_the_name_that_was_there_before' where name = 'short_name'.
Have a look at LevelUP/Down, sqlite, etc...
They will abstract all the magic that needs to be done in your JSON file!
Second solution: Use multiple files
When you stream your file, write two new files!
One that contain current position in the input file and lines that need to be retried
The other one the expected result.
You will also be able to kill your process at any time and restart
According to this answer writing to the same file while reading is not reliable. As a commenter there says, better to write to a temporary file, and then delete the original and rename the temp file over it.
To create a stream of lines you can use byline. Then for each line, apply some operation and pipe it out to the output file.
Something like this:
var fs = require('fs');
var stream = require('stream');
var util = require('util');
var LineStream = require('byline').LineStream;
function Modify(options) {
stream.Transform.call(this, options);
}
util.inherits(Modify, stream.Transform);
Modify.prototype._transform = function(chunk, encoding, done) {
var self = this;
setTimeout(function() {
// your modifications here, note that the exact regex depends on
// your json format and is probably the most brittle part of this
var modifiedChunk = chunk.toString();
if (modifiedChunk.search('response:[^,}]+') === -1) {
modifiedChunk = modifiedChunk
.replace('}', ', response: ' + new Date().getTime() + '}') + '\n';
}
self.push(modifiedChunk);
done();
}, Math.random() * 2000 + 1000); // to simulate an async modification
};
var inPath = './data.json';
var outPath = './out.txt';
fs.createReadStream(inPath)
.pipe(new LineStream())
.pipe(new Modify())
.pipe(fs.createWriteStream(outPath))
.on('close', function() {
// replace input with output
fs.unlink(inPath, function() {
fs.rename(outPath, inPath);
});
});
Note that the above results in only one async operation happening at a time. You could also save the modifications to an array and once all of them are done write the lines from the array to a file, like this:
var fs = require('fs');
var stream = require('stream');
var LineStream = require('byline').LineStream;
var modifiedLines = [];
var modifiedCount = 0;
var inPath = './data.json';
var allModified = new Promise(function(resolve, reject) {
fs.createReadStream(inPath).pipe(new LineStream()).on('data', function(chunk) {
modifiedLines.length++;
var index = modifiedLines.length - 1;
setTimeout(function() {
// your modifications here
var modifiedChunk = chunk.toString();
if (modifiedChunk.search('response:[^,}]+') === -1) {
modifiedChunk = modifiedChunk
.replace('}', ', response: ' + new Date().getTime() + '}');
}
modifiedLines[index] = modifiedChunk;
modifiedCount++;
if (modifiedCount === modifiedLines.length) {
resolve();
}
}, Math.random() * 2000 + 1000);
});
}).then(function() {
fs.writeFile(inPath, modifiedLines.join('\n'));
}).catch(function(reason) {
console.error(reason);
});
If instead of lines you wish to stream chunks of valid json which would be a more robust approach, take a look at JSONStream.
As mentioned in the comment, the file you have is not proper JSON, although is valid in Javascript. In order to generate proper JSON, JSON.stringify() could be used. I think it would make life difficult for others to parse nonstandard JSON as well, therefore I would recommend furnishing a new output file instead of keeping the original one.
However, it is still possible to parse the original file as JSON. This is possible via eval('(' + procline + ')');, however it is not secure to take external data into node.js like this.
const fs = require('fs');
const readline = require('readline');
const fr = fs.createReadStream('file1');
const rl = readline.createInterface({
input: fr
});
rl.on('line', function (line) {
if (line.match(new RegExp("\{name"))) {
var procline = "";
if (line.trim().split('').pop() === ','){
procline = line.trim().substring(0,line.trim().length-1);
}
else{
procline = line.trim();
}
var lineObj = eval('(' + procline + ')');
lineObj.response = 200;
console.log(JSON.stringify(lineObj));
}
});
The output would be like this:
{"name":"item1","response":200}
{"name":"item2","response":200}
{"name":"item3","response":200}
Which is line-delimited JSON (LDJSON) and could be useful for streaming stuff, without the need for leading and trailing [, ], or ,. There is an ldjson-stream package for it as well.
below is my test directory structure:
I have made a script which works on a psd file in the finals folder. My aim is to save it to the tifs folder. This is the code i have:
app.activeDocument.saveAs(file."../tifs", TiffSaveOptions, true, Extension.LOWERCASE);
I am well and truly stuck. I have tried so many combinations and everything is throwing an error. I just want to come out of the finals folder, and then go into the tifs folder and save.
any help would be much appreciated. :)
You've not set up your file path correctly. I suspect "../tifs" isn't working as you'd hoped. Here it is in full.
// Flatten the tiff
app.activeDocument.flatten();
// set up the new directory
// make sure you change this or
// have a folder in c:\testpsd\tifs
var myFolder = "c:\\testpsd\\tifs"; // add extra escape slash
// get the documents name
var myFileName = app.activeDocument.name;
// remove it's extension
var myDocName = myFileName.substring(0,myFileName.length -4);
// set the new filename and path
var myFilePath = myFolder + "/" + myDocName + ".tiff";
// tiff file options
var tiffFile = new File(myFilePath);
tiffSaveOptions = new TiffSaveOptions();
tiffSaveOptions.byteOrder = ByteOrder.MACOS;
tiffSaveOptions.layers = false;
tiffSaveOptions.transparency = true;
tiffSaveOptions.alphaChannels = true;
tiffSaveOptions.embedColorProfile = false;
tiffSaveOptions.imageCompression = TIFFEncoding.TIFFLZW;
tiffSaveOptions.saveImagePyramid = false;
// finally save out the document
activeDocument.saveAs(tiffFile, tiffSaveOptions, false, Extension.LOWERCASE);
I'm trying to save a file in Illustrator using Javascript but I keep getting an error.
Here is what works, but is not what I want:
// save as
var dest = "~/testme.pdf";
saveFileToPDF(dest);
function saveFileToPDF (dest) {
var doc = app.activeDocument;
if ( app.documents.length > 0 ) {
var saveName = new File ( dest );
saveOpts = new PDFSaveOptions();
saveOpts.compatibility = PDFCompatibility.ACROBAT5;
saveOpts.generateThumbnails = true;
saveOpts.preserveEditability = true;
alert(saveName);
doc.saveAs( saveName, saveOpts );
}
}
The var "dest" saves the file to the root of my Mac user account. I simply want to save the file relative to the source document in a subfolder, so I tried this:
var dest = "exports/testme.pdf";
This brings up a dialogue with ".pdf" highlighted, properly awaiting input inside the "exports" folder that I already created. I can type something and it will save, but it ignores the file name "testme.pdf" that was specified in the code. I can type "cheese" over the highlighted ".pdf" it knows I want, and it will save "cheese.pdf" in the folder "exports".
I also tried these with no luck:
var dest = "exports/testme";
var dest = "/exports/testme.pdf";
var dest = "testme.pdf";
etc., etc.
What am I missing?
To use saveAs without a dialog popping up, you need to use the global property userInteractionLevel:
var originalInteractionLevel = userInteractionLevel;
userInteractionLevel = UserInteractionLevel.DONTDISPLAYALERTS;
...
userInteractionLevel = originalInteractionLevel;
Since you want to save relative to your document, so first find the path for your current document as follows
var path = app.activeDocument.path;
var dest = path + "/exports/testme.pdf";
You can also check whether exports folder exists or not if not you can create with script as follows
var path = app.activeDocument.path;
var exportFolder = Folder(path + "/exports");
if(!exportFolder.exists){
exportFolder.create();
}
var dest = exportFolder + "/testme.pdf";