I am trying to download multiple files from a OneDrive folder. Below has my code but it will only download the last file and not all of them
for(const f in files)
{
var fileURL = (files[f]["#microsoft.graph.downloadUrl"]);
var fileName = (JSON.stringify(files[f].name)).slice(1,-1);
var request = https.get(fileURL, function(response){
console.log(fileURL);
if(response.statusCode == 200){
var file = fs.createWriteStream(`./temp/${userId}/${fileName}`);
response.pipe(file);
}
request.setTimeout(60000,function(){
request.destroy();
});
});
}
i.e the console log would print
FILE_URL1
FILE_URL1
FILE_URL1
rather than
FILE_URL1
FILE_URL2
FILE_URL3
Note that if the console.log(fileURL) is placed before var request https.get... it prints out the 3 file urls. I'm not sure if its a problem with the loops or if there is something else. I am quite new at javascript so I dont know a lot.
Replace the var with const or let you will see the different result
Description: What's the difference between using "let" and "var"?
Related
I like to write a Thunderbird AddOn that encrypts stuff. For this, I already extracted all data from the compose window. Now I have to save this into files and run a local executable for encryption. But I found no way to save the files and execute an executable on the local machine. How can I do that?
I found the File and Directory Entries API documentation, but it seems to not work. I always get undefined while trying to get the object with this code:
var filesystem = FileSystemEntry.filesystem;
console.log(filesystem); // --> undefined
At least, is there a working AddOn that I can examine to find out how this is working and maybe what permissions I have to request in the manifest.json?
NOTE: Must work cross-platform (Windows and Linux).
The answer is, that WebExtensions are currently not able to execute local files. Also, saving to some local folder on the disk is also not possible.
Instead, you need to add some WebExtension Experiment to your project and there use the legacy APIs. There you can use the IOUtils and FileUtils extensions to reach your goal:
Execute a file:
In your background JS file:
var ret = await browser.experiment.execute("/usr/bin/executable", [ "-v" ]);
In the experiment you can execute like this:
var { ExtensionCommon } = ChromeUtils.import("resource://gre/modules/ExtensionCommon.jsm");
var { FileUtils } = ChromeUtils.import("resource://gre/modules/FileUtils.jsm");
var { XPCOMUtils } = ChromeUtils.import("resource://gre/modules/XPCOMUtils.jsm");
XPCOMUtils.defineLazyGlobalGetters(this, ["IOUtils");
async execute(executable, arrParams) {
var fileExists = await IOUtils.exists(executable);
if (!fileExists) {
Services.wm.getMostRecentWindow("mail:3pane")
.alert("Executable [" + executable + "] not found!");
return false;
}
var progPath = new FileUtils.File(executable);
let process = Cc["#mozilla.org/process/util;1"].createInstance(Ci.nsIProcess);
process.init(progPath);
process.startHidden = false;
process.noShell = true;
process.run(true, arrParams, arrParams.length);
return true;
},
Save an attachment to disk:
In your backround JS file you can do like this:
var f = messenger.compose.getAttachmentFile(attachment.id)
var blob = await f.arrayBuffer();
var t = await browser.experiment.writeFileBinary(tempFile, blob);
In the experiment you can then write the file like this:
async writeFileBinary(filename, data) {
// first we need to convert the arrayBuffer to some Uint8Array
var uint8 = new Uint8Array(data);
uint8.reduce((binary, uint8) => binary + uint8.toString(2), "");
// then we can save it
var ret = await IOUtils.write(filename, uint8);
return ret;
},
IOUtils documentation:
https://searchfox.org/mozilla-central/source/dom/chrome-webidl/IOUtils.webidl
FileUtils documentation:
https://searchfox.org/mozilla-central/source/toolkit/modules/FileUtils.jsm
How do I transfer a zip archive generated on the server back to the client? I'm using AngularJS and SailsJS. Currently I set the HTML headers to match the content type, generate the archive using archiver and pipe the data into the res obejct before calling res.end().
The file-data is succesfully placed inside the XHR response, but the file is never downloaded on the clients side - unless I make an API call to zipFiles (see the code below).
How do I fix this?
zipFiles: async function (req, res) {
var archiver = require('archiver');
var year = req.allParams().year;
var quarter = req.allParams().quarter;
/*
* FIXME: This is dangerous, the same code is present in api/controllers/sirka/SirkaShStatController.js
* FIXME: A globally-available file should contain all relevant paths
*/
var src_path = __some__path__
var file_name = `download.zip`;
// Set HTML headers to match the contents of the respone
res.writeHead(200, {
'Content-Type': 'application/zip',
'Content-Disposition': `attachment; filename=${file_name}`,
});
var archive = archiver('zip');
archive.on('error', function(err) {
throw err;
});
// Once the archive has been finished (by archive.finalize()) send the file
archive.on('finish', function() {
sails.log.info('Archive finished, sending...')
res.end();
});
// Pipe the archive data into the respone object
archive.pipe(res);
// Append files found in src_path at the top level of the archive
archive.directory(src_path, false);
archive.finalize();
}
After a lot of searching and tinkering I've finally managed to solve the issue. I'll try to explain the different approaches that I took and their results.
1st approach
Generate the ZIP-file in-memory and transfer the binary data back to the user through the request.
This approach failed (see original question) since the call to zip the files was done through XHR/AJAX, even though it was possible to pipe the data into the response, it coulnd't be fetched on the client side.
2nd approach
Create the zip-file on the server, then represent the binary data as a Buffer. With this approach, I could simply return the buffer back to the caller by calling res.ok(data) once the zip-file was fully generated:
var archiver = require('archiver');
var archive = archiver('zip');
var fs = require('fs');
var output = fs.createWriteStream(dst_path);
archive.on('error', function(err) {
throw err;
});
// Once the archive has been finished (by archive.finalize()) send the file
archive.on('finish', function() {
sails.log.info('Archive finished, sending...');
});
output.on('close', function () {
var data = fs.readFileSync(dst_path);
console.log(data);
console.log(Buffer.byteLength(data));
return res.ok(data);
})
// Pipe the archive data into the response object
archive.pipe(output);
// Append files found in src_path at the top level of the archive
archive.directory(src_path, false);
archive.finalize();
Then on the client-side I simply receive the data, convert it to Uint8Array and wrap it with another array. The wrapping is necessary since the data makes up one whole part of the Blob.
Then once the Blob is generated, I create an ObjectURL for it and attach the link to an invisible a element that is automatically clicked.
var dataBuffer = res["data"];
var binaryData = new Uint8Array(dataBuffer);
var blobParts = [binaryData];
var blob = new Blob(blobParts, {type: 'application/zip'});
var downloadUrl = URL.createObjectURL(blob);
var a = document.createElement('a');
document.body.appendChild(a);
a.style = "display: none";
a.href = downloadUrl;
a.download = `Sygehusstatistik-${year}-K${quarter}.zip`;
a.click()
I've had issues with the generated zip-file getting placed into itself recursively, in order to avoid that ensure that src_path != dst_path
I am not using the package of multer because I am not using express so I am not sure how multer can work with sailsjs
Anyways, I am trying to upload multiple files to s3, at first I worked with for loop which did not work because for loop is synchronous and file upload is asynchronous.
But then I googled that using recurrsive would work so I tried it but somehow it still didn't though.
Files are uploaded but then the size isn't right for all of them.
Somehow the size might be bigger / smaller then when I download the file let's say if it's a doc file, either I get error saying it's not a msdoc file or what's inside is all scrambled. If it's a pdf, it'll say failed to open the pdf file.
If I try only with one file, it works sometimes but not always though.
Did I do something wrong with the codes below?
s3_upload_multi: async function(req){
try {
let fieldName = req._fileparser.upstreams[0].fieldName;
let files = req.file(fieldName)._files;
let return_obj = [];
const upload_rec = files => {
if (files.length <= 0) return return_obj;
const f = files.pop();
const fileUpload = f.stream;
const s3 = new AWS.S3();
// https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#putObject-property
s3.putObject(({ // uses s3 sdk
Bucket: sails.config.aws.bucket,
Key: 'blahblahblahblahblah',
Body: fileUpload._readableState.buffer.head.data, // buffer from file
ACL: 'public-read',
}, function ( err, data ) {
if (err) reject(err);
return_obj.push(data);
console.log(return_obj, 'return_obj');
});
return upload_rec(files);
};
upload_rec(files);
} catch (e) {
console.log(e, 'inside UploadService');
return false;
}
}
Thanks in advance for any advices and suggestions
I have a function in my NW.js app that downloads a bunch of files from the server and saves them in the folder chosen by the user with the names sent from the server. I do not know the names of the files in advance - the urls I am using are randomly-generated strings that I have gotten from another server, and this server is looking up each hash to see which file it corresponds to.
var regexp = /filename=\"(.*)\"/gi;
media_urls.forEach(function(url) {
var req = client.request(options, function(res) {
var file_size = parseInt(res.headers['content-length'], 10);
var content_disposition = res.headers['content-disposition'];
var name = regexp.exec(content_disposition)[1];
var path = Path.join(save_dir, name);
var file = fs.createWriteStream(path);
file.on('error', function(e) {
console.log(e);
req.abort();
});
res.on('data', function(chunk) {
file.write(chunk);
});
res.on('end', function() {
file.end();
});
});
req.on('error', function(e) {
console.log(e);
});
req.end();
});
I keep getting ENOENT errors when this code runs. This doesn't make any sense because the file is supposed to be created now, so of course it doesn't exist!
Why am I getting this error instead of having the file downloaded?
The file names coming from the server had :s in them, which is a valid filename character on Linux ext4, but not on Windows ntfs.
Changing
var name = regexp.exec(content_disposition)[1];
to
var name = regexp.exec(content_disposition)[1].replace(':', '-');
solved this particular problem.
Am working on an offline application using HTML5 and jquery for mobile. i want to back up files from the local storage using jszip. below is a code snippet of what i have done...
if (localStorageKeys.length > 0) {
for (var i = 0; i < localStorageKeys.length; i++) {
var key = localStorageKeys[i];
if (key.search(_instrumentId) != -1) {
var data = localStorage.getItem(localStorageKeys[i])
var zip = new JSZip();
zip.file(localStorageKeys[i] + ".txt", data);
var datafile = document.getElementById('backupData');
datafile.download = "DataFiles.zip";
datafile.href = window.URL.createObjectURL(zip.generate({ type: "blob" }));
}
else {
}
}
}
in the code above am looping through the localstorage content and saving ezch file in a text format. the challenge that am facing is how to create several text files inside DataFiles.zip as currently am only able to create one text file inside the zipped folder. Am new to javascript so bare with any ambiguity in my question.
thanks in advance.
Just keep calling zip.file().
Look at the example from their documentation page (comments mine):
var zip = new JSZip();
// Add a text file with the contents "Hello World\n"
zip.file("Hello.txt", "Hello World\n");
// Add a another text file with the contents "Goodbye, cruel world\n"
zip.file("Goodbye.txt", "Goodbye, cruel world\n");
// Add a folder named "images"
var img = zip.folder("images");
// Add a file named "smile.gif" to that folder, from some Base64 data
img.file("smile.gif", imgData, {base64: true});
zip.generateAsync({type:"base64"}).then(function (content) {
location.href="data:application/zip;base64," + content;
});
The important thing is to understand the code you've written - learn what each line does. If you do this, you'd realize that you just need to call zip.file() again to add another file.
Adding to #Jonathon Reinhart answer,
You could also set both file name and path at the same time
// create a file and a folder
zip.file("nested/hello.txt", "Hello World\n");
// same as
zip.folder("nested").file("hello.txt", "Hello World\n");
If you receive a list of files ( from ui or array or whatever ) you can make a compress before and then archive. The code is something like this:
function upload(files){
var zip = new JSZip();
let archive = zip.folder("test");
files.map(function(file){
files.file(file.name, file.raw, {base64: true});
}.bind(this));
return archive.generateAsync({
type: "blob",
compression: "DEFLATE",
compressionOptions: {
level: 6
}
}).then(function(content){
// send to server or whatever operation
});
}
this worked for me at multiple json files. Maybe it helps.
In case you want to zip files and need a base64 output, you can use the below code-
import * as JSZip from 'jszip'
var zip = new JSZip();
zip.file("Hello.json", this.fileContent);
zip.generateAsync({ type: "base64" }).then(function (content) {
const base64Data = content