I'm trying to download a file (without saving it) and convert it, but readFileSync throws an error that the file can't be found: "Error: ENOENT: no such file or directory, open 'https://s3.amazonaws.com/appforest_uf/f1631452514756x615162562554826200/testdoc.txt'"
Code:
var path = require('path');
const fs = require('fs');
const https = require('https');
const file_x = 'https://s3.amazonaws.com/appforest_uf/f1631452514756x615162562554826200/testdoc.txt';
var filename = path.basename(file_x);
const FileBuffer = fs.readFileSync(file_x);
const fileParam = {
value: FileBuffer,
options: {
filename: filename,
contentType: 'application/octet-stream'
}
};
console.log(fileParam);
And I get the error the file can't be found... The URL is working OK, do I need to do something else to download from URL ?
Did you try looking for similar questions? For example, readFileSync not reading URL as file. There it is clearly stated that the fs API only deals with local files.
You will need to use the https module to stream the file.
The package fs only reads files, not URLs. Alternatively, you can use a package like axios. Axios allows you to make HTTP requests within node.js.
Example usage:
const axios = require('axios');
axios.get('https://s3.amazonaws.com/appforest_uf/f1631452514756x615162562554826200/testdoc.txt').then(response => {
console.log(response);
...
});
Related
This is my first electron/node application, I m trying to use a json file as a datastore. so I created a simple one index.json under the app folder next to index.js|css|html
I installed a npm package jsonfile that is loading just fine
When I try to load my json file the EOF is rised claiming that there is no json file, and I can see that using the DevTools source tab that my json file is not there ( not loaded )
I tried force reload from electron app menu.
Here is my files code that is reading my json
const jsonfile = require('jsonfile')
const file = '/index.json';
var json;
jsonfile.readFile(file)
.then(obj => json = obj)
.catch(error => console.error(error))
------------ Edit
correcting the path name to index.json or ./index.json rises the same issue
You can use the native fs (filesystem) module.
let path = "index.json"
const fs = require('fs');
const json = JSON.parse(fs.readFileSync(path));
Thanks for your support
For me the issue was more about file system handling than electron.
All I did is to chmod my project folder to assure that I will be able to read and write into the index.json datastore
sudo chmod -R 777 /opt/workspaces/electron/myElectronPrpjectFolder
Then for a better path resolution I used the basic idea used in electron archtype, It more error safe
const path = require('path')
const file = path.join(__dirname,'index.json');
var json;
var html = "";// The returned object.
$(document).ready(function () {
jsonfile.readFile(file)
.then(obj => {
json = JSON.parse(JSON.stringify(obj));
console.log(JSON.stringify(json))
parseIssues(json.children);
document.getElementById('a').innerHTML = html;
})
.catch(error => console.error(error))
});
You can see that I m using JQuery in this snippet but it also works without JQuery.
in resume, better path resolve policy with granted priveleges on folder.
Thanks
Is it possible to use res.download() after writing a file to the filesystem?
router.get('/exportjson', (req, res, next) => {
let json = `{"#dope":[{"set":"","val":"200"}],"comment":"comment","folderType":"window"}`
const file = `${__dirname}/upload-folder/export.JSON`;
fs.writeFile('file', json, 'application/json', function(){
res.download(file);
})
})
I'm not sure I fully understand your question, but I'm assuming you want to be able to save that json data to the path /upload-folder/export.json and then allow the browser to download the file using res.download() at the path GET /exportjson.
You've got a couple of issues. First, fs.writeFile takes a file path as the first argument, and you are just passing the string file. With your code, the data would be written to the current directory as file. You probably want to use the path module and create a path to the file you want to write, like so:
const path = require('path');
const jsonFilePath = path.join(__dirname, '../upload-folder/export.json');
Assuming the code is at routes/index.js, this path would point to the root directory of the project to the file upload-folder/export.json.
The data you want to write is in your variable json, but you have it stored as a string. I would actually leave it as an object:
let json = {
"#dope": [
{
"set":"",
"val":"200"
}
],
"comment":"comment",
"folderType":"window"
};
And then call JSON.stringify on it when you pass it to fs.writeFile as the second argument. You will also need to pass in the utf-8 option as the third argument, not application/json:
fs.writeFile(jsonFilePath, JSON.stringify(json), 'utf-8', function(err) {
In the callback to fs.writeFile, you want to call res.download and pass it the path to the file that you just wrote to the filesystem, which is stored in jsonFilePath (you had this part right, I just changed the variable name):
res.download(jsonFilePath);
Here is the relevant portion of the router file that has code to get everything working correctly:
const fs = require('fs');
const path = require('path');
const jsonFilePath = path.join(__dirname, '../upload-folder/export.json');
router.get('/exportjson', (req, res, next) => {
let json = {
"#dope": [
{
"set":"",
"val":"200"
}
],
"comment":"comment",
"folderType":"window"
};
fs.writeFile(jsonFilePath, JSON.stringify(json), 'utf-8', function(err) {
if (err) return console.log(err);
res.download(jsonFilePath);
});
});
Assuming this file lives in /routes/index.js, the file would be saved at /upload-folder/export.json.
Here is a gif showing how it looks on my machine:
I am trying to get json values using nodejs but not working.I have searched some question in stackoverflow related this but always I am getting [Object Object] like this.I do not know Why I am getting like this.Anyone can resolve this issue?
file.json:
{
"scripts": {
"mr": "place",
"kg": "time",
"bh": "sec"
}
}
extension.js:
var fs = require("fs");
var file = JSON.parse(fs.readFileSync("c:\\xampp\\htdocs\\projects\\file.json", "utf8"));
console.log(file);
This is not duplicate. I have tried many ways but not working.
Note:I am using this code inside my visual studio code extension.
In node, you can import JSON like a JavaScript file
const file = require('./file.json')
console.log(file)
See is there a require for json in node.js for more info
const data = require("./file.json")
console.log(data.scripts)
Try this one out it is simple.
const fs = require('fs');
const paht = require('path');
console.log(paht.join(__dirname,'../file.json'));
let file = JSON.parse(fs.readFileSync(paht.join(__dirname,'../file.json'), "utf8"));
__dirname gives you the directory of your current file, I used path.join to make sure I can go further.
I put the json file in upper directory in my case
In the following path:
app/test/vehicle/
there are several .js file. Is there any way in Javascript where I can get the number of .js file contains in the above-mentioned path?
I am using webStorm environment
Yes, using node:
const fs = require('fs');
const dir = './directory';
fs.readdir(dir, (err, files) => {
console.log(files.length);
});
https://stackoverflow.com/a/43747896/3650835
Since javascript doesn't have local filesystem access when run in the browser, there is no way to access local files from javascript. If you are asking about accessing all files hosted on a cdn or something similar in that directory, there is still no way to tell which requests will return a 20x.
If you are using nodejs, use the fs.readdir function:
const fs = require('fs');
fs.readdir('app/test/vehicle/', (err, files) => {
files.forEach(file => { console.log(file);} );
})
There is no way to access the file system from a web page.
If you are using nodejs, then you can use fs.readdir or fs.readdirSync:
const fs = require('fs');
const dir = './directory';
fs.readdir(dir, (err, files) => {
console.log(files.filter(file => file.endsWith(".js")).length);
});
Where are you trying to get list of the files? On the client side it is impossible. JavaScript does not have access to the client file system. If you are trying to do it in the server side using for example node.js then you can do something like:
var filesys = require('fs');
var files = filesys.readdirSync('/app/test/vehicle/');
I am trying to unzip a gzipped file in Node but I am running into the following error.
Error: incorrect header check
at Zlib._handle.onerror (zlib.js:370:17)
Here is the code the causes the issue.
'use strict'
const fs = require('fs');
const request = require('request');
const zlib = require('zlib');
const path = require('path');
var req = request('https://wiki.mozilla.org/images/f/ff/Example.json.gz').pipe(fs.createWriteStream('example.json.gz'));
req.on('finish', function() {
var readstream = fs.createReadStream(path.join(__dirname, 'example.json.gz'));
var writestream = fs.createWriteStream('example.json');
var inflate = zlib.createInflate();
readstream.pipe(inflate).pipe(writestream);
});
//Note using file system because files will eventually be much larger
Am I missing something obvious? If not, how can I determine what is throwing the error?
The file is gzipped, so you need to use zlib.Gunzip instead of zlib.Inflate.
Also, streams are very efficient in terms of memory usage, so if you want to perform the retrieval without storing the .gz file locally first, you can use something like this:
request('https://wiki.mozilla.org/images/f/ff/Example.json.gz')
.pipe(zlib.createGunzip())
.pipe(fs.createWriteStream('example.json'));
Otherwise, you can modify your existing code:
var gunzip = zlib.createGunzip();
readstream.pipe(gunzip).pipe(writestream);