I just started working with the Microsoft Azure Storage SDK for NodeJS (https://github.com/Azure/azure-storage-node) and already successfully uploaded my first pdf files to the cloud storage.
However, now I started looking at the documentation, in order to download my files as a node_buffer (so I dont have to use fs.createWriteStream), however the documentation is not giving any examples of how this works. The only thing they are writing is "There are also several ways to download files. For example, getFileToStream downloads the file to a stream:", but then they only show one example, which is using the fs.createWriteStream, which I dont want to use.
I was also not able to find anything on Google that really helped me, so I was wondering if anybody has experience with doing this and could share a code sample with me?
The getFileToStream function need a writable stream as param. If you want all the data wrote to a Buffer instead of a file, you just need to create a custom writable stream.
const { Writable } = require('stream');
let bufferArray = [];
const myWriteStream = new Writable({
write(chunk, encoding, callback) {
bufferArray.push(...chunk)
callback();
}
});
myWriteStream.on('finish', function () {
// all the data is stored inside this dataBuffer
let dataBuffer = Buffer.from(bufferArray);
})
then pass myWriteStream to getFileToStream function
fileService.getFileToStream('taskshare', 'taskdirectory', 'taskfile', myWriteStream, function(error, result, response) {
if (!error) {
// file retrieved
}
});
Related
I have a program that allows users to upload video files to firebase storage. If the file is not an mp4 I then send it to a third party video converting site to convert it to mp4. They hit a webhook(firebase function) with the URL and other information about the converted file.
Right now I'm trying to download the file to the tmp dir in firebase functions and then send it to firebase storage. I have the following questions.
Can I bypass downloading a file to the functions tmp dir and just save it directly to storage? if so how?
I'm currently having trouble downloading to the functions tmp dir below is my code. The function is returning Function execution took 6726 ms, finished with status: 'crash'
export async function donwloadExternalFile(url:string, fileName:string) {
try {
const axios = await import('axios')
const fs = await import('fs')
const workingDir = join(tmpdir(), 'downloadedFiles')
const tmpFilePath = join(workingDir, fileName)
const writer = fs.createWriteStream(tmpFilePath);
const response = await axios.default.get(url, { responseType: 'stream' })
response.data.pipe(writer);
await new Promise((resolve, reject) => {
writer.on('error', err => {
writer.close();
reject(err);
});
writer.on('close', () => {
resolve();
});
});
return
} catch (error) {
throw error
}
}
As mentioned above in the comments section, you can use the Cloud Storage Node.js SDK to effectuate the upload of your file to Cloud Storage.
Please take a look at the SDK Client reference documentation where you can find numerous samples and more information about this Cloud Storage client library.
Also, I'd like to remind you that you can bypass writing to /tmp by using a pipeline. According to the documentation for Cloud Functions, "you can process a file on Cloud Storage by creating a read stream, passing it through a stream-based process, and writing the output stream directly to Cloud Storage."
Last but not least, always delete temporary files from the Cloud Function's local system. Failing to do so can eventually result in out-of-memory issues and subsequent cold starts.
I have a local JSON file which I intent to read/write from a NodeJS electron app. I am not sure, but I believe that instead of using readFile() and writeFile(), I should get a FileHandle to avoid multiple open and close actions.
So I've tried to grab a FileHandle from fs.promises.open(), but the problem seems to be that I am unable to get a FileHandle from an existing file without truncate it and clear it to 0.
const { resolve } = require('path');
const fsPromises = require('fs').promises;
function init() {
// Save table name
this.path = resolve(__dirname, '..', 'data', `test.json`);
// Create/Open the json file
fsPromises
.open(this.path, 'wx+')
.then(fileHandle => {
// Grab file handle if the file don't exists
// because of the flag 'wx+'
this.fh = fileHandle;
})
.catch(err => {
if (err.code === 'EEXIST') {
// File exists
}
});
}
Am I doing something wrong? Are there better ways to do it?
Links:
https://nodejs.org/api/fs.html#fs_fspromises_open_path_flags_mode
https://nodejs.org/api/fs.html#fs_file_system_flags
Because JSON is a text format that has to be read or written all at once and can't be easily modified or added onto in place, you're going to have to read the whole file or write the whole file at once.
So, your simplest option will be to just use fs.promises.readFile() and fs.promises.writeFile() and let the library open the file, read/write it and close the file. Opening and closing a file in a modern OS takes advantage of disk caching so if you're reopening a file you just previously opened not long ago, it's not going to be a slow operation. Further, since nodejs performs these operations in secondary threads in libuv, it doesn't block the main thread of nodejs either so its generally not a performance issue for your server.
If you really wanted to open the file once and hold it open, you would open it for reading and writing using the r+ flag as in:
const fileHandle = await fsPromises.open(this.path, 'r+');
Reading the whole file would be simple as the new fileHandle object has a .readFile() method.
const text = await fileHandle.readFile({encoding 'utf8'});
For writing the whole file from an open filehandle, you would have to truncate the file, then write your bytes, then flush the write buffer to ensure the last bit of the data got to the disk and isn't sitting in a buffer.
await fileHandle.truncate(0); // clear previous contents
let {bytesWritten} = await fileHandle.write(mybuffer, 0, someLength, 0); // write new data
assert(bytesWritten === someLength);
await fileHandle.sync(); // flush buffering to disk
Is there any way to get the actual data instead of downladURL from firebase storage ? In my case, i store string(some amount of html) to the storage and i want to get the actual data when its needed.
But i can't figure out how to do it. According to firebase documentation i can get the download able url but can't fetch the actual data.
Here is the function to fetch data from the storage(In the test case i can get the url properly, but i need the actual data)
// Create a reference to the file we want to download
var starsRef = storageRef.child('images/stars.jpg');
// Get the download URL
starsRef.getDownloadURL().then(function(url) {
// Insert url into an <img> tag to "download"
})
Thanks
Update 17.02.2020
I solve my problem, my mistake! Its possible to download file from storage using ajax request which is mentioned in the docs. Here is the simple function i define which return a promise and after resolving you can get the actual file/data.
async updateToStorage(pathArray, dataToUpload) {
let address = pathArray.join("/");
// Create a storage ref
let storageRef = firebase.storage().ref(address);
// Upload file as string format, known as firebase task
let uploadPromise = await storageRef.putString(dataToUpload);
let url = await uploadPromise.ref.getDownloadURL();
const res = await fetch(url);
const content = await res.text();
return content;
}
const avatar = await updateToStorage(['storage', 'uid', 'avatarUrl'], avatar.png);
//avatar will be the actual image after download.
The Cloud Storage for Firebase APIs for JavaScript running in web browsers actually don't provide a way to download the raw data of a file. This is different on Android and iOS. Notice that StorageReference doesn't have any direct accessors for data, unlike the Android and iOS equivalents. I don't know why this is. Consider it a feature request that you can file with Firebase support.
You will probably need to set up some sort of API endpoint that your code can call, routed through the web server that serves your web site, or through something else that supports CORS so that you can make an request from the browser that crosses web domains without security issues.
I am using ssh2-sftp-client to get files from a remote server, I'm running into an issue reading and downloading these files once I get() them.
At first, I was able to use the get() method to download the file when the API was hit - I could also return the whole file contents in a console.log statement then it started returning Buffer content. I updated with this:
npm install ssh2-sftp-client#3.1.0
And now I get a ReadbleStream.
function getFile(req,res) {
sftp.connect(config).then(() => {
return sftp.get(process.env.SFTP_PATH + '/../...xml',true);
}).then((stream)=>{
const outFile = fs.createWriteStream('...xml')
stream.on('data', (c) => {
console.log(`Received ${c.length} bytes of data.`);
outFile.write(c);
res.send('ok')
});
stream.on('close', function() {
});
}).catch((err) => {
console.log(err, 'catch error');
});
};
I have the above code that returns a stream but I'm not sure how to get the file - the write() method doesn't seem to work here.
Any advice or suggestions on how I can use this library to read and download files would be greatly appreciated
First, don't use version 3.x. That version has been deprecated. The most recent version is v4.1.0 and has had significant cleanup work to fix a number of small bugs.
If all you want to do is download the files, then use the fastGet() method. It takes 2 args, source path and destination path. It is a lot faster than plain get as it does the download in parallel.
If you don't want to do that, then the get() method has a number of options. If you only pass in one arg (source) it will return a buffer. If you pass in 2 args, the second arg must be either a string (path to local file) or a writeable stream. If a writeable stream, the data will be piped into that stream.
I am using an API for a Twitch.tv streaming bot called DeepBot.
Here is the link to it on github https://github.com/DeepBot-API/client-websocket
My goal is to create a text document listing all the information pulled from the bot using the command api|get_users|. The bot's response is always a json object. How can I take the json object from the bot and save it as a text file?
Edit: My code
var WebSocket = require('ws');
var ws = new WebSocket('ws://Ip and Port/');
ws.on('open', function () {
console.log('sending API registration');
ws.send('api|register|SECRET');
});
ws.on('close', function close() {
console.log('disconnected');
});
ws.on('message', function (message) {
console.log('Received: ' + message);
});
ws.on('open', function () {
ws.send('api|get_users|');
});
Well that depends on how your setup is? You posted this under javascript. So I guess you are either:
using a browser, to make the websocket connection, in with case there is no direct way to save a file on the client. But in HTML5 you can store key,value pairs with local storage.
using node js (server side javascript) in witch case the code is as below:
some other setup, that I can't guess. in witch case you might tell a little more about it?
In browser with HTML5 capabilities:
// where msg is an object returned from the API
localStorage.setItem('Some key', JSON.stringify(msg));
In Node JS
var fs = require("fs"); // Has to be installed first with “npm install fs”
// where msg is an object returned from the API
fs.writeFile("some-file.json", JSON.stringify(msg), function (err) {
if (err) throw err;
});
Edit: OK, Thanks for clearing it up.
I believe Blag's solution is the way to go.
Good luck with your project!
If it's for a client side JS save :
Create a file in memory for user to download, not through server
and
Convert JS object to JSON string
Is what you need. ( I don't test it, but it'll look like this : )
var j = {"name":"binchen"};
var s = JSON.stringify(j);
window.location = 'data:text/plain;charset=utf-8,'+encodeURIComponent(s);