I was trying to write to a file using Deno.writeFile
await Deno.writeFile('./file.txt', 'some content')
But got the following cryptic error:
error: Uncaught TypeError: arr.subarray is not a function
at Object.writeAll ($deno$/buffer.ts:212:35)
at Object.writeFile ($deno$/write_file.ts:70:9)
What's the right way to write files in Deno?
There are multiple ways to write a file in Deno, all of them require --allow-write flag and will throw if an error occurred, so you should handle errors correctly.
Using Deno.writeFile
This API takes a Uint8Array, not a string, the reason why you get that error. It also takes an optional WriteFileOptions object
const res = await fetch('http://example.com/image.png');
const imageBytes = new Uint8Array(await res.arrayBuffer());
await Deno.writeFile('./image.png', imageBytes);
There's also the synchronous API (it blocks the event loop as it does in Node.js).
Deno.writeFileSync('./image.png', imageBytes);
Writing strings
The easiest way is to use Deno.writeTextFile
await Deno.writeTextFile('./file.txt', 'some content');
You can also use Deno.writeFile with TextEncoder.
const encoder = new TextEncoder(); // to convert a string to Uint8Array
await Deno.writeFile('./file.txt', encoder.encode('some content'));
Streaming
Deno.open returns a FsFile which contains a WritableStream in .writable property, so you can just pipe a stream directly to it.
const res = await fetch('https://example.com/csv');
const file = await Deno.open('./some.csv', { create: true, write: true })
await res.body.pipeTo(file.writable);
file.close();
If you have a Reader instead of a ReadableStream you can convert it to a ReadableStream using readableStreamFromReader from std/streams:
import { readableStreamFromReader } from "https://deno.land/std#0.156.0/streams/mod.ts?s=readableStreamFromReader";
// ...
const readable = readableStreamFromReader(someReader);
await readable.pipeTo(file.writeable)
Low-level APIs
Using Deno.open and Deno.writeAll (or Deno.writeAllSync)
const file = await Deno.open('./image.png', { write: true, create: true });
/* ... */
await Deno.writeAll(file, imageBytes);
file.close(); // You need to close it!
See OpenOptions here. If you want to append you would do:
{ append: true }
And you can also use even lower-level APIs such as Deno.write or Writer.write
You can use ensureDir to safely write files to possibly non-existent directories:
import { ensureDir } from "https://deno.land/std#0.54.0/fs/ensure_dir.ts";
ensureDir("./my/dir")
.then(() => Deno.writeTextFile("./my/dir/file.txt", "some content"));
The containing file directory can be derived via dirname:
import { dirname } from "https://deno.land/std#0.54.0/path/mod.ts";
const file = "./my/dir/file.txt";
ensureDir(dirname(file)).then(() => Deno.writeTextFile(file, "some content"));
An alternative is ensureFile to assert file existence:
import { ensureFile } from "https:deno.land/std/fs/ensure_file.ts";
ensureFile(file).then(/* your file write method */)
This variant is slightly less verbose, with the cost of one additional write operation (file creation, if not exists).
Related
I am new to Typescript and node.
I have this function
sftp.connect(config) //CONNECT TO STFP
.then(() => {
sftp.list(remoteFilePath) //LIST THE FILES IN THE FILEPATH
.then((list) => {
list.forEach((index) => { //FOR EVERY FILE IN THE FOLDER, DOWNLOAD IT
const fileName = remoteFilePath + index.name;
console.log(fileName);
sftp.fastGet(fileName, "/Users/Bob/" + index.name)
.then((value) => {
console.log(value);
sftp.end();
})
})
})
})
// .then(() => {
// // sftp.end();
// })
.catch(err => {
console.error(err.message);
});
and using the ssh2-sftp-client library. My question is that is it possible for this library to get the contents of the file as opposed to downloading it? I plan on making this function into a lambda function.
At the moment, the variable value contains a text telling me that the file has been downloaded to my designated path.
If you want to get the contents of the file you can read it using the fs module after downloading it
// using the ES6 module syntax
import { readFileSync } from "fs"
const data = readFileSync("./file.txt")
If you want to get the contents of the file without downloading them to the disk you have to pass a different destination. Use the ssh2-sftp-client get method instead, it accepts a Stream or a Buffer as the destination. You can use a Stream but you have to pipe it somewhere. Here's an example using process.stdout which is a writable stream:
// ...
stfp.get(
fileName, "/Users/Bob/" + index.name,
process.stdout
)
Since zlib has been added to node.js I'd like to ask a question about unzipping .gz with async/await style, w/o of using streams, one by one.
In the code below I am using fs-extra instead of standard fs & typescript (instead of js), but as for the answer, it doesn't matter will it have js or ts code.
import fs from 'fs-extra';
import path from "path";
import zlib from 'zlib';
(async () => {
try {
//folder which is full of .gz files.
const dir = path.join(__dirname, '..', '..', 'folder');
const files: string[] = await fs.readdir(dir);
for (const file of files) {
//read file one by one
const
file_content = fs.createReadStream(`${dir}/${file}`),
write_stream = fs.createWriteStream(`${dir}/${file.slice(0, -3)}`,),
unzip = zlib.createGunzip();
file_content.pipe(unzip).pipe(write_stream);
}
} catch (e) {
console.error(e)
}
})()
As for now, I have this code, based on streams, which is working, but in various StackOverflow answers, I haven't found any example with async/await, only this one, but it also uses streams I guess.
So does it even possible?
//inside async function
const read_file = await fs.readFile(`${dir}/${file}`)
const unzip = await zlib.unzip(read_file);
//write output of unzip to file or console
I understand that this task will block the main thread. It's ok for me since I write a simple day schedule script.
Seems I have figure it out, but I am still not hundred percent sure about it, here is example of full IIFE:
(async () => {
try {
//folder which is full of .gz files.
const dir = path.join(__dirname, '..', '..', 'folder');
const files: string[] = await fs.readdir(dir);
//parallel run
await Promise.all(files.map(async (file: string, i: number) => {
//let make sure, that we have only .gz files in our scope
if (file.match(/gz$/g)) {
const
buffer = await fs.readFile(`${dir}/${file}`),
//using .toString() is a must, if you want to receive readble data, instead of Buffer
data = await zlib.unzipSync(buffer , { finishFlush: zlib.constants.Z_SYNC_FLUSH }).toString(),
//from here, you can write data to a new file, or parse it.
json = JSON.parse(data);
console.log(json)
}
}))
} catch (e) {
console.error(e)
} finally {
process.exit(0)
}
})()
If you have many files in one directory, I guess you could use await Promise.all(files.map => fn()) to run this task in parallel. Also, in my case, I required to parse JSON, so remember about some nuances of JSON.parse.
I have been trying to upload a file to Firebase storage using a callable firebase cloud function.
All i am doing is fetching an image from an URL using axios and trying to upload to storage.
The problem i am facing is, I don't know how to save the response from axios and upload it to storage.
First , how to save the received file in the temp directory that os.tmpdir() creates.
Then how to upload it into storage.
Here i am receiving the data as arraybuffer and then converting it to Blob and trying to upload it.
Here is my code. I have been missing a major part i think.
If there is a better way, please recommend me. Ive been looking through a lot of documentation, and landed up with no clear solution. Please guide. Thanks in advance.
const bucket = admin.storage().bucket();
const path = require('path');
const os = require('os');
const fs = require('fs');
module.exports = functions.https.onCall((data, context) => {
try {
return new Promise((resolve, reject) => {
const {
imageFiles,
companyPIN,
projectId
} = data;
const filename = imageFiles[0].replace(/^.*[\\\/]/, '');
const filePath = `ProjectPlans/${companyPIN}/${projectId}/images/${filename}`; // Path i am trying to upload in FIrebase storage
const tempFilePath = path.join(os.tmpdir(), filename);
const metadata = {
contentType: 'application/image'
};
axios
.get(imageFiles[0], { // URL for the image
responseType: 'arraybuffer',
headers: {
accept: 'application/image'
}
})
.then(response => {
console.log(response);
const blobObj = new Blob([response.data], {
type: 'application/image'
});
return blobObj;
})
.then(async blobObj => {
return bucket.upload(blobObj, {
destination: tempFilePath // Here i am wrong.. How to set the path of downloaded blob file
});
}).then(buffer => {
resolve({ result: 'success' });
})
.catch(ex => {
console.error(ex);
});
});
} catch (error) {
// unknown: 500 Internal Server Error
throw new functions.https.HttpsError('unknown', 'Unknown error occurred. Contact the administrator.');
}
});
I'd take a slightly different approach and avoid using the local filesystem at all, since its just tmpfs and will cost you memory that your function is using anyway to hold the buffer/blob, so its simpler to just avoid it and write directly from that buffer to GCS using the save method on the GCS file object.
Here's an example. I've simplified out a lot of your setup, and I am using an http function instead of a callable. Likewise, I'm using a public stackoverflow image and not your original urls. In any case, you should be able to use this template to modify back to what you need (e.g. change the prototype and remove the http response and replace it with the return value you need):
const functions = require('firebase-functions');
const axios = require('axios');
const admin = require('firebase-admin');
admin.initializeApp();
exports.doIt = functions.https.onRequest((request, response) => {
const bucket = admin.storage().bucket();
const IMAGE_URL = 'https://cdn.sstatic.net/Sites/stackoverflow/company/img/logos/so/so-logo.svg';
const MIME_TYPE = 'image/svg+xml';
return axios.get(IMAGE_URL, { // URL for the image
responseType: 'arraybuffer',
headers: {
accept: MIME_TYPE
}
}).then(response => {
console.log(response); // only to show we got the data for debugging
const destinationFile = bucket.file('my-stackoverflow-logo.svg');
return destinationFile.save(response.data).then(() => { // note: defaults to resumable upload
return destinationFile.setMetadata({ contentType: MIME_TYPE });
});
}).then(() => { response.send('ok'); })
.catch((err) => { console.log(err); })
});
As a commenter noted, in the above example the axios request itself makes an external network access, and you will need to be on the Blaze or Flame plan for that. However, that alone doesn't appear to be your current problem.
Likewise, this also defaults to using a resumable upload, which the documentation does not recommend when you are doing large numbers of small (<10MB files) as there is some overhead.
You asked how this might be used to download multiple files. Here is one approach. First, lets assume you have a function that returns a promise that downloads a single file given its filename (I've abridged this from the above but its basically identical except for the change of INPUT_URL to filename -- note that it does not return a final result such as response.send(), and there's sort of an implicit assumption all the files are the same MIME_TYPE):
function downloadOneFile(filename) {
const bucket = admin.storage().bucket();
const MIME_TYPE = 'image/svg+xml';
return axios.get(filename, ...)
.then(response => {
const destinationFile = ...
});
}
Then, you just need to iteratively build a promise chain from the list of files. Lets say they are in imageUrls. Once built, return the entire chain:
let finalPromise = Promise.resolve();
imageUrls.forEach((item) => { finalPromise = finalPromise.then(() => downloadOneFile(item)); });
// if needed, add a final .then() section for the actual function result
return finalPromise.catch((err) => { console.log(err) });
Note that you could also build an array of the promises and pass them to Promise.all() -- that would likely be faster as you would get some parallelism, but I wouldn't recommend that unless you are very sure all of the data will fit inside the memory of your function at once. Even with this approach, you need to make sure the downloads can all complete within your function's timeout.
I'm still trying to grok my way through streams in general. I have been able to stream a large file using multiparty from within form.on('part'). But I need to defer the invocation and resolve the stream before it's read. I have tried PassThrough, through. through2, but have gotten different results, which it mainly hangs, and I can't figure out what to do, nor steps to debug. I'm open to all alternatives. Thanks for all insights.
import multiparty from 'multiparty'
import {
PassThrough
} from 'stream';
import through from 'through'
import through2 from 'through2'
export function promisedMultiparty(req) {
return new Promise((resolve, reject) => {
const form = new multiparty.Form()
const form_files = []
let q_str = ''
form.on('field', (fieldname, value) => {
if (value) q_str = appendQStr(fieldname, value, q_str)
})
form.on('part', async (part) => {
if (part.filename) {
const pass1 = new PassThrough() // this hangs at 10%
const pass2 = through(function write(data) { // this hangs from the beginning
this.queue(data)
},
function end() {
this.queue(null)
})
const pass3 = through2() // this hangs at 10%
/*
// This way works for large files, but I want to defer
// invocation
const form_data = new FormData()
form_data.append(savepath, part, {
filename,
})
const r = request.post(url, {
headers: {
'transfer-encoding': 'chunked'
}
}, responseCallback(resolve))
r._form = form
*/
form_files.push({
part: part.pipe(pass1),
// part: part.pipe(pass2),
// part: part.pipe(pass3),
})
} else {
part.resume()
}
})
form.on('close', () => {
resolve({
fields: qs.parse(q_str),
forms: form_files,
})
})
form.parse(req)
})
}
p.s. For sure the title could be better, if someone could use the proper terms please. Thanks.
I believe this is because you are not using through2 correctly - i.e. not actually emptying the buffer once it's full (that's why it hangs at 10% on bigger files, but works on smaller ones).
I believe an implementation like this should do it:
const pass2 = through2(function(chunk, encoding, next) {
// do something with the data
// Use this only if you want to send the data further to another stream reader
// Note - From your implementation you don't seem to need it
// this.push(data)
// This is what tells through2 it's ready to empty the
// buffer and read more data
next();
})
I am new to nodejs and am trying to set up a server where i get the exif information from an image. My images are on S3 so I want to be able to just pass in the s3 url as a parameter and grab the image from it.
I am u using the ExifImage project below to get the exif info and according to their documentation:
"Instead of providing a filename of an image in your filesystem you can also pass a Buffer to ExifImage."
How can I load an image to the buffer in node from a url so I can pass it to the ExifImage function
ExifImage Project:
https://github.com/gomfunkel/node-exif
Thanks for your help!
Try setting up request like this:
var request = require('request').defaults({ encoding: null });
request.get(s3Url, function (err, res, body) {
//process exif here
});
Setting encoding to null will cause request to output a buffer instead of a string.
Use the axios:
const response = await axios.get(url, { responseType: 'arraybuffer' })
const buffer = Buffer.from(response.data, "utf-8")
import fetch from "node-fetch";
let fimg = await fetch(image.src)
let fimgb = Buffer.from(await fimg.arrayBuffer())
I was able to solve this only after reading that encoding: null is required and providing it as an parameter to request.
This will download the image from url and produce a buffer with the image data.
Using the request library -
const request = require('request');
let url = 'http://website.com/image.png';
request({ url, encoding: null }, (err, resp, buffer) => {
// Use the buffer
// buffer contains the image data
// typeof buffer === 'object'
});
Note: omitting the encoding: null will result in an unusable string and not in a buffer. Buffer.from won't work correctly too.
This was tested with Node 8
Use the request library.
request('<s3imageurl>', function(err, response, buffer) {
// Do something
});
Also, node-image-headers might be of interest to you. It sounds like it takes a stream, so it might not even have to download the full image from S3 in order to process the headers.
Updated with correct callback signature.
Here's a solution that uses the native https library.
import { get } from "https";
function urlToBuffer(url: string): Promise<Buffer> {
return new Promise((resolve, reject) => {
const data: Uint8Array[] = [];
get(url, (res) => {
res
.on("data", (chunk: Uint8Array) => {
data.push(chunk);
})
.on("end", () => {
resolve(Buffer.concat(data));
})
.on("error", (err) => {
reject(err);
});
});
});
}
const imageUrl = "https://i.imgur.com/8k7e1Hm.png";
const imageBuffer = await urlToBuffer(imageUrl);
Feel free to delete the types if you're looking for javascript.
I prefer this approach because it doesn't rely on 3rd party libraries or the deprecated request library.
request is deprecated and should be avoided if possible.
Good alternatives include got (only for node.js) and axios (which also support browsers).
Example of got:
npm install got
Using the async/await syntax:
const got = require('got');
const url = 'https://www.google.com/images/branding/googlelogo/2x/googlelogo_color_272x92dp.png';
(async () => {
try {
const response = await got(url, { responseType: 'buffer' });
const buffer = response.body;
} catch (error) {
console.log(error.body);
}
})();
you can do it that way
import axios from "axios";
function getFileContentById(
download_url: string
): Promise < Buffer > {
const response = await axios.get(download_url, {
responseType: "arraybuffer",
});
return Buffer.from(response.data, "base64");
}