To send a PDF file from a Node.js server to a client I use the following code:
const pdf = printer.createPdfKitDocument(docDefinition);
const chunks = [];
pdf.on("data", (chunk) => {
chunks.push(chunk);
});
pdf.on("end", () => {
const pdfBuffered = `data:application/pdf;base64, ${Buffer.concat(chunks).toString("base64")}`;
res.setHeader("Content-Type", "application/pdf");
res.setHeader("Content-Length", pdfBuffered.length);
res.send(pdfBuffered);
});
pdf.end();
Everything is working correctly, the only issue is that the stream here is using callback-approach rather then async/await.
I've found a possible solution:
const { pipeline } = require("stream/promises");
async function run() {
await pipeline(
fs.createReadStream('archive.tar'),
zlib.createGzip(),
fs.createWriteStream('archive.tar.gz')
);
console.log('Pipeline succeeded.');
}
run().catch(console.error);
But I can't figure out how to adopt the initial code to the one with stream/promises.
You can manually wrap your PDF code in a promise like this and then use it as a function that returns a promise:
function sendPDF(docDefinition) {
return new Promise((resolve, reject) => {
const pdf = printer.createPdfKitDocument(docDefinition);
const chunks = [];
pdf.on("data", (chunk) => {
chunks.push(chunk);
});
pdf.on("end", () => {
const pdfBuffered =
`data:application/pdf;base64, ${Buffer.concat(chunks).toString("base64")}`;
resolve(pdfBuffered);
});
pdf.on("error", reject);
pdf.end();
});
}
sendPDF(docDefinition).then(pdfBuffer => {
res.setHeader("Content-Type", "application/pdf");
res.setHeader("Content-Length", pdfBuffer.length);
res.send(pdfBuffer);
}).catch(err => {
console.log(err);
res.sendStatus(500);
});
Because there are many data events, you can't promisify just the data portion. You will still have to listen for each data event and collect the data.
You can only convert a callback-API to async/await if the callback is intended to only be executed once.
The one you found online works, because you're just waiting for the whole stream to finish before the callback runs once. What you've got is callbacks that execute multiple times, on every incoming chunk of data.
There are other resources you can look at to make streams nicer to consume, like RXJS, or this upcoming ECMAScript proposal to add observables to the language. Both of these are designed to handle the scenario when a callback can execute multiple times — something that async/await can not do.
Related
How do I write to a Node Passthrough stream, then later read that data? When I try, the code hangs as though no data is sent. Here's a minimal example (in Typescript):
const stream = new PassThrough();
stream.write('Test chunk.');
stream.end();
// Later
const chunks: Buffer[] = [];
const output = await new Promise<Buffer>((resolve, reject) => {
stream.on('data', (chunk) => {
chunks.push(Buffer.from(chunk));}
);
stream.on('error', (err) => reject(err));
stream.on('end', () => {
resolve(Buffer.concat(chunks));
});
});
Please note that I can't attach the event listeners before writing to the stream: I don't know at the time of writing how I'm going to be reading from it. My understanding of a Transform stream like PassThrough was that it "decoupled" the Readable from the Writable, so that you could access them asynchronously.
Your code works for me, the promise resolves to a buffer containing "Test chunk.".
It will fail, however, if the readable side of the stream has already started emitting data when the stream.on('data', (chunk) => {...}) is executed. I could force such a behavior by enclosing the // Later part of your code in a setTimeout and inserting an additional
stream.on("data", () => {});
before that. This command will cause the stream to start emitting. Could that have happened in your case?
To be on the safe side, end the "early" part of your code with stream.pause() and begin the "later" part with stream.resume(), for example:
const output = await new Promise<Buffer>((resolve, reject) => {
stream.resume();
stream.on('data', (chunk) => {
...
I use the fetchAPI to retrieve my data from the backend as a stream.
I decrypt the data chunk by chunk and the concat the content back together for the original file.
I have found that the stream seems to provide data differently each time makling the chunnks different. How can I force the stream to the chunks in the original sequence.
fetch(myRequest, myInit).then(response => {
var tmpResult = new Uint8Array();
const reader = response.body.getReader();
return new ReadableStream({
start(controller) {
return pump();
function pump() {
return reader.read().then(({ done, value }) => {
// When no more data needs to be consumed, close the stream
if (value) {
//values here are different in order every time
//making my concatenated values different every time
controller.enqueue(value);
var decrypted = cryptor.decrypt(value);
var arrayResponse = decrypted.toArrayBuffer();
if (arrayResponse) {
tmpResult = arrayBufferConcat(tmpResult, arrayResponse);
}
}
// Enqueue the next data chunk into our target stream
if (done) {
if (counter == length) {
callback(obj);
}
controller.close();
return;
}
return pump();
});
}
}
})
})
The documentation tells us that:
Each chunk is read sequentially and output to the UI, until the stream
has finished being read, at which point we return out of the recursive
function and print the entire stream to another part of the UI.
I made a test program with node, using node-fetch:
import fetch from 'node-fetch';
const testStreamChunkOrder = async () => {
return new Promise(async (resolve) => {
let response = await fetch('https://jsonplaceholder.typicode.com/todos/');
let stream = response.body;
let data = '';
stream.on('readable', () => {
let chunk;
while (null !== (chunk = stream.read())) {
data += chunk;
}
})
stream.on('end', () => {
resolve(JSON.parse(data).splice(0, 5).map((x) => x.title));
})
});
}
(async () => {
let results = await Promise.all(new Array(10).fill(testStreamChunkOrder()))
let joined = results.map((r) => r.join(''));
console.log(`Is every result same: ${joined.every((j) => j.localeCompare(joined[0]) === 0)}`)
})()
This one fetches some random todo-list json and streams it chunk-by-chunk, accumulating the chunks into data. When the stream is done, we parse the full json and take the first 5 elements of the todo-list and keep only the titles, after which we then return the result asynchronously.
This whole process is done 10 times. When we have 10 streamed title-lists, we go through each title-list and join the title names together to form a string. Finally we use .every to see if each of the 10 strings are the same, which means that each json was fetched and streamed in the same order.
So I believe the problems lies somewhere else - the streaming itself is working correctly. While I did use node-fetch instead of the actual Fetch API, I think it is safe to say that the actual Fetch API works as it should.
Also I noticed that you are directly calling response.body.getReader(), but when I looked at the documentation, the body.getReader call is done inside another then statement:
fetch('./tortoise.png')
.then(response => response.body)
.then(body => {
const reader = body.getReader();
This might not matter, but considering everything else in your code, such as the excessive wrapping and returning of functions, I think your problems could go away just by reading a couple of tutorials on streams and cleaning up the code a bit. And if not, you will still be in a better position to figure out if the problem is in one of your many functions you are unwilling to expose. Asynchronous code's behavior is inherently difficult to debug and lacking information around such code makes it even harder.
I'm assuming you're using the cipher/decipher family of methods in node's crypto library. We can simplify this using streams by first piping the ReadableStream into a decipher TransformStream (a stream that is both readable and writable) via ReadableStream#pipe().
const { createDecipherIv } = require('crypto');
const { createWriteStream } = require('fs');
const { pipeline } = require('stream');
// change these to match your encryption scheme and key retrieval
const algo = 'aes-256-cbc';
const key = 'my5up3r53cr3t';
// put your initialization vector you've determined here
// leave null if you are not (or the algo doesn't support) using an iv
const iv = null;
// creates the decipher TransformStream
const decipher = createDecipherIv(algo, key, iv);
// write plaintext file here
const destFile = createWriteStream('/path/to/destination.ext');
fetch(myRequest, myInit)
.then(response => response.body)
.then(body => body.pipe(decipher).pipe(destFile))
.then(stream => stream.on('end', console.log('done writing file')));
You may also pipe this to be read out in a buffer, pipe to the browser, etc, just be sure to match your algorithm, key, and iv wherever you're defining your cipher/decipher functions.
If we take the pattern in that MDN example seriously, we should use the controller to enqueue the decrypted data (not the still encrypted value), and aggregate the results with the stream returned by the first promise. In other words...
return fetch(myRequest, myInit)
// Retrieve its body as ReadableStream
.then(response => {
const reader = response.body.getReader();
return new ReadableStream({
start(controller) {
return pump();
function pump() {
return reader.read().then(({ done, value }) => {
// When no more data needs to be consumed, close the stream
if (done) {
controller.close();
return;
}
// do the computational work on each chunk here and enqueue
// *the result of that work* on the controller stream...
const decrypted = cryptor.decrypt(value);
controller.enqueue(decrypted);
return pump();
});
}
}
})
})
// Create a new response out of the stream
.then(stream => new Response(stream))
// Create an object URL for the response
.then(response => response.blob())
.then(blob => {
const arrayResponse = blob.toArrayBuffer();
// arrayResponse is the properly sequenced result
// if the caller wants a promise to resolve to this, just return it
return arrayResponse;
// OR... the OP code makes reference to a callback. if that's real,
// call the callback with this result
// callback(arrayResponse);
})
.catch(err => console.error(err));
Trying to get things working correctly in Blazor Server Side App. I have an Uploader Component but it doesn't InvokeAsync after each promise is resolved on client side. It waits for all Images to load then Invokes the C# method. How would I get it to Invoke the C# method after each image is loaded?
I know JavaScript is single threaded but also tried with web workers and still does the same thing.
Sample repo can be found here
https://dev.azure.com/twinnaz/BlazorUploader
Gif of what's happening.
https://imgur.com/a/aF4AQUf
It should be able to invoke the C# method Async in parallel from javascript file if my thinking is correct.
This issue is related with Blazor and JS. On JS you are not awaiting for GenerateImageData.
You should to use a modern for … ofloop instead, in which await will work as expected:
GetFileInputFiles = async (instance, fileInput) => {
var files = Array.from(fileInput.files);
for (const image of files) {
var imagedata = await readUploadedFileAsText(image);
console.log("sending");
_ = await instance.invokeMethodAsync('GenerateImageData', imagedata);
console.log("sent");
};
};
On Blazor, I suggest to you to rewrite GenerateImageData as :
[JSInvokable]
public async Task GenerateImageData(string data)
{
System.Console.WriteLine( "Receiving" );
ImageBase64.Add(data);
await Task.Delay(1);
StateHasChanged();
System.Console.WriteLine( "Received" );
}
Result:
More detailed info about JS issue: Using async/await with a forEach loop
More detailed info about Blazor issue: Blazor - Display wait or spinner on API call
GeneratePreviewImages = async (dotNet, fileInput) => {
const files = Array.from(fileInput.files);
const imageSrcs = files.map(file => getPreviewImageSrc(file));
loop(imageSrcs, dotNet);
};
const loop = async (arr, dotNet) => {
for await (const src of arr) {
console.log(src);
dotNet.invokeMethodAsync('GenerateImageData', src);
}
};
const getPreviewImageSrc = async (file) => {
return URL.createObjectURL(file);
};
I'm trying to get my head around promises, I think I can see how they work in the way that you can say do Step 1, Step 2 and then Step 3 for example.
I have created this download function using node-fetch (which uses native Promises)
## FileDownload.js
const fetch = require('node-fetch');
const fs = require('fs');
module.exports = function(url, target) {
fetch(url)
.then(function(res) {
var dest = fs.createWriteStream(target);
res.body.pipe(dest);
}).then(function(){
console.log(`File saved at ${target}`)
}).catch(function(err){
console.log(err)
});
}
So this all executes in order and I can see how that works.
I have another method that then converts a CSV file to JSON (again using a promise)
## CSVToJson.js
const csvjson = require('csvjson');
const fs = require('fs');
const write_file = require('../helpers/WriteToFile');
function csvToJson(csv_file, json_path) {
return new Promise(function(resolve, reject) {
fs.readFile(csv_file, function(err, data){
if (err)
reject(err);
else
var data = data.toString();
var options = {
delimiter : ',',
quote : '"'
};
const json_data = csvjson.toObject(data, options);
write_file(json_path, json_data)
resolve(data);
});
});
}
module.exports = {
csvToJson: csvToJson
}
When I call these functions one after another the second function fails as the first has not completed.
Do I need to wrap these two function calls inside another promise, even though on their own they each have promises implemented?
Please advise if I am totally misunderstanding this
When I call these functions one after another the second function fails as the first has not completed.
There are two issues with the first:
It doesn't wait for the file to be written; all it does is set up the pipe, without waiting for the process to complete
It doesn't provide any way for the caller to know when the process is complete
To deal with the first issue, you have to wait for the finish event on the destination stream (which pipe returns). To deal with the second, you need to return a promise that won't be fulfilled until that happens. Something along these lines (see ** comments):
module.exports = function(url, target) {
// ** Return the end of the chain
return fetch(url)
.then(function(res) {
// ** Unfortunately, `pipe` is not Promise-enabled, so we have to resort
// to creating a promise here
return new Promise((resolve, reject) => {
var dest = fs.createWriteStream(target);
res.body.pipe(dest)
.on('finish', () => resolve()) // ** Resolve on success
.on('error', reject); // ** Reject on error
});
}).then(result => {
console.log(`File saved at ${target}`);
return result;
});
// ** Don't `catch` here, let the caller handle it
}
Then you can use then and catch on the result to proceeed to the next step:
theFunctionAbove("/some/url", "some-target")
.then(() = {
// It worked, do the next thing
})
.catch(err => {
// It failed
});
(Or async/await.)
Side note: I haven't code-reviewed it, but a serious issue in csvToJson jumped out, a minor issue as well, and #Bergi has highlighted a second one:
It's missing { and } around the else logic
The minor issue is that you have var data = data.toString(); but data was a parameter of that function, so the var is misleading (but harmless)
It doesn't properly handle errors in the part of the code in the else part of the readFile callback
We can fix both by doing a resolve in the else and performing the rest of the logic in a then handler:
function csvToJson(csv_file, json_path) {
return new Promise(function(resolve, reject) {
fs.readFile(csv_file, function(err, data){
if (err)
reject(err);
else
resolve(data);
});
})
.then(data => {
data = data.toString();
var options = {
delimiter : ',',
quote : '"'
};
const json_data = csvjson.toObject(data, options);
write_file(json_path, json_data);
return data;
});
}
What i want to do is to upload file on server, then get URL of uploaded file and preview it. Files can be more than one. For that purpose i have written following code:
let filesURL=[];
let promises=[];
if(this.state.files_to_upload.length>0) {
for(let i=0; i<this.state.files_to_upload.length; i++) {
promises.push(this.uploadFilesOnServer(this.state.files_to_upload[i]))
}
Promise.all(promises).then(function(result){
console.log(result);
result.map((file)=>{
filesURL.push(file);
});
});
console.log(filesURL);
}
const uploadedFilesURL=filesURL;
console.log(uploadedFilesURL);
console.log(filesURL); give me the values returned by Promise.all.
And i want to use these values only when Promise.all completes properly. But, i am facing problem that lines console.log(uploadedFilesURL); excutes first irrespective of Promise.all and give me undefined values.I think i am not using promises correctly, can anyone please help me?
uploadFileOnServer code is:
uploadFilesOnServer(file)
{
let files=[];
let file_id='';
const image=file;
getImageUrl().then((response) => {
const data = new FormData();
data.append('file-0', image);
const {upload_url} = JSON.parse(response);
console.log(upload_url);
updateProfileImage(upload_url, data).then ((response2) => {
const data2 = JSON.parse(response2);
file_id=data2;
console.log(file_id);
files.push(file_id);
console.log(files);
});
});
return files;
}
No, promise is asynchronous and as such, doesn't work the way you think. If you want to execute something after a promise completed, you must put it into the promise's then callback. Here is the example based on your code:
uploadFilesOnServer(file) {
let files=[];
let file_id='';
const promise = getImageUrl()
.then((imageUrlResponse) => {
const data = new FormData();
data.append('file-0', file);
const { upload_url } = JSON.parse(imageUrlResponse);
console.log(upload_url);
return updateProfileImage(upload_url, data);
})
.then ((updateImageResponse) => {
file_id= JSON.parse(updateImageResponse);
console.log(file_id);
files.push(file_id);
console.log(files);
return files;
});
return promise;
}
let filesPromise = Promise.resolve([]);
if(this.state.files_to_upload.length > 0) {
const promises = this.state.files_to_upload.map((file) => {
return this.uploadFilesOnServer(file);
});
filesPromise = Promise.all(promises).then((results) => {
console.log(results);
return [].concat(...results);
});
}
// This is the final console.log of you (console.log(uploadedFilesURL);)
filesPromise.then((filesUrl) => console.log(filesUrl));
A good book to read about ES6 in general and Promises in particular is this book Understanding ECMAScript 6 - Nicholas C. Zakas
Edit:
Here is an simple explanation of the example code:
The uploadFilesOnServer is a function that takes a file, upload it and will return the file URL when the upload completes in the future in the form of a promise. The promise will call its then callback when it gets the url.
By using the map function, we create a list of url promises, the results we've got from executing uploadFilesOnServer on each file in the list.
The Promise.all method waits for all the promises in the list to be completed, joins the list of url results and create a promise with the result which is the list of urls. We need this because there is no guarantee that all of the promises will complete at once, and we need to gather all the results in one callback for convenience.
We get the urls from the then callback.
You have to do this on the .then part of your Promise.all()
Promise.all(promises)
.then(function(result){
console.log(result);
result.map((file)=>{
filesURL.push(file);
});
return true; // return from here to go to the next promise down
})
.then(() => {
console.log(filesURL);
const uploadedFilesURL=filesURL;
console.log(uploadedFilesURL);
})
This is the way async code works. You cannot expect your console.log(filesURL); to work correctly if it is being called syncronously after async call to fetch files from server.
Regarding to your code there are several problems:
1.uploadFilesOnServer must return Promise as it is async. Therefore:
uploadFilesOnServer(file)
{
let files=[];
let file_id='';
const image=file;
return getImageUrl().then((response) => {
const data = new FormData();
data.append('file-0', image);
const {upload_url} = JSON.parse(response);
console.log(upload_url);
updateProfileImage(upload_url, data).then ((response2) => {
const data2 = JSON.parse(response2);
file_id=data2;
console.log(file_id);
files.push(file_id);
console.log(files);
return files;
});
});
}
2.Inside your main function body you can assess results of the Promise.all execution only in its respective then handler.
As a side note I would recomment you to use es7 async/await features with some transpilers like babel/typescript. This will greatly reduce the nesting/complications of writing such async code.