How to synchronously send mails through nodemailer? - javascript

I'm creating an app using nodejs with the nodemailer module to send mails.
The process of my app is to read a list with names and emails, create png diploma files with jimp (based on each name and email) and store and send each one of them through nodemailer to each different mail addresses and after all this is done I want to delete each file but all this in a sync way, because the png diploma and sending the email takes some time:
The syntax of my list is:
const list = [
[name1, email1#mail.com]
[name2, email2#mail.com]
[ ... ]
[namex, emailx#mail.com]
]
Actually I want to wait for each mail to be sent because gmail seems to have a problem to handle sending multiple mails at time, after sending 13 or 15 mails it shows the next err:
error: { Error: Data command failed: 421 4.7.0 Temporary System Problem. Try again later (10). x12sm4645987otk.1 - gsmtp
So, in order to achieve this, I iterate over the list with a classic for loop (a foreach loop does it in an async way and doesn't let me to keep control over the diploma generation), I process each one of the positions of the
//Iterating over mails array
for (let i = 0; i < list.length; i++) {
// Little msg to know what is going on
console.log(`Processing address ${i} out of ${list.length}`)
const element = list[i]
// diplomaData is an object which contains data needed (such as name, course, etc) to create the diploma
diplomaData.name = element[0];
// diplomaDir is the address in which each diploma gets stored, it is returned by the generateDiploma function
diplomaDir = await generator.generateDiploma(diplomaData)
// So once the diploma is generated, I send its address to generateMailContentFunction
// it returns an object which contains text like a greeting, congratulations and of course, the diploma address
mailContent = await mailer.generateMailContent(element, diplomaDir)
// So the only thing pending is to send the email with the correspondent content
await mailer.sendMail(mailContent)
// I've commented this function because even it is declared in an async way it
// await utilities.remove(diplomaDir)
}
This is my sendMail function:
exports.sendMail = async (mailOptions) => {
transporter.sendMail(mailOptions, (err, info) => {
if (err) {
console.log("error: ", err);
} else {
console.log(`Mail sent succesfully!`);
}
});
}
So in few words my problem is that nodemailer seems to launch all the mails at the same time after finishing the loop (I can confirm this because in my console the logs for "Processing address ..." appears before the ones from nodemailer, so I just want to make this process absolutely synchronous, could anybody help me please? :(

Your sendMail function is not asynchronous in nature. It is kicking off an asynchronous function (ie. transporter.sendMail) then immediately returning undefined (as there is no return statement).
exports.sendMail = function(mailOptions){
return new Promise(function (resolve, reject){
transporter.sendMail(mailOptions, (err, info) => {
if (err) {
console.log("error: ", err);
reject(err);
} else {
console.log(`Mail sent successfully!`);
resolve(info);
}
});
});
}
Now when you await mailer.sendMail(mailContent) a promise will be returned & there will actually be something to await. That is, the resolution or rejection of the promise.
Be sure to have a try/catch block enclosing any await operators.

Related

Terminate asynchronous firebase-function properly [duplicate]

This question already has an answer here:
Why is my PDF not saving intermittently in my Node function?
(1 answer)
Closed last year.
As described in the firebase docs, it is required to
"resolve functions that perform asynchronous processing (also known as
"background functions") by returning a JavaScript promise."
(https://firebase.google.com/docs/functions/terminate-functions?hl=en).
otherwise it might happen, that
"the Cloud Functions instance running your function does not shut down
before your function successfully reaches its terminating condition or
state. (https://firebase.google.com/docs/functions/terminate-functions?hl=en)
In this case I am trying to adapt a demo-code for pdf-generation written by Volodymyr Golosay on https://medium.com/firebase-developers/how-to-generate-and-store-a-pdf-with-firebase-7faebb74ccbf.
The demo uses 'https.onRequest' as trigger and fulfillis the termination requirement with 'response.send(result)'. In the adaption I need to use a 'document.onCreate' trigger and therefor need to find a different termination.
In other functions I can fulfill this requirement by using async/await, but here I am struggling to get a stable function with good performance. The shown function logs after 675 ms "finished with status: 'ok' ", but around 2 minutes later it logs again that the pdf-file is saved now (see screenshot of the logger).
What should I do to terminate the function properly?
// adapting the demo code by Volodymyr Golosay published on https://medium.com/firebase-developers/how-to-generate-and-store-a-pdf-with-firebase-7faebb74ccbf
// library installed -> npm install pdfmake
const functions = require("firebase-functions");
const admin = require("firebase-admin");
admin.initializeApp();
const db = admin.firestore();
const Printer = require('pdfmake');
const fonts = require('pdfmake/build/vfs_fonts.js');
const fontDescriptors = {
Roboto: {
normal: Buffer.from(fonts.pdfMake.vfs['Roboto-Regular.ttf'], 'base64'),
bold: Buffer.from(fonts.pdfMake.vfs['Roboto-Medium.ttf'], 'base64'),
italics: Buffer.from(fonts.pdfMake.vfs['Roboto-Italic.ttf'], 'base64'),
bolditalics: Buffer.from(fonts.pdfMake.vfs['Roboto-Italic.ttf'], 'base64'),
}
};
exports.generateDemoPdf = functions
// trigger by 'document.onCreate', while demo uses 'https.onRequest'
.firestore
.document('collection/{docId}')
.onCreate(async (snap, context) => {
const printer = new Printer(fontDescriptors);
const chunks = [];
// define the content of the pdf-file
const docDefinition = {
content: [{
text: 'PDF text is here.',
fontSize: 19 }
]
};
const pdfDoc = printer.createPdfKitDocument(docDefinition);
pdfDoc.on('data', (chunk) => {
chunks.push(chunk);
});
pdfDoc.on('end', async () => {
const result = Buffer.concat(chunks);
// Upload generated file to the Cloud Storage
const docId = "123456789"
const bucket = admin.storage().bucket();
const fileRef = bucket.file(`${docId}.pdf`, {
metadata: {
contentType: 'application/pdf'
}
});
await fileRef.save(result);
console.log('result is saved');
// NEEDS PROPER TERMINATION HERE?? NEEDS TO RETURN A PROMISE?? FIREBASE DOCS: https://firebase.google.com/docs/functions/terminate-functions?hl=en
// the demo with 'https.onRequest' uses the following line to terminate the function properly:
// response.send(result);
});
pdfDoc.on('error', (err) => {
return functions.logger.log('An error occured!');
});
pdfDoc.end();
});
I think everything is fine in your code. It seems it takes 1m 34s to render the file and save it to storage.
Cloud function will be terminated automatically when all micro and macro tasks are done. Right after you last await.
To check how long does it takes and does it terminate right after saving, you can run the firebase emulator on your local machine.
You will see logs in the terminal and simultaneously watch on storage.
I suspect you did terminate properly - that's the nature of promises. Your function "terminated" with a 200 status, returning a PROMISE for the results of the PDF save. When the PDF save actually terminates later, the result is logged and the promise resolved. This behavior is WHY you return the promise.

Node.js how to return an array from within a asynchronous callback for use in another file

File called testing.js
I can do whatever I like with the data in saveWeatherData but cannot call this function and return the data without getting 'undefined'
For example if i tried the below code in saveWeatherData it will print out the summary as expected...
console.log(The summary of the weather today is: ${dataArray[0]});
However I want to use these values within another file such as a server file that when connected to will display weather summary temperature etc.
So I need to return an array with these values in it so that I can call this function and get my data stored in an array for further use.
I know that the reason the array --dataArray is returning undefined is because asynchronous code.
The array is returned before we have gotten the data using the callback.
My question, is there anyway to do what I am trying to do?
I tried my best to explain the problem and what I want to do, hopefully its understandable.
Would I have to use a callback inside of a callback? To callback here to return the data when its been fetched?
I just cant get my head about it and have tried multiple things to try and get the result I am looking for.
My last idea and something i would prefer not to do is the use the 'fs' module to save the data to a text or json file for use in my other files through reading the data from the saved file...
I feel im close but cant get over the last hurdle, so ive decided to ask for a little help, even just point me on the right track and Ill continue to try and figure it out.
Phew...
Thank you for your time!
const request = require("request");
let dataArray = [];
let saveWeatherData = function(weatherData) {
dataArray = weatherData;
return dataArray;
};
let getWeatherData = function(callback) {
request({
url: `https://api.forecast.io/forecast/someexamplekey/1,-1`,
json: true
}, (error, response, body) => {
//Creating array to hold weather data until we can save it using callback...
let array = [];
if (error) {
console.log("Unable to connect with Dark Sky API servers.")
}
else {
console.log(`Successfully connected to Dark Sky API servers!\n`);
array.push(body.currently.summary, body.currently.temperature, body.currently.apparentTemperature, body.currently.windSpeed, body.currently.windBearing);
callback(array);
}
});
};
getWeatherData(saveWeatherData);
module.exports = {
saveWeatherData
};
My Other File...
File called server.js
const http = require("http");
const testing = require("./testing");
function onRequest(request, response){
let data = testing.saveWeatherData();
console.log(`A user made a request: ${request.url}`);
response.writeHead(200, {"context-type": "text/plain"});
response.write("<!DOCTYPE html>");
response.write("<html>");
response.write("<head>");
response.write("<title>Weather</title>");
response.write("</head>");
response.write("<body>");
response.write("Weather summary for today: " + data[0]);
response.write("</body>");
response.write("</html>");
response.end();
}
http.createServer(onRequest).listen(8888);
console.log("Server is now running on port 8888...");
I'm still not sure about what are you trying to do. However, I think you're not exporting what you suppose to be exporting. To avoid the use of so many callbacks you may use async/await.
Change this part of your server.js
async function onRequest(request, response) {
let data = await testing.getWeatherData();
console.log(`A user made a request: ${request.url}`);
response.writeHead(200, { 'context-type': 'text/plain' });
response.write('<!DOCTYPE html>');
response.write('<html>');
response.write('<head>');
response.write('<title>Weather</title>');
response.write('</head>');
response.write('<body>');
response.write('Weather summary for today: ' + data[0]);
response.write('</body>');
response.write('</html>');
response.end();
}
And this of your testing.
let getWeatherData = function() {
return new Promise(resolve =>
request(
{
url: `https://api.darksky.net/forecast/someexamplekey/1,-1`,
json: true
},
(error, response, body) => {
//Creating array to hold weather data until we can save it using callback...
let array = [];
if (error) {
console.log('Unable to connect with Dark Sky API servers.');
} else {
console.log(`Successfully connected to Dark Sky API servers!\n`);
array.push(
body.currently.summary,
body.currently.temperature,
body.currently.apparentTemperature,
body.currently.windSpeed,
body.currently.windBearing
);
resolve(array);
}
}
)
);
};
module.exports = {
getWeatherData
};
It will check for new Weather in each request. If you want to save the result to avoid checking every single time you might need to do something else. But I think for a weather app the important is to keep it updated.

Chaining routes in NodeJs with values after sending API response

I want to chain routes in NodeJs with values after sending API response to end-ser,
WHY: > The uploaded files would be somewhat large (5-50mb each) and require some processing, can not make my API user wait/timeout while my NodeJS code is working.. so need, 1: Upload files and send success immediately to user, Process files (few promises) and return/log success/failure for notification system.
My individual code blocks are done and working fine (i.e. upload service and file processing service both are good under tests and work nicely when tested individually.)
now with the API to upload in place, I've added following code:
router.post('/upload', upload.array('upload_data', multerMaxFiles), (req, res, next) => {
////some uploading and processing stuff - works nicely
res.json({ 'message': 'File uploaded successfully.' });// shown to API client nicely
console.log("what next? " + utilz.inspect(uploaded_file_paths)) //prints file names on console
next();
});
PROBLEM:
app.use('/api', uploadRoute); //The above code route
//want some processing to be done
app.use(function(req, res, next) {
**want those uploaded file names here**
tried with few response object options but stabs with error
});
OR
use something like ....
app.use(someFunction(uploaded_file_names)); **want those uploaded file names as params**
PS:
Any promise after the file upload success would result in 'Error: Can't set headers after they are sent.', so not helpful writing anything there.
Any suggestions folks.
--
N Baua
Once you've sent a response back to the browser (to keep it from timing out during your long processing time), that http request is done. You cannot send any more data on it and trying to do so will trigger a server-side error. You cannot "chain routes" the way you were asking as you seem to want to do because you simply can't send more data over that http request once you've sent the first response.
There are two common ways to deal with this issue.
As part of your initial response, send back a transaction ID and then have the client poll back every few seconds with an Ajax call asking what the final status is of that transaction. The server can return "in progress" until it is finally done and then it can return the final status.
You can connect a webSocket or socket.io connection from client to server. As part of your initial response to the upload, send back a transaction ID. Then, when the transaction is done server-side, it sends a notification on the webSocket or socket.io connection for that particular client with the transactionID with the final status. The client can then respond accordingly to that final status. You can either keep the webSocket/socket.io connection open for use with other requests or you can then close that connection.
Using either technique, you could also return/send a progress value (like percent complete) that the client could use to display completion progress. This is generally very helpful on the client-side to keep an impatient user from giving up or refreshing the page. If they can see that the processing is proceeding, they won't give up thinking that maybe it stopped working.
This should work with res.write(). But it does depend on your clients cache i think.
I tried this, but it does not work in my firefox.
app.get('/test', function(req, res) {
var count = 0;
var interval = setInterval(function() {
if (count++ === 100) {
clearInterval(interval);
res.end();
}
res.write('This is line #' + count + '\n');
}, 100);
});
After I increased frequency and number of writes it seems to work.
So try:
router.post('/upload', upload.array('upload_data', multerMaxFiles), (req, res, next) => {
////some uploading and processing stuff - works nicely
res.write(JSON.stringify({ 'message': 'File uploaded successfully.' }));// shown to API client nicely
console.log("what next? " + utilz.inspect(uploaded_file_paths)) //prints file names on console
next();
});
//STEP (1)
//Example simulate upload multiple files with chained api calls. In this example the parameters "idFile" and "arrayidFileExample" are helpful.
//You should create and send this data from view.
{
"idFile": "04fe640f6e4w", //id first file
"arrayidFileExample": ["04fe640f6e4w","03g5er4g65erg","g0er1g654er65g4er","0g4er4g654reg654re"] //simulate idFiles array from view
}
//STEP (2)
//your upload files api
app.post('/upload', function(req, res) {
//Express headers, no important in this code
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
let arrayIdFiles = req.body.arrayidFileExample; //create arrayIdFiles
let isBlock = false; //flag to block call loop to api
let countIndex = 0;
let currentIndex = 0;
//STEP (3)
//If "arrayIdFiles" index is not exist, set isBlock to true. Remeber flag to block call loop to api
for(let count in arrayIdFiles) {
if(req.body.idFile == arrayIdFiles[count]) {
console.log("current index --> ", countIndex)
countIndex++;
console.log("next index --> ", countIndex)
if(arrayIdFiles[countIndex] == undefined) {
isBlock = true;
}
break;
}
countIndex++;
currentIndex++;
}
//STEP (4)
//If isBlock is equal false, call get upload api with next idFile. this is simulate "recursive api loop"
if(isBlock == false) {
postUploadFile(
'http://localhost:3500/upload',
{
"idFile":arrayIdFiles[currentIndex + 1], //send next idFile from arrayIdFiles
"arrayidFileExample": arrayIdFiles //initial arrayIdFiles
});
}
//STEP (6)
//response json example
const json = JSON.stringify({
error:false,
statusCode: 200,
body:{
message:'current id file '+req.body.idFile,
}
});
res.write(json);
return res.end();
});
//STEP (5)
//call "/upload" api post
const postUploadFile = (url = '', body = {}) => {
return new Promise((resolve, reject)=>{
axios.post(url, body).then(response => {
return resolve(response.data);
}).catch(error => {});
});
};
//server listen instance
server.listen(3500,() => {
});

NodeJS Undefined JSON object

just posting a question as I have seen some other similar questions on here but none with a method that seemingly works for me.
I'm new to NodeJS and playing around with requesting data from an API. For my test here im just trying to pull ticker prices based on the input of a prompt from the user.
This works fine, however the object
This is the code I am using to try and make this work:
prompt.start();
prompt.get(['coin'], function (err, result) {
request({url: `https://min-api.cryptocompare.com/data/price?fsym=${result.coin}&tsyms=BTC,USD`, json:true}, function(err, res, json) {
if (err) {
throw err;
}
console.log(json);
var json = JSON.stringify(json);
var string2 = JSON.parse(json);
console.log(string2.btc_price);
console.log(json);
});
console.log('Retrieving: ' + result.coin);
});
The API request works, however it returns JSON that looks like this with my 3 console logs:
{ set_attributes: { btc_price: 1, usd_price: 15839.35 } }
undefined
{"set_attributes":{"btc_price":1,"usd_price":15839.35}} -- (Stringify'd response)
I want to be able to extract the btc_price & usd_price as variables, ive tried a few different methods and can't figure out where exactly im going wrong. Any help would be greatly appreciated!
Cheers,
J
When you attempt to extract the btc_price attribute, it's actually nested so your second console should read console.log(string2.set_attributes.btc_price);
axios has more stars on Github, more followers on Github and more forks.
Features
Make XMLHttpRequests from the browser
Make http requests from node.js
Supports the Promise API
Intercept request and response
Transform request and response data
Cancel requests
Automatic transforms for JSON data
Client side support for protecting against XSRF
Using async / await
// Make a request for a user with a given ID
var preload = null;
async function getPrice(symbol) {
preload = await axios.get('https://min-api.cryptocompare.com/data/price?fsym=${symbol}&tsyms=BTC,USD')
.then(function (response) {
preload = response.data;
})
.catch(function (error) {
console.log(error);
});
return `preload.BTC = ${preload.BTC}; preload.BTC = ${preload.BTC}`;
};
getPrice('ETH');
// return preload.BTC = 0.04689; preload.USD = 742.85

SQL Select a value from database

For a Discord bot I have a command that changes the prefix of that guild, it 'works fine' as in it updates it in my database (MySQL Workbench), but the commands still triggers for ANY prefix, so if you stick any character in-front of the command it triggers instead of the one in the database.
This is my code to check the prefix:
let prefix = "!";
connection.query(`SELECT * FROM guilds WHERE guildid = ${message.guild.id}`, (error, commands) => {
if (error) throw error;
if (commands.length) { //guild exists in database
commands.forEach(value =>
prefix = value.prefix;
console.log(value.prefix); // returns correct prefix from database
});
} else {
prefix = "!";
}
});
It's a little challenging trying to read your code with the arrow short hands. I'm rather sure this is where your error is coming form.
prefix = value.prefix; console.log(value.prefix);
Do you mean to be logging the original value.prefix before reassigning it?
Try this code.
commands.forEach(function(value){
console.log(value.prefix);
prefix = value.prefix;
})
Javascript Docs on .forEach()

Categories