Javascript 'then' not working as expected in Ripple-lib calls - javascript

I'm trying to create a simple example of payments over the XRPL using Ripple-lib. The idea is to send several payments to different accounts stored in an array. I've made it kind of work in a different way as it is expected, but when using the 'then' method (as the docs recommend) does not work at all.
I'm a total newbie to Javascript so I don't have a good grasp on the language nor asyncronous coding and promises. When using the 'then' paradigm, the code stops working and no output can be seen in the console. This is the code I'm currently using. In the comments inside the 'SendXRP' function I explain the problem. How can this be re-arranged? Between the two ways, what is the proper one to code it?
'use strict';
const RippleAPI = require('ripple-lib').RippleAPI;
const sender = 'r*********************************';
const secret = 's****************************';
const destinations = ['r*********************************',
'r*********************************',
'r*********************************'];
const amount = 5;
// Instantiate Ripple API
const api = new RippleAPI({
server: "wss://s.altnet.rippletest.net:51233"
});
run();
async function sendXRP(amount, fee, destination, memo) {
// Update amount
amount = (amount - fee).toString();
// Build payment
const payment = {
source: {
address: sender,
maxAmount: {
value: amount,
currency: 'XRP'
}
},
destination: {
address: destination,
amount: {
value: amount,
currency: 'XRP'
}
},
memos: [
{
data: memo
}
]
};
// Build instuctions
const instructions = {
maxLedgerVersionOffset: 5
};
console.log('Sending ' + amount + ' to ' + destination);
// THIS KIND OF WORKS FOR NOW
// Prepare the payment
const preparedTX = await api.preparePayment(sender, payment, instructions);
// Sign the payment
const signedTX = api.sign(preparedTX.txJSON, secret);
// Submit the payment
const result = await api.submit(signedTX['signedTransaction']);
// Return TX hash on successful TX
if ('resultCode' in result && result['resultCode'] == 'tesSUCCESS') {
return signedTX.id;
} else {
return null;
}
// THIS IS MORE SIMILAR TO HOW IT IS DONE IN THE DOCS! NOT WORKING!
// ALSO, HOW DO I RETURN THE RESULT OF API.SIGN TO THE MAIN FUNCTION?
// Prepare the payment
// api.preparePayment(sender, payment, instructions).then(preparedTX => {
// // Sign the payment
// api.sign(preparedTX.txJSON, secret).then(signedTX => {
// // Submit the payment
// api.submit(signedTX['signedTransaction']);
// })
// }).catch(console.error);
}
function run() {
// Connect to Ripple server
api.connect().then(() => {
return api.getFee();
}).then(async fee => {
for (var i in destinations) {
var hash = await sendXRP(amount, Number(fee), destinations[i], 'memotext');
console.log(hash);
}
}).then(() => {
return api.disconnect();
}).catch(console.error);
}

Could it be that some of the transactions failed to send? If it failed, the result variable from sendXRP should have the txresult, but since you returned null if the result code is not tesSUCCESS, it doesn't return the result information.
const result = await api.submit(signedTX['signedTransaction']);
if ('resultCode' in result && result['resultCode'] == 'tesSUCCESS') {
return signedTX.id;
} else {
return null;
}
Before, when I tried submitting transactions consecutively, it would fail and return error code tefPAST_SEQ.
"The sequence number of the transaction is lower than the current sequence number of the account sending the transaction." from https://developers.ripple.com/tef-codes.html
I recommend removing the if('resultCode' in result...) block and check the transaction result. If the transactions failed with tefPAST_SEQ error, my solution to this is set the account sequence in instructions manually or add setTimeOut after each submit.

Related

Google Apps Script Working on backend but not on sheets

I am trying to create a script that pulls from the coin market cap API and displays the current price. The script is working fine on the back end when I assign the variable a value. However, when I try to run the function on sheets the returned value is null.
function marketview(ticker) {
var url = "https://pro-api.coinmarketcap.com/v1/cryptocurrency/quotes/latest?CMC_PRO_API_KEY=XXX&symbol=" + ticker;
var data = UrlFetchApp.fetch(url);
const jsondata = JSON.parse(data);
Logger.log(jsondata.data[ticker].quote['USD'].price)
}
My execution logs show that the scripts are running, but when when I use the function and try and quote ETH for example, the script is running for BTC.
When I do this on the backend and assign ETH the script works fine and returns the right quote. Any ideas on what I'm missing?
I did the same with coingecko API and add an issue having all my requests being rejected with quota exceeded error.
I understood that Google sheets servers IPs address were already spamming coingecko server. (I was obviously not the only one to try this).
This is why I used an external service like apify.com to pull the data and re-expose data over their API.
This is my AppScripts coingecko.gs:
/**
* get latest coingecko market prices dataset
*/
async function GET_COINGECKO_PRICES(key, actor) {
const coinGeckoUrl = `https://api.apify.com/v2/acts/${actor}/runs/last/dataset/items?token=${key}&status=SUCCEEDED`
return ImportJSON(coinGeckoUrl);
}
You need ImportJSON function, available here: https://github.com/bradjasper/ImportJSON/blob/master/ImportJSON.gs
Then in a cell I write: =GET_COINGECKO_PRICES(APIFY_API_KEY,APIFY_COINGECKO_MARKET_PRICES), you will have to create two field named APIFY_API_KEY and APIFY_COINGECKO_MARKET_PRICES in order for this to work.
Then register on apify.com, then you'll have to create an actor by forking apify-webscraper actor.
I set the StartURLs with https://api.coingecko.com/api/v3/coins/list, this will give me the total number of existing crypto (approx 11000 as of today), and number of page so I can run the request concurrently (rate limit is 10 concurrent requests on coingecko), then I just replace /list with /market and set the proper limit to get all the pages I need.
I use the following for the tasks page function:
async function pageFunction(context) {
let marketPrices = [];
const ENABLE_CONCURRENCY_BATCH = true;
const PRICE_CHANGE_PERCENTAGE = ['1h', '24h', '7d'];
const MAX_PAGE_TO_SCRAP = 10;
const MAX_PER_PAGE = 250;
const MAX_CONCURRENCY_BATCH_LIMIT = 10;
await context.WaitFor(5000);
const cryptoList = readJson();
const totalPage = Math.ceil(cryptoList.length / MAX_PER_PAGE);
context.log.info(`[Coingecko total cryptos count: ${cryptoList.length} (${totalPage} pages)]`)
function readJson() {
try {
const preEl = document.querySelector('body > pre');
return JSON.parse(preEl.innerText);
} catch (error) {
throw Error(`Failed to read JSON: ${error.message}`)
}
}
async function loadPage($page) {
try {
const params = {
vs_currency: 'usd',
page: $page,
per_page: MAX_PER_PAGE,
price_change_percentage: PRICE_CHANGE_PERCENTAGE.join(','),
sparkline: true,
}
let pageUrl = `${context.request.url.replace(/\/list$/, '/markets')}?`;
pageUrl += [
`vs_currency=${params.vs_currency}`,
`page=${params.page}`,
`per_page=${params.per_page}`,
`price_change_percentage=${params.price_change_percentage}`,
].join('&');
context.log.info(`GET page ${params.page} URL: ${pageUrl}`);
const page = await fetch(pageUrl).then((response) => response.json());
context.log.info(`Done GET page ${params.page} size ${page.length}`);
marketPrices = [...marketPrices, ...page];
return page
} catch (error) {
throw Error(`Fail to load page ${$page}: ${error.message}`)
}
}
try {
if (ENABLE_CONCURRENCY_BATCH) {
const fetchers = Array.from({ length: totalPage }).map((_, i) => {
const pageIndex = i + 1;
if (pageIndex > MAX_PAGE_TO_SCRAP) {
return null;
}
return () => loadPage(pageIndex);
}).filter(Boolean);
while (fetchers.length) {
await Promise.all(
fetchers.splice(0, MAX_CONCURRENCY_BATCH_LIMIT).map((f) => f())
);
}
} else {
let pageIndex = 1
let page = await loadPage(pageIndex)
while (page.length !== 0 && page <= MAX_PAGE_TO_SCRAP) {
pageIndex += 1
page = await loadPage(pageIndex)
}
}
} catch (error) {
context.log.info(`Fetchers failed: ${error.message}`);
}
context.log.info(`End: Updated ${marketPrices.length} prices for ${cryptoList.length} cryptos`);
const data = marketPrices.sort((a, b) => a.id.toLowerCase() > b.id.toLowerCase() ? 1 : -1);
context.log.info(JSON.stringify(data.find((item) => item.id.toLowerCase() === 'bitcoin')));
function sanitizer(item) {
item.symbol = item.symbol.toUpperCase()
return item;
}
return data.map(sanitizer)
}
I presume you are hiting the same issue I had with coinmarketcap, and that you could do the same with it.
You're not return ing anything to the sheet, but just logging it. Return it:
return jsondata.data[ticker].quote['USD'].price

I don't understand how to get over Stripe's rate limit

I am trying to develop the backend of an ecommerce website using Stripe and NodeJS (Express precisely).
When the server starts, I am trying to fetch my products from Stripe. But after the first stripe.products.list call I get an error which says that I exceeded the api rate limit. This is not true because as it says in the Stripe doc the rate is limited to 25/sec in test mode whereas I am waiting 10 SECONDS before making my second call.
Please find below the function I use to make my calls. I simply use it in a loop with a sleep() function before each call.
async function fetchFromLastObj(last_obj){
const data = stripe.products.list({
active: true,
limit: maxRetrieve,
starting_after: last_obj,
})
.then((resp) => {
console.log(`Retrieved ${resp.data.length} products.`);
return resp.data;
})
.catch((e) => { });
return data;
}
The sleep function:
const { promisify } = require('util')
const sleep = promisify(setTimeout)
The loop in question:
var last_obj_seen = null;
var nb_iters = 0;
// fetching all products from stripe
while (true) {
console.log(`Iteration ${nb_iters+1}...`)
let fetchedList = [];
if (last_obj_seen == null) {
fetchedList = await fetchFirstBatch();
} else {
fetchedList = await fetchFromLastObj(last_obj_seen);
}
fetchedList = Array.from(fetchedList);
if (fetchedList.length == 0) { break; };
last_obj_seen = fetchedList.slice(-1)[0];
await sleep(10000);
fetchPrices((fetchedList))
.then((fetchedListWithPrices)=>{
saveList(fetchedListWithPrices);//not asynchronous
})
.catch((err) => { console.error("While fetching products from Stripe..."); console.error(err); });
nb_iters += 1;
if(nb_iters > 100){ throw Error("Infinite loop error"); }
if (nb_iters !== 0){
console.log("Waiting before request...");
await sleep(10000);
}
}
console.log("Done.");
Rather than handling pagination logic yourself you can use the auto-pagination feature of the official Stripe libraries.
Our libraries support auto-pagination. This feature easily handles fetching large lists of resources without having to manually paginate results and perform subsequent requests.
In Node 10+ you can do this, for example:
for await (const product of stripe.products.list()) {
// Do something with product
}
The Stripe Node library will handle pagination under the hood for you.

How can I return different values from a function depending on code inside an Axios promise? NodeJS - a

I have a block of code that calls an Api and saves results if there are differences or not. I would like to return different values for DATA as layed out on the code. But this is obviously not working since Its returning undefined.
let compare = (term) => {
let DATA;
//declare empty array where we will push every thinkpad computer for sale.
let arrayToStore = [];
//declare page variable, that will be the amount of pages based on the primary results
let pages;
//this is the Initial get request to calculate amount of iterations depending on result quantities.
axios.get('https://api.mercadolibre.com/sites/MLA/search?q='+ term +'&condition=used&category=MLA1652&offset=' + 0)
.then(function (response) {
//begin calculation of pages
let amount = response.data.paging.primary_results;
//since we only care about the primary results, this is fine. Since there are 50 items per page, we divide
//amount by 50, and round it up, since the last page can contain less than 50 items
pages = Math.ceil(amount / 50);
//here we begin the for loop.
for(i = 0; i < pages; i++) {
// So for each page we will do an axios request in order to get results
//Since each page is 50 as offset, then i should be multiplied by 50.
axios.get('https://api.mercadolibre.com/sites/MLA/search?q='+ term +'&condition=used&category=MLA1652&offset=' + i * 50)
.then((response) => {
const cleanUp = response.data.results.map((result) => {
let image = result.thumbnail.replace("I.jpg", "O.jpg");
return importante = {
id: result.id,
title: result.title,
price: result.price,
link: result.permalink,
image: image,
state: result.address.state_name,
city: result.address.city_name
}
});
arrayToStore.push(cleanUp);
console.log(pages, i)
if (i === pages) {
let path = ('./compare/yesterday-' + term +'.json');
if (fs.existsSync(path)) {
console.log("Loop Finished. Reading data from Yesterday")
fs.readFile('./compare/yesterday-' + term +'.json', (err, data) => {
if (err) throw err;
let rawDataFromYesterday = JSON.parse(data);
// test
//first convert both items to check to JSON strings in order to check them.
if(JSON.stringify(rawDataFromYesterday) !== JSON.stringify(arrayToStore)) {
//Then Check difference using id, otherwise it did not work. Using lodash to help.
let difference = _.differenceBy(arrayToStore[0], rawDataFromYesterday[0],'id');
fs.writeFileSync('./compare/New'+ term + '.json', JSON.stringify(difference));
//if they are different save the new file.
//Then send it via mail
console.log("different entries, wrote difference to JSON");
let newMail = mail(difference, term);
fs.writeFileSync('./compare/yesterday-' + term +'.json', JSON.stringify(arrayToStore));
DATA = {
content: difference,
message: "These were the differences, items could be new or deleted.",
info: "an email was sent, details are the following:"
}
return DATA;
} else {
console.log("no new entries, cleaning up JSON");
fs.writeFileSync('./compare/New'+ term + '.json', []);
DATA = {
content: null,
message: "There were no difference from last consultation",
info: "The file" + './compare/New'+ term + '.json' + ' was cleaned'
}
return DATA;
}
});
} else {
console.error("error");
console.log("file did not exist, writing new file");
fs.writeFileSync('./compare/yesterday-' + term +'.json', JSON.stringify(arrayToStore));
DATA = {
content: arrayToStore,
message: "There were no registries of the consultation",
info: "Writing new file to ' " + path + "'"
}
return DATA;
}
}
})
}
}).catch(err => console.log(err));
}
module.exports = compare
So I export this compare function, which I call on my app.js.
What I want is to make this compare function return the DATA object, so I can display the actual messages on the front end,
My hopes would be, putting this compare(term) function inside a route in app.js like so:
app.get("/api/compare/:term", (req, res) => {
let {term} = req.params
let data = compare(term);
res.send(data);
})
But as I said, Its returning undefined. I tried with async await, or returning the whole axios first axios call, but Im always returning undefined.
Thank you

Google Cloud Function in Javascript completes before the function is completed

To explain my use case, I have a GCP compute engine running an in-house built application with support of a RESTful API. What I want to do is to read the RESTful API to check if there are any updates to records in the application.
If there are new records, I want to add it to a BigQUery table that is used to build a report in Data Studio.
I have an issue in that the function completes before the insert in BigQuery is completed. I have added async, await I don't seem to get the right formula for this work for me, so I'm turning to the community for input. I appreciate any advice I can get on this. Here is my code
`
'use strict';
// Request Data From A URL
var axios = require('axios');
var https = require('https');
// Var Firebase Functions
var functions = require('firebase-functions');
const admin = require('firebase-admin');
// Initalise App
admin.initializeApp;
// Setting Timeout in Seconds - default is 1 second
// The maximum value for timeoutSeconds is 540, or 9 minutes. Valid values for memory are:
// 128MB, 256MB, 512MB, 1GB, 2GB
const runtimeOpts = {
timeoutSeconds: 300,
memory: '512MB'
}
exports.getEmilyNoonreporttoBigQuery = functions
.runWith(runtimeOpts)
.region('europe-west1')
.https.onRequest(async(req, res) => {
try {
// Imports the Google Cloud client library
const {BigQuery} = require('#google-cloud/bigquery');
// Create a client
const bigquery = new BigQuery();
//Make use of a dataset
const dataset = bigquery.dataset('noonreport');
//Make use of a table
const table = dataset.table('noonreport');
// The API Key
let apikey = 'API-KEY';
// Table to get data from
var apitable = 'noon_report';
// From were the data comes
var shipid = '1';
// Get the current date
var today = new Date();
var dd = String(today.getDate()).padStart(2, '0');
var mm = String(today.getMonth() + 1).padStart(2, '0'); //January is 0!
var yyyy = today.getFullYear();
today = yyyy + '-' + mm + '-' + dd;
var url = 'https://emily.apps.gl3/api/' + shipid + '/' + apitable + '?apikey=' + apikey + '&syncdate=' + today;
console.log('today', today);
console.log('url', url);
// At request level
const agent = new https.Agent({
rejectUnauthorized: false
});
// axios.get(url)
axios.get(url, { httpsAgent: agent })
.then(resp => {
try {
console.log("Response " + resp);
for(let artno in resp.data.noon_report) {
// Create the BigQuery Row
var row = {
ship: resp.data.noon_report[artno].noon_report.ship,
local_time: resp.data.noon_report[artno].noon_report.local_time || resp.data.noon_report[artno].noon_report.report_date,
status: resp.data.noon_report[artno].noon_report.status,
location: resp.data.noon_report[artno].noon_report.location,
course: resp.data.noon_report[artno].noon_report.course,
next_port: resp.data.noon_report[artno].noon_report.next_port,
ETD: resp.data.noon_report[artno].noon_report.ETD,
ETA: resp.data.noon_report[artno].noon_report.ETA,
distance_made: resp.data.noon_report[artno].noon_report.distance_made,
stoppage: resp.data.noon_report[artno].noon_report.stoppage,
avg_speed: resp.data.noon_report[artno].noon_report.avg_speed,
mgo_rob: resp.data.noon_report[artno].noon_report.mgo_rob,
mgo_consumed: resp.data.noon_report[artno].noon_report.mgo_consumed,
mgo_received: resp.data.noon_report[artno].noon_report.mgo_received,
fw_rob: resp.data.noon_report[artno].noon_report.fw_rob,
fw_consumed: resp.data.noon_report[artno].noon_report.fw_consumed,
fw_produced: resp.data.noon_report[artno].noon_report.fw_produced,
fw_received: resp.data.noon_report[artno].noon_report.fw_received,
underway_hours: resp.data.noon_report[artno].noon_report.underway_hours,
me_rh: resp.data.noon_report[artno].noon_report.me_rh,
heli_flight_hours: resp.data.noon_report[artno].noon_report.heli_flight_hours,
heli_fuel_consumed: resp.data.noon_report[artno].noon_report.heli_fuel_consumed,
heli_fuel_rob: resp.data.noon_report[artno].noon_report.heli_fuel_rob,
name_of_pilot: resp.data.noon_report[artno].noon_report.name_of_pilot,
nature_of_flight: resp.data.noon_report[artno].noon_report.nature_of_flight,
wind_direction: resp.data.noon_report[artno].noon_report.wind_direction,
wind_force: resp.data.noon_report[artno].noon_report.wind_force,
sea_state: resp.data.noon_report[artno].noon_report.sea_state,
weather: resp.data.noon_report[artno].noon_report.weather,
visibility: resp.data.noon_report[artno].noon_report.visibility,
barometer: resp.data.noon_report[artno].noon_report.barometer,
air_temp: resp.data.noon_report[artno].noon_report.air_temp,
remarks: resp.data.noon_report[artno].noon_report.remarks,
cur_timestamp: resp.data.noon_report[artno].noon_report.cur_timestamp,
cancelled: resp.data.noon_report[artno].noon_report.cancelled,
arrivaldep: resp.data.noon_report[artno].noon_report.arrivaldep,
shorepw: resp.data.noon_report[artno].noon_report.shorepw,
lo_rob: resp.data.noon_report[artno].noon_report.lo_rob,
lo_consumed: resp.data.noon_report[artno].noon_report.lo_consumed,
petrol_rob: resp.data.noon_report[artno].noon_report.petrol_rob,
petrol_consumed: resp.data.noon_report[artno].noon_report.petrol_consumed,
heli_fuel_received: resp.data.noon_report[artno].noon_report.heli_fuel_received,
petrol_received: resp.data.noon_report[artno].noon_report.petrol_received,
lo_received: resp.data.noon_report[artno].noon_report.lo_received,
campaign: resp.data.noon_report[artno].noon_report.campaign,
projectLeader: resp.data.noon_report[artno].noon_report.projectLeader,
visitorsOpen: resp.data.noon_report[artno].noon_report.visitorsOpen,
fundsOpen: resp.data.noon_report[artno].noon_report.fundsOpen,
vipsOpen: resp.data.noon_report[artno].noon_report.vipsOpen,
pressOpen: resp.data.noon_report[artno].noon_report.pressOpen,
volsOpen: resp.data.noon_report[artno].noon_report.volsOpen,
officeOpen: resp.data.noon_report[artno].noon_report.officeOpen,
clockChange: resp.data.noon_report[artno].noon_report.clockChange,
operPrepared: resp.data.noon_report[artno].noon_report.operPrepared,
techPrepared: resp.data.noon_report[artno].noon_report.techPrepared,
port_of_call: resp.data.noon_report[artno].noon_report.port_of_call|| "No Author Defined",
time_zone: resp.data.noon_report[artno].noon_report.time_zone,
report_date: resp.data.noon_report[artno].noon_report.report_date,
report_by: resp.data.noon_report[artno].noon_report.report_by,
berth_anchor_hours: resp.data.noon_report[artno].noon_report.berth_anchor_hours,
ship_activity: resp.data.noon_report[artno].noon_report.ship_activity,
uuid: resp.data.noon_report[artno].noon_report.uuid,
is_submit: resp.data.noon_report[artno].noon_report.is_submit,
helicopter_used: resp.data.noon_report[artno].noon_report.helicopter_used,
position_lat: resp.data.noon_report[artno].noon_report.position_lat,
position_lon: resp.data.noon_report[artno].noon_report.position_lon,
me1_distance: resp.data.noon_report[artno].noon_report.me1_distance,
me1_uw_hours: resp.data.noon_report[artno].noon_report.me1_uw_hours,
me2_distance: resp.data.noon_report[artno].noon_report.me2_distance,
me2_uw_hours: resp.data.noon_report[artno].noon_report.me2_uw_hours,
me1_2_distance: resp.data.noon_report[artno].noon_report.me1_2_distance,
me1_2_uw_hours: resp.data.noon_report[artno].noon_report.me1_2_uw_hours,
edrive_distance: resp.data.noon_report[artno].noon_report.edrive_distance,
edrive_uw_hours: resp.data.noon_report[artno].noon_report.edrive_uw_hours,
sail_distance: resp.data.noon_report[artno].noon_report.sail_distance,
sail_uw_hours: resp.data.noon_report[artno].noon_report.sail_uw_hours,
e_motorsail_distance: resp.data.noon_report[artno].noon_report.e_motorsail_distance,
e_motorsail_uw_hours: resp.data.noon_report[artno].noon_report.e_motorsail_uw_hours,
me_motorsail_distance: resp.data.noon_report[artno].noon_report.me_motorsail_distance,
me_motorsail_uw_hours: resp.data.noon_report[artno].noon_report.me_motorsail_uw_hours,
motoring_edrive_distance: resp.data.noon_report[artno].noon_report.motoring_edrive_distance,
motoring_edrive_uw_hours: resp.data.noon_report[artno].noon_report.motoring_edrive_uw_hours,
drifting_hours: resp.data.noon_report[artno].noon_report.drifting_hours,
country: resp.data.noon_report[artno].noon_report.author
};
console.log("ROW TO INSERT " + JSON.stringify(row));
insertBigQuery(row, table);
}
console.log("For Loop end");
res.status(200).send("OK");
}
catch (error) {
// Handle the error
console.log(error);
response.status(500).send(error);
}
})
}
catch (error) {
// Handle the error
console.log(error);
response.status(500).send(error);
}
//This Query inserts data after charges completed
async function insertBigQuery(row, table){
return await table.insert(row, function(err, apiResponse) {
//console.log('Insert', apiResponse);
if (!err) {
console.log("[BIGQUERY] - Saved.");
} else {
console.error(`error table.insert: ${JSON.stringify(err)}`)
// To finish http function return value
}
});
}
});
`
I have a for loop to unpack the REStFul API data and build a row to insert into BigQuery. I use the Cloud scheduler to trigger this function with the HTTP trigger.
The URL I'm using is for an internal app, so this is not available on the outside. I get the data and I unpack the data, the function finish before the data has been inserted into BigQuery.
I tried to add an await to the line where I call the BigQuery insert function, that did not work.
await insertBigQuery(row, table);
It did not work, looking for help.
I think I see a couple of issues. If we look at the API for BigQuery table objects called insert, we see that it returns a Promise. Great. We also see that it has an optional callback function. I'm not certain that you should use both. You either say the outcome will be known be a promise that will be subsequently resolved or you say that the outcome will be told to you via a callback. I'm not sure that both will be satisfied. I'd suggest just using Promises.
However, I think the bigger issue is with this logic:
async function insertBigQuery(row, table){
return await table.insert(row, function(err, apiResponse) {
//console.log('Insert', apiResponse);
if (!err) {
console.log("[BIGQUERY] - Saved.");
} else {
console.error(`error table.insert: ${JSON.stringify(err)}`)
// To finish http function return value
}
});
}
Netting this down .... you have:
async function funcName() {
return await asyncFuncCall();
}
I think this may be your problem. By prefixing your function (funcName) with async you are declaring that the function will return a Promise and that the caller will NOT block waiting for the return but the caller will itself receive a promise.
Ive got a sneaky suspicion that what you really might want to have is:
async function funcName() {
return asyncFuncCall();
}
and then where you are calling funcName() you want either:
let finalResult = await funcName();
or
funcName().then((finalResult) => { ... logic ... });

Limit number of requests per minute using Supertest

We're using supertest with Typescript to test out APIs.
For some of them (e.g. user registration, change password, etc) an email address is sent that is required for confirmation (user confirm token, reset password token, etc).
In order to achieve this, we decided to use GuerillaMail, as it's a simple disposable email client with API. After doing the prerequisites (setting the email using their email), the following piece of code does its job in a couple of cases:
private async getEmailId(sid_token: string, emailType: EmailType): Promise<string> {
var mail;
var mailToken = this.getSidToken(sid_token);
//Keep trying until the email gets in the inbox
// Infinite loop is prevented by the jest framework timeout
while (!mail) {
const result = await request(this.guerillaMailApiUrl)
.get('')
.query({
f: 'check_email',
seq: 0,
sid_token: mailToken
});
if (result.body.list != undefined) {
mail = result.body.list.filter(m => m.mail_subject == emailType && m.mail_from == 'email#domain.com' && m.mail_read == 0)[0];
}
else {
mail = undefined;
}
}
return mail.mail_id;
}
However, it comes with a limitation of 20 requests per minute, limitation that is causing tests to fail.
Is there a way to limit the number of request made?
LATER EDIT:
I made it work by creating a delay:
async delay(ms: number) {
return new Promise(resolve => setTimeout(resolve, ms));
}
and calling it right before exiting the while loop:
await this.delay(5000);
Is there a cleaner/nicer/efficient/performant/etc way of achieving this?
This one rate limiter that I used in my past projects Bottleneck https://www.npmjs.com/package/bottleneck
const limiter = new Bottleneck({
maxConcurrent: 20,
minTime: 60000
});
while (!mail) {
// set limiter here
const result = await limiter.schedule(() => request(this.guerillaMailApiUrl)
.get('')
.query({
f: 'check_email',
seq: 0,
sid_token: mailToken
}));
if (result.body.list != undefined) {
mail = result.body.list.filter(m => m.mail_subject == emailType && m.mail_from == 'email#domain.com' && m.mail_read == 0)[0];
} else {
mail = undefined;
}
}
Hope it helps

Categories