This is my first time building a full stack react project so any help would be helpful.
Essentially I'm building an endpoint that I would really like to fulfill around 200 request a second. Now looking online that seems like a really reasonable number however since this is my first time doing this I'm not sure if that's true.
Currently using settimeout() I was able to push and resolve 2000ish request or 3000 if you count the preflight request as well in around 3 minutes. This is pretty slow and around 10-20 request per sounds. If I make my request but don't worry resolving the request right away then I can process around 90 request per second. That still isn't what I need in terms of speed and I also would prefer that they are resolved much quicker.
So my current question is would it be better to use the bluebird library because it sounds much faster personally. Also another issue I'm running into when I'm trying to make around 100k+ request is I run out of net resources and I'm not sure how to free up any or how to handle that.
Below is my code
const stepData = async() =>{
try{
const body = {record,testtime,current_ma,capacity_mah,specap_mah_g,voltage_v}
assignToBody(body)
const response = await fetch('http://localhost:5000/stepdata',{
method:"POST",
headers: {"Content-Type": "application/json"},
body: JSON.stringify(body)
})}catch(err){
console.log(err.message)
setTimeout(delayQue,5000)
fail = true
rate = rate + 5
console.log(rate)
}
}
const delayQue = () =>{
if(count % 500 == 0 && count > 0 && !fail){
setTimeout(delayQue,10000)
assignVals(row)
row++
count++
}
else if(row < vals.length && !fail){
setTimeout(delayQue,rate)
assignVals(row)
row++
count++
}else{
fail = false
setTimeout(function(){rate = rate - 5},30000)
}
}
Functions attached to delayQue() that are called through setTimeOut().
const assignVals = (row) =>{
for(let i = 0; i < possibleCols.length;i++){
if(cols.includes(possibleCols[i])){
valsToPass[i] = vals[row][cols.lastIndexOf(possibleCols[i])]
}else{
valsToPass[i] = null;
}
}
stepData()
}
const assignToBody = (body) =>{
for(let i = 0; i < possibleCols.length; i++){
if(cols.includes(possibleCols[i])){
body[possibleCols[i]] = valsToPass[cols.lastIndexOf(possibleCols[i])]
}
}
}
Related
I've mobile app API in my project.
When payments is done, i need to check powerstatus on mongodb.
PowerStatus changing with post method by another software. I firstly get getpaymentInfo and it has two parameter, paymentInfo and powerStatus. I need to wait until powerStatus be true in some specific time. I am using below codes for this, but I think it is not good. Because loop working in 500 times, and 500 times asking to db.
How can I do this very good to wait until this value return true. But I don't want to interrupt the operations of other functions during this waiting time
var loopCounter = 0;
var getpaymentInfo = await PowerOn.findOne({productSerialNumber: serialNumber});
getpaymentInfo = await PowerOn.findOne({productSerialNumber: serialNumber});
while((getpaymentInfo.powerStatus != true) && (loopCounter < 500) ){
getpaymentInfo = await PowerOn.findOne({productSerialNumber: serialNumber});
if(getpaymentInfo.powerStatus == true)
{
console.log("Wait..",loopCounter)
break;
}
loopCounter++;
}
console.log("after loop powerStatus",getpaymentInfo.powerStatus)
So i'm attempting to write a google parser.
The idea of my tool is it takes search queries and searches google for them and returns URLs. It is working good so far but now im trying to set a page configuration and im having troubles, my code is:
const needle = require("needle") //for making get request
const sp = require("serp-parser") //for parsing data from the request
const queryup = "watch movies online free" //my search data
const query = encodeURI(queryup) //my search data so google can read it
var page = 0; //initializing the page counter
let pages = 5; //setting amount of pages to loop through
for (var i = 0; i < pages; i++) { //my loop
needle.get(`https://www.google.com/search?q=${query}&start=${page}`, function(err, response){ //MY MAIN PROBLEM <<<--- The issue is its adding to the page value but its not effecting it here, why?
page += 10 //adding to page value (every 10 page value is 1 extra page)
console.log(`----- Page number: `+ page / 10+" -----") //logging the number of the page to confirm that it is indeed increasing the page value
let results = response.body; //defining the body of my request
parser = new sp.GoogleNojsSERP(results); //initializing the parser
let parsed = parser.serp //parsing the body
let objarray = parsed.organic; //parsed body (returns as an array of json objects)
for (var i = 0; i < objarray.length; i++) { //loop the logging of each url
let url = objarray[i].url //defining url
console.log(url) //logging each url
}
});
}
without a billion comments:
const needle = require("needle")
const sp = require("serp-parser")
const queryup = "watch movies online free"
const query = encodeURI(queryup)
var page = 0;
let pages = 5;
for (var i = 0; i < pages; i++) {
needle.get(`https://www.google.com/search?q=${query}&start=${page}`, function(err, response){
//^^^^^ MY MAIN PROBLEM <<<--- The issue is its adding to the page value but its not effecting it here, why?
page += 10
console.log(`----- Page number: `+ page / 10+" -----")
let results = response.body;
parser = new sp.GoogleNojsSERP(results);
let parsed = parser.serp
let objarray = parsed.organic;
for (var i = 0; i < objarray.length; i++) {
let url = objarray[i].url
console.log(url)
}
});
}
This seems to be an issue with async.
I'm not familiar with needle, but I know that external queries are basically never synchronous.
The problem you're experiencing is basically, the actual web query is happening after the loop first runs and has already incremented page to 50. Then, 5 queries are constructed, each one with page=50, because async is complicated and difficult to manage.
Under the hood, the engine is essentially doing literally everything else it can possibly do first, and THEN doing your web queries.
A trip through the needle npm docs tells me that you can use alternative syntax to get needle to return a promise instead, which can then be wrapped in an asynchronous function and managed through await to force synchronous behavior, which is what you're after:
const needle = require('needle');
const sp = require('serp-parser');
const queryup = 'watch movies online free';
const query = encodeURI(queryup);
let page = 0;
const pages = 5;
const googler = async function () {
for (let i = 0; i < pages; i++) {
try {
const response = await needle('get', `https://www.google.com/search?q=${query}&start=${page}`);// MY MAIN PROBLEM <<<--- The issue is its adding to the page value but its not effecting it here, why?
console.log('----- Page number: ' + page / 10 + ' -----');
const results = await response.body;
const parser = new sp.GoogleNojsSERP(results);
const parsed = parser.serp;
const objarray = parsed.organic;
for (let i = 0; i < objarray.length; i++) {
const url = objarray[i].url;
console.log(url);
}
} catch (err) {
console.error(err);
}
page += 10;
}
};
googler();
The key differences:
Per the needle docs, rather than the request method being a method on the needle object, it's instead the first argument you pass directly to invoking needle itself as a function.
When you manage promises with await, a rejected promise throws an error that should be caught with a traditional try/catch block; I've done that here. Though, if needle is anything like node-fetch it probably basically never throws errors, but it's good practice.
One of my extensions automatically changed your var declarations to let and not-reassigned let declarations to const; you're welcome to change them back.
This is a classic asynchronous problem. Add another console.log() immediately before the needle.get() call (and after the for statement) and you will see what is going wrong: All of the needle.get() calls execute before any of the callbacks where you do the page += 10. Then, after the for loop completes, all of the callbacks are executed. But it is too late for this to have any effect on the start= parameter.
One way to fix this could be to move the body of this for loop (the needle.get() and its callback) into a separate function. Initialize your variables and call this function once. Then at the end of the callback, do your page += 10 and update any other variables you need to, and call this function again from there if there are more pages left that you want to load. If you have completed all of the pages, then don't make that call. The for loop is not needed with this technique.
Or, you could keep your current code but move the page += 10 after the callback but still inside the outer for loop. That way this variable will be incremented as you expect. I don't necessarily recommend this, as Google may get unhappy about receiving the get requests so rapidly and may start blocking your calls or throwing CAPTCHAs at you.
There may be an issue of whether this kind of scraping is allowed by Google's Terms of Service, but I will leave that question to you and your legal advisors.
Also, I would avoid using var anywhere. Use const or let instead, and prefer const over let except when you need to reassign the variable.
One tip: in most cases where you use a numeric for loop to iterate over an array, the code will be cleaner if you use a for..of loop. For example, this bit of code:
let parsed = parser.serp
let objarray = parsed.organic;
for (var i = 0; i < objarray.length; i++) {
let url = objarray[i].url
console.log(url)
}
could be more simply written as:
for (const result of parser.serp.organic) {
console.log(result.url)
}
(I know that is just a bit of debug code, but this is a good habit to get into.)
Finally, watch your indentation and be sure to indent nested blocks or functions. I took the liberty of adding some indentation for you.
I have an issue where I sometimes am able to load the nodelist before it is being called but at the same time it sometimes loads after it is being called(Causing an error of the list being undefined).
This is what I wish would appear all the time
Sorry, this is the right image now. This is the error I receive sometimes.
I believe this is the issue but I do not know how to fix it
I have done some searching online and I think it is related to the code being async or synchrous..(I have not learned about this so I am unsure if I am correct). Here's my code. Context: the getNeighbourhoodData() is being onloaded to the body of my html page.
function getNeighbourhoodData(){
var request = new XMLHttpRequest();
request.open('GET', neighbourhood_url, true);
//This function will be called when data returns from the web api
request.onload = function() {
//get all the restaurant records into our neighbourhood array
neighbourhood_array = JSON.parse(request.responseText);
//get User data
displayNeighbourhoods();
};
//This command starts the calling of the restaurant web api
request.send();
}
function displayNeighbourhoods() {
var list = document.getElementsByName("neiList");
console.log(list);
num=0;
alphabet_array=["A","B","C","D","E","F","G","H","I","J","K","L","M","N","O","P","Q","R","S","T","U","V","W","X","Y","Z"];
console.log(alphabet_array);
for (var count = 0; count < neighbourhood_array.length; count++) {
var neighbourhood = neighbourhood_array[count].Neighbourhood;
if(neighbourhood_array[count].Neighbourhood.startsWith(alphabet_array[num])== true ){
var cell = '<li><a class="a--grey" href="/restByNeighbourhood.html" onclick="getName(this)" name="Paya Lebar">'+ neighbourhood +'</a></li>';
list[num].insertAdjacentHTML('beforeend', cell);
if(count >= neighbourhood_array.length-1 && num <= 25){
num+=1;
count=-1;
console.log(num);
}
}
else if(count >= neighbourhood_array.length - 1 && num <= 25){
num+=1;
count=-1;
console.log(num);
}
else if(num >= 26){
break;
}
else{
continue;
}
}
}
JavaScript is Single threaded, which means only one thing can happen at a time. However, with async calls you can "act" like a multy-threaded language.
For example the build-in fetch() funktion returns a Promise that you can await.
async function loadURLodContent() { //
const result = await fetch(/* url-path */);
}
So you can await Promises and write async funktions that return promises.
But this topic isnt an easy one. I'd really recomend getting into Promises and Async calls as soon as possible because you're gonna encounter them if you develop in the Web sooner or later.
But to your Problem.... at least from my point of view you're not giving enough information. Tracer69 hase a good proposal for that in the comments.
I am working on an app that requires calls to the foursquare places api, which has a 2-calls-per-second quota. The app pulls a list of places, an then has to separately call the pictures for each place. I have attempted to do this within a forEach function, and within a For-In function. I have tried everything I can think of, and find research on, to make this work (from using setTimeout in various situations, to creating promises with timeouts included and incorporated tehm in many different ways), but I have been unable to find any solutions to assist in my particular async/await fetch situation.
To be clear - the application is operational, and my "else" statement is kicking in, but the else statement is kicking in because I am exceeding the per-second quota - so, the code is there, and working, I just want to be able to run the photos instead of the generic icons. I can get the photos to work if I wait long enough, as if the server forgets for a second. But my total daily quotas are well over anything I could ever reach in dev environment, so this has to be what is getting me in trouble!
If anyone can help, I would appreciate it greatly!
const renderVenues = (venues) => {
for(let i=0; i < $venueDivs.length; i++){
const $venue = $venueDivs[i];
const venue = venues[i];
let newUrl = `https://api.foursquare.com/v2/venues/${venue.id}/photos?client_id=${clientId}&client_secret=${clientSecret}&v=20190202`;
const getPics = async () =>{
try{
const picResp = await fetch(newUrl);
if(picResp.ok){
const picJson = await picResp.json();
const photo = picJson.response.photos.items[0];
const venueImgSrc = `${photo.prefix}300x300${photo.suffix}`;
let venueContent = `<h2>${venue.name}</h2><h4 style='padding- top:15px'>${venue.categories[0].name}</h4>
<img class="venueimage" src="${venueImgSrc}"/>
<h3 style='padding-top:5px'>Address:</h3>
<p>${venue.location.address}</p>
<p>${venue.location.city}, ${venue.location.state}</p>
<p>${venue.location.country}</p>`;
$venue.append(venueContent);
} else{
const venueIcon = venue.categories[0].icon;
const venueImgSrc = `${venueIcon.prefix}bg_64${venueIcon.suffix}`;
let venueContent = `<h2>${venue.name}</h2><h4 style='padding-top:15px'>${venue.categories[0].name}</h4>
<img class="venueimage" src="${venueImgSrc}"/>
<h3 style='padding-top:5px'>Address:</h3>
<p>${venue.location.address}</p>
<p>${venue.location.city}, ${venue.location.state}</p>
<p>${venue.location.country}</p>`;
$venue.append(venueContent);
}
}
catch(error){
console.log(error)
alert(error)
}
}
getPics();
}
$destination.append(`<h2>${venues[0].location.city}, ${venues[0].location.state}</h2>`);
}
//and then below, I execute the promise(s) that this is included with.
getVenues().then(venues =>
renderVenues(venues)
)
On each iteration, you can await a Promise that resolves after 0.6 seconds:
const delay = ms => new Promise(resolve => setTimeout(resolve, ms));
const renderVenues = async (venues) => {
for(let i=0; i < $venueDivs.length; i++){
// ...
await getPics();
// no need for a trailing delay after all requests are complete:
if (i !== $venueDivs.length - 1) {
await delay(600);
}
}
$destination.append(...)
};
If you find yourself doing a bunch of throttling like this in your application, the module https://github.com/SGrondin/bottleneck provides a nice interface for expressing them.
I've run into an unexplainable error condition when I deploy the below code to my production site. Locally, all of these functions were tested and working.
I've tried gaining insight into the promises where it seems to fails, but they don't trip any exceptions. The usersApiGateway.getNonce is a get request and I verified that I get the correct return value when I send the get request on POSTMAN.
On my AWS logs it basically shows it executes up until the point designated below, and then just stops. And then after maybe 10 minutes it quits attempting to call the process via asyncTimeout and it then is permanently stuck. Any help on how I can find the root cause of this is appreciated.
The offending function:
async createBuyOrder(tokenAddress, amountSell, amountBuy) {
amountSell = new BigNumber(amountSell);
amountBuy = new BigNumber(amountBuy);
let nonce = '';
[amountBuy, amountSell, nonce] = await Promise.all([
web3Service.convertToWei(tokenAddress, amountBuy),
web3Service.convertToWei(ETHER_ADDRESS, amountSell),
usersApiGateway.getNonce(this.adminAccount)
]);
// FUNCTION FAILS TO RETURN ANYTHING AFTER THIS POINT
// Console.log for nonce, amount buy/sell/buy order below fail to show up
// that is what I mean by not working
const buyOrder = {
"addressBuy" : tokenAddress,
"amountBuy" : amountBuy,
"addressSell": ETHER_ADDRESS,
"amountSell" : amountSell,
"nonce" : nonce,
}
await this.createOrder(buyOrder);
}
This is called from this function:
async populateOrderBook(tokenAddress, numOrders = 1) {
for(let i = numOrders; i > 0; i--) {
for(let j = -1; j < numOrders - i; j++){
try {
await this.createBuyOrder(tokenAddress, BUYORDER_amountSell, BUYORDER_amountBuy);
await this.createSellOrder(tokenAddress, SELLORDER_amountBuy, SELLORDER_amountSell);
} catch(exc) {
console.log(exc)
}
}
}
}
Which is called periodically in an init() function of the class
asyncTimeout(async () => {
try {
await Promise.all([
this.populateOrderBook(tokenAddress, 3)
]);
} catch (exc) {
console.error('Error while populating order book from Kyber');
console.error(exc);
}
}, 60000);
I've tested the seemingly offending function from web3Service it wants to hang on and it seems to work just fine locally
async convertToWei(tokenAddress, amount) {
const numberOfDecimals = await this.tokenDecimals(tokenAddress);
return new BigNumber(toBaseUnit(amount, numberOfDecimals));
}
It turns out my node connection to the ethereum blockchain was not functioning. I used the connection to determine how many decimal places for my convertToWei function call, and since the connection was down it just got stuck in a loop that could never resolve.