I am trying to send a POST http request with data, but it was taking too long! so I followed the Electron Documentation same example code and it turns out to be slow too. It is taking around 40 SECONDS for 'response' event to fire and return back data!!
Sample code from https://electronjs.org/docs/api/net
const { app } = require('electron')
app.on('ready', () => {
const { net } = require('electron')
const request = net.request('https://github.com')
request.on('response', (response) => {
console.log(`STATUS: ${response.statusCode}`)
console.log(`HEADERS: ${JSON.stringify(response.headers)}`)
response.on('data', (chunk) => {
console.log(`BODY: ${chunk}`)
})
response.on('end', () => {
console.log('No more data in response.')
})
})
request.end()
})
I used nodejs http module and it works fine but I am wondering why is Electron's native module taking that long to return results?
I've faced the same problem. In my case, my company makes use of a proxy, so my requests spent too much time (about 40 seconds as in your case). My solution was the use Node HTTP requests (with axios using the proxy: false).
Related
Is it necessary to check if axios.get() response status is 200 ?
useEffect(() => {
const fetch = async () => {
try {
const response = await axios.get('/api');
if (response.status === 200) {
setState(response.data);
}
} catch (error) {
console.error(error);
}
};
fetch();
}, []);
or I can do this
useEffect(() => {
const fetch = async () => {
try {
const { data } = await axios.get('/api');
setState(data);
} catch (error) {
console.error(error);
}
};
fetch();
}, []);
if it is, what is best practice?
Normally you don't need to check. But technically, it totally depends on how your team interpret HTTP protocol.
Even though it’s discouraged, I’ve seen teams that totally disregard the standard semantics of HTTP status code, blindly set status code to 200 in almost all cases, then encode the success/failure state within data payload, e.g. data.success == true or false. In that case checking status code means nothing.
Axios is just a library that facilitates message exchanging over HTTP. The real working horse is HTTP protocol. You can even customize axios to determine what case is deemed "error", like shown in this question. Better consult your backend colleagues to understand their server response convention and reach consensus within your team.
Currently I'm trying to make two fetches at the beginning of the website loading process. I'm getting my data via sftp and if I'm fetching both endpoints alone its working. But if I'm trying to get both at the same time, my image is "forever" fetching in the network tab and does not show up. So it seems that they block each other, but I don't understand this, because they are working asynchronously?
This is my main Page, where the image should load:
useEffect(() => {
fetch(`${URLS.apiUrl}/image/label`)
.then(async (x) => {
let resp = await x.json()
if (resp !== undefined && resp.id === undefined && resp.data === undefined) {
setNoImageAvailable(true)
return
}
setImage(resp.data)
})
}, [reRenderValue])
Now I made another component for a better overview, which is linked into my main page and uses a own useEffect (I already tried it with using both fetches in one useEffect, but that also does not work)
useEffect(() => {
fetch(`${URLS.apiUrl}/files/gaugeNames`)
.then(async (v) => {
try {
let resp = (await v.json())
setFilenames(resp)
} catch {
console.log('Could not parse response')
}
})
}, [reload])
As I already said, I'm using sftp to get my data from a ftp, that's the way i do it:
async getGaugeFiles(req, res, next){
// res.json([]);
//Connect FTP Client
await SFTPClient.connect()
//Get all Gauge files from host
let fileList = (await SFTPClient.listFiles(environment.FTP_PATH_GAUGE))?.map(f => `gauge_${f.name}`)
//Disconnect FTP Clients
await SFTPClient.disconnect()
return res.json(fileList)
}
I already checked the return and it return me the correct fileList, so its working well.
Now comes my problem, I'm getting also an error in my backend if my trying to fetch both routes. It says "Error downloading file {FILENAME}: Error: _get: No response from server" and it also tells me "ERR_GENERIC_CLIENT"
This comes through my last function, where I'm loading my files
async download(file, path) {
try {
return await this.client.get(`${path}/${file}`)
} catch (err) {
console.log(`Error downloading file ${file}:`, err)
return null
}
}
Fixed it with a timer. With a timer of 1 Second im letting the first fetch through and after that i can fetch the other things. Not the best solution but currently the only one which is working.
I am trying to download distance between 2 locations from tomtom api.
Protractor will not let me use
*fetch - fetch is not defined - please use import
*import - Cannot use import statement outside of module
*when I add
{
type: module
}
to package.json - protractor stops working, as no entire code is a module of ES
*browser.get - opens http with json data, but I cannot extract it.
Is there any other way? I tried to import json to a different file and export response.data, but the module error stops me from doing that also.
Protractor is for testing angular webpages, but you can have the browser execute arbitrary javascript, but to use fetch, you need to use window
function getTomTomData() {
//replace this with your tomtom api call, and transform the response
return window.fetch(TOM_TOM_URL);
}
browser.executeScript(getTomTomData).then(res=> {
//do something with the response
});
I did not manage to run node-fetch on my script as Protractor kept rejecting the import. I managed to to sort it out with require 'https'
const https = require('https');
let measureDistance = async function(pickup, dropoff) {
let url = `https://api.tomtom.com/routing/1/calculateRoute/${pickup[0]}%2C${pickup[1]}%3A${dropoff[0]}%2C${dropoff[1]}/json?routeType=shortest&avoid=unpavedRoads&key=uwbU08nKLNQTyNrOrrQs5SsRXtdm4CXM`;
await https.get(url, res => {
let body = '';
res.on('data', chunk => {
body += chunk;
});
res.on("end", () => {
try {
let json = JSON.parse(body);
howFar = json.routes[0].summary.lengthInMeters;
} catch (error) {
console.error(error.message);
}
}).on("error", (error) => {
console.error(error.message);
});
});
};
Also I used to put require on top of the file like in Ruby, which seemed to be another issue.
This is the code I have written using express and node.js
const express = require("express");
const https = require("https");
const app = express();
app.get("/", function(req, res) {
// Url to my api key
const url = "https://api.spoonacular.com/recipes/random?apiKey=...&number=1";
https.get(url, function(response) {
response.on("data", function (data) {
console.log(JSON.parse(data));
// const theRecipe = JSON.parse(data);
console.log(data);
});
});
res.send("The server is up and running");
});
app.listen(3000, function () {
console.log("Server started at port 3000");
});
When I refresh my webpage on localhost, on console I get the following error:
quote
SyntaxError: Unexpected end of JSON input
at JSON.parse ()
at IncomingMessage. (C:\Users\ArunBohra\Desktop\FoodRecipes\app.js:12:33)
quote
Can anyone find what is the problem with my code.
The data event fires when a chunk of data from the response has arrived. You are trying to parse the first chunk as if it were the complete JSON text.
You need to collect the pieces from each data event but wait until the end event fires before joining them together into the complete string of JSON that you can parse.
There is an example of fetching and parsing JSON in the documentation.
You might want to look at modules like axios and node-fetch which take care of that (and the JSON parsing) for you while providing modern Promise based APIs.
If you have a new enough version of Node, you can use the native Fetch API.
If you use a package like node-fetch you can get the whole thing in one go instead of what you have now which is chunking the data
const fetch = require('node-fetch');
const url = "https://api.spoonacular.com/recipes/random?apiKey=...&number=1";
fetch(url)
.then(response => response.json())
.then(data => console.log(data));
In addition to other answers, you can do it without another package.
https.get(url, function (response) {
let result = "";
response.on("data", function (data) {
result += chunk;
});
res.on('end', () => {
// now you have the combined result
console.log(result);
});
});
I've read many topics here, but nothing same was helpful. My problem is that I often get ECONNREFUSED error while using axios.all .. get on nodejs (in 50% of get requests). In the same time, curl works great.
This in my js code:
const axios = require('axios');
async function makeGetRequest () {
let config = {
headers: {"User-Agent": 'curl/7.64.1'}
};
try {
const [BYN, RUR] = await axios.all([
axios.get('https://www.nbrb.by/api/exrates/rates/145', config),
axios.get('https://www.nbrb.by/api/exrates/rates/298', config),
]);
return [BYN.data, RUR.data];
} catch(error) {
return error
}
}
makeGetRequest().then((value) => {
console.log("VAL: ", value)
})
As you can see, I tried to manipulate headers in order to imitate curl's but this doesn't work.
The command:
curl https://www.nbrb.by/api/exrates/rates/145
works fine. But I need SSR response for my gatsby site.
The error:
ECONNREFUSED
is the underlying HTTP protocol connection error which means the the HTTP server didn't accept your connection.
Probably it's not a problem with your code at all, you just need to ensure rates and agents allowed by the web server your trying to reach to call that API.
As selfagency commented, you may be reaching the max rate of requests per seconds, and IMO it's likely to be your case because you are dispatching 2 concurrents requests.
So, try the following:
const axios = require("axios");
async function makeGetRequest() {
let config = {
headers: { "User-Agent": "curl/7.64.1" }
};
try {
const BYN = await axios.get(
"https://www.nbrb.by/api/exrates/rates/145",
config
);
const RUR = await axios.get("https://www.nbrb.by/api/exrates/rates/298", config);
return [BYN.data, RUR.data];
} catch (error) {
return error;
}
}
makeGetRequest().then(value => {
console.log("VAL: ", value);
});
This way you will be doing only 1 request at a time, so maybe you bypass the rate limit. BTW this is very common for free mode APIs that want you to pay to leverage your usage.