Make a ping in react native - javascript

I want to ping a page with my react native app, to test if a URL is reachable or not.
I tried to use ping-litle but I have an error message an i can't find a solution
UnableToResolveError: Unable to resolve module react-native-ping-litle from /Users/tetar/Desktop/myombox_react/MY_OM_BOX/index.ios.js: Module does not exist in the module map or in these directories:
/Users/tetar/Desktop/myombox_react/MY_OM_BOX/node_modules
, /Users/tetar/node_modules

actually i did this
request.onreadystatechange = (e) => {
if (request.readyState !== 4) {
return;
}
if (request.status === 200) {
console.log('success'); // just print a success on console
} else {
console.log('error'); // just print a error on console
}
};
request.open('GET', 'http://192.168.0.254/'); //put your adresse here
request.send();
I don't know if it's the best way but it's working for me

It might not be a good idea but you can set a time out for your fetch requests.
function checkStatus(url, timeout, callback) {
return new Promise(function(resolve,reject) {
fetch(url)
.then((data)=>{ resolve(data) })
.catch((e)=>{ reject(e) })
setTimeout(()=>{
callback()
reject(throw new Error('Timeout'))
},timeout)
})
}
so you send a timeout value (10, 15 sec, etc), a url and a callback.
Hope it helps

Related

Is it bad practice to use a global XMLHttpRequest?

I am working on the example from this page, trying to learn some javascript: https://developer.mozilla.org/en-US/docs/Web/Guide/AJAX/Getting_Started
And I have a little web server and a status page that I made that I want to use to check the status of the web server. The way I have it built is I want to check the status of the web server onLoad rather than with a button.
HTML
Just using: <body onload=pingSite();> to try to automatically run the check when the page loads.
Javascript
<script>
let xhr = new XMLHttpRequest(); <--- The question and the problem.
function pingSite() {
var xhr = new XMLHttpRequest();
response = 'xyz';
//If an XMLHTTP instance cannot be created
if(!xhr)
{
response = 'Internal error: Cannot create XMLHTTP instance.'
return response;
}
xhr.onreadystatechange = HandlePing;
xhr.open('GET', 'mysite.com');
xhr.send();
return response;
}
function HandlePing() {
//If the request has finished
if(xhr.readyState === XMLHttpRequest.DONE)
{
try {
//If the status is 200 (IE we have recieved a response back indicating the server is up.)
if(xhr.status === 200)
{
alert("Server is up");
}
else
{
alert("There was a problem with the request");
}
//If the server is down.
} catch (error) {
alert(`Caught Exception: ${error.description}`);
}
}
else
{
response = 'Pinging...';
}
}
</script>
The problem I have is two fold:
1.) The only way I can get this to work is by creating a global variable in my script above both of the functions in order to get the call to work. I have a gut feeling this is really dangerous and bad practice. Is it, and if so, what is a better way to approach this problem?
2.) The way I have it set up seems to work, but it doesn't return any indication that it worked at all. There is no alert. There is no response. The console is empty. Am I missing something? Do I need an event handler in the HTML despite the fact I am doing it onload?
You could dodge the global variable by creating an anonymous inline function that passes xhr as an argument:
xhr.onreadystatechange = () => HandlePing(xhr);
Or with bind:
// same as arrow function above but harder to read
xhr.onreadystatechange = HandlePing.bind(null, xhr);
Or, assuming you don't need it anywhere else, you could move the HandlePing function declaration into the pingSite function:
function pingSite() {
function handlePing() {
if(xhr.readyState === XMLHttpRequest.DONE)
// ...
}
const xhr = new XMLHttpRequest();
// ...other stuff...
const xhr.onreadystatechange = HandlePing;
// ...
}
A few additional thoughts:
You should use fetch instead of XMLHttpRequest.
You should use addEventListener instead of attaching an onload attribute to the body.
fetch uses Promises, which make it easier to manage asynchronous behavior.
The skeletal implementation might look something like this:
window.addEventListener('load', handleLoadEvent);
function handleLoadEvent(e) {
fetch('http://example.com')
.then( response => {
// do stuff with the response
})
.catch( error => {
// deal with errors
})
}
And if you didn't want to pollute the global namespace with the handleLoadEvent function, you could wrap this all in an IIFE:
(function () {
window.addEventListener('load', handleLoadEvent);
function handleLoadEvent(e) {
fetch('http://example.com')
.then( response => {
// do stuff with the response
})
.catch( error => {
// deal with errors
})
}
})()
Or if you prefer async/await you could write the handleLoadEvent function that way:
async function handleLoadEvent(e) {
try {
const response = await fetch('http://example.com');
// do stuff with response
}
catch (e) {
// deal with error
}
}

Javascript Fetch API status returning undefined

With the deprecation of XMLHttpRequest, I have been trying to rewrite a javascript function that checks if a url exists by using Fetch. My console log results in the correct value, but my return statement is always undefined. What am I doing wrong?
function urlExists(url) {
var request = new Request(url);
fetch(request).then(function(response) {
console.log(response.status);
return response.status != 404;
});
}
EDIT: I jumped the gun on a bug based on this error message in console [Deprecation] Synchronous XMLHttpRequest on the main thread is deprecated because of its detrimental effects to the end user's experience. My bug was actually elsewhere in my code. Sorry for the confusion!
The problem is that you trying to do a sync operation by writing an async code. Both return statements in your code are relevant to their respective scope. You need to have a callback to return the response from fetch.
function urlExists(url, callback) {
var request = new Request(url);
fetch(request).then(function(response) {
console.log(response.status);
callback(response.status != 404);
});
}
/* Usage */
urlExists('http://example.com', (isExist) => {
if(isExist) {
console.log('URL found');
}
console.log('URL not found');
})
The code you have provided does not handle error part. Hence, That might be the reason it was not working.
function urlExists(url) {
var request = new Request(url);
return fetch(request).then(function(response) {
return (response.status != 404);
}, function(error){
return false;
});
}
urlExists("https://jsonplaceholder.typicode.com/todos/1").then(result=>{ console.log(result); });
urlExists("https://google.com").then(result=>{ console.log(result); });
I have tested this and working fine.
Convert it into this
function urlExists(url) {
return new Promise((resolve,reject)=>{
var request = new Request(url);
fetch(request).then(function(response) {
resolve(response.status != 404);
});
});
}
To Use It
urlExists("https://www.google.com").then(result=>{
console.log(result);
});
Your function returns a promise, so make sure you're using await or .then(...) in outer code to access the response correctly

I need to read a text file from a javascript

I'm writing webpage with a javascript to read data files in text format from the server per user request. Once the text file has been loaded, I need to manipulate the data somewhat.
I have been using XMLHttpRequest for the loading, however, now I see that synchronous requests are "deprecated". I can't start manipulating the data before it's loaded, so what can I do in this case?
Use an asynchronous request (or fetch, see below, which is also asynchronous):
function doGET(path, callback) {
var xhr = new XMLHttpRequest();
xhr.onreadystatechange = function() {
if (xhr.readyState == 4) {
// The request is done; did it work?
if (xhr.status == 200) {
// ***Yes, use `xhr.responseText` here***
callback(xhr.responseText);
} else {
// ***No, tell the callback the call failed***
callback(null);
}
}
};
xhr.open("GET", path);
xhr.send();
}
function handleFileData(fileData) {
if (!fileData) {
// Show error
return;
}
// Use the file data
}
// Do the request
doGET("/path/to/file", handleFileData);
Or using promises, which are the more modern way to handle callbacks (but keep reading):
function doGET(path, callback) {
return new Promise(function(resolve, reject) {
var xhr = new XMLHttpRequest();
xhr.onreadystatechange = function() {
if (xhr.readyState == 4) {
// The request is done; did it work?
if (xhr.status == 200) {
// Yes, use `xhr.responseText` to resolve the promise
resolve(xhr.responseText);
} else {
// No, reject the promise
reject(xhr);
}
}
};
xhr.open("GET", path);
xhr.send();
});
}
// Do the request
doGET("/path/to/file")
.then(function(fileData) {
// Use the file data
})
.catch(function(xhr) {
// The call failed, look at `xhr` for details
});
Here in 2019, there's no reason to use XHR wrapped in a promise like that, just use fetch:
function doGET(url) {
return fetch(url).then(response => {
if (!response.ok) {
throw new Error("HTTP error " + response.status); // Rejects the promise
}
});
}
Since you want to handle the local file, Try this
Make use of XMLHttpRequest
function readFile(file)
{
var f = new XMLHttpRequest();
f.open("GET", file, false);
f.onreadystatechange = function ()
{
if(f.readyState === 4)
{
if(f.status === 200 || f.status == 0)
{
var res= f.responseText;
alert(res);
}
}
}
f.send(null);
}
Then you have to call with File:\\
readFile('File:\\\yourpath');

How to I re-invoke the promise after the first call?

How can I re-invoke the promise after the first call?
I have this issue where .then is executed once only after the first click, you won't get this console.log("Success!", response); executed on any click after that. But I need it to recycle. Is it possible?
usage:
$( document ).ready(function() {
get('http://api.icndb.com/jokes/random').then(function(response) {
console.log("Success!", response);
}, function(error) {
console.error("Failed!", error);
});
});
promise function:
function get(url) {
// Return a new promise.
return new Promise(function(resolve, reject) {
$(".promise").click(function(){
// do lots of other stuff here...
// Do the usual XHR stuff
var req = new XMLHttpRequest();
req.open('GET', url);
req.onload = function() {
// This is called even on 404 etc
// so check the status
if (req.status == 200) {
// Resolve the promise with the response text
resolve(req.response);
}
else {
// Otherwise reject with the status text
// which will hopefully be a meaningful error
reject(Error(req.statusText));
}
};
// Handle network errors
req.onerror = function() {
reject(Error("Network Error"));
};
// Make the request
req.send();
});
});
}
html,
Promise
There's nothing wrong with writing your own promisified get() function, which is exactly what jQuery's $.ajax() or Angular's $http (and others) give you.
All you need to do is rearrange your code slightly such that :
get() is a general purpose utility, not tied to a particular event
get() is called from event handler(s) as required.
$(function() {
function get(url) {
return new Promise(function(resolve, reject) {
var req = new XMLHttpRequest();
req.open('GET', url);
req.onload = function() {
if (req.status == 200) {
resolve(req.response);
} else {
reject(Error(req.statusText));
}
};
req.onerror = function() {
reject(Error("Network Error"));
};
req.send();
});
}
$(".promise").click(function() {
// do lots of other stuff here...
get('http://api.icndb.com/jokes/random').then(function(response) {
console.log("Success!", response);
}, function(error) {
console.error("Failed!", error);
});
});
});
All I've done here is move your lines of code into a different order.
As I explained in my comment, a promise can only be used once. Once it is resolved or rejected, it's state is set forever and it will never call the existing .then() handlers again. So, you can't use a promise for something that you want called each time an event occurs. You're probably back to callbacks for that like this which seems perfectly appropriate for this situation:
$( document ).ready(function() {
get('http://api.icndb.com/jokes/random', function(response) {
console.log("Success!", response);
}, function(error) {
console.error("Failed!", error);
});
});
function get(url, success, fail) {
$(".promise").click(function(){
// do lots of other stuff here...
// Do the usual XHR stuff
var req = new XMLHttpRequest();
req.open('GET', url);
req.onload = function() {
// This is called even on 404 etc
// so check the status
if (req.status == 200) {
// Resolve the promise with the response text
success(req.response);
}
else {
// Otherwise reject with the status text
// which will hopefully be a meaningful error
fail(Error(req.statusText));
}
};
// Handle network errors
req.onerror = function() {
fail(Error("Network Error"));
};
// Make the request
req.send();
});
}

How to force a program to wait until an HTTP request is finished in JavaScript?

Is there a way in JavaScript to send an HTTP request to an HTTP server and wait until the server responds with a reply? I want my program to wait until the server replies and not to execute any other command that is after this request. If the HTTP server is down I want the HTTP request to be repeated after a timeout until the server replies, and then the execution of the program can continue normally.
Any ideas?
Thank you in advance,
Thanasis
EDIT: Synchronous requests are now deprecated; you should always handle HTTP requests in an async way.
There is a 3rd parameter to XmlHttpRequest's open(), which aims to indicate that you want the request to by asynchronous (and so handle the response through an onreadystatechange handler).
So if you want it to be synchronous (i.e. wait for the answer), just specify false for this 3rd argument.
You may also want to set a limited timeout property for your request in this case, as it would block the page until reception.
Here is an all-in-one sample function for both sync and async:
function httpRequest(address, reqType, asyncProc) {
var req = window.XMLHttpRequest ? new XMLHttpRequest() : new ActiveXObject("Microsoft.XMLHTTP");
if (asyncProc) {
req.onreadystatechange = function() {
if (this.readyState == 4) {
asyncProc(this);
}
};
}
req.open(reqType, address, !(!asyncProc));
req.send();
return req;
}
which you could call this way:
var req = httpRequest("http://example.com/aPageToTestForExistence.html", "HEAD"); // In this example you don't want to GET the full page contents
alert(req.status == 200 ? "found!" : "failed"); // We didn't provided an async proc so this will be executed after request completion only
You can perform a synchronous request. jQuery example:
$(function() {
$.ajax({
async: false,
// other parameters
});
});
You should take a look at jQuery's AJAX API. I highly recommend using a framework like jQuery for this stuff. Manually doing cross-browser ajax is a real pain!
You can use XMLHttpRequest object to send your request. Once request is sent, you can check readyState property to identify current state. readyState will have following different states.
Uninitialized - Has not started loading yet
Loading - Is loading
Interactive - Has loaded enough and the user can interact with it
Complete - Fully loaded
for example:
xmlhttp.open("GET","somepage.xml",true);
xmlhttp.onreadystatechange = checkData;
xmlhttp.send(null);
function checkData()
{
alert(xmlhttp.readyState);
}
hope this will help
For the modern browser, I will use the fetch instead of XMLHttpRequest.
async function job() {
const response = await fetch("https://api.ipify.org?format=json", {}) // type: Promise<Response>
if (!response.ok) {
throw Error(response.statusText)
}
return response.text()
}
async function onCommit() {
const result = await job()
// The following will run after the `job` is finished.
console.log(result)
}
fetch syntax
an examples
<button onclick="onCommit()">Commit</button>
<script>
function onCommit() {
new Promise((resolve, reject) => {
resolve(job1())
}).then(job1Result => {
return job2(job1Result)
}).then(job2Result => {
return job3(job2Result)
}).catch(err => { // If job1, job2, job3, any of them throw the error, then will catch it.
alert(err)
})
}
async function testFunc(url, options) {
// options: https://developer.mozilla.org/en-US/docs/Web/API/WindowOrWorkerGlobalScope/fetch
const response = await fetch(url, options) // type: Promise<Response>
if (!response.ok) {
const errMsg = await response.text()
throw Error(`${response.statusText} (${response.status}) | ${errMsg} `)
}
return response
}
async function job1() {
console.log("job1")
const response = await testFunc("https://api.ipify.org?format=json", {})
return await response.json()
}
async function job2(job1Data) {
console.log("job2")
console.log(job1Data)
const textHeaders = new Headers()
textHeaders.append('Content-Type', 'text/plain; charset-utf-8')
const options = {"headers": textHeaders}
const response = await testFunc("https://api.ipify.org/?format=text", options)
// throw Error(`test error`) // You can cancel the comment to trigger the error.
return await response.text()
}
function job3(job2Data) {
console.log("job3")
console.log(job2Data)
}
</script>
For this you can start loader in javascript as soon as page starts loading and then you can close it when request finishes or your dom is ready.
What i am trying to say, as page load starts, start a loader . Then page can do multiple synchronous request using ajax , until and unless you didn't get response, do not close close loader.
After receiving the desired in response in final call, you can close the loader.
I have a similar situation in an game built with Three.js and Google Closure. I have to load 2 resources, Three and Closure do not allow me to make these synchronous.
Initially I naively wrote the following:
main() {
...
var loaded=0;
...
// Load Three geometry
var loader = new THREE.JSONLoader();
loader.load("x/data.three.json", function(geometry) {
...
loaded++;
});
// Load my engine data
goog.net.XhrIo.send("x/data.engine.json", function(e) {
var obj = e.target.getResponseJson();
...
loaded++;
});
// Wait for callbacks to complete
while(loaded<2) {}
// Initiate an animation loop
...
};
The loop that waits for the callbacks to complete never ends, from the point of view of the loop loaded never get incremented. The problem is that the callbacks are not fired until main returns (at least on Chrome anyway).
One solution might be to have both callbacks check to see if it's the last to complete, and them go on to initiate the animation loop.
Another solution - perhaps a more direct answer to what you are asking (how to wait for each load before initiating another) - would be to nest the callbacks as follows:
// Load Three geometry
var loader = new THREE.JSONLoader();
loader.load("x/data.three.json", function(geometry) {
...
// Load my engine data
goog.net.XhrIo.send("x/data.engine.json", function(e) {
var obj = e.target.getResponseJson();
...
// Initiate an animation loop
...
});
});
};
This is an old question but wanted to provide a different take.
This is an async function that creates a promise that resolves with the Http object when the request is complete. This allow you to use more modern async/await syntax when working with XMLHttpRequest.
async sendRequest() {
const Http = new XMLHttpRequest();
const url='http://localhost:8000/';
Http.open("GET", url);
Http.send();
if (Http.readyState === XMLHttpRequest.DONE) {
return Http;
}
let res;
const p = new Promise((r) => res = r);
Http.onreadystatechange = () => {
if (Http.readyState === XMLHttpRequest.DONE) {
res(Http);
}
}
return p;
}
Usage
const response = await sendRequest();
const status = response.status;
if (status === 0 || (status >= 200 && status < 400)) {
// The request has been completed successfully
console.log(response.responseText);
} else {
// Oh no! There has been an error with the request!
console.log(`Server Error: ${response.status}`)
}
For those using axios, you can wrap it in an async iife and then await it:
(async () => {
let res = await axios.get('https://example.com');
// do stuff with the response
})();
Note, I haven't done any error checking here.

Categories