Working on a single page application, and I am currently building a JQuery, ajax function for all of my calls to go through.
For a typical page I might have 3 ajax calls. My idea is if a users internet goes out to hold these ajax calls in an array. And then do a timer and keep checking if the user has internet. Once they have internet do the calls. So no calls are run when the user is offline (except for the check for internet one) and once their back online do what they wanted. So in my beforeSend I have this.
beforeSend : function($xhr)
{
self.ajaxPool.push($xhr);
if(!self.validateConnection())
{
console.log(self.ajaxPool);
}
}
So my question is when I get connection back, and loop through my array to each $xhr object what function can I call on it to say, 'Hey do what you were supposed to now'? For example $xhr.complete()? I've done a console log on the pool to look at the objects when connection is down but none of the functions it has look like they'd do the trick.
I would ditch the beforeSend entirely since you're using a pool anyways and do something like this...
//where you want to make your initial request somewhere in code
//do this INSTEAD of calling $.ajax(options)
ajaxPool.push(options);
//an interval to handle the queue
setInterval(function(){
//nothing in the pool so do nothing
if(ajaxPool.length < 1) return;
//not online so don't try to send
if(!validateConnection()) return;
//hey we're online lets make the ajax request
doAjax(ajaxPool.shift());
},200);
function doAjax(options){
//add call backs since you seem to be using a single call back
options.complete=function(data){ ... };
//do call
$.ajax(options);
}
You could make this more OOP and you could pause the interval when you know it's not being used, but I think this gets the basic concept across.
I been using this Offline JS for my RIA applications. It's really reliable for your offline scenarios
Monitors ajax requests looking for failure
Confirms the connection status by requesting an image or fake resource
Automatically grabs ajax requests made while the connection is down and remakes them after the connection is restored.
https://github.com/HubSpot/offline
Related
i'm building an angular app that will make about a thousand people to connect simultaneously to book a ticket. I want only "XYZ" of them to access simultaneously at the registration Angular component. The other ones will see a "waiting room" component until it's their turn.
I set up the whole thing like this:
User enters the page.
I make an http call to expressjs server
The server checks if the "connections" collection constains less than XYZ docs
If true, it unlocks the user registation component and with an http post req, it creates a new doc in the db. if false it leaves it hidden and shows up the waitingroom component
When user leaves the page, his doc in "connections" collection gets destroyed with an http delete call.
Fully working.
The problem is now that i want to create a kind of "priority" system, because, going like that, if you just refresh you may be lucky and get access, even if you are soon arrived and there is who is waiting since 1990's. So i introduced a "priority" system. When the user makes the first http call, if user is not allowed, the server creates a timestamp and pushes it into an array.
const timestamps = []
.
.
.
// this below is in http get req
Connessione.countDocuments({},(err,count)=>{
if(count<=nmax){
console.log("Ok")
res.status(200).json({allowed: true})
}
else{
const timestamp = req.params.timestamp;
timestamps.push(timestamp);
console.log("Semo troppi")
res.status(401).json({allowed: false})
}
});
The idea is to listen to db changes, and when there is just XYZ-1 in the db. Make a call to the first timestamp's angular frontend to say him: "Hey there, if you want we're done. You can go" and unlock him the access to registration component.
The problem is that i can't make continuous http requests every second from angular until there's a free place...
Is there any method to send a request at the server, and when server says OK, calls angular and says "Hey dude. You can go!"?
Hope you understood my question. If not ask me in the comments.
Thanks in advance
Even i had trouble with sockets in the beginning so i'll try to explain the concept in a simple way, Whenever you write an API or Endpoint you have a one way connection i.e. you send request to server and it return back some response as shown below.
Event 1:
(Client) -> Request -> (Server)
Event 2:
(Client) <- Response <- (Server)
For API's, without request you cannot get response.
To overcome this issue as of now i can think of two possible ways.
Using Sockets, With sockets you can create a two way connection. Something like this
(Server) <-> data <-> (Client)
It means you can pass data both ways, Client to server and Server to client. So whenever an event occurs(some data is added or updated in database) one can emit or broadcast it to the client and the client can listen to the socket and receive it.
In your case as it's a two connection you can emit the data from angular and
I've attached few links at the bottom. please have a look.
Using XML/AJAX Request, This is not a preferable method, using setInterval you can call the server in every 5 seconds or so and do the operation needed.
setInterval(ajaxCall, 5000); //5000 MS == 5 seconds
function ajaxCall() {
//do your AJAX stuff here
}
Links:
https://socket.io/docs/
https://alligator.io/angular/socket-io/
We have a desktop and a web application, which share some database records. I am currently trying to save information into our user record (which both applications currently share), and when it is open in the desktop application the user record is locked.
Right now, the web application simply throws an error if the user record is locked when trying to make edits/update information.
I have been tasked with updating the user record with certain information as soon as the user record becomes unlocked (we are more concerned with it being locked for only a few minutes at a time, rather than hours)
One of the things I have to save to the user object is the menu positions of a javascript built sortable menu...so I'm wanting a jQuery ajax call that works in the background and attempts to save the menu item positions until the user record is unlocked.
I have a generic handler (.ashx) set up that calls the update methods in the user object in order to save the menu positions and other data. I know how to make ajax calls and all that, I just can't wrap my head around how to have a background process that continually tries to save until the record is unlocked.
Maybe a worker function similar to:
(function worker() {
$.ajax({
url: 'ajax/test.html',
success: function (data) {
$('.result').html(data);
},
complete: function () {
// Schedule the next request when the current one's complete
setTimeout(worker, 5000);
}
});
})();
...which has a timeout, or a setInterval? I admittedly don't have any code written for this (other than the methods in the back end which does a check for the locked record, and attempts to save to the user object).
Can anyone help?
EDIT
try
{
UniFile userSecurityFile = CurrentSession.CreateUniFile("USER.SECURITY");
userSecurityFile.WriteField(this.UserId, 443);
}
catch (UniFileException ex)
{
Exception betterExceptionToThrow = null;
if (ex.IsRecordLockedError())
{
betterExceptionToThrow = new RecordAccessException(RecordAccessIssueType.Locked, "Could not save because record is locked");
}
}
You shouldn't even attempt to solve this using Javascript - JS is good to get data from the server or to post data to it but you have no way to know if the same user is using another browser at the same time or another machine - these may all generate the same write request to the server at the same time.
You need to have thread-safe server-side entrypoints that perform CRUD operations on your data. You can call these entry points using JS / Ajax but these functions will have to catch exceptions from the underlying database layer and return some kind of error code (or other error status) to the browser that tells the UI layer that the write attempt failed. There's no way to guarantee that the save will work without errors and the client may have to retry several times to succeed.
As the server-side code attempts to complete the write operation, you'll have to make sure you deal with concurrency problems in the data layer of your application. For more information, read Concurrency control - this, however, is a huge topic and a simplified article won't get you far. You'll have to decide if you can allow multiple write attempts (last one wins) or if all write attempts on a changed record should fail.
Edit:
If I misunderstood your question and your server-side code already handles all of the above then in the browser all you need to do is set up an Ajax call, wait for the return value and if it failed, try again for N number of times. In this case, your sample code is roughly what you need.
I want to long poll a script on my server from within a phonegap app, to check for things like service messages, offers etc.
I'm using this technique in the js:
(function poll(){
$.ajax({
url: "/php/notify.php",
success: function(results){
//do stuff here
},
dataType: 'json',
complete: poll,
timeout: 30000,
});
})();
which will start a new poll every 5 minutes (will be stopping the polling when the app is 'paused' to avoid extra load)
I am not sure how to set up the php though? I can set it up so it doesnt return anything and just loops trough the script, but how to make it return a response as soon as i decide i want to send a message to the app? my php code so far is:
<?php
include 'message.php';
$counter = 1;
while($counter > 0){
//if the data variable exists (from the included file) then send the message back to the app
if($message != ''){
// Break out of while loop if we have data
break;
}
}
//if we get here weve broken out the while loop, so we have a message, but make sure
if($message != ''){
// Send data back
print(json_encode($message));
}
?>
message.php contains a $message variable (array), which normally is blank however would contain data when i want it to. The problem is, when i update the $message var in message.php, it doesnt send a response back to the app, instead it waits until it has timed out and the poll() function starts again.
so my question is, how do i set-up the php so i can update the message on my server and it be sent out instantly to anyone polling?
Long polling is actually very resource intensive for what it achieves
The problem you have is that it's constantly opening a connection every second, which in my opinion is highly inefficient. For your situation, there are two ways to achieve what you need; the preferred way being to use web sockets (I'll explain both):
Server Sent Events
To avoid your inefficient Ajax timeout code, you may want to look into Server Sent Events, an HTML5 technology designed to handle "long-polling" for you. Here's how it works:
In JS:
var source = new EventSource("/php/notify.php");
source.onmessage=function(event) {
document.getElementById("result").innerHTML+=event.data + "<br>";
};
In PHP:
You can send notifications & messages using the SSE API interface. I
don't have any code at hand, but if you want me to create an example,
I'll update this answer with it
This will cause Javascript to long-poll the endpoint (your PHP file) every second, listening for updates which have been sent by the server. Somewhat inefficient, but it works
WebSockets
Websockets are another ballgame completely, and are really great
Long-Polling & SSE's work by constantly opening new requests to the server, "listening" for any information that is generated. The problem is that this is very resource-intensive, and consequently, quite inefficient. The way around this is to open a single sustained connection called a web socket
StackOverflow, Facebook & all the other "real-time" functionality you enjoy on these services is handled with Web Sockets, and they work in exactly the same way as SSE's -- they open a connection in Javascript & listen to any updates coming from the server
Although we've never hard-coded any websocket technology, it's by far recommended you use one of the third-party socket services (for reliability & extensibility). Our favourite is Pusher
setInterval(function{
//send ajax request and update chat window
}, 1000)
is there any better way to update the chat with new messages? is this the right way to update the chat using setInterval?
There are two major options (or more said popular ways)
Pulling
First is pulling, this is what you are doing. Every x (milli)seconds you check if the server config has changed.
This is the html4 way (excluding flash etc, so html/js only). For php not the best way because you make for a sinle user a lot of connections per minute (in your example code at least 60 connections per second).
It is also recommended to wait before the response and then wait. If for example you request every 1 second for an update, but your response takes 2 seconds, you are hammering your server. See tymeJV answer for more info
Pushing
Next is pushing. This is more the HTML5 way. This is implemented by websockets. What is happining is the client is "listing" to a connection and waiting to be updated. When it is updated it will triger an event.
This is not great to implement in PHP because well you need a constanct connection, and your server will be overrun in no time because PHP can't push connections to the background (like Java can, if I am correct).
I made personally a small chat app and used pusher. It works perfectly. I only used the free version so don't know how expensive it is.
Pretty much yes, one minor tweak, rather than encapsulate an AJAX call inside an interval (this could result in pooling of unreturned requests if something goes bad on the server), you should throw a setTimeout into the AJAX callback to create a recursive call. Consider:
function callAjax() {
$.ajax(options).done(function() {
//do your response
setTimeout(callAjax, 2000);
});
}
callAjax();
In the example Todos app for backbone.js, this takes place:
clearCompleted: function() {
_.each(Todos.done(), function(todo){ todo.clear(); });
return false;
},
This deletes multiple models by sending out multiple http DELETE requests to whatever service is backing the app. In the example's case that is no problem b/c they are using a local storage solution.
But when I try a similar process with a database on the backend (sqlite/datamapper/sinatra) the fact that it sends off multiple delete http requests simultaneously causes the db to lock and send back an error.
Is this something any of you have run into?
I can think of two ways around it:
Have a destroyBatch() that sends an array of id's into a DELETE call, and have sinatra sniff out the multiple ids and handle the deletes all at once server-side.
Have a destroyAsync() on the client-side that pushes the ids into a queue and calls destroy() on the models one-by-one in an async chain reaction until they are all gone ( but you would see them being deleted one by one on the screen with a pause in between each).
Do either of those solutions seem reasonable, or am I a frail goose flapping wildly?
-j
Option 2 is not a viable one. Your user can click back or close the window and the deletion will not succeed completely. So out with this one.
This leaves us to:
Fix your initial problem of locks in the DB :D
Send all ids to be deleted at once.
I would try to solve the initial problem first. What is causing them to lock up? I am pretty sure that in development mode sinatra will process a single request at a time, so sending a bunch of delete will actually be serialized on the backend processing... That is another question altogether that would be linked to the sqlite error returned.
As for sending the deletion in batches. It is a good idea, but it deviates from the standard RESTful controller. So you will have to handle that yourself as backbone do not provide a way to do this. You can add a deleteAll method on the collection and handle the sync from there (do not forget to send events if you are relying on them).