I have been working on having a instant messaging system on a website(kind of like Facebook and Gmail). I have javascript poll the server for new messages.
If the user has multiple instances of the site open is there any way to prevent each one from making requests?
You can assign each "new" load of the page with a UUID, and drop requests from all UUIDs that are not the most recent one for user. You need to send the UUID back in each request. If you want to get advanced, you can have the JavaScript on the page check the response to see if the server says it's an old UUID, and that it should stop making the requests.
Register each connection with a GUID generated on the fly in the browser. Check the GUID and the username pair to see which page was owner last. On page load, declare yourself a new window and that you're taking ownership. Sort of PageJustLoadedMakeMeOwner(myGuid, username)
Then have that GUID targeted frame update the server regularly for it's ownerness of the page.
If it stops updating the server, then have rules in the server that allow the next page to contact to take ownership of for that username.
Have pages that have lost ownership self-demote to only accessing once a minute or so.
The response to check if a given page is owner of that username is really fast. Takes almost no time to do, as far as the client is aware. So the AJAX there doesn't really restrict you.
Sort of a AmIOwner(username, myGuid) check (probably do this every five seconds or so). If true, then do the thing that you want to happen. If false, then poll to see if the owner of the page is vacant. If true, then take ownership. If false, then poll again in xx amount of seconds to see if the owner is vacant.
Does that make any sort of sense?
You could do something for multiple instances in the same browser, but there's nothing you can do if the user has multiple browsers. (Granted, not that common scenario)
If you still want to give it a try, probably the easiest way would be to keep a timestamp of the last request in a cookie and make new request only upon a certain threshold. You still might run a small race until the multiple instance s settle down, but if you use fuzzy time period for the polls, the instances should settle down pretty quickly to a stable state where one of the instances makes the call and the others reuse the result from the last call.
The main advantage of that approach is that the requests can be made by any of the instances, so you don't have to worry about negotiating a "primary" instance that makes the calls and figuring a fallback algorithm if the user closes the "primary" one. The main drawback is that since it's a fuzzy timing based algorithm, it does not fully eliminate the race conditions and occasionally you'll have two instances make the requests. You'll have to fine tune the timing a bit, to minimize that case, but you can't fully prevent it.
Related
I have variable in Javascript which are created by reading a number from HTML, adding a number to it and then returning it to HTML.
I want to make it so that no matter what browser/what user you are, you are seeing the latest version of the variable. Currently, if I refresh the page then the number resets to 0 (the default value). I want it so that if I update the number to 1 when someone else views it from another browser they will also see 1 and not 0.
I've seen that cookies are an option, however I thought cookies were client side only? So that would mean that only I would see the latest version of the variable.
I've seen that sessions are another option, are sessions server side? And would they do the job that I am after?
Is there another way of doing this I haven't considered?
Thanks in advance
I want to make it so that no matter what browser/what user you are, you are seeing the latest version of the variable.
You need to send your updates from the browser to a server, and then have that server relay your updates to all the other clients. There are many choices for how to do this, with various tradeoffs and complexity.
One method is to simply take that number and send it to the server. Then, on next page load, the server injects that new number into the page it outputs (or it serves it up via an API call, over AJAX, via the Fetch API, or server-sent events, WebSocket, etc.). If you do this though, you will need to decide how to handle concurrency. What will happen if two people load the page at the same time?
The general system you're describing is called Operational Transform, and this is a rabbit hole you probably don't want to go down right now. Just understand that there's no magic that synchronizes things across the planet perfectly and at the same time. Your system has to account for inherent delays in some way.
I've seen that cookies are an option, however I thought cookies were client side only?
Yes, cookies are client-side. They're sent to the server with every request, but that's not a useful tool for you, aside from session identification.
I've seen that sessions are another option, are sessions server side?
They can be, but you need to find a way to know what the user is between browsers. Normally, a session ID is stored in cookies.
Is it because you have limited space to store session information of a logged in user or security concerns of allowing a user to be logged in for an extended time period, or probably a mix of both?
Is this done at the back-end or front-end?
Assuming it is a back-end requirement, I have seen lot of Javascript codes that creates a widget alerting the end user of a time-out. However I don't see how it can be really coherent with the server other than a guideline that says you haven't performed an Ajax operation for a certain amount of time and hence your session will be timed out without really checking back-end, because if you are checking back-end then you are actually extending you session.
Also in general, what is the criteria that extends the session of a logged in user? Does he have to fire an Ajax request to the back-end (assuming a SPA) or is it enough if he clicks on an input field? If so we do we keep a timer that gets cleared each time this happens? (Again not really coherent with the server but works practically). I know this is a broad question. Any pointers would be helpful.
[PS: If it is more about theory, I could just move to another site in the SO Network? I thought it is a relevant question for beginners.]
This may be a bit of a tricky one (for me at least, but you guys may be smarter). I need to capture the timestamp of exactly when a reader clicks a link in an email. However, this link is not a hyperlink to another webpage. It is a link formatted as a GET request with querystrings that will automatically submit a form.
Here is the tricky part....The form processing is not handled by PHP or .NET or any other server side language. It is a form engine that is hosted and managed by a cloud based marketing platform that captures and displays the form submission data (So i have no access to the code behind the scenes).
Now, if this wasn't an email I'd say it is simple enough to just use Javascript. However, javascript doesn't work so well with email, if at all (I'm just assuming there are some email clients out there that support javascript).
How would you go about capturing the timestamp for when the link is clicked without using any type of scripting? Is this even possible?
The best solution i could come up with was to have the link point to an intermediate page with javascript to capture timestamp and then redirect to the form submission. Only problem with that is that it will only capture timestamp of page load and not of the actual click activity.
There is no way to do what you want "without any type of scripting". If no scripting is done, no functionality may be added or changed.
The best option is the very one you suggested: use an intermediary page that records the request time. Barring unusual circumstances (such as a downed server), the time between a link being clicked and the request reaching the server will be less than 1 second.
Do you really need a higher resolution or accuracy than ~1s? What additional gain is there from having results on the order of milliseconds or microseconds? I can't imagine a scenario in which you'd have tangible benefits from such a thing, though if you do have one I'd love to hear it.
My initial thought was to say that what you're trying to do can't be done without some scripting capability, but I suppose it truly depends on what you're trying to accomplish overall.
While there is ambiguity in what you're trying to accomplish from what you have written, I'm going to make an assumption: you're trying to record interaction with a particular email.
Depending on the desired resolution, this is very possible--in fact--something that most businesses have been doing for years.
To begin my explanation of the technique, consider this common functionality in most mail clients (web-based or otherwise):
Click here to display images below
The reason for this existing is that the images that are loaded into the message that you're reading often come from a remote server not hosted by the mail client. In the process of requesting that image, a great deal of information about yourself is given to that outside server via HTTP headers in your request including, among other things, a timestamp for the request. Thus the above button is used to prevent that from happening without your consent.
That said, its also important to note how other mail client providers, most notably gmail, are approaching this now. The aforementioned technique is so common (used by advertisers and by other, more nefarious parties for the purpose of phishing, malware, etc) that Google has decided to start caching all mail images themselves. The result is that the email looks exactly the same, but all requests for images are instead directed at Google's cached versions.
Long story short, you can get a timestamp to note interaction with an email via image request, but such metric collection in general, regardless if its done in the manner I've outlined, is something mail clients try to prevent, at least at some level.
EDIT - To relate this back to what you mention in your question and your idea of having some intermediary page, you could skip having that page and instead you would point an image request towards a server you control
In our application, we are painting navigation component using JavaScript/jQuery and because of authorization, this involves complex logic.
Navigation component is required on almost all authenticated pages, hence whenever user navigates from one page to another, the complex logic is repeated on every page.
I am sure that under particular conditions the results of such complex calculations will not change for a certain period, hence I feel recalculation is unnecessary under those conditions.
So I want to store/cache the results at browser/client side. One of the solution I feel would be creating a cookie with the results.
I need suggestions if it is a good approach. If not, what else can I do here?
If you can rely on modern browsers HTML 5 web strorage options are a good bet.
http://www.html5rocks.com/en/features/storage
Quote from above
There are several reasons to use client-side storage. First, you can
make your app work when the user is offline, possibly sync'ing data
back once the network is connected again. Second, it's a performance
booster; you can show a large corpus of data as soon as the user
clicks on to your site, instead of waiting for it to download again.
Third, it's an easier programming model, with no server infrastructure
required. Of course, the data is more vulnerable and the user can't
access it from multiple clients, so you should only use it for
non-critical data, in particular cached versions of data that's also
"in the cloud". See "Offline": What does it mean and why should I
care? for a general discussion of offline technologies, of which
client-side storage is one component.
if(typeof(Storage)!=="undefined")
{
// this will store and retrieve key / value for the browser session
sessionStorage.setItem('your_key', 'your_value');
sessionStorage.getItem('your_key');
// this will store and retrieve key / value permanently for the domain
localStorage.setItem('your_key', 'your_value');
localStorage.getItem('your_key');
}
Better you can try HTML 5 Local Storage or Web SQL, you can have more options in it.Web SQL support is very less when compared to Local Storage. Have a look on this http://diveintohtml5.info/storage.html
I'm currently fooling around with AJAX. Right now, I created a Markdown previewer that updates on change of a textarea. (I guess you know that from somewhere... ;-) ).
Now, I'm trying to figure out, how to update a page upon an event is fired from another client. So to say an asynchron message board. A user writes something, an event is called, the post is written.
But on the other clients' pages, the new post is of course not yet available until they reload and get the updated list of posts from the database.
Now, how can you get this to work asynchronously? So in that moment when one client does something, the other clients all get to know that he did something?
I don't think this can be done completely in AJAX, but I also have no idea whatsoever how to implement this on server-side, as it would require a page reload to inform the other clients of the event.
I'm thinking of creating a file or database entry that hashes the current state of data. Whenever a client loads the page, he saves this hash. Then, a timer (does this exist in JavaScript?) checks for the hash every few seconds.
As soon as anyone changes the databse, the hash is recalculated. If the script sees that the hash was changed and is different to the one saved, it reloads the contents form the database and saves the new hash.
Is that even going to work?
Polling that is light as possible is really the best solution here. Even if you did use a socket or something... That's still basically a live connection waiting around that will likely have to poll itself (albeit in a more effecient way).
20 queries in 10 minutes that have responses like {"updates":false} shouldn't even be putting a dent in your application. I mean imagine someone browsing your site requesting 20 pages and the related images/scripts/etc (even if some caching is involved), there could easily be hundreds of requests requiring all sorts of wasted database queries to information to be displayed on the page they don't actually care about.
You could use polling. For example each client might be sending continuous AJAX requests to the server say each 30 seconds to see if new posts are available and if yes, show them:
setInterval(function() {
// TODO: Send an AJAX request here to the server and fetch new posts.
// if new posts are available update the DOM
}, 30 * 1000);
On the other hand when someone decides to write a new post you send an AJAX (or not AJAX) request to the server to store this post in the database.
Another less commonly used approach is the concept of Comet and the HTML 5 WebSockets implementation which allow the clients to be notified by the server of changes using push.