What is the current best practice and method of loading a webpage (that has 10 - 15 seconds worth of server side script).
User clicks a link > server side runs > html page is returned (blank
page for 10 - 15 seconds).
User clicks a link > html page is immediately returned (with progress
bar) > AJAX post request to the server side > complete script > return result to
html.
Other options (threading?)
I am running Google App Engine (Python) Standard Environment.
Best Practice would be for the the script to not take 10-15 seconds.
What is your script doing? Is it generating something that you can pre-compute and cache or save in Google Cloud Storage?
If you're daisy-chaining datastore queries together, is there something you can do to make them happen async in tandem?
If it really has to take 10-15 seconds, then I'd say option 2 is must:
User clicks a link > html page is immediately returned (with progress bar) > AJAX post request to the server side > complete script > return result to html.
The way we're doing it is using the Ajax approach (the second one) which is what everyone else does.
You can use Task Queues to run your scripts asynchronously and return the result to front end using FCM (Firebase Cloud Messaging).
You should also try to break the script into multiple task queues to make it run faster.
Related
The Issue
Recently, I deployed a page to production containing javascript which used setInterval() to ping a webservice every few seconds. The behavior can be summed up as follows:
Every X seconds, Javascript on the upcomingEvents.aspx page calls the hitWebService() function, which sits in the hitWebService.js file.
The X interval value set proved to be too small, so, I removed all references to hitWebService(), the hitWebService.js file itself, and the web service it was trying to reach.
Attempts to hit the web service from normal IP addressess dropped off, but I am still getting attempted hits from a number of users who use a proxy service.
My theory is that my upcomingEvents.aspx and hitWebService.js have been cached by the proxy service. Indeed, when I log the referrer strings when a user hits the error page (every so often, one of these users will get redirected here), they are being referred from upcomingEvents.aspx.
The issue is that the attempts to hit this web service are filling up the IIS logs at an uncomfortable rate, and are causing unnecessary traffic on the server.
What I have attempted
Removed web service completely
Deleting the hitWebService.js file, also replaced it with dummy file
Changed content expiration on IIS so that content expires immediately
Added Response.Cache.SetCacheability(HttpCacheability.NoCache) to the .vb codebehind on page
Completely republished site with changes
Restarted IIS, stopped and started IIS.
Interesting bits
I can alter the vb codebehind on UpcomingEvents.apsx to log session details, etc, and it seems to update almost instantly for the proxy service users
The Question
If my theory is correct, and the proxy server is indeed caching these the files hitWebService.js and upcomingEvents.aspx, are there any other routes that I can go down to force a code refresh, considering the above strategies haven't worked?
Thanks a lot,
In my case, i had a ajax call begin cached by asp.net. I used a param with the javascript date number so each call have a different querystring.
like this:
function addQueryStringAntiCache(url)
{
var d = new Date();
var n = d.getTime();
return url + (url.indexOf("?") == -1 ? "?" : "&") + "nocache=" + n;
}
you can do the same thing for script:
<script src="myScript.js?v=8912398314812" />
In case you have access to the machine, use fiddle to check it the browser even call to get the files or if it use it's own cache. In that case you can try to include metaData or http Header to prevent it caching it. Check this response
Hope it help you.
I'm making a website with the Steam API.
I was trying it out by getting the friends list of the signed in person.
But the more friends you have, the longer it takes to load the page.
So I made the page start to load the friends as soon as the page is done loading.
If I try to refresh the page or sign out while the page is requesting the friends list, it just keeps on loading until the friend list has been fetched and only then, it refreshes the page or signs out.
How do I fix so I can refresh the page without having to wait for the request to be fuly performed.
Here is the jQuery I use to load in the PHP file:
$(function() {
$('#friends').load("friendstest.php");
});
Please tell me if you need more information.
The website is here.
This can be due to a sessiĆ³n write lock.
If you make the request and it takes a long time, the user session file is locked for writing, so another request can't open it for writing before the previous request release the file, and it is done at the end of the php script.
If your php script is not going to write any data in the session, you can call the session_write_close() function to release the lock (http://php.net/manual/en/function.session-write-close.php) so you enable other request to open the session file.
So at the begining of your friendstest.php and after the session_start, you can call the session_write_close().
Instead of loading the entire friends list in a single call, you can split it into multiple calls with something like:
upto = 0;
function loadMoreFriends() {
$('#friends').append("friendstest.php?limit=10&start="+upto);
upto += 10;
}
In this example, upto records how many friends have been loaded. The php would return only 10 friends and start from where the page is upto.
You would call this function on page load and then whenever you would like to load more friends to the page (scroll to bottom or similar).
I currently have a PHP script running in the background executing a PHP script reading a big table. At the same time it is sending the results to an API. This is done with unbuffered query.
On the top of this script, I've put
ignore_user_abort(true);
set_time_limit(0);
To make sure the script runs in the background until it is done. I also have a Javascript that is getting progress report from this script. But, when the page is reloaded, the progress is started again and it will start sending the data again to the API duplicating data.
I was wondering if there is a way to let the user continue on the script. So the script is already running in the background, but is there a way to return the user to the script so it'll be like they never left?
EX:
User starts importing, import is at 200 rows out of 1 million. They refresh the page and the page says 202 rows out of 1 million. 202 mil cause time has past importing more rows while the user has left since script is executing in the background.
Thank you in advanced
You can use websocket to this case. When you will establish new connection to websocket you can store the connection in cookies. Then when you will reload the page you can restablish connection to websocket server. On top of that you need have a websocket server that can read from cookie variables.
Could this be done with using a cookie ?
Each time it runs, update said cookie then if the page is refreshed use the information in the cookie to start where it left off and update the page ?
I have a web service where any user can start a process that takes somewhere between 15 seconds and 10 minutes to complete. The process is started with a POST request and the user is redirected to page that shows current progress of the process (e.g. https://example.com/progress-status/123).
My current implementation is to send HTTP header Refresh with value n;url=https://example.com/progress-status/123 where n is automatically changed between 5 and 120 according to expected time to completion and current server load. As a result, the progress status is automatically updated once every 5 seconds or more. After the progress has been completed, the status page will immediately redirect (HTTP 301 and Location header) to the completed job.
This works otherwise nicely but causes ugly flickering in Opera 42.0, which considers this to mean forced reload and skip all caches. I'm sending correct Cache-Control headers so using cached result for everything would be fine but Refresh header causes all caches to be ignored. (The status page contains some images and links to static CSS files so it does not make any sense to refresh those resources for every poll request.)
Is there any way to implement polling just the HTML page without JavaScript? I know that I could poll just the current status with Ajax query and then update the part of the currently visible page with the updated information. However, that will not work if user disables JavaScript. Rest of the service works without JavaScript so requiring JavaScript for something this simple seems bad. (I already have a GET submit button on the progress status page to force refresh manually.)
I know that HTTP Refresh header is not defined in HTTP 1.0 or HTTP 1.1 so this is a bit grey area. I'm looking for something that works in real world without JavaScript.
you can try the html meta refresh
<meta http-equiv="refresh" content="5"><!-- reloads the page after 5 seconds //-->
W3Schools doc on <meta>
Of course only put it if progress is < 100% :)
Not completely without, but you don't have to reload the whole page. Instead of that, you could just listen to SSE (server side events) and update the status value on the page (the element where it is contained).
I can't think of any way to do that. And much less one that entails good practices: use Javascript.
You can use a noscript tag to advise your users that JavaScript is necessary:
<noscript>
<h1>JavaScript is not enabled, please check your browser settings.</h1>
</noscript>
Realistically almost everyone has Javascript enabled nowdays.
Works Google Maps, FaceBook, etc, without JavaScript? No.
Why your web app will require that?
I complete agree with this phrase from #Matthew Trow answer:
I think sacrificing functionality for 99% of users to accommodate 1% is sheer bloody mindedness.
I know how to do that with javascript but I need a secure way to do it.
Anybody can view page source, get the link and do not wait 5 seconds.
Is there any solution? I'm working with javascript and django.
Thanks!
The only secure way would be to put the logic on the server that checks the time. Make an Ajax call to the server. If the time is under 5 seconds, do not return the HTML, if it is greater than , than return the html to show.
Other option is to have the link point to your server and if the time is less than five seconds it redirects them to a different page. If it is greater than 5, it will redirect them to the correct content.
Either way, it requires you to keep track of session time on the server and remove it from the client.
Use server side timeout.. whenever there is (AJAX) request from client for download link with timestamp, compare the client sent timestamp with currenttime and derive how much time is required to halt the request at server side to make up ~5 seconds. So by comparing timestamp you can almost achieve accuracy of waiting time as the network delays would be taken into account automatically.
You can use ajax, retrieve the button source code from your back end and intert it on your page.
Something like
$.get('url', function(sourceCode) {
$('#midiv').html(sourceCode);
});