How to call or make the javascript function from the Application_Start of global.asax in asp.net mvc(C#) application?
You can remember the last "invoked" time in Session or cookies (which is easier for javascript but worse for performance/etc) and then
function check() {
// or var lasttime = <%= Session["lasttime"] %>;
if (now - $.cookie("lasttime") > timeout)
{
$.cookie("lasttime", now);
performAction();
}
window.setTimeout(check, 1000);
}
You can call time function once from $(document).ready().
But note that it may take browser several seconds to render page, or it may bump into 404 or other errors and page will be inactive... javascript is not a reliable way to do scheduled actions.
Another way is to have your timer on server. JavaScript function like above will just ask for it from time to time, passing user ID or something like that. This will prevent timer reset during page reload. But you'll have to do request too often. So the best solution would be to combine two techniques:
Run timer on server
When page is renders, set var inited = false;
Run function above but like this: if (!inited) timer = $.getJSON("/timer?uid=x"); and when you have the precise current timer you can continue with JavaScript only, without server requests.
"The javascript function gets the data to be shown to the User from database through jquery. The javascript function will be executed periodically using setTimeout"
This wouldnt be the place to do it.
Have you thought about using your masterpage?
Since JavaScript executes on client side and global.asax executes on server side. You cannot do that.
How about you check a Application level variable at the load of your landing page (master page would also do) and register whatever the javascript there and set the variable.
You can skip the registration if the variable is set.
Related
I have ASP Web Forms Web application, which is used for searching in my database. I have page which contains gridview and also text input field for filtering results.
Input field has onkeyup event which run postback by JavaScript for refreshing the gridview from codebehind.
And also in codebehind I have method, which does saving state last filter request, and if request do not equal previously filter request, then I run new query to database.
My problem - onkeyup event on text input field may generate multiple postbacks. If it first postback, then server will execute the query to database. But following postbacks do not execute and overlap first results. And therefore my gridview stay old state.
<script type="text/javascript">
function DoUpdateGridView() {
var timeout = null;
clearTimeout(timeout);
timeout = setTimeout(function () {
Sys.WebForms.PageRequestManager.getInstance()._doPostBack('<%=GUIGridUpdatePanel.ClientID%>', '<%=GUIGridUpdatePanel.UniqueID%>');
}, 1000);
};
</script>
How you may see I set delay for execute JavaScript, but this only Delays execution of the code but does not interrupt.
I think i may make cache first results, and return it.
But I need know all solutions for this problem.
Сan there be any way to perform only the latest event from the JavaScript?
But then somebody may send multiple request and my web application may fall (DDOS).
You can't interrup what is already happening on the server, but there is an issue with your "delay" code. You need to persist the timeout variable between calls.
Try:
<script type="text/javascript">
//New Position for timeout variable
var timeout = null;
function DoUpdateGridView() {
//This will now clear the existing timneout function
clearTimeout(timeout);
timeout = setTimeout(function () {
//For debug purposes only
console.log("About to postback");
Sys.WebForms.PageRequestManager.getInstance()._doPostBack('<%=GUIGridUpdatePanel.ClientID%>', '<%=GUIGridUpdatePanel.UniqueID%>');
}, 1000);
};
</script>
When two (or more) postbacks are made from the same client at the same time, they send the same ViewState data to the server as part of each request. Control state is part of ViewState. Assuming Session is being handled in the default manner, these requests are processed by the server serially.
Let's say the server alters controls while handling the first request. Then the second request is processed. But since both postbacks were generated at the same time, the second request has the same ViewState (and thus the same control state) as the first request, so the state of the controls that the server sees is the state before the first request was processed, and not the state after the first request was processed!
The simplest solution is to store the pieces of state you are using to determine whether to change the controls in the Session, which by default is stored in memory on the server, thus making the two requests distinguishable.
I'm working on a closed system web application to aid companies in their everyday online commerce chores. That means on the one hand that it won't be open to the public, on the other: it will have to deal with large amounts of data while maintaining a fluent work experience.
This is why I turned to web workers in JS to run all sorts of database access and data loading in the background.
My understanding is, that not only the main UI/main JS remains uninterrupted but also the different web workers run without hindering each other.
I now have the following setup:
mainJS: function statusCheck which runs on pageload:
function statusCheck() {
if(typeof(w__statusCheck) == "undefined") {
var w__statusCheck = new Worker("...statusCheck.js");
w__statusCheck.postMessage("go");
w__statusCheck.onmessage = function(e) {
var message = JSON.parse(e.data);
if(message.text!=undefined) displayMessage(message.text);
}
}
statusCheck.js which is the worker simply goes like this:
function checkStatus() {
console.log("statusCheck started");
// I will leave standard parts out:
// creating and testing the ajax variable against different browsers
ajaxRequest.onreadystatechange = function() {
if(ajaxRequest.readyState == 4) {
self.postMessage(ajaxRequest.responseText);
var timer;
timer = self.setTimeout(function(){
checkStatus();
}, 1000);
}
}
ajaxRequest.open("GET", "...worker_statusCheck.php", true);
ajaxRequest.send(null);
}
this.onmessage = function(e){
checkStatus();
};
As you can see, this restarts itself every second (for now). The intervall might be longer in production.
worker_statusCheck.php simply gets different things from the database and knits them into a JSON object which gives me the system status.
This works beautifully.
Now I have another worker which is supposed to get initiated by a click on a link to effectively call some php to perform actions:
mainJS loadWorker
function loadWorker(url="") {
console.log("loadWorker started");
if(url!="") {
var uniqueID = "XXX" // creating a random ID based on timestamp and Math.random()
if(typeof(window[uniqueID]) == "undefined") {
var variables = { ajaxURL: url };
window[uniqueID] = new Worker("....loadWorker.js");
window[uniqueID].postMessage(JSON.stringify(variables));
window[uniqueID].onmessage = function(e) {
var message = JSON.parse(e.data);
if(message["success"]!=undefined) {
variables["close"] = "yes";
window[uniqueID].postMessage(JSON.stringify(variables));
}
}
}
With every click on a certain link this gets called, creates a uniquely named worker, runs it, receives the data and tells the worker to close().
The php again does its thing and writes a progress update in the DB after each step of the lengthy procedure. These progress updates I fetch from the DB with the above repeating statusCheck.
Now, I can see the entries in the DB with timestamp, so I know they get written each at their time.
So, both workers do their job and run reliably. But I have noticed, that whenever I initiate the manual (randomly named) worker the statusCheck actually stops performing. It just gets stuck... I was able to confirm this with console output from both workers. So it's not the main JS that seems stuck, but the statusCheck actually pauses... and resumes when loadWorker is done.
Am I missing something fundamental here? Any insight would be appreciated since I'm new to this concept of web workers.
Thanx :)
Your question lacks resources to truly figure out what exactly goes wrong. I can concur that two web workers can operate at the same time, even with synchronous operations. I tested this for both for loops and sync XHR requests.
There are multiple things I would recommend though.
First - unless you're processing the data with some CPU heavy algorithm, web workers are waste of time. XHR requests do not block main thread (unless you explicitly ask them to).
In statusCheck() you declare var w__statusCheck which means a local variable. Therefore it will always be null as seen from outer scope. It might get garbage-collected once no code is running in the worker.
Do not use XMLHttpRequest.onreadystatechange. Use onload and onerror.
Random unique ID's for variables are almost always wrong. If you need to store the worker refference at all, either give it a reasonable name (eg. the url it's supposed to load) or use incremental id.
Do NOT stringify data that you post to web worker. It's already done for you by the browser, possibly in more optimal manner. Converting the data to something is a single most common stupid thing people do with web workers.
Also when posting question, at least make sure the code makes some sense. In your post curly braces do not match.
Alright.. I figured it out:
I was looking in all the wrong places. Turns out, I had initialized my php session in all the php scripts which are called by the workers. And my two parallel workers both called one. So the session file was locked by the first php script and the second had to wait until it was back open again. It was not the workers or the JS being hindered, it was the php.
I now took out the session initialization from my statusCheck.php and it works like a charm. I will keep it in those others that handle the user input responses because there it actually makes sense: user clicks on button "compile data XY" which is run by the worker and takes a while. Impatient as he is he already clicks the next button "show this data"... and due to the locked session file I have sort of a neat queue for those actions. :)
I still will take above recommendations to heart and see to it to improve my code. :)
This is driving me nuts for a while now. I have an ajax call like so:
function update()
{
$.get("update.php", function(data)
{
$("#output-progress").html(data);
});
}
And I call it like so:
window.setInterval(function()
{
update();
}, 2000);
}
Then I have another calc function which is also called:
function calc()
{
$.get("calc.php", function(data)
{
//whole bunch of lines to re-render page
});
}
So the idea is that while calc() is running, the update() will periodically update a div on the progress.
However what is happening is that if I open the console and check the calls to update() is triggered every 5 seconds, but they just stall and they complete only after the calc() has returned. I first thought this was a browser/jQuery issue, but if I log both the functions into separate log files in PHP then the update() gets logged only after the calc() finishes!
Im not sure what is going on here, any pointers are greatly appreciated!
Most likely, you are using sessions, and both calc.php and update.php access session data. In order to ensure data consistency in sessions, access to session data is locked, so only one php process can access the session at a time. This means that while calc.php has the session, no other page access can read it.
What you will want to do is call session_write_close() after calc.php has finished anything that might require access to the session, and before it starts its time-consuming task.
session_write_close() writes the current session's data and releases the lock. Once calc.php is no longer holding to the session, accesses from update.php can read it.
I am having some trouble with a bit of code. I have a function that does some stuff to some data, calls a remote system (activating a script on that system and passing in the data), and then makes another call to the same system to activate a different script (which acts on the data saved above). The problem is that the 1st call to the remote system appears to get lost in the execution.
This is being run in Safari, uses jquery; the function is tied to a button click, which is defined in the javascript code with an onclick function (i.e. it is not defined in the html button definition).
Here's a rough breakdown of the function (cleaned out for viewing purposes - I hope I left enough to make it clear):
function compareJSON() {
// loop through the objects, testing and changing data
// ...
dataSession=({ //build object for output });
$.each( dataSession.chapters , function( indexC, value ) {
//compare objects to some others, testing and changing data
});
// ...
//Call remote script on other system
urlString="url://blah.dee.com/Blar?script=SaveJSON&$JSONobject=";
window.location= urlString + JSON.stringify(dataSession);
//Call remote script on other system
window.location="url://blah.dee.com/Blar?script=EditJSON";
}
The last three lines of code are the two calls. It uses the window.location to actually trigger the remote system, passing the data through the URL. But I need BOTH scripts to get called and run. It appears that only the LAST script in the sequence ever gets run. If I switch them around it remains whatever is in last place.
Is there something about the window.location that doesn't actually process until the end of the function?
This script actually used to be a series of separate function calls, but I figured I was running into asynchronous execution that was causing the various script calls to not register. But once I put the code into this single function, it was still happening.
Any clues would be helpful.
Thanks,
J
Modifing the value of window.location is reserved exclusively for instances in which you'd like to cause a browser redirect.
It looks like you want to trigger a page request instead. You say you already have jQuery loaded, if so, you can trigger such a request using jQuery.get or a similar function.
For example:
// Loads the myscript.php page in the background
$.get('myscript.php');
// You can also pass data (in the form of an object as the second argument)
$.get('myscript.php', { name: "John", time: "2pm" });
I have a Div that uses jQuery to load a file/contents with a javascript function..
function DoWork() {
// Do Stuff
}
Let's say the user can reload the Div and pull the same file/contents with the same js function DoWork(). The problem is, when the file is reloaded, the previous loaded function DoWork() is still running. How can I kill the previous fired DoWork() and restart it?
Javascript is single-threaded, which means only one thing can be executing at a given moment. If DoWork is already "running" it's either a) blocking all other JS code, and you have no choice but to let it finish since you have no way to execute any interruption code until it finishes on its own, or b) DoWork is scheduled to fire off on an interval via setTimeout() or setInterval().
If it's the latter case, setTimeout() and setInterval() return an ID. Store that ID somewhere and call clearTimeout(doWork_timeout_id) or clearInterval(doWork_interval_id) according to how you started it.
You can build a simple function that use: setTimeout and then each call to DoWork will call first to: clearTimeout. I don't really like this solution because you will waste CPU on setTimeout.
So another option will be to use web worker in DoWork (It will do lots of other good things for you in case you are working with big data as it's running in another thread) - then you get an option to send 'stop' message each time you start the work of DoWork().
Are you using ajax to load the div's contents? if so, the better way is as follows:
var doWorkAjax=null;
function DoWork(){
if (doWorkAjax) doWorkAjax.abort();
doWorkAjax = $.ajax(url, data, function(result){
....
doWorkAjax=null;
});
}