I am writing a webpage with the following structure:
One section (table A) depends on another section (table B);
Another section (table B) has elements that require recalculation on each update. The calculation is handled by external tools, and will cause an event when finished.
In order to guarantee correctness, the table need to be updated only after the other table is fully updated (i.e., done with computation). However, I don't know how to effectively achieve this, and I could not find any wait facility within JavaScript.
For now, I am using the following method:
Declare a global variable updated and make it false;
After the first table received input, I make an empty while loop until updated is true;
Add an listener, once the calculation is done and the event received, set updated to true.
This seems unintuitive to me but I cannot think of any other way of doing it. Is there any good ways to do this?
Thanks for any inputs!
In 2022, it's useful to have an event listener that fires off a Promise (which can be used in promise-chains, or async/await code). A clean way to make one:
function getPromiseFromEvent(item, event) {
return new Promise((resolve) => {
const listener = () => {
item.removeEventListener(event, listener);
resolve();
}
item.addEventListener(event, listener);
})
}
async function waitForButtonClick() {
const div = document.querySelector("div")
const button = document.querySelector("button")
div.innerText = "Waiting for you to press the button"
await getPromiseFromEvent(button, "click")
div.innerText = "The button was pressed!"
}
waitForButtonClick()
<button>ClickMe</button>
<div></div>
Add an listener, once the calculation is done and the event received, set updated to true.
Instead of setting updated to true, and then waiting for updated to be true- just do whatever you want to do in the listener.
myEventBus.addListener(function () {
// do whatever
updateTable();
alert('table updated!');
});
Doing empty while loops is a bad idea. Not only do you burn CPU cycles, but Javacript is single threaded so you will loop forever without giving anyone a chance to change the variable.
What you can do is rewrite the table that has other people depending on it to "fire an event itself". There are many ways to do this, but basicaly you just want it to call a "continuation' function instead of blindily returning. This function can be predefined or you can pass it as a parameter somewhere.
//this is just illustrative
//Your actual code will be probably very different from this.
function update_part(){
//do something
signal_finished_part()
}
var parts_done = 0;
function signal_finished_part(){
parts_done ++;
if(parts_done >= 5){
signal_all_parts_done();
}
}
function signal_all_parts_done()
{
//do something to table A
}
You could write a callback function for whatever triggers the update. To avoid messy callbacks, you could use promises too, and update parts of the table depending on the data retrieved in the update operation. Open to suggestions.
Related
I'm working on the "Approve All" button. The process here is when I click "Approve All," each individual "Approve" button will be triggered as "click" all at once, and then it will send POST requests to the controller. However, when I clicked Approve All button, there was a race condition causing the controller returns Error 500: Internal server error. I have tried using JS setTimeout() with value 1500*iter, but when the iterator gets higher, for example at i = 100, then it would take 1500*100 => 150000ms (150s). I hope that explains the problem clearly. Is there a way to prevent such a case?
Here is my code, I'm using JQuery:
let inspection = $this.parents("li").find("ul button.approve"); // this will get all 'approve' button to be clicked at once
inspection.each((i,e)=>{
(function () {
setTimeout(function () {
$(e).data("note",r);
$(e).click();
}, 1500 * i); // this acts like a queue, but when i > 100, it takes even longer to send POST requests.
})(this,i,e,r);
});
// then, each iteration will send a POST request to the controller.
$("#data-inspection ul button.approve").on("click", function() {
// send POST requests
});
Any help would be much appreciated. Thank you.
That 500 error may also be the server crashing from being unable to process all the requests simultaneously.
What I'd recommend is using an event-driven approach instead of setTimeout. Your 1500ms is basically a guess - you don't know whether clicks will happen too quickly, or if you'll leave users waiting unnecessarily.
I'll demonstrate without jQuery how to do it, and leave the jQuery implementation up to you:
// use a .js- class to target buttons your buttons directly,
// simplifying your selectors, and making them DOM agnostic
const buttonEls = document.querySelectorAll('.js-my-button');
const buttonsContainer = document.querySelector('.js-buttons-container');
const startRequestsEvent = new CustomEvent('customrequestsuccess');
// convert the DOMCollection to an array when passing it in
const handleRequestSuccess = dispatchNextClickFactory([...buttonEls]);
buttonsContainer.addEventListener('click', handleButtonClick);
buttonsContainer.addEventListener(
'customrequestsuccess',
handleRequestSuccess
);
// start the requests by dispatching the event buttonsContainer
// is listening for
buttonsContainer.dispatchEvent(startRequestsEvent);
// This function is a closure:
// - it accepts an argument
// - it returns a new function (the actual event listener)
// - the returned function has access to the variables defined
// in its outer scope
// Note that we don't care what elements are passed in - all we
// know is that we have a list of elements
function dispatchNextClickFactory(elements) {
let pendingElements = [...elements];
function dispatchNextClick() {
// get the first element that hasn't been clicked
const element = pendingElements.find(Boolean);
if (element) {
const clickEvent = new MouseEvent('click', {bubbles: true});
// dispatch a click on the element
element.dispatchEvent(clickEvent);
// remove the element from the pending elements
pendingElements = pendingElements.filter((_, i) => i > 0);
}
}
return dispatchNextClick;
}
// use event delegation to mitigate adding n number of listeners to
// n number of buttons - attach to a common parent
function handleButtonClick(event => {
const {target} = event
if (target.classList.contains('js-my-button')) {
fetch(myUrl)
.then(() => {
// dispatch event to DOM indicating request is complete when the
// request succeeds
const completeEvent = new CustomEvent('customrequestsuccess');
target.dispatchEvent(completeEvent);
})
}
})
There are a number of improvements that can be made here, but the main ideas here are that:
one should avoid magic numbers - we don't know how slowly or quickly requests are going to be processed
requests are asynchronous - we can determine explicitly when they succeed or fail
DOM events are powerful
when a DOM event is handled, we do something with the event
when some event happens that we want other things to know about, we can dispatch custom events. We can attach as many handlers to as many elements as we want for each event we dispatch - it's just an event, and any element may do anything with that event. e.g. we could make every element in the DOM flash if we wanted to by attaching a listener to every element for a specific event
Note: this code is untested
I've a question/problem with an whileloop
I need to wait until something changes outside the while loop.
Let's say i have this while loop:
window.changeMe = true;
while(window.changeMe){
}
now i have these two options:
Change the changeMe variable via the Console/JavaScript Execution
Change the changeMe variable via an WebSocket Event
but neither is working, if i change the Variable directly, it is not changed.
If i trigger an WebSocket Event its not getting called.
Maybe its BLOCKED.. so is there any other way to change the variable?
I known i can use await and its already working that way, but the problem is that these functions with while are called via an Addon
and using many await's looks kinda ugly for the addon creator :(
an system with setTimeout & Callbacks are also working but also looks kinda ugly..
Yes, you are correct. Having a infinite while loop will prevent executing any other code from javascript event loop which occupies the main thread.
In order to imitate the same behavior you can implement your own while loop that is friendly to asynchronous events and external code execution. You have to use:
tail recursion in order to minimize the memory footprint,
setTimeout as a mechanism to allow other parts of your code to run asynchronously.
EXAMPLE:
window.changeMe = true;
let stop = setTimeout(() => { console.log("External change stop"); window.changeMe = false; }, 4000)
var whileLoop = () => {
console.log("Inside: ", window.changeMe)
return window.changeMe
? setTimeout(() => { whileLoop(); }, 0)
: false
}
whileLoop()
console.log("Outside: ", window.changeMe)
Here is a fiddle:
https://jsfiddle.net/qwmosfrd/
Here is a setInterval fiddle:
https://jsfiddle.net/2s6pa1jo/
Promise return value example fiddle:
https://jsfiddle.net/0qum6gnf/
JavaScript is single-threaded. If you have while (true) {}, then nothing else outside the while loop can change the state of your program. You need to change your approach. You probably want to set up event listeners instead or put this inside an async function so you can use await to release execution, or some other asynchronous API. But plain vanilla while () {} is synchronous and cannot be affected by other things while it is running.
You can't use a while loop in that way in nodejs.
Nodejs runs your Javascript in a single thread and the overall architecture of the environment is event driven. What your while loop is doing is a spin loop so while that loop is running, no other events can ever run. You have to return control back to the event loop before any other events can run. That means that timers, network events, etc... cannot run while your spin loop is running. So, in nodejs, this is never the right way to write code. It will not work.
The one exception could be if there was an await inside the loop which would pause the loop and allow other events to run.
So, while this is running:
while(window.changeMe){
}
No other events can run and thus nothing else gets a chance to change the changeMe property. Thus, this is just an infinite loop that can never complete and nothing else gets a chance to run.
Instead, you want to change your architecture to be event driven so that whatever changes the changeMe property emits some sort of event that other code can listen to so it will get notified when a change has occurred. This can be done by having the specific code that changes the property also notify listeners or it can be done by making the property be a setter method so that method can see that the property is being changed and can fire an event to notify any interested listeners that the value has changed.
I want to port some old software to javascript. These programs are typically not event driven and just turn in a loop. They only pause to get input from the input stream. I can't just convert them to javascript since there is no equivalent for a classic pascal or c read instruction. I thought it would be possible to use an input field which would fire an onchange event. My program would then be suspended until the event fires. But apparently you can't suspend a JS program.
My second attempt was to set a flag on the onchange event. My program stays in a loop until the flag is set and then reads the value of the input field. But to prevent the browser from getting blocked by this loop I need some sleep functionality between two polls. Apparently there is no equivalent of a sleep function in JS.
How can this be done ?
You could use Promises and await/async to create code that looks like what you know from blocking code.
But it is important to note that this is not blocking code. At the await other code waiting to be executed can interleave.
function waitForIntput(id) {
// create a Promise that resolves at the input event
return new Promise((resolve, reject) => {
let elm = document.getElementById(id)
function listener(evt) {
// remove the listener so that no memory leaking occures
elm.removeEventListener('input', listener)
// resolve the promise with the current value of the element
resolve(elm.value)
}
// call the listener on the input event
elm.addEventListener('input', listener, false);
})
}
(async function() {
while(true) {
console.log('before waitForIntput')
console.log(await waitForIntput('test'))
console.log('after waitForIntput')
}
}())
<input id="test">
If it is a good idea to solve it that way depends on the exact use-case. In general you should check how the task you want to perform should be solved in the new environment, instead of forcing the old style into the new environment.
If you're coming from a language that does something like...
while(true) {
x = readInput();
processInput(x);
}
Then you're correct, there is no direct equivalent in JavaScript. You need to forget about looping, and instead think of everything that happens in your loop before it blocks on user input as one part that sets up an event, and everything that happens after as a callback that handles that event.
The above (very trivial) program would be rewritten in JavaScript as something like:
readInput().then((x) => { processInput(x) });
I have the following code;
$("#myID").click(function () {
//do something
})
At some point, a user action on another part of the webpage needs to change the action that occurs on the click e.g.
$("#myID").click(function () {
//do something different
})
My question is, what's the correct/most efficient way of doing this? Currently I'm just implementing the second chunk of code above, but will this cause some odd behaviour? i.e. will there now be two different actions performed on click? Or does the second block of code override the first.
They will both execute so no, the second call does not overwrite the first.
Basic jsFiddle example
And as pimvdb notes, they will be executed in the order they were bound.
You can always unbind the click function first: http://api.jquery.com/unbind/
Right, they "stack". I.e. $("#myID") will maintain a list of event handlers, and execute both on click.
If you no longer want the original handler, you need to unbind it, using $("#myID").off('click') or if you're using an old version of jquery, $("#myID").unbind('click')`. http://api.jquery.com/unbind/
In your code, both clicks will be executed.
Try to unbind click event before
$("#myID").unbind("click")
http://api.jquery.com/unbind/
You can add a global variable isAnotherEvent = false and then check on click event which part of code you need to execute, to execute another part simply make isAnotherEvent = true
var isAnotherEvent = false;
$("#myID").click(function () {
if(!isAnotjerEvent){
//do something
} else {
//do something else
}
})
$("#btnChangeEvent").click(function(){
isAnotherEvent = true;
}
My users are presented a basically a stripped down version of a spreadsheet. There are textboxes in each row in the grid. When they change a value in a textbox, I'm performing validation on their input, updating the collection that's driving the grid, and redrawing the subtotals on the page. This is all handled by the OnChange event of each textbox.
When they click the Save button, I'm using the button's OnClick event to perform some final validation on the amounts, and then send their entire input to a web service, saving it.
At least, that's what happens if they tab through the form to the Submit button.
The problem is, if they enter a value, then immediately click the save button, SaveForm() starts executing before UserInputChanged() completes -- a race condition. My code does not use setTimeout, but I'm using it to simulate the sluggish UserInputChanged validation code:
<script>
var amount = null;
var currentControl = null;
function UserInputChanged(control) {
currentControl = control;
// use setTimeout to simulate slow validation code
setTimeout(ValidateAmount, 100);
}
function SaveForm() {
// call web service to save value
document.getElementById("SavedAmount").innerHTML = amount;
}
function ValidateAmount() {
// various validationey functions here
amount = currentControl.value; // save value to collection
document.getElementById("Subtotal").innerHTML = amount;
}
</script>
Amount: <input type="text" onchange="UserInputChanged(this)">
Subtotal: <span id="Subtotal"></span>
<button onclick="SaveForm()">Save</button>
Saved amount: <span id="SavedAmount"></span>
I don't think I can speed up the validation code -- it's pretty lightweight, but apparently, slow enough that code tries to call the web service before the validation is complete.
On my machine, ~95ms is the magic number between whether the validation code executes before the save code begins. This may be higher or lower depending on the users' computer speed.
Does anyone have any ideas how to handle this condition? A coworker suggested using a semaphore while the validation code is running and a busy loop in the save code to wait until the semaphore unlocks - but I'd like to avoid using any sort of busy loop in my code.
Use the semaphore (let's call it StillNeedsValidating). if the SaveForm function sees the StillNeedsValidating semaphore is up, have it activate a second semaphore of its own (which I'll call FormNeedsSaving here) and return. When the validation function finishes, if the FormNeedsSaving semaphore is up, it calls the SaveForm function on its own.
In jankcode;
function UserInputChanged(control) {
StillNeedsValidating = true;
// do validation
StillNeedsValidating = false;
if (FormNeedsSaving) saveForm();
}
function SaveForm() {
if (StillNeedsValidating) { FormNeedsSaving=true; return; }
// call web service to save value
FormNeedsSaving = false;
}
Disable the save button during validation.
Set it to disabled as the first thing validation does, and re-enable it as it finishes.
e.g.
function UserInputChanged(control) {
// --> disable button here --<
currentControl = control;
// use setTimeout to simulate slow validation code (production code does not use setTimeout)
setTimeout("ValidateAmount()", 100);
}
and
function ValidateAmount() {
// various validationey functions here
amount = currentControl.value; // save value to collection
document.getElementById("Subtotal").innerHTML = amount; // update subtotals
// --> enable button here if validation passes --<
}
You'll have to adjust when you remove the setTimeout and make the validation one function, but unless your users have superhuman reflexes, you should be good to go.
I think the timeout is causing your problem... if that's going to be plain code (no asynchronous AJAX calls, timeouts etc) then I don't think that SaveForm will be executed before UserInputChanged completes.
A semaphore or mutex is probably the best way to go, but instead of a busy loop, just use a setTimeout() to simulate a thread sleep. Like this:
busy = false;
function UserInputChanged(control) {
busy = true;
currentControl = control;
// use setTimeout to simulate slow validation code (production code does not use setTimeout)
setTimeout("ValidateAmount()", 100);
}
function SaveForm() {
if(busy)
{
setTimeout("SaveForm()", 10);
return;
}
// call web service to save value
document.getElementById("SavedAmount").innerHTML = amount;
}
function ValidateAmount() {
// various validationey functions here
amount = currentControl.value; // save value to collection
document.getElementById("Subtotal").innerHTML = amount; // update subtotals
busy = false;
}
You could set up a recurring function that monitors the state of the entire grid and raises an event that indicates whether the entire grid is valid or not.
Your 'submit form' button would then enable or disable itself based on that status.
Oh I see a similar response now - that works too, of course.
When working with async data sources you can certainly have race conditions because the JavaScript process thread continues to execute directives that may depend on the data which has not yet returned from the remote data source. That's why we have callback functions.
In your example, the call to the validation code needs to have a callback function that can do something when validation returns.
However, when making something with complicated logic or trying to troubleshoot or enhance an existing series of callbacks, you can go nuts.
That's the reason I created the proto-q library: http://code.google.com/p/proto-q/
Check it out if you do a lot of this type of work.
You don't have a race condition, race conditions can not happen in javascript since javascript is single threaded, so 2 threads can not be interfering with each other.
The example that you give is not a very good example. The setTimeout call will put the called function in a queue in the javascript engine, and run it later. If at that point you click the save button, the setTimeout function will not be called until AFTER the save is completely finished.
What is probably happening in your javascript is that the onClick event is called by the javascript engine before the onChange event is called.
As a hint, keep in mind that javascript is single threaded, unless you use a javascript debugger (firebug, microsoft screipt debugger). Those programs intercept the thread and pause it. From that point on other threads (either via events, setTimeout calls or XMLHttp handlers) can then run, making it seem that javascript can run multiple threads at the same time.