I am seeing that sometimes neither of the events are being fired when doing indexedDb.open().
If I set a timeout and observe the state of openRequest, the ready state is set = 'done'. If I do any transaction on the db in the openRequest.result, it works fine.
My guess is that in some cases the openRequest execution is complete before we get to attach onsuccess or other event handlers, in other words it does not get executed in the next event loop.
So, I can inspect the state after an 'x' amount of time if no event is fired. However this approach is hackish and fragile.
Does anyone know of a better way to address this?
var db;
var openRequest = window.indexedDB.open("todos", 1);
openRequest.onerror = function(event) {
console.error('error in open', event);
};
openRequest.onsuccess = (event) => {
console.log('success ' , event)
};
openRequest.onupgradeneeded = (event) => {
console.log('upgradeneeded ' , event);
db = event.target.result;
db.createObjectStore("toDoList", { keyPath: "taskTitle" });
};
openRequest.onblocked = (event) => {
console.log('blocked ' , event);
};
setTimeout( () => {
console.log('timeout');
console.log(openRequest.readyState) // equals done
}, 10000)
Either a "success" or an "error" event must fire when readyState becomes "done". If that's not happening then you've found a browser bug.
As noted in a comment, you'll want db = event.target.result in the onsuccess handler as well, otherwise db will not be set if an upgrade was not necessary. Are you certain that this isn't the source of your error? (i.e. maybe success was firing, you just weren't capturing the result?)
in some cases the openRequest execution is complete before we get to attach 'onsuccess' or other event handlers
If that happened it would be a browser bug. Are you seeing consistent behavior across browsers? Can you reliable reproduce this?
Related
I'm working on the "Approve All" button. The process here is when I click "Approve All," each individual "Approve" button will be triggered as "click" all at once, and then it will send POST requests to the controller. However, when I clicked Approve All button, there was a race condition causing the controller returns Error 500: Internal server error. I have tried using JS setTimeout() with value 1500*iter, but when the iterator gets higher, for example at i = 100, then it would take 1500*100 => 150000ms (150s). I hope that explains the problem clearly. Is there a way to prevent such a case?
Here is my code, I'm using JQuery:
let inspection = $this.parents("li").find("ul button.approve"); // this will get all 'approve' button to be clicked at once
inspection.each((i,e)=>{
(function () {
setTimeout(function () {
$(e).data("note",r);
$(e).click();
}, 1500 * i); // this acts like a queue, but when i > 100, it takes even longer to send POST requests.
})(this,i,e,r);
});
// then, each iteration will send a POST request to the controller.
$("#data-inspection ul button.approve").on("click", function() {
// send POST requests
});
Any help would be much appreciated. Thank you.
That 500 error may also be the server crashing from being unable to process all the requests simultaneously.
What I'd recommend is using an event-driven approach instead of setTimeout. Your 1500ms is basically a guess - you don't know whether clicks will happen too quickly, or if you'll leave users waiting unnecessarily.
I'll demonstrate without jQuery how to do it, and leave the jQuery implementation up to you:
// use a .js- class to target buttons your buttons directly,
// simplifying your selectors, and making them DOM agnostic
const buttonEls = document.querySelectorAll('.js-my-button');
const buttonsContainer = document.querySelector('.js-buttons-container');
const startRequestsEvent = new CustomEvent('customrequestsuccess');
// convert the DOMCollection to an array when passing it in
const handleRequestSuccess = dispatchNextClickFactory([...buttonEls]);
buttonsContainer.addEventListener('click', handleButtonClick);
buttonsContainer.addEventListener(
'customrequestsuccess',
handleRequestSuccess
);
// start the requests by dispatching the event buttonsContainer
// is listening for
buttonsContainer.dispatchEvent(startRequestsEvent);
// This function is a closure:
// - it accepts an argument
// - it returns a new function (the actual event listener)
// - the returned function has access to the variables defined
// in its outer scope
// Note that we don't care what elements are passed in - all we
// know is that we have a list of elements
function dispatchNextClickFactory(elements) {
let pendingElements = [...elements];
function dispatchNextClick() {
// get the first element that hasn't been clicked
const element = pendingElements.find(Boolean);
if (element) {
const clickEvent = new MouseEvent('click', {bubbles: true});
// dispatch a click on the element
element.dispatchEvent(clickEvent);
// remove the element from the pending elements
pendingElements = pendingElements.filter((_, i) => i > 0);
}
}
return dispatchNextClick;
}
// use event delegation to mitigate adding n number of listeners to
// n number of buttons - attach to a common parent
function handleButtonClick(event => {
const {target} = event
if (target.classList.contains('js-my-button')) {
fetch(myUrl)
.then(() => {
// dispatch event to DOM indicating request is complete when the
// request succeeds
const completeEvent = new CustomEvent('customrequestsuccess');
target.dispatchEvent(completeEvent);
})
}
})
There are a number of improvements that can be made here, but the main ideas here are that:
one should avoid magic numbers - we don't know how slowly or quickly requests are going to be processed
requests are asynchronous - we can determine explicitly when they succeed or fail
DOM events are powerful
when a DOM event is handled, we do something with the event
when some event happens that we want other things to know about, we can dispatch custom events. We can attach as many handlers to as many elements as we want for each event we dispatch - it's just an event, and any element may do anything with that event. e.g. we could make every element in the DOM flash if we wanted to by attaching a listener to every element for a specific event
Note: this code is untested
I'm trying to consume sse response in javascript. The onopen and onerror works well but onmessage not.
var conn = new EventSource(`${config.serverHost}/log/stream/${this.streamId}`)
conn.onopen = (evt) => {
console.log('connected to ' + logSourceId)
}
conn.onmessage = (evt) => {
console.log(evt)
}
conn.onerror = (evt) => {
console.error(evt)
}
Connection establishes successfully and events received can be found in Chrome Network Recording as following image shown. But onmessage is never triggered!
Any comment will be appreciated.
You appear to be using named events (something along the lines of "log streami...").
To support these, you need to use the addEventListener format.
// assuming the event name is "log streaming"
conn.addEventListener("log streaming", e => {
console.log("log streaming", e)
})
FYI, onmessage is for any un-named events, aka those event streams without an event property.
event
A string identifying the type of event described. If this is specified, an event will be dispatched on the browser to the listener for the specified event name; the website source code should use addEventListener() to listen for named events. The onmessage handler is called if no event name is specified for a message.
I'm writing a Chrome extension. It's used for recording users' behavior on browsing web pages. It does that by adding event listeners to customers' web pages, using Chrome content script.
Code in content script looks like:
var recordingEvents = ['click', 'input', 'change'];
recordingEvents.forEach(function (e) {
window.addEventListener(e, handler, true);
});
Example of custom page:
<script>
function reload() {
var ifrw = document.getElementById("iframeResult").contentWindow;
ifrw.document.open();
ifrw.document.write("<div>abc</div>");
ifrw.document.close();
}
</script>
<body>
<input type="submit" onclick="reload();" value="Reload" />
<iframe id="iframeResult"></iframe>
</body>
It uses document.open, document.write to rewrite content of iframe.
Here is the question. My event listeners are attached to window object. And document.open removes all its event listeners. Like picture below shows.
Is there a way to avoid document.open removing event listeners? Or to observe document.open, so I can manually re-add listeners after it?
I've found this issue trying to solve exactly the same problem.
Here is a spec https://html.spec.whatwg.org/multipage/webappapis.html#dom-document-open that says that on document.open current document is destroyed and replaced with a fresh one. I had a hope that some event's like "load" are still preserved, no luck.
Here is my detection code:
const testEventName = 'TestEvent';
let tm;
function onTestEvent() {
clearTimeout(tm);
}
function listenToTestEvent() {
document.addEventListener(testEventName, onTestEvent);
}
listenToTestEvent();
function onLostEvents() {
console.log('events are lost');
listenToTestEvent();
// DO THING HERE
}
function checkIfEventsAreLost() {
document.dispatchEvent(new CustomEvent(testEventName));
tm = setTimeout(onLostEvents);
}
new MutationObserver(checkIfEventsAreLost).observe(document, { childList: true });
When document is recreated its childList is changed(new documentElementnode), this is the best trigger I've thought of to detect document replacement.
Note that even listeners fire before setTimeout(..., 0)
This is a detailed explanation of why #Viller's answer works. I'm making this a new answer as it didn't fit into a comment
The TestEvent event is a special event that monitors when the events that were previously setup in a document are removed.
In particular, this accounts for the case of document.open, which removes all listeners not only from the document but also from the window.
The general idea is to setup a listener for a custom event called TestEvent, which clears a timeout. Such timeout is setup only when the document mutates and is triggered by a mutation observer.
Since the timeout schedules the operation to happen at least during the next tick of the event loop, such timeout can be cleared before that, avoiding the execution of its callback all together. And, since the TestEvent event handler clears that timeout, the fact that the timeout is cleared implies that the listener is still attached. On the other hand, if the timeout is not cleared before the next tick, the would signify the events were removed and a new "setup" is needed.
According to MDN:
The Document.open() method [...] come(s) with some side effects. For example:
All event listeners currently registered on the document, nodes inside
the document, or the document's window are removed.
Below I provide a module (onGlobalListenerRemoval) where one can easily register some callback functions to get notified whenever listeners get cleared. This uses the same working principle as the code in Viller's answer.
Usage principle:
onGlobalListenerRemoval.addListener(() => {
alert("All event listeners got removed!")
});
Module code:
const onGlobalListenerRemoval = (() => {
const callbacks = new Set();
const eventName = "listenerStillAttached";
window.addEventListener(eventName, _handleListenerStillAttached);
new MutationObserver((entries) => {
const documentReplaced = entries.some(entry =>
Array.from(entry.addedNodes).includes(document.documentElement)
);
if (documentReplaced) {
const timeoutId = setTimeout(_handleListenerDetached);
window.dispatchEvent(new CustomEvent(eventName, {detail: timeoutId}));
}
}).observe(document, { childList: true });
function _handleListenerDetached() {
// reattach event listener
window.addEventListener(eventName, _handleListenerStillAttached);
// run registered callbacks
callbacks.forEach((callback) => callback());
}
function _handleListenerStillAttached(event) {
clearTimeout(event.detail);
}
return {
addListener: c => void callbacks.add(c),
hasListener: c => callbacks.has(c),
removeListener: c => callbacks.delete(c)
}
})();
Given the code above:
binaryServer = BinaryServer({port: 9001});
binaryServer.on('connection', function(client) {
console.log("new connection");
client.on('stream', function(stream, meta) {
console.log('new stream');
stream.on('data', function('data'){
//actions
stream.on('end', function() {
//actions
});
});
});
I can say that client inherits the features of binaryServer. So if I make console.log(client.id) in the events of stream I can see, which client generate the given event. Now I want to know if every single event is exclusive of one client, in other words I want to know if data happens for every single client (that generates data) and no data event will be generated while the actions is happening.
You're registering a listener to the "connection" event which can happen within binaryServer. When a "connection" event happens, the registered listener will receive an argument, which you choose to call client. client in this case is an object, and doesn't inherit features of binaryServer.
"data" happens for every client, but will have unique results for each clientsince you register an event listener for every client.
If two events are triggered after each other, the callback function of the first event will be called, and after that the second events callback function will be called. See the following example code:
var event = new Event('build');
var i = 0;
// Listen for the event.
document.addEventListener('build', function (e) {
console.log(i++);
}, false);
// Dispatch the event.
document.dispatchEvent(event);
document.dispatchEvent(event);
JSFiddle (watch console)
Information about JavaScript inheritance
Information about JavaScript event loop
I set up a websocket with socketio in the browser console such that
socket.socket.connected
returns true. But if I then add:
socket.on('connect', function () {console.log('some'); });
nothing happens i.e. 'some' is not logged. This is from the official socket-io page:
var socket = io.connect();
socket.on('connect', function () {
socket.emit('ferret', 'tobi', function (data) {
console.log(data);
});
});
I suppose this code is working. But I'm now suspecting that it only works because when the listener is setup (i.e. socket.on...) the socket is not connected and only because of the delay for setting up the connection the listener is active when the connection is established. If that would be correct than this would mean that if I got a client with an unreasonable high load after io.connect() is called such that the connection is established before the listener is setup I'm running into trouble because the listener never fires.
Is this correct?
How can I setup a listener that fires in both cases:
The connection is not yet established when the listener is setup
The connection is already established when the listener is setup
Thank you and best regards
callBackOnConnect: function(socket, callback) {
if (socket.socket.connected) {
callback();
} else {
socket.on('connect', function() {
callback();
});
}
}
AFAIK this is safe - though not pretty for sure.
Although I couldn't find it in the documentation, I ran some tests that put a lot of strain on the client after io.connect() was called. In all cases, the client's 'connect' event fired well after the last listener had been set up. For example: (thanks to www.sean.co.uk for the JavaScript delay)
var socket = io.connect();
// www.sean.co.uk
function pausecomp(millis)
{
var date = new Date();
var curDate = null;
do { curDate = new Date(); }
while(curDate-date < millis);
}
//keeps processing for 5 seconds to simulate
//any CPU-intensive process
pausecomp(5000);
//listener gets set up after delay
socket.on('connect', function() {
//the time that the connect event is fired is compared
//with the listener's set up time
console.log(new Date().getTime() - setUpTime);
});
//the time that the listeners are set up is logged
var setUpTime = new Date().getTime();
Every time I ran this, I logged a positive result (rather than not getting a result at all). Interestingly, the higher the pause, the longer after setup it took for 'connect' to fire. I know that's not proof, but it's at least suggestive.
Regarding when the listeners should be set up, as long as they come after io.connect() and they aren't inside callbacks, they'll be set up before the server fires the 'connection' event, which as far as I can tell comes before the client's 'connect' event. I would be grateful if someone could point me in the direction of a resource that gets into detail about how and when these built-in events are called, but as far as I can tell they just work without worrying about these things.