I wonder if any of you can help me with this? I have developed a web application that uses an indexed database with 95 object stores. The DB installs fine on Chrome, but usually produces an error on IE (where I would like to get it running for reasons I won't discuss here). The problem is the creating of 95 object stores in the onUpgrade routine (even if I put no data in them). Here is an example of the code:
encode (NoofCKlistItems,"NoofCKlistItems");
encode (pgtxt,"pgtxt");
//there are 95 statements like the ones above
function encode(j_ThisField,StoreName) {
objectStore = dbInterview.createObjectStore(StoreName);
for (i in j_ThisField) {
itemnumber = parseInt(j_ThisField[i][0], 10);
objectStore.put(j_ThisField[i][1],itemnumber); }
}
I cannot think of a way round this. I have tried chaining the 95 calls to the encode function with callbacks, but no change. I have tried putting a delay in the end of the encode function, but then it loses the transaction. The only things that reliably work are (a) to halve the number of object stores, or (b) to put an alert at the end of the encode function, so the program has to stop at every variable (so you have to click 95 times!).
I have not found a way of adding to the object stores once the DB is installed. An upgrade event removes everything and starts again, so the problem remains.
Any ideas?
Many thanks
Steve Moss
I have solved the problem. I simply put a check at the end of the encode routine, to see if the last of the 95 variables is being written to the indexeddb. If it is the last one, an alert is fired to say the download is complete. This seems to give IE the breathing space it needs to finish processing. So the encode routine becomes:
function encode(j_ThisField,StoreName) {
objectStore = dbInterview.createObjectStore(StoreName);
for (i in j_ThisField) {
itemnumber = parseInt(j_ThisField[i][0], 10);
objectStore.put(j_ThisField[i][1],itemnumber); }
counter++;
if (StoreName===<THE LAST VARIABLE NAME>){alert('complete');}
}
It works every time.
Sorry, I missed a crucial line. I also had to add an alert when the counter has reached 45. This relates to having found that IE copes with half of the 90 variables, but not the whole 90. So the encode routine is:
function encode(j_ThisField,StoreName) {
objectStore = dbInterview.createObjectStore(StoreName);
for (i in j_ThisField) {
itemnumber = parseInt(j_ThisField[i][0], 10);
objectStore.put(j_ThisField[i][1],itemnumber); }
counter++;
if (counter===45){alert('Click to continue dowloading');}
if (StoreName===<THE LAST VARIABLE NAME>){alert('Click to complete the download');}
}
Thanks for your suggestions
Related
This may be a quite naive question but I really need some help.
Prior to writing this post, I was programming on JSBin. Turns out without me realizing, I ran a setInterval loop prompting for userInput and it kept on looping, making me unable to click anywhere to change the code to fix the loop. It kept on repeating and repeating. It got to the point where I had to refresh and lose all my hard-written-code (I was not logged in, so my code was not saved)! I want to avoid that next time.
So, my question is how do I stop any such kind of setInterval Loops, so that I am able to access my code and change it and re-run it. Below is a code that demonstrates my issue, if you try running it on JSBin.com (obviously, it is not the code I wrote before). As you can see, I can not click on my code to change it (or save it) in any way, which means I lose all my code!
This may seem like a useless question, but I really want to know ways to fix it and perhaps fixing it from the developer tools will help me be familiar with the overwhelming set of tools it has :P. So please help me if you know a solution.
Thank you for taking your time to help me! I appreciate it.
setInterval(demo,1);
function demo()
{
var name = prompt("Enter your name: ");
}
Another option is to search the developer tools "Elements" panel for the iframe (this should be doable even if the main document is unresponsive due to prompt's blocking) - then, just right click the iframe element and remove it, no need to type any Javascript. (or, if you want you can select the iframe with querySelector and remove it, eg document.querySelector('iframe').remove())
That's kind of a hack and should only be used in cases like the one exposed in OP but,
About all implementations use integers as timerid that just get incremented at every call.
So what you can do, is to clear all timeouts that were created on the page.
To do so you need to first get to which timerid we are, then call cleatTimeout or clearInterval (they do the same) in a loop until you reach the last call:
function stopAllTimers() {
const timerid = setTimeout(_=>{}); // first grab the current id
let i=0;
while(i < timerid) {
clearTimeout(i); // clear all
i++;
}
};
btn.onclick = stopAllTimers;
// some stoopid orphan intervals
setInterval(()=>console.log('5000'), 5000);
setInterval(()=>console.log('1000'), 1000);
setInterval(()=>console.log('3000'), 3000);
const recursive = () => {
console.log('recursive timeout');
setTimeout(recursive, 5000);
};
recursive();
<button id="btn">stop all timeouts</button>
Assuming the dev tools are closed, hit esc and f12 nearly simultaneously. This should open the dev tools. If it doesn't keep trying until it does.
Once they are open, hit esc and f8. Again, retry til it halts javascript execution at some arbitrary point in the code.
In the "sources" tab locate the generated script for what you wrote (offhand I don't know how it would look like from within JSBin) and literally delete the var name = prompt("Enter your name: "); line. Hitting f8 again will continue execution as if the "new" code is running. This should free you up to copy/paste your code from the site itself before you refresh the page
I am using a jQuery plugin called jQuery Phoenix. It is a plugin that makes localStorage with forms very easy.
My question:
This script auto-saves every second. I have a form with around 275 fields and it saves all of the fields.
(I know saving that many fields that often is a bit overkill, but it's the default setting. I'm going to change it to save on an onblur event but it will still be saving 275 fields every time the person changes fields.)
If it is saving that often, will I run into any type of performance issues in browsers?
I do not know much about localStorage or how it affects performance, especially when saving this many fields of data that often.
As has been mentioned, you can optimise your use of localStorage a fair bit from what is proposed, but that isn't the question.
LocalStorage is pretty fast, as confirmed in some tests written about here:
https://gomakethings.com/how-fast-is-vanilla-js-localstorage/
they were seeing speeds around 12ms to write an object of 10,000 values and read it back again.
I had a play and storing each item on its own, and it tended to be between 70-100ms, but when you're 'only' dealing with around 300 values it's less than 2ms, this was the same if the values were strings or integers.
Here's the code I used, which is largely based on the code in the linked article:
// Timestamp before the test
var start = performance.now();
// Set/get data to localStorage
var count = 10000;
for (var i = 0; i < count; i++) {
localStorage.setItem(`perfTest_${i}`, i);
var result = localStorage.getItem(`perfTest_${i}`);
if (parseInt(result) !== i) {
console.error(`${result} !== ${i}`);
}
}
// Timestamp after the test
var end = performance.now();
// Duration of the test
console.log('It took ' + (end - start) + 'ms.');
The easiest way to run these tests is just to paste them into your browser's console!
Performance is surely going to differ and it will be an issue since internal implementation of localStorage is browser specific.
There is also a time difference between first read and subsequent read.
Beside local storage has a limited capacity(though it can be changed)
Also you wont want to stringify before saving. This will be inefficient.
Note: If all 275 fields are not changing changing simultaneously you can you only save the changed field
I'm trying to create code that requires the least number of bytes and that works for all browsers including IE 7.
In this example, the program calls dosomething('x1') and dosomething('x2').
If I have code like this:
var items,item,index,count;
items=Array('x1','x2');
count=items.length;
for (index=0;index<count;index++){
item=items[index];
dosomething(item);
}
Could I reduce it to this and have it still function exactly the same in all browsers:
var a=Array('x1','x2'),c=a.length,i;
for (i=0;i<c;i++){
f(a[i]);
}
I understand I changed the variable names and calling function name but my goal is to use the least number of bytes possible in the code to make the code execute.
I'm just not sure if declaring a variable equal to a property of a value from a previous variable in the same list of declarations would actually return correct results.
In other words, does var a=Array('x1','x2'),c=a.length... work, or do I have to specifically do var a=Array('x1','x2');var c=a.length; to make it work in all browsers including IE 7?
This is what the Google Closure Compiler service returned:
var a,b,c,d;a=["x1","x2"];d=a.length;for(c=0;c<d;c++)b=a[c],dosomething(b);
You can find many different Javascript compressors online to automate the process you are hand coding now. Yet, it's always good to understand how they work as it helps to write code that is better compressed.
As for IE, you can test your code by changing the emulations settings in the IE debugger panel. Just press F12, click the Emulation tab, and adjust the document mode to 7 (IE7).
Hope this is enough to get you started in the right direction.
You can use Array.map from IE 9
var items = Array('x1','x2');
items.map(dosomething(item));
What I want to do: Group all the like elements on a page (of a certain kind) into an object which I can later iterate on -- or apply sweeping changes to every element within.
My code is successful at accomplishing the given task but when the number of elements grows to 200-300+ then the performance drastically drops off and users have noticed. I have isolated the offending lines of code and want to know if there is another way of accomplishing the same problem.
The add() function appears to be the problematic operation based on timers I have placed around them. At first the time required to perform the operation is .001 but grows until the number of elements reaches 300 and it takes ~.1 of a second for each additional element AND continues slowing down.
I have researched (and more) for jQuery performance enhancing abilities and have implemented a few of them (namely 3) but they have not given me any meaningful performance increases. Amazingly, this code performs within 1 second (!) for Firefox (300+ calls to add()) while Chrome and IE take roughly 10-20x longer or more...
Here is my code:
rowsToChange = $([]);
// Grab all the ids greater than whichever one I'm currently looking at:
var arr = $.makeArray($("[id^=stackLocatorLinkFillUp]:gt("+(uniqueID-1)+")"));
for (var i=0; i<arr.length; i++) {
$this = arr[i];
// <<< VARIOUS CONDITIONALS that make this as selective as possible REMOVED >>>
startTimer = new Date().getTime();
// **************************
// PROBLEMATIC LINE FOLLOWS when 200+ records:
rowsToChange = rowsToChange.add($this);
// Grows from .001 to .1xx after 300 iterations
console.log("innertiming:"+(new Date().getTime() - startTimer)/1000);
// **************************
}
The end result looks like this (via Chrome Inspector):
[<div style="display:none" id="stackLocatorLinkFillUp1">itemType=BOUND&ccLocale=PERIODICAL</div>,
<div style="display:none" id="stackLocatorLinkFillUp2">itemType=BOUND&ccLocale=PERIODICAL</div>,
...
]
Eventually I process all these as follows (which I love the simplicity of!):
var superlink = "...new <a> goodness to display for all elements...";
rowsToChange.html(superlink).css("display","block");
This looked like it could be a valid solution (different add method?) but I would prefer to continue gathering a list of objects together so that the last line can work its magic.
(am not i am pointed out that the following is not true -- regarding concatenation; thanks 'am not i am')
It seems like the add() operation must be concatenating strings since that appears to be one of the main problems others face. But transforming my add() statement into += doesn't look like it works.
Thanks for checking this out;
Chrome: 18.0.1025.142 m
Firefox: 11.0
IE: 8.0.7600.16385
First observation: add saves the previous element set. Try rowsToChange = jQuery.merge(rowsToChange, [$this]); instead.
Second observation: it seems as though rowsToChange will end up being the exact same element set as the one you called $.makeArray on. Why not just save the original set?
DCoder shows how to appropriately merge the information together if you are using a for loop. However, if you come here and are using a .each() loop, use what follows.
The main difference is that brackets are unnecessary / necessary depending on the structure of 'this'. It also seems to be generally accepted that .each() is at least slightly slower than the native javascript for loop. (evidence from 2009) (timing test_copied from question above)
var $this, rowsToChange = $([]);
// slower than a for loop
$("[id^=stackLocatorLinkFillUp]:gt("+(uniqueID-1)+")").each( function() {
// If statements <removed> that decide whether or not to include in the new container
$this = $(this); // probably unnecessary under most situations
rowsToChange = jQuery.merge(rowsToChange, $this);
});
Operate on every piece of the new sub-group decided upon by the removed if statements!
rowsToChange.html("...");
Thanks to everyone who viewed the question, took the time to answer, voted it up, etc.!
I am writing a greasemonkey script. Recently i had this same problem twice and i have no idea why is this happening.
function colli(){
.....
var oPriorityMass = bynID('massadderPriority');//my own document.getElementById() function
var aPriorities = [];
if (oPriorityMass) {
for (var cEntry=0; cEntry < oPriorityMass.childNodes.length; cEntry++) {
var sCollNumber = oPriorityMass.childNodes[cEntry].getAttribute('coll');
if (bynID('adder' + sCollNumber + '_check').checked)
aPriorities.push(parseInt(sCollNumber));
}
}
.....
}
So the mystery of this is, one day i had oPriorityMass named as oPririoty. It was working fine, but the whole function was not yet complete and i started working on another functions for my script. These functions have no connection with each other.
Few days later i decided to go back to my function in the above example and finish it. I ran a test on it without modifying anything and got an error in the firefox's (4) javascript error console saying that oPriority.chilNodes[cEntry] is undefined. NOTE, few days back i have tested it exactly the same way and there was no such problem at all.
Ok, so, i decided to rename oPriority to oPriorityMass. Magically, problem got solved.
At first i thought, maybe there was some conflict of 2 objects, with the same name being used in different functions, which somehow continued to live even outside of function scope. My script is currently over 6000 lines big, but i did a search and found out that oPriority was not mentioned anywhere else but in this exact function.
Can somebody tell me, how and why is this happening? I mentioned same thing happened twice now and they happened in different functions, but the same problem node.childNodes[c] is undefined yet node is not null and node.childNodes.length show correct child count.
What is going on? How do i avoid such problems?
Thank you
EDIT: The error given by error console is
Error: uncaught exception: TypeError: oPriorityMass.childNodes[cEntry] is undefined
In response to Brocks comment:
GM_log(oPriorityMass.childNodes[cEntry]) returns undefined as a message. So node.childNodes[c] is the thing that is undefined in general.
My script creates a div window. Later, the above function uses elements in this div. Elements do have unique IDs and i am 100% sure the original site don't know about them.
My script has a start/stop button to run one or the other function when i need to.
I have been refreshing the page and running my script function now. I have noticed that sometimes (but not always) script will fail with the described error on the first run, however, if i run it again (without refreshing the page) it starts working.
The page has a javascript that modifies it. It changes some of it's element widths so it changes when the browser is resized. But i know it has no effect on my div as it is left unchanged when i resize browser.
EDIT2:
function bynID(sID) {
return top.document.getElementById(ns(sID));
}
function ns(sText) {
return g_sScriptName + '_' + sText;
}
ns function just adds the script name in front of the ID. I use it when creating HTML element so my elements never have the same id as the web page. So bynID() is simple function that saves some typing time when i need to get element by ID.
I have modified my colli() function to include check
if (oPriorityMass) {
if (!oPriorityMass.childNodes[0]) {
GM_log('Retrying');
setTimeout(loadPage,2000);
return;
}
for (var cEntry=0; cEntry < oPriorityMass.childNodes.length; cEntry++) {
var sCollNumber = oPriorityMass.childNodes[cEntry].getAttribute('coll');
if (bynID('adder' + sCollNumber + '_check').checked)
aPriorities.push(parseInt(sCollNumber));
}
}
The loadPage function does 1 AJAX call, then i run few XPATH queries on it, but the actual contents are never appended/shown on the page, just kept inside document.createElement('div'), then this function calls colli(). So now, as i have modified my function, i checked the error console and saw that it may take up to 5 tries for it to start working correctly. 5 x 2seconds, thats 10 seconds. It is never 5 retries always, may vary There's got to be something else going on?
In Firefox, childNodes can include #text nodes. You should check to make sure that childNodes[cEntry] has nodeType == 1 or has a getAttribute method before trying to call it. e.g.
<div id="d0">
</div>
<div id="d1"></div>
In the above in Firefox and similar browsers (i.e. based on Gecko and WebKit based browsers like Safari), d0 has one child node, a text node, and d1 has no child nodes.
So I would do something like:
var sCollNumber, el0, el1;
if (oPriorityMass) {
for (var cEntry=0; cEntry < oPriorityMass.childNodes.length; cEntry++) {
el0 = oPriorityMass.childNodes[cEntry];
// Make sure have an HTMLElement that will
// have a getAttribute method
if (el0.nodeType == 1) {
sCollNumber = el0.getAttribute('coll');
el1 = bynID('adder' + sCollNumber + '_check');
// Make sure el1 is not falsey before attempting to
// access properties
if (el1 && el1.checked)
// Never call parseInt on strings without a radix
// Or use some other method to convert to Number
aPriorities.push(parseInt(sCollNumber, 10));
}
}
Given that sCollNumber seems like it is a string integer (just guessing but it seems likely), you can also use:
Number(sCollNumber)
or
+sCollNumber
whichever suits and is more maintainable.
So, according to your last edit, it now works, with the delay, right?
But when I suggested the delay it was not meant to do (even more?) ajax calls while waiting!!
NOT:
if (!oPriorityMass.childNodes[0]) {
GM_log('Retrying');
setTimeout(loadPage,2000);
return;
More like:
setTimeout (colli, 2000);
So the ajax and the other stuff that loadPage does could explain the excessive delay.
The random behavior could be caused by:
return top.document.getElementById(ns(sID));
This will cause erratic behavior if any frames or iframes are present, and you do not block operation on frames. (If you do block such operation then top is redundant and unnecessary.)
GM does not operate correctly in such cases -- depending on what the script does -- often seeming to "switch" from top scope to frame scope or vice versa.
So, it's probably best to change that to:
return document.getElementById (ns (sID) );
And make sure you have:
if (window.top != window.self) //-- Don't run on frames or iframes
return;
as the top lines of code.
Beyond that, it's near impossible to see the problem, because of insufficient information.
Either boil the problem into a Complete, Self Contained, Recipe for duplicating the failure.
OR, post or link to the Complete, Unedited, Script.