function buildRpt() {
/* Initialize document here */
db.transaction(
function(trn) {
trn.executeSql(
'SELECT * FROM table',
null,
function(trn,result) {
for (var iI = 0; iI < result.rows.length; iI++) {
var row = result.rows.item(iI);
/* Add record data to document here */
trn.executeSql(
'SELECT * FROM detail table WHERE id = ?',
[ row.id ],
function (trn,rslt) {
/* Add detail info to document */
for (var iTmp = 0; iTmp < rslt.rows.length; iTmp++) {
var tmpRow = rslt.rows.item(iTmp);
/* update document */
}
},
errorHandler
);
}
},
errorHandler
);
}
);
}
I need to get information from the client side database and use it to populate a document. I can iterate through the records of a table and I'm fine. However, when I try to do another database query to get detail information for each record, javascript does the inner query asynchronously. I looked at other questions but I am still kind of foggy on how to accomplish this task. In the code above, the nested executeSql is processed asynchronously. The detail information never ends up in the document. Is there a way to get the detail information for each record into the document where it belongs?
I'm not sure what the question is? If the problem is that nothing comes back use console.log to see if data is coming out. The issue is likely due to your reference to the DOM that is being updated. If the issue is getting requests in a particular order read below.
Deferred objects
It would be nice to see if you have a framework available to you. Since you are issuing AJAX requests in a loop, you need something to synchronize those requests as you will not know in what order they will be returned. For example,
requestRows ->
rows.each ->
ajax call -> //These return in any order
If you insist on doing this framework free look into deferred objects, and be prepared to do a lot of work. Otherwise if you use dojo use http://dojotoolkit.org/reference-guide/dojo/Deferred.html or jQuery I have written something that does this: https://gist.github.com/1219564
Related
I'm trying to get a list of all code snippets of a GitLab (not GitHub!) project. The GitLab API provides allows me to collect at most 100 snippets per call, but provides Link Headers for the other (next,prev,first and last) pages (see this part of GitLab API).
What is the best way to perform asynchronous jQuery.get calls to all the pages and then pass all results in a single callback function?
Neither of the two ideas I have come up with seem very attractive:
Make a dummy call for the first page to get the total number of pages (X-Total-Pages in getResponseHeader), then generate all jQuery.gets and pass them to jQuery.when. Not too bad, but the call to the first page is wasted.
Run the jQuery.gets in a while loop and have each of them store its data to some global variable before launching the callback on that global variable. Disadvantage is that calls are not run in parallel and the global variable does not look like a clean solution.
Since I think this should be a somewhat common problem I was hoping there would be a clean solution somewhere?
EDIT
Here is an implementation of the first idea, to illustrate what I am asking for, namely how to avoid the first $.getJSON call whose results are not used?
function getAllSnippets(callback){
//Make a dummy call to get number of pages
data = {"private_token": "my_private_token", "per_page": 100, "page":1}
$.when(
$.getJSON("https://myserver/api/snippets",data)
).then(function( data, textStatus, jqXHR ) {
//Get the number of pages
numPages = jqXHR.getResponseHeader('X-Total-Pages')
console.log(numPages+' pages in total')
//Generate queries for each new page
var promises = [];
for (var iPage = 1; iPage < numPages+1; iPage++) {
data = {"private_token": "my_private_token", "per_page": 100, "page":iPage}
promises.push($.getJSON("https://myserver/api/snippets",data));
}
//Collect and merge results from all calls
Promise.all(promises).then(function(results) {
answers = []
for (var iPage = 0; iPage < numPages; iPage++){
answers = $.merge( answers, results[iPage] );
}
callback(answers);
}, function(err) {
alert("Failed to collect code snippets"+err);
});
});
}
I have an issue.
So, my story is:
I have a 30 GB big file (JSON) of all reddit posts in a specific timeframe.
I will not insert all values of each post into the table.
I have followed this series, and he coded what I'm trying to do in Python.
I tried to follow along (in NodeJS), but when I'm testing it, it's way too slow. It inserts one row every 5 seconds. And there 500000+ reddit posts and that would literally take years.
So here's an example of what I'm doing in.
var readStream = fs.createReadStream(location)
oboe(readStream)
.done(async function(post) {
let { parent_id, body, created_utc, score, subreddit } = data;
let comment_id = data.name;
// Checks if there is a comment with the comment id of this post's parent id in the table
getParent(parent_id, function(parent_data) {
// Checks if there is a comment with the same parent id, and then checks which one has higher score
getExistingCommentScore(parent_id, function(existingScore) {
// other code above but it isn't relevant for my question
// this function adds the query I made to a table
addToTransaction()
})
})
})
Basically what that does, is to start a read stream and then pass it on to a module called oboe.
I then get JSON in return.
Then, it checks if there is a parent saved already in the database, and then checks if there is an existing comment with the same parent id.
I need to use both functions in order to get the data that I need (only getting the "best" comment)
This is somewhat how addToTransaction looks like:
function addToTransaction(query) {
// adds the query to a table, then checks if the length of that table is 1000 or more
if (length >= 1000) {
connection.beginTransaction(function(err) {
if (err) throw new Error(err);
for (var n=0; n<transactions.length;n++) {
let thisQuery = transactions[n];
connection.query(thisQuery, function(err) {
if (err) throw new Error(err);
})
}
connection.commit();
})
}
}
What addToTransaction does, is to get the queries I made and them push them to a table, then check the length of that table and then create a new transaction, execute all those queries in a for loop, then comitting (to save).
Problem is, it's so slow that the callback function I made doesn't even get called.
My question (finally) is, is there any way I could improve the performance?
(If you're wondering why I am doing this, it is because I'm trying to create a chatbot)
I know I've posted a lot, but I tried to give you as much information as I could so you could have a better chance to help me. I appreciate any answers, and I will answer the questions you have.
I am modifying a third party - web client application in which I only have access to certain js files.
The search function is limited to search in one given server node at a time, and as a work around, I hardcoded all the server nodes and created a for loop, invoking the "search" several times, at different nodes.
The server response (in a form of FORM - without getters) are automatically handled by a callback, which then renders the view of the form. This means I am only able to display the last response and thus displaying only one set of result.
To handle this, I added $trs = $(tr).clone(true) on the callback function, saving all the rows from previous forms and then - I made the last loop to "search" to have another callback - which will then append the collected rows from $tr and display the last form complete with all the results from all nodes.
But the result is inconsistent. It sometimes just displays result from one server node. I would think this is caused by some delay in server response which caused that form to render last. I tried to put delay by setTimeout function, but that keeps me from getting any result at all
I am very new with all the web programming - JS and JQUERY both (well CSS and HTML even lol) and I would like to ask for your suggestions on a better way to handle this.
Thank you!
_handleConfigSubmit: function (form, error) {
//alert("_handleConfigSubmit");
if (form) {
var formView = new jabberwerx.ui.XDataFormView(form);
var that = this;
formView.event("xdataItemSelected").bind(function(evt) {
that.jq.find(".muc_search_button_join").removeAttr("disabled");
var resultTable = that.jq.find(".muc_search_results table.result_table");
resultTable.find("tr.selected").removeClass("selected");
that._selectedItem = evt.data.selected;
resultTable.find("tr#"+evt.data.selected._guid).addClass("selected");
});
var searchResultsDiv = jabberwerx.$(".muc_search_results", this.jq);
searchResultsDiv.empty();
this.update();
var dim = {
width: searchResultsDiv.width(),
height: searchResultsDiv.height()
};
formView.render().appendTo(searchResultsDiv);
formView.dimensions(dim);
$trs = $("table.result_table tbody>tr:not(:first)").clone(true);
if ($trList!=null){
$trList = $trList.add($trs);
}else{
$trList = $trs;
}
$("table.result_table tbody>tr:not(:first)").remove()
if (ctr<=3){
$("table.result_table tbody").append($trList);
}else{
ctr++;
}
} else {
this._showError(error);
}
}
Following typical REST standards, I broke up my resources into separate endpoints and calls. The main two objects in question here are List and Item (and of course, a list has a list of items, as well as some other data associated with it).
So if a user wants to retrieve his lists, he might make a Get request to api/Lists
Then the user might want to get the items in one of those lists and make a Get to api/ListItems/4 where 4 was found from List.listId retrieved in the previous call.
This is all well and good: the options.complete attribute of $.ajax lets me point to a callback method, so I can streamline these two events.
But things get very messy if I want to get the elements for all the lists in question. For example, let's assume I have a library function called makeGetRequest that takes in the end point and callback function, to make this code cleaner. Simply retrieving 3 elements the naive way results in this:
var success1 = function(elements){
var success2 = function(elements){
makeGetRequest("api/ListItems/3", finalSuccess);
}
makeGetRequest("api/ListItems/2", success2);
}
makeGetRequest("api/ListItems/1", success1);
Disgusting! This is the kind of thing in programming 101 we're smacked across the wrists for and pointed to loops. But how can you do this with a loop, without having to rely on external storage?
for(var i : values){
makeGetRequest("api/ListItems/" + i, successFunction);
}
function successFunction(items){
//I am called i-many times, each time only having ONE list's worth of items!
}
And even with storage, I would have to know when all have finished and retrieved their data, and call some master function that retrieves all the collected data and does something with it.
Is there a practice for handling this? This must have been solved many times before...
Try using a stack of endpoint parameters:
var params = [];
var results [];
params.push({endpoint: "api/ListItems/1"});
params.push({endpoint: "api/ListItems/2"});
params.push({endpoint: "api/ListItems/3"});
params.push({endpoint: "api/ListItems/4"});
Then you can make it recursive in your success handler:
function getResources(endPoint) {
var options = {} // Ajax Options
options.success = function (data) {
if (params.length > 0) {
results.push({endpoint: endpoint, data: data});
getResources(params.shift().endpoint);
}
else {
theMasterFunction(results)
}
}
$.get(endPoint, options)
}
And you can start it with a single call like this:
getResources(params.shift().endpoint);
Edit:
To keep everything self contained and out of global scope you can use a function and provide a callback:
function downloadResources(callback) {
var endpoints = [];
var results [];
endpoints.push({endpoint: "api/ListItems/1"});
endpoints.push({endpoint: "api/ListItems/2"});
endpoints.push({endpoint: "api/ListItems/3"});
endpoints.push({endpoint: "api/ListItems/4"});
function getResources(endPoint) {
var options = {} // Ajax Options
options.success = function (data) {
if (endpoints.length > 0) {
results.push({endpoint: endpoint, data: data});
getResources(endpoints.shift().endpoint);
}
else {
callback(results)
}
}
$.get(endPoint, options)
}
getResources(endpoints.shift().endpoint);
}
In use:
downloadResources(function(data) {
// Do stuff with your data set
});
dmck's answer is probably your best bet. However, another option is to do a bulk list option, so that your api supports requests like api/ListItems/?id=1&id=2&id=3.
You could also do an api search endpoint, if that fits your personal aesthetic more.
I am new to jquery and recently have been amazed how powerful those call backs are. But here I got some logic and I am not sure what the best way is.
Basically, I got a bunch of tables in web sql database in chrome. I will need to go through say table A, table B and table C. I need to submit each row of the table to server. Each row represents a bunch of complicated logic and data url and they have to be submitted in the order of A -> B -> C.
The regular Java way would be:
TableA.SubmitToServer()
{
query table A;
foreach(row in tableA.rows)
{
int nRetID = submitToServer(row);
//do other updates....
}
}
Similar for tableB and table C.
then just call:
TableA.SubmitToServer();
TableB.SubmitToServer();
TableC.SubmitToServer();
//that is very clear and easy.
But in JQuery, it probably will be:
db.Transaction(function (tx){
var strQuery = "select * from TableA";
tx.executeSql(strQuery, [], function (tx, result) {
for(i = 0 ; i < result.rows.length; i++) {
submitTableARowToServer(tx, result.rows.getItem(i), function (tx, result) {
//do some other related updates based on this row from tableA
//also need to upload some related files to server...
});
}
},
function errorcallback() ...
});
As you can see, there are already enough nested callbacks. Now, where should I put the process for TableB and tableC? They all need to have similar logic and they can only be called after everything is done from TableA. So, What is the best way to do this in JQuery?
Thanks