How to ensure orderly processing result of a websocket's callbacks? - javascript

I have an application which listens to a websocket endpoint and processes the data received from it and saves it to a database.
The problem of race condition arises when two callbacks are invoked concurrently (for example: one task may begin processing, then another task may begin processing and update the database, then the first task may update the database - so in the end the database updates are out of order).
The solution I thought of was to record the exact time a callback is called, process the data, then attach the time to the data passed to the database and in the database compare this time with the last update time and act accordingly.
One possible problem I thought of is that the time may be recorded out of order (for example: consider the scenario where the first callback is called, then the second callback is called and the time is recorded, then the time is recorded for the first callback).
How would you do it the right way? Solutions to this problem or other ways to go about it?
EDIT: To be more specific as I'm intending for the program to be as real-time as possible I'd like to allow for the most up-to-date callback to be processed without delay (without waiting for all other previous callbacks to entirely process) but to ensure that the end result of the processing (as is recorded in the database) adheres to the order in which the callbacks arrived (is not corrupt)

You can the data handler callback return a promise of when it's finished.
Each time you get a new data from the socket, wait for that promise before handling it, then store the resulting promise for the next data to wait for.
That would look like this:
const ready = Promise.resolve();
socket.on(..., data => {
ready = ready.then(() => processData(data);
});
This will have no effect on any other code.
EDIT: To do expensive work outside the lock, you can write
socket.on(..., data => {
const result = doExpensiveWork(data); // Returns a promise
ready = Promise.all(result, ready).then(([result]) => insertData(result));
});

Related

Is non-parallel access to a method in node JS guaranteed?

Javascript is single threaded and - Node.js uses an asynchronous event-driven design pattern, which means that multiple actions are taken at the same time while executing a program.
With this in mind, I have a pseudo code:
myFunction() // main flow
var httpCallMade = false // a global variable
async myFunction() {
const someData = await callDB() // LINE 1 network call
renderMethod() // LINE 2 flow1
}
redisPubSubEventHandler() { // a method that is called from redis subscription asynchronously somewhere from a background task in the program
renderMethod() // LINE 3 flow2
}
renderMethod(){
if(!httpCallMade) {
httpCallMade = true //set a global flag
const res = makeHTTPCall() // an asynchronous network call. returns a promise.
} // I want to ensure that this block is "synchronized" and is not acessible by flow1 and flow2 simultaneously!
}
myFunction() is called in the main thread - while redisPubSubEventHandler() is called asynchronously from a background task in the program. Both flows would end in calling renderMethod(). The idea is to ensure makeHTTPCall() (inside renderMethod) is only allowed to be called once
Is it guaranteed that renderMethod() would never be executed in parallel by LINE2 and LINE3 at the same time? My understanding is that as soon as renderMethod() is executed - event loop will not allow anything else to happen in server - which guarantees that it is only executed once at a given time (even if it had a network call without await).
Is this understanding correct?
If not, how do I make synchronize/lock entry to renderMethod?
Javascript is single-threaded. Therefore, unless you are deliberately using threads (eg. worker_threads in node.js) no function in the current thread can be executed by two parallel threads at the same time.
This explains why javascript has no mutex or semaphore capability - because generally it is not needed (note: you can still have race conditions because asynchronous code may be executed in a sequence you did not expect).
There is a general confusion that asynchronous code means parallel code execution (multi-threaded). It can but most of the time when a system is labeled as asynchronous or non-blocking or event-oriented INSTEAD of multi-threaded it often means that the system is single-threaded.
In this case asynchronous means parallel WAIT. Not parallel code execution. Code is always executed sequentially - only, due to the ability of waiting in parallel you may not always know the sequence the code is executed in.
There are parts of javascript that execute in a separate thread. Modern browsers execute each tab and iframe in its own thread (but each tab or iframe are themselves single-threaded). But script cannot cross tabs, windows or iframes. So this is a non-issue. Script may access objects inside iframes but this is done via an API and the script itself cannot execute in the foreign iframe.
Node.js and some browsers also do DNS queries in a separate thread because there is no standardized cross-platform non-blocking API for DNS queries. But this is C code and not your javascript code. Your only interaction with this kind of multi-threading is when you pass a URL to fetch() or XMLHttpRequest().
Node.js also implement file I/O, zip compression and cryptographic functions in separate threads but again this is C code, not your javascript code. All results from these separate threads are returned to you asynchronously via the event loop so by the time your javascript code process the result we are back to executing sequentially in the main thread.
Finally both node.js and browsers have worker APIs (web workers for browsers and worker threads for node.js). However, both these API use message passing to transfer data (in node only a pointer is passed in the message thus the underlying memory is shared) and it still protects functions from having their variables overwritten by another thread.
In your code, both myFunction() and redisPubSubEventHandler() run in the main thread. It works like this:
myFunction() is called, it returns immediately when it encounters the await.
a bunch of functions are declared and compiled.
we reach the end of your script:
// I want to ensure that this method is "synchronized" and is not called by flow1 and flow2 simultaneously!
}
<----- we reach here
now that we have reached the end of script we enter the event loop...
either the callDB or the redis event completes, our process gets woken up
the event loop figures out which handler to call based on what event happened
either the await returns and call renderMethod() or redisPubSubEventHandler() gets executed and call renderMethod()
In either case both your renderMethod() calls will execute on the main thread. Thus it is impossible for renderMethod() to run in parallel.
It is possible for renderMethod() to be half executed and another call to renderMethod() happens IF it contains the await keyword. This is because the first call is suspended at the await allowing the interpreter to call renderMethod() again before the first call completes. But note that even in this case you are only in trouble if you have an await between if.. and httpCallMade = true.
You need to differentiate between synchronous and asynchronous, and single- and multi-threaded.
JavaScript is single-threaded so no two lines of the same execution context can run at the same time.
But JavaScript allows asynchronous code execution (await/async), so the code in the execution context does not need to be in the order it appears in the code but that different parts of the code can be executed interleaved (not overlapped) - which could be called "running in parallel", even so, I think this is misleading.
event-driven design pattern, which means that multiple actions are taken at the same time while executing a program.
There are certain actions that can happen at the same time, like IO, multiprocessing (WebWorkers), but that is (with respect to JavaScript Code execution) not multi-threaded.
Is it guaranteed that renderMethod() would never be executed in parallel by LINE2 and LINE3 at the same time?
Depends on what you define as parallel at the same time.
Parts of logic you describe in renderMethod() will (as you do the request asynchronously) run interleaved, so renderMethod(){ if(!httpCallMade) { could be executed multiple times before you get the response (not the Promise) back from makeHTTPCall but the code lines will never executed at the same time.
My understanding is that as soon as renderMethod() is executed - event loop will not allow anything else to happen in server - which guarantees that it is only executed once at a given time (even if it had a network call without await).
The problem here is, that you somehow need to get the data from your async response.
Therefore you either need to mark your function as async and use const res = await makeHTTPCall() this would allow code interleaving at the point of await. Or use .then(…) with a callback, which will be executed asynchronously at a later point (after you left the function)
But from the beginning of the function to the first await other the .then not interleaving could take place.
So your httpCallMade = true would prevent that another makeHTTPCall could take place, before the currently running is finished, under the assumption that you set httpCallMade to false only when the request is finished (in .then callback, or after the await)
// I want to ensure that this method is "synchronized" and is not called by flow1 and flow2 simultaneously!
As soon as a get a result in an asynchronous way, you can't go back to synchronous code execution. So you need to have a guard like httpCallMade to prevent that the logic described in renderMethod can run multiple times interleaved.
Your question really comes down to:
Given this code:
var flag = false;
function f() {
if (!flag) {
flag = true;
console.log("hello");
}
}
and considering that flag is not modified anywhere else, and many different, asynchronous events may call this function f...:
Can "hello" be printed twice?
The answer is no: if this runs on an ECMAScript compliant JS engine, then the call stack must be empty first before the next job is pulled from an event/job queue. Asynchronous tasks/reactions are pushed on an event queue. They don't execute before the currently executing JavaScript has run to completion, i.e. up until the call stack is empty. So they never interrupt running JavaScript code pre-emptively.
This is true even if these asynchronous tasks/events/jobs are scheduled by other threads, lower-level non-JS code,...etc. They all must wait their turn to be consumed by the JS engine. And this will happen one after the other.
For more information, see the ECMAScript specification on "Job". For instance 8.4 Jobs and Host Operations to Enqueue Jobs:
A Job is an abstract closure with no parameters that initiates an ECMAScript computation when no other ECMAScript computation is currently in progress.
[...]
Only one Job may be actively undergoing evaluation at any point in time.
Once evaluation of a Job starts, it must run to completion before evaluation of any other Job starts.
For example, promises generate such jobs -- See 25.6.1.3.2 Promise Resolve Functions:
When a promise resolve function is called with argument resolution, the following steps are taken:
[...]
Perform HostEnqueuePromiseJob(job.[[Job]], job.[[Realm]]).
It sounds like you want to do something like a 'debounce', where any event will cause makeHttpCall() execute, but it should only be executing once at a time, and should execute again after the last call if another event has occurred while it was executing. So like this:
DB Call is made, and makeHttpCall() should execute
While makeHttpCall() is executing, you get a redis pub/sub event that should execute makeHttpCall() again, but that is delayed because it is already executing
Still before the first call is done, another DB call is made and requires makeHttpCall() to execute again. But even though you have received two events, you only need to have it called one time to update something with the most recent information you have.
The first call to makeHttpCall() finishes, but since there have been two events, you need to make a call again.
const makeHttpCall = () => new Promise(resolve => {
// resolve after 2 seconds
setTimeout(resolve, 2000);
});
// returns a function to call that will call your function
const createDebouncer = (fn) => {
let eventCounter = 0;
let inProgress = false;
const execute = () => {
if (inProgress) {
eventCounter++;
console.log('execute() called, but call is in progress.');
console.log(`There are now ${eventCounter} events since last call.`);
return;
}
console.log(`Executing... There have been ${eventCounter} events.`);
eventCounter = 0;
inProgress = true;
fn()
.then(() => {
console.log('async function call completed!');
inProgress = false;
if (eventCounter > 0) {
// make another call if there are pending events since the last call
execute();
}
});
}
return execute;
}
let debouncer = createDebouncer(makeHttpCall);
document.getElementById('buttonDoEvent').addEventListener('click', () => {
debouncer();
});
<button id="buttonDoEvent">Do Event</button>

NodeJS http and extremely large response bodies

At the moment, I'm trying to request a very large JSON object from an API (particularly this one) which, depending on various factors, can be upwards of a few MB. The problem is, however, is that NodeJS takes forever to do anything and then just runs out of memory: the first line of my response callback doesn't ever execute.
I could request each item individually, but that is a tremendous amount of requests. To quote the a dev behind the new API:
Until now, if you wanted to get all the market orders for Tranquility you had to request every type per region individually. That would generally be 50+ regions multiplied by upwards of 13,000 types. Even if it was just 13,000 types and 50 regions, that is 650,000 requests required to get all the market information. And if you wanted to get all the data in the 5-minute cache window, it would require almost 2,200 requests per second.
Obviously, that is not a great idea.
I'm trying to get the array items into redis for use later, then follow the next url and repeat until the last page is reached. Is there any way to do this?
EDIT:
Here's the problem code. Visiting the URL works fine in-browser.
// ...
REGIONS.forEach((region) => {
LOG.info(' * Grabbing data for `' + region.name + '#' + region.id + '`');
var href = url + region.id + '/orders/all/', next = href;
var page = 1;
while (!!next) {
https.get(next, (res) => {
LOG.info(' * * Page ' + page++ + ' responded with ' + res.statusCode);
// ...
The first LOG.info line executes, while the second does not.
It appears that you are doing a while(!!next) loop which is the cause of your problem. If you show more of the server code, we could advise more precisely and even suggest a better way to code it.
Javascript run your code single threaded. That means one thread of execution runs to completion before any other events can be run.
So, if you do:
while(!!next) {
https.get(..., (res) => {
// hoping this will run
});
}
Then, your callback to http.get() will never get called. Your while loop just keeps running forever. As long as it is running, the callback from the https.get() can never get called. That request is likely long since completed and there's an event sitting in the internal JS event queue to call the callback, but until your while() loop finished, that event can't get called. So you have a deadlock. The while() loop is waiting for something else to run to change it's condition, but nothing else can run until the while() loop is done.
There are several other ways to do serial async iterations. In general, you can't use .forEach() or while().
Here are several schemes for async looping:
Node.js: How do you handle callbacks in a loop?
While loop with jQuery async AJAX calls
How to synchronize a sequence of promises?
How to use after and each in conjunction to create a synchronous loop in underscore js
Or, the async library which you mentioned also has functions for doing async looping.
First of all, a few MBs of json payload is not exactly huge. So the route handler code might require some close scrutiny.
However, to actually deal with huge amounts of JSON, you can consume your request as a stream. JSONStream (along with many other similar libraries) allow you to do this in a memory efficient way. You can specify the paths you need to process using JSONPath (XPath analog for JSON) and then subscribe to the stream for matching data sets.
Following example from the README of JSONStream illustrates this succinctly:
var request = require('request')
, JSONStream = require('JSONStream')
, es = require('event-stream')
request({url: 'http://isaacs.couchone.com/registry/_all_docs'})
.pipe(JSONStream.parse('rows.*'))
.pipe(es.mapSync(function (data) {
console.error(data)
return data
}))
Use the stream functionality of the request module to process large amounts of incoming data. As data comes through the stream, parse it to a chunk of data that can be worked with, push that data through the pipe, and pull in the next chunk of data.
You might create a transform stream to manipulate a chunk of data that has been parsed and a write stream to store the chunk of data.
For example:
var stream = request ({ url: your_url }).pipe(parseStream)
.pipe(transformStream)
.pipe (writeStream);
stream.on('finish', () => {
setImmediate (() => process.exit(0));
});
Try for info on creating streams https://bl.ocks.org/joyrexus/10026630

JavaScript Double Null Check and Locking

In a language with threads and locks it is easy to implement a lazy load by checking the value of a variable, if it's null then lock the next section of code, check the value again and then load the resource and assign. This prevents it from being loaded multiple times and causes threads after the first to wait for the first thread to complete the action that's needed.
Psuedo code:
if(myvar == null) {
lock(obj) {
if(myvar == null) {
myvar = getData();
}
}
}
return myvar;
JavaScript runs in a single thread, however, it still has this type of issue because of asynchronous execution while one call is waiting on a blocking resource. In this Node.js example:
var allRecords;
module.exports = getAllRecords(callback) {
if(allRecords) {
return callback(null,allRecords);
}
db.getRecords({}, function(err, records) {
if (err) {
return callback(err);
}
// Use existing object if it has been
// set by another async request to this
// function
allRecords = allRecords || partners;
return callback(null, allRecords);
});
}
I'm lazy loading all the records from a small DB table the first time this function is called and then returning the in-memory records on subsequent calls.
Problem: If multiple async requests are made to this function at the same time then the table is going to be loaded unnecessarily from the DB multiple times.
In order to solve this I could simulate a locking mechanism by creating a var lock; variable and setting it to true while the table is loading. I would then put the other async calls into a setTimeout() loop and check back on this variable every (say) 1 second until the data was available and then allow them to return.
The problems with that solution are:
It's fragile, what if the first async call throws and doesn't unset the lock.
How many times do we loop back into the timer before giving up?
How long should the timer be set for? In some environments 1 second might be way too long and inefficient.
Is there a best practise for solving this in JavaScript?
On the first call to the service, initialize an array. Start the fetch operation. Create a Promise, store it in the array.
On subsequent calls, if the data is there, return an already-fulfilled Promise. If not, add another Promise to the array and return that.
When the data arrives, resolve all the waiting Promise objects in the list. (You can throw away the list once the data's there.)
I really like the promise solution in the other answer -- very clever, very interesting. Promises aren't the dominent methodology, so you may need to educate the team. I'm going to go in another direction though.
What you're after is a memoize function -- an in-memory key/value cache of expensive results. JavaScript the Good Parts has a memoize sample towards the end. Lodash has a memoize function. These assume synchronous processing so don't account for your scenario -- which is to say they'd hit the database lots of times until one of the "threads" replied.
The async library also has a memoize function that does exactly what you want. In it's innards, it keeps a queue array of callbacks, and once it gets the answer, it both caches it and calls all the callbacks.
If you're into inventing, by all means, use promises. If you'd just like a plug-n-play answer, use async#memoize.

AngularJs: Have method return synchronously when it calls $http or $resource internally

Is there a way to wait on a promise so that you can get the actual result from it and return that instead of returning the promise itself? I'm thinking of something similar to how the C# await keyword works with Tasks.
Here is an example of why I'd like to have a method like canAccess() that returns true or false instead of a promise so that it can be used in an if statement. The method canAccess() would make an AJAX call using $http or $resource and then somehow wait for the promise to get resolved.
The would look something like this:
$scope.canAccess = function(page) {
var resource = $resource('/api/access/:page');
var result = resource.get({page: page});
// how to await this and not return the promise but the real value
return result.canAccess;
}
Is there anyway to do this?
In general that's a bad idea. Let me tell you why. JavaScript in a browser is basically a single threaded beast. Come to think of it, it's single threaded in Node.js too. So anything you do to not "return" at the point you start waiting for the remote request to succeed or fail will likely involve some sort of looping to delay execution of the code after the request. Something like this:
var semaphore = false;
var superImportantInfo = null;
// Make a remote request.
$http.get('some wonderful URL for a service').then(function (results) {
superImportantInfo = results;
semaphore = true;
});
while (!semaphore) {
// We're just waiting.
}
// Code we're trying to avoid running until we know the results of the URL call.
console.log('The thing I want for lunch is... " + superImportantInfo);
But if you try that in a browser and the call takes a long time, the browser will think your JavaScript code is stuck in a loop and pop up a message in the user's face giving the user the chance to stop your code. JavaScript therefore structures it like so:
// Make a remote request.
$http.get('some wonderful URL for a service').then(function (results) {
// Code we're trying to avoid running until we know the results of the URL call.
console.log('The thing I want for lunch is... " + results);
});
// Continue on with other code which does not need the super important info or
// simply end our JavaScript altogether. The code inside the callback will be
// executed later.
The idea being that the code in the callback will be triggered by an event whenever the service call returns. Because event driven is how JavaScript likes it. Timers in JavaScript are events, user actions are events, HTTP/HTTPS calls to send and receive data generate events too. And you're expected to structure your code to respond to those events when they come.
Can you not structure your code such that it thinks canAccess is false until such time as the remote service call returns and it maybe finds out that it really is true after all? I do that all the time in AngularJS code where I don't know what the ultimate set of permissions I should show to the user is because I haven't received them yet or I haven't received all of the data to display in the page at first. I have defaults which show until the real data comes back and then the page adjusts to its new form based on the new data. The two way binding of AngularJS makes that really quite easy.
Use a .get() callback function to ensure you get a resolved resource.
Helpful links:
Official docs
How to add call back for $resource methods in AngularJS
You can't - there aren't any features in angular, Q (promises) or javascript (at this point in time) that let do that.
You will when ES7 happens (with await).
You can if you use another framework or a transpiler (as suggested in the article linked - Traceur transpiler or Spawn).
You can if you roll your own implementation!
My approach was create a function with OLD javascript objects as follows:
var globalRequestSync = function (pUrl, pVerbo, pCallBack) {
httpRequest = new XMLHttpRequest();
httpRequest.onreadystatechange = function () {
if (httpRequest.readyState == 4 && httpRequest.status == 200) {
pCallBack(httpRequest.responseText);
}
}
httpRequest.open(pVerbo, pUrl, false);
httpRequest.send(null);
};
I recently had this problem and made a utility called 'syncPromises'. This basically works by sending what I called an "instruction list", which would be array of functions to be called in order. You'll need to call the first then() to kick things of, dynamically attach a new .then() when the response comes back with the next item in the instruction list so you'll need to keep track of the index.
// instructionList is array.
function syncPromises (instructionList) {
var i = 0,
defer = $q.defer();
function next(i) {
// Each function in the instructionList needs to return a promise
instructionList[i].then(function () {
var test = instructionList[i++];
if(test) {
next(i);
}
});
}
next(i);
return defer.promise;
}
This I found gave us the most flexibility.
You can automatically push operations etc to build an instruction list and you're also able to append as many .then() responses handlers in the callee function. You can also chain multiple syncPromises functions that will all happen in order.

WebSQL Transaction Won't Run In JS Callback Functions

I am using PhoneGap and jQuery Mobile. I am trying to get some JSON data from a remote location and then populate a local WebSQL database with it. Here is my JS function:
function getLocations() {
var tx = window.openDatabase('csdistroloc', '1.0', 'Distro DB', 1000000);
tx.transaction(function(tx) {
tx.executeSql('DROP TABLE IF EXISTS locations'); //this line works!
tx.executeSql('CREATE TABLE IF NOT EXISTS locations (id, name, address, postalcode, phone, category)'); //this line works!
$.ajax({
url: "http://mydomain.com/api.php",
dataType: 'json',
data: { action: "getlocations" },
success: function(data) {
tx.executeSql("INSERT INTO locations (id, name, address, postalcode, phone, category) VALUES (2,'cheese','232','seven',5,6)"); //this line produces an error
}});
}, dberror, dbsuccess);
}
Running the above function gives me an error "INVALID_STATE_ERR: DOM Exception 11" on the line noted above. It does the same thing when I am actually trying to use the returned JSON data to insert data. I have also tried the $.getJSON technique with the exact same result.
Any advice would be appreciated!
Although the accepted answer is correct, I would like to expand upon it because I encountered the same problem and that answer doesn't say why it doesn't work as the OP had it.
When you create a transaction in Web SQL, the transaction processing remains alive only so long as there are any statements queued up in the transaction. As soon as the pipeline of statements in the transaction dries up, the engine closes (commits) the transaction. The idea is that when the function(tx) { ... } callback runs,
It executes all of the statements it need to. executeSql is asynchronous, so it returns immediately even though the statement has not yet been executed.
It returns control back to the Web SQL engine.
At this point the engine notices that there are statements queued up and runs them to completion before closing the transaction. In your case, what happens is this:
You call executeSql twice to queue up two statements.
You request something through ajax.
You return
The engine runs the two statements that it has queued up. The ajax request is also running asynchronously but it must access the network which is slow so it likely has not completed yet. At this point, the statement queue is empty and the Web SQL engine decides that it's time to commit and close the transaction! It has no way of knowing that there is going to be another statement coming later! By the time the ajax call returns and it attempts to execute the INSERT INTO locations, it's too late, the transaction is already closed.
The solution suggested by the accepted answer works: don't use the same transaction inside the ajax callback but create a new one. Unfortunately, it has the pitfall you would expect from using 2 transactions instead of 1: the operation is no longer atomic. That may or may not be important for your application.
If atomicity of the transaction is important for you, your only 2 recourses are:
Do everything (all 3 statements) in one transaction inside the ajax callback.
This is what I recommend. I think it's very likely that waiting until after the ajax request completes before creating the table is compatible with your application requirements.
Perform the ajax request synchronously as explained here.
I don't recommend that. Asynchronous programming in JavaScript is a good thing.
By the way, I encountered the problem in the context of Promises, in code that looked something like this:
// XXX don't do this, it doesn't work!
db.transaction(function(tx) {
new Promise(function(resolve, reject) {
tx.executeSql(
"SELECT some stuff FROM table ....", [],
function(tx, result) {
// extract the data that are needed for
// the next step
var answer = result.rows.item( .... ).some_column;
resolve(answer);
}
)
}).then(function(answer) {
tx.executeSql(
"UPDATE something else",
// The answer from the previous query is a parameter to this one
[ ... , answer, .... ]
)
});
});
The problem is that, with promises, the chained .then() clause is not run immediately upon resolution of the original promise. It is only queued for later execution, much like the ajax request. The only difference is that, unlike the slow ajax request, the .then() clause runs almost immediately. But "almost" is not good enough: it may or may not run soon enough to slip in the next SQL statement into the queue before the transaction gets closed; accordingly, the code may or may not produce the invalid state error depending on timing and/or browser implementation.
Too bad: Promise would have been useful to use inside SQL transactions. The above pseudo-example can easily be rewritten without promises, but some use cases can greatly take advantage of chains of many .then()s as well as things like Promise.all that can make sure that an entire collection of SQL statements run in any order but all complete prior to some other statement.
I would first suggest not naming your database 'tx' but rather db or database. This could be a variable naming problem since both the function parameter and your database variables are called "tx"
EDIT: I had this same problem and solved it by making the query within the callback it's own transaction. Like so:
success: function(data) {
tx.transaction(function(transaction){
transaction.executeSql("INSERT INTO locations (id, name, address, postalcode, phone, category)
VALUES (2,'cheese','232','seven',5,6)"); //now more DOM exception!
}
}}
I think the problem is by the time the callback is fired the outer transaction has completed because webSQL's transactions are not synchronous.
We do have a way to lock the transaction, while you do any AJAX or other async operation. Basically before calling AJAX you need to start a dummy db operation and on success of that operation check if AJAX is done or not, and again call the same dummy operation till your AJAX is done. When AJAX is done you can now reuse the transaction object do next set of executeSQLs. This approach is thoroughly explained in this article here. (I hope someone will not delete this answer too, as someone did earlier on a similar question)
Try this approach

Categories