Asynchronously call faceapi.detectAllFaces() - javascript

I am using face-api.js to detect faces in a given image. Everything is configured as mentioned on Github and detection is working.
But I want to call faceapi.detectAllFaces() asynchronously, I means I don't want to wait for its result, so I can't call await faceapi.detectAllFaces(). I tried below code:
In cam.js
async function detectFace()
{
document.getElementById('camPic').src = getCurrentImageAsBase64(); //getCurrentImageAsBase64() gets the image in base64 format from canvas
return faceapi.detectAllFaces(document.getElementById('camPic'), new faceapi.SsdMobilenetv1Options());
}
In index.jsp
$('#camPrcd').click(function(e)
{
e.preventDefault();
detectFace().then((arr) => console.log(arr));
console.log('detection called');
});
After button click, on console I can see message "detection called" and after sometime detection result gets logged from then() block.
But my observation is when detectFace() is called the html page feels like hanged (not able to click on any button) and when arr gets printed on console then I can able to click on page.
Looks like even though "detection called" message gets printed immediately before detection happens but faceapi.detectAllFaces() is not doing work asynchronously.
Note: This happens only for first call of faceapi.detectAllFaces() as per author of face-api.js during first call model for face detection get compiled and so detection time is more at first call as compare to subsequent call to function.
So is there any way that, I can call for detection and still web page is accessible and when it's finished a callback function will handle the detection result.

Related

Angular: $http.get() only fires every second onpopstate trigger

I have an AngularJS app that makes a call to an API and returns a bunch of data that users can then filter by tags for greater granularity in the results. Each time a tag is clicked to filter the data, the app makes a new $http.get() call, and the URL is modified with the appropriate query parameters so that the user can save the permalink and come back to any particular data set.
I'm trying to give the app proper history handling with window.history.pushState(), and passing the relevant query parameters for each history object as state data. I'm using window.onpopstate to detect when the back/forward buttons are clicked, and using that to make the new $http.get() call with the relevant state data from the history.
For some reason, the $http.get() function only fires on every second popstate, and then it makes two calls. It's almost as if there's some caching going on, but I haven't been able to find the culprit. This behaviour persists in both directions, backwards and forwards, and is consistently every second event. I've verified that window.history.length is only incremented by 1 for every tag added/removed, that the state data is being successfully sent, that new search queries are being correctly assembled, and that the request path is correct. It's just not firing. What's going on??
To illustrate, the behaviour flow looks like this:
Load page at /default
Add first tag: URL is /default&tags=a, $http.get() returns new data
Add second tag: URL is /default&tags=a,b, $http.get() returns new data
Add third tag: URL is /default&tags=a,b,c, $http.get() returns new data
Add fourth tag: URL is /default&tags=a,b,c,d, $http.get() returns new data
First back button event
window.onpopstate fires, URL is now /default&tags=a,b,c
No network changes
Second back button event
window.onpopstate fires, URL is now /default&tags=a,b
$http.get() fires, sends network request for data with /default&tags=a,b,c
$http.get() fires again, sends network request for data with /default&tags=a,b
dataset for /default&tags=a,b loads
Third back button event
window.onpopstate fires, URL is now /default&tags=a
No network changes
Fourth back button event
window.onpopstate fires, URL is now /default
$http.get() fires, sends network request for data with /default&tags=a
$http.get() fires again, sends network request for data with /default
dataset for /default loads
Relevant code snippet:
$scope.apiRequest = function(options, callback) {
// Omitted: a bunch of functions to build query
// based on user-selected tags.
// I've verified that this is working correctly.
$http.get(path)
.then(function(response) {
console.log('http request submitted');
if (callback) {
callback(response.data.response, response.data.count, response.data.facets);
console.log('data returned');
}
}, function(response) {
console.log('there has been an error');
});
}
Neither the success nor error events fire. I've tried using $http.get().then().catch() to see if there might be something else going on, but for some reason I keep getting an error in my console that says that ...catch() is not a valid function, which is in and of itself bewildering. Any ideas?
Thanks!
This sounds indicative of a function not cycling through the $digest loop. In this case you may attempt to add $scope.$apply(); as the last line in your window.onpopstate handler function to kick the $digest cycle to execute your function call.
This article, Notes On AngularJS Scope Life-Cycle helped me to better understand the $digest cycle and how you can force the $digest cycle to run with $scope.$apply(); Keep in mind you want to use $scope.$apply() sparingly but in some cases you are forced to kick off the cycle, especially with async callbacks.

Javascript window.location calls getting lost?

I am having some trouble with a bit of code. I have a function that does some stuff to some data, calls a remote system (activating a script on that system and passing in the data), and then makes another call to the same system to activate a different script (which acts on the data saved above). The problem is that the 1st call to the remote system appears to get lost in the execution.
This is being run in Safari, uses jquery; the function is tied to a button click, which is defined in the javascript code with an onclick function (i.e. it is not defined in the html button definition).
Here's a rough breakdown of the function (cleaned out for viewing purposes - I hope I left enough to make it clear):
function compareJSON() {
// loop through the objects, testing and changing data
// ...
dataSession=({ //build object for output });
$.each( dataSession.chapters , function( indexC, value ) {
//compare objects to some others, testing and changing data
});
// ...
//Call remote script on other system
urlString="url://blah.dee.com/Blar?script=SaveJSON&$JSONobject=";
window.location= urlString + JSON.stringify(dataSession);
//Call remote script on other system
window.location="url://blah.dee.com/Blar?script=EditJSON";
}
The last three lines of code are the two calls. It uses the window.location to actually trigger the remote system, passing the data through the URL. But I need BOTH scripts to get called and run. It appears that only the LAST script in the sequence ever gets run. If I switch them around it remains whatever is in last place.
Is there something about the window.location that doesn't actually process until the end of the function?
This script actually used to be a series of separate function calls, but I figured I was running into asynchronous execution that was causing the various script calls to not register. But once I put the code into this single function, it was still happening.
Any clues would be helpful.
Thanks,
J
Modifing the value of window.location is reserved exclusively for instances in which you'd like to cause a browser redirect.
It looks like you want to trigger a page request instead. You say you already have jQuery loaded, if so, you can trigger such a request using jQuery.get or a similar function.
For example:
// Loads the myscript.php page in the background
$.get('myscript.php');
// You can also pass data (in the form of an object as the second argument)
$.get('myscript.php', { name: "John", time: "2pm" });

Javascript: how to make changes to the document before function finishes running?

I want to create a function that when called would display a "Loading..." message, and display the results as soon as it finishes. when I do it like this:
function load() {
$('#status').html("loading...");
/* do something */
...
$('#status').html("done");
$('results').html(result);
}
The "loading" message never appears, after a while what a user sees is just the "done" message and the results. How can I make the "loading" text appear just the moment I want it to?
If "do something" is synchronous, the browser never gets a chance to update the UI between the two content changes.
To have the changes appear you need to do something like:
$('#status').html('loading...');
setTimeout(function() {
// do something
$('#status').html('done');
}, 0);
The setTimeout call gives the UI a chance to update the display before jumping into your "something".
Note that if possible you should try to ensure that "something" doesn't tie up your browser for a long time. Try to break the task up into multiple chunks, where each chunk is also dispatched using setTimeout().
Hard to tell without seeing the "stuff" part, but I hope this helps a little;
function load() {
$('#status').html("loading...");
function onLoaded(result) {
$('#status').html("done");
$('results').html(result);
}
// do your stuff here
// Not being able to see the "stuff", I guess you do some AJAX or something
// else which is asynchronous? if you do, at the end of your callback, add
// onLoaded(result)
}
The key is in the "do something". It depends on what you want to do but I would expect that you want to use jQuery's load() function.
Many jQuery functions accept 'callback functions' which are executed after the task is complete. The callback function section of the load() documentation should explain everything.

How do I write a JS function to do something and then return once that process is done?

I'm adding some functionality to an existing function. I need to insert an additional step in the middle of the current routine. I know how to go to the 2nd function but I don't know what to do to return to the main function once the 2nd routine completes.
function step1(){
perform ajax call to see if student is assigned to a project
step1_subfunction()
// wait here until step1_subfunction is done
do some more stuff with response from user
}
function step1_subfunction(){
prompt user via jQuery dialog, 'Add or move employee to the project?'
// return to step1 with answer returned from user and resume
}
I'd google this but I don't know if this "process" has a name. Back in my days of COBOL, we called this gosub.
UPDATED:
Step1 performs an ajax call to see if an employee has been assigned to a project. If the response.status = 'Assigned', the user will be asked via a jQuery dialog box, "Do you want to copy or move the employee to this project?". The jQuery dialog box will be step1_subroutine. The answer will be passed back to the step1 function. The remaining part of step1 will simply be to place a value in a hidden text field of "copy" or "move".
What you have will perform what you are describing, but may not make the data from the user available to function step1() without a return in function step1_subfunction(). Below I've modified your example code to demonstrate the passing of values back.
function step1(){
//do some stuff
var returnValFromFunction = step1_subfunction();
// wait here until step1_subfunction is done
// Now use returnValFromFunction, it contains the information from the user
do some more stuff with response from user
}
function step1_subfunction(){
prompt user for some information
// return to step1 with information returned from user and resume
return userResponse;
}
What you've written should work - javascript is single-threaded, so have you tried it?
Javascript doesn't have subroutines specifically, just create a function that returns and ignore the result, as you have done. When the second routine completes, the scope and execution will continue in the first function.
just do nothing.
what you are trying to achieve is just a "function call"
so it will automatically return to it's caller "stack frame" once executed.
You can make the ajax object synchronous - IE, no code will continue until it gets a response. It's the third parameter of open (true is asynchronous, false is synchronous).
xmlhttpobject.open('POST', 'url', false);
There are cases where a synchronous call is fine but it should always be avoided if possible.
The other alternative, which would likely require some logic changes in your code but would be better off in the long run, is to bind the onReadyStateChange event. This fires every time the state of the xmlHttpRequest object changes - you can check to see if the status is 200 and the readystate is 4 to make sure the request is done and completed successfully.
Here's a better reference. Good luck.

How can I capture a click on the browser page without any possible effect on the site's robustness?

My javascript code is added to random websites. I would like to be able to report to my server when a (specific) link/button on the a website is clicked. However I want to do it without any possible interruption to the website execution under any circumstances (such as error in my code, or my server id down etc.). In other words I want the site to do its default action regardless of my code.
The simple way to do it is adding event listener to the click event, calling the server synchronously to make sure the call is registered and then to execute the click. But I don't want my site and code to be able to cause the click not to complete.
Any other ideas on how to do that?
As long as you don't return false; inside your callback and your AJAX is asynchronous I don't think you'll have any problems with your links not working.
$("a.track").mousedown(function(){ $.post("/tools/track.php") })
I would also suggest you encapsulating this whole logyc inside a try{} catch() block so that any errors encauntered will not prevent the normal click behaviour to continue.
Perhaps something like this? I haven't tested it so it may contain some typo's but the idea is the same...
<script type="text/javascript">
function mylinkwasclicked(id){
try{
//this function is called asynchronously
setTimeOut('handlingfunctionname('+id+');',10);
}catch(e){
//on whatever error occured nothing was interrupted
}
//always return true to allow the execution of the link
return true;
}
</script>
then your link could look like this:
<a id="linkidentifier" href="somelink.html" onclick="mylinkwasclicked(5)" >click me!</a>
or you could add the onclick dynamically:
<script type="text/javascript">
var link = document.getElementById('linkidentifier');
link.onclick=function(){mylinkwasclicked(5);};
</script>
Attach this function:
(new Image()).src = 'http://example.com/track?url=' + escape(this.href)
+ '&' + Math.random();
It is asynchronous (the 'pseudo image' is loaded in the background)
It can be cross domain (unlike ajax)
It uses the most basic Javascript functionalities
It can, however, miss some clicks, due to site unloading before the image request is done.
The click should be processed normally.
1) If your javascript code has an error, the page might show an error icon in the status bar but it will continue the processing, it won't hang.
2) If your ajax request is asynchronous, the page will make that request and process the click simultaneously. If your server was down and the ajax request happening in the background timed out, it won't cause the click event to not get processed.
If you do the request to your server synchronously, you'll block the execution of the original event handler until the response is received, if you do it asynchronously, the original behaviour of the link or button may be doing a form post or changing the url of the document, which will interrupt your asynchronous request.
Delay the page exit just long enough to ping your server url
function link_clicked(el)
{
try {
var img = new Image();
img.src = 'http://you?url=' + escape(el.href) + '&rand=' + math.random();
window.onbeforeunload = wait;
}catch(e){}
return true;
}
function wait()
{
for (var a=0; a<100000000; a++){}
// do not return anything or a message window will appear
}
so what we've done is add a small delay to the page exit to give the outbound ping enough time to register. You could also just call wait() in the click handler but that would add an unnecessary delay to links that don't exit the page. Set the delay to whatever gives good results without slowing down the user noticeably. Anything more than a second would be rude but a second is a long time for a request roundtrip that returns no data. Just make sure your server doesn't need any longer to process the request or simply dump to a log and process that later.

Categories