I am using an Image processing API. (Blitline)
By the nature of it, image processing takes a while to complete. Let's say 3 - 6 seconds.
After submitting a job, the API returns immediately the future url of my procesed image, but for 3 - 6 seconds that url will return a 404, since the image has not yet finished proessing.
As soon as the job finishes, the Blitline service sends a Postback to a PHP script on my server, telling me it's done.
At this point, I want to show the processed image to the user.
Is there a technology that will load the image in the user browser at the time the postback comes in?
I know it could be done with Javascript polling. E.g. check every 2 seconds if the postback had come in yet.
But I wonder if there is a more modern way to do this?
Another issue that has to be dealt with is S3 latency. Just because an image is uplaoded to S3 and S3 has responded that it got it, doesn't mean the image will be available publicly immediately. While it is generally available within a few milliseconds, this can extend into seconds sometimes.
Since you have the URL, you can just poll S3 for the image. Here is an example:
https://coderwall.com/p/hy_qjw
This example tries to load a hidden image from S3. If it succeeds, it tries again in a few seconds (you can adjust the setTimeout). This would work wether you are waiting for Blitline to finish, or waiting for S3 to make the image available.
Related
Pretty new to web dev so my apologies if it's not very clear.
I have an image processor in my backend that takes varied amount of time to process an image depending on its size, type etc. The number of images that can be sent is anything. I am trying to send multiple images into their api calls from my front end so that all images can be processed in parallel.
Now on my front end side I don't want to wait for the final response, but instead serve any response that my backend sends back (let's say backend got 5 images and processes the 4th image first. I want to show the 4th image anyway and continue showing images as and when they arrive)
I have tried promise.all() but i still have to wait until all images have been processed. Is there any other method that can help?
Simply use await instead of promise and it should work fine
I have an anchor that when clicked, downloads a file that first needs to be generated. This takes a few seconds, so to the user it seems like nothing is happening. I want to display a loading animation while waiting for the first byte (after which the browser shows the download progress), but I can't figure how to do that.
I don't want to perform an ajax request and register for the progress events, because I want the browser's download manager to handle the actual download.
Is there a way to be notified when the download is actually starting?
I ended up solving it by creating 2 endpoints:
endpoint that generates the file and buffers the result and returns an url to the second enpoint
endpoint that returns the generated (buffered) result
The HTML button the user clicks will ask the first endpoint for the URL, which doesn't return until the file has been generated. While that request is running, a loading animation is displayed to indicate to the user that the file is being generated. When the URL is returned, I stop the animation, create an (invisible) anchor, set the href to that url, attach it to the document and click it, causing the browser to download the file (which will now start to download immediately).
In my website, I use a script (on server) with data sent from the webpage. The problem is that the script execution time (+data transit) takes 5 to 30 seconds to execute and waiting can be quite long for the user so I was wondering what is the best way to create a waiting loading bar? Do I have to execute a first function on the server that will calculate the time, and then send it to the client, or do I use repeted AJAX requests with each time, the current status ?
The advantage with the second one is that it's easier to just send the status after each part of the script than to calculate the speed from just the data sent. Moreover it could be more precise, basing itself on the actual connection time... But I think it could use more server ressources. And if I choose that what is the best delay between each requests ?
Thank you for your answers
I know how to do that with javascript but I need a secure way to do it.
Anybody can view page source, get the link and do not wait 5 seconds.
Is there any solution? I'm working with javascript and django.
Thanks!
The only secure way would be to put the logic on the server that checks the time. Make an Ajax call to the server. If the time is under 5 seconds, do not return the HTML, if it is greater than , than return the html to show.
Other option is to have the link point to your server and if the time is less than five seconds it redirects them to a different page. If it is greater than 5, it will redirect them to the correct content.
Either way, it requires you to keep track of session time on the server and remove it from the client.
Use server side timeout.. whenever there is (AJAX) request from client for download link with timestamp, compare the client sent timestamp with currenttime and derive how much time is required to halt the request at server side to make up ~5 seconds. So by comparing timestamp you can almost achieve accuracy of waiting time as the network delays would be taken into account automatically.
You can use ajax, retrieve the button source code from your back end and intert it on your page.
Something like
$.get('url', function(sourceCode) {
$('#midiv').html(sourceCode);
});
I've got a bunch of javascript working on this page so that users can fill the form which includes a file upload field. They can add these forms to a "queue" which is just a series of iframes with the forms data moved into it. With the click of a button it will go through each form and submit them one at a time. When each form is submitted I load a gif to show that there is action. When the processing page is finished it will spit some jquery back at the iframe and give a success or error message. This works great so long as the files are not too large. It seems that the larger files (near 1GB) results in a condition where the jquery from the processing page never shows up in the iframe. This is disastrous because the submitting page will not continue to submit forms unless it gets some sort of response. Also the user is left with a spinning image that never goes away, and are unsure if even one large file has actually uploaded. I've tried setting the max_execution_time and max_input_time for an hour, but this doesn't help at all. Currently using a jquery/javascript to loop through each form and submit it. Can anyone tell me why this is happening and/or how to resolve this issue?
You can set the timeout with jQuery ajax to be longer - From the documentation.
timeoutNumber
Set a timeout (in milliseconds) for the request. This will override any global timeout set with $.ajaxSetup(). The timeout period starts at the point the $.ajax call is made; if several other requests are in progress and the browser has no connections available, it is possible for a request to time out before it can be sent. In jQuery 1.4.x and below, the XMLHttpRequest object will be in an invalid state if the request times out; accessing any object members may throw an exception. In Firefox 3.0+ only, script and JSONP requests cannot be cancelled by a timeout; the script will run even if it arrives after the timeout period.