How to improve upload speed of files to google storage? - javascript

I have implemented javascript code to upload files in multiple chunks to google cloud storage.
Below is flow that I execute the upload file:
1. user selects a file to upload using javascript client web app {request is from ASIA region}
2. javascript client app request to our app server implemented in NODEJS {hosted in google cloud's compute engine - US region} to allow the file upload {authorization}
2. Nodejs App server returns a signedurl to the client app
3. client app start uploading file to google storage in multiple chunks using that signed url
4. on upload successful client reports to app server
I am able to upload files in multiple chunks, but I have observed that upload is 2-3 times slower if I host nodejs app server in google cloud US region rather than hosting on same machine from where I am executing client app request
Please let me know if you have solution how to improve the upload performance.
There is some workaround mentioned in google cloud signed-url documentation :
Resumable uploads are pinned in the region they start in. For example,
if you create a resumable upload URL in the US and give it to a client
in Asia, the upload still goes through the US. Performing a resumable
upload in a region where it wasn't initiated can cause slow uploads.
To avoid this, you can have the initial POST request constructed and
signed by the server, but then give the signed URL to the client so
that the upload is initiated from their location. Once initiated, the
client can use the resulting session URI normally to make PUT requests
that do not need to be signed.
But with that reference:
I couldnt found any code sample for: once client receives the signed
url from server how the initial JSON API call can be constructed ??
what should be expected response in 1st call? and how to extract
session URI
how to use the session URI to upload further chunks?

You may be confusing two separate GCS features. GCS allows for resumable uploads to be authorized to third parties without credentials in a couple of ways.
First, and preferred, is signed URLs. Your sends a signed URL to a client that will allow that client to begin a resumable upload.
Second, and less preferred due to the region pinning you mention above, is having the server initiate a resumable upload itself and then passes the upload ID to the client.
It sounds like you want the first thing but are using the second.
Using signed URLs requires making use of the XML API, which handles resumable uploads in a similar way to the JSON API: https://cloud.google.com/storage/docs/xml-api/resumable-upload
You'll want to sign that very first POST call to create an upload and pass that URL to the user to invoke on their own.

Related

Is it possible to upload a file to Dropbox without storing it on my server first using PHP?

Currently, I have a PHP app running on Heroku using a Postgresql database. I want my users to be able to upload an image to a folder on my dropbox, and store other information (in this case, product information such as price, title, weight, location of the image on dropbox) on my database.
Right now, I'm using a file input inside an HTML form to submit the image by posting the whole form to my server (including the image), and then I use cURL to send the image to dropbox and wait for the response to succeed. On success, I create my database record that has the other information I mentioned earlier.
This works well for small files, but Heroku has a 30 second timeout that I can't change. For large files, the whole file uploads to the server, and then it uploads to dropbox. These two upload operations are time-intensive and takes more time than the timeout allows.
I had the idea of sending the file to dropbox using javascript (jQuery ajax commands specifically) so that it's handled by the client, and then POSTing to my server on success, but I'm worried about how secure that is since I would need to have my own authorization tokens in the source code that the client can view.
Is there any way for PHP to send a file from the client to an external URL without it touching the server? How do I do this securely?
This sounds like a good fit for the Dropbox API /2/files/get_temporary_upload_link endpoint.
You can call that on your server to retrieve the temporary upload link, and then pass that link down to the browser. You can then have some JavaScript code perform the upload directly from the browser using that link.
Since only the /2/files/get_temporary_upload_link endpoint call requires your Dropbox access token (whereas the temporary upload link itself doesn't), you can keep your access token secret on the server only, without exposing it to the client. And since the upload happens directly from the browser to the Dropbox servers, you don't have to pass the file data through your own server, avoiding the timeout issue.

Request timeout while uploading movie to app service

I am having trouble uploading a big movie on to the Azure app service which I created. I get request timeout after 4-5 mins while uploading the movie (greater than 150MB). For the frontend, I am using VueJS and send multiple files by doing promisify all settled function. Don't have any issues while using it locally. For backend, I am using Nodejs(fastify) with a multer package and I am using an in-memory storage option. Once I receive the file basically I upload it to Azure blob storage.
Do I have to send movie data in chunks from the frontend to backend? how to achieve it when I have multiple files.
Can we use socket io?
I tried using socket io. however, my browser freezes if I send a big file and I am totally new to sockets.
I am not sure how can I fix this issue. It would be great if someone can guide me and show me an example.
Looking forward to hearing from you guys
thanks,
meet
Problems for uploading files to server
Check the timeout in your axios request (front end) - because you have to wait until the all files uploaded to server (https://github.com/axios/axios#creating-an-instance).
Check the domain hosting configuration (if you are hosting your backend service in nginx - check the upload limit (https://www.tecmint.com/limit-file-upload-size-in-nginx/))

Is Cloudinary API secure if I do the following

I am using the base api url below like in many examples ..https://api.cloudinary.com/v1_1/name.
But if I use dotenv to hide the name and to hide the upload preset like below then will that keep my api secure or will people be able to find it from the img urls that are returned when an image is uploaded.
formData.append('upload_preset', process.env.REACT_APP_UPLOAD_PRESET);
https://api.cloudinary.com/v1_1/${process.env.REACT_APP_CLOUD_NAME}/image/upload,
If you're allowing uploads from client-side code and are sending them from the client to Cloudinary directly, users will always be able to see the cloud name and upload preset name you use, if not easily in your app's source, then certainly via a proxy or other debug tools.
However, that's expected, and is the reason for using the unsigned upload option: unsigned uploads to allow you to perform uploads in cases where the client can't authenticate itself with a server component by using the upload preset you specify - what happens to the uploaded files is determined by the pre-configured options in the upload preset so you can name them a certain way, put them in a specific folder, add tags, edit the images via resizing or other transformations before they're saved, etc.
If you don't want to expose the cloud name or upload preset name, you'll need to pass the files to a server endpoint you control, and then upload them to Cloudinary from there, which would put you in the same basic situation where the client code has the ability to upload files without authentication [or using authentication your users can see and can copy], although then it would be the endpoint on your server allowing that rather than Cloudinary's /v1_1/

Upload file to personal Dropbox from public website without exposing private key

I receive images for processing on my site and wanna omit uploading to local server and further adding to Dropbox folder from it by implementing direct upload to Dropbox from web browser.
Dropbox API is pretty simple, but has one problem - I need to expose API key to end user, and that key allows to download all pics from my account.
Then if I use Dropbox API - one user can download images uploaded by others, and it's not acceptable scenario.
Is there any way to bypass limitation with exposing full access API key to website end users?
Unfortunately, no, in order to upload directly to Dropbox via the Dropbox API, the client needs the access token. Further, the Dropbox API doesn't offer an upload-only permission.
Fundamentally, clients can't keep secrets, so any access token that exists client-side (in this case, in the browser) can be extracted and abused.
Edit:
The Dropbox API now offers some new pieces of functionality that may be useful here:
A) The Dropbox API now offers "scopes", which you can use to configure an app or access token to only a limited set of functionality, such as the ability to write but not read files.
You can find more information about the release in our blog post here:
https://dropbox.tech/developers/now-available--scoped-apps-and-enhanced-permissions
B) The Dropbox API now offers the /2/files/get_temporary_upload_link endpoint, which can be used to get a URL that a client can POST to in order to upload to a Dropbox account without an access token:
https://www.dropbox.com/developers/documentation/http/documentation#files-get_temporary_upload_link

How to secure an API used only from front-end (Ajax call)

Well, I created an API to manage for our websites some attachments uploads and store into Amazon S3 buckets
The scenario : Once visitor / user in the form and wants to submit it with attachment, once the file is selected then button clicked an Ajax request fire to the micro service API so it can store the file into S3 do some processing then return the direct link or identifier.
The question is : how can we authenticate the user using for example a short live token or something like that without being hijacked, mis-usage of the token..
In Javascript everything is visible to the visitor, and we try to not integrate any heavy process in the backend
If I got your question straight, you have a web interface in which files are uploaded to an S3 bucket and you need to make sure that in a certain back end API (such as REST) all file upload commands will have authentication and authorization.
The answer is highly dependent on your architecture but generally speaking, all Javascript calls are nothing but HTTP calls. So you need HTTP authentication/authorization. In general, the most straightforward method for REST over HTTP is the basic authentication, in which the client sends a credential in every single request. This may sound odd at first but it is quite standard since HTTP is supposed to be stateless.
So the short answer, at least for the scenario I just described, would be to ask the user to provide the credentials that Javascript will keep in the client side, then send basic authentication which the REST interface can understand. The server-side processes will then get such information and decide whether a certain file can be written in a certain S3 bucket.

Categories