Request timeout while uploading movie to app service - javascript

I am having trouble uploading a big movie on to the Azure app service which I created. I get request timeout after 4-5 mins while uploading the movie (greater than 150MB). For the frontend, I am using VueJS and send multiple files by doing promisify all settled function. Don't have any issues while using it locally. For backend, I am using Nodejs(fastify) with a multer package and I am using an in-memory storage option. Once I receive the file basically I upload it to Azure blob storage.
Do I have to send movie data in chunks from the frontend to backend? how to achieve it when I have multiple files.
Can we use socket io?
I tried using socket io. however, my browser freezes if I send a big file and I am totally new to sockets.
I am not sure how can I fix this issue. It would be great if someone can guide me and show me an example.
Looking forward to hearing from you guys
thanks,
meet

Problems for uploading files to server
Check the timeout in your axios request (front end) - because you have to wait until the all files uploaded to server (https://github.com/axios/axios#creating-an-instance).
Check the domain hosting configuration (if you are hosting your backend service in nginx - check the upload limit (https://www.tecmint.com/limit-file-upload-size-in-nginx/))

Related

Best way to handle GCP file downloads

I’m currently working on a project that uses GCP storage to store some recorded audio files. The way it is designed is a user requests the file to the NodeJS server which in turn pipes the content of the file from GCP to the client. However, I’m seeing a big spike in latency when multiple users (<10) download files at the same time. My thought is that the NodeJS server should just return to the client the URI of the GCP file and the client should download it directly from GCP using an Authentication token. Am I correct, or should indeed NodeJS act as a proxy between GCP and the client to pipe the data?

Express upload a file on load balancing?

Recently i made an app in node js with a load balancing feature. I made a server just for the db itself and other for managing request. The problem is, in the app you can upload a file with multer and in express you upload the file to that server. Its a express static.
For example i have 4 server, one for the db, 2 for the apps, and 1 for the load balancer.
When the loadbalancer request to app-1 server, the file upload to app-1 server not to db server. So when i try to access the file from app-2 server the file didn't exist.
Is there any ways to solve this problem?
Or better ways to use the load balancer?
because im new with load balancer. Thanks
Where are you storing the file that has been uploaded through app-1? You can for example store it in a bucket and keep the reference in your SQL DB. So, when app-2 tries to access the file, it directly asks the DB, which is shared between your apps.
To answer your question, you have 2 solutions:
maintaining a session persistence with your load balancer, so a user that requests to app-1 keeps requesting to app-1 until his session expires.
having a stateless backend design, meaning that you don't need sessions, so any user can send a request to any app instances you have, it will behave the same.
I would go with solution 2, it's easier after you get your head around the stateless concept.
Having an external bucket and not relying on your app internal memory and storage system is a good way to implement a stateless architecture.

Image upload to Heroku App

I am currently investigating a way to upload Images to a Heroku repo where I have a python-application that takes in the images, has them classified and saves the results in a .csv file.
The Images can be selected for upload via a website that uses Javascript and HTML.
My Question now is, how would I best enable the upload from the website to the Heroku App?
Bearing in mind that the Frontend is currently running on my local machine and that I want to use Heroku as a Backend to take in either Images or Strings.
Will I need an SSH-connection to a separate Web server? Will I need to use Amazon S3?
Not looking for a complete Solution to my problem per se, but if someone could point me in the right direction as to what I will need to solve my problem that would be great.
You could upload an image to Heroku however there two problems with that
Heroku router times out requests after 30 seconds which means that if your users have a spotty connection and/or huge files the upload will fail
Heroku's ephemeral filesystem means that you must process this file in your web process because workers run on different dynos and don't have access to your web dyno filesystem. So that's another strike at 30 seconds timeout.
Your best bet is to have your users upload their files directly to s3 from their browsers. We had a good experience with filestack.com js widget, but there are other ways.
Your page will then ping your backend with this newly uploaded file's s3 url. The backend will launch an asynchronous job using Heroku worker to process it.
This neatly solves all issues with timeouts and blocking your web dynos.

How to improve upload speed of files to google storage?

I have implemented javascript code to upload files in multiple chunks to google cloud storage.
Below is flow that I execute the upload file:
1. user selects a file to upload using javascript client web app {request is from ASIA region}
2. javascript client app request to our app server implemented in NODEJS {hosted in google cloud's compute engine - US region} to allow the file upload {authorization}
2. Nodejs App server returns a signedurl to the client app
3. client app start uploading file to google storage in multiple chunks using that signed url
4. on upload successful client reports to app server
I am able to upload files in multiple chunks, but I have observed that upload is 2-3 times slower if I host nodejs app server in google cloud US region rather than hosting on same machine from where I am executing client app request
Please let me know if you have solution how to improve the upload performance.
There is some workaround mentioned in google cloud signed-url documentation :
Resumable uploads are pinned in the region they start in. For example,
if you create a resumable upload URL in the US and give it to a client
in Asia, the upload still goes through the US. Performing a resumable
upload in a region where it wasn't initiated can cause slow uploads.
To avoid this, you can have the initial POST request constructed and
signed by the server, but then give the signed URL to the client so
that the upload is initiated from their location. Once initiated, the
client can use the resulting session URI normally to make PUT requests
that do not need to be signed.
But with that reference:
I couldnt found any code sample for: once client receives the signed
url from server how the initial JSON API call can be constructed ??
what should be expected response in 1st call? and how to extract
session URI
how to use the session URI to upload further chunks?
You may be confusing two separate GCS features. GCS allows for resumable uploads to be authorized to third parties without credentials in a couple of ways.
First, and preferred, is signed URLs. Your sends a signed URL to a client that will allow that client to begin a resumable upload.
Second, and less preferred due to the region pinning you mention above, is having the server initiate a resumable upload itself and then passes the upload ID to the client.
It sounds like you want the first thing but are using the second.
Using signed URLs requires making use of the XML API, which handles resumable uploads in a similar way to the JSON API: https://cloud.google.com/storage/docs/xml-api/resumable-upload
You'll want to sign that very first POST call to create an upload and pass that URL to the user to invoke on their own.

Where to store a large amount of media files in a Node.js eco-system?

Long time lurker first time poster. Hi.
I've got a Node.js backend server, serving React.js in the front end. I'm uploading a huge amount of mp3 and wav files to the server itself currently. That is, a user uploads a file in my front end, and I create a folder on the server the node instance is running on and store the mp3/wav there, pertaining to that user.
The project is moving out of development into production, and I'm wondering from a scalability perspective a) how bad is this practice b) what my best options are for hosting, and c) alternative options to storing files on the server itself.
There is an existing user base of about 500 users, each of which uploads about 600MB - 1.5GB of media every 1.5 months.
Any insight would be great, as search seems inconclusive. Thanks!
I suggest you integrate with CDN cloud servers. e.g dropbox,google, or AWS . its has very flexible API including role based access, and authentication.
Even if you want to keep on your server, I suggest to run separate server only to upload/download files and create oauth based authentication, system.
In case you also want to go for streaming. Then also there are cloud server which offer streaming support like wows,airplayit etc.

Categories