Image upload to Heroku App - javascript

I am currently investigating a way to upload Images to a Heroku repo where I have a python-application that takes in the images, has them classified and saves the results in a .csv file.
The Images can be selected for upload via a website that uses Javascript and HTML.
My Question now is, how would I best enable the upload from the website to the Heroku App?
Bearing in mind that the Frontend is currently running on my local machine and that I want to use Heroku as a Backend to take in either Images or Strings.
Will I need an SSH-connection to a separate Web server? Will I need to use Amazon S3?
Not looking for a complete Solution to my problem per se, but if someone could point me in the right direction as to what I will need to solve my problem that would be great.

You could upload an image to Heroku however there two problems with that
Heroku router times out requests after 30 seconds which means that if your users have a spotty connection and/or huge files the upload will fail
Heroku's ephemeral filesystem means that you must process this file in your web process because workers run on different dynos and don't have access to your web dyno filesystem. So that's another strike at 30 seconds timeout.
Your best bet is to have your users upload their files directly to s3 from their browsers. We had a good experience with filestack.com js widget, but there are other ways.
Your page will then ping your backend with this newly uploaded file's s3 url. The backend will launch an asynchronous job using Heroku worker to process it.
This neatly solves all issues with timeouts and blocking your web dynos.

Related

How to fetch data as .csv file from client with node.js application

i have written a game in javascript with the p5.js library. Now i want to host the game on a server to conduct a survey on a service like amazon turk. Ideally the clients recieve a URL to the game and play it while in-game actions are tracked and stored in node.js or on the server and exported as a .csv file once they are done playing. After they finish the game the csv. file should be sent automatically to a location that i can then access. I have zero experience in server hosting or similar topics.
So a couple questions arise:
Is a hosting service like Heroku suitable for hosting the game?
Do i need to use node.js to make this happen?
Which of those two would extract the data and store it to a csv? And where is the file stored?
How do i get or access the csv. after?
Any alternative takes to solve the problem?
Thanks alot in advance!
github repository: https://github.com/luuuucaaa/schaeffers-charade
game on github pages: https://luuuucaaa.github.io/schaeffers-charade/
If I were you, I would do it like below:
Host
Since your project is basically a html & JavaScript static contents,
AWS S3's static hosting would be sufficient (Also, the current git hub pages is another option if you just want to host it).
Hosting on node.js environment is also available using webpack serving, but it requires additional works. (but if you require other npm packages to generate .csv file, you need webpack anyway to bundle js file and attach it to html)
Data Storing
Two ways are considerable,
the first is to store it on the filesystem. Generate .csv via JS script within your app, and save it where the app is hosted (if you go with s3, you can access it afterwards, but I'm not sure if it can write objects by script)
The second is to post the data to another API endpoint. (for example building an API Gateway on AWS that triggers Lambda, which stores it on S3)
It's merely an example and I don't know exactly what you want to achieve, but take it into considerations. Good luck. Cool game BTW.

Request timeout while uploading movie to app service

I am having trouble uploading a big movie on to the Azure app service which I created. I get request timeout after 4-5 mins while uploading the movie (greater than 150MB). For the frontend, I am using VueJS and send multiple files by doing promisify all settled function. Don't have any issues while using it locally. For backend, I am using Nodejs(fastify) with a multer package and I am using an in-memory storage option. Once I receive the file basically I upload it to Azure blob storage.
Do I have to send movie data in chunks from the frontend to backend? how to achieve it when I have multiple files.
Can we use socket io?
I tried using socket io. however, my browser freezes if I send a big file and I am totally new to sockets.
I am not sure how can I fix this issue. It would be great if someone can guide me and show me an example.
Looking forward to hearing from you guys
thanks,
meet
Problems for uploading files to server
Check the timeout in your axios request (front end) - because you have to wait until the all files uploaded to server (https://github.com/axios/axios#creating-an-instance).
Check the domain hosting configuration (if you are hosting your backend service in nginx - check the upload limit (https://www.tecmint.com/limit-file-upload-size-in-nginx/))

Angular - how to test Internet upload speed without backend?

I want to upload file into folder from which my Angular app is served while running on localhost. I'm not able to find any solution without using backend.
For example I just want to upload an image file and that file should copy in specified folder of the project. This should be done only with Angular without using any Backend script or hitting any API endpoint.
Depending on your webhost, you can make your assets-folder accessible via FTP.
Making a FTP-call from javascript (angular is javascript) isn't that difficult. And there are plenty of example and questions about it on the internet (like this)
Why you wouldn't do that:
The credentials for your ftp-connection will be accessible in the compiled javascript-code. With a little bit of effort, everyone can find it.
Each gate you open through the webhosts firewall, is a extra vulnerability. Thats why everybody will recommend you to add an API endpoint for uploading files so that you keep holding the strings of what may be uploaded.
Edit:
As I read your question again and all the sub-answers, I (think) figured out that you are building an native-like app with no back-end, just an angular-single page front-end application. An I can understand why (you can run this on every platform in an application that supports javascript), but the problem you are encountering is only the first of a whole series.
If this is the case, I wouldn't call it uploadingas you would store it locally.
But the good news is that you have localstoragefor your use to store temporary data on the HDD of the client. It isn't a very large space but it is something...
The assets folder is one of the statically served folders of the Angular app. It is located on the server so you can't add files to it without hitting the server (HTTP server, API, or whatever else...).
Even when running your app on localhost, there's a web server under the hood, so it behaves exactly the same than a deployed application, and you can't add files to the assets folder via the Angular app.
I don't know what exactly you want to do with your uploaded files, but:
If you want to use them on client side only, and in one user session, then you can just store the file in a javascript variable and do what you want with it
If you want to share them across users, or across user sessions, then you need to store them on the server, and you can't bypass an API or some HTTP server configuration
Based on your clarification in one of your comments:
I'm trying to develop a small speed test application in which user can upload any file from his system to check upload and download speed.
The only way to avoid having you own backend is to use 3rd party API.
There are some dedicated speed test websites, which also provide API access. E.g.:
https://myspeed.today
http://www.speedtest.net
https://speedof.me/api.html
Some more: https://duckduckgo.com/?q=free+speedtest+api
Note, that many of these APIs are paid services.
Also, I've been able to find this library https://github.com/ddsol/speedtest.net, which might indicate that speedtest.net has some kind of free API tier. But this is up to you to investigate.
This question might also be of help, as it shows using speedtest.net in React Native: Using speedtest.net api with React Native
You can use a third party library such ng-speed-test. For instance here is an Angular library which has an image hosted on a third party server (ie GitHub) to test internet speed.

Where to store a large amount of media files in a Node.js eco-system?

Long time lurker first time poster. Hi.
I've got a Node.js backend server, serving React.js in the front end. I'm uploading a huge amount of mp3 and wav files to the server itself currently. That is, a user uploads a file in my front end, and I create a folder on the server the node instance is running on and store the mp3/wav there, pertaining to that user.
The project is moving out of development into production, and I'm wondering from a scalability perspective a) how bad is this practice b) what my best options are for hosting, and c) alternative options to storing files on the server itself.
There is an existing user base of about 500 users, each of which uploads about 600MB - 1.5GB of media every 1.5 months.
Any insight would be great, as search seems inconclusive. Thanks!
I suggest you integrate with CDN cloud servers. e.g dropbox,google, or AWS . its has very flexible API including role based access, and authentication.
Even if you want to keep on your server, I suggest to run separate server only to upload/download files and create oauth based authentication, system.
In case you also want to go for streaming. Then also there are cloud server which offer streaming support like wows,airplayit etc.

The anatomy of uploading

I am wondering what is the general consensus for uploading moderately large files. I have a web app, and every time a user uploads a file (typically larger than 5mb), the web server tends to hang until the file upload is finished.
The above seems normal, because a single upload can take up a single HTTP request handler. Do web devs take this into consideration and either:
a) Pay for more HTTP handlers
b) Use some other method to overcome this by using AJAX, or other approach
I've heard that it is quite normal for web apps to have a few HTTP request handlers to take care of this, which will cost quite a bit more. On the other, if cost is an issue, then some have suggested trying to upload directly to the web server or storage service (i.e. Amazon S3) directly via Flash + AJAX. The latter method takes a bit of scripting and is a bit messy.
My second concern:
By using ajax to upload files onto a server. Does this still take up a whole HTTP request handler? i.e. does the server hang until the upload is finished?
Even with flash, I would still need to specify a url to upload to. The url would be one of the actions on my controller. Which would mean that processing still takes place on the server side. Is this right so far?
I was thinking. If I were, in the other hand, to use one of the upload scripts (plupload, uploadify, swfupload, etc) to upload directly to Amazon S3, then the processing is handled on the S3 server instead of the local web server. Which wont hang the web app at all. Am I understanding this correctly?
Would like to hear your feedback.
For large uploads you should use non-blocking, evented servers like Node.js, Twisted on Pyhon,
AnyEvent on Perl or EventMachine on Ruby. Using the thread-per-connection model is just too expensive for long running connections.
It is not uncommon for Node.js users to have so many simultaneous connections that they actually hit their operating systems limits while still not using all their resources - for example see this question asked by someone who was concerned by having only 30 thousand simultaneous connections and then managed to reach over 60 thousand connections on a single server with 4GB of RAM.
The point is that if your are concerned about your connections blocking your server from serving new requests, then you shouldn't use a blocking server in the first place.
I'm currently developing a web app which handles multiple image uploads simultaneously. I researched far and wide, and the best option I found was swfupload. It is super easy to implement and highly customizable. Users can select multiple files from the dialogue box, add them to a queue, and get actual progress feedback from the browser. So that lag isn't such a big deal to the user.
Though, bah.....it uses flash to initialize the dialogue box, but everything else is handled with good old javascript.
A great working example is carbonmade.com
Thanks for the responses so far.
Unfortunately, our host Heroku does not support non-blocking, evented servers. I've also tried flash + javascript based uploaders like SWFUpload, Uploadify. Some variations of the mentioned plugins worked, and some didn't. Spent countless hours of trial and error, but didnt like how the code was being integrated on my Rails app.
In the end, went with manually uploading the file to S3 directly following this link. Which also enables a response back from the S3 server to notify us that an upload was successful, giving us the path to the uploaded file so that we can then create a background job (via redis + resque) to process the file.
In the future if you are going to do direct uploading to S3 via Rails, please check out my sample projects below. You will save yourself many, many headaches and it's not very "messy" :)
Sample project using Rails 3, Flash and MooTools-based FancyUploader to upload directly to S3: https://github.com/iwasrobbed/Rails3-S3-Uploader-FancyUploader
Sample project using Rails 3, Flash/Silverlight/GoogleGears/BrowserPlus and jQuery-based Plupload to upload directly to S3: https://github.com/iwasrobbed/Rails3-S3-Uploader-Plupload
By the way, you can do post-processing with Paperclip using something like this blog post describes:
http://www.railstoolkit.com/posts/fancyupload-amazon-s3-uploader-with-paperclip

Categories