Best way to handle Google Drive uploads from a back-end server - javascript

I was wondering what could be the best way to handle Google Drive uploads from my web application.
In this web application the Google Drive access tokens are available with the back-end server.
I have two simple ways in my mind currently:
Upload directly via front-end:
The front-end app can make an authenticated request for a drive access token to my back-end. My back-end can then return the access token to the front-end. Frontend can now then directly upload the file to Google Drive.
With this approach its relatively simple, there is only single point of failure in the upload process. Also my back-end doesn't have to deal with all upload related logic. On the other hand with this approach, I have to expose the drive access token to the front-end which may not be very bad.
Proxying the upload via my back-end:
With this approach my front-end will first upload the file to my back-end and then my back-end going to upload the file to Google Drive. With this I don't have to expose the access token to client. But this has got a lot of disadvantages for e.g. in the upload process there are now multiple points of failure. My back-end needs to implement all that upload logic and also needs to deal large file uploads. That's why I am not so comfortable with this approach.
Is there any better / standard way of handling this thing?

I would suggest uploading files directly from the frontend. It's how it's usually done, it's more efficient and it doesn't have to involve less security. If you have any doubts about this, you can open a new question explaining the technologies you use and the purpose of the web app.

Related

Download azure blob using purely javascript and no Nodejs?

I have a cordova application which downloads a zip file as blob from azure. Since I am very new to azure, I would like to know that is it okay security wise to access azure blob with SAS url from the cordova application ?
My point is that I would need to append the shared access signature (SAS) token to the blob url, something like below.
https://myazureportal.container.blobs/myblob?MY_SAS
This way my javascript code will have SAS hard-coded. What is the correct approach since I would prefer to access blob using javascript only and preferably without writing any server side code if possible ?
if I use SAS inside javascript files of my cordova application, is it a security flaw ? If so, any approach to implement the same using purely javascript only ?
Things I tried:
Created a back-end WEB-API service in ASP.NET Core and this way, I would be able to download the blob file but I am looking for is a pure javascript approach.
Apart from the point mentioned by Eric about code being decompiled, there are a few other things you would need to worry about.
If you are embedding the SAS URL in your application, you will have to make them long-lived i.e. with an expiry date far out in future. That's a security risk and is against best practices.
A shared access signature is created using an account key and becomes invalid the moment you regenerate your account key. If you're embedding SAS URL in your application and have to regenerate your account key for any reason, your SAS URL becomes essentially useless.
You can learn more about the best practices for SAS Token here: https://learn.microsoft.com/en-us/azure/storage/common/storage-sas-overview#best-practices-when-using-sas.
Yes it is a security flaw as your app can be decompiled and your code inspected. If you want to keep this approach, at least have a login connected to a back-end that sends the SAS back to your front-end.
Ideally you would do everything in the back-end and return the blob to your front-end.

Angular - how to test Internet upload speed without backend?

I want to upload file into folder from which my Angular app is served while running on localhost. I'm not able to find any solution without using backend.
For example I just want to upload an image file and that file should copy in specified folder of the project. This should be done only with Angular without using any Backend script or hitting any API endpoint.
Depending on your webhost, you can make your assets-folder accessible via FTP.
Making a FTP-call from javascript (angular is javascript) isn't that difficult. And there are plenty of example and questions about it on the internet (like this)
Why you wouldn't do that:
The credentials for your ftp-connection will be accessible in the compiled javascript-code. With a little bit of effort, everyone can find it.
Each gate you open through the webhosts firewall, is a extra vulnerability. Thats why everybody will recommend you to add an API endpoint for uploading files so that you keep holding the strings of what may be uploaded.
Edit:
As I read your question again and all the sub-answers, I (think) figured out that you are building an native-like app with no back-end, just an angular-single page front-end application. An I can understand why (you can run this on every platform in an application that supports javascript), but the problem you are encountering is only the first of a whole series.
If this is the case, I wouldn't call it uploadingas you would store it locally.
But the good news is that you have localstoragefor your use to store temporary data on the HDD of the client. It isn't a very large space but it is something...
The assets folder is one of the statically served folders of the Angular app. It is located on the server so you can't add files to it without hitting the server (HTTP server, API, or whatever else...).
Even when running your app on localhost, there's a web server under the hood, so it behaves exactly the same than a deployed application, and you can't add files to the assets folder via the Angular app.
I don't know what exactly you want to do with your uploaded files, but:
If you want to use them on client side only, and in one user session, then you can just store the file in a javascript variable and do what you want with it
If you want to share them across users, or across user sessions, then you need to store them on the server, and you can't bypass an API or some HTTP server configuration
Based on your clarification in one of your comments:
I'm trying to develop a small speed test application in which user can upload any file from his system to check upload and download speed.
The only way to avoid having you own backend is to use 3rd party API.
There are some dedicated speed test websites, which also provide API access. E.g.:
https://myspeed.today
http://www.speedtest.net
https://speedof.me/api.html
Some more: https://duckduckgo.com/?q=free+speedtest+api
Note, that many of these APIs are paid services.
Also, I've been able to find this library https://github.com/ddsol/speedtest.net, which might indicate that speedtest.net has some kind of free API tier. But this is up to you to investigate.
This question might also be of help, as it shows using speedtest.net in React Native: Using speedtest.net api with React Native
You can use a third party library such ng-speed-test. For instance here is an Angular library which has an image hosted on a third party server (ie GitHub) to test internet speed.

Upload file to personal Dropbox from public website without exposing private key

I receive images for processing on my site and wanna omit uploading to local server and further adding to Dropbox folder from it by implementing direct upload to Dropbox from web browser.
Dropbox API is pretty simple, but has one problem - I need to expose API key to end user, and that key allows to download all pics from my account.
Then if I use Dropbox API - one user can download images uploaded by others, and it's not acceptable scenario.
Is there any way to bypass limitation with exposing full access API key to website end users?
Unfortunately, no, in order to upload directly to Dropbox via the Dropbox API, the client needs the access token. Further, the Dropbox API doesn't offer an upload-only permission.
Fundamentally, clients can't keep secrets, so any access token that exists client-side (in this case, in the browser) can be extracted and abused.
Edit:
The Dropbox API now offers some new pieces of functionality that may be useful here:
A) The Dropbox API now offers "scopes", which you can use to configure an app or access token to only a limited set of functionality, such as the ability to write but not read files.
You can find more information about the release in our blog post here:
https://dropbox.tech/developers/now-available--scoped-apps-and-enhanced-permissions
B) The Dropbox API now offers the /2/files/get_temporary_upload_link endpoint, which can be used to get a URL that a client can POST to in order to upload to a Dropbox account without an access token:
https://www.dropbox.com/developers/documentation/http/documentation#files-get_temporary_upload_link

What is best place to store and organize images for NOSQL database

I'm developing web application using parse.com where news content may have 20 images. what is best place to store images to avoid consequences in the future, like overloading database performance cost and such
1) store in NOSQL object type database
2) store in folder and save path pointers in NOSQL
what are pros and cons going each way?
Parse isn't a great choice for a content hosting provider because you will be pressing on the storage cap, not to mention limitations on bandwidth usage.
Use Parse as a general-purpose backend for user authentication and app data, then host the images on another service such as AWS Simple Storage Service (S3) and reference those resources using cloud code web hooks. In case you're wondering, Parse actually uses Amazon to host all of their infrastructure.
Another option could be to access the images directly using AWS API Gateway. Once you have the images stored in S3, you will be able to automatically create native client APIs for your project.
When you upload a file, it will be stored on the disk (that is how Parse works). In your database, you only store only a link to that file. Generally, your images will not necessarily be stored in one location or server. For the sake of scalability, you want to take advantage of Content Delivery Networks (CDNs) which could replicate your images to multiple servers across the world. Now no matter where your image is stored, you should be able to access it anywhere simply via a single link stored in your database.
You can store the path to the images within the database, but you want to store the actual images on S3 and access them via Cloudfront (CDN).
You can setup cloudfront in front of your S3 bucket and this will allow all your images to be accessed via AWS' CDN.

Save data on S3 using Javascript or Jquery

I want to collect data entered by the user in a browser and save to Amazon S3. Is this something I can do with Javascript/jQuery?
I know this is an old question, but I had the same issue and think I've found a solution. S3 has a REST interface to which you can POST data directly, without exposing your AWS Secret Key. So, you can construct an AJAX POST request to your S3 bucket endpoint using Javascript or jQuery. You can specify an access policy in the request as well, which restricts upload access to only certain buckets and certain directories.
Amazon verifies the authenticity of your requests using an HMAC signature which you provide in the request. The signature is constructed using details about the request and your AWS Secret Key, which only you and Amazon know, so fraudulent requests can't be made without someone having a valid signature.
Yes it is possible, and as I already pointed in the comments of the accepted answer there are legitimate and useful uses to do so without compromising security and credentials.
You can post objects to S3 directly from the browser:
http://docs.amazonwebservices.com/AmazonS3/latest/API/RESTObjectPOST.html
Bad idea:
1) Think of how much fun people could have with emptying your bank account when they find your S3 credentials embedded in your Javascript code.
2) The javascript would be loaded from your server and trying to talk to Amazon's servers - that's forbidden as it's cross-domain communication.
Something like this you'd want to handle on the server. You could easily whip up an AJAX interface to send the data client browser -> your server -> amazon. That way your S3 credentials are store on your server and not transmitted willy-nilly to everyone using your site.
Maybe have a look at node.js, and try the aws-sdk package by:
npm install aws-sdk
There are blog and doc I found about how to upload files to S3:
this blog. and
aws doc.
There are a variety of issues with attempting to access S3 via client-side code:
There is no way to secure your credentials.
Many responses are in XML instead of JSON, and the XML parsing engine in JavaScript is heavy and slow.
Authenticating the requests would require JavaScript implementations of HMAC-SHA1.
There are issues with making cross-domain requests from JavaScript without routing through a proxy.
All-in-all, there are no feasible solutions for client-side JavaScript at the moment. If you're interested in server-side JavaScript, there are some S3 classes floating around GitHub for Node.js.

Categories