Downloading a file using JWK for Authentication - javascript

This article states that:
When using cookies, you can trigger a file download and stream the contents easily. However, in the tokens world, where the request is done via XHR, you can't rely on that.
Is there something tricky about downloading a file using XHR from a URL that requires a JWK token for authorization?

Related

How to use openstack on client side

I try to make a upload/download services of files for my website, and i'm trying to use the object storage from openstack. The thing is, i have no problem doing it via php and openstack php sdk, but when i'm trying to do it via some javascript, i can't find a good sdk or methods.
I'm not using node, I have a php server, and a javascript client. I would like to uploads or download files directly from the javascript client. I don't want the file to transfer through the php server. I managed to create openstack tokens with the php sdk, maybe i could send those to the javascript so then they can authenticate? It's been one week of searching with no solutions...
Openstack has a S3 plugin that can ease your search for a library/sdk.
Otherwise, you should forge a temporary URL server-side, I'm sure your PHP library has tooling for this. The URL can then be used client-side to PUT the file.
The temporary URL is forged in a way to open temporary write-only access for the upload. There is also the same kind of URL to open read-only access to some elements.
So, either the client requests for a place to upload to the PHP which sends back the URL, or simply have the client upload the file to your PHP that will forge the link and then redirect the request to the URL.

Stream response from server to local file in javascript?

We have an REST API endpoint that will stream multiple GB of data as a response. Currently using xhr and responseType: blob, then In our web interface, we would like to stream that response do a file instead of storing the entire response in memory, and the trying to save it to a file. {oked around the fetch API. Still can't quite figure out how to get something like that to work. What are we missing?
The canonical HTTP way of doing this is to use a HTTP chunked response. That means that instead of using XHR one points the browser at the file URI and the server should respond with a response that indicates it supports chunking.
You're essentially trying to reinvent the wheel (of downloading files) with XHR and JavaScript.
Point the browser at the proper URI and it will be streamed directly to client's disk. Supports resuming broken downloads out-of-the-box!

Is it safe to Uploading to azure blob through xhr request in javascript

Iam implenting a project with angular2 as front end and Laravel as backend.
In this project i am uploading the files to Azure blob through xhr request from front end.
when ever the Xhr request fires the azure blob storage url is showing in network tab in console , and this url contains the signature of the azure blob.
is it ok to implement this functionality through XHR in Javascript. any suggstions will helpa me a lot.
Is it ok to implement this functionality through XHR in Javascript.
any suggstions will help me a lot.
I think this is fine as long as you keep the SAS token valid only for the duration it is required to upload the file (It can be tricky thing to guess the actual time it would need to upload a file).
There are a few other things you could do to make it more secure:
Only include Write or Create permission in the SAS token. Don't include other permissions like Read or Delete if you don't need to. This way, user will only upload the file using this SAS token and do nothing else.
As mentioned above, keep the SAS token validity duration short.
If possible, get the SAS token only for the file the user is uploading. Don't get a SAS token for the entire blob container. This way, user will only be upload a specific file.
Include IP ACLing in your SAS token so that user can't share the SAS URL with other users. IP ACLing will enforce SAS URL be used only from the IP address included in the SAS token.

How to download large files in javascript with OAuth without storing the whole file in browser?

I have a closed-source SaaS webapp that requires OAuth for my users to access some data. Much of the data available is binary, and would be much easier dealt with if users could download files, rather than doing everything in the browser.
I can write javascript and deploy it in the webapp, so I can trigger the OAuth authentication and add the required Authorization token header to a data request.
It is not possible to send authorization in any other way (e.g., query parameter) besides an HTTP header, so I can't make simple HTML anchor tags with URLs to allow users to download data as files.
I believe I can use the Blob URLs to enable downloading of this data, and fall back to data URLs for older browsers.
Two questions:
Is there an easier way for me to allow my users to download data as files, but still inject an HTTP header?
Can I stream the data, so I don't need a 100MB data URL?

Upload image to S3 from webapp without storing it in server

I have a web application with a Javascript Frontend and a Java Backend.
I have a Use Case where the users can upload their profile pictures. The Use Case is simple: send the image from the user browser and store it in S3 with a maximum file size. Currently the Frontend sends the image data stream to the Backend, and this the stores the image in S3 using the AWS Java SDK.
The Backend is obliged to previously store this image in a file system in order to know the image file size (and avoid reading more bytes than a certain maximum), since it is required by S3 to send the PUT Object Request along with the file Content Length.
Is there any other way I can do this with AWS? Using another service? maybe Lambda? I don't like this method of having to previously store the file in a file system and then open again a stream to send it to S3.
Thank you very much in advance.
You might get the file size on the client side as mentioned here but consider browser support.
You shouldn't share your keys with the client side code. I believe Query String Authentication should be used in this scenario.
Assuming your maximum file size is less than your available memory, why can't you just read it in a byte[] or something similar where you can send it to S3 without writing it to disk. You also should be able to get the size that way.
Depends on the S3 java client you are using. If it allows you to read from ByteArrayInputStream then you should be able to do whatever.
Looks like you can use InputStream. See javadoc.
new PutObjectRequest(String bucketName,
String key,
InputStream input,
ObjectMetadata metadata)

Categories