I have a web application with a Javascript Frontend and a Java Backend.
I have a Use Case where the users can upload their profile pictures. The Use Case is simple: send the image from the user browser and store it in S3 with a maximum file size. Currently the Frontend sends the image data stream to the Backend, and this the stores the image in S3 using the AWS Java SDK.
The Backend is obliged to previously store this image in a file system in order to know the image file size (and avoid reading more bytes than a certain maximum), since it is required by S3 to send the PUT Object Request along with the file Content Length.
Is there any other way I can do this with AWS? Using another service? maybe Lambda? I don't like this method of having to previously store the file in a file system and then open again a stream to send it to S3.
Thank you very much in advance.
You might get the file size on the client side as mentioned here but consider browser support.
You shouldn't share your keys with the client side code. I believe Query String Authentication should be used in this scenario.
Assuming your maximum file size is less than your available memory, why can't you just read it in a byte[] or something similar where you can send it to S3 without writing it to disk. You also should be able to get the size that way.
Depends on the S3 java client you are using. If it allows you to read from ByteArrayInputStream then you should be able to do whatever.
Looks like you can use InputStream. See javadoc.
new PutObjectRequest(String bucketName,
String key,
InputStream input,
ObjectMetadata metadata)
Related
Currently, I have a PHP app running on Heroku using a Postgresql database. I want my users to be able to upload an image to a folder on my dropbox, and store other information (in this case, product information such as price, title, weight, location of the image on dropbox) on my database.
Right now, I'm using a file input inside an HTML form to submit the image by posting the whole form to my server (including the image), and then I use cURL to send the image to dropbox and wait for the response to succeed. On success, I create my database record that has the other information I mentioned earlier.
This works well for small files, but Heroku has a 30 second timeout that I can't change. For large files, the whole file uploads to the server, and then it uploads to dropbox. These two upload operations are time-intensive and takes more time than the timeout allows.
I had the idea of sending the file to dropbox using javascript (jQuery ajax commands specifically) so that it's handled by the client, and then POSTing to my server on success, but I'm worried about how secure that is since I would need to have my own authorization tokens in the source code that the client can view.
Is there any way for PHP to send a file from the client to an external URL without it touching the server? How do I do this securely?
This sounds like a good fit for the Dropbox API /2/files/get_temporary_upload_link endpoint.
You can call that on your server to retrieve the temporary upload link, and then pass that link down to the browser. You can then have some JavaScript code perform the upload directly from the browser using that link.
Since only the /2/files/get_temporary_upload_link endpoint call requires your Dropbox access token (whereas the temporary upload link itself doesn't), you can keep your access token secret on the server only, without exposing it to the client. And since the upload happens directly from the browser to the Dropbox servers, you don't have to pass the file data through your own server, avoiding the timeout issue.
I am trying to build a Image Upload System with Node and Azure Blob Storage. I have decided to use the Azure Blob Browser JS SDK to upload image . I have a dilemma in the above process .How do I tell the server of the blob name . I thought of the following approach but this has several problems:
Let just say I give the blob a uuid and send the same uuid to the server but the client JS can be changed and the real blob name and the one sent to the server can differ
It might be that my approach is completely trash. Please reply ,I am a newbie to web development
If I understand your question, I’m doing something similar in ASP.NET and SQL where I actually store the blob itself in Azure storage BUT in a SQL table it’s really just a row of meta-data, where each image has an internal numerical seed ID, a “user-defined” name (which does not have to be unique across the system), and the GUID for e blob name stored in Azure.
First off, I know this seems illogical when I could just send the download URL to the server. The issue with that is that user's can access these download links and so for those who can I need to be able to download it. I can't really explain why as I am under NDA.
I am trying to download a file from a URL via the client (browser) and stream the data directly to the server where the file is saved so the client essentially acts as a "middleman" and does not require the file to be downloaded to the client's machine.
I have been experimenting with "socket.io-stream" and "socket.io-file" but i am having a few issues with both. "socket.io-stream" allows me to upload a specific file from the client to the server but the uploaded file has a size of 0kb and doesn't have any examples on Github.
"socket.io-file" has examples, which I followed and currently have it setup so I can use an input tag to select a file to upload to the server successfully.
From what I can see the "socket.io-file" upload function takes a file object as the parameter.
So I have two questions really:
Is there a plugin for JS (Browser) & NodeJs (Server) that would allow me to do this?
or
How can I create a File Object from an external url?
I solved this is the end, using a chrome extension to download the file as a blob object, pass the object to the content script and then use socket.io-stream to upload it to the server.
In a web app I'm using Cropit to let the user upload and crop an image. Then I need to upload this image to the backend server.
Cropit returns the cropped image in Data URI format (type string).
My question is: what's the best way to now upload the cropped image to the backend server?
So far I've figured out two options here:
Send the Data URI from the client as a simple string, then convert it to binary data on the server, and save the image to disk.
Convert the Data URI to binary on the client and attach it to a FormData input, then send it to the server, and save the image to disk.
If I understand correctly, there's no native JS way to send Data URI as a multipart/form-data. Is this right?
Is it better (i.e. more performant / safer) to use approach 1 or 2? Or is preferable to do it in another way that I didn't mention?
Thanks!
The best way is to send the Data uri via a post method
I've found that the fastest most reliable method is to turn the image into a data uri.
With a multipart upload there could be issues sending the data and encoding the correct values, and it becomes a sticky mess. With the resources we havetoday a data URI is the best suggestion
Edit: Depending on the type of server you are using, it might be a smarter option to use multi-part upload. For example, an AWS lambda may only allow 5mb of data in the request. This would mean the best option would be to use a presigned URL with multi-part upload via S3, which should be handled via the frontend web portion.
Essentially the correct solution allows depends upon your architecture.
Hope you are aware of issues using HTML5 canvas based image editing, for various images sizes, canvas size and the picture quality.
As you mentioned using data format you can write the same to a file in the server side logic(as I am not sure about the server side technologies you use)
Another approach could be use CLI tool image magic(http://www.imagemagick.org/) in the server side to process the image. ie, upload the file to server with the additional information to edit image, and process it at the server. For gathering edit information from client you can use any of the client side javascript libraries like (http://odyniec.net/projects/imgareaselect/)
As you told we can save data url as file by using server as well as we can save this in local using JavaScript by using below code
using PHP :
$uri_ = 'data://' . substr($uri, 5);
$binary = file_get_contents($uri_);
By JavaScript refer this github
Best way to do is
If you want to give that file to download by user as soon as they done image editing then go for JavaScript method because it will run in client machine and creates file faster than server side.
If you want to save that file in server for next time use then create file using server side coding.
I have WebApp(MVC4) with a file upload and I did a terrible mistake, I let the client upload files to my server(Virtual Machine Azure) directly.
In order to do it, I set (in the WebConfig) :
maxRequestLength="2097151"
executionTimeout="3600"
maxAllowedContentLength="4294967295"
Now I understand that its not the way to do it.
So what I want is to be able to upload the user files directly to my Azure Blob Storage without the files getting to my site.
I manage to upload files to my storage with C# butI don't want to send the files to the server-side and taking care of them there, because its mean that the files are first upload to the server then they move to the blob storage, and this is not good for me because I'm dealing with a very large files.
I need to transfer the files to the blob storage without going through the server.
How can I do it ? I didn't manage to find too many article addressing this issue, I just read about SAS and CORS that helping addressing the problem but without actual guidelines to follow.
You are right that CORS and SAS are the correct way to do this. I would recommend reading Create and Use a SAS with the Blob Service article for an introduction. Please also see our demo at Build 2014 for a sample application that lists and downloads blob in JavaScript using CORS and SAS.