Hypothesis:
I have some json file in an amazon s3 bucket.
In the same bucket I have a html-js viewev that uses that json files as data source.
I'd like that a user could only see data using my html-js viewer and not, for example, downlading the json files locally or using a 3rd party viewer.
Is there any possibility to achieve such objective?
This isn't possible with only browser side code served by a static s3 bucket. No matter how hard you obfuscate or encrypt the data, you still have to provide the code to unlock and display it.
The only way you can (more or less) ensure they have to view your advertisement is to serve up the HTML of the viewer itself. It sounds like this JSON isn't changing much, if you're just serving it out of an s3 bucket. You can probably just pre-render the HTML with your advertisement and put that in the s3 bucket, rather than having json + an html viewer.
Related
Hi just signed up to amazon s3, and I have uploaded images, my question is should I save css and js, files in s3 our can this cause some issues with the site
You can use S3, but be careful of your bucket permissions and ideally use Cloudfront in front of it.
The ideal way to serve your css and js files in a 'serverless' way is to store these files in S3 and have it served by CloudFront.
You can follow this tutorial to configure CloudFront to serve your static content from S3.
This will not only save you money (since every file served from CloudFront is going to be way cheaper than S3) but also going to serve data faster to your users since CloudFront is technically a CDN (Content Delivery Network).
I'm using the javascript SDK for AWS s3. I'm attempting to upload a file to my bucket and right after that, getting that file data from S3. So i'm facing two issues :
I need to execute synchronously two functions. One for uploading and another to fetch the file's data. I am using the Client javascript SDK for aws S3 and i don't know how to be sure to have all files fully uploaded before starting to fetch them in the bucket.
Also, the object data does not show the url of the file , so i don't know how to get it.
Please any help would be appreciated.
I have a web application with a Javascript Frontend and a Java Backend.
I have a Use Case where the users can upload their profile pictures. The Use Case is simple: send the image from the user browser and store it in S3 with a maximum file size. Currently the Frontend sends the image data stream to the Backend, and this the stores the image in S3 using the AWS Java SDK.
The Backend is obliged to previously store this image in a file system in order to know the image file size (and avoid reading more bytes than a certain maximum), since it is required by S3 to send the PUT Object Request along with the file Content Length.
Is there any other way I can do this with AWS? Using another service? maybe Lambda? I don't like this method of having to previously store the file in a file system and then open again a stream to send it to S3.
Thank you very much in advance.
You might get the file size on the client side as mentioned here but consider browser support.
You shouldn't share your keys with the client side code. I believe Query String Authentication should be used in this scenario.
Assuming your maximum file size is less than your available memory, why can't you just read it in a byte[] or something similar where you can send it to S3 without writing it to disk. You also should be able to get the size that way.
Depends on the S3 java client you are using. If it allows you to read from ByteArrayInputStream then you should be able to do whatever.
Looks like you can use InputStream. See javadoc.
new PutObjectRequest(String bucketName,
String key,
InputStream input,
ObjectMetadata metadata)
Is there a better online storage service than Amazon's S3, which requires multipart uploads and the whole file to be buffered to my server before it's uploaded to them.
I want some service that I can directly stream uploads to it (through my server) without any buffering
Assuming that what you have is a complete file that you want to end up stored on Amazon, there isn't much else you can do.
You can do streaming to S3 by using the low-level API: http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectOps.html
The only alternatives are to transfer things piece-wise and then later reassemble them. For example, you could use Kinesis Firehose to upload individual file-chunks to S3. Then you'd need some other job to come along and mash the pieces back together into the original file.
You don't have to buffer the entire file to your server before uploading to S3. S3's multipart upload allows you to upload each part separately, which can be as small as 5MB, ie: you server only has to buffer 5MB at a time. I use this technique in goofys to simulate streaming writes.
I have WebApp(MVC4) with a file upload and I did a terrible mistake, I let the client upload files to my server(Virtual Machine Azure) directly.
In order to do it, I set (in the WebConfig) :
maxRequestLength="2097151"
executionTimeout="3600"
maxAllowedContentLength="4294967295"
Now I understand that its not the way to do it.
So what I want is to be able to upload the user files directly to my Azure Blob Storage without the files getting to my site.
I manage to upload files to my storage with C# butI don't want to send the files to the server-side and taking care of them there, because its mean that the files are first upload to the server then they move to the blob storage, and this is not good for me because I'm dealing with a very large files.
I need to transfer the files to the blob storage without going through the server.
How can I do it ? I didn't manage to find too many article addressing this issue, I just read about SAS and CORS that helping addressing the problem but without actual guidelines to follow.
You are right that CORS and SAS are the correct way to do this. I would recommend reading Create and Use a SAS with the Blob Service article for an introduction. Please also see our demo at Build 2014 for a sample application that lists and downloads blob in JavaScript using CORS and SAS.