Create a file for upload in rails 3 using javascript - javascript

My web app allows users to record their geolocation data. I need to somehow get that geolocation data into a file so I can put it in s3 storage. I have no idea how to go about this, but the controller already has file uploads to s3 set up using paperclip.
Is there some way to generate a file with javascript and then attach that file when the user clicks save? The other option I was thinking is that I could add a bunch of strings to the body using jQuery .data() method, but then I don't know how to attach a string as a file in my rails 3 form.
Any help is appreciated.

maybe you should try out Amazon's simple db instead of s3 for this? It would be more appropriate than creating files to store data in s3.
Amazon recently released a ruby sdk for their web services, SDB included:
https://github.com/amazonwebservices/aws-sdk-for-ruby
Edit: Or better yet, forget using SDB directly. I had forgotten that that SDK includes an implementation of ActiveModel called AWS::Record. Should make this trivial.
I'm assuming you're on Heroku or something and don't have a method of data persistence?
edit: looking quickly at paperclip's assign method, there's a chance this would work.
yourmodel.your_paperclip_attachment = StringIO.new(params[:your_posted_geolocation_data])
Paperclip appears to handle creating a tempfile from the stream.

Related

How to retrieve file names from a specific directory using JavaScript or jQuery?

I want to make a code using only JavaScript or/and jQuery to access a static directory and retrieve the names of some icons i saved there (SVG icons) and display the names to the user. i couldn't do that with the file api and i have no idea where to start.
I'm going to assume from your mention of jQuery and the File API that you're trying to do this within a browser.
You can't. It just isn't allowed, there is no mechanism to provide it.
If you're in control of the machine where you want this information to be accessed, you can run a server process on it that can do that; code in the browser can then make a request to the server code to request the information. But there's no browser-only way to do it.

Firebase Storage download to computer

So I am trying to figure out how to download an array of images to a users computer. I have been storing everything through calling my server as I feel more secure using firebase on the server. So on click of a button on the client I can get a return of an array of the images in my firebase storage bucket.
Button click -> call server -> get a return of the array of urls from firebase
Now is there a way to download these to the users computer? Prompt them to choose a file path or download them directly?
I know I can do a single download auto by this:
var a = $("<a>").attr("href", url).attr("download", "img.png").appendTo("body");
a[0].click();
a.remove();
I have tested a single url download that auto downloads on the button click, but I dont feel like I should have to loop through the array one at a time to download all nor do I know if this would work. I would assume it would since a single url works.
Is there a better way?
There is no way to download multiple files in one request from Firebase Storage. If you want to allow downloading of multiple files, you'll have to store them in a single (say zip) file and use the approach you already do today for downloading that file.
Alternatively you can use the Google Cloud Storage API to download a bunch of files. See this answer for more on that, but be aware the the Google Cloud Storage API is meant for use on an app server and not directly in your web page.

Google drive refetch/refresh the previously fetched file

we are using google picker to fetch the file in our application. Previously it was one time thing, but we noticed that because we don't allow data manipulation on our app, user need to do the changes on the drive and again do the upload/fetch process. We want to simplify the workflow and allow user to do one click refresh/resync of the file (spreadsheet). In order to do that I am thinking to save the file_id on my app, though I'll still need to get oAuthToken to build the service and fetch the file. Is that right approach or please suggest if any other mechanism I can follow.
The current google-picker workflow is using js on the client side which provides oAuthToken, fileId, Name of the file. on the server side I use python to build the service and fetch the actual file.
In my opinion, your process i.e, saving the file ID and access/refresh tokens and fetch the file when you need is the correct way to go.
I have built an application where we manage our files using google drive. In short my process was:
User consent and Auth.
Use picker to upload files and upon successful operation save necessary file data.
Fetch the file (with access token or use refresh token to get new access token) whenever we need to view it to user
I hope this helps.
Cheers

Android data transfer between apps and website

I have build a website which provide a lot of data and based on some rules of company, I can't directly get the data from database for my app version. The only way to do it is to use a webview to show the previous website, but I want to get some data reference to the website for my app processing.
I would like to get an values call productID from the website and use it to save as a record into the mobile local database. This saving move will run in the Android app. Is there a way to do this?
It there a way to get data from the website? Is JavaScript possible for this case?
Just a few words to Google for you. Just look out for a REST API for you Server (you can write it in PHP) to give the data in JSON to your app.
Use GSON in your app to get the JSON object to a Java object. And keep in mind to do all network calls in a thread/async task. You can have a look at the Volley library provided by Google.
Have fun with your Project ;) Step by step, and you will get it!
Tutorial

showing image from Amazon S3 bucket only with javascript

Hypothesis: I have thousands of images into different folders in an amazon S3 bucket. I'd like to make them accessibile to unlogged users as slideshow, but I don't want to deal with db and server poor performance (in case of too many users at the same time) , so I'd like to use only javascript.
The problem is that I should however deliver to the client the file list, since I can't use XMLHttpRequest to fetch and parse the xml file that Amazon provides when you try to browse a bucket because (I expect) the browsing page should be located on my webserver.
I think I should write some server-side code to create,after every upload/modification, an updated filelist to share with users, but I'm not sure it's a good idea.
Can anybody suggest me the best way to proceed?
Happy New Year!
Possible answer, tell me what do you think about:
Amazon provides ListBucket operation http://docs.amazonwebservices.com/AmazonS3/latest/API/SOAPListBucket.html
I can choose how many results to get at once using max-keys and marker (for pagination) parameters (example: http://download.terracotta.org/?max-keys=5).
I will obtain a xml file (as smallas I want) that I can parse locally with js in a "list.html" file, for example.
I could then include this list.html file (that should print just the definition of an array of images) in a iframe included in my slideshow.html file on my webserver.
Too dirty?
The Amazon S3 JavaScript API has a method, bucket.list() that will list the contents of a bucket.

Categories