So I am trying to figure out how to download an array of images to a users computer. I have been storing everything through calling my server as I feel more secure using firebase on the server. So on click of a button on the client I can get a return of an array of the images in my firebase storage bucket.
Button click -> call server -> get a return of the array of urls from firebase
Now is there a way to download these to the users computer? Prompt them to choose a file path or download them directly?
I know I can do a single download auto by this:
var a = $("<a>").attr("href", url).attr("download", "img.png").appendTo("body");
a[0].click();
a.remove();
I have tested a single url download that auto downloads on the button click, but I dont feel like I should have to loop through the array one at a time to download all nor do I know if this would work. I would assume it would since a single url works.
Is there a better way?
There is no way to download multiple files in one request from Firebase Storage. If you want to allow downloading of multiple files, you'll have to store them in a single (say zip) file and use the approach you already do today for downloading that file.
Alternatively you can use the Google Cloud Storage API to download a bunch of files. See this answer for more on that, but be aware the the Google Cloud Storage API is meant for use on an app server and not directly in your web page.
Related
I am trying to create an "upload to OneDrive" button on my website, knowing that I would just like this to open a user authentication window (so that the user can upload the file created on my website directly on its own Onedrive cloud.
I have already done the same thing with google drive but so simply that I do not understand why I can not find a solution for my problem with "OneDrive" (and dropbox also for that matter ..)
my language is : nodeJs/python/Js
Create a database to store the files. Try using mondodb (a node package) to create a database to store uploaded files.
I aim to create a program in ReactJS that works Offline. That is, the program will have the information stored locally and only when it is necessary to update, will be connected to the internet and download it using an API.
At this point, I chose to store the information coming from the API in LocalStorage.
componentDidMount(){
fetch('api.url')
.then(response => response.json())
.then(data => localStorage.setItem('fromAPI',JSON.stringify(data)))
}
My question is: the API in question will have photos, and it will be necessary to download these photos to store them locally. How can I store them locally?
What do you mean store them locally?
Your web app cannot access user's hard drive as it pleases. You can download resources with download prompt, or simply to user's set download folder, but to reuse them, you'd need the user to give you upload access to them (with input[type=file]). I guess this is not what you're looking for.
Alternatively, you can store files within localStorage, presumably base64-encoded. (Altho I think utf16 might work as well.) Just beware of localStorage's size limitation, around 5 - 10 MB depending on browser.
I did not understand if you want to save the image URL?
If you don't - you can save the base64 of the image,
read about how to convert the image to base64.
Here is a nice package you can use:
https://www.npmjs.com/package/image-to-base64
we are using google picker to fetch the file in our application. Previously it was one time thing, but we noticed that because we don't allow data manipulation on our app, user need to do the changes on the drive and again do the upload/fetch process. We want to simplify the workflow and allow user to do one click refresh/resync of the file (spreadsheet). In order to do that I am thinking to save the file_id on my app, though I'll still need to get oAuthToken to build the service and fetch the file. Is that right approach or please suggest if any other mechanism I can follow.
The current google-picker workflow is using js on the client side which provides oAuthToken, fileId, Name of the file. on the server side I use python to build the service and fetch the actual file.
In my opinion, your process i.e, saving the file ID and access/refresh tokens and fetch the file when you need is the correct way to go.
I have built an application where we manage our files using google drive. In short my process was:
User consent and Auth.
Use picker to upload files and upon successful operation save necessary file data.
Fetch the file (with access token or use refresh token to get new access token) whenever we need to view it to user
I hope this helps.
Cheers
My web app allows users to record their geolocation data. I need to somehow get that geolocation data into a file so I can put it in s3 storage. I have no idea how to go about this, but the controller already has file uploads to s3 set up using paperclip.
Is there some way to generate a file with javascript and then attach that file when the user clicks save? The other option I was thinking is that I could add a bunch of strings to the body using jQuery .data() method, but then I don't know how to attach a string as a file in my rails 3 form.
Any help is appreciated.
maybe you should try out Amazon's simple db instead of s3 for this? It would be more appropriate than creating files to store data in s3.
Amazon recently released a ruby sdk for their web services, SDB included:
https://github.com/amazonwebservices/aws-sdk-for-ruby
Edit: Or better yet, forget using SDB directly. I had forgotten that that SDK includes an implementation of ActiveModel called AWS::Record. Should make this trivial.
I'm assuming you're on Heroku or something and don't have a method of data persistence?
edit: looking quickly at paperclip's assign method, there's a chance this would work.
yourmodel.your_paperclip_attachment = StringIO.new(params[:your_posted_geolocation_data])
Paperclip appears to handle creating a tempfile from the stream.
I have a rather large db (as in many records). I'd rather let the client download a pre-built db instead of forcing them to load a bunch of text, then insert all the records before being able to use the db.
The closest thing to a spec I can find is this:
https://developer.apple.com/library/content/documentation/iPhone/Conceptual/SafariJSDatabaseGuide/UsingtheJavascriptDatabase/UsingtheJavascriptDatabase.html
It doesn't mention anything about being able to download a database, but I thought someone on SO might have a solution.
Just make up the DB on the simulator or via some GUI... Then make the DB available on the web via a link to the DB file on your webserver. Then in the app check a flag that you make up to see if it has downloaded the DB yet, if it hasn't then just download the DB just like any other file over HTTP and store it in your documents directory. Then set your flag so you know you have downloaded the DB and that the app doesn't need to do this again. Then when you need to access the DB just point sqlite to the DB file you placed in the Documents directory.