Firebase Multiple Images - javascript

I'm sort of new at Firebase and I was wondering if anyone knows how to upload multiple images to firebase at once? I am creating a website where I want to have several upload file buttons and use Javascript to have all of those images be uploaded at once to Firebase Storage under the same ID. Thank you in advance!

First, look at js FileList for an example snippet of how to allow user to select multiple images (files) for upload.
Then have a look at the firebase storage upload files documentation for an example of how to upload.
I imagine a promise.all / race condition would be the best way to handle multiple images...

Something like this should do the trick:
// set it up
firebase.storage().ref().constructor.prototype.putFiles = function(files) {
var ref = this;
return Promise.all(files.map(function(file) {
return ref.child(file.name).put(file);
}));
}
// use it!
firebase.storage().ref().putFiles(files).then(function(metadatas) {
// Get an array of file metadata
}).catch(function(error) {
// If any task fails, handle this
});

Related

How to upload Bulk amount of json data with images to the firebase realtime and storage respectively

I have a bulk amount of data in CSV format. I am able to upload that data with python by converting them to the dictionary (dict) with loop. the whole data is getting uploaded.
but now I want to upload bulk data to firebase and images to storage and I want to link between each document and image because i am working on e-commerce react app. so that I can retrieve documents along with images.
which is a good way to do this? should I do this with javascript or python?
I uploaded data manually to firebase by importing from there but again I am unable to upload bulk images to storage and also unable to create references between them. please give me a source where I can find this solution
This is tough, because it's hard to fully understand how exactly your images and CSV's are linked, but generally if you need to link something to items stored in Firebase, you can get a link either manually (go into storage, click and item, and the 'Name' Field on the right hand side is a link), or you can get it when you upload it. So for example, I have my images stored in firebase, and a postgres database with a table storing the locations. In my API (Express), when I post the image to blob storage, I create the URL of the item, and post that as an entry in my table, as well as setting it to be the blobs name. I'll put the code here, but obviously it's a completely different architecture to your problem, so i'll try and highlight the important bits (it's also JS, not Python, sorry!):
const uploadFile = async () => {
var filename = "" + v4.v4() + ".png"; //uses the uuid library to create a unique value
const options = {
destination: filename,
resumable: true,
validation: "crc32c",
metadata: {
metadata: {
firebaseStorageDownloadTokens: v4.v4(),
},
},
};
storage
.bucket(bucketName)
.upload(localFilename, options, function (err, file) {});
pg.connect(connectionString, function (err, client, done) {
client.query(
`INSERT INTO table (image_location) VALUES ('${filename}')`, //inserts the filename we set earlier into the postgres table
function (err, result) {
done();
if (err) return console.error(err);
console.log(result.rows.length);
}
);
});
console.log(`${filename} uploaded to ${bucketName}.`);
};
Once you have a reference between the two like this, you can just get the one in the table first, then use that to pull in the other one using the location you have stored.

Directus v9 file-type validations before upload

Is there any way to validate or limit what file-type/ extensions could be uploaded before it actually gets uploaded?
I've tried using a couple of custom-hooks but they all just went through.
Hooks tried:
files.create.before
items.create.before
module.exports = function registerHook({ exceptions }) {
const { InvalidPayloadException } = exceptions;
return {
"files.create.before": async function (input) {
console.log(input);
throw new InvalidPayloadException("title");
},
};
};
If you scroll down past either Filter, or Action Events, you will see a small block listing all system collection names where files cannot be intercepted during create/update.
Maybe they're trying to keep it as a file manager which can store every types, then bind it through pivot table.
You could try creating a custom user interface where you limited the choice of uploading file extensions yourself (Not sure if it works).
enter image description here

iterate through firebase storage to display multiple images

Currently I am able to fetch a specific picture knowing its title and location in the storage, but I want to be able to show all pictures in one folder inside my storage knowing the location of that storage folder but not the content titles.
I have tried using the below code (projectID is the folder which I need to show all the elements of) but it doesn't seem to work. I am new to javascript so I apologize for the wrong function call of .once.
const childRef = storageRefer.child(`${projectID}`);
childRef.once("value", function(snapshot) {
snapshot.forEach(function(child) {
child.getDownloadURL().then(function(url) {
console.log(url);
});
});
});
this code should be able to log the url of all the images but all I get is an error about the .once function. If anyone knows what I am doing wrong or a better method in getting all the images in one folder inside my storage that would be super helpful, thanks!
Edit:
Looking back at this I realized I could store the location of the images into a database for them as I can easily iterate through a database without knowing what is inside and call to storage to get the image, but that seems sloppy?
There currently is no API call in Firebase Storage to list all files in a folder. If you need such functionality, you should store the metadata of the files (such as the download URLs) in a place where you can list them. The Firebase Firestore is perfect for this and allows you to also easily share the URLs with others.
var listRef = firebase.storage().ref().child('profiles/');
listRef.listAll().then(function(res){
res.items.forEach(function(itemRef){
itemRef.getDownloadURL().then(function (link) {
console.log(link);
})
})
})
}
This function brings all photos saved in the storage "Profile"
I hope it works for you.
#Shodmoth Check out this new firebase link (https://firebase.google.com/docs/storage/web/list-files) for how to list all the files in a folder.
// Create a reference under which you want to list
var listRef = storageRef.child('files/uid');
// Find all the prefixes and items.
listRef.listAll().then(function(res) {
res.prefixes.forEach(function(folderRef) {
console.log(folderRef)
});
res.items.forEach(function(itemRef) {
console.log(itemRef) //can call .getDownloadURL() on each itemRef
});
}).catch(function(error) {
// Uh-oh, an error occurred!
});

Firebase Cloud FireStore: Insert Large Array

Disclaimer: I have been programming for about 4 months now, so I'm still very new to programming.
I am using Firebase Cloud Firestore for a database and inserting CSV files with large amounts of data in them. Each file can be about 100k records in length. The project I'm working on requires a user to upload these CSV's from a web page.
I created a file uploader and I'm using the PapaParse JS tool to parse the csv, and it does so very nicely, it returns an array instantly, even if it's very long. I've tried to use the largest files I can find and then logging it to the console, it's very fast.
The problem is when I then take that array it gives me and loop through it and insert that into Cloud Firestore. It works, the data is inserted exactly how I want it. But it's very slow. And if I close the browser window it stops inserting. Inserting only 50 records takes about 10-15 seconds. So with files of 100k records, this is not going to work.
I was considering using Cloud Functions, but before I now try and learn how all that works, maybe I'm just not doing this in an efficient way? So I thought to ask here.
Here is the JS
// Get the file uploader in the dom
var uploader = document.getElementById('vc-file-upload');
// Listen for when file is uploaded
uploader.addEventListener('change',function(e){
// Get the file
var file = e.target.files[0];
// Parse the CSV File and insert into Firestore
const csv = Papa.parse(file, {
header: true,
complete: function(results) {
console.log(results);
sim = results.data;
var simLength = sim.length;
for (var i = 0; i < simLength;i++) {
var indSim = sim[i]
iccid = indSim.iccid;
const docRef = firestore.collection('vc_uploads').doc(iccid);
docRef.set({indSim}).then(function() {
console.log('Insert Complete.')
}).catch(function (err) {
console.log('Got an error: '+ err)
})
};
}
});
});
It will almost certainly be faster overall if you just upload the file (perhaps to Cloud Storage) and perform the database operations in Cloud Functions or some other backend, and it will complete even if the user leaves the app.

Remove all files from Mongo GridFs (both files and chunks)

I'm writing some tests agains MongoDb GridFs and want to make sure that I start each test with a clean slate. So I want to remove everything, both from fs_files and fs_chunks.
What's the easiest way to do that?
If GridFs has its own database then I would just drop the database via mongo shell with db.dropDatabase(). Alternately if there are collections you would like to keep apart from fs_files and fs_chunks in the database you could drop the collections explicitly with db.collection.drop(). For both you can run via command from a driver rather than through the shell if desired.
If you want to remove GridFS, try it like (pyMongo Case)
for i in fs.find(): # or fs.list()
print fs.delete(i._id)
The simple way to remove all files and chunks is
let bucket = new GridFSBucket(db, { bucketName: 'gridfsdownload' });
bucket.drop(function(error) { /* do something */ });
This will remove the collections as well. If you want to keep the collections, you need to
to do something like this:
let allFiles = db.collection('uploaded.file').find({}).toArray();
for (let file of allFiles) {
await bucket.delete(file._id);
}

Categories