I'm working on this project using IPFS and I'm trying to create a website that allows users to upload files directly from their browser to IPFS. My goal was that the website would be a front-end website but whenever I add a file to IPFS and I check it's hash on https://gateway.ipfs.io/ipfs/hash-here nothing happens, which made me think that the files are probably not getting uploaded to IPFS because I'm not running it on my local machine. Is this correct?
const Buffer = require('safe-buffer').Buffer;
export default function uploadFiles(node, files) {
let reader = new FileReader();
reader.onloadend = () => {
let byteData = reader.result.split('base64,')[1];
let fileData = Buffer.from(byteData);
node.files.add(fileData, (err, res) => {
if (err) {
throw err
}
let hash = res[0].hash
console.log(hash); ///////prints a hash that isnt visible on
//gateway
node.files.cat(hash, (err, res) => {
if (err) {
throw err
}
let data = ''
res.on('data', (d) => {
data = data + d
})
res.on('end', () => {
// console.log(data);
// console.log(atob(data));
})
})
});
}
reader.readAsDataURL(files['0']);
};
Are you running a js-ipfs node in your browser? Did you get the chance to look at the examples in the examples folder in js-ipfs repo? Url here: https://github.com/ipfs/js-ipfs/tree/master/examples
If you add a file to your node and the node is on, the IPFS gateway node will be able to find the content from your browser node.
Related
So I created my first big project: https://rate-n-write.herokuapp.com/
In brief, this is a blog app where the user can write reviews and publish them along with pictures.
I have used firebase as the database to store the articles. The app is working fine on localhost. Whenever I am trying to upload an image on Heroku, I get this error
The error is showing up in line number 8 of the following code (editor.js):
uploadInput.addEventListener('change', () => {
uploadImage(uploadInput, "image");
})
const uploadImage = (uploadFile, uploadType) => {
const [file] = uploadFile.files;
if(file && file.type.includes("image")){
const formdata = new FormData();
formdata.append('image', file);
//Error shows up here in the fetch line
fetch('/upload', {
method: 'post',
body: formdata
}).then(res => res.json())
.then(data => {
if(uploadType == "image"){
addImage(data, file.name);
} else{
bannerPath = `${location.origin}/${data}`;
banner.style.backgroundImage = `url("${bannerPath}")`;
}
})
const change_text = document.getElementById("uploadban");
change_text.innerHTML = " ";
} else{
alert("upload Image only");
}
}
This is just a snippet of the whole editor.js file.
Is it because I am trying to upload the file to the project directory? (server.js snippet below):
app.post('/upload', (req, res) => {
let file = req.files.image;
let date = new Date();
// image name
let imagename = date.getDate() + date.getTime() + file.name;
// image upload path
let path = 'public/uploads/' + imagename;
// create upload
file.mv(path, (err, result) => {
if(err){
throw err;
} else{
// our image upload path
res.json(`uploads/${imagename}`)
}
})
})
Do I need to use an online storage service like AWS S3?
Heroku is not suitable for persistent storage of data, the uploaded pictures would be deleted after a while (when the dyno is restarted) read this.
I would suggest using 3rd party object Storage services like
cloudinary or AWS S3
How to download a file with Node.js from google drive api
I don't need anything special. I only want to download a file from a GoogleDrive, and then save it to a given directory of client.
app.get("/download",function(req,res){
const p38290token = new google.auth.OAuth2(CLIENT_ID, CLIENT_SECRET, REDIRECT_URI);
p38290token.setCredentials({ refresh_token: token.acc });
const p38290Id = google.drive({
version: "v3",
auth: p38290token,
});
var dest = fs.createWriteStream("./test.png");
try {
p38290Id.files.get({
fileId: "1daaxy0ymKbMro-e-JnexmGvM4WzW-3Hn",
alt: "media"
}, { responseType: "stream" },
(err, res) => {
res.data
.on("end", () => {
console.log("Done");
})
.on("error", err => {
console.log("Error", err);
})
.pipe(dest); // i want to sent this file to client who request to "/download"
}
)
} catch (error) {
}
})
I want to do that just someone come to www.xyz.com/download and file will be download automatically
The issue seems to be with this line:
var dest = fs.createWriteStream("./test.png");
You are using a file system command which is meant to interact with files on the server. Your question makes it clear that you wish for express to deliver the contents of the file over to the client making the HTTP request.
For that you can just use the res parameter of the route callback function. You declare it on this line:
app.get("/download",function(req,res){
In your case I'd remove the dest variable completely and simply pipe the file to res like so:
.pipe(dest);
Have a look at this answer as well.
Im trying to get the contents of a file using the google drive API v3 in node.js.
I read in this documentation I get a stream back from drive.files.get({fileId, alt: 'media'})but that isn't the case. I get a promise back.
https://developers.google.com/drive/api/v3/manage-downloads
Can someone tell me how I can get a stream from that method?
I believe your goal and situation as follows.
You want to retrieve the steam type from the method of drive.files.get.
You want to achieve this using googleapis with Node.js.
You have already done the authorization process for using Drive API.
For this, how about this answer? In this case, please use responseType. Ref
Pattern 1:
In this pattern, the file is downloaded as the stream type and it is saved as a file.
Sample script:
var dest = fs.createWriteStream("###"); // Please set the filename of the saved file.
drive.files.get(
{fileId: id, alt: "media"},
{responseType: "stream"},
(err, {data}) => {
if (err) {
console.log(err);
return;
}
data
.on("end", () => console.log("Done."))
.on("error", (err) => {
console.log(err);
return process.exit();
})
.pipe(dest);
}
);
Pattern 2:
In this pattern, the file is downloaded as the stream type and it is put to the buffer.
Sample script:
drive.files.get(
{fileId: id, alt: "media",},
{responseType: "stream"},
(err, { data }) => {
if (err) {
console.log(err);
return;
}
let buf = [];
data.on("data", (e) => buf.push(e));
data.on("end", () => {
const buffer = Buffer.concat(buf);
console.log(buffer);
});
}
);
Reference:
Google APIs Node.js Client
I'm currently seeking some help with my Cloud Function that is triggered by a Cloud Storage Upload. It checks if the file is a Video, if so we process this Video through ffmpeg to extract a single frame to be used for a Poster Image later.
It all seems to work except my upload of the image back to Cloud Storage doesn't work. At this point where my Cloud Function is it doesn't produce any errors at all, so i have no clue why the upload of the image to Cloud Storage is not working. I would greatly appreciate if anyone with the experience can review my Cloud Function below and provide some insight into why it's not working. Please advice if possible!! Thank you!!!! ^_^
Note: Screenshot of Cloud Function Log is provided below the code snippet
const admin = require('firebase-admin'); // Firebase Admin SDK
const functions = require('firebase-functions'); // Firebase Cloud Functions
const gcs = require('#google-cloud/storage')(); // Cloud Storage Node.js Client
const path = require('path'); // Node.js file and directory utility
const os = require('os'); // Node.js operating system-related utility
const fs = require('fs'); // Node.js file system API
const ffmpeg = require('fluent-ffmpeg');
const ffmpegPath = require('#ffmpeg-installer/ffmpeg').path;
const ffprobePath = require('#ffprobe-installer/ffprobe').path;
// Initialize Firebase Admin
admin.initializeApp(functions.config().firebase);
// Listen for changes in Cloud Storage bucket
exports.storageFunction = functions.storage.object()
.onChange((event) => {
const file = event.data; // The Storage object.
const fileBucket = file.bucket; // The Storage bucket that contains the file.
const filePath = file.name; // File path in the bucket.
const fileName = path.basename(filePath); // Get the file name.
const fileType = file.contentType; // File content type.
if (!fileType.startsWith('video/')) {
return;
}
const bucket = gcs.bucket(fileBucket);
const tempFilePath = path.join(os.tmpdir(), fileName);
const tempFolderPath = os.tmpdir();
// Download video to temp directory
return bucket.file(filePath).download({
destination: tempFilePath
}).then(() => {
console.log('Video downloaded locally to', tempFilePath);
// Generate screenshot from video
ffmpeg(tempFilePath)
.setFfmpegPath(ffmpegPath)
.setFfprobePath(ffprobePath)
.on('filenames', (filenames) => {
console.log(`Will generate ${filenames}`);
})
.on('error', (err) => {
console.log(`An error occurred: ${err.message}`);
})
.on('end', () => {
console.log(`Output image created at ${tempFilePath}`);
const targetTempFileName = `${fileName}.png`;
const targetFilePath = path.join(path.dirname(filePath), targetTempFileName);
console.log(targetTempFileName);
console.log(targetFilePath);
// Uploading the image.
return bucket.upload(tempFilePath, { destination: targetFilePath })
.then(() => {
console.log('Output image uploaded to', filePath);
})
.catch((err) => {
console.log(err.message);
});
})
.screenshots({
count: 1,
folder: tempFolderPath
});
});
});
Cloud Function Log
It looks like you're trying to return a promise from the ffmpeg callback API:
.on('end', () => {
return bucket.upload(tempFilePath, { destination: targetFilePath })
.then(...)
})
I don't know the ffmpeg API, but I'm almost certain that will not cause the function to wait for the upload to complete. Instead, you need to return a promise from directly from your function that resolves only after all the async work is complete.
If the last item of work is inside a callback, and you need to wait for that, you can wrap the entire thing into a new promise and manually resolve it at the right time. In pseudocode:
return new Promise((resolve, reject) => {
// ffmpeg stuff here...
.on('end', () => {
// the last bit of work here...
bucket.upload(...)
.then(() => { resolve() })
})
})
Notice how the resolve method provided by the new promise is being called to indicate when that promise should itself resolve.
I am creating an Electron Application in which I am recording data from webcam and desktop, at the end of the recording session, I want to save the data to a file in the background. I do not know how to write the data from a blob to a file directly. Any suggestions?
Below is my current handling for MediaRecord Stop event.
this.mediaRecorder.onstop = (e) => {
var blob = new Blob(this.chunks,
{ 'type' : 'video/mp4; codecs=H.264' });
var fs = require('fs');
var fr = new FileReader();
var data = null;
fr.onload = () => {
data = fr.result;
fs.writeFile("test.mp4", data, err => {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
};
fr.readAsArrayBuffer(blob);
}
You can do it using FileReader and Buffer.
In the renderer process, send the event to the main process to save the file with the buffer:
function saveBlob(blob) {
let reader = new FileReader()
reader.onload = function() {
if (reader.readyState == 2) {
var buffer = new Buffer(reader.result)
ipcRenderer.send(SAVE_FILE, fileName, buffer)
console.log(`Saving ${JSON.stringify({ fileName, size: blob.size })}`)
}
}
reader.readAsArrayBuffer(blob)
}
Get back the confirmation:
ipcRenderer.on(SAVED_FILE, (event, path) => {
console.log("Saved file " + path)
})
(SAVE_FILE and SAVED_FILE are static strings containing event name)
and in the main process:
ipcMain.on(SAVE_FILE, (event, path, buffer) => {
outputFile(path, buffer, err => {
if (err) {
event.sender.send(ERROR, err.message)
} else {
event.sender.send(SAVED_FILE, path)
}
})
})
outputFile is from 'fs-extra'
Handling node operations in main process is preferred. See Electron Security suggestions.
If you do want to not use main process, you can use 'electron-remote' to create background processes to write the file. Additionally, you can invoke ffmpeg in the background process to compress/encode the file into different format.