Does anyone have any examples of how I can handle the files that are sent to featherjs?
I have the following client side completely separate from featherjs but haven't trouble actually accessing said files in my service.
var req = request.post('http://uploadhost/upload').set('Authorization', 'Bearer '+this.props.authtoken);
this.state.files.forEach(file => {
req.attach(file.name, file);
});
req.end(this.callback);
FeathersJS just extends express. You need to add a multipart parser, like multer, if you are decoding form data (which looks like you are).
const multer = require('multer');
const multipartMiddleware = multer();
// Upload Service with multipart support
app.use('/uploads',
// multer parses the file named 'uri'.
// Without extra params the data is
// temporarely kept in memory
multipartMiddleware.single('uri'),
// another middleware, this time to
// transfer the received file to feathers
function(req,res,next){
req.feathers.file = req.file;
next();
},
blobService({Model: blobStorage})
);
Ultimately, feathers uses their blob service to create files.
const blobService = require('feathers-blob');
const blobStorage = fs(__dirname + '/uploads');
Read More
I hope this helps clarify my comment.
Related
When trying to upload a stream into a Google bucket I am getting Error: Not Found when using get method and Error: socket hang up after a few second delay when using the request method.
Everything with firebase seems to be initialized fine, and when I log the stream I see the data coming through, but what would be the best way to write a file to GCS using a remote URL?
const storage = firebase.storage()
const bucket = storage.bucket("bucket/path")
const file = bucket.file("filename.pdf")
const url =
"https://url/to/file/filename.pdf"
https.get(url, async (res) => {
console.log(res)
res.pipe(file.createWriteStream())
})
The cause of the issue was passing the folder path into the bucket name instead of the file name.
Bucket name is available in the storage console, and do not pass in a folder path.
Bucket name example:
gs://bucket.appspot.com
(remove the gs:// when passing it as a value)
const bucket = storage.bucket("bucketname")
const file = bucket.file("bucket/path/filename.pdf")
I am using jdenticon to generate user avatars on signup in a node/express app.
Running locally, I can do this by:
Generate identicon using jdenticon
Save file locally
Upload local file to cloudinary
Here's how I do this
const cloudinary = require("cloudinary");
cloudinary.config({
cloud_name: 'my-account-name',
api_key: process.env.CLOUDINARY_API,
api_secret: process.env.CLOUDINARY_SECRET
});
// 1. Generate identicon
let jdenticon = require("jdenticon"),
fs = require("fs"),
size = 600,
value = String(newUser.username),
svg = jdenticon.toPng(value, size);
let file = "uploads/" + value + ".png";
// 2. Save file locally
fs.writeFileSync(file, svg);
// 3. Upload local file to cloudinary
let avatar = await cloudinary.v2.uploader.upload(file);
// Do stuff with avatar object
This works great for running my app locally. However, as I understand it, I can't store images on Heroku (if this is not the case then that would be great to know, and would simplify things massively), so I will need to save the generated identicon directly to cloudinary.
How can I upload the generated image (svg = jdenticon.toPng(value, size);) directly to cloudinary, without first saving?
Any help would be appreciated!
jdenticon.toPng returns a buffer, I believe. And cloudinary's upload_stream method accepts a buffer, so you should be able to just do ....
const data = jdenticon.toPng(value, size);
const options = {}; // optional
cloudinary.v2.uploader.upload_stream(options, (error, result) => {
if (error) {
throw error;
}
console.log('saved .....');
console.log(result);
}).end(data);
My app is created with mean and I am a user of docker too. The purpose of my app is to create and download a CSV file. I already created my file, compressed it and placed it in a temp folder (the file will be removed after the download). This part is in the nodejs server side and works without problems.
I already use several things like (res.download) which is supposed to download directly the file in the browser but nothing append. I tried to use blob in the angularjs part but it doesn't work.
The getData function creates and compresses the file (it exists I can reach it directly when I look where the app is saved).
exports.getData = function getData(req, res, next){
var listRequest = req.body.params.listURL;
var stringTags = req.body.params.tagString;
//The name of the compressed CSV file
var nameFile = req.body.params.fileName;
var query = url.parse(req.url, true).query;
//The function which create the file
ApollineData.getData(listRequest, stringTags, nameFile)
.then(function (response){
var filePath = '/opt/mean.js/modules/apolline/client/CSVDownload/'+response;
const file = fs.createReadStream(filePath);
res.download(filePath, response);
})
.catch(function (response){
console.log(response);
});
};
My main problem is to download this file directly in the browser without using any variable because it could be huge (like several GB). I want to download it and then delete it.
There is nothing wrong with res.download
Probably the reason why res.download don't work for you is b/c you are using AJAX to fetch the resource, Do a regular navigation. Or if it requires some post data and another method: create a form and submit.
Using jspdf I have generated pdf file on the client side (AngularJS). I am able to successfully download the file. Now I have another function to send the pdf via email to user email address.
Problem:
When user clicks on send email button, the pdf which I created using jspdf should be uploaded to server and there I should be able to attach the pdf file which I got from client to email. Is it possible to do the same? I am not sure whether we can do this or not.
Can we send the doc object in var doc = new jsPDF('p', 'pt'); to nodejs then render it and finally attach to the email?
If the above task is not possible then let me know about other possibilities.
P.S: I am using nodemailer for sending emails.
I have forked a sample code for both client and server side code, tested and working perfectly. Please modify according to your need.
Server side: Hosted on cloud9 for testing - Hence it takes the public ip and port provided by the provider via process object. Change the listener according to your hosting environment.
Note: Please read the inline comment for better understanding
var express = require('express');
var app = express();
var fs = require('fs');
var bodyParser = require('body-parser');
var formidable = require('formidable'),
form = new formidable.IncomingForm();
app.post("/", function(req, res, next) {
form.parse(req, function(err, fields, files) {
console.log("File received:\nName:"+files.pdf.name+"\ntype:"+files.pdf.type);
});
form.on('end', function() {
/* this.openedFiles[0].path -- object Contains the path to the file uploaded
------- Use NodeMailer to process this file or attach to the mail-----------------*/
console.log("PDF raw data:"+ fs.readFileSync(this.openedFiles[0].path, "utf8"));
res.status(200).send("thank you");
});
})
.listen(process.env.PORT, process.env.IP, function() {
console.log('Server app listening');
});
Client Side: Fiddler
Note: I didn't paste imgData as SO has character limit. Refer the fiddler link for the client code. Change the request URL to your server. Client side uses Blob API which is an HTML5 standard. so test it on HTML5 compliant browsers.
var imgData = [[**Copy the same image provided by JSpdf example. Check the fiddler for complete code**]]
var doc = new jsPDF();
doc.setFontSize(40);
doc.text(35, 25, "Octonyan loves jsPDF");
doc.addImage(imgData, 'JPEG', 15, 40, 180, 180);
var data = new Blob([doc.output()], {
type: 'application/pdf'
});
var formData = new FormData();
formData.append("pdf", data, "myfile.pdf");
var request = new XMLHttpRequest();
request.open("POST", "https://saltjs-nirus.c9.io"); // Change to your server
request.send(formData);
References:
https://developer.mozilla.org/en-US/docs/Web/API/FormData/Using_FormData_Objects
https://github.com/felixge/node-formidable
http://mrrio.github.io/jsPDF/ (Example reference)
https://developer.mozilla.org/en/docs/Web/API/Blob
Basically, I wrote a server that response a js file(object format) to users who made the request. The js file is generated by two config file. I call them config1.js and config2.js.
Here is my code:
var express = require('express');
var app = express();
var _ = require('underscore');
app.use('/config.js', function (req, res) {
var config1 = require('config1');
var config2 = require('config2');
var config = _.extend(config1, config2);
res.set({'Content-Type': 'application/javascript'});
res.send(JSON.stringify(config));
});
For what I am understanding, every time I make a request to /config.js, it will fetch the latest code in config1 and config2 file even after I start server. However, if I start server, make some modification in config1.js. then make the request, it will still return me the old code. Can anyone explain and how to fix that? thanks
You should not use require in order to load your files because it is not its purpose, it caches the loaded file (see this post for more information), that is why you get the same content every time you make a request.
Use a tool like concat-files instead, or concat it "by hand" if you prefer.
Concat files and extend objects aren't equal operations. You can read the files via 'fs' module, parse objects, extend, and send.