Using jspdf I have generated pdf file on the client side (AngularJS). I am able to successfully download the file. Now I have another function to send the pdf via email to user email address.
Problem:
When user clicks on send email button, the pdf which I created using jspdf should be uploaded to server and there I should be able to attach the pdf file which I got from client to email. Is it possible to do the same? I am not sure whether we can do this or not.
Can we send the doc object in var doc = new jsPDF('p', 'pt'); to nodejs then render it and finally attach to the email?
If the above task is not possible then let me know about other possibilities.
P.S: I am using nodemailer for sending emails.
I have forked a sample code for both client and server side code, tested and working perfectly. Please modify according to your need.
Server side: Hosted on cloud9 for testing - Hence it takes the public ip and port provided by the provider via process object. Change the listener according to your hosting environment.
Note: Please read the inline comment for better understanding
var express = require('express');
var app = express();
var fs = require('fs');
var bodyParser = require('body-parser');
var formidable = require('formidable'),
form = new formidable.IncomingForm();
app.post("/", function(req, res, next) {
form.parse(req, function(err, fields, files) {
console.log("File received:\nName:"+files.pdf.name+"\ntype:"+files.pdf.type);
});
form.on('end', function() {
/* this.openedFiles[0].path -- object Contains the path to the file uploaded
------- Use NodeMailer to process this file or attach to the mail-----------------*/
console.log("PDF raw data:"+ fs.readFileSync(this.openedFiles[0].path, "utf8"));
res.status(200).send("thank you");
});
})
.listen(process.env.PORT, process.env.IP, function() {
console.log('Server app listening');
});
Client Side: Fiddler
Note: I didn't paste imgData as SO has character limit. Refer the fiddler link for the client code. Change the request URL to your server. Client side uses Blob API which is an HTML5 standard. so test it on HTML5 compliant browsers.
var imgData = [[**Copy the same image provided by JSpdf example. Check the fiddler for complete code**]]
var doc = new jsPDF();
doc.setFontSize(40);
doc.text(35, 25, "Octonyan loves jsPDF");
doc.addImage(imgData, 'JPEG', 15, 40, 180, 180);
var data = new Blob([doc.output()], {
type: 'application/pdf'
});
var formData = new FormData();
formData.append("pdf", data, "myfile.pdf");
var request = new XMLHttpRequest();
request.open("POST", "https://saltjs-nirus.c9.io"); // Change to your server
request.send(formData);
References:
https://developer.mozilla.org/en-US/docs/Web/API/FormData/Using_FormData_Objects
https://github.com/felixge/node-formidable
http://mrrio.github.io/jsPDF/ (Example reference)
https://developer.mozilla.org/en/docs/Web/API/Blob
Related
Does anyone have any examples of how I can handle the files that are sent to featherjs?
I have the following client side completely separate from featherjs but haven't trouble actually accessing said files in my service.
var req = request.post('http://uploadhost/upload').set('Authorization', 'Bearer '+this.props.authtoken);
this.state.files.forEach(file => {
req.attach(file.name, file);
});
req.end(this.callback);
FeathersJS just extends express. You need to add a multipart parser, like multer, if you are decoding form data (which looks like you are).
const multer = require('multer');
const multipartMiddleware = multer();
// Upload Service with multipart support
app.use('/uploads',
// multer parses the file named 'uri'.
// Without extra params the data is
// temporarely kept in memory
multipartMiddleware.single('uri'),
// another middleware, this time to
// transfer the received file to feathers
function(req,res,next){
req.feathers.file = req.file;
next();
},
blobService({Model: blobStorage})
);
Ultimately, feathers uses their blob service to create files.
const blobService = require('feathers-blob');
const blobStorage = fs(__dirname + '/uploads');
Read More
I hope this helps clarify my comment.
I'm generating a PDF using the PDFKit library and I would like to insert the generated PDF into a MongoDB collection. On the client side a user should be able to see the list of files in the collection and choose one of these files to download.
As of now I'm saving the PDF in the collection as a Uint8Array and I can then click on the file at the front end to download it.
However, my problem is that the file seems to be corrupt. It will not open in Adobe Reader or in Chrome.
I've tried saving it to the collection with and without PDFKits compression.
Is this possible to do? Or do I have a bad approach to this.
Any help with this would be great and thanks in advance!
Server Side
Most of the code here is based off of this post on the PDFKit GitHub
var doc = new PDFDocument();
var stream = require('stream');
var converter = new stream.PassThrough();
doc.pipe(converter);
var data = [];
converter.on('data', function(chunk) {
data.push(chunk);
});
converter.on('end', Meteor.bindEnvironment( function () {
var buffer = Buffer.concat(data);
PdfFiles.insert({ data:buffer, userID:userId });
}));
// Adding content to the PDF
doc.end();
Client Side
'click .downloadPDF': function (event) {
event.preventDefault();
var file = UserFiles.findOne({userID:Meteor.userId()});
var FileSaver = require('file-saver');
var blob = new Blob(file.data, {type: "application/pdf"});
FileSaver.saveAs(blob, "AwesomePDF");
}
I got the code to work as intended simply by changing:
var blob = new Blob(file.data, {type: "application/pdf"});
to
var blob = new Blob([file.data], {type: "application/pdf"});
I'm trying to stream audio from a server containing an audio file to a client using BinaryJS. My code was inspired by the code in this question: Playing PCM stream from Web Audio API on Node.js
Here's what my server code looks like:
// create a BinaryServer using BinaryJS
var BinaryServer = require('binaryjs').BinaryServer;
// gotta be able to access the filesystem
var fs = require('fs');
// create our server listening on a specific port
var server = BinaryServer({port: 8080});
// do this when a client makes a request
server.on('connection', function(client){
// get the audio file
var file = fs.createReadStream(__dirname + '/JDillaLife.mp3');
// convert to int16
var len = file.length;
var buf = new Int16Array(len);
while(len--){
buf[len] = data[len]*0xFFFF;
}
var Stream = client.send();
Stream.write(buf.buffer);
console.log("Server contacted and file sent");
});
console.log('Server running on port 8080');
And my client code:
var Speaker = require('speaker');
var speaker = new Speaker({
channels: 2,
bitDepth: 32,
sampleRate: 44100,
signed: true
});
var BinaryClient = require('binaryjs').BinaryClient;
var client = BinaryClient('http://localhost:8080');
client.on('stream', function(stream, meta){
stream.on('data', function(data){
speaker.write(data);
});
});
This is a very rough draft and I'm almost certain it won't play nicely right away, but right now it's throwing an error when I run that seems to take issue with the line var buf = new Int16Array(len);, and I'm not sure why this is happening. It says there's a "type error," but I'm not sure why this would happen when assigning a new object to an empty variable. I'm new to JS (and the whole untyped languages thing in general) so is this an issue with what I'm assigning?
I think the issue here is that you're accessing file.length, while file is a Stream object which I don't think have a length property. So what you're doing is basically saying
new Int16Array(undefined);
and hence the type error.
fs.createReadStream is documented here; https://nodejs.org/api/fs.html#fs_fs_createreadstream_path_options
You can use the Stream object to read data in chunks using stream.read(256), as documented here; https://nodejs.org/api/stream.html#stream_readable_read_size
Hopefully that gives you enough pointers to proceed!
I'm developing a web app that can upload large file into the Azure Blob Storage.
As a backend, I am using Windows Azure Mobile Services (the web app will generate contents for mobile devices) in nodeJS.
My client can successfully send chunks of data to the backend, everything looks fine but, at the end, the uploaded file is empty. The data upload has been prepared by following this tutorial: it works perfectly when the file is small enough to be uploaded in a single requests. The process fails when the file needs to be broken in chunks. It uses the ReadableStreamBuffer from the tutorial.
Can somebody help me?
Here the code:
Back-end : createBlobBlockFromStream
[...]
//Get references
var azure = require('azure');
var qs = require('querystring');
var appSettings = require('mobileservice-config').appSettings;
var accountName = appSettings.STORAGE_NAME;
var accountKey = appSettings.STORAGE_KEY;
var host = accountName + '.blob.core.windows.net';
var container = "zips";
//console.log(request.body);
var blobName = request.body.file;
var blobExt = request.body.ext;
var blockId = request.body.blockId;
var data = new Buffer(request.body.data, "base64");
var stream = new ReadableStreamBuffer(data);
var streamLen = stream.size();
var blobFull = blobName+"."+blobExt;
console.log("BlobFull: "+blobFull+"; id: "+blockId+"; len: "+streamLen+"; "+stream);
var blobService = azure.createBlobService(accountName, accountKey, host);
//console.log("blockId: "+blockId+"; container: "+container+";\nblobFull: "+blobFull+"streamLen: "+streamLen);
blobService.createBlobBlockFromStream(blockId, container, blobFull, stream, streamLen,
function(error, response){
if(error){
request.respond(statusCodes.INTERNAL_SERVER_ERROR, error);
} else {
request.respond(statusCodes.OK, {message : "block created"});
}
});
[...]
Back-end: commitBlobBlock
[...]
var azure = require('azure');
var qs = require('querystring');
var appSettings = require('mobileservice-config').appSettings;
var accountName = appSettings.STORAGE_NAME;
var accountKey = appSettings.STORAGE_KEY;
var host = accountName + '.blob.core.windows.net';
var container = "zips";
var blobName = request.body.file;
var blobExt = request.body.ext;
var blobFull = blobName+"."+blobExt;
var blockIdList = request.body.blockList;
console.log("blobFull: "+blobFull+"; blockIdList: "+JSON.stringify(blockIdList));
var blobService = azure.createBlobService(accountName, accountKey, host);
blobService.commitBlobBlocks(container, blobFull, blockIdList, function(error, result){
if(error){
request.respond(statusCodes.INTERNAL_SERVER_ERROR, error);
} else {
request.respond(statusCodes.OK, result);
blobService.listBlobBlocks(container, blobFull)
}
});
[...]
The second method returns the correct list of blockId, so I think that the second part of the process works fine. I think that it is the first method that fails to write the data inside the block, as if it creates some empty blocks.
In the client-side, I read the file as an ArrayBuffer, by using the FileReader JS API.
Then I convert it in a Base4 encoded string by using the following code. This approach works perfectly if I create the blob in a single call, good for small files.
[...]
//data contains the ArrayBuffer read by the FileReader API
var requestData = new Uint8Array(data);
var binary = "";
for (var i = 0; i < requestData.length; i++) {
binary += String.fromCharCode( requestData[ i ] );
}
[...]
Any idea?
Thank you,
Ric
Which version of the Azure Storage Node.js SDK are you using? It looks like you might be using an older version; if so I would recommend upgrading to the latest (0.3.0 as of this writing). We’ve improved many areas with the new library, including blob upload; you might be hitting a bug that has already been fixed. Note that there may be breaking changes between versions.
Download the latest Node.js Module (code is also on Github)
https://www.npmjs.org/package/azure-storage
Read our blog post: Microsoft Azure Storage Client Module for Node.js v. 0.2.0 http://blogs.msdn.com/b/windowsazurestorage/archive/2014/06/26/microsoft-azure-storage-client-module-for-node-js-v-0-2-0.aspx
If that’s not the issue, can you check a Fiddler trace (or equivalent) to see if the raw data blocks are being sent to the service?
Not too sure if your still suffering from this problem but i was experiencing the exact same thing and came across this looking for a solution. Well i found one and though id share.
My problem was not with how i push the block but in how i committed it. My little proxy server had no knowledge of prior commits, it just pushes the data its sent and commits it. Trouble is i wasn't providing the commit message with the previously committed blocks so it was overwriting them with the current commit each time.
So my solution:
var opts = {
UncommittedBlocks: [IdOfJustCommitedBlock],
CommittedBlocks: [IdsOfPreviouslyCommittedBlocks]
}
blobService.commitBlobBlocks('containerName', 'blobName', opts, function(e, r){});
For me the bit that broke everything was the format of the opts object. I wasn't providing an array of previously committed block names. Its also worth noting that i had to base64 decode the existing block names as:
blobService.listBlobBlocks('containerName', 'fileName', 'type IE committed', fn)
Returns an object for each block with the name being base64 encoded.
Just for completeness here's how i push my blocks, req is from the express route:
var blobId = blobService.getBlockId('blobName', 'lengthOfPreviouslyCommittedArray + 1 as Int');
var length = req.headers['content-length'];
blobService.createBlobBlockFromStream(blobId, 'containerName', 'blobName', req, length, fn);
Also with the upload i had a strange issue where the content-length header caused it to break and so had to delete it from the req.headers object.
Hope this helps and is detailed enough.
I've searched for a while for a solution to this problem, but haven't found much.
My goal is to recieve a message from a udp client, which the server recieves and forwards to a web client, which plays an audio clip each time a message is recieved. However, for some reason the audio will not play. If I open the page straight from my directory, the audio can be played, but if I try and access it through localhost it fails to load. Does anyone know of a solution?
Here is the client side javascript.
var mySound = new Audio('/public/audio/Bloom.mp3');
mySound.load();
var socket = io.connect('http://localhost');
socket.on('message', function(data){
console.log(data);
$('#content').text(data);
mySound.play();
//document.getElementById('audiotag1').play();
});
This page is served by server.js, a node.js file using socket.io and express. I don't receive any errors from my console.log.
Here is the server.js
var app = require('express')()
, server = require('http').Server(app)
, io =require('socket.io')(server)
, dgram = require('dgram');
var httpPort = 1234;
var udpPort = 5000;
server.listen(httpPort);
app.use(server.express.static( __dirname + '/public'));
app.get('/', function(request, response){
var ipAddress = request.socket.remoteAddress;
console.log("New express connection from: " + ipAddress);
response.sendfile(__dirname + '/public/index.html');
});
var udpSocket = dgram.createSocket('udp4', function(msgBuffer){
var jsonMessage = JSON.parse(msgBuffer);
io.sockets.emit('message', JSON.stringify(jsonMessage));
});
udpSocket.bind(udpPort, '127.0.0.1');
You can go to this link to see the error chrome has.
http://postimg.org/image/xkv7a2kwb/
Does anyone have any ideas on how to fix this?
This error usually occurs when you have incorrect Express config for static files.
From your client-side JavaScript I can see that you store your static files in public folder.
So somewhere on the server-side you should have something like this:
app.use('/public', express.static('/path/to/public'));
Doing this way http://localhost/public/audio/Bloom.mp3 request will be served with /path/to/public/audio/Bloom.mp3 file.
Caveat:
Since Express 4 there is "no more app.use(app.router)" and "All routing methods will be added in the order in which they appear". This means that you have to put the above line of code before first app.get(...), app.post(...), etc.
Hope this information will help someone.
we can upload tone into cloud storage and then use that source like below
playAudio() {
let audio = new Audio()
audio.src ='https://res.cloudinary.com/Apple_Notification.mp3'
audio.load()
audio.play()
}
trigger this function in client-side whenever a new message comes in