How to send additonal data along with form data in angular - javascript

I am sending attached files/images with formData to back end along with content of the mail and then inject them into nodemailer, I am using multer as a middleware if it helps
component.ts
let fileList: FileList = event.target.files;
if(fileList.length > 0) {
this.formData = new FormData();
for(let i = 0 ; i < fileList.length; i++){
let file: File = fileList[i];
this.formData.append('uploadFile', file, file.name);
}
}
service.ts(this works and I can fetch files from req.files in the node)
sendData(formData){
let headers = new HttpHeaders();
return this.http.post('http://localhost:3000/mail',formData)
}
service.ts(doesn't work, need to do this)
sendEmail(formData,email){
let data = { data : email , formData : formData}
return this.http.post('http://localhost:3000/mail',data)
}
both req.files and req.body.formData comes undefined in this case,
I tried appending the formdata to the header but I don't know the proper way to append/fetch or maybe it's not possible to.
Node.js Part
var storage = multer.diskStorage({
destination : (req,file, callback)=>{
req.body.path = [];
callback(null, './storage');
},
filename : (req, file ,callback)=>{
let filename = Date.now() + '-' +file.originalname;
req.body.path.push(filename);
callback(null, filename);
app.post('/mail',upload.any(),nodemailer);
nodemailer.js
module.exports = (req,res)={
console.log(req.files); X
//Code Irrelevant for this question
}

For this you have to do it like this :
sendEmail(formData,email){
let headers = new HttpHeaders();
formData.append('email', email);
return this.http.post('http://localhost:3000/mail',formData);
}
Reason, you have to pass the form as multipart and you are trying to
send data as JSON , so it will not receive anything until you pass the
whole thing as above.
sendEmail(formData,email){
let data = { data : email , formData : formData}
return this.http.post('http://localhost:3000/mail',data)
}
as you can see in your function you are sending the JSON.

Related

Send a byte array to WCF service

I am trying to send a pdf file from javascript to a rest wcf service.
The service expects an array of byte with the following signature
The trick is in the byte array parameter, all the others are working fine
[OperationContract]
[WebInvoke(UriTemplate = "rest/{sessionToken}/ImportNewTemplate?commit={commit}&createApplication={createApplication}&templateName={templateName}&option={option}")]
[CloudMethod(Group = "02. Templates", Description = "Import a new template in the platform.", HelpFile = "ListPaperTemplate.aspx")]
[CloudParameter(Name = "sessionToken", Description = "session token", HelpFile = "ServiceAPIDoc.aspx?q=sessionToken")]
[CloudParameter(Name = "createApplication", Description = "Create a standalone application linked to this template.")]
[CloudParameter(Name = "commit", Description = "Commit the upload ? if true, the template will be imported, else the return just allow you to preview template description.")]
[CloudParameter(Name = "templateName", Description = "Name of the new template. Only valid for single pdf upload. If the files are zipped, the file name in the zip will be used instead")]
[CloudParameter(Name = "templateFile", Description = "Can be a PDF file, or a zip file containing a flat pdf + xml definition", HelpFile = "ServiceAPIDoc.aspx?q=templateFile")]
CloudObjects.TemplateImportation ImportNewTemplate(string sessionToken, bool commit, bool createApplication, byte[] templateFile, string templateName, string option);
this is what I use from the javascript end to send the pdf file
const file = e.target.files[0];
// Encode the file using the FileReader API
const reader = new FileReader();
var fileByteArray = [];
reader.onloadend = async (e) => {
const arrayBuffer = e.target.result,
array = new Uint8Array(arrayBuffer);
for (const a of array) {
console.log(a);
fileByteArray.push(a);
}
let ret = await dispatch('createTemplate', {name: this.newForm.name, pdf:fileByteArray, save:false});
await this.$store.dispatch('hideLoadingScreen')
// Logs data:<type>;base64,wL2dvYWwgbW9yZ...
};
reader.onerror = async () => {
await this.$store.dispatch('hideLoadingScreen')
}
reader.onabort = async () => {
await this.$store.dispatch('hideLoadingScreen')
}
await this.$store.dispatch('showLoadingScreen');
reader.readAsArrayBuffer(file);
And here is the code to send it to the rest service
let url = `${getters.getServiceUrl}ImportNewTemplate?templateName=${name}&commit=${save || true}`
const xhr = new XMLHttpRequest;
xhr.open("POST", url, false);
xhr.setRequestHeader('Content-Type', 'application/json');
let response = await xhr.send(pdf);
However every time I get an error from the service when it tries to deserialise the byte array.
The exception message is 'There was an error deserializing the object of type System.Byte[]. End element 'root' from namespace '' expected.
I have tried a lot of alternatives but nothing works.
Any suggestions are welcome !
Thanks
For those interested, the trick was to add JSON.stringify to the returned array.
So: xhr.send(JSON.stringify(pdf))
would do the trick

How do I send compressed data from Laravel server to frontend JS (Solved)

Solved: Added the solution I used in an answer below
I have a compressed json file in my storage folder with path storage/app/public/data.json.gz I am trying to send this data to my js frontend via a fetch request. Sending the data is working but I am having trouble decompressing it back to JSON with js to use in my js code. I've read that it might be possible for the browser to automatically decompress this but I'm not sure how to enable it. Also the reason I am trying to send over the data compressed is because it is 130 MB of data that shrinks down to 7 MB when compressed and I am hoping that sending less data will speed up the site for users.
Laravel route that sends compressed file
Route::get('/chunks/{index}', function ($index) {
$path = 'public/chunks_'.$index.'.json.gz';
if (!Storage::exists($path)) {
abort(404);
}
return Storage::response($path);
});
Currently I am using the fetch API to get the data
JS Code
let chunks = await getZipFile("/chunks/0",[]).then((data) => {
return data;
});
public static async getZipFile(url: string, params: any = {}, method = "GET", headers: any = {
"Content-Type": "application/zip",
}){
headers['X-Requested-With'] = 'XMLHttpRequest';
let options: any = {
'method': method,
'headers': headers
};
url += "?" + new URLSearchParams(params).toString();
const result = await fetch(url, options).then((response) => response);
return result;
};
any help would be appreciated, currently I can retrieve the compressed data and convert it to a string with result.text() but I have not been able to figure out how to decompress it. I tried using Zlib to decompress but got an error Can't resolve './zlib_bindings'. So looking for a solution similar to using Zlib (or something similar) to decompress or figuring out to configure the server/browser to automatically decompress.
I ended up taking Moradnejad's answer and used zip.js, here is the updated code
Laravel Route:
Instead of declaring a route to send the file I used a laravel symbolic link to get a .zip file from my public storage. (https://laravel.com/docs/9.x/filesystem#the-public-disk)
Also incase it is helpful here is the command I wrote to create the .zip file from the files in my storage.
public function handle()
{
$fileNames = [];
for($i = 0; $i < 10000/1000; $i++){
array_push($fileNames,'public/chunks_'.$i.'.json');
}
$this->zipFiles($fileNames,'./storage/app/public','./storage/app/public/chunks.zip');
return 0;
}
public function zipFiles($files,$path,$zipFileNameAndPath) {
$zip = new ZipArchive;
$zip->open($zipFileNameAndPath, ZipArchive::CREATE);
foreach ($files as $file) {
$zip->addFile($path.'/'.$file,$file);
}
$zip->close();
}
Updated JS request code, I used result.blob() to return a blob of the data.
public static zipRequest(url: string){
return this.getZipFile(url);
}
public static async getZipFile(url: string, params: any = {}, method = "GET", headers: any = {
"Content-Type": "application/zip",
}){
headers['X-Requested-With'] = 'XMLHttpRequest';
let options: any = {
'method': method,
'headers': headers
};
if ("GET" === method) {
url += "?" + new URLSearchParams(params).toString();
} else {
//add csrf token to post request
options.headers["X-CSRF-TOKEN"] = document.querySelector<HTMLElement>('meta[name="csrf-token"]')!.getAttribute('content');
options.body = JSON.stringify(params);
}
const result = await fetch(url, options).then((response) => response);
return result.blob();
};
Updated JS handle blob result, I am using zip.js to get all 10 json files from the .zip data, and then I am merging the 10 json files together.
import * as zip from "#zip.js/zip.js";
async function getAllChunks() {
let chunks = await Helper.getZipFile("storage/chunks.zip",[]).then( async (data) => {
//console.log(data);
let allChunks: any = [];
let textReader = new zip.BlobReader(data);
let zipReader = new zip.ZipReader(textReader);
let entries = await zipReader.getEntries();
for(let i = 0; i < entries.length; i++){
let entry = entries[i];
if(entry.getData){
let textWriter = new zip.TextWriter();
let jsonString = await entry.getData(textWriter);
let chunks = await JSON.parse(jsonString);
allChunks.push.apply(allChunks, chunks);
}
}
return allChunks;
});
return chunks;
}
You're mixing two ideas. HTTP requests can be compressed and decompressed at a lower level than application level. This means that it will handle compression and decompression by itself, if enabled. See here.
What you have here is a compressed file. No frontend or ajax call would decompress it automatically for you.
Solutions:
Either enable compression for HTTP requests and depend on it to handle compression automatically. So send the uncompressed in this version. This could be helpful.
Or use a frontend library, like 'zip.js', to decompress when you receive the compressed file.

Create csv and post as multipart/form-data in JavaScript

How can I create an equivalent code in JavaScript?
Given ['col1', 'col2'] and ['val1', 'val2'] OR 'col1,col2\r\nval1,val2\r\n'
I want to be able to create a csv object without reading/writing to disk and then POST it.
Python:
from io import StringIO
import csv
import requests
f = StringIO()
w = csv.writer(f)
w.writerow(['col1', 'col2'])
w.writerow(['val1', 'val2'])
input_byte = f.getvalue().encode('UTF-8')
headers = {'Content-Type':"multipart/form-data"}
endpoint = "http://localhost:8085/predict"
files = {'file': ('input.csv', input_byte)}
response = requests.post(endpoint, files=files)
Here is my code in JavaScript so far:
let data = [['col1', 'col2'],
['val1', 'val2']];
// convert to csv format
let csvContent = data.map(e => e.join(",")).join("\r\n") + "\r\n";
// I believe FormData object is required to send multipart/form-data
// I do not think I am passing my csv data correctly
let body = new FormData();
let buff = Buffer.from(csvContent, "utf-8");
body.append("file", buff, {filename : 'input.csv'});
let response = await fetch("http://localhost:8085/predict",
{
method: 'POST',
body: body,
headers: {'Content-Type':"multipart/form-data"}
});
EDIT:
I was able to send a csv file but had to write it to a disk first. Is it possible to avoid it?
let data = [['col1', 'col2'],
['val1', 'val2']];
// convert to csv format
let csvContent = data.map(e => e.join(",")).join("\r\n") + "\r\n";
// save a csv file
let path = './files/' + Date.now() + '.csv';
fs.writeFile(path, csvContent, (err) => {
if (err) {
console.error(err);
}
});
let body = new FormData();
// this works :)
// but how can I create a csv buffer without writing to a disk?
body.append("file", fs.createReadStream(path), {filename : 'input.csv'});
let response = await fetch("http://localhost:8085/predict",
{
method: 'POST',
body: body,
});
I was able to solve my question with the following script:
const FormData = require('form-data');
const Readable = require('stream').Readable;
const fetch = require("node-fetch");
let data = [['col1', 'col2'],
['val1', 'val2']];
// convert to csv format
let csvContent = data.map(e => e.join(",")).join("\r\n") + "\r\n";
const stream = Readable.from(csvContent);
let body = new FormData();
body.append("file", stream, {filename : 'input.csv'});
let response = await fetch("http://localhost:8085/predict",
{
method: 'POST',
body: body,
});

Send canvas.toDataURL images to nodejs

I'm trying to send image from front-end script to my server.
Front-end script:
var img_data = canvas.toDataURL('image/jpg'); // contains screenshot image
// Insert here POST request to send image to server
And I'm trying to accept the data in the backend and store it into req.files to be able to access like this:
const get_image = (req, res) => {
const File = req.files.File.tempFilePath;
}
What way can I do to send the image to the server and get the image like in the example above?
your img_data is a base 64 string, which you can send to server directly in a post request
e.g.
await fetch('/api/path', { method: 'POST', headers: { "content-type": "application/json"}, body: JSON.stringify({ file: img_data }) });
On your backend, you can convert this string to binary, and save to file.
var fs = require('fs');
app.post('/api/path', async (req, res) => {
const img = req.body.file;
var regex = /^data:.+\/(.+);base64,(.*)$/;
var matches = string.match(regex);
var ext = matches[1];
var data = matches[2];
var buffer = Buffer.from(data, 'base64'); //file buffer
.... //do whatever you want with the buffer
fs.writeFileSync('imagename.' + ext, buffer); //if you do not need to save to file, you can skip this step.
....// return res to client
})
You have to convert it to a Blob first, and then append it to a Form. The form would be the body of the request that you send to server.
canvas.toBlob(function(blob){
var form = new FormData(),
request = new XMLHttpRequest();
form.append("image", blob, "filename.png");
request.open("POST", "/upload", true);
request.send(form);
}, "image/png");

formData.append() is not sending file to the server?

I am writing some React.js that will upload multiple photos at a time. I am currently trying to send a batch of photos to the server but I cannot seem to get the files to append to the formData.
I call this function on the onChange event of the input field:
batchUpload(e) {
e.preventDefault();
let files = e.target.files;
for (let i = 0; i < files.length; i++) {
let file = files[i],
reader = new FileReader();
reader.onload = (e) => {
let images = this.state.images.slice();
if (file.type.match('image')) {
images.push({file, previewURL: e.target.result});
this.formData.append('files', file); //THIS IS NOT APPENDING THE FILE CORRECTLY
this.setState({images});
}
};
reader.readAsDataURL(file);
}
this.props.setFormWasEdited(true);
}
Then once the save button is pressed I run this function:
saveClick(goBack, peopleIdArray) {
if (this.state.images.length > 0) {
let formData = this.formData;
formData.append('token', Tokens.findOne().token);
formData.append('action', 'insertPhotoBatch');
formData.append('tags', peopleIdArray);
formData.append('date', dateString());
for (var pair of formData.entries()) {
console.log(pair[0] + ', ' + JSON.stringify(pair[1]));
}
let xhr = new XMLHttpRequest();
xhr.open('POST', Meteor.settings.public.api, true);
xhr.onload = (e) => {
if (xhr.status === 200) {
// upload success
console.log('XHR success');
} else {
console.log('An error occurred!');
}
};
xhr.send(formData);
} else {
//signifies error
return true;
}
}
Everything seems to be fine until I append the files to the formData. What am I doing wrong?
If I'm not mistaken you problem is with this.formData.append('files', file);
Running this line in a for loop will get you 1 field with all the file appended to each other resulting in an invalid file.
Instead you must file the file "array" syntax used informs like so:
this.formData.append('files[]', file);
This way you get the files on server side as $_FILES['files']['name'][0], $_FILES['files']['name'][1], ... and like wise for other properties of the files array.
I hope you have solved your issues already. I am still stuck not understanding why it would seem that my formData is not bringing anything to my server, but I did find an issue with your code.
When you use
JSON.stringify(pair[1])
the result looks like an empty array. If you instead try
pair[1].name
you'd see that append actually did attach your file.
const config = {
headers: { 'content-type': 'multipart/form-data' }
}
const formData = new FormData();
Object.keys(payload).forEach(item => {
formData.append([item], payload[item])
})
pass this formData to your API.

Categories