Problems sending a pdf from the backend - javascript

I am doing a small project in ASP.NET Core (backend) and VueJs (frontend) the communication is by axios.
I want to send a PDF file to a local path on the server. And show it in a new window.
I've been looking for information for a couple of days and I only get more confused.
This is the closest I've ever been, and I honestly don't know what I'm doing :(
(I apologize for my English)
VueJS & Vuetify
this.$proxies.menusProxy
.getFileStream(
this.eitem.identificador,
{
params: {
file: "Clientes.pdf",
},
},
)
.then((x) => {
console.log("Done: ", x);
})
.catch((x) => {
console.log("Error: ", x);
});
})
.catch((x) => {});
ASP.NET
public async Task<HttpResponseMessage> GetFileStream(string file)
{
var uploads = Path.Combine("c:\\Shared\\06W03R0821105PLAE1\\"+file);
MemoryStream responseStream = new MemoryStream();
Stream fileStream = System.IO.File.Open(uploads, FileMode.Open);
fileStream.CopyTo(responseStream);
fileStream.Close();
responseStream.Position = 0;
HttpResponseMessage response = new HttpResponseMessage();
response.StatusCode = HttpStatusCode.OK;
response.Content = new StreamContent(responseStream);
string contentDisposition = string.Concat("attachment; filename=", file);
response.Content.Headers.ContentDisposition =
ContentDispositionHeaderValue.Parse(contentDisposition);
return response;
}
Console response:
Image Console
Thank you very much to all

Related

Return a CSV File from Flask to Angular Client

I have an app which runs an Angular Frontend and Flask Backend. I have created a button that trigger a API call to Flask that queries my database and returns a dataframe (df) in a CSV format.
I believe I have coded the Flask part correctly as I can't see any errors in the logs. However, I do get an error appearing on the client side which is:
SyntaxError: Unexpected token '', ""... is not valid JSON
I suspect its because my subscribing of the data is done incorrect, but I am unsure of what needs to happen
Angular (When the Download Button is clicked, this is triggered)
fullDownload() {
let responseData: any;
const date = this.selectedDate;
const id= this.selectedId;
this.seriveFile.getFullData(date, id).subscribe(
data => {
responseData = new Blob([data], {type: 'text/csv'})
const url = window.URL.createObjectURL(responseData);
window.open(url);
},
error => {this.errorMessage = error.error.error;
}
);
}
Angular (The service it calls)
public getFullData(value, id) {
let params = new HttpParams();
params = params.append('date', value);
params = params.append('id', id);
return this.http.get<any>(`${this.baseAPIURL}/api/example/download-data`, {params, responseType:"blob" as "json"});
}
Flask
resp = make_response(df.to_csv(index=False))
resp.headers["Content-Disposition"] = "attachment; filename=export.csv"
resp.headers["Content-Type"] = "text/csv"
return resp
This should download the file:
fullDownload() {
let responseData: any;
const date = this.selectedDate;
const id = this.selectedId;
this.seriveFile.getFullData(date, id).subscribe(
data => {
const filename = 'csv.csv';
const a = document.createElement('a');
a.href = window.URL.createObjectURL(data);
a.download = filename;
a.click();
},
error => {
this.errorMessage = error.error.error;
}
);
}

Parsing Xml large file shutdown the server nodejs

I have written code in node js, I get XML data in a stream from the server, then parse it. locally it works fine, but when I deploy on the server, it shut down the server node.
it takes 100% CPU usage. Is there any way to minimum use CPU usage?
Any expert opinion
const axios = require('axios'),
node_xml_stream = require('node-xml-stream'),
httpAdapter = require('axios/lib/adapters/http');
let response;
try {
sails.log.info('Before http stream request');
response = await axios.get(process.env.SRO_FEED_URL, {
responseType: 'stream',
adapter: httpAdapter
});
} catch (error) {
sails.log.error('http request error ', error);
}
const stream = response.data;
sails.log.info('After the feed response');
parser = new node_xml_stream(),
stream.pipe(parser);
parser.on('opentag',
function(name, attrs) {
if (name === 'job') {
// let attr = attrs;
}
t_name = name; })
parser.on('text', function(text) {
if (t_name === 'id') {
id = text;
}
if (t_name === 'company') {
company = text;
}
});
code locally working fine, but when I deploy its crash the server.
I'm having a tough time finding a node package that can parse large XML files that are 1G+ in size.

How can I send an image from Node.js to React.js?

I have the following setup, by which I send the image, from its url, to be edited and sent back to be uploaded to S3. The problem I currently have is that the image gets on S3 corrupted, and I am wondering if there's trouble in my code that's causing the issue.
Server side:
function convertImage(inputStream) {
return gm(inputStream)
.contrast(-2)
.stream();
}
app.get('/resize/:imgDetails', (req, res, next) => {
let params = req.params.imgDetails.split('&');
let fileName = params[0]; console.log(fileName);
let tileType = params[1]; console.log(tileType);
res.set('Content-Type', 'image/jpeg');
let url = `https://${process.env.Bucket}.s3.amazonaws.com/images/${tileType}/${fileName}`;
convertImage(request.get(url)).pipe(res);
})
Client side:
axios.get('/resize/' + fileName + '&' + tileType)
.then(res => {
/** PUT FILE ON AWS **/
var img = res;
axios.post("/sign_s3_sized", {
fileName : fileName,
tileType : tileType,
ContentType : 'image/jpeg'
})
.then(response => {
var returnData = response.data.data.returnData;
var signedRequest = returnData.signedRequest;
var url = returnData.url;
this.setState({url: url})
// Put the fileType in the headers for the upload
var options = {
headers: {
'Content-Type': 'image/jpeg'
}
};
axios.put(signedRequest,img, options)
.then(result => {
this.setState({success: true});
}).bind(this)
.catch(error => {
console.log("ERROR: " + JSON.stringify(error));
})
})
.catch(error => {
console.log(JSON.stringify(error));
})
})
.catch(error => console.log(error))
Before going any further, I can assure you now that uploading any images via this setup minus the convertImage() works, otherwise the image gets put on S3 corrupted.
Any pointers as to what the issue behind the image being corrupted is?
Is my understanding of streams here lacking perhaps? If so, what should I change?
Thank you!
EDIT 1:
I tried not running the image through the graphicsmagick API at all (request.get(url).pipe(res);) and the image is still corrupted.
EDIT 2:
I gave up at the end and just uploaded the file from Node.js straight to S3; it turned out to be better practice anyway.
So if you are end goal is to upload the image in the S3 bucket using Node Js, there are simple ways by using multer-s3 node module.

AWS S3 Upload after GET Request to Image, Not Uploading Correctly

I'm trying to upload an image to my AWS S3 bucket after downloading the image from another URL using Node (using request-promise-native & aws-sdk):
'use strict';
const config = require('../../../configs');
const AWS = require('aws-sdk');
const request = require('request-promise-native');
AWS.config.update(config.aws);
let s3 = new AWS.S3();
function uploadFile(req, res) {
function getContentTypeByFile(fileName) {
var rc = 'application/octet-stream';
var fn = fileName.toLowerCase();
if (fn.indexOf('.png') >= 0) rc = 'image/png';
else if (fn.indexOf('.jpg') >= 0) rc = 'image/jpg';
return rc;
}
let body = req.body,
params = {
"ACL": "bucket-owner-full-control",
"Bucket": 'testing-bucket',
"Content-Type": null,
"Key": null, // Name of the file
"Body": null // File body
};
// Grabs the filename from a URL
params.Key = body.url.substring(body.url.lastIndexOf('/') + 1);
// Setting the content type
params.ContentType = getContentTypeByFile(params.Key);
request.get(body.url)
.then(response => {
params.Body = response;
s3.putObject(params, (err, data) => {
if (err) { console.log(`Error uploading to S3 - ${err}`); }
if (data) { console.log("Success - Uploaded to S3: " + data.toString()); }
});
})
.catch(err => { console.log(`Error encountered: ${err}`); });
}
The upload succeeds when I test it out, however after trying to redownload it from my bucket the image is unable to display. Additionally, I notice after uploading the file with my function, the file listed in the bucket is much larger in filesize than the originally uploaded image. I'm trying to figure out where I've been going wrong but cannot find where. Any help is appreciated.
Try to open the faulty file with a text editor, you will see some errors written in it.
You can try using s3.upload instead of putObject, it works better with streams.

Uploading base64 encoded Image to Amazon S3 via Node.js

Yesterday I did a deep night coding session and created a small node.js/JS (well actually CoffeeScript, but CoffeeScript is just JavaScript so lets say JS) app.
what's the goal:
client sends a canvas datauri (png) to server (via socket.io)
server uploads image to amazon s3
step 1 is done.
the server now has a string a la
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAADICAYAAACt...
my question is: what are my next steps to "stream"/upload this data to Amazon S3 and create an actual image there?
knox https://github.com/LearnBoost/knox seems like an awesome lib to PUT something to S3, but what I'm missing is the glue between the base64-encoded-image-string and actual upload action?
Any ideas, pointers and feedback welcome.
For people who are still struggling with this issue. Here is the approach I used with native aws-sdk :
var AWS = require('aws-sdk');
AWS.config.loadFromPath('./s3_config.json');
var s3Bucket = new AWS.S3( { params: {Bucket: 'myBucket'} } );
Inside your router method (ContentType should be set to the content type of the image file):
var buf = Buffer.from(req.body.imageBinary.replace(/^data:image\/\w+;base64,/, ""),'base64')
var data = {
Key: req.body.userId,
Body: buf,
ContentEncoding: 'base64',
ContentType: 'image/jpeg'
};
s3Bucket.putObject(data, function(err, data){
if (err) {
console.log(err);
console.log('Error uploading data: ', data);
} else {
console.log('successfully uploaded the image!');
}
});
s3_config.json file :
{
"accessKeyId":"xxxxxxxxxxxxxxxx",
"secretAccessKey":"xxxxxxxxxxxxxx",
"region":"us-east-1"
}
Here's the code from one article I came across, posting below:
const imageUpload = async (base64) => {
const AWS = require('aws-sdk');
const { ACCESS_KEY_ID, SECRET_ACCESS_KEY, AWS_REGION, S3_BUCKET } = process.env;
AWS.config.setPromisesDependency(require('bluebird'));
AWS.config.update({ accessKeyId: ACCESS_KEY_ID, secretAccessKey: SECRET_ACCESS_KEY, region: AWS_REGION });
const s3 = new AWS.S3();
const base64Data = new Buffer.from(base64.replace(/^data:image\/\w+;base64,/, ""), 'base64');
const type = base64.split(';')[0].split('/')[1];
const userId = 1;
const params = {
Bucket: S3_BUCKET,
Key: `${userId}.${type}`, // type is not required
Body: base64Data,
ACL: 'public-read',
ContentEncoding: 'base64', // required
ContentType: `image/${type}` // required. Notice the back ticks
}
let location = '';
let key = '';
try {
const { Location, Key } = await s3.upload(params).promise();
location = Location;
key = Key;
} catch (error) {
}
console.log(location, key);
return location;
}
module.exports = imageUpload;
Read more: http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#upload-property
Credits: https://medium.com/#mayneweb/upload-a-base64-image-data-from-nodejs-to-aws-s3-bucket-6c1bd945420f
ok, this one is the answer how to save canvas data to file
basically it loos like this in my code
buf = new Buffer(data.dataurl.replace(/^data:image\/\w+;base64,/, ""),'base64')
req = knoxClient.put('/images/'+filename, {
'Content-Length': buf.length,
'Content-Type':'image/png'
})
req.on('response', (res) ->
if res.statusCode is 200
console.log('saved to %s', req.url)
socket.emit('upload success', imgurl: req.url)
else
console.log('error %d', req.statusCode)
)
req.end(buf)
The accepted answer works great but if someone needs to accept any file instead of just images this regexp works great:
/^data:.+;base64,/
For laravel developers this should work
/* upload the file */
$path = Storage::putFileAs($uploadfolder, $uploadFile, $fileName, "s3");
make sure to set your .env file property before calling this method

Categories