I am trying to send a yaml file as a base64 string so that this code works:
const response = await octokit.request('GET /repos/{owner}/{repo}/git/blobs/{file_sha}', {
owner: 'DevEx',
repo: 'hpdev-content',
file_sha: fileSha,
headers: {
authorization: `Bearer ${githubConfig?.token}`,
},
});
const decoded = Buffer.from(response.data.content, 'base64').toString('utf8');
In the above code response.data.content should have the data.
I have this route:
router.get('/repos/:owner/:repo/git/blobs/:file_sha', (req, res) => {
// TODO: do we need to do anything with the path params?
// eslint-disable-next-line #typescript-eslint/no-unused-vars
const { owner, repo, file_sha } = req.params;
const contents = writeUsersReport();
const encoded = Buffer.from(contents, 'binary').toString('base64');
res.send(encoded);
});
The code is working fine except that the client code expects the base64 string in a property called content in the following code:
const decoded = Buffer.from(response.data.content, 'base64').toString('utf8');
But the string is in response.data.
How can I set the content property instead?
How about sending a json response containing an object with a content property from your server side instead of the encoded string directly?
// ...
const encoded = Buffer.from(contents, 'binary').toString('base64');
res.json({content:encoded});
Related
How do I download the document I receive in return in react?
Here is the my node.js app. fetchContracts is a function which getting data from mongodb then ganere a excel file by using json2xls npm package.
Its returns as like this:
const xls = json2xls(contracts);
return xls;
If tying to write file fs.writeFileSync(path.join(__dirname, filename), xls, 'binary'); generating successfully xlsx file in the server.
But I need to send the file to the server without writing file. For this, I made some experiments that you can see below.
export const EXPORT_EXCEL: SessionedAsyncControllerType = async (req: SessionedRequest, res: Response) => {
const fileName = 'hello_world.xlsx'
const fileType = 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
const xls = await fetchContracts({}, "fileName.xlsx")
const fileData = xls;
res.writeHead(200, {
'Content-Disposition': `attachment; filename="${fileName}"`,
'Content-Type': fileType,
})
const download = Buffer.from(fileData, 'base64')
res.end(download)
}
I getting response like this.
But i don't know how can i download the response file in react?
In react side:
return api.get(`api/excel`).then((response: any) => {
console.log(response);
})
I just log into console. How can i download directly file which coming node response in react.js?
Try this
return api.get(`api/excel`).then((response: any) => {
const outputFilename = `${Date.now()}.xlsx`;
// If you want to download file automatically using link attribute.
const url = URL.createObjectURL(new Blob([response.data]));
const link = document.createElement('a');
link.href = url;
link.setAttribute('download', outputFilename);
link.click();
})
I am using ipfs-http-client to read the contents of a file form infura, how do i use the "cat" funtionality to correctly get the data in a string/json format?
const client = create({
url: ipfsUrl(),
headers: {
authorization: ipfsAuthPhrase(),
},
});
const cidformat = "f" + cid.substring(2);
const cidV0 = new CID(cidformat).toV0().toString();
const resp = await client.cat(cidV0);
let content = [];
for await (const chunk of resp) {
content = [...content, ...chunk];
}
console.log(content.toString());
right now i am just getting a array of binaries on the console log.
From this point on its just a matter of decoding the content buffer.
If the content is some JSON:
const raw = Buffer.from(content).toString('utf8')
console.log(JSON.parse(raw))
If the content is an image:
Buffer.from(content).toString('base64')
This is a duplicate of this question here
Here is the code I'm trying to work with:
let Client = require('ssh2-sftp-client');
let sftp = new Client();
var csv = require("csvtojson");
sftp.connect({
host: 'HOST',
port: 'PORT',
username: 'USERNAME',
password: 'PASSWORD'
}).then(() => {
return sftp.get('/home/user/etc/testfile.csv');
}).then((data) => {
csv()
.fromString(data.toString()) // changed this from .fromStream(data)
.subscribe(function(jsonObj) { //single json object will be emitted for each csv line
// parse each json asynchronously
return new Promise(function(resolve, reject) {
resolve()
console.log(jsonObj);
})
})
}).catch((err) => {
console.log(err, 'catch error');
});
I can read back the CSV data, and can see it going into JSON format on console.log(jsonObj) but the data is unreadable, all '\x00o\x00n\x00'' ..
I'm not sure what to do in the line:
// parse each json asynchronously
Could anyone help to figure out how to parse the CSV/JSON after it comes back from the buffer?
The null bytes \x00 are pointing towards an encoding/decoding issue. The CSV file might be encoded using UTF-16, but Buffer.toString() by default decodes the data using UTF-8. You can change that to data.toString('utf16le') (or data.toString('ucs2')) to force using the correct encoding.
Alright, so I'm fairly new to Node/Express. I've got an Angular chart, and a React-based Node exporting service.
When you click on a chart's export button, it should create an SVG, pass it along to the export service in an axios.post request, convert it to React component, and export the HTML -> PNG via an image stream, in memory, which should be downloaded on the client when we get a successful response back.
I've tried a Blob, a Buffer, hacking an image.src, etc. I can't seem to get it.
The response passes back the raw image string, so all of that works (if that's what I should be returning)... but I can't get it to download on the client. After digging, I found out that res.download only works with GET, but I specifically need a POST here to pass the SVG and other random data up to the export server (too large for query params).
I'm not sure if it's just that I'm converting it incorrectly (stream) and I can't parse it correctly on the client, or if I'm missing something key.
Here are some snippets:
Angular Service
// ...stuff...
exportPNG(chart) {
const exportServiceURL = 'someUrl.com'
const svg = chart.toSvg();
const params = { svg, ...someOtherData};
const exportUrl = `${exportServiceURL}/exports/image/chart`;
return axios.post(exportUrl, params).then(res => {
//... I need to download the response's image somehow, here?
// res.data looks like "�PNG IHDR�9]�{pHYs��IDATx^�}T[Ǚ�w6�\(-��c[؍�� ')�...
});
React/Export service:
// ...stuff...
async getChart (req, res) => {
const deferred = async.defer();
const filename = 'reportingExport.png';
// Returns HTML, wrapped around the SVG string.
const html = this.getChartHtml(req.body);
const stream = await this.htmlToImageStream(html, filename);
stream.on('open', () => this.pipeStream(req, res, stream, filename));
}
async htmlToImageStream(html, tempFilename) {
const deferred = async.defer();
webshot(html, tempFilename, { siteType: 'html' }, err => {
if (err) deferred.error(err);
const stream = fs.createReadStream(tempFilename);
// Does cleanup, fs.unlink, not important
stream.on('end', () => this.imageExportStreamEnd(stream, tempFilename));
deferred.resolve(stream);
});
return deferred.promise;
}
pipeStream(req, res, stream, filename) {
res.set('Content-Type', 'application/png');
//res.attachment(filename); // this didn't work
res.download(`${path.join(__dirname, '../../../')}${filename}`, filename);
stream.pipe(res); // Seems to return an actual PNG
}
I am new to nodejs and am trying to set up a server where i get the exif information from an image. My images are on S3 so I want to be able to just pass in the s3 url as a parameter and grab the image from it.
I am u using the ExifImage project below to get the exif info and according to their documentation:
"Instead of providing a filename of an image in your filesystem you can also pass a Buffer to ExifImage."
How can I load an image to the buffer in node from a url so I can pass it to the ExifImage function
ExifImage Project:
https://github.com/gomfunkel/node-exif
Thanks for your help!
Try setting up request like this:
var request = require('request').defaults({ encoding: null });
request.get(s3Url, function (err, res, body) {
//process exif here
});
Setting encoding to null will cause request to output a buffer instead of a string.
Use the axios:
const response = await axios.get(url, { responseType: 'arraybuffer' })
const buffer = Buffer.from(response.data, "utf-8")
import fetch from "node-fetch";
let fimg = await fetch(image.src)
let fimgb = Buffer.from(await fimg.arrayBuffer())
I was able to solve this only after reading that encoding: null is required and providing it as an parameter to request.
This will download the image from url and produce a buffer with the image data.
Using the request library -
const request = require('request');
let url = 'http://website.com/image.png';
request({ url, encoding: null }, (err, resp, buffer) => {
// Use the buffer
// buffer contains the image data
// typeof buffer === 'object'
});
Note: omitting the encoding: null will result in an unusable string and not in a buffer. Buffer.from won't work correctly too.
This was tested with Node 8
Use the request library.
request('<s3imageurl>', function(err, response, buffer) {
// Do something
});
Also, node-image-headers might be of interest to you. It sounds like it takes a stream, so it might not even have to download the full image from S3 in order to process the headers.
Updated with correct callback signature.
Here's a solution that uses the native https library.
import { get } from "https";
function urlToBuffer(url: string): Promise<Buffer> {
return new Promise((resolve, reject) => {
const data: Uint8Array[] = [];
get(url, (res) => {
res
.on("data", (chunk: Uint8Array) => {
data.push(chunk);
})
.on("end", () => {
resolve(Buffer.concat(data));
})
.on("error", (err) => {
reject(err);
});
});
});
}
const imageUrl = "https://i.imgur.com/8k7e1Hm.png";
const imageBuffer = await urlToBuffer(imageUrl);
Feel free to delete the types if you're looking for javascript.
I prefer this approach because it doesn't rely on 3rd party libraries or the deprecated request library.
request is deprecated and should be avoided if possible.
Good alternatives include got (only for node.js) and axios (which also support browsers).
Example of got:
npm install got
Using the async/await syntax:
const got = require('got');
const url = 'https://www.google.com/images/branding/googlelogo/2x/googlelogo_color_272x92dp.png';
(async () => {
try {
const response = await got(url, { responseType: 'buffer' });
const buffer = response.body;
} catch (error) {
console.log(error.body);
}
})();
you can do it that way
import axios from "axios";
function getFileContentById(
download_url: string
): Promise < Buffer > {
const response = await axios.get(download_url, {
responseType: "arraybuffer",
});
return Buffer.from(response.data, "base64");
}