Using HTTP2 directly in NestJS - javascript

Is there any way to use http2 directly in NestJs? (methods like pushStream etc) or should i use express for this kind of features?

No. There is none so far. But as long as you have the latest nodejs, you may use the http2 libraries.
You have 2 options:
write a nestjs custom transport (I wonder no one has done that so far)
mix it into your code yourself e.g. here a JS http2 client consuming
const http2 = require('http2')
const session = http2.connect('http://localhost:8088')
session.on('error', (err) => console.error(err))
const body = {
sql: "SELECT * FROM SIGNALS EMIT CHANGES;",
properties: {"ksql.streams.auto.offset.reset": "latest"}
}
const req = session.request({
':method': 'POST',
':path': '/query-stream',
'Content-Type': 'application/json'
})
req.write(JSON.stringify(body), 'utf8')
req.end()
req.on('response', (headers) => {
// we can log each response header here
for (const name in headers) {
console.log("a header: " + '${name}: ${headers[name]}')
}
})
req.setEncoding('utf8')
let data = ''
req.on('data', (chunk) => {
data += chunk
console.log('\n${data}')
})
req.on('end', () => {
console.log('\n${data}')
session.close()
})

Related

How to upload a file into Firebase Storage from a callable https cloud function

I have been trying to upload a file to Firebase storage using a callable firebase cloud function.
All i am doing is fetching an image from an URL using axios and trying to upload to storage.
The problem i am facing is, I don't know how to save the response from axios and upload it to storage.
First , how to save the received file in the temp directory that os.tmpdir() creates.
Then how to upload it into storage.
Here i am receiving the data as arraybuffer and then converting it to Blob and trying to upload it.
Here is my code. I have been missing a major part i think.
If there is a better way, please recommend me. Ive been looking through a lot of documentation, and landed up with no clear solution. Please guide. Thanks in advance.
const bucket = admin.storage().bucket();
const path = require('path');
const os = require('os');
const fs = require('fs');
module.exports = functions.https.onCall((data, context) => {
try {
return new Promise((resolve, reject) => {
const {
imageFiles,
companyPIN,
projectId
} = data;
const filename = imageFiles[0].replace(/^.*[\\\/]/, '');
const filePath = `ProjectPlans/${companyPIN}/${projectId}/images/${filename}`; // Path i am trying to upload in FIrebase storage
const tempFilePath = path.join(os.tmpdir(), filename);
const metadata = {
contentType: 'application/image'
};
axios
.get(imageFiles[0], { // URL for the image
responseType: 'arraybuffer',
headers: {
accept: 'application/image'
}
})
.then(response => {
console.log(response);
const blobObj = new Blob([response.data], {
type: 'application/image'
});
return blobObj;
})
.then(async blobObj => {
return bucket.upload(blobObj, {
destination: tempFilePath // Here i am wrong.. How to set the path of downloaded blob file
});
}).then(buffer => {
resolve({ result: 'success' });
})
.catch(ex => {
console.error(ex);
});
});
} catch (error) {
// unknown: 500 Internal Server Error
throw new functions.https.HttpsError('unknown', 'Unknown error occurred. Contact the administrator.');
}
});
I'd take a slightly different approach and avoid using the local filesystem at all, since its just tmpfs and will cost you memory that your function is using anyway to hold the buffer/blob, so its simpler to just avoid it and write directly from that buffer to GCS using the save method on the GCS file object.
Here's an example. I've simplified out a lot of your setup, and I am using an http function instead of a callable. Likewise, I'm using a public stackoverflow image and not your original urls. In any case, you should be able to use this template to modify back to what you need (e.g. change the prototype and remove the http response and replace it with the return value you need):
const functions = require('firebase-functions');
const axios = require('axios');
const admin = require('firebase-admin');
admin.initializeApp();
exports.doIt = functions.https.onRequest((request, response) => {
const bucket = admin.storage().bucket();
const IMAGE_URL = 'https://cdn.sstatic.net/Sites/stackoverflow/company/img/logos/so/so-logo.svg';
const MIME_TYPE = 'image/svg+xml';
return axios.get(IMAGE_URL, { // URL for the image
responseType: 'arraybuffer',
headers: {
accept: MIME_TYPE
}
}).then(response => {
console.log(response); // only to show we got the data for debugging
const destinationFile = bucket.file('my-stackoverflow-logo.svg');
return destinationFile.save(response.data).then(() => { // note: defaults to resumable upload
return destinationFile.setMetadata({ contentType: MIME_TYPE });
});
}).then(() => { response.send('ok'); })
.catch((err) => { console.log(err); })
});
As a commenter noted, in the above example the axios request itself makes an external network access, and you will need to be on the Blaze or Flame plan for that. However, that alone doesn't appear to be your current problem.
Likewise, this also defaults to using a resumable upload, which the documentation does not recommend when you are doing large numbers of small (<10MB files) as there is some overhead.
You asked how this might be used to download multiple files. Here is one approach. First, lets assume you have a function that returns a promise that downloads a single file given its filename (I've abridged this from the above but its basically identical except for the change of INPUT_URL to filename -- note that it does not return a final result such as response.send(), and there's sort of an implicit assumption all the files are the same MIME_TYPE):
function downloadOneFile(filename) {
const bucket = admin.storage().bucket();
const MIME_TYPE = 'image/svg+xml';
return axios.get(filename, ...)
.then(response => {
const destinationFile = ...
});
}
Then, you just need to iteratively build a promise chain from the list of files. Lets say they are in imageUrls. Once built, return the entire chain:
let finalPromise = Promise.resolve();
imageUrls.forEach((item) => { finalPromise = finalPromise.then(() => downloadOneFile(item)); });
// if needed, add a final .then() section for the actual function result
return finalPromise.catch((err) => { console.log(err) });
Note that you could also build an array of the promises and pass them to Promise.all() -- that would likely be faster as you would get some parallelism, but I wouldn't recommend that unless you are very sure all of the data will fit inside the memory of your function at once. Even with this approach, you need to make sure the downloads can all complete within your function's timeout.

Stream file upload to s3 via express server

I have an express endpoint where i currently handle uploading of files. Large files are taking lots of memory b/c i was using bodyParser which buffers the entire file in memory before calling my handler function.
I removed the bodyParser middleware from this endpoint and i'm strugging to properly use streams to basically stream the file upload -> express -> s3.
This is the docs on the s3 method and it accepts a buffer or a stream.
https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#upload-property
route
router.put('/files/:filename', putHandler({ s3Client: s3Client }))
I tried this which streams the file to my handler method, but it doesn't seem to be streaming it to the s3.upload method (no surprise really)
function put ({ s3Client }) {
return (req, res) => {
...
let whenFileUploaded = new Promise((resolve, reject) => {
// const { Readable } = require('stream')
// const inStream = new Readable({
// read() {}
// })
let data = ''
req.on('data', function (chunk) {
req.log.debug('in chunk')
data += chunk
// inStream.push(chunk)
})
req.on('end', function () {
req.log.debug('in end')
})
s3Client.upload(
{
Key: filepath,
Body: data,
SSECustomerAlgorithm: 'AES256',
SSECustomerKey: sseKey.id.split('-').join('')
},
{
partSize: 16 * 1024 * 1024, // 16mb
queuSize: 1
},
(err, data) => err ? reject(err) : resolve(data)
)
})
My guess is that I need to create a stream and pipe req.on('data... to my stream and then set Body: inStream which you can see i attempted with the commented out stuff, but that didn't seem to work either.
Help?
Turns out the answer is actually very simple. All I had to do was pass the req object.
function put ({ s3Client }) {
return (req, res) => {
...
let whenFileUploaded = new Promise((resolve, reject) => {
s3Client.upload(
{
Key: filepath,
Body: req, // <-- NOTE THIS LINE
SSECustomerAlgorithm: 'AES256',
SSECustomerKey: sseKey.id.split('-').join('')
},
{
partSize: 16 * 1024 * 1024, // 16mb
queuSize: 1
},
(err, data) => err ? reject(err) : resolve(data)
)
})
The way i found this out is b/c I looked at the express source code for what a req object is and I see that it is a http.IncomingMessage object - https://github.com/expressjs/express/blob/master/lib/request.js#L31
Then i looked at the Node docs and I see that http.IncomingMessage implements the Readable Stream interface
It implements the Readable Stream interface, as well as the following
additional events, methods, and properties.
https://nodejs.org/docs/latest-v9.x/api/http.html#http_class_http_incomingmessage

Streaming JSON data to React results in unexpected end of JSON inpit

I'm trying to stream a lot of data from a NodeJS server that fetches the data from Mongo and sends it to React. Since it's quite a lot of data, I've decided to stream it from the server and display it in React as soon as it comes in. Here's a slightly simplified version of what I've got on the server:
const getQuery = async (req, res) => {
const { body } = req;
const query = mongoQueries.buildFindQuery(body);
res.set({ 'Content-Type': 'application/octet-stream' });
Log.find(query).cursor()
.on('data', (doc) => {
console.log(doc);
const data = JSON.stringify(result);
res.write(`${data}\r\n`);
}
})
.on('end', () => {
console.log('Data retrieved.');
res.end();
});
};
Here's the React part:
fetch(url, { // this fetch fires the getQuery function on the backend
method: "POST",
body: JSON.stringify(object),
headers: {
"Content-Type": "application/json",
}
})
.then(response => {
const reader = response.body.getReader();
const decoder = new TextDecoder();
const pump = () =>
reader.read().then(({ done, value }) => {
if (done) return this.postEndHandler();
console.log(value.length); // !!!
const decoded = decoder.decode(value);
this.display(decoded);
return pump();
});
return pump();
})
.catch(err => {
console.error(err);
toast.error(err.message);
});
}
display(chunk) {
const { data } = this.state;
try {
const parsedChunk = chunk.split('\r\n').slice(0, -1);
parsedChunk.forEach(e => data.push(JSON.parse(e)));
return this.setState({data});
} catch (err) {
throw err;
}
}
It's a 50/50 whether it completes with no issues or fails at React's side of things. When it fails, it's always because of an incomplete JSON object in parsedChunk.forEach. I did some digging and it turns out that every time it fails, the console.log that I marked with 3 exclamation marks shows 65536. I'm 100% certain it's got something to do with my streams implementation and I'm not queuing the chunks correctly but I'm not sure whether I should be fixing it client or server side. Any help would be greatly appreciated.
Instead of implementing your own NDJSON-like streaming JSON protocol which you are basically doing here (with all of the pitfalls of dividing the stream into chunks and packets which is not always under your control), you can take a look at some of the existing tools that are created to do what you need, e.g.:
http://oboejs.com/
http://ndjson.org/
https://www.npmjs.com/package/stream-json
https://www.npmjs.com/package/JSONStream
https://www.npmjs.com/package/clarinet

Can't read content of online file in Node.js with XMLHttpRequest on the client [duplicate]

How can I make an HTTP request from within Node.js or Express.js? I need to connect to another service. I am hoping the call is asynchronous and that the callback contains the remote server's response.
Here is a snippet of some code from a sample of mine. It's asynchronous and returns a JSON object. It can do any form of GET request.
Note that there are more optimal ways (just a sample) - for example, instead of concatenating the chunks you put into an array and join it etc... Hopefully, it gets you started in the right direction:
const http = require('http');
const https = require('https');
/**
* getJSON: RESTful GET request returning JSON object(s)
* #param options: http options object
* #param callback: callback to pass the results JSON object(s) back
*/
module.exports.getJSON = (options, onResult) => {
console.log('rest::getJSON');
const port = options.port == 443 ? https : http;
let output = '';
const req = port.request(options, (res) => {
console.log(`${options.host} : ${res.statusCode}`);
res.setEncoding('utf8');
res.on('data', (chunk) => {
output += chunk;
});
res.on('end', () => {
let obj = JSON.parse(output);
onResult(res.statusCode, obj);
});
});
req.on('error', (err) => {
// res.send('error: ' + err.message);
});
req.end();
};
It's called by creating an options object like:
const options = {
host: 'somesite.com',
port: 443,
path: '/some/path',
method: 'GET',
headers: {
'Content-Type': 'application/json'
}
};
And providing a callback function.
For example, in a service, I require the REST module above and then do this:
rest.getJSON(options, (statusCode, result) => {
// I could work with the resulting HTML/JSON here. I could also just return it
console.log(`onResult: (${statusCode})\n\n${JSON.stringify(result)}`);
res.statusCode = statusCode;
res.send(result);
});
UPDATE
If you're looking for async/await (linear, no callback), promises, compile time support and intellisense, we created a lightweight HTTP and REST client that fits that bill:
Microsoft typed-rest-client
Try using the simple http.get(options, callback) function in node.js:
var http = require('http');
var options = {
host: 'www.google.com',
path: '/index.html'
};
var req = http.get(options, function(res) {
console.log('STATUS: ' + res.statusCode);
console.log('HEADERS: ' + JSON.stringify(res.headers));
// Buffer the body entirely for processing as a whole.
var bodyChunks = [];
res.on('data', function(chunk) {
// You can process streamed parts here...
bodyChunks.push(chunk);
}).on('end', function() {
var body = Buffer.concat(bodyChunks);
console.log('BODY: ' + body);
// ...and/or process the entire body here.
})
});
req.on('error', function(e) {
console.log('ERROR: ' + e.message);
});
There is also a general http.request(options, callback) function which allows you to specify the request method and other request details.
Request and Superagent are pretty good libraries to use.
note: request is deprecated, use at your risk!
Using request:
var request=require('request');
request.get('https://someplace',options,function(err,res,body){
if(err) //TODO: handle err
if(res.statusCode === 200 ) //etc
//TODO Do something with response
});
You can also use Requestify, a really cool and very simple HTTP client I wrote for nodeJS + it supports caching.
Just do the following for GET method request:
var requestify = require('requestify');
requestify.get('http://example.com/api/resource')
.then(function(response) {
// Get the response body (JSON parsed or jQuery object for XMLs)
response.getBody();
}
);
This version is based on the initially proposed by bryanmac function which uses promises, better error handling, and is rewritten in ES6.
let http = require("http"),
https = require("https");
/**
* getJSON: REST get request returning JSON object(s)
* #param options: http options object
*/
exports.getJSON = function (options) {
console.log('rest::getJSON');
let reqHandler = +options.port === 443 ? https : http;
return new Promise((resolve, reject) => {
let req = reqHandler.request(options, (res) => {
let output = '';
console.log('rest::', options.host + ':' + res.statusCode);
res.setEncoding('utf8');
res.on('data', function (chunk) {
output += chunk;
});
res.on('end', () => {
try {
let obj = JSON.parse(output);
// console.log('rest::', obj);
resolve({
statusCode: res.statusCode,
data: obj
});
}
catch (err) {
console.error('rest::end', err);
reject(err);
}
});
});
req.on('error', (err) => {
console.error('rest::request', err);
reject(err);
});
req.end();
});
};
As a result you don't have to pass in a callback function, instead getJSON() returns a promise. In the following example the function is used inside of an ExpressJS route handler
router.get('/:id', (req, res, next) => {
rest.getJSON({
host: host,
path: `/posts/${req.params.id}`,
method: 'GET'
}).then(({ statusCode, data }) => {
res.json(data);
}, (error) => {
next(error);
});
});
On error it delegates the error to the server error handling middleware.
Unirest is the best library I've come across for making HTTP requests from Node. It's aiming at being a multiplatform framework, so learning how it works on Node will serve you well if you need to use an HTTP client on Ruby, PHP, Java, Python, Objective C, .Net or Windows 8 as well. As far as I can tell the unirest libraries are mostly backed by existing HTTP clients (e.g. on Java, the Apache HTTP client, on Node, Mikeal's Request libary) - Unirest just puts a nicer API on top.
Here are a couple of code examples for Node.js:
var unirest = require('unirest')
// GET a resource
unirest.get('http://httpbin.org/get')
.query({'foo': 'bar'})
.query({'stack': 'overflow'})
.end(function(res) {
if (res.error) {
console.log('GET error', res.error)
} else {
console.log('GET response', res.body)
}
})
// POST a form with an attached file
unirest.post('http://httpbin.org/post')
.field('foo', 'bar')
.field('stack', 'overflow')
.attach('myfile', 'examples.js')
.end(function(res) {
if (res.error) {
console.log('POST error', res.error)
} else {
console.log('POST response', res.body)
}
})
You can jump straight to the Node docs here
Check out shred. It's a node HTTP client created and maintained by spire.io that handles redirects, sessions, and JSON responses. It's great for interacting with rest APIs. See this blog post for more details.
Check out httpreq: it's a node library I created because I was frustrated there was no simple http GET or POST module out there ;-)
For anyone who looking for a library to send HTTP requests in NodeJS, axios is also a good choice. It supports Promises :)
Install (npm): npm install axios
Example GET request:
const axios = require('axios');
axios.get('https://google.com')
.then(function (response) {
// handle success
console.log(response);
})
.catch(function (error) {
// handle error
console.log(error);
})
Github page
Update 10/02/2022
Node.js integrates fetch in v17.5.0 in experimental mode. Now, you can use fetch to send requests just like you do on the client-side. For now, it is an experimental feature so be careful.
If you just need to make simple get requests and don't need support for any other HTTP methods take a look at: simple-get:
var get = require('simple-get');
get('http://example.com', function (err, res) {
if (err) throw err;
console.log(res.statusCode); // 200
res.pipe(process.stdout); // `res` is a stream
});
Use reqclient: not designed for scripting purpose
like request or many other libraries. Reqclient allows in the constructor
specify many configurations useful when you need to reuse the same
configuration again and again: base URL, headers, auth options,
logging options, caching, etc. Also has useful features like
query and URL parsing, automatic query encoding and JSON parsing, etc.
The best way to use the library is create a module to export the object
pointing to the API and the necessary configurations to connect with:
Module client.js:
let RequestClient = require("reqclient").RequestClient
let client = new RequestClient({
baseUrl: "https://myapp.com/api/v1",
cache: true,
auth: {user: "admin", pass: "secret"}
})
module.exports = client
And in the controllers where you need to consume the API use like this:
let client = require('client')
//let router = ...
router.get('/dashboard', (req, res) => {
// Simple GET with Promise handling to https://myapp.com/api/v1/reports/clients
client.get("reports/clients")
.then(response => {
console.log("Report for client", response.userId) // REST responses are parsed as JSON objects
res.render('clients/dashboard', {title: 'Customer Report', report: response})
})
.catch(err => {
console.error("Ups!", err)
res.status(400).render('error', {error: err})
})
})
router.get('/orders', (req, res, next) => {
// GET with query (https://myapp.com/api/v1/orders?state=open&limit=10)
client.get({"uri": "orders", "query": {"state": "open", "limit": 10}})
.then(orders => {
res.render('clients/orders', {title: 'Customer Orders', orders: orders})
})
.catch(err => someErrorHandler(req, res, next))
})
router.delete('/orders', (req, res, next) => {
// DELETE with params (https://myapp.com/api/v1/orders/1234/A987)
client.delete({
"uri": "orders/{client}/{id}",
"params": {"client": "A987", "id": 1234}
})
.then(resp => res.status(204))
.catch(err => someErrorHandler(req, res, next))
})
reqclient supports many features, but it has some that are not supported by other
libraries: OAuth2 integration and logger integration
with cURL syntax, and always returns native Promise objects.
If you ever need to send GET request to an IP as well as a Domain (Other answers did not mention you can specify a port variable), you can make use of this function:
function getCode(host, port, path, queryString) {
console.log("(" + host + ":" + port + path + ")" + "Running httpHelper.getCode()")
// Construct url and query string
const requestUrl = url.parse(url.format({
protocol: 'http',
hostname: host,
pathname: path,
port: port,
query: queryString
}));
console.log("(" + host + path + ")" + "Sending GET request")
// Send request
console.log(url.format(requestUrl))
http.get(url.format(requestUrl), (resp) => {
let data = '';
// A chunk of data has been received.
resp.on('data', (chunk) => {
console.log("GET chunk: " + chunk);
data += chunk;
});
// The whole response has been received. Print out the result.
resp.on('end', () => {
console.log("GET end of response: " + data);
});
}).on("error", (err) => {
console.log("GET Error: " + err);
});
}
Don't miss requiring modules at the top of your file:
http = require("http");
url = require('url')
Also bare in mind that you may use https module for communicating over secured network. so these two lines would change:
https = require("https");
...
https.get(url.format(requestUrl), (resp) => { ......
## you can use request module and promise in express to make any request ##
const promise = require('promise');
const requestModule = require('request');
const curlRequest =(requestOption) =>{
return new Promise((resolve, reject)=> {
requestModule(requestOption, (error, response, body) => {
try {
if (error) {
throw error;
}
if (body) {
try {
body = (body) ? JSON.parse(body) : body;
resolve(body);
}catch(error){
resolve(body);
}
} else {
throw new Error('something wrong');
}
} catch (error) {
reject(error);
}
})
})
};
const option = {
url : uri,
method : "GET",
headers : {
}
};
curlRequest(option).then((data)=>{
}).catch((err)=>{
})

How do I pass a react DOM state's information to backend(NodeJS) when a button is invoked?

I'm using his logic on the frontend, but I'm having some trouble actually receiving that data on the backend. I'm using the Sails.js framework. Any suggestions?
handleSubmit = () => {
// Gathering together the data you want to send to API
const payload = {
subject: this.state.subject,
message: this.state.message,
};
this.handleAjaxRequest(payload);
};
// Method to send data to the backend
// Making the req -I'm using Axios here.
handleAjaxRequest = (payload) => {
let request = axios({
method: 'post',
url: '/api/',
data: payload,
headers: 'Content-Type: application/json'
});
// Do stuff with the response from your backend.
request.then(response => {
console.debug(response.data);
})
.catch(error => {
console.error(error);
})
};
I used to do this using Express and didn't have these problems.
Any help, method, a suggestion is more than welcome :)
Please forgive my ignorance, I'm just here to learn.
Okay, so the first thing I had to do is generate a new restful API using the command sails generate api data. In the package.json file I set up a proxy that includes the backends endpoint, like this "proxy": "http://localhost:1337" - I mean, you don't need to do this, but if you don't then you have to include this URL part on every request. Because it doesn't change, it's pretty convenient to do so.
On the frontend, I made a function sendData() that takes the necessary data from my previous component (depending on what the user selected) and send that data using axios to the backend -->
sendData = () => {
const { relYear } = this.props.history.location.state.dev;
const { relMonth } = this.props.history.location.state.dev;
const selectedMonth = moment().month(relMonth).format("MM");
const finalSelect = parseInt(relYear + selectedMonth, 10);
axios.post('/data', { 'selectedDate' : finalSelect })
.then(res => console.log('Data send'))
.catch(err => console.error(err));
}
On the backend I fetched the data, did the calculations and send back the result to the frontend of my app. -->
getApiData = () => {
let apiData = [];
axios.get('/data')
.then(res => {
let first = Object.values(res.data.pop()).shift(); // Getting the relevant 'selectedDate'
apiData.push(first);
}).catch(err => console.error(err));
return apiData;
}

Categories