I am trying to use Actions SDK in my own Server, the actions that I made are showed in Google Assistant, but it's not working, the assistant just closes without showing any errors. This is my code:
'use strict';
const express = require('express');
const bodyParser = require('body-parser');
var exps = express();
exps.use(bodyParser.json());
const {actionssdk} = require('actions-on-google');
const app = actionssdk({debug: true});
const asyncTask = () => new Promise(
resolve => setTimeout(resolve, 1000)
);
exps.post('/', function(request, response) {
app.intent('actions.intent.MAIN', (conv) => {
return asyncTask()
.then(() => conv.ask('Hi, this is a test!'));
});
});
express().use(bodyParser.json(), app).listen(3000);
Request and Debug tabs
Both Errors and Response are empty.
I think the issue is that you are creating two different express objects. One gets mounted on the '/' path, but isn't setup to listen on any port. The other listens on a port, but doesn't have any paths setup for it to handle.
Changing your listener line to
exps.use(bodyParser.json(), app).listen(3000);
will make it so the express object where you've setup the '/' path will also be the one listening on the port.
It also appears that your webhook is listening at the '/' path, but you've specified the webhook in your actions.json file as using the '/node/' path. (It is a little difficult to read the screen shot - which is why we request you post the text and not a screen shot.) If you either change your webhook to listen to '/node/' or change the actions.json file to use '/', it should work.
Looking at the documentation (https://developers.google.com/actions/assistant/responses) suggests that you are attempting to call conv.ask() incorrectly. I would imagine you'd need something like this:
conv.ask(new SimpleResponse({speech: 'Hi, this is a test!', text: 'Hi, this is a test!'}));
Related
I am trying to fix the Checkmarx scanning tool reported issue, I tried to sanitized the err as well as req in the below route module.
However, it still complains about the same error.
index.js
const express = require('express')
const router = express.Router()
const fs = require('fs')
const config = require('config')
var _require = require('jsdom'),
JSDOM = _require.JSDOM;
var window = new JSDOM('').window;
var DOMPurify = createDOMPurify(window);
function sanitizeError(value){
return DOMPurify.sanitize(value);
}
function sanitizeObject(obj) {
var sanitizedObject = {};
Object.keys(obj).forEach(function (key) {
sanitizedObject[key] = sanitizeValue(obj[key]);
});
return sanitizedObject;
};
//error handler route
router.use('/error',(err, req, res, next) => {
//sanitizeObject(req)
req.logger.error('uncaught error page', sanitizeError(err))
res.redirect('/toanotehrerror page')
})
module.exports = router
Checkmarx Error:
Reflected_XSS error. It is referring to the line req.logger.error in the above module
The application's router.use embeds untrusted data in the generated output with error, at line x of \routes\index.js. This untrusted data is embedded straight into the output without proper sanitization or encoding, enabling an attacker to inject malicious code into the output.
The attacker would be able to alter the returned web page by simply providing modified data in the user input error, which is read by the router.use method at line x of \routes\index.js. This input then flows through the code straight to the output web page, without sanitization.
This can enable a Reflected Cross-Site Scripting (XSS) attack.
Checkmarx does not have DOMPurify in the list of its recognized sanitizers. What it does recognize are the ESAPI library, xss-filters and htmlescape packages
https://www.npmjs.com/package/xss-filters
https://www.npmjs.com/package/node-esapi
https://www.npmjs.com/package/htmlescape
While technically your code can prevent XSS, I would rewrite it using using any of the packages above. For instance if we are to use xss-filters:
var xssFilters = require('xss-filters');
function sanitizeError(value){
return xssFilters.inHTMLData(value);
}
I want to return javascript with an Express server, not html document,
I tried
const express = require('express');
const app = express();
app.get('/', (req, res) => {
res.type('.js');
res.send("var danny = new Human();");
});
app.listen(5000, () => {
console.log('Example app listening at http://localhost:5000');
});
but I just get <pre>var danny = new Human();</pre> as html response :/
Thank you for your help :)
You're misinterpreting the result. You are sending JS to the browser. The browser just isn't doing what you expect with it.
If you type the URL of some JS into the address bar of the browser, then it will generate an HTML document to display that script's source code.
Browsers will only execute JS if it is loaded with a <script> element in an HTML document (or via a browser extension).
Edit:
You can use res.sendFile(__dirname + "path/to/file.js")
I decided that i want to investigate what is the best possible way to handle big amount of traffic with NodeJS server, i did a small test on 2 digital ocean servers which has 1GB RAM / 2 CPUs
No-Cluster server code:
// Include Express
var express = require('express');
// Create a new Express application
var app = express();
// Add a basic route – index page
app.get('/', function (req, res) {
res.redirect('http://www.google.co.il');
});
// Bind to a port
app.listen(3000);
console.log('Application running');
Cluster server code:
// Include the cluster module
var cluster = require('cluster');
// Code to run if we're in the master process
if (cluster.isMaster) {
// Count the machine's CPUs
var cpuCount = require('os').cpus().length;
// Create a worker for each CPU
for (var i = 0; i < cpuCount; i += 1) {
cluster.fork();
}
// Code to run if we're in a worker process
} else {
// Include Express
var express = require('express');
// Create a new Express application
var app = express();
// Add a basic route – index page
app.get('/', function (req, res) {
res.redirect('http://www.walla.co.il');
});
// Bind to a port
app.listen(3001);
console.log('Application running #' + cluster.worker.id);
}
And i sent stress test requests to those servers, i excepted that the cluster server will handle more requests but it didn't happen, both servers crashed on the same load, although 2 node services were running on the cluster and 1 service on the non-cluster.
Now i wonder why ? Did i do anything wrong?
Maybe something else is making the servers reach its breakpoint? both servers crashed at ~800 rps
Now i wonder why ? did i do anything wrong?
Your test server doesn't do anything other than a res.redirect(). If your request handlers use essentially no CPU, then you aren't going to be CPU bound at all and you won't benefit from involving more CPUs. Your cluster will be bottlenecked at the handling of incoming connections which is going to be roughly the same with or without clustering.
Now, add some significant CPU usage to your request handler and you should get a different result.
For example, change to this:
// Add a basic route – index page
app.get('/', function (req, res) {
// spin CPU for 200ms to simulate using some CPU in the request handler
let start = Date.now();
while (Date.now() - start < 200) {}
res.redirect('http://www.walla.co.il');
});
Running tests is a great thing, but you have to be careful what exactly you're testing.
What #jfriend00 says is correct; you aren't actually doing enough heavy lifting to justify this, however, you're not actually sharing the load. See here:
app.listen(3001);
You can't bind two services onto the same port and have the OS magically load-balance them[1]; try adding an error handler on app.listen() and see if you get an error, e.g.
app.listen(3001, (err) => err ? console.error(err));
If you want to do this, you'll have to accept everything in your master, then instruct the workers to do the task, then pass the results back to the master again.
It's generally easier not to do this in your Node program though; your frontend will still be the limiting factor. An easier (and faster) way may be to put a special purpose load-balancer in front of multiple running instances of your application (i.e. HAProxy or Nginx).
[1]: That's actually a lie; sorry. You can do this by specifying SO_REUSEPORT when doing the initial bind call, but you can't explicitly specify that in Node, and Node doesn't specify it for you...so you can't in Node.
I'm trying to create a webapp for a web art class using node (w/ npm) and express. The idea is to have the body of the site be all one color, but anyone can text the site a hexcode/CSS color at a Twilio number and the color of the site will instantly change to that color value.
Essentially how it works is the server receives a POST request from Twilio at http://example.com/message, which contains the body of the text message. It writes it to a temporary file at ~/app/.data/color.tmp, which is accessed by the client with a jQuery .get() call to http://example.com/color, which returns
So here's the problem: I got a version of the app working on glitch.me, so I know that this code can work, but I'm having a lot of trouble getting it to work on my domain. I installed the app and can start it with npm, and it successfully shows me the HTML page, but the Chrome devtools show the script is receiving a 403 when it tries to access /color. Also, new texts to my site aren't changing the color value in /.data/color.tmp. I thought it might be a permissions issue but I checked them and they seem fine.
Here's the server file and the script on the index.html page:
app/server.js
var express = require('express');
var bodyParser = require('body-parser');
var fs = require('fs');
var app = express();
app.use(bodyParser.urlencoded({extended: false}));
var dataPath = '.data/color.tmp';
// set a new color (saves posted color to disk)
app.post("/message", function (request, response) {
var dataStr = JSON.stringify(request.body.Body);
fs.writeFile(dataPath, dataStr);
response.end();
});
// get the saved color (reading from disk)
app.get("/color", function (request, response) {
var dataStr = fs.readFileSync(dataPath).toString();
response.send(JSON.parse(dataStr));
});
app.get("/", function (request, response) {
response.sendFile(__dirname + '/views/index.html');
});
var listener = app.listen(process.env.PORT, function () {
console.log('listening on port ' + listener.address().port);
});
app/views/index.html
<script>
// checks server for color value and sets background
function checkForColorChange() {
$.get('/color', function getColorComplete(data) {
document.body.style.backgroundColor = data;
console.log(data);
})
}
// Poll the server at 2000ms interval
setInterval(checkForColorChange, 2000);
checkForColorChange();
</script>
Anyway, I feel like I must be missing something really obvious if it worked so easily on Glitch and won't on my website, but I've been stuck for a few days and am not making any progress! Any help would be so appreciated. Let me know if anything's unclear too.
(See update below for a working example)
TL;DR - example:
Original answer
There are few problems with your code:
you're not checking for errors
you're using blocking functions
you're implicitly relying on file permissions but you're not checking it
you're using string concatenation instead of path.join to join paths
you're constantly polling for new data instead of waiting for it to change
you're not catching exceptions of functions that can raise exception
you're not waiting for async operations to finish and you don't handle errors
The main problem that you're experiencing right now is most likely with the file permissions. The good news is that you don't need any file access for what you're doing and using files for that is not optimal anyway. All you need is to store the color in a variable if you don't need it it persist between server restarts - and even if you do then I would use a simple database for that.
For example:
// some initial value:
var color = '#ffffff';
app.post("/message", function (request, response) {
var color = request.body.Body;
response.end();
});
// get the saved color (reading from disk)
app.get("/color", function (request, response) {
response.send(color);
});
app.get("/", function (request, response) {
response.sendFile(__dirname + '/views/index.html');
});
var listener = app.listen(process.env.PORT, function () {
console.log('listening on port ' + listener.address().port);
});
This is the first change that I would use - don't rely on the file system, permissions, race conditions etc.
Another problem that you had with your code was using blocking functions inside of request handlers. You should never use any blocking function (those with "Sync" in their name) except the first tick of the event loop.
Another improvement that I would make would be using WebSocket or Socket.io instead of polling for data on regular intervals. This would be quite easy to code. See this answer for examples:
Differences between socket.io and websockets
A plus of doing that would be that all of your students would get the color changed instantly and at the same time instead of in random moments spanning 2 seconds.
Update
I wrote an example of what I was describing above.
The POST endpoint is slightly different - it uses /color route and color=#abcdef instead of /message and Body=... but you can easily change it if you want - see below.
Server code - server.js:
// requires removed for brevity
const app = express();
const server = http.Server(app);
const io = socket(server);
let color = '#ffffff';
app.use(bodyParser.urlencoded({ extended: false }));
app.use('/', express.static(path.join(__dirname, 'html')));
io.on('connection', (s) => {
console.log('Socket.io client connected');
s.emit('color', color);
});
app.post('/color', (req, res) => {
color = req.body.color;
console.log('Changing color to', color);
io.emit('color', color);
res.send({ color });
});
server.listen(3338, () => console.log('Listening on 3338'));
HTML page - index.html:
<!doctype html>
<html lang=en>
<head>
<meta charset=utf-8>
<meta name=viewport content="width=device-width, initial-scale=1">
<title>Node Live Color</title>
<link href="/style.css" rel=stylesheet>
</head>
<body>
<h1>Node Live Color</h1>
<script src="/socket.io/socket.io.js"></script>
<script src="/script.js"></script>
</body>
</html>
Style sheet - style.css:
body {
transition: background-color 2s ease;
background-color: #fff;
}
Client-side JavaScript - script.js:
var s = io();
s.on('color', function (color) {
document.body.style.backgroundColor = color;
});
What is particularly interesting is how simple is the client side code.
For your original endpoint use this in server.js:
app.post('/message', (req, res) => {
color = req.body.Body;
console.log('Changing color to', color);
io.emit('color', color);
res.end();
});
Full example is available on GitHub:
https://github.com/rsp/node-live-color
I tested it locally and on Heroku. You can click this button to deploy it on Heroku and test yourself:
Enjoy.
I think, the problem is in var dataStr = fs.readFileSync(dataPath).toString();. Please change your dataPath as follow:
var dataPath = __dirname + '/data/color.tmp';
And also make sure that file has read/write permission by the .
I've two node apps/services that are running together,
1. main app
2. second app
The main app is responsible to show all the data from diffrent apps at the end. Now I put some code of the second app in the main app and now its working, but I want it to be decoupled. I mean that the code of the secnod app will not be in the main app (by somehow to inject it on runtime )
like the second service is registered to the main app in inject the code of it.
the code of it is just two modules ,is it possible to do it in nodejs ?
const Socket = require('socket.io-client');
const client = require("./config.json");
module.exports = (serviceRegistry, wsSocket) =>{
var ws = null;
var consumer = () => {
var registration = serviceRegistry.get("tweets");
console.log("Service: " + registration);
//Check if service is online
if (registration === null) {
if (ws != null) {
ws.close();
ws = null;
console.log("Closed websocket");
}
return
}
var clientName = `ws://localhost:${registration.port}/`
if (client.hosted) {
clientName = `ws://${client.client}/`;
}
//Create a websocket to communicate with the client
if (ws == null) {
console.log("Created");
ws = Socket(clientName, {
reconnect: false
});
ws.on('connect', () => {
console.log("second service is connected");
});
ws.on('tweet', function (data) {
wsSocket.emit('tweet', data);
});
ws.on('disconnect', () => {
console.log("Disconnected from blog-twitter")
});
ws.on('error', (err) => {
console.log("Error connecting socket: " + err);
});
}
}
//Check service availability
setInterval(consumer, 20 * 1000);
}
In the main module I put this code and I want to decouple it by inject it somehow on runtime ? example will be very helpful ...
You will have to use vm module to achieve this. More technical info here https://nodejs.org/api/vm.html. Let me explain how you can use this:
You can use the API vm.script to create compiled js code from the code which you want run later. See the description from official documentation
Creating a new vm.Script object compiles code but does not run it. The
compiled vm.Script can be run later multiple times. It is important to
note that the code is not bound to any global object; rather, it is
bound before each run, just for that run.
Now when you want to insert or run this code, you can use script.runInContext API.
Another good example from their official documentation:
'use strict';
const vm = require('vm');
let code =
`(function(require) {
const http = require('http');
http.createServer( (request, response) => {
response.writeHead(200, {'Content-Type': 'text/plain'});
response.end('Hello World\\n');
}).listen(8124);
console.log('Server running at http://127.0.0.1:8124/');
})`;
vm.runInThisContext(code)(require);
Another example of using js file directly:
var app = fs.readFileSync(__dirname + '/' + 'app.js');
vm.runInThisContext(app);
You can use this approach for the conditional code which you want to insert.
You can create a package from one of your apps and then reference the package in the other app.
https://docs.npmjs.com/getting-started/creating-node-modules
There are several ways to decouple two applications. One easy way is with pub/sub pattern (in case you don't need a response).
(Now if you have an application that is very couple, it will be very difficult to decouple it unless you do some refactoring.)
zeromq offers a very good implementation of pub/sub and is very fast.
e.g.
import zmq from "zmq";
socket.connect('tcp://127.0.0.1:5545');
socket.subscribe('sendConfirmation');
socket.on('message', function (topic, message) {
// you can get the data from message.
// something like:
const msg = message.toString('ascii');
const data = JSON.parse(msg);
// do some actions.
// .....
});
//don't forget to close the socket.
process.on('SIGINT', () => {
debug("... closing the socket ....");
socket.close();
process.exit();
});
//-----------------------------------------
import zmq from "zmq";
socket.bind('tcp://127.0.0.1:5545');
socket.send(['sendConfirmation', someData]);
process.on('SIGINT', function() {
socket.close();
});
This way you could have two different containers (docker) for your modules, just be sure to open the corresponding port.
What i don't understand, is why you inject wsSocket and also you create a new Socket. Probably what I would do is just to send the
socket id, and then just use it like:
const _socketId = "/#" + data.socketId;
io.sockets.connected[socketId].send("some message");
You could also use another solution like kafka instead of zmq, just consider that is slower but it will keep the logs.
Hope this can get you an idea of how to solve your problem.
You can use npm link feature.
The linking process consists of two steps:
Declaring a module as a global link by running npm link in the module’s root folder
Installing the linked modules in your target module(app) by running npm link in the target folder
This works pretty well unless one of your local modules depends on another local module. In this case, linking fails because it cannot find the dependent module. In order to solve this issue, one needs to link the dependent module to the parent module and then install the parent into the app.
https://docs.npmjs.com/cli/link