Hello all, I want to send hex commands to my device using Node red. I'm new to node red and JavaScript, what is the way to send this command in the proper way. I want to write it in a function.
The device is connected via serial port and on rs232 interface.
The things that I have tried:
Blockquote
"
1-
var buf = Buffer.from([0xaa,0x3f,0x00,0x00,0xf0,0x11]).toString;
msg.payload = buf;
return msg;
2-
var buf = Buffer.from([0xaa,0x3f,0x00,0x00,0xf0,0x11]);
msg.payload = buf;
return msg;
3-
var buf = Buffer.from(['aa','3f','00','00','f0','11']);
msg.payload = buf;
return msg;
And some more ways, like without using a function. In most cases I'm getting no response. I'm pretty sure of the physical connections, since using a different software which is UART Assistant and over the same port with same settings, the device is responding to the same command that I send.
Related
I've been playing around with the MPR121-Shield Capacitive Touch Sensor by Adafruit. In Arduino's IDE, there is an example of code you can simply download and run, and it works perfectly : when I touch on the pin 11, for example, it returns ''11 touched'', and when I release it, it returns ''11 pressed''. Great!
Now the problem comes when I try to transfer that data to NW.js. By using Chrome's serial port in NW.js, I can connect to the port in which my Arduino is connected, and try to read whichever data the Arduino is sending. However, as I try to read the data, the only thing I receive is an ArrayBuffer filled with bytes of 0. I am really not sure what is happening here, as both devices work perfectly when I run it in Arduino's IDE, but it returns basically nothing with chrome.serialport.
Does anyone have a tip or an idea of what's going on here? If I do a console.log(info.data), I only get an ArrayBuffer with empty bites.
Thanks
Here is my code :
const ab2str = require('arraybuffer-to-string');
nw.Window.get().showDevTools();
let buffer = "";
chrome.serial.getDevices(devices => {
devices.forEach(device => console.log(device));
});
// let port = "COM3";
let port = "/dev/cu.usbmodem142401";
chrome.serial.connect(port, {bitrate: 1000000}, () => {
console.log("Serialport connected:" + port);
chrome.serial.onReceive.addListener(onDataReceived);
});
function onDataReceived(info) {
let lines = ab2str(info.data).split("\n");
lines[0] = buffer + lines[0];
buffer = lines.pop();
lines.forEach(line => {
const [type, value] = line.split("=");
console.log(type, value);
});
}
The Tx and Rx baud rates has to be the same to properly decode the information, and the arduino IDE handles that for you in the first case, but you will need to handle it manually for the second case. In serial port communication, single bit is transferred at a time unlike in parallel ports where you will have all bits availed at the same time for reading. So, in serial ports, the rate at which the information is transmitted(Tx) should be the same as the rate at which the information is received(Rx), otherwise bits could be lost and you may get a wrong information. The arduino IDE handles most of these issues for you, if I'm not wrong the IDE allows you to change the baud rate, but the default is 9600.
I'm creating a test network where a node.js application communicates to a C++ application via ZeroMQ using Google Protocol Buffers for the message structure.
Data from C++ to JavaScript can be serialized, sent, received, and deserialized without issue. However, I'm having an issue moving in the opposite direction.
If the Protobuf message going from JavaScript to C++ contains an integer with a set value over 127, the data is manipulated some where along the wire where many of the bytes of the payload change.
Note that every message sent in both directions has a 4 byte header that works (seammingly) correctly so I won't include that in the conversation for now.
I've noticed that on the JS side before sending, the bytes for the protobuf serialized data will look something like this:
26,4,129,192,136,24,64,40 stored in type Buffer (npm).
However, on the C++ side after receiving, I'm getting:
26,4,239,191,189,239,191,189 stored in a std::string container.
I have verified that the bytes in the JS Buffer do not change up to the point that ZeroMQ sub is called. However, after that I can't tell what happens or when the values change.
I've tried forcing any use of Buffer types to Uint8Arrays with no luck. I haven't tried much else because I'm not sure what the issue is in the slightest.
This is the basic flow of the message creation on the JavaScript side.
var data = new ProtoData.Data1();
data.setTemp(128); // type is int32
var payload = data.serializeBinary();
var size = payload.length + 4; // 16 bits
var head1 = 4; // 8 bits
var head2 = 4; // 8 bits
var payload_buf = Buffer.from(payload);
// create the header
var header = Buffer.allocUnsafe(4);
header.writeUInt16LE(size, 0);
header.writeUInt8(head1, 2);
header.writeUInt8(head2, 3);
var msg = Buffer.concat([header, payload_buf]);
zmqPubSock.sock.send(msg);
Receiving this on the C++ side will have a completely different message received. Keep in mind that if data.setTemp() had a value of < 127 then this message would be received on the C++ side without issue.
I was just introduced to Node-Red after asking around for some suggestions on an IoT setup. I have a piece of javascript code that is sending data to a web socket. The code that it is sending is in a HEX format and is sent to the web socket.
I am trying to replicate this using node-red and I am having some trouble figuring out which node to use for sending the data.
Vanilla Javascript:
function connectToSocket() {
// Try to connect to the socket
try {
// Create our socket connection
connection = new WebSocket('ws://' + gatewayIP + ':8000');
connection.binaryType = "arraybuffer";
// Failed to create the socket connection
} catch (e) {
// Log error message
logMessage('Failed to connect to socket');
return;
}
}
connection.send('\x02\x00\x01\x04\x26\x2D');
I have tried sending this as a string and json object as msg.payload but it is not triggering the device as I expect it to such as when I run the normal JS function in a browser.
What would be an appropriate format to send this hex string in?
What you want to send is a buffer and the inject node can't generate a buffer at this point. The easiest way to do this will be to insert a function node between the inject and the WebSocket Out node.
The function node should contain something like:
msg.payload = Buffer.from("\x02\x00\x01\x04\x26\x2D");
return msg;
This will swap the payload for a buffer with the right values.
EDIT:
For NodeJS 0.10.x you should use something like as Buffer.from() was introduced in NodeJS 4.x:
msg.payload = new Buffer("\x02\x00\x01\x04\x26\x2D");
return msg;
I am trying to emulate Chrome's native messaging feature using Firefox's add-on SDK. Specifically, I'm using the child_process module along with the emit method to communicate with a python child process.
I am able to successfully send messages to the child process, but I am having trouble getting messages sent back to the add-on. Chrome's native messaging feature uses stdin/stdout. The first 4 bytes of every message in both directions represents the size in bytes of the following message so the receiver knows how much to read. Here's what I have so far:
Add-on to Child Process
var utf8 = new TextEncoder("utf-8").encode(message);
var latin = new TextDecoder("latin1").decode(utf8);
emit(childProcess.stdin, "data", new TextDecoder("latin1").decode(new Uint32Array([utf8.length])));
emit(childProcess.stdin, "data", latin);
emit(childProcess.stdin, "end");
Child Process (Python) from Add-on
text_length_bytes = sys.stdin.read(4)
text_length = struct.unpack('i', text_length_bytes)[0]
text = sys.stdin.read(text_length).decode('utf-8')
Child Process to Add-on
sys.stdout.write(struct.pack('I', len(message)))
sys.stdout.write(message)
sys.stdout.flush()
Add-on from Child Process
This is where I'm struggling. I have it working when the length is less than 255. For instance, if the length is 55, this works:
childProcess.stdout.on('data', (data) => { // data is '7' (55 UTF-8 encoded)
var utf8Encoded = new TextEncoder("utf-8).encode(data);
console.log(utf8Encoded[0]); // 55
}
But, like I said, it does not work for all numbers. I'm sure I have to do something with TypedArrays, but I'm struggling to put everything together.
The problem here, is that Firefox is trying to read stdout as UTF-8 stream by default. Since UTF-8 doesn't use the full first byte, you get corrupted characters for example for 255. The solution is to tell Firefox to read in binary encoding, which means you'll have to manually parse the actual message content later on.
var childProcess = spawn("mybin", [ '-a' ], { encoding: null });
Your listener would then work like
var decoder = new TextDecoder("utf-8");
var readIncoming = (data) => {
// read the first four bytes, which indicate the size of the following message
var size = (new Uint32Array(data.subarray(0, 4).buffer))[0];
//TODO: handle size > data.byteLength - 4
// read the message
var message = decoder.decode(data.subarray(4, size));
//TODO: do stuff with message
// Read the next message if there are more bytes.
if(data.byteLength > 4 + size)
readIncoming(data.subarray(4 + size));
};
childProcess.stdout.on('data', (data) => {
// convert the data string to a byte array
// The bytes got converted by char code, see https://dxr.mozilla.org/mozilla-central/source/addon-sdk/source/lib/sdk/system/child_process/subprocess.js#357
var bytes = Uint8Array.from(data, (c) => c.charCodeAt(0));
readIncoming(bytes);
});
Maybe is this similar to this problem:
Chrome native messaging doesn't accept messages of certain sizes (Windows)
Windows-only: Make sure that the program's I/O mode is set to O_BINARY. By default, the I/O mode is O_TEXT, which corrupts the message format as line breaks (\n = 0A) are replaced with Windows-style line endings (\r\n = 0D 0A). The I/O mode can be set using __setmode.
I want to crypt an input stream and send it to another server via TCP. So far so good. Everything runs smoothly, until the connection is closed. In almost any case the needed block size of 192 bits is not met and the script crashes with wrong final block length, although I turned auto padding on.
It seems like auto padding only works, when using the legacy interface. Am I doing something wrong here?
var net = require("net")
, crypto = require("crypto");
var credentials = { algorithm: "aes192", password: "password" }
, decipher = crypto.createDecipher(credentials.algorithm, credentials.password)
, cipher = crypto.createCipher(credentials.algorithm, credentials.password);
decipher.setAutoPadding(true);
cipher.setAutoPadding(true);
net.createServer(function(socket) {
socket.pipe(socket);
}).listen(2000);
var socket = net.connect(2000);
socket.pipe(decipher).pipe(process.stdout);
process.stdin.pipe(cipher).pipe(socket);
socket.write("Too short.");
socket.end();
In my ideal Node.js world, the (De-)Cipher Stream would automatically pad the last block, when the source stream is closed. I think this is a design flaw.
Apart from opening an issue, how can I circumvent this behaviour? Do I have to put a byte counter between Socket and (De-)Cipher Streams?
You have set your pipes like this :
stdin | cipher | socket (loopback) | decipher | stdout
But you bypass the encryption by writing directly to the socket, using them like this :
socket (loopback) | decipher | stdout
Try with this code :
var net = require("net")
, crypto = require("crypto");
var credentials = { algorithm: "aes192", password: "password" }
, decipher = crypto.createDecipher(credentials.algorithm, credentials.password)
, cipher = crypto.createCipher(credentials.algorithm, credentials.password);
decipher.setAutoPadding(false); //set to false to keep the padding
cipher.setAutoPadding(true);
//Loopback
server = net.createServer(function(socket) {
socket.pipe(socket);
})
server.listen(2000);
var socket = net.connect(2000);
//cipher to the loopback socket, to decipher and stdout
cipher.pipe(socket).pipe(decipher).pipe(process.stdout);
//write some data
cipher.write("Too short.");
//Clean exit
cipher.end();
server.unref();
For the purpose of demonstration, I removed auto padding from the Decryptor object so you can see the leftover padding. Piping the program in xxd (at the command line, not in node) gives me this ouput :
$ nodejs so.js | xxd
0000000: 546f 6f20 7368 6f72 742e 0606 0606 0606 Too short.......
With the 0x06 repeated 6 times.