Node.js : Stopping transfer at EOF while sending binary data through UART - javascript

I'm trying to open a binary file './test' and send it's contents, one byte at a time, to an external device through the UART. The external device echos back each byte.
The regular file './test' and the buffer 'dta' in this case are both 19860 bytes in length however the code will continue to send bytes from beyond the end of buffer 'dta' well after 'a' becomes greater than 'dta.length' and I can't figure out why. Any ideas?
var fs = require('fs');
var SerialPort = require("serialport").SerialPort
var serialPort = new SerialPort("/dev/ttyAMA0", {baudrate: 115200}, false);
stats = fs.statSync(__dirname+"/test");
dta = new Buffer (stats.size);
dta = fs.readFileSync(__dirname+"/test");
a=0;
serialPort.open(function (error) {
if ( error ) {
console.log('failed to open: '+error);
} else {
serialPort.write(dta[a]);
}
});
serialPort.on('data', function(data) {
a++;
if (a < dta.length) serialPort.write(dta[a]);
});

Related

How to write BLE write characteristic over 512B

I have a client attempting to send images to a server over BLE.
Client Code
//BoilerPlate to setup connection and whatnot
sendFile.onclick = async () => {
var fileList = document.getElementById("myFile").files;
var fileReader = new FileReader();
if (fileReader && fileList && fileList.length) {
fileReader.readAsArrayBuffer(fileList[0]);
fileReader.onload = function () {
var imageData = fileReader.result;
//Server doesn't get data if I don't do this chunking
imageData = imageData.slice(0,512);
const base64String = _arrayBufferToBase64(imageData);
document.getElementById("ItemPreview").src = "data:image/jpeg;base64," + base64String;
sendCharacteristic.writeValue(imageData);
};
}
};
Server Code
MyCharacteristic.prototype.onWriteRequest = function(data, offset, withoutResponse, callback) {
//It seems this will not print out if Server sends over 512B.
console.log(this._value);
};
My goal is to send small images (Just ~6kb)...These are still so small that'd I'd still prefer to use BLE over a BT Serial Connection. Is the only way this is possible is to perform some chunking and then streaming the chunks over?
Current 'Chunking' Code
const MAX_LENGTH = 512;
for (let i=0; i<bytes.byteLength; i+= MAX_LENGTH) {
const end = (i+MAX_LENGTH > bytes.byteLength) ? bytes.byteLength : i+MAX_LENGTH;
const chunk = bytes.slice(i, end);
sendCharacteristic.writeValue(chunk);
await sleep(1000);
}
The above code works, however it sleeps in between sends. I'd rather not do this because there's no guarantee a previous packet will be finished sending and I could sleep longer than needed.
I'm also perplexed on how the server code would then know the client has finished sending all bytes and can then assemble them. Is there some kind of pattern to achieving this?
BLE characteristic values can only be 512 bytes, so yes the common way to send larger data is to split it into multiple chunks. Use "Write Without Response" for best performance (MTU-3 must be at least as big as your chunk).

.zip file download with reactjs has corrupted content

My backend (in nodejs) uses a webservice that allows to download a file from a distant server. It has to "return" the download towards my reactJS webapp in the frontend.
The download is functionnal when testing the service with the swagger. Thus, my backend returns a string with the raw data, which I re-interpet to download.
My frontend treatment is as such :
downloadFile(){
let path = this.props.result.link;
let name = this.props.result.fileName;
downloader.downloadAFile(path,name).then( response => {
if (response){
if (response.success == true) {
let bytes = this.data64ToArrayBuffer(response.data);
var file = new File([bytes],this.props.result.fileName+".zip",{type: "application/zip"});
fileSaver.saveAs(file);
} else {
console.log(response);
if (response.data == "Fichier introuvable."){
window.alert(response.data);
}
}
}
}).catch(function (error){
console.log(error);
});
}
dataToArrayBuffer = (data) => {
var binaryString = data;
console.log(binaryString);
var binaryLen = binaryString.length;
var bytes = new Uint8Array(binaryLen);
for (var i = 0; i < binaryLen; i++) {
var ascii = binaryString.charCodeAt(i);
bytes[i] = ascii;
}
return bytes;
}
For a file named "test.txt", containing "test" as text, at the "/PATH/TO/FOLDER" path, the API should propose a zip file to download, names "test.txt.zip", with inside, all the folders trees /PATH/TO/FOLDER, and in the final folder, the mentionned file.
My download in the frontend is effective in some way since it downloads a zip with the right name, the right size, and the right folder tree inside.
However, the test.txt file in the final folder of the path seems to be corrupted, for when I try to open it in 7zip, I have the following error : CRC failed : test.txt .
I suspect it is a problem with how I handle the raw data. I'll take any hints and leads, since I am effectively blocked.

How to convert JavaScript array to binary data and back for WebSocket?

I have this example array:
[{
id: 1,
name: "test",
position: [1234,850], // random position on the map
points: 100 // example points
}];
Here is what I want to do:
Convert the array into binary data, and send the binary to my WebSocket server.
On the server, decode the binary into an array and make changes.
Convert the array into binary, and send the binary to the client.
On the client, decode the binary back into an array.
Example screenshot of what I mean:
This is my actual code:
var connection = new WebSocket('wss://my_website.eu:1234');
connection.binaryType = "ArrayBuffer";
connection.onmessage = function (event) {
// console.log(event);
if (event.data instanceof window["ArrayBuffer"]) {
var data3 = JSON.parse(String.fromCharCode.apply(null, new Uint16Array(event.data)));
console.log(data3);
} else {
console.log(event.data); // Blob {size: 0, type: ""}
}
};
$("body").mousemove(function( event ) {
var data = {
name: "lol",
pos: [event.pageX, event.pageY]
};
// convert to binary frame
var data2 = new Uint16Array(data);
console.log(data2); // []
// try to convert back to array
var data3 = String.fromCharCode.apply(null, new Uint16Array(data2));
console.log(data3); // empty
connection.send(data2); // Binary Frame (Opcode 2, mask) | length: 0
});
Server-side code:
connection.on('message', function(message) {
for (var i = players.length - 1; i >= 0; i--) {
players[i].connection.send(message.binaryData);
}
});
LATEST EDIT READ FROM HERE
I now can send a message as a binary frame to a WebSocket server. I found functions to convert a string to a binary type and send it to a WS server.
Now I have a problem. This function (below) is not working at server-side. Example code:
var data = {
name: "value"
};
connection.send(JSON.stringify(data));
This code is working good. Now, when I try to send as an array buffer:
var data = {
name: "value"
};
connection.send(StringToArrayBuffer(JSON.stringify(data)));
the output is not a binary frame. It is just a string [object ArrayBuffer]:
I also tried:
connection.send(JSON.stringify(data), {binary: true, mask: false});
but this is sending the message as a normal string, not a binary frame.
So, how I can send a binary frame from a WebSocket server to a client? When I send back a received binary message:
connection.on('message', function(message) {
for (var i = players.length - 1; i >= 0; i--) {
playerConnection[i].send(message.binaryData);
}
}
only this works.
First of all, Browsers treat binary data differently than NodeJS. In browser, binaries can be seen as Blob or ArrayBuffer, but in NodeJS, it is seen as Buffer doesn't understand ArrayBuffer. I won't go in too deep into this, but you need to handle data differently between browser and nodeJS.
When using WebSocket at the browser side, data are transmitted as either string or binary, if binary will be used, then you have to specify BinaryType, and in this particular case, I will use ArrayBuffer.
As to string to buffer, I suggest to use the standard UTF-8 as there are 2 ways of encoding UTF-16. For example '\u0024' in UTF-16 will be stored as 00 24 in UTF-16BE, and in UTF-16LE, it is stored as 24 00. That is, if you are going to use UTF-16, then you should use TextEncoder and TextDecoder. Otherwise you can simply do this
strToAB = str =>
new Uint8Array(str.split('')
.map(c => c.charCodeAt(0))).buffer;
ABToStr = ab =>
new Uint8Array(ab).reduce((p, c) =>
p + String.fromCharCode(c), '');
console.log(ABToStr(strToAB('hello world!')));
For UTF-16, the browser code should be something like:
const ENCODING = 'utf-16le';
var ws = new WebSocket('ws://localhost');
ws.binaryType = 'arraybuffer';
ws.onmessage = event => {
let str = new TextDecoder(ENCODING).decode(event.data),
json = JSON.parse(str);
console.log('received', json);
};
ws.onopen = () => {
let json = { client: 'hi server' },
str = JSON.stringify(json);
console.log('sent',json);
//JSON.toString() returns "[object Object]" which isn't what you want,
//so ws.send(json) will send wrong data.
ws.send(new TextEncoder(ENCODING).encode(str));
}
At the server side, data is stored as Buffer and it more or less does everything natively. You however need to specify Encoding unless it is UTF-8.
const ENCODING = 'utf-16le';
//You may use a different websocket implementation, but the core
//logic reminds as they all build on top of Buffer.
var WebSocketServer = require('websocket').server,
http = require('http'),
//This is only here so webSocketServer can be initialize.
wss = new WebSocketServer({
httpServer: http.createServer()
.listen({ port: 80 })});
wss.on('request', request => {
var connection = request.accept(null, request.origin);
connection.on('message', msg => {
if (msg.type === 'binary') {
//In NodeJS (Buffer), you can use toString(encoding) to get
//the string representation of the buffer.
let str = msg.binaryData.toString(ENCODING);
console.log(`message : ${str}`);
//send data back to browser.
let json = JSON.parse(str);
json.server = 'Go away!';
str = JSON.stringify(json);
//In NodeJS (Buffer), you can create a new Buffer with a
//string+encoding, and the default encoding is UTF-8.
let buf = new Buffer(str, ENCODING);
connection.sendBytes(buf);
}
});
});
Try it:
Sending data example:
var data = [{
id: 1,
name: "test",
position: [1234, 850], //random position on the map
points: 100 //example points
}];
var data2 = new Uint16Array(data);
socket.send(data2);
In your event onMessage websocket try it:
function onMessage(event) {
if (event.data instanceof window["ArrayBuffer"]){
var data3 = JSON.parse(String.fromCharCode.apply(null, new Uint16Array(event.data)));
};
};

How to write a file from an ArrayBuffer in JS

I am trying to write a file uploader for Meteor framework.
The principle is to split the fileon the client from an ArrayBuffer in small packets of 4096 bits that are sent to the server through a Meteor.method.
The simplified code below is the part of the client that sends a chunk to the server, it is repeated until offset reaches data.byteLength :
// data is an ArrayBuffer
var total = data.byteLength;
var offset = 0;
var upload = function() {
var length = 4096; // chunk size
// adjust the last chunk size
if (offset + length > total) {
length = total - offset;
}
// I am using Uint8Array to create the chunk
// because it can be passed to the Meteor.method natively
var chunk = new Uint8Array(data, offset, length);
if (offset < total) {
// Send the chunk to the server and tell it what file to append to
Meteor.call('uploadFileData', fileId, chunk, function (err, length) {
if (!err) {
offset += length;
upload();
}
}
}
};
upload(); // start uploading
The simplified code below is the part on the server that receives the chunk and writes it to the file system :
var fs = Npm.require('fs');
var Future = Npm.require('fibers/future');
Meteor.methods({
uploadFileData: function(fileId, chunk) {
var fut = new Future();
var path = '/uploads/' + fileId;
// I tried that with no success
chunk = String.fromCharCode.apply(null, chunk);
// how to write the chunk that is an Uint8Array to the disk ?
fs.appendFile(path, chunk, 'binary', function (err) {
if (err) {
fut.throw(err);
} else {
fut.return(chunk.length);
}
});
return fut.wait();
}
});
I failed to write a valid file to the disk, actually the file is saved but I cannot open it, when I see the content in a text editor, it is similar to the original file (a jpg for example) but some characters are different, I think that could be an encoding problem as the file size is not the same, but I don't know how to fix that...
Saving the file was as easy as creating a new Buffer with the Uint8Array object :
// chunk is the Uint8Array object
fs.appendFile(path, Buffer.from(chunk), function (err) {
if (err) {
fut.throw(err);
} else {
fut.return(chunk.length);
}
});
Building on Karl.S answer, this worked for me, outside of any framework:
fs.appendFileSync(outfile, Buffer.from(arrayBuffer));
Just wanted to add that in newer Meteor you could avoid some callback hell with async/await. Await will also throw and push the error up to client
Meteor.methods({
uploadFileData: async function(file_id, chunk) {
var path = 'somepath/' + file_id; // be careful with this, make sure to sanitize file_id
await fs.appendFile(path, new Buffer(chunk));
return chunk.length;
}
});

Node.js - Reading CSV-file not working with line numbers > 500

I am currently struggling to run my Node.js server.
What I want to do:
Upload a CSV-File from mobile device to my local server and save it on the file system
Read each line of the .csv-File and save each row to my MongoDB database
Uploading and saving the file works flawlessly. Reading the .csv-File and saving each row to the database only works for files with small line numbers.
I don't know the exact number of lines when it stops working. It seems to differ every time I read a file.
Sometimes (if the line numbers are bigger than 1000) the CSV-Reader I use doesn't even start processing the file. Other times he reads only 100-200 lines and then stops.
Here is my code how I upload the file:
var fs = require('fs');
var sys = require("sys");
var url = require('url');
var http = require('http');
http.createServer(function(request, response) {
sys.puts("Got new file to upload!");
var urlString = url.parse(request.url).pathname;
var pathParts = urlString.split("/");
var deviceID = pathParts[1];
var fileName = pathParts[2];
sys.puts("DeviceID: " + deviceID);
sys.puts("Filename: " + fileName);
sys.puts("Start saving file");
var tempFile = fs.createWriteStream(fileName);
request.pipe(tempFile);
sys.puts("File saved");
// Starting a new child process which reads the file
// and inserts each row to the database
var task = require('child_process').fork('databaseInsert.js');
task.on('message', function(childResponse) {
sys.puts('Finished child process!');
});
task.send({
start : true,
deviceID : deviceID,
fileName : fileName
});
sys.puts("After task");
response.writeHead(200, {
"Content-Type" : "text/plain"
});
response.end('MESSAGE');
}).listen(8080);
This works all fine.
Now the code of the child process (databaseInsert.js):
var sys = require("sys");
var yaCSV = require('ya-csv');
var Db = require('mongodb').Db;
var dbServer = require('mongodb').Server;
process.on('message', function(info) {
sys.puts("Doing work in child process");
var fileName = info.fileName;
var deviceID = info.deviceID;
sys.puts("Starting db insert!");
var dbClient = new Db('test', new dbServer("127.0.0.1", 27017, {}), {
w : 1
});
dbClient.open(function(err, client) {
if (err) {
sys.puts(err);
}
dbClient.createCollection(deviceID, function(err, collection) {
if (err) {
sys.puts("Error creating collection: " + err);
} else {
sys.puts("Created collection: " + deviceID);
var csvReader = yaCSV.createCsvFileReader(fileName, {
columnsFromHeader : true,
'separator' : ';'
});
csvReader.setColumnNames([ 'LineCounter', 'Time', 'Activity',
'Latitude', 'Longitude' ]);
var lines = 0;
csvReader.addListener('data', function(data) {
lines++;
sys.puts("Line: " + data.LineCounter);
var docRecord = {
fileName : fileName,
lineCounter : data.LineCounter,
time : data.Time,
activity : data.Activity,
latitude : data.Latitude,
longitude : data.Longitude
};
collection.insert(docRecord, {
safe : true
}, function(err, res) {
if (err) {
sys.puts(err);
}
});
});
}
});
});
process.send('finished');
});
At first I didn't use a child process but I had the same behaviour as I have now. So I tested this.
Hopefully someone who has some experience with Node.js can help me.
I think your issue is that you are trying to read the tempFile while it is still being written to. Right now you are piping the request to the file stream (which proceeds in parallel and asynchronously) and start the reader process. The reader process will then start reading the file in parallel with the write operations. If the reader is faster (it usually will be), it will read the first couple of records but then encounter an end of file and stop reading.
To remedy this, you could only start the reader process after writing has completely finished, i.e., put the part from sys.puts("File.send"); onward into a callback of tempFile.end(...) (see http://nodejs.org/api/stream.html#stream_writable_end_chunk_encoding_callback).
Reading the file while it is still being written to, akin to the tail command in Unix, is fairly hard in my understanding (google for details on how difficult it is to implement a proper tail).
Are you familiar with mongoimport/export?
I used this in the past to export from my db to a csv file...so you can do the opposite after you upload it from the mobile-client to the server.
Its from the shell, but you can write it in code using nodeJS_ChildSpawn

Categories