Integrate Javascript with response - javascript

I have two Java scripts and i would like to merge them to get a output that can be treated as successful or failure. I'm not sure if this is the right direction or if i have got this completely wrong.
Initial script is to git clone the repos
const shell = require('shelljs')
const path = '/home/test/'
shell.cd(path)
shell.exec('git clone https://github.com/test/mic.git')
This is a java script and it does clone the repo.
node git.js and it simply clones the repos
Now I have another script which needs to get the result of the above script and pass it to a variable which then says if it is success of failure.
var body = response.getResponseBody();
var res = JSON.parse(body);
if (res.flag < 1){
api.setValue("flag", "failed");
}
Is there a way to integrate these scripts to get the right results.
All i want is if the first script will success/fail and get the status as a result which can be passed as a flag to another variable.
Any directions is really helpful

Shell.exec takes a callback.
If you have an error the code parameter in the callback should be a non-zero
shell.exec('git clone https://github.com/test/mic.git', (code, stdout, stderr) => {
if(code === 0) {
// No Error
} else {
// Had an error
}
})

Related

Required JSON file has old values from before running the program

I am writing a discord bot using javascript (discord.js).
I use json files to store my data and of course always need the latest data.
I do the following steps:
I start the bot
I run a function that requires the config.json file every time a message is sent
I increase the xp a user gets from the message he sent
I update the users xp in the config.json
I log the data
So now after logging the first time (aka sending the first message) I get the data that was in the json file before I started the bot (makes sense). But after sending the second message, I expect the xp value to be higher than before, because the data should have been updated, the file new loaded and the data logged again.
(Yes I do update the file every time. When I look in the file by myself, the data is always up to date)
So is there any reason the file is not updated after requiring it the second time? Does require not reload the file?
Here is my code:
function loadJson() {
var jsonData = require("./config.json")
//here I navigate through my json file and end up getting to the ... That won't be needed I guess :)
return jsonData
}
//edits the xp of a user
function changeUserXP(receivedMessage) {
let xpPerMessage = getJsonData(receivedMessage)["levelSystemInfo"].xpPerMessage
jsonReader('./config.json', (err, data) => {
if (err) {
console.log('Error reading file:',err)
return
}
//increase the users xp
data.guilds[receivedMessage.guild.id].members[receivedMessage.author.id].xp += Number(xpPerMessage)
data.guilds[receivedMessage.guild.id].members[receivedMessage.author.id].stats.messagesSent += 1
fs.writeFile('./test_config.json', JSON.stringify(data, null, 4), (err) => {
if (err) console.log('Error writing file:', err)
})
})
}
client.on("message", (receivedMessage) => {
changeUserXP(receivedMessage)
console.log(loadJson(receivedMessage))
});
I hope the code helps :)
If my question was not precise enough or if you have further questions, feel free to comment
Thank you for your help <3
This is because require() reads the file only once and caches it. In order to read the same file again, you should first delete its key (the key is the path to the file) from require.cache

Changing an array entry within a json object

I've pulled json from a File using NodeJS's fs.createReadStream() and I'm now finding difficulty writing data back into the File (already parsing and then stringifying as appropriate).
The Discord-Bot I'm developing deletes text-channels then 'recreates' them (with the same title) to clear chat - it grabs the channel IDs dynamically and puts them in a file, until the channels are deleted.
However, the file-writing procedure ends up in errors.
This was my first attempt:
let channels_json = fs.createReadStream()
//let channels_json = fs.readFileSync(`${__dirname}\\..\\json\\channels.json`);
let obj = (JSON.parse(channels_json)).channelsToClear;
let i = 0;
obj.forEach(id => {
i++;
if(id === originalId){
obj[i] = channela.id;
}
});
obj += (JSON.parse(channels_json)).infoChannel;
obj += "abc";
json = JSON.stringify(obj);
channels_json.write(json);
This was my second:
let id_to_replace = message.guild.channels.get(channels[channel]).id;
//let channels_json = fs.readFileSync(`${__dirname}\\..\\json\\channels.json`);
let obj;
let channels_json = fs.createReadStream(`${__dirname}\\..\\json\\channels.json`,function(err,data){
if (err) throw err;
obj = JSON.parse(data);
if (obj["channelsToClear"].indexOf(id_to_replace) > -1) {
obj["channelsToClear"][obj["channelsToClear"].indexOf(id_to_replace)] = channela.id;
//then replace the json file with new parsed one
channels_json.writeFile(`${__dirname}\\..\\json\\channels.json`, JSON.stringify(obj), function(){
console.log("Successfully replaced channels.json contents");
});
//channels_json.end();
}
});
The final outcome was to update the 'channelsToClear' array within the json file with new channel-IDs. Console/Node output varied, all of which had to do with "create channels with an options object" or "Buffer.write" (all irrelevant) - the json file remained unchanged..
You're using Streams incorrectly. You can't write back out through a read stream.
For a simple script type thing, streaming is probably overkill. Streams are quite a bit more complicated and while worth it for high-efficienty applications but not for something that looks like it is going to be relatively quick.
Use the fs.readFileSync and fs.writeFileSync to get and write the data instead.
As far as your actual searching for and replacing the channel, I think either approach would work, but assuming there is only ever going to be one replacement, the second approach is probably better.

Execute code after SQLite database has been written to hard storage

I have the following example code (on a node.js server) that should insert data into an sqlite table and then run a child process which copies the sqlite database file to another directory. The problem is that the copied version does not contain the newly inserted data. When I set a timeout before executing the command everything works but I would prefer to use a callback or event.
const sqlite3 = require('sqlite3').verbose();
const db = new sqlite3.Database('test.db');
const exec = require('child_process').exec;
db.serialize(function() {
var val = Date.now()/1000;
db.run("INSERT INTO test (val) VALUES (?);", [val]);
db.close();
exec('/bin/cp -rf /path0/test.db /path1/');
});
as the documentations says about close function :
Database#close([callback])
Closes the database.
callback (optional): If provided, this function will be called when the database was closed successfully or when an error occurred. The first > argument is an error object. When it is null, closing succeeded. If no callback is provided and an error occurred, an error event with the error object as the only parameter will be emitted on the database object. If closing succeeded, a close event with no parameters is emitted, regardless of whether a callback was provided or not.
you should be able to provide a callback to the close function to be run after the db is closed, if i understand your code well it should be something like this :
const sqlite3 = require('sqlite3').verbose();
const db = new sqlite3.Database('test.db');
const exec = require('child_process').exec;
db.serialize(function() {
var val = Date.now()/1000;
db.run("INSERT INTO test (val) VALUES (?);", [val]);
db.close(() => { exec('/bin/cp -rf /path0/test.db /path1/') });
});
reference is here!

redis.exceptions.ResponseError: MOVED error in redis set operation

I have created a Redis cluster with 30 instances (15 masters/ 15 nodes). With python code i connected to these instances, i found the masters and then i wanted to add some keys to them.
def settomasters(port, host):
r = redis.Redis( host=host, port=port )
r.set("key"+port,"value")
Error:
redis.exceptions.ResponseError: MOVED 12539 127.0.0.1:30012
If i try to set key from redis-cli -c -p portofmyinstance sometimes i get a redirection message that tells where the keys stored.
I know that in case of get requests for example, a smart client is needed in order to redirect the requests to the correct node (the node that holds the key) otherwise a moved error occurs. Is it the same situation? I need to catch the redis.exceptions.ResponseError and try to set again?
while True:
try:
r.set("key","value")
break
except:
print "error"
pass
My first try was above code but without solution. The set operation never succeeds.
On the other hand below code in javascript does not throw an error and i cannot figure the reason:
var redis = require('redis-stream'),
client = new redis(30001, '127.0.0.1');
// Open stream
var stream = client.stream();
// Example of setting 200 records
for(var record = 0; record <200; record++) {
var command = ['set', 'qwerty' + record, 'QWERTYUIOP'];
stream.redis.write( redis.parse(command) );
}
stream.on('close', function () {
console.log('Completed!');
});
// Close the stream after batch insert
stream.end();
Any help will be appreciated, thanks.
with a redis cluster you can use the normal redis client only if you "find for the certain key the slot that belongs and then the slots that each master serves. With this information i can set keys to the correct node without moved redirection errors." as #Antonis said. Otherwise you need http://redis-py-cluster.readthedocs.io/en/master/

Can't seem to save the output of a client.shell() command to a variable

I am using node-webkit and ADBkit to attempt to read a line from an android build.prop and do something depending on what that line is.
full script at http://pastebin.com/7N7t1akk
The gist of it is this:
var model = client.shell(devices, "su -c 'grep ro.product.model /system/build.prop'" );
alert(model)
i want to read ro.product.model from build.prop into the variable model
As a test im simply attempting to create an alert that displays the return of this shell command, in my case ro.product.model=KFSOWI but whenever i run this script with a connected device the alert returns object Object
edit**
Ive just realized that client.getProperties(serial[, callback]) would probably work better but dont understand these functions (specifically callback) very well
I am very new to this area of Javascripting and home someone can offer some insight
JavaScript is asynchronous programming language, it is built on callbacks. Every function should have callback with data passed to it, if you will watch on documentation, you have client.shell(serial, command[, callback]) so data from executing client.shell() will be passed to callback. You should assign some function that will process callback, for your case will be this
client.shell(devices, "su -c 'grep ro.product.model /system/build.prop'", function(data) {
console.log(data);
});
P.S. there is no alert in nodejs
According to the documentation, you can catch the output in the 2nd argument of client.shell()'s callback:
client.shell(devices, "su -c 'grep ro.product.model /system/build.prop'", function(err, output) {
if (err) {
console.log(err);
}
console.log(output);
});
Use async / await for clearer code.
const data = await client.shell(devices, "su -c 'grep ro.product.model /system/build.prop'" );
console.log(data); // => "Samsung.TM395"
Of course, this only will work if this code is in an async function.
Streamed data.
For streamed data with adbkit, you will need to do a little more to read the entire stream and then output the results, like so:
const stream = await adbClient.shell( config.udid, "ime list -s" ) // adb command to list possible input devices (e.g. keyboards, etc.).
const result = await adb.util.readAll( stream );
console.log( result.toString() ); // => com.sec.android.inputmethod/.SamsungKeypad

Categories