I am trying to create a terminal app that will run indefinitely and will have the ability to read from the terminal.
I tried to user the "readline" api but the app terminates without waiting for any input.
I added a "while(true)" loop but it seems that the thread gets stacked in the loop and does not respond to my input.
I need a series of random numbers.
To accomplice it I added an interval of 1000ms and the result was the same with while loop.
To summary I need to create an app that reads from the terminal and create random numbers on a given interval.
Any guidance will be appreciated.
Edit 1
Additional information I just thought to give you.
I tried to put either the readline call or the interval in a separate forked process but nothing changed.
Also I tried to use recursion for the readline.
Edit 2
Although I accepted #amangpt777`s answer I would like to give another problem that you might encounter.
I was calling my script like this 'clear | node ./script.js' on windows` powershell.
I believe that it was the pipe that was blocking my input.
I don't know if this can happen on linux, I haven't tested it.
I just add it here so you keep it in mind.
I am not sure what you are trying to accomplish here. But following code will take input from user using readline and will keep on storing the input in an array. Note that I have some commented code in this which can be uncommented if you want a publish subscriber model. Also that you will need to add more code to sanitize and validate your input. I hope you will get some pointers to achieve what you want with this:
var readline = require('readline');
//var redis = require('redis');
//let subscriber = redis.createClient();
//let publisher = redis.createClient();
let numEntered = [];
var r1 = readline.createInterface(
{
"input": process.stdin,
"output": process.stdout
}
);
// subscriber.subscribe('myFunc');
// subscriber.on('message', (channel, msg) => {
// //Your logic
// });
function printMyArr(){
console.log("Numbers entered till now: ", numEntered);
}
function askNumber(){
askQuestion('Next Number?\n')
.then(ans => {
handleAnswer(ans);
})
.catch(err => {
console.log(err);
})
}
function handleAnswer(inputNumber) {
if(inputNumber === 'e') {
console.log('Exiting!');
r1.close();
process.exit();
}
else {
numEntered.push(parseInt(inputNumber));
//publisher.publish('myFunc', parseInt(inputNumber));
// OR
printMyArr();
askNumber();
}
}
function askQuestion(q) {
return new Promise((resolve, reject) => {
r1.question(q, (ans) => {
return resolve(ans);
});
});
}
function init() {
askQuestion('Enter Stream. Press e and enter to end input stream!\n')
.then(ans => {
handleAnswer(ans);
})
.catch(err => {
console.log(err);
})
}
init();
Related
I planned to create a discord server with bots. There are quite a lot (6 in total) and are just supposed to be fictional characters with some background story. I'm quite new and all of that is way too complicated for me to code myself, therefor I ask for your help! I just want to have a nice server for my friends and I with enjoyable bots and all of these desperate hours of trying to get some useful code is driving me nuts..
I only managed to get one bot to do stuff, using the prefix "-".
It can change it's status (watching, listening, playing) and the name of the thing he's doing.
I'm not quite sure why streaming doesn't work or if that's possible in general but it would be really cool if it would.
My status code: (1st Problem)
client.once('ready', () => {
console.log('Bot is ready!');
if (config.activity.streaming == true) {
client.user.setActivity(config.activity.game, {type: 'WATCHING'}); //STREAMING, PLAYING, LISTENING
} else {
client.user.setActivity(config.activity.game, {url: 'https://twitch.tv/usrname'});
client.user.setStatus('idle'); //dnd, idle, online, invisible
}
});
config.json
"activity": {
"streaming": true,
"game": "Whatevergame"
}
}
As I said, streaming is not working for some reason and the status (idle, dnd..) is also not working.
2nd Problem
If I try to add other bots with the login, it will log both bots on, but only one of them will work, what's actually pretty logical since the commands are all made for only one bot. So I'm trying to figure out how to get them all packed into the main file.
3rd Problem
I used the try - catch function to execute commands, which I pre- set up, and if theres none, it sends an error message. See for yourself:
client.on('message', message =>{
if (!message.content.startsWith(prefix) || message.author.bot) return;
const args = message.content.slice(prefix.length).split(/ +/);
const command = args.shift().toLowerCase();
try {
client.commands.get(command).execute(message, args);
}
catch {
message.channel.send("I don't know that, sorry.");
}
});
So everytime I type another command, from which I do not want the bot to respond to, it will respond with "I don't know[...]" It would be sufficient to just set up another prefix for the "other command" to fix that problem so the bot knows that for every prefix starting with a.e "-", it has to send an error message if that command is not existing. But for other prefixes, a.e "?", it's supposed to execute the other command/s.
4th Problem
My (current) last problems are the welcome messages. My code:
index.js
const welcome = require("./welcome");
welcome (client)
welcome.js
module.exports = (client) => {
const channelId = '766761508427530250' // welcome channel
const targetChannelId = '766731745960919052' //rules and info
client.on('guildMemberAdd', (member) => {
console.log(member)
const message = `New user <#${member.id}> joined the server. Please read through ${member.guild.channels.cache.get(targetChannelId).toString()} to gain full access to the server!`
const channel = member.guild.channels.cache.get(channelId)
channel.send(message)
})
}
The code is working perfectly fine, however it would be way more exciting with a little more variety. I'm trying to get multiple welcome messages that get randomly chosen by the bot.. I thought about a Mathfloor as an approach but I'm not quite sure how that would work..
Thank you for reading through my text and I hope that I will soon be able to enjoy the server with my guys!
Cheers!
First problem
I'm not sure why ClientUser.setActivity() and ClientUser.setStatus is not working. In the STREAMING example, it might be because you didn't specify the type of activity. Either way, there's an easier way to what you're doing, which is ClientUser.setPresence(). This method is kind of like a combination of the other two.
client.once('ready', () => {
console.log('Bot is ready!');
config.activity.streaming
? client.user.setPresence({
activity: { name: config.activity.game, type: 'WATCHING' },
})
: client.user.setPresence({
activity: {
name: config.activity.game,
type: 'STREAMING',
url: 'https://twitch.tv/usrname',
},
status: 'idle', // note: the idle icon is overwritten by the STREAMING icon, so this won't do much
});
});
Second Problem
It's pretty hard to make multiple bots, both duplicates of each other, in one file. I would recommend just using a lot of Array.prototype.forEach() loops to apply all events and such to both clients.
[1, 2, 3].forEach((num) =>
console.log(`The element I'm currently iterating a function through is ${num}`)
);
// example main file
const { Client, Collection } = require('discord.js');
const [roseClient, sunflowerClient] = [new Client(), new Client()];
// adding properties:
[roseClient, sunflowerClient].forEach((client) => client.commands = new Collection())
// events
[roseClient, sunflowerClient].forEach((client) =>
client.on('message', (message) => {
// ...
});
);
// login
roseClient.login('token1');
sunflowerClient.login('token2');
Third problem
Again, forEach() loops save the day (❁´◡`❁). This time, however, you should actually use Array.prototype.every(), which will return true if every element of an array passes the given test.
Basically, if we were to use a normal forEach() loop, then even if one of the prefixes found the match, the other wouldn't and the error message would always be sent out. So instead we'll use every() to only send out an error message if both prefixes find no match.
// what if we ony wanted the error message if *every* number was 3
[1, 2, 3].forEach((num) => {
if (num === 3) console.error('Error message');
});
console.log('--------------------');
// now it will only send if all numbers were three (they're not)
if ([1, 2, 3].every((num) => num === 3))
console.error('Error message');
client.on('message', (message) => {
if (['-', '?'].every((prefix) => {
if (!message.content.startsWith(prefix) || message.author.bot) return;
const args = message.content.slice(prefix.length).split(/ +/);
const command = args.shift().toLowerCase();
try {
// it did not pass the test (the test being finding no match), and the error message should not be displayed
return false;
client.commands.get(command).execute(message, args);
} catch {
// it did pass the test (by finding no match). if the next test is failed too, the error message should be displayed
return true;
message.channel.send("I don't know that, sorry.");
}
});
});
Fourth Problem
You're on the right track! Math.floor() is definitely the right way to get a random element from an array.
function chooseFood() {
// make an array
const foods = ['eggs', 'waffles', 'cereal', "nothing (●'◡'●)", 'yogurt'];
// produce a random integer from 0 to the length of the array
const index = Math.floor(Math.random() * foods.length);
// find the food at that index
console.log(`Today I will eat ${foods[index]}`);
};
<button onClick="chooseFood()">Choose What to Eat</button>
module.exports = (client) => {
client.on('guildMemberAdd', (member) => {
console.log(member);
const welcomes = [
`Welcome ${member}!`,
`Woah! Didn't see you there ${member}; welcome to the server!`,
`${member} enjoy your stay at ${member.guild}!`,
];
const message = `${
welcomes[Math.floor(Math.random() * welcomes.length)]
} Please read through ${member.guild.channels.cache.get(
'766731745960919052'
)} to gain full access to the server!`;
member.guild.channels.cache.get('766761508427530250').send(message);
});
};
That was a mouthful (╯°□°)╯︵ ┻━┻
I'm new to the idea of asynchronous code, and am still trying to wrap my brain around how everything works.
I'm building a Node Express application which will interface with a database. When running in a development environment I want it to interface with a Sqlite database. (The production database will not use Sqlite. This only applies to creating a small development environment.)
My problem is I'm having trouble controlling the execution order and timing of queries to the database.
I would like to build my SqliteDatabase.js file such that it can only execute queries sequentially, despite the fact that functions in this file will be called by other parts of the program that are running asynchronously.
How can I acheive this?
For reference, here is how I currently have my SqliteDatabase.js file set up:
var debug = require('debug')('app:DATABASE');
var sqlite = require('sqlite3').verbose();
open = function() {
var db = new sqlite.Database('./test-database.db', sqlite.OPEN_READWRITE | sqlite.OPEN_CREATE, function(err) {
if (err) {
debug("We've encountered an error opening the sqlite database.");
debug(err);
} else {
debug("Sqlite database opened successfully.");
}
});
return db;
}
executeSQLUpdate = function(sql, next) {
var db = open();
db.serialize(function() {
console.log("executing " + sql);
db.run(sql);
db.close();
next();
});
}
exports.executeSQLUpdate = executeSQLUpdate;
Is there some way to build a queue, and make it so when the "executeSQLUpdate" function is called, the request is added to a queue, and is not started until all previous requests have been completed?
To give an example, take a look at this code which utilises my SqliteDatabase.js file:
ar database = require('../../bin/data_access/SqliteDatabase.js');
var createTestTableStmt = "CREATE TABLE IF NOT EXISTS Test(\n" +
"Name TEXT PRIMARY KEY NOT NULL UNIQUE,\n" +
"Age INT NOT NULL,\n" +
"Gender TEXT NOT NULL\n" +
");";
var clearTestTableStmt = "DELETE FROM Test;";
var testInsertStmt = "INSERT INTO Test (Name, Age, Gender)\n" +
"VALUES (\"Connor\", 23, \"Male\");";
createTable = function() {
database.executeSQLUpdate(createTestTableStmt, clearTable);
}
clearTable = function() {
database.executeSQLUpdate(clearTestTableStmt, insertRow);
}
insertRow = function() {
database.executeSQLUpdate(testInsertStmt, function() {
console.log("Done!");
});
}
createTable();
9 times out of 10 the above code works fine, but every once in a while, the "insert row" function is called before the "clearTable" function is called, which throws an error because of a violated database constraint.
How can I change my implementation of the SqliteDatabase.js file to avoid this issue?
You can use async to do this using await. This code will wait for each asynchronous database call to complete before executing the next line.
async function createTable() {
await database.executeSQLUpdate(createTestTableStmt);
await database.executeSQLUpdate(clearTestTableStmt);
await database.executeSQLUpdate(testInsertStmt);
console.log("Done!");
}
Your console.log statement will only execute once all three have completed.
I should also mention that you need a try...catch block around the three database calls to trap any errors and provide an alternate exit point if something should go wrong.
I realized why the callback function next() was sometimes being called before db.run(sql)
It turns out that db.run() is itself an asychronous function. I updated my code, and added a callback to the db.run() line to make sure we don't skip ahead until it's done.
Here's what it looks like now:
executeSQLUpdate = function(sql, next) {
var db = open();
db.run(sql, function(err) {
db.close(function() {
if (next) next(err);
});
});
}
Nesting each asynchronous function in the previous function's callback, makes each function execute in order.
Thanks to everyone who gave me hints that helped me figure out what the problem was.
Is it possible to cancel a regex.match operation if takes more than 10 seconds to complete?
I'm using an huge regex to match a specific text, and sometimes may work, and sometimes can fail...
regex: MINISTÉRIO(?:[^P]*(?:P(?!ÁG\s:\s\d+\/\d+)[^P]*)(?:[\s\S]*?))PÁG\s:\s+\d+\/(\d+)\b(?:\D*(?:(?!\1\/\1)\d\D*)*)\1\/\1(?:[^Z]*(?:Z(?!6:\s\d+)[^Z]*)(?:[\s\S]*?))Z6:\s+\d+
Working example: https://regex101.com/r/kU6rS5/1
So.. i want cancel the operation if takes more than 10 seconds. Is it possible? I'm not finding anything related in sof
Thanks.
You could spawn a child process that does the regex matching and kill it off if it hasn't completed in 10 seconds. Might be a bit overkill, but it should work.
fork is probably what you should use, if you go down this road.
If you'll forgive my non-pure functions, this code would demonstrate the gist of how you could communicate back and forth between the forked child process and your main process:
index.js
const { fork } = require('child_process');
const processPath = __dirname + '/regex-process.js';
const regexProcess = fork(processPath);
let received = null;
regexProcess.on('message', function(data) {
console.log('received message from child:', data);
clearTimeout(timeout);
received = data;
regexProcess.kill(); // or however you want to end it. just as an example.
// you have access to the regex data here.
// send to a callback, or resolve a promise with the value,
// so the original calling code can access it as well.
});
const timeoutInMs = 10000;
let timeout = setTimeout(() => {
if (!received) {
console.error('regexProcess is still running!');
regexProcess.kill(); // or however you want to shut it down.
}
}, timeoutInMs);
regexProcess.send('message to match against');
regex-process.js
function respond(data) {
process.send(data);
}
function handleMessage(data) {
console.log('handing message:', data);
// run your regex calculations in here
// then respond with the data when it's done.
// the following is just to emulate
// a synchronous computational delay
for (let i = 0; i < 500000000; i++) {
// spin!
}
respond('return regex process data in here');
}
process.on('message', handleMessage);
This might just end up masking the real problem, though. You may want to consider reworking your regex like other posters have suggested.
Another solution I found here:
https://www.josephkirwin.com/2016/03/12/nodejs_redos_mitigation/
Based on the use of VM, no process fork.
That's pretty.
const util = require('util');
const vm = require('vm');
var sandbox = {
regex:/^(A+)*B/,
string:"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAC",
result: null
};
var context = vm.createContext(sandbox);
console.log('Sandbox initialized: ' + vm.isContext(sandbox));
var script = new vm.Script('result = regex.test(string);');
try{
// One could argue if a RegExp hasn't processed in a given time.
// then, its likely it will take exponential time.
script.runInContext(context, { timeout: 1000 }); // milliseconds
} catch(e){
console.log('ReDos occurred',e); // Take some remedial action here...
}
console.log(util.inspect(sandbox)); // Check the results
I've got an rxjs observer (really a Subject) that tails a file forever, just like tail -f. It's awesome for monitoring logfiles, for example.
This "forever" behavior is great for my application, but terrible for testing. Currently my application works but my tests hang forever.
I'd like to force an observer change to complete early, because my test code knows how many lines should be in the file. How do I do this?
I tried calling onCompleted on the Subject handle I returned but at that point it's basically cast as an observer and you can't force it to close, the error is:
Object # has no method 'onCompleted'
Here's the source code:
function ObserveTail(filename) {
source = new Rx.Subject();
if (fs.existsSync(filename) == false) {
console.error("file doesn't exist: " + filename);
}
var lineSep = /[\r]{0,1}\n/;
tail = new Tail(filename, lineSep, {}, true);
tail.on("line", function(line) {
source.onNext(line);
});
tail.on('close', function(data) {
console.log("tail closed");
source.onCompleted();
});
tail.on('error', function(error) {
console.error(error);
});
this.source = source;
}
And here's the test code that can't figure out how to force forever to end (tape style test). Note the "ILLEGAL" line:
test('tailing a file works correctly', function(tid) {
var lines = 8;
var i = 0;
var filename = 'tape/tail.json';
var handle = new ObserveTail(filename);
touch(filename);
handle.source
.filter(function (x) {
try {
JSON.parse(x);
return true;
} catch (error) {
tid.pass("correctly caught illegal JSON");
return false;
}
})
.map(function(x) { return JSON.parse(x) })
.map(function(j) { return j.name })
.timeout(10000, "observer timed out")
.subscribe (
function(name) {
tid.equal(name, "AssetMgr", "verified name field is AssetMgr");
i++;
if (i >= lines) {
handle.onCompleted(); // XXX ILLEGAL
}
},
function(err) {
console.error(err)
tid.fail("err leaked through to subscriber");
},
function() {
tid.end();
console.log("Completed");
}
);
})
It sounds like you solved your problem, but to your original question
I'd like to force an observer change to complete early, because my test code knows how many lines should be in the file. How do I do this?
In general the use of Subjects is discouraged when you have better alternatives, since they tend to be a crutch for people to use programming styles they are familiar with. Instead of trying to use a Subject I would suggest that you think about what each event would mean in an Observable life cycles.
Wrap Event Emitters
There already exists wrapper for the EventEmitter#on/off pattern in the form of Observable.fromEvent. It handles clean up and keeping the subscription alive only when there are listeners. Thus ObserveTail can be refactored into
function ObserveTail(filename) {
return Rx.Observable.create(function(observer) {
var lineSep = /[\r]{0,1}\n/;
tail = new Tail(filename, lineSep, {}, true);
var line = Rx.Observable.fromEvent(tail, "line");
var close = Rx.Observable.fromEvent(tail, "close");
var error = Rx.Observable.fromEvent(tail, "error")
.flatMap(function(err) { return Rx.Observable.throw(err); });
//Only take events until close occurs and wrap in the error for good measure
//The latter two are terminal events in this case.
return line.takeUntil(close).merge(error).subscribe(observer);
});
}
Which has several benefits over the vanilla use of Subjects, one, you will now actually see the error downstream, and two, this will handle clean up of your events when you are done with them.
Avoid *Sync Methods
Then this can be rolled into your file existence checking without the use of readSync
//If it doesn't exist then we are done here
//You could also throw from the filter if you want an error tracked
var source = Rx.Observable.fromNodeCallback(fs.exists)(filename)
.filter(function(exists) { return exists; })
.flatMap(ObserveTail(filename));
Next you can simplify your filter/map/map sequence down by using flatMap instead.
var result = source.flatMap(function(x) {
try {
return Rx.Observable.just(JSON.parse(x));
} catch (e) {
return Rx.Observable.empty();
}
},
//This allows you to map the result of the parsed value
function(x, json) {
return json.name;
})
.timeout(10000, "observer timed out");
Don't signal, unsubscribe
How do you stop "signal" a stop when streams only travel in one direction. We rarely actually want to have an Observer directly communicate with an Observable, so a better pattern is to not actually "signal" a stop but to simply unsubscribe from the Observable and leave it up to the Observable's behavior to determine what it should do from there.
Essentially your Observer really shouldn't care about your Observable more than to say "I'm done here".
To do that you need to declare a condition you want to reach in when stopping.
In this case since you are simply stopping after a set number in your test case you can use take to unsubscribe. Thus the final subscribe block would look like:
result
//After lines is reached this will complete.
.take(lines)
.subscribe (
function(name) {
tid.equal(name, "AssetMgr", "verified name field is AssetMgr");
},
function(err) {
console.error(err)
tid.fail("err leaked through to subscriber");
},
function() {
tid.end();
console.log("Completed");
}
);
Edit 1
As pointed out in the comments, In the case of this particular api there isn't a real "close" event since Tail is essentially an infinite operation. In this sense it is no different from a mouse event handler, we will stop sending events when people stop listening. So your block would probably end up looking like:
function ObserveTail(filename) {
return Rx.Observable.create(function(observer) {
var lineSep = /[\r]{0,1}\n/;
tail = new Tail(filename, lineSep, {}, true);
var line = Rx.Observable.fromEvent(tail, "line");
var error = Rx.Observable.fromEvent(tail, "error")
.flatMap(function(err) { return Rx.Observable.throw(err); });
//Only take events until close occurs and wrap in the error for good measure
//The latter two are terminal events in this case.
return line
.finally(function() { tail.unwatch(); })
.merge(error).subscribe(observer);
}).share();
}
The addition of the finally and the share operators creates an object which will attach to the tail when a new subscriber arrives and will remain attached as long as there is at least one subscriber still listening. Once all the subscribers are done however we can safely unwatch the tail.
I'm using this Gumroad-API npm package in order to fetch data from an external service (Gumroad). Unfortunately, it seems to use a .then() construct which can get a little unwieldy as you will find out below:
This is my meteor method:
Meteor.methods({
fetchGumroadData: () => {
const Gumroad = Meteor.npmRequire('gumroad-api');
let gumroad = new Gumroad({ token: Meteor.settings.gumroadAccessKey });
let before = "2099-12-04";
let after = "2014-12-04";
let page = 1;
let sales = [];
// Recursively defined to continue fetching the next page if it exists
let doThisAfterResponse = (response) => {
sales.push(response.sales);
if (response.next_page_url) {
page = page + 1;
gumroad.listSales(after, before, page).then(doThisAfterResponse);
} else {
let finalArray = R.unnest(sales);
console.log('result array length: ' + finalArray.length);
Meteor.call('insertSales', finalArray);
console.log('FINISHED');
}
}
gumroad.listSales(after, before, page).then(doThisAfterResponse); // run
}
});
Since the NPM package exposes the Gumorad API using something like this:
gumroad.listSales(after, before, page).then(callback)
I decided to do it recursively in order to grab all pages of data.
Let me try to re-cap what is happening here:
The journey starts on the last line of the code shown above.
The initial page is fetched, and doThisAfterResponse() is run for the first time.
We first dump the returned data into our sales array, and then we check if the response has given us a link to the next page (as an indication as to whether or not we're on the final page).
If so, we increment our page count and we make the API call again with the same function to handle the response again.
If not, this means we're at our final page. Now it's time to format the data using R.unnest and finally insert the finalArray of data into our database.
But a funny thing happens here. The entire execution halts at the Meteor.call() and I don't even get an error output to the server logs.
I even tried switching out the Meteor.call() for a simple: Sales.insert({text: 'testing'}) but the exact same behaviour is observed.
What I really need to do is to fetch the information and then store it into the database on the server. How can I make that happen?
EDIT: Please also see this other (much more simplified) SO question I made:
Calling a Meteor Method inside a Promise Callback [Halting w/o Error]
I ended up ditching the NPM package and writing my own API call. I could never figure out how to make my call inside the .then(). Here's the code:
fetchGumroadData: () => {
let sales = [];
const fetchData = (page = 1) => {
let options = {
data: {
access_token: Meteor.settings.gumroadAccessKey,
before: '2099-12-04',
after: '2014-12-04',
page: page,
}
};
HTTP.call('GET', 'https://api.gumroad.com/v2/sales', options, (err,res) => {
if (err) { // API call failed
console.log(err);
throw err;
} else { // API call successful
sales.push(...res.data.sales);
res.data.next_page_url ? fetchData(page + 1) : Meteor.call('addSalesFromAPI', sales);
}
});
};
fetchData(); // run the function to fetch data recursively
}