Writing on stdin when using dockerode - javascript

How can I write to the container's stdin, when using dockerode library? I tried doing it in a multiple ways, but nothing seems to work.
My current code that is not able to write to stdin:
export async function nameToStdName(
pluginName: string,
pluginDescription: string,
pluginId: number,
numberOfDuplicates: number
) {
const docker = new Docker();
const input = `${pluginName}; ${pluginDescription}`;
// Run the docker container and pass input from a string
const dockerImageName = 'name-to-stdname';
const dockerCmd = ['python', '/app/main.py', '-i', pluginId.toString(), '-v', numberOfDuplicates.toString()];
const options = {
cmd: dockerCmd,
AttachStdin: true,
AttachStdout: true,
Tty: false,
};
const container = await docker.createContainer({
Image: dockerImageName,
...options,
});
await container.start();
const stream = await container.attach({
stream: true,
stdin: true,
stdout: true,
});
// Handle output from container's stdout
let name = "";
stream.on('data', (data: Stream) => {
console.log(`Received output: ${data.toString()}`);
name += data.toString();
});
// Pass input to container's stdin
stream.write(input);
await container.wait();
return name;
}

Related

Nodejs child_proccess.spawn no output using stdio: 'inherit'

I'm using node.js child_proccess.spawn() in order to execute few command lines in CMD and get the output.
I have encountered few issues:
When i'm trying to spawn the proccess witout stdio: 'inherit' option - The CMD freezes after executing the last command and won't print out the results.
When I add the stdio: 'inherit' option, I get the results printed to my terminal but I cant catch the output with child.stdout.on..
Is there any possible way to capture the terminal output or to avoid the proccess from being stuck?
function executeCommands (){
const firstCommand = 'do something1'
const secondCommand = 'do something2'
const thirdCommand = 'do something3'
let child = require('child_process').spawn(`${firstCommand} && ${secondCommand} &&
${thirdCommand}`, [], {shell: true,stdio: 'inherit'})
child.stdout.setEncoding('utf8')
child.stdout.on('data', (data) => {
console.log('stdout',data)
})
child.stdio.on('data', (data) => {
console.log('stdio',data)
})
child.stderr.on('data', (data) => {
console.log('stderr',data.toString())
})
}
Use child_process
const { execSync } = require("node:child_process");
const npmVersion = execSync("npm -v", { encoding: "utf-8" });
console.log(npmVersion);
// 8.15.0
if you want to use spawnSync
const { spawnSync } = require("node:child_process");
const npmVersion = spawnSync("npm", ["-v"], { encoding: "utf-8" });
console.log(npmVersion.stdout);
// 8.15.0

string to bufferstream not always writing data

I have a cloud function receiving a json string in a pubsub topic.
The goal is to extracts some data into a new json string.
Next parse it as JSONL.
And finally stream it to Google Cloud Storage.
I notice that sometimes the files seem to contain data and sometimes they do not.
The pubsub is working fine and data is coming into this cloud function just fine.
I tried adding some async awaits where I seem it might fit but I am afraid it has do to with the bufferstream. Both topics on where I have trouble getting my head around.
What could be the issue?
const stream = require('stream');
const { Storage } = require('#google-cloud/storage');
// Initiate the source
const bufferStream = new stream.PassThrough();
// Creates a client
const storage = new Storage();
// save stream to bucket
const toBucket = (message, filename) => {
// Write your buffer
bufferStream.end(Buffer.from(message));
const myBucket = storage.bucket(process.env.BUCKET);
const file = myBucket.file(filename);
// Pipe the 'bufferStream' into a 'file.createWriteStream' method.
bufferStream.pipe(file.createWriteStream({
validation: 'md5',
}))
.on('error', (err) => { console.error(err); })
.on('finish', () => {
// The file upload is complete.
console.log(`${filename} is uploaded`);
});
};
// extract correct fields
const extract = (entry) => ({
id: entry.id,
status: entry.status,
date_created: entry.date_created,
discount_total: entry.discount_total,
discount_tax: entry.discount_tax,
shipping_total: entry.shipping_total,
shipping_tax: entry.shipping_tax,
total: entry.total,
total_tax: entry.total_tax,
customer_id: entry.customer_id,
payment_method: entry.payment_method,
payment_method_title: entry.payment_method_title,
transaction_id: entry.transaction_id,
date_completed: entry.date_completed,
billing_city: entry.billing.city,
billing_state: entry.billing.state,
billing_postcode: entry.billing.postcode,
coupon_lines_id: entry.coupon_lines.id,
coupon_lines_code: entry.coupon_lines.code,
coupon_lines_discount: entry.coupon_lines.discount,
coupon_lines_discount_tax: entry.coupon_lines.discount_tax,
});
// format json to jsonl
const format = async (message) => {
let jsonl;
try {
// extract only the necessary
const jsonMessage = await JSON.parse(message);
const rows = await jsonMessage.map((row) => {
const extractedRow = extract(row);
return `${JSON.stringify(extractedRow)}\n`;
});
// join all lines as one string with no join symbol
jsonl = rows.join('');
console.log(jsonl);
} catch (e) {
console.error('jsonl conversion failed');
}
return jsonl;
};
exports.jsonToBq = async (event, context) => {
const message = Buffer.from(event.data, 'base64').toString();
const { filename } = event.attributes;
console.log(filename);
const jsonl = await format(message, filename);
toBucket(jsonl, filename);
};
it's fixed by moving the bufferstream const into the tobucket function.

Create a file from user input Node.js

So I'm making a discord bot, and I want to have a file called createcommand.js, so when you run it with node createcommand.js is starts asking for user input. So it ask "what should the name be" and then it creates a file from the input with the name, the it will ask "what should it send" and then it will add the command in the file. I've tried with https://pastebin.com/UARJcExh
#!/usr/bin/env node
const fs = require('file-system');
const ncp = require('ncp');
const init = async function () {
const modulee = require('../index.js')
const readline = require('readline').createInterface({
input: process.stdin,
output: process.stdout
})
const name =
readline.question(`Welcome, what should the name of the command be?`, (input) =>{
js.name = input
let config = {}
readline.question(`What should it output`, (input) => {
js.output = input;
readline.close();
})
})
}
var js = `
exports.run = async (client, message, args, level) => {
message.channel.send(output)
};
exports.conf = {
enabled: true,
guildOnly: false,
aliases: [],
permLevel: "User"
};
exports.help = {
name: "support",
category: "Miscelaneous",
description: "",
usage: name
};
module.exports = config;`;
fs.appendFile(`${js.name}`, js ,function (err) {
if (err) {
throw new Error(err);
}})
init();
. But it doesn't work.

Testing custom process.stdio mock hangs in Jest

I am trying to create a node module based on node's internal process.stdio to have an object with stdin, stdout, stderr to use that is not tied to process in any way, to pass around and mock.
It its proving difficult to test this file for some reason. Even the minimal test of new Stdio() seems to "block" or "hang" jest. --forceExit works but there is some other weird or odd behavior. Is there anything specific about the code below that would cause the process to hang?
Here it is:
const tty = require('tty')
export function getStdout () {
var stdout
const fd = 1
stdout = new tty.WriteStream(fd)
stdout._type = 'tty'
stdout.fd = fd
stdout._isStdio = true
stdout.destroySoon = stdout.destroy
stdout._destroy = function (er, cb) {
// Avoid errors if we already emitted
er = er || new Error('ERR_STDOUT_CLOSE')
cb(er)
}
// process.on('SIGWINCH', () => stdout._refreshSize())
return stdout
}
export function getStderr () {
var stderr
const fd = 2
stderr = new tty.WriteStream(fd)
stderr._type = 'tty'
stderr.fd = fd
stderr._isStdio = true
stderr.destroySoon = stderr.destroy
stderr._destroy = function (er, cb) {
// Avoid errors if we already emitted
er = er || new Error('ERR_STDOUT_CLOSE')
cb(er)
}
// process.on('SIGWINCH', () => stderr._refreshSize())
return stderr
}
export function getStdin () {
var stdin
const fd = 0
stdin = new tty.ReadStream(fd, {
highWaterMark: 0,
readable: true,
writable: false
})
stdin.fd = fd
stdin.on('pause', () => {
if (!stdin._handle) { return }
stdin._readableState.reading = false
stdin._handle.reading = false
stdin._handle.readStop()
})
return stdin
}
export function setupStdio () {
var stdio = {}
Object.defineProperty(stdio, 'stdout', {
configurable: true,
enumerable: true,
get: getStdout
})
Object.defineProperty(stdio, 'stderr', {
configurable: true,
enumerable: true,
get: getStderr
})
Object.defineProperty(stdio, 'stdin', {
configurable: true,
enumerable: true,
get: getStdin
})
return stdio
}
export default class Stdio {
constructor () {
const {stdin, stderr, stdout} = setupStdio()
this.stdin = stdin
this.stderr = stderr
this.stdout = stdout
return this
}
}
This hangs the process in Jest, why?
import Stdio from './index'
test('works', () => {
const x = new Stdio()
expect(x).toBeTruthy()
})

node streams - get maximum call stack exceeded

I cant seem to figure out why I'm getting this error with my stream pipeline. I think I have exhausted all paths, so is there something Im missing: Here is what I have:
var myCsvStream = fs.createReadStream('csv_files/myCSVFile.csv');
var csv = require('fast-csv');
var myData = [];
var myFuncs = {
parseCsvFile: function (filepath) {
var csvStream;
csvStream = csv
.parse({headers: true, objectMode: true, trim: true})
.on('data', function (data) {
myData.push(data);
})
.on('end', function () {
console.log('done parsing counties')
});
return csvStream;
}
}
myCsvStream
.pipe(myFuncs.parseCsvFile())
.pipe(process.stdout);
The process.stdout is just so I can see that the data can continue on to the next stream, however, when adding pipe(process.stdout) or even a through2 duplex stream I get this maximum callstack reached error. Any Ideas?
I think you should write it that way :
var myCsvStream = fs.createReadStream('csv_files/myCSVFile.csv');
var csv = require('fast-csv');
var csvStream = csv
.parse({headers: true, objectMode: true, trim: true})
.on('data', function (data) {
myData.push(data);
})
.on('end', function () {
console.log('done parsing counties')
});
myCsvStream
.pipe(csvStream)
.pipe(process.stdout);
After you can wrap it all up in a function.

Categories