I'm trying do a mp4 converter, but if the client cancel the request, continues the conversion, how I stop the conversion if the client cancel the request?
Code from conversor:
// Start the ffmpeg child process
const ffmpegProcess = cp.spawn(ffmpeg, [
// Remove ffmpeg's console spamming
'-loglevel', '8', '-hide_banner',
// Redirect/Enable progress messages
'-progress', 'pipe:3',
// Set inputs
'-i', 'pipe:4',
'-i', 'pipe:5',
// Map audio & video from streams
'-map', '0:a',
'-map', '1:v',
// Keep encoding
'-c:v', 'copy',
// Define output file
token + formato,
], {
windowsHide: true,
stdio: [
/* Standard: stdin, stdout, stderr */
'inherit', 'inherit', 'inherit',
/* Custom: pipe:3, pipe:4, pipe:5 */
'pipe', 'pipe', 'pipe', 'pipe'
],
});
ffmpegProcess.stdio[3].on('data', (err) => {
console.log(res.status())
});
audio.pipe(ffmpegProcess.stdio[4]);
video.pipe(ffmpegProcess.stdio[5]);
ffmpegProcess.stdio[6].on('error', (err) => {
// Remover token do objeto
delete inUseTokens[token];
res.status(500).send(err.message);
});
ffmpegProcess.stdio[6].on('close', () => {
console.log("convertido!")
res.render('downloated', {formato: formato, title: titulo, token: token, thumbnail: thumbnail, seconds: seconds})
})
});
I don't know how I can solve this...
Related
Im using this code to stream into a file. But the created file is empty. Is there something wrong with my code?
const fileStream = pinoms.prettyStream(
{
prettyPrint: {
colorize: true,
levelFirst: true,
translateTime: "yyyy-dd-mm, h:MM:ss TT",
},
},
pinoms.destination({
dest: './my-file', // omit for stdout
minLength: 4096, // Buffer before writing
sync: true}) // Asynchronous logging)
)
const streams = [
{stream: fileStream}
]
const logger = pinoms(pinoms.multistream(streams))
logger.info('HELLO %s!', 'World')
In the documentation it says:
const prettyStream = pinoms.prettyStream(
{
prettyPrint:
{ colorize: true,
translateTime: "SYS:standard",
ignore: "hostname,pid" // add 'time' to remove timestamp
},
prettifier: require('pino-pretty') // not required, just an example of setting prettifier
// as well it is possible to set destination option
}
);
So it should be possible.
PS: I know there is the option to put a writestream with fs into it but I want to get the time formatted.
I found a practicable solution for me. Scince v 7 pino provides the multistream function by it selve. Now I can do all I wanted to do. Using destination and also make the timestamp pretty.
const streams = [
{stream: pino.destination('test.log')},
{stream: pretty({
colorize: true,
sync: true,
})}
]
const logger = pino({level: 'info', timestamp: pino.stdTimeFunctions.isoTime}, pino.multistream(streams))
logger.info('HELLO %s!', 'World')
I get the following error when I try to run the command: artillery run realtime_transcribing_test.yaml:
TypeError: Cannot read property 'capture' of null.
realtime_transcribing_test.yaml:
config:
target: "ws://localhost:8001/calls/live-calls"
processor: "./binary-payload.js"
phases:
- duration: 60
arrivalRate: 5
scenarios:
- engine: "ws"
flow:
- send:
rate: 48000
format: 1
language: "en-IN"
user_id: "Test client"
- think: 1
- loop:
- function: "sendBinaryData"
- send: "{{payload}}"
- think: 1
count: 100
binary-payload.js:
module.exports = {
sendBinaryData
};
function sendBinaryData(userContext, events, done) {
navigator.mediaDevices
.getUserMedia({ audio: true, video: false })
.then(stream => {
const mediaRecorder = new MediaRecorder(stream, {
mimeType: 'audio/webm',
});
mediaRecorder.addEventListener('dataavailable', event => {
if (event.data.size > 0) {
userContext.vars.payload = event.data;
}
});
mediaRecorder.start(100);
setTimeout(event => {
mediaRecorder.stop();
}, 100);
});
return done();
}
Both the files are placed within the same directory. With my current findings this is a very generic error statement thrown by artillery. I have also verified the validity of the YAML file. Please help me understand the issue with my configuration.
I am using nodeJS 12.x lambda function to invoke certain commands on one of the EC2 instance. I have made sure that
SSM agent is installed on the EC2 instance
Proper roles are assigned to the lambda function, Policies are - AmazonEC2ReadOnlyAccess, AmazonSSMFullAccess, AWSLambdaExecute.
Below is the lambda code:
var AWS = require('aws-sdk');
const ssm = new AWS.SSM();
AWS.config.update({region:'ap-south-1'});
exports.handler = function(event, context) {
var ec2 = new AWS.EC2();
ec2.describeInstances(function(err, data) {
if(err) {
console.log(err, err.stack);
}
else {
let instance = data.Reservations[0].Instances[0].InstanceId;
console.log("\nInstance Id: ", instance);
ssm.sendCommand({
DocumentName: "AWS-RunShellScript",
InstanceIds: [ instance ],
TimeoutSeconds: 3600,
Parameters: {
commands: ['ifconfig']
}
}, function(err, data) {
if (err) {
console.log("\nError:", err);
} else {
console.log("\nSuccess: ", data);
}
context.done(null, 'Function Finished!');
})
}
});
};
When I invoke this function manually I am getting the status as pending. Below is the output log.
Response:
"Function Finished!"
Request ID:
"748b280a-4277-42a1-a0c3-************"
Function logs:
START RequestId: 748b280a-4277-42a1-a0c3-************ Version: $LATEST
2020-11-05T08:52:26.895Z 748b280a-4277-42a1-a0c3-************ INFO
Inside describe instances:
2020-11-05T08:52:26.952Z 748b280a-4277-42a1-a0c3-************ INFO
Instance Id: i-016f4673e082a829e
2020-11-05T08:52:27.237Z 748b280a-4277-42a1-a0c3-************ INFO
Success: {
Command: {
CommandId: '8b7a3b6d-4a7a-4259-9c82-************',
DocumentName: 'AWS-RunShellScript',
DocumentVersion: '',
Comment: '',
ExpiresAfter: 2020-11-05T10:52:27.220Z,
Parameters: { commands: [Array] },
InstanceIds: [ 'i-****************' ],
Targets: [],
RequestedDateTime: 2020-11-05T08:52:27.220Z,
Status: 'Pending',
StatusDetails: 'Pending',
OutputS3BucketName: '',
OutputS3KeyPrefix: '',
MaxConcurrency: '50',
MaxErrors: '0',
TargetCount: 1,
CompletedCount: 0,
ErrorCount: 0,
DeliveryTimedOutCount: 0,
ServiceRole: '',
NotificationConfig: {
NotificationArn: '',
NotificationEvents: [],
NotificationType: ''
},
CloudWatchOutputConfig: { CloudWatchLogGroupName: '', CloudWatchOutputEnabled: false },
TimeoutSeconds: 3600
}
}
END RequestId: 748b280a-4277-42a1-a0c3-************
REPORT RequestId: 748b280a-4277-42a1-a0c3-************ Duration: 677.90 ms Billed Duration: 700 ms Memory Size: 128 MB Max Memory Used: 96 MB
Why is the status not success? When I manually use 'RunCommand' it works properly.
What am I doing wrong?
The command status is showing as pending because, it is currently in pending. Once you execute the command it goes to from pending ----> Complete.
if you take the command ID (CommandId: '8b7a3b6d-4a7a-4259-9c82-************' in above case) and look into System Manager Run Command, by the time you try to search for it, it will show successful or failed
I am trying to create a script for email verification. I mean when we create an account on the site, then a verification mail will come to the given mail address and then we have to go to that mail and verify it(clicking on the link/or fetching the code). I tried this solution. But I have got stuck on the specific error.
This is code which I am trying.
describe("Sample test case", function () {
function getLastEmail() {
let deferred = protractor.promise.defer();
console.log("Waiting for an email...");
mailListener.on("mail", function(mail){
deferred.fulfill(mail);
});
return deferred.promise;
}
beforeAll(function () {
browser.waitForAngularEnabled(false);
browser.get("https://mail.google.com/mail/");
isAngularSite(false);
browser.sleep(3000);
});
it("should login with a registration code sent to an email", function () {
element(by.id("username")).sendKeys("MyemailID");
element(by.id("password")).sendKeys("mypassword");
element(by.id("loginButton")).click();
browser.controlFlow().await(getLastEmail()).then(function (email) {
expect(email.subject).toEqual("BillPledge Email Verification");
expect(email.headers.to).toEqual("support#billpledge.com");
// extract registration code from the email message
let pattern = /Registration code is: (\w+)/g;
let regCode = pattern.exec(email.text)[1];
console.log(regCode);
});
});
});
and this is my conf file.
// An example configuration file.
exports.config = {
// The address of a running selenium server.
// seleniumAddress: 'http://localhost:4444/wd/hub',
// if we are using protractor's webdriver-manager locally, you cannot use selenium Address
// If the webdriver-manager needs to start a local server, you can use
selenium: 'http://localhost:4445/wd/hub',
seleniumPort: 4445, // Port matches the port above
// Capabilities to be passed to the webdriver instance.
capabilities: {
'browserName': 'chrome'
},
// Spec patterns are relative to the current working directly when
// protractor is called.
specs: ['./e2eTest/GmailTest.js'],
// Options to be passed to Jasmine-node.
jasmineNodeOpts: {
showColors: true,
defaultTimeoutInterval: 300000
},
allScriptsTimeout: 200000,
onPrepare: function () {
global.isAngularSite = function (flag) {
browser.ignoreSynchronization = !flag;
};
browser.driver.manage().window().maximize();
//To generate the report.
let HtmlReporter = require('protractor-beautiful-reporter');
jasmine.getEnv().addReporter(new HtmlReporter({
baseDirectory: 'Results/Report'
}).getJasmine2Reporter());
let reporter = new HtmlReporter({
baseDirectory: 'Results/Report'
});
let path = require('path');
new HtmlReporter({
baseDirectory: 'Results/Report'
, preserveDirectory: false
, cssOverrideFile: 'css/style.css'
, takeScreenShotsForSkippedSpecs: true
, screenshotsSubfolder: 'images'
, jsonsSubfolder: 'jsons'
, takeScreenShotsOnlyForFailedSpecs: false
, gatherBrowserLogs: true
, pathBuilder: function pathBuilder(spec, descriptions, results, capabilities) {
// Return '<browser>/<specname>' as path for screenshots:
// Example: 'firefox/list-should work'.
return path.join(capabilities.caps_.browser, descriptions.join('-'));
}
, metaDataBuilder: function metaDataBuilder(spec, descriptions, results, capabilities) {
// Return the description of the spec and if it has passed or not:
return {
description: descriptions.join(' ')
, passed: results.passed()
};
}
});
let MailListener = require("mail-listener2");
// here goes your email connection configuration
let mailListener = new MailListener({
username: "myemailid",
password: "mypassword",
host: "imap.gmail.com",
port: 993, // imap port
tls: true,
tlsOptions: {rejectUnauthorized: false},
mailbox: "INBOX", // mailbox to monitor
searchFilter: ["UNSEEN", "FLAGGED"], // the search filter being used after an IDLE notification has been retrieved
markSeen: true, // all fetched email willbe marked as seen and not fetched next time
fetchUnreadOnStart: true, // use it only if you want to get all unread email on lib start. Default is `false`,
mailParserOptions: {streamAttachments: true}, // options to be passed to mailParser lib.
attachments: true, // download attachments as they are encountered to the project directory
attachmentOptions: {directory: "attachments/"} // specify a download directory for attachments
});
mailListener.start();
mailListener.on("server:connected", function () {
console.log("Mail listener initialized");
});
global.mailListener = mailListener;
},
onCleanUp: function () {
mailListener.stop();
},
};
But while executing the script I am getting following error. The error is:
Error: Please log in via your web browser:
https://support.google.com/mail/accounts/answer/78754 (Failure)
and
Failed: browser.controlFlow(...).await is not a function
I know I am doing some mistake but I am not able to figure out. So can anyone can help me in pointing out and solve it, not solve but at least some suggestion which can help in running this script perfectly.
Thanks
Try to use browser.wait(getLastEmail) instead of browser.controlFlow().await(getLastEmail()
I have a local instance of MongoDB, started with --master, and I can see in the start log that I have a replica set oplog (file: /data/db/local.1) created. And it appears local.1 is updated every time I perform and insert().
I am trying to tail / stream the oplog with the following code...
MongoDB.connect('mongodb://localhost:27017/test',(e,db) => {
if ( e ) { return console.log('Connect MongoDB Error',e); }
console.log('Connected MongoDB');
Server.DB = db;
db.collection('oplog.rs').find({}).toArray((e,d) => {
console.log('oplog.rs');
console.log(e);
console.log(d);
})
var updates = db.collection('oplog.rs').find({},{
tailable: true,
awaitData: true,
noCursorTimeout: true,
oplogReplay: true,
numberOfRetries: Number.MAX_VALUE
}).stream();
updates.on('data',(data) => { console.log('data',data); });
updates.on('error',(e) => { console.log('error',e); });
updates.on('end',(end) => { console.log('end',end); });
});
The Code logs the following on start...
oplog.rs
null
[]
However there is no output from stream, I set numberOfRetries to a lower value I get the error..
error { MongoError: No more documents in tailed cursor
at Function.MongoError.create (/Users/peter/node_modules/mongodb-core/lib/error.js:31:11)
at nextFunction (/Users/peter/node_modules/mongodb-core/lib/cursor.js:637:50)
at /Users/peter/node_modules/mongodb-core/lib/cursor.js:588:7
at queryCallback (/Users/peter/node_modules/mongodb-core/lib/cursor.js:211:18)
at Callbacks.emit (/Users/peter/node_modules/mongodb-core/lib/topologies/server.js:116:3)
at Connection.messageHandler (/Users/peter/node_modules/mongodb-core/lib/topologies/server.js:282:23)
at Socket.<anonymous> (/Users/peter/node_modules/mongodb-core/lib/connection/connection.js:273:22)
at emitOne (events.js:96:13)
at Socket.emit (events.js:189:7)
at readableAddChunk (_stream_readable.js:176:18)
name: 'MongoError',
message: 'No more documents in tailed cursor',
tailable: true,
awaitData: true }
end undefined
oplog.rs is used in replica sets. Starting mongodb with --master you are using long-deprecated master-slave replication, which uses local.oplog.$main database, so there is nothings in oplog.rs.
You need to start mongod with --replSet to benefit from replica set oplog. Read more how to deploy and configure replica sets.