Twilio agent conference outbound call recording - javascript

I would like to get the recordings for outbound calls made by me, basically, we have a conference which the I join, and then use the outbound call api to add in the destination calls. See code below.
code to create conference
const conference_props = {
beep: true,
startConferenceOnEnter: true,
endConferenceOnExit: false,
maxParticipants: 2,
eventCallbackUrl: '/callback',
record: 'record-from-start',
recordingStatusCallback: '/callback',
trim: true
};
const dial = twiml.dial();
dial.conference('conferencename', conference_props);
code to add the calls
phone
.conferences('conferencename')
.participants.create({
to: '+123455666',
from: '+123455666',
earlyMedia: true,
record: true,
trim: true,
startConferenceOnEnter: true,
endConferenceOnExit: true,
conferenceStatusCallback: '/callback'
})
.then(participant => console.log(participant.sid), (err) => {
console.log(err);
});
however the only callbacks i get is at the end of the conference. This previously worked fine when using the old style conferences and dialling a call then adding it to the conference on connect, but i want to have the earlymedia (ringing noise) that the agent conference provides

I have resolved this, they way i did it was to modify the outbound call part, the url has to be absolute, so https://www.blah.com/callback and set on statusCallback not conferenceStatusCallback
so
phone
.conferences('conferencename')
.participants.create({
to: '+123455666',
from: '+123455666',
earlyMedia: true,
record: true,
trim: true,
startConferenceOnEnter: true,
endConferenceOnExit: true,
statusCallback: 'https://www.example.com/callback'
})
.then(participant => console.log(participant.sid), (err) => {
console.log(err);
});

Related

Quickblox WebRTC issue: onCallListener is not working inside my react app

So, basically I'm trying to receive a call from provider to my app. For that purpose Quickblox gives us a listener to receive the upcoming calls onCallListener. So here is my code snippet that should work but doesn't.
const calleesIds = [4104]
const sessionType = QB.webrtc.CallType.VIDEO
const additionalOptions = {}
let callSession = QB.webrtc.createNewSession(calleesIds, sessionType, null, additionalOptions)
console.log(callSession, "SESSION")
const mediaParams = {
audio: true,
video: true,
options: {
muted: true,
mirror: true,
},
elemId: "myVideoStream"
}
QB.webrtc.onCallListener = function(session: any, extension: object) {
callSession = session
console.log('asdasd')
// if you are going to take a call
session.getUserMedia(mediaParams, function (error: object, stream: object) {
if (error) {
console.error(error)
} else {
session.accept(extension)
session.attachMediaStream("videoStream", stream)
}
})
}
P.S. I also integrated chat which works perfect!
Found the solution by myself! Whenever you create a user and dialog id, search that user in the quickblox dashboard by the dialogId and change its settings: you will see that userId and providerId is the same which is wrong. So put your userId in the userId field and save that. After that you video calling listeners will work fine!)
P. S. also in the backend replace provider token with user token.

Trouble with React Native Push Notification

I'm currently working on a Android mobile App.
It's a kitchen recipes app. The app will send notification to the user during the day.
In the settings of the app, the user can choose how many and at what time he will receive the notifications (11 am to 7 pm for example)
This is where the problem begins;
I use the react-native-push-notification library with the following code:
static LocalNotif(string)
{
PushNotification.localNotification({
vibrate: true, // (optional) default: true
vibration: 300, // vibration length in milliseconds, ignored if vibrate=false, default: 1000
title: "VĂ©rifier vos produit", // (optional)
message: string, // (required)
largeIcon: "ic_launcher",
smallIcon: "ic_notification",
});
}
Next, I use the react-native-background-fetch to send a notification, even if the app is not running
static async backFetch(delay_to_next_notif)
{
BackgroundFetch.configure({
minimumFetchInterval: 3600
}, async (taskId) => {
// This is the fetch-event callback.
console.log("[BackgroundFetch] taskId: ", taskId);
// Use a switch statement to route task-handling.
switch (taskId) {
case 'com.foo.customtask':
this.LocalNotif("test")
break;
default:
console.log("Default fetch task");
}
// Finish, providing received taskId.
BackgroundFetch.finish(taskId);
});
// Step 2: Schedule a custom "oneshot" task "com.foo.customtask" to execute 5000ms from now.
BackgroundFetch.scheduleTask({
taskId: "com.foo.customtask",
forceAlarmManager: true,
delay: delay_to_next_notif// <-- milliseconds
});
}
The use of react-native-background-fetch is very strange. Sometime I never receive the notification.
Is it possible to use a push notification library and create a routine so that the user receives notifications at specific times during the day, even if the app is not running?
You can use Pushnptification.configure method and set your state if your app is in forground or background something like this
async componentDidMount() {
await this.requestUserPermission();
PushNotification.configure({
onNotification: (notification) => {
console.log('NOTIFICATION', notification);
if (notification.foreground === false) {
console.log('app is in background')
}
this.setState({
msg: notification.message.body
? notification.message.body
: notification.message,
forground: notification.foreground,
});
},
});
}
and in your return u can do something like this
{this.state.forground === true
? showMessage({
message: this.state.msg,
backgroundColor: '#1191cf',
type: 'default',
duration: 10000,
icon: 'success',
onPress: () => {
console.log('app is in forground')
},
})
: null}

How to reconnect Pubnub after internet reconnect in javascript?

I have a Pubnub instance,
I want to know how to handle reconnection when internet does down and comes back up with like a given number of retries? The documentation definitely gives the appropriate docs but I am unable to put it into code.
Help would be greatly appreciated.
my code:
this.pubnub = new PubNub({
subscribeKey: this.serverDetails.authInfo.subscribeKey,
authKey: this.serverDetails.authInfo.authKey,
uuid,
restore: true,
ssl: true
});
this.listeners = {
message: msgEvent => {
console.log(msgEvent);
},
status: statusEvent => {
}
};
this.pubnub.addListener(this.listeners);
Set restore:true in your init code.
this.pubnub = new PubNub({
subscribeKey: this.serverDetails.authInfo.subscribeKey,
authKey: this.serverDetails.authInfo.authKey,
uuid,
ssl: true,
restore: true // this allows reconnect to restore your channel subscription
});

WebRTC channelMessage not recieving first message

When sending a message using WebRTC sendDirectlyToAll, the message is never recieved the first time, but every time after that.
I've stripped the code down to a very simple state now, but it's still the same. Anyone got a clue about why this is happening?
Here is the code:
var webrtc = new SimpleWebRTC({
localVideoEl: 'localVideo',
remoteVideosEl: 'remoteVideos',
autoRequestMedia: false,
media: {
video: true,
audio: false
},
localVideo: {
autoplay: true,
mirror: true,
muted: true
}
});
$("#chat-send-button").on("click", function (e) {
sendMessage();
});
function sendMessage() {
console.log("sendMessage");
const chatMessage = $("#chat-message-input");
webrtc.sendDirectlyToAll(
"chat",
"info", {
"chatmessage": chatMessage.val()
}
)
chatMessage.val("");
}
webrtc.on("channelMessage", function (peer, channel, data) {
console.log(peer);
console.log(channel);
console.log("data", data);
$("#chat-message-container").text(data.payload.chatmessage);
});
You probably need for the WebRTC connection to be established before allowing the user to send a message - do you make use of the readyToCall event described in the documentation https://github.com/SimpleWebRTC/SimpleWebRTC#3-tell-it-to-join-a-room-when-ready
(a link to an editable runable code snippet might help)

Using the node.js google cloud speech to text, how can I get the status of a current job?

I managed to trigger a job with:
const config = {
languageCode: 'en-US',
enableSpeakerDiarization: true,
audioChannelCount: 2,
enableSeparateRecognitionPerChannel: true,
useEnhanced: true,
profanityFilter: false,
enableAutomaticPunctuation: true,
};
const audio = {
uri: `gs://${filePath}`
}
const requestObj = {
config: config,
audio: audio
}
return speechClient.longRunningRecognize(requestObj)
I get back an object with a name. I want to use that with https://cloud.google.com/speech-to-text/docs/reference/rest/v1/LongRunningRecognizeMetadata (via the node.js package) to get the current status.
How do I do it?
return speechClient.longrunning.Operation()
Seems not to exist
Looks like you can do it with:
return speechClient.operationsClient.getOperation({ name: googleName })
This is not super well documented

Categories