Google datastore entity creation and update - javascript

I am attempting to update an entity in my datastore kind using sample code from here https://cloud.google.com/datastore/docs/reference/libraries. The actual code is something like this:
/ Imports the Google Cloud client library
const Datastore = require('#google-cloud/datastore');
// Your Google Cloud Platform project ID
const projectId = 'YOUR_PROJECT_ID';
// Creates a client
const datastore = new Datastore({
projectId: projectId,
});
// The kind for the new entity
const kind = 'Task';
// The name/ID for the new entity
const name = 'sampletask1';
// The Cloud Datastore key for the new entity
const taskKey = datastore.key([kind, name]);
// Prepares the new entity
const task = {
key: taskKey,
data: {
description: 'Buy milk',
},
};
// Saves the entity
datastore
.save(task)
.then(() => {
console.log(`Saved ${task.key.name}: ${task.data.description}`);
})
.catch(err => {
console.error('ERROR:', err);
});
I tried to create a new entity using this code. But when I ran this code and checked the datastore console, there were no entitites created.Also, I am unable to update an existing entity. What could be the reason for this?
I am writing the code in Google Cloud Functions.This is the log when I run this function:
{
insertId: "-ft02akcfpq"
logName: "projects/test-66600/logs/cloudaudit.googleapis.com%2Factivity"
operation: {…}
protoPayload: {…}
receiveTimestamp: "2018-06-15T09:36:13.760751077Z"
resource: {…}
severity: "NOTICE"
timestamp: "2018-06-15T09:36:13.436Z"
}
{
insertId: "000000-ab6c5ad2-3371-429a-bea2-87f8f7e36bcf"
labels: {…}
logName: "projects/test-66600/logs/cloudfunctions.googleapis.com%2Fcloud-functions"
receiveTimestamp: "2018-06-15T09:36:17.865654673Z"
resource: {…}
severity: "ERROR"
textPayload: "Warning, estimating Firebase Config based on GCLOUD_PROJECT. Intializing firebase-admin may fail"
timestamp: "2018-06-15T09:36:09.434Z"
}

I have tried the same code and it works for me. However, I have noticed that there was a delay before the entities appeared in Datastore. In order to update and overwrite existing entities, use .upsert(task) instead of .save(task) (link to GCP documentation). You can also use .insert(task) instead of .save(task) to store new entities.
Also check that the project id is correct and that you are inspecting the entities for the right kind.

Related

onMessage event is not triggered when agent sends a message amazon-connect-chatjs

I'm integrating amazon connect chatbot on my end and I want to establish a connection between a customer and an agent. In order to so, I have used onMessage event to retrieve agent messages back on my platform but right now, it is not triggered.
I have initially used aws-sdk and #aws-sdk/client-connectparticipant library to send messages where I have used multiple SDKs api in this order
startChatContact -> createParticipantConnection -> sendEvent -> sendMessage
This was done in order to establish a connection between a client and an agent, so that they can send messages to each other as well. With these SDKs, I was successfully able to send messages from customer to agent but in order to retrieve the messages back, I have used getTranscript api inititally, which I was calling in every 2 seconds to check for any updated messages. I have also stored message ids on my end to avoid any duplicate entries.
But now I'm looking for a better solution and for that, I have used amazon-connect-chatjs library and I have used the code below:
import "amazon-connect-streams";
import "amazon-connect-chatjs";
connect.contact(contact => {
if (contact.getType() !== connect.ContactType.CHAT) {
// applies only to CHAT contacts
return;
}
contact.onAccepted(() => {
const cnn = contact.getConnections().find(cnn => cnn.getType() === connect.ConnectionType.AGENT);
const agentChatSession = connect.ChatSession.create({
chatDetails: cnn.getMediaInfo(),
options: {
region: "us-west-2"
},
type: connect.ChatSession.SessionTypes.AGENT,
websocketManager: connect.core.getWebSocketManager()
});
});
});
I have also tried creating a customer chat session, but it is not working as well
const customerChatSession = connect.ChatSession.create({
chatDetails: {
contactId: "...",
participantId: "...",
participantToken: "..."
},
options: { // optional
region: "us-west-2"
},
type: connect.ChatSession.SessionTypes.CUSTOMER
});
Apart from that, I have used onTyping event, but it is not getting triggered as well.
Please let me know if I'm missing anything or doing anything wrong here.
Update1
if i add console.log
const customerChatSession = connect.ChatSession.create({
chatDetails: {
contactId: "...",
participantId: "...",
participantToken: "..."
},
options: { // optional
region: "us-west-2"
},
type: connect.ChatSession.SessionTypes.CUSTOMER
});
console.log("Here");
then even it does not print Here.
There is no error on console.
To create agentChatSession need websocket
connect.core.getWebSocketManager()
this is undefined even.

Google cloud dataflow job creation error: "Cannot set worker pool zone. Please check whether the worker_region experiments flag is valid"

I try to create a dataflow job to index a bigquery table into elasticSearchwith the node package google-cloud/dataflow.v1beta3.
The job is working fine when it's created and launched from the google cloud console, but I have the following error when I try it in node:
Error: 3 INVALID_ARGUMENT: (b69ddc3a5ef1c40b): Cannot set worker pool zone. Please check whether the worker_region experiments flag is valid. Causes: (b69ddc3a5ef1cd76): An internal service error occurred.
I tried to specify the experiments params in various ways but I always end up with the same error.
Does anyone managed to get a similar dataflow job working? Or do you have information about dataflow experiments?
Here is the code:
const { JobsV1Beta3Client } = require('#google-cloud/dataflow').v1beta3
const dataflowClient = new JobsV1Beta3Client()
const response = await dataflowClient.createJob({
projectId: 'myGoogleCloudProjectId',
location: 'europe-west1',
job: {
launch_parameter: {
jobName: 'indexation-job',
containerSpecGcsPath: 'gs://dataflow-templates-europe-west1/latest/flex/BigQuery_to_Elasticsearch',
parameters: {
inputTableSpec: 'bigQuery-table-gs-adress',
connectionUrl: 'elastic-endpoint-url',
index: 'elastic-index',
elasticsearchUsername: 'username',
elasticsearchPassword: 'password'
}
},
environment: {
experiments: ['worker_region']
}
}
})
Thank you very much for your help.
After many attempts I manage yesterday to find how to specify the worker region.
It looks like this:
await dataflowClient.createJob({
projectId,
location,
job: {
name: 'jobName',
type: 'Batch',
containerSpecGcsPath: 'gs://dataflow-templates-europe-west1/latest/flex/BigQuery_to_Elasticsearch',
pipelineDescription: {
inputTableSpec: 'bigquery-table',
connectionUrl: 'elastic-url',
index: 'elastic-index',
elasticsearchUsername: 'username',
elasticsearchPassword: 'password',
project: projectId,
appName: 'BigQueryToElasticsearch'
},
environment: {
workerPools: [
{ region: 'europe-west1' }
]
}
}
})
It's not working yet, I need to find the correct way to provide the other parameters, but now the dataflow job is created in the google cloud console.
For anyone who would be struggling with this issue, I finally found how to launch a dataflow job from a template.
There is a function launchFlexTemplate that work the same way as the job creation in the google cloud console.
Here is the final function working correctly:
const { FlexTemplatesServiceClient } = require('#google-cloud/dataflow').v1beta3
const response = await dataflowClient.launchFlexTemplate({
projectId: 'google-project-id',
location: 'europe-west1',
launchParameter: {
jobName: 'job-name',
containerSpecGcsPath: 'gs://dataflow-templates-europe-west1/latest/flex/BigQuery_to_Elasticsearch',
parameters: {
apiKey: 'elastic-api-key', //mandatory but not used if you provide username and password
connectionUrl: 'elasticsearch endpoint',
index: 'elasticsearch index',
elasticsearchUsername: 'username',
elasticsearchPassword: 'password',
inputTableSpec: 'bigquery source table', //projectid:datasetId.table
//parameters to upsert elasticsearch index
propertyAsId: 'table index use for elastic _id',
usePartialUpdate: true,
bulkInsertMethod: 'INDEX'
}
}

Dynamic filename in Winston dailyrotate for Promtail/Loki/Grafana

My NodeJS application writes logs with Winston. These logs then will be picked up by Promtail, to be saved to S3 by Loki and then processed in a dashboard in Grafana.
I want to create logs in Winston with dailyrotation of 30m. I want the logs to first be stored in my folder "/home/gad-web/gad-logs" when they are still being appended. And when they are rotated I want to move them to "/home/gad-web/gad-logs-rotated". Promtail will be looking at this specific folder.
I want to use dynamic filenames for different logs being written out, so that I can easily assign static labels to each file separetly using Promtail, rather than having to process each log line and assign a dynamic label to each line of log in one large file.
my file logger.mjs looks like this (formats, levels and other irrelevant data is left out):
const logDir = '/home/gad-web/gad-logs'
const logDirRotated = '/home/gad-web/gad-logs-rotated'
let winstonGdprProofFormat = winston.format.combine(...)
let winstonDailyRotateFileTransport = new winston.transports.DailyRotateFile({
frequency: '30m',
format: winstonGdprProofFormat,
filename: `${logDir}/all-gdpr-proof-%DATE%.log`,
datePattern: 'YYYY-MM-DD HH-mm',
})
// Move the file to another location after it is rotated, so it can be picked up by Promtail
winstonDailyRotateFileTransport.on('rotate', function (oldFilenamePath, newFilenamePath) {
let pathToMoveTo = `${logDirRotated}/${path.basename(oldFilenamePath)}`
fs.rename(oldFilenamePath, pathToMoveTo, function (err) {
if (err) throw err
})
})
let winstonTransports = []
if (process.env.environment !== 'local') {
winstonTransports.push(winstonConsoleTransport)
winstonTransports.push(winstonDailyRotateFileTransport)
} else {
winstonTransports.push(winstonConsoleWithColorsTransport)
}
const logger = winston.createLogger({
level: process.env.environment !== 'local' ? 'info' : 'debug',
levels: winstonLevels,
transports: winstonTransports,
})
export function log (obj) {
let { level, requestId, method, uri, msg, time, data } = obj
if (!level) {
level = 'info'
}
logger.log({
level: level,
requestId: requestId,
method: method,
uri: uri,
msg: msg,
time: time,
data: data,
})
}
It is being called in files that write logs like this:
import { log } from '../config/logger.mjs'
...
function writeRequestLog (start, request, requestId) {
let end = new Date().getTime()
let diff = end - start
log({ level: 'info', requestId: requestId, method: request.method, uri: request.path, msg: null, time: `${diff}ms`, data: JSON.stringify(request.query) })
}
Since the file is imported directly, it is immediately executed, and the winstonDailyRotateFileTransport is created using ${logDir}/all-gdpr-proof-%DATE%.log as the filename. How do I go around this instantiating this with a filename, so that I get daily rotated log files of 30minutes for a bunch of dynamically created different files?
I tried creating a Class in JS, but I quickly got into trouble because of the .on('rotate', ...) defined for the winstonDailyRotateFileTransport, and i'm also not sure of other implications creating a class for this might have (since this logger will be used a lot of times in my code)

Power BI Rest API - JS - Programatically Setting Visual Level Filter Error

I am trying to build a web application with Buttons to filter Visuals from a Power BI report.
Documentation that I am following, I am able to get the visual onto my application but when I set a filter using Visual.setFilters() method it is throwing me an error saying "Setting visual level filters is not supported.".
Visual in Web Application:
Error from Developer Console:
Code:
` var accessToken = '#ViewBag.AccessToken';
if (!accessToken || accessToken == "") {
return;
}
var basicFilter = {
$schema: "http://powerbi.com/product/schema#basic",
target: {
table: "Products",
column: "Product"
},
operator: "In",
values: ["Sova"],
filterType: 'BasicFilter'
}
// Get models. models contains enums that can be used.
var models = window['powerbi-client'].models;
// Gross Margin Tile
var embedConfiguration = {
type: 'visual',
accessToken: accessToken,
id: 'REPORT_ID',
pageName: 'ReportSection',
visualName:'VisualContainer7',
embedUrl: 'REPORT_EMBEDD_URL',
dashboardId: 'DASHBOARD_ID',
tokenType: models.TokenType.Aad,
filters: []
};
var $tileContainer = $('#grossMarginTile');
var grossMarginTile = powerbi.embed($tileContainer.get(0), embedConfiguration);
grossMarginTile.setFilters([basicFilter])
.catch(errors => {
console.log(errors)
});`
I am new to this, any help will be appreciated, what am I doing wrong?
It looks like you have an older version of the JS SDK, update to the latest and this will be resolved.
I also noticed that you do visual embedding but you have dashboardId in the embedComfiguration, This is unnecessary

calling json file from firebase function

I have 3 groups( html ) inside every group is four people and those people are the same in all 3 groups but just different numbers (votes/credits whatever), and I have a json file where is values for those people inside group. My HTML file reads my json file without problem.
I'm using Dialogflows Inline Editor to work with the Google Assistant. What I want is that the same way my html( javascript ) is reading those values from json file, I want to be able to load the person.json file as well. I have edited that many times could not manage to call person.json url.
For example:
"Hey google, tell me Alex credits"
< here it should read from my json file which is 73 >
Here are codes: person1.html
var response = await fetch("laurel.json");
var arr = await response.json();
var laurel= arr[1];
var dflt = {
min: 0,
max: 100,
// donut: true,
gaugeWidthScale: 1.1,
counter: true,
hideInnerShadow: true
}
var ee1 = new r({
id: 'ee1',
value: laurel['Jennifer'],
title: 'Jennifer ',
defaults: dflt
});
var ee2 = new r({
id: 'ee2',
value: laurel['Peter'],
title: 'Peter',
defaults: dflt
});
var ee3 = new r({
id: 'ee3',
value: laurel['Justin'],
title: 'Justin',
defaults: dflt
});
var ee4 = new r({
id: 'ee4',
value: laurel['Alex'],
title: 'Alex',
defaults: dflt
});
});
inline editors index.js :
intentMap.set('persons1', someFunction);
function someFunction(agent) {
agent.add(`Alex credits are 73 `);
}
// // below to get this function to be run when a Dialogflow intent is matched
// function yourFunctionHandler(agent) {
// agent.add(`This message is from Dialogflow's Cloud Functions for Firebase editor!`);
// agent.add(new Card({
// title: `Title: this is a card title`,
// imageUrl: 'https://developers.google.com/actions/images/badges/XPM_BADGING_GoogleAssistant_VER.png',
// text: `This is the body text of a card. You can even use line\n breaks and emoji! 💁`,
// buttonText: 'This is a button',
// buttonUrl: 'https://assistant.google.com/'
// })
// );
// agent.add(new Suggestion(`Quick Reply`));
// agent.add(new Suggestion(`Suggestion`));
// agent.setContext({ name: 'weather', lifespan: 2, parameters: { city: 'Rome' }});
// }
// // Uncomment and edit to make your own Google Assistant intent handler
// // uncomment `intentMap.set('your intent name here', googleAssistantHandler);`
// // below to get this function to be run when a Dialogflow intent is matched
// function googleAssistantHandler(agent) {
// let conv = agent.conv(); // Get Actions on Google library conv instance
// conv.ask('Hello from the Actions on Google client library!') // Use Actions on Google library
// agent.add(conv); // Add Actions on Google library responses to your agent's response
// }
// // See https://github.com/dialogflow/dialogflow-fulfillment-nodejs/tree/master/samples/actions-on-google
// // for a complete Dialogflow fulfillment library Actions on Google client library v2 integration sample
// Run the proper function handler based on the matched Dialogflow intent name
let intentMap = new Map();
intentMap.set('Default Welcome Intent', welcome);
intentMap.set('Default Fallback Intent', fallback);
// intentMap.set('your intent name here', yourFunctionHandler);
// intentMap.set('your intent name here', googleAssistantHandler);
agent.handleRequest(intentMap);
});
You've got a couple of options here. Either you switch your plan to Blaze and make a network request for the JSON data that is hosted on your public URL mentioned earlier.
Or you can also just put the JSON data into the Firebase realtime database as it is also accessible as a JSON document on a public URL if you set the database rules to publicly readable.
database.rules.json
{
"rules": {
"persons": {
".read": true
}
}
}
After that you can access your JSON data in Firebase on this URL: https://<your-project-id>.firebaseio.com/persons.json
That should be effectively the same thing for accessing in your web app.
For your AoG fulfillment you can simply use the Firebase SDK to access the data
firebase.database().ref('persons').child('personId');

Categories