insert meteor data to external mongodb - javascript

I have 2 droplets on digitalocean: one running meteor and other running MongoDB. I want to store the data in MongoDB on other droplet instead of meteor's MongoDB. I was able to login via shell via mongo ip_address:27017/eTable -u uName -p passwd command. Also I managed to successfully connect to the external database by setting MONGO_URL variable, but I am not able to insert data into the external database. I tried different variations with the insert query, but no luck. How can I do this?
EDIT:
I don't get any errors. Here is what I have tried so far:
var database = new MongoInternals.RemoteCollectionDriver("mongodb://uname:passwd#ip:27017/eTable");
eTable = new Mongo.Collection('eTable', {
_driver: database
});
And here is the code with which I am trying to insert:
Template.intro.rendered = function() {
db.eTable.insert({
_id: sess,
visit: 'True',
Navigate: 'False',
Created: 'False',
Drops: 'False',
Uploads: 'False',
ts: new Date()
}, function(err, data) {
if (err) {
console.log(err);
} else {
console.log("Inserted");
}
});
}

Related

Google cloud dataflow job creation error: "Cannot set worker pool zone. Please check whether the worker_region experiments flag is valid"

I try to create a dataflow job to index a bigquery table into elasticSearchwith the node package google-cloud/dataflow.v1beta3.
The job is working fine when it's created and launched from the google cloud console, but I have the following error when I try it in node:
Error: 3 INVALID_ARGUMENT: (b69ddc3a5ef1c40b): Cannot set worker pool zone. Please check whether the worker_region experiments flag is valid. Causes: (b69ddc3a5ef1cd76): An internal service error occurred.
I tried to specify the experiments params in various ways but I always end up with the same error.
Does anyone managed to get a similar dataflow job working? Or do you have information about dataflow experiments?
Here is the code:
const { JobsV1Beta3Client } = require('#google-cloud/dataflow').v1beta3
const dataflowClient = new JobsV1Beta3Client()
const response = await dataflowClient.createJob({
projectId: 'myGoogleCloudProjectId',
location: 'europe-west1',
job: {
launch_parameter: {
jobName: 'indexation-job',
containerSpecGcsPath: 'gs://dataflow-templates-europe-west1/latest/flex/BigQuery_to_Elasticsearch',
parameters: {
inputTableSpec: 'bigQuery-table-gs-adress',
connectionUrl: 'elastic-endpoint-url',
index: 'elastic-index',
elasticsearchUsername: 'username',
elasticsearchPassword: 'password'
}
},
environment: {
experiments: ['worker_region']
}
}
})
Thank you very much for your help.
After many attempts I manage yesterday to find how to specify the worker region.
It looks like this:
await dataflowClient.createJob({
projectId,
location,
job: {
name: 'jobName',
type: 'Batch',
containerSpecGcsPath: 'gs://dataflow-templates-europe-west1/latest/flex/BigQuery_to_Elasticsearch',
pipelineDescription: {
inputTableSpec: 'bigquery-table',
connectionUrl: 'elastic-url',
index: 'elastic-index',
elasticsearchUsername: 'username',
elasticsearchPassword: 'password',
project: projectId,
appName: 'BigQueryToElasticsearch'
},
environment: {
workerPools: [
{ region: 'europe-west1' }
]
}
}
})
It's not working yet, I need to find the correct way to provide the other parameters, but now the dataflow job is created in the google cloud console.
For anyone who would be struggling with this issue, I finally found how to launch a dataflow job from a template.
There is a function launchFlexTemplate that work the same way as the job creation in the google cloud console.
Here is the final function working correctly:
const { FlexTemplatesServiceClient } = require('#google-cloud/dataflow').v1beta3
const response = await dataflowClient.launchFlexTemplate({
projectId: 'google-project-id',
location: 'europe-west1',
launchParameter: {
jobName: 'job-name',
containerSpecGcsPath: 'gs://dataflow-templates-europe-west1/latest/flex/BigQuery_to_Elasticsearch',
parameters: {
apiKey: 'elastic-api-key', //mandatory but not used if you provide username and password
connectionUrl: 'elasticsearch endpoint',
index: 'elasticsearch index',
elasticsearchUsername: 'username',
elasticsearchPassword: 'password',
inputTableSpec: 'bigquery source table', //projectid:datasetId.table
//parameters to upsert elasticsearch index
propertyAsId: 'table index use for elastic _id',
usePartialUpdate: true,
bulkInsertMethod: 'INDEX'
}
}

How to create an offline MySQL database javascript?

I have created a sample.js file with the following code
var mysql = require('mysql');
Typically, I would connect to my online database using:
var pool = mysql.createPool({
host: 'den1.mysql5.gear.host',
user: 'myst',
password: 'hidden',
database: "myst"
});
and then do
var connection = pool.getConnection(function(err, connection) {
//do whatever like connection.query
});
How can I create a local database file and access that, instead of using server side databases?
Edit: USING ONLY MySQL!
If you do not know, please do not answer. I am not looking for an alternative (since most alternatives cause node to delete packages needed by discord.js for some reason).
MySQL is quite heavy to implement database on front end as there is size and speed limitations. I'll prefer using it on back end only but if you want to use database on front-end you can use db.js. There is indexDB presently present in most of the modern browser. The db.js is a wrapper around that consumes it to implement database on front end. Here's sample provided on the documentation.
<script src='/scripts/db.js'></script>
var server;
db.open( {
server: 'my-app',
version: 1,
schema: {
people: {
key: { keyPath: 'id' , autoIncrement: true },
// Optionally add indexes
indexes: {
firstName: { },
answer: { unique: true }
}
}
}
} ).done( function ( s ) {
server = s
} );

Google datastore entity creation and update

I am attempting to update an entity in my datastore kind using sample code from here https://cloud.google.com/datastore/docs/reference/libraries. The actual code is something like this:
/ Imports the Google Cloud client library
const Datastore = require('#google-cloud/datastore');
// Your Google Cloud Platform project ID
const projectId = 'YOUR_PROJECT_ID';
// Creates a client
const datastore = new Datastore({
projectId: projectId,
});
// The kind for the new entity
const kind = 'Task';
// The name/ID for the new entity
const name = 'sampletask1';
// The Cloud Datastore key for the new entity
const taskKey = datastore.key([kind, name]);
// Prepares the new entity
const task = {
key: taskKey,
data: {
description: 'Buy milk',
},
};
// Saves the entity
datastore
.save(task)
.then(() => {
console.log(`Saved ${task.key.name}: ${task.data.description}`);
})
.catch(err => {
console.error('ERROR:', err);
});
I tried to create a new entity using this code. But when I ran this code and checked the datastore console, there were no entitites created.Also, I am unable to update an existing entity. What could be the reason for this?
I am writing the code in Google Cloud Functions.This is the log when I run this function:
{
insertId: "-ft02akcfpq"
logName: "projects/test-66600/logs/cloudaudit.googleapis.com%2Factivity"
operation: {…}
protoPayload: {…}
receiveTimestamp: "2018-06-15T09:36:13.760751077Z"
resource: {…}
severity: "NOTICE"
timestamp: "2018-06-15T09:36:13.436Z"
}
{
insertId: "000000-ab6c5ad2-3371-429a-bea2-87f8f7e36bcf"
labels: {…}
logName: "projects/test-66600/logs/cloudfunctions.googleapis.com%2Fcloud-functions"
receiveTimestamp: "2018-06-15T09:36:17.865654673Z"
resource: {…}
severity: "ERROR"
textPayload: "Warning, estimating Firebase Config based on GCLOUD_PROJECT. Intializing firebase-admin may fail"
timestamp: "2018-06-15T09:36:09.434Z"
}
I have tried the same code and it works for me. However, I have noticed that there was a delay before the entities appeared in Datastore. In order to update and overwrite existing entities, use .upsert(task) instead of .save(task) (link to GCP documentation). You can also use .insert(task) instead of .save(task) to store new entities.
Also check that the project id is correct and that you are inspecting the entities for the right kind.

Meteor insert collection within server callback not working

Basically, I am trying to get a wss feed going from Poloniex, and update a collection with it so that I can have 'latest' prices in a collection (I will update and overwrite existing entries) and show it on a web page. For now, I got the wss working and am just trying to insert some of the data in the collection to see if it works, but it doesn't and I can't figure out why!
Note: The collection works, I've manually inserted a record with the shell.
Here is the code I have now:
import { Meteor } from 'meteor/meteor';
import * as autobahn from "autobahn";
import { Mongo } from 'meteor/mongo'
import { SimpleSchema } from 'meteor/aldeed:simple-schema'
//quick DB
Maindb = new Mongo.Collection('maindb');
Maindb.schema = new SimpleSchema({
place: {type: String},
pair: {type: String},
last: {type: Number, defaultValue: 0}
});
Meteor.startup(() => {
var wsuri = "wss://api.poloniex.com";
var Connection = new autobahn.Connection({
url: wsuri,
realm: "realm1"
});
Connection.onopen = function(session)
{
function tickerEvent (args,kwargs) {
console.log(args[0]);
Maindb.insert({place: 'Poloniex', pair: args[0]});
}
session.subscribe('ticker', tickerEvent);
Connection.onclose = function () {
console.log("Websocket connection closed");
}
}
Connection.open();
});
The console logs the feed but then the insert does not work.
I looked online and it said that to get an insert to work when in a 'non Meteor' function, you need to use Meteor.bindEnvironment which I did:
I changed
function tickerEvent (args,kwargs) {
console.log(args[0]);
Maindb.insert({place: 'Poloniex', pair: args[0]});
}
which became
var tickerEvent = Meteor.bindEnvironment(function(args,kwargs) {
console.log(args[0]);
Maindb.insert({place: 'Poloniex', pair: args[0]});
}); tickerEvent();
Which doesn't do anything - not even print the feed on my console. Using this same structure but simply removing Meteor.bindEnvironmentprints again to the console but doesn't update.
Am I doing something wrong?

MeteorJS Infinite loop when using meteor call and meteor method

I have a sample code that goes like this:
Client Helper:
getUsername: function (userId) {
Meteor.call("getUsername", userId, function (err, result) {
if(!err) {
Session.set("setUsername", result);
else {
console.log(err);
}
});
return Session.get("setUsername");
}
Server
Meteor.methods({
"getUsername": function (userId) {
var x = Meteor.users.find({_id: userId}, {fields: {username:1}}).fetch()[0];
return x.username;
}
});
The result of this code is an infinite loop of username passing to the client. Is there a way to stop the loop and pass only the data that is needed on the client? I believe the reactivity is causing the data to loop infinitely and I am not sure how to stop it. I tried using "reactive":false on my query in the server but it does not work.
If you want to access username everywhere in client templates (so thats why you put it into session), I would not set it in template helper. I would set it on startup and get username from session in template helpers (without calling server method)
If you need username just in one template, so you want to return its value from your template helper, do not put it into session, just return it in your server method callback.
Based on your sample code, I assume, you have a set of posts and you are retrieving user name based on user id for each post. Then instead of doing it this way, you should use publish composite package to publish related users as well.
Meteor.publishComposite('getPosts', function (postIds) {
return [{
find: function() {
return Posts.find({ _id: { $in: postIds }});
// you can also do -> return Posts.find();
// or -> return Posts.find({ /* or what ever your selector is to get the posts you need*/ });
},
children: [{
find: function(post) {
return Meteor.users.find({
id: post.userId //or the correct field in your post document to get user id
}, {
fields: {
"profile": 1
}
});
}
}}
}]
});
This way your publication will take care of publishing related users along with posts. You don't need to use methods and call them each time.

Categories