I have created a sample.js file with the following code
var mysql = require('mysql');
Typically, I would connect to my online database using:
var pool = mysql.createPool({
host: 'den1.mysql5.gear.host',
user: 'myst',
password: 'hidden',
database: "myst"
});
and then do
var connection = pool.getConnection(function(err, connection) {
//do whatever like connection.query
});
How can I create a local database file and access that, instead of using server side databases?
Edit: USING ONLY MySQL!
If you do not know, please do not answer. I am not looking for an alternative (since most alternatives cause node to delete packages needed by discord.js for some reason).
MySQL is quite heavy to implement database on front end as there is size and speed limitations. I'll prefer using it on back end only but if you want to use database on front-end you can use db.js. There is indexDB presently present in most of the modern browser. The db.js is a wrapper around that consumes it to implement database on front end. Here's sample provided on the documentation.
<script src='/scripts/db.js'></script>
var server;
db.open( {
server: 'my-app',
version: 1,
schema: {
people: {
key: { keyPath: 'id' , autoIncrement: true },
// Optionally add indexes
indexes: {
firstName: { },
answer: { unique: true }
}
}
}
} ).done( function ( s ) {
server = s
} );
Related
I try to create a dataflow job to index a bigquery table into elasticSearchwith the node package google-cloud/dataflow.v1beta3.
The job is working fine when it's created and launched from the google cloud console, but I have the following error when I try it in node:
Error: 3 INVALID_ARGUMENT: (b69ddc3a5ef1c40b): Cannot set worker pool zone. Please check whether the worker_region experiments flag is valid. Causes: (b69ddc3a5ef1cd76): An internal service error occurred.
I tried to specify the experiments params in various ways but I always end up with the same error.
Does anyone managed to get a similar dataflow job working? Or do you have information about dataflow experiments?
Here is the code:
const { JobsV1Beta3Client } = require('#google-cloud/dataflow').v1beta3
const dataflowClient = new JobsV1Beta3Client()
const response = await dataflowClient.createJob({
projectId: 'myGoogleCloudProjectId',
location: 'europe-west1',
job: {
launch_parameter: {
jobName: 'indexation-job',
containerSpecGcsPath: 'gs://dataflow-templates-europe-west1/latest/flex/BigQuery_to_Elasticsearch',
parameters: {
inputTableSpec: 'bigQuery-table-gs-adress',
connectionUrl: 'elastic-endpoint-url',
index: 'elastic-index',
elasticsearchUsername: 'username',
elasticsearchPassword: 'password'
}
},
environment: {
experiments: ['worker_region']
}
}
})
Thank you very much for your help.
After many attempts I manage yesterday to find how to specify the worker region.
It looks like this:
await dataflowClient.createJob({
projectId,
location,
job: {
name: 'jobName',
type: 'Batch',
containerSpecGcsPath: 'gs://dataflow-templates-europe-west1/latest/flex/BigQuery_to_Elasticsearch',
pipelineDescription: {
inputTableSpec: 'bigquery-table',
connectionUrl: 'elastic-url',
index: 'elastic-index',
elasticsearchUsername: 'username',
elasticsearchPassword: 'password',
project: projectId,
appName: 'BigQueryToElasticsearch'
},
environment: {
workerPools: [
{ region: 'europe-west1' }
]
}
}
})
It's not working yet, I need to find the correct way to provide the other parameters, but now the dataflow job is created in the google cloud console.
For anyone who would be struggling with this issue, I finally found how to launch a dataflow job from a template.
There is a function launchFlexTemplate that work the same way as the job creation in the google cloud console.
Here is the final function working correctly:
const { FlexTemplatesServiceClient } = require('#google-cloud/dataflow').v1beta3
const response = await dataflowClient.launchFlexTemplate({
projectId: 'google-project-id',
location: 'europe-west1',
launchParameter: {
jobName: 'job-name',
containerSpecGcsPath: 'gs://dataflow-templates-europe-west1/latest/flex/BigQuery_to_Elasticsearch',
parameters: {
apiKey: 'elastic-api-key', //mandatory but not used if you provide username and password
connectionUrl: 'elasticsearch endpoint',
index: 'elasticsearch index',
elasticsearchUsername: 'username',
elasticsearchPassword: 'password',
inputTableSpec: 'bigquery source table', //projectid:datasetId.table
//parameters to upsert elasticsearch index
propertyAsId: 'table index use for elastic _id',
usePartialUpdate: true,
bulkInsertMethod: 'INDEX'
}
}
I have a very basic feathers service which stores data in mongoose using the feathers-mongoose package. The issue is with the get functionality. My model is as follows:
module.exports = function (app) {
const mongooseClient = app.get('mongooseClient');
const { Schema } = mongooseClient;
const messages = new Schema({
message: { type: String, required: true }
}, {
timestamps: true
});
return mongooseClient.model('messages', messages);
};
When the a user runs a GET command :
curl http://localhost:3030/messages/test
I have the following requirements
This essentially tries to convert test to ObjectID. What i would
like it to do is to run a query against the message attribute
{message : "test"} , i am not sure how i can achieve this. There is
not enough documentation for to understand to write or change this
in the hooks. Can some one please help
I want to return a custom error code (http) when a row is not found or does not match some of my criterias. How can i achive this?
Thanks
In a Feathers before hook you can set context.result in which case the original database call will be skipped. So the flow is
In a before get hook, try to find the message by name
If it exists set context.result to what was found
Otherwise do nothing which will return the original get by id
This is how it looks:
async context => {
const messages = context.service.find({
...context.params,
query: {
$limit: 1,
name: context.id
}
});
if (messages.total > 0) {
context.result = messages.data[0];
}
return context;
}
How to create custom errors and set the error code is documented in the Errors API.
My Scenario,
I am working on a desktop based application. Where My big challenge is keeping data into relational DB(Offline) and sync accordingly(Company have their own syncing algorithm). I am using Electron and VueJS as client side. For building the desktop app I'm using electron-builder. I am able to write migrations with raw SQL or various ORM.
What I want?
While I'll install into a desktop, I want to create the database file and apply all the migrations on the client's computer. I just don't know that part how to do that. I also looked into Electron Builder Docs. But didn't understand. I need an example, any idea.
Please help me. Thanks
After doing a lot of research I found an awesome solution provided by sequalize.js. I found a library Umzug Github. Let's look at the implementation...
/**
* Created by Ashraful Islam
*/
const path = require('path');
const Umzug = require('umzug');
const database = /* Imported my database config here */;
const umzug = new Umzug({
storage: 'sequelize',
storageOptions: {
sequelize: database
},
// see: https://github.com/sequelize/umzug/issues/17
migrations: {
params: [
database.getQueryInterface(), // queryInterface
database.constructor, // DataTypes
function () {
throw new Error('Migration tried to use old style "done" callback. Please upgrade to "umzug" and return a promise instead.');
}
],
path: './migrations',
pattern: /\.js$/
},
logging: function () {
console.log.apply(null, arguments);
}
});
function logUmzugEvent(eventName) {
return function (name, migration) {
console.log(`${name} ${eventName}`);
}
}
function runMigrations() {
return umzug.up();
}
umzug.on('migrating', logUmzugEvent('migrating'));
umzug.on('migrated', logUmzugEvent('migrated'));
umzug.on('reverting', logUmzugEvent('reverting'));
umzug.on('reverted', logUmzugEvent('reverted'));
module.exports = {
migrate: runMigrations
};
Idea behind the scene
I clearly declare the migration directory. Also, define the file matching pattern. Umzug just read files from there and run the DB migration. An example migration file is following...
// 000_Initial.js
"use strict";
module.exports = {
up: function(migration, DataTypes) {
return migration.createTable('Sessions', {
sid: {
type: DataTypes.STRING,
allowNull: false
},
data: {
type: DataTypes.STRING,
allowNull: false
},
createdAt: {
type: DataTypes.DATE
},
updatedAt: {
type: DataTypes.DATE
}
}).then(function() {
return migration.addIndex('Sessions', ['sid']);
});
},
down: function(migration, DataTypes) {
return migration.dropTable('Sessions');
}
};
I'm using the multicast-dns node module to attempt making this work.
Looking up custom.local in the browser gives me the console message I setup, but I'm unable to see my actual server running (which is doing so at localhost:12345, where 12345 is a dynamic number). I want to be able to see my local server when visiting custom.local. Is this possible?
Here's some code:
mdns.on("query", query => {
if (query.questions[0] && query.questions[0].name === "custom.local") {
console.log(query);
mdns.respond({
answers: [
{
name: "custom.local",
type: "SRV",
data: {
port: n.get("p"), // dynamic port
weight: 0,
priority: 10,
target: ip // local IP
}
}, {
name: "custom.local",
type: "A",
data: ip,
ttl: 300
}
]
});
}
});
EDIT: I can connect to my local server just fine, that wasn't an issue.
Quoting cfreak:
You can't put port numbers in DNS. DNS is only for looking up an IP by name. For your browser to see it by the name alone you need a proxy program in front of your service or you need to run the service itself on port 80. Port numbers really shouldn't be dynamic. You should specify it in the setup of your service.
That answers my question and offers next steps. Thanks!
UPDATE: Figured out what I was trying to do. Here's some code!
FOUND A SOLUTION, WOOP WOOP!
I'm using this module, but tweaked the source a bit (only because I have dynamic ports, because I feel like it).
/* jshint undef: true, unused: true, esversion: 6, node: true */
"use strict";
//
// G E T
// P A C K A G E S
import express from "express";
import http from "http";
import local from "./server/local";
const n = express();
n.get("/", (req, res) => {
res.send("Welcome home");
});
//
// L A U N C H
const server = http.createServer(n);
server.listen(0, () => {
const port = server.address().port;
local.add(port, "custom.local");
});
Hope this helps you as well, future Internet searcher! :D
Don't let negative folks on other SE sites bring you down. :virtual fist bump:
I have 2 droplets on digitalocean: one running meteor and other running MongoDB. I want to store the data in MongoDB on other droplet instead of meteor's MongoDB. I was able to login via shell via mongo ip_address:27017/eTable -u uName -p passwd command. Also I managed to successfully connect to the external database by setting MONGO_URL variable, but I am not able to insert data into the external database. I tried different variations with the insert query, but no luck. How can I do this?
EDIT:
I don't get any errors. Here is what I have tried so far:
var database = new MongoInternals.RemoteCollectionDriver("mongodb://uname:passwd#ip:27017/eTable");
eTable = new Mongo.Collection('eTable', {
_driver: database
});
And here is the code with which I am trying to insert:
Template.intro.rendered = function() {
db.eTable.insert({
_id: sess,
visit: 'True',
Navigate: 'False',
Created: 'False',
Drops: 'False',
Uploads: 'False',
ts: new Date()
}, function(err, data) {
if (err) {
console.log(err);
} else {
console.log("Inserted");
}
});
}