How can I create d3.js graph from data on mongodb server using node.js?
D3.js includes ways to request non-local data either as json or text (csv) via urls and such.
In a setup that is not security sensitive (like local development or a demo environment) you could fairly directly use the mongo rest api if you enable it, which will give you json output for objects.
Or you could write build a simple http server (like in python, perl or go) that execs (python (also subprocess), perl (also backticks and qx{}), go) the mongoexport tool with the right parameters to provide csv output from mongo.
If you already have data in Mongo, and you've got Node already setup, then maybe that's what you want to use:
⇒ ⇒
If so, there's someone out there that's used Node.js® with some npm modules for MongoDB® to specifically drive a D3.js® visualization.
Related
I find the examples for affdex javascript sdk use the sdk in the frontend webpage.
Is it possible to use the sdk in Node.js? I'd like to save the emotion data to MongoDB. Or can I pass the emotion data I get in the fronend to Node.js in the backend? Then save the data from Node.js to MongoDB?According to the documentation, the CameraDetector constructor expects four parameters { divRoot, width, height, faceMode }. Which means I need to include a div element. But Node.js cannot manipulate frontend elements natively.
The current published version of the SDK cannot be used from NodeJS .. Not because it needs a DOM, but because internally it uses MEMFS to load the expressions models.
Is there a technique or module I can use in my node project that will take in JSON from a rest API and store it in some sort of readable object?
I understand currently that when I get data from a REST API, express's body parser will store it as a JSON object for me and I can traverse the tree from there, my issue is what if a contributor to my project has no idea what the JSON from the REST API looks like.
In Java you can tell since JACKSON will map it to a Java bean, in JavaScript I feel like you are going in blindly. If a contributor is using sublime and they don't have a debugger how should they figure out what the JSON object looks like?
In nodejs, you can use express-jsonschema package to validate the incoming json to rest endpoints. If the json is not compliant with schema, you can respond with bad request error ( http status 400).
Please look more about this package on package manager (npmjs.com).
AJV package can be used to validate the incoming JSON.That is best in performance
https://www.npmjs.com/package/ajv
I have been using SQL for quite a while now, for a node.js project, I wanted to make the package smaller, and easier to manage, I thought by getting rid of MYSQl I can have put the database as part of the server, using a CSV file, or or something similar, and a variable array, how could I archive this?
I am yet to try any code, but this is how I am planning on going about it at the moment:
A basic express web server using the GET/POST requests, import a CSV file to a array, then do a basic variable comparison, like say, if (vararray === "foo"){
return "foo" exists in the database.}
I am still relatively new to javascript.
How i can Run python or javascript file from postgres function ?
I have two postgres data base with totally different structure .
but i need to sync that two databases .
I will write the script that will read data from local data base and write into server data base in any language (python , java , javascript, node).
but the main problem is :
I do not know how to call that script from postgres procedure or function .
so the ques is :
How we can call or run any script from postgres procedure ?
Use plperl, plpython or pljava and use the system utils of the language. I.e in plpython use subprocess, in java use Runtime.getRuntime().exec and so on:
For example in plpython:
CREATE LANGUAGE plpythonu; -- create language if does not exist
CREATE OR REPLACE FUNCTION venue.test_execute()
RETURNS void AS
$BODY$
import subprocess
subprocess.call(['/path/to/script/test.py'])
$BODY$
LANGUAGE plpythonu;
SELECT venue.test_execute();
CONSIDERATIONS
Actually you can launch a script from a Postgres function, but this is not the preferred way to synchronize data between databases. You probably will have issues related to absolute and relative paths to scripts and data, and file system permissions. Instead use a database function to export data in a known format and call your synchronize script from outside the database. Another options is use the dblink module.
You can create a Perl Function and in that call your script to sync the databases.
I'm working on a small project where I'm going to read some parameters from a SQLite database. The data is generated from a Linux server running a C code. I then want to create a script using JavaScript to fetch the data from the database. I've tried with alasql.js but it takes very long time (~1 minute) before I get the list with the parameters from two tables.
SELECT sensors.id, sensors.sensorname, sensors.sensornumber, information.sensorvalue, information.timestamp FROM sensors INNER JOIN information ON sensors.id=information.sensorid
I've been reading about IndexedDB but seems like it only works with JavaScript but not with C-code. Please, correct me if I'm wrong. The clue here is that I want a database that supports writing to database from C-code and reading from database from JavaScript. The database can be read either from file:// schema or an IP address.
Would appreciate any help regarding this problem. Thanks!
Unfortunately, SQLite internal file format is complicated to fast parsing with JavaScript. One of the reasons, that the only browser side library which can read it is SQL.js and it is relatively slow, because it can not read data by selected pages from the database file, but only the whole file.
One of the options: you can switch from SQLite format to CSV or TSV plain tet formats, which can be eaily and quickly send to the browser and be parsed with jQuery.CSV, PapaParse, AlaSQL or any other CSV parsing libraries, like:
alasql('SELECT * FROM TSV("mydata.txt",{headers:true})',[],function(data){
// data
});
Another alternative: you can write simple server on node.js with native support of SQLite and then provide requested records in JSON format with ajax to your application (like in this article)