I'm working on a small project where I'm going to read some parameters from a SQLite database. The data is generated from a Linux server running a C code. I then want to create a script using JavaScript to fetch the data from the database. I've tried with alasql.js but it takes very long time (~1 minute) before I get the list with the parameters from two tables.
SELECT sensors.id, sensors.sensorname, sensors.sensornumber, information.sensorvalue, information.timestamp FROM sensors INNER JOIN information ON sensors.id=information.sensorid
I've been reading about IndexedDB but seems like it only works with JavaScript but not with C-code. Please, correct me if I'm wrong. The clue here is that I want a database that supports writing to database from C-code and reading from database from JavaScript. The database can be read either from file:// schema or an IP address.
Would appreciate any help regarding this problem. Thanks!
Unfortunately, SQLite internal file format is complicated to fast parsing with JavaScript. One of the reasons, that the only browser side library which can read it is SQL.js and it is relatively slow, because it can not read data by selected pages from the database file, but only the whole file.
One of the options: you can switch from SQLite format to CSV or TSV plain tet formats, which can be eaily and quickly send to the browser and be parsed with jQuery.CSV, PapaParse, AlaSQL or any other CSV parsing libraries, like:
alasql('SELECT * FROM TSV("mydata.txt",{headers:true})',[],function(data){
// data
});
Another alternative: you can write simple server on node.js with native support of SQLite and then provide requested records in JSON format with ajax to your application (like in this article)
Related
I am now learning while developing my Java web app using JSF, and I was asked to study on using Javascript to create datatable/report dynamically and present it via .html/.xhtml. I am required to use Datatables.net to produce the table. The backend process is using Java instead of Php. Since it is so rare to use this method, I got no idea on this. My thought so far is:
MySQL data --convert to (processed by Java code)--> Json --read by-> Javascript
How do I process the SQL data? Can I directly run the query inside Java like:
SELECT
CONCAT("[",
GROUP_CONCAT(
CONCAT("{name:'",NAME,"'"),
CONCAT("email:'",email,"'"),
CONCAT(",address:'",address,"'}")
)
,"]") AS JSON FROM customer
If so, how do I parse the data out and make sure Javascript will read it?
The another way I found is to use some library like Gson or something similar.
Explanations or examples about this issue is much appreciated.
I have a web page that calculate data about specific thing, this page does not have a database to store data so i want to make an excel file on the server to store data in it when user click 'submit', the problem is that i want to create only one excel file and i want to append new data into this file, can i do that with javascript or jquery ?
EDIT : i prefer making an offline project.
The question is too broad, but for the scope of answering,
1) You could use Google Sheets API to store the value to a sheet in the google cloud. (Yes, you can use javascript for this)
2) You could create a node.js server and use javascript to write to csv or an excel file(research for npm modules based on your requirement) and expose it using a REST API for your application
Yes this is possible.
Steps:
1. You can keep one excel sheet on the server, (example- test.csv).
2. Every time you want to append some data to this excel sheet, you may copy data from test.csv into json format, append the new data into the json (array).
3. delete the test.csv from server.
4. Use FILE APIs, to create a new csv file by the name test.csv with the data that was generated in json format in step 2.
(This is independent of node/google sheets, can be used with just html, js, and also if your project is in php);
I have a working code for this somewhere in my backup, would try to provide a link here to it.
P.s - This is probably my first answer here, please excuse for any issues.
You will need to install node.js into your machine to do this locally. (https://nodejs.org/en/download/). It would need a lot of security disables to allow file manipulation on the browser.
After, if your purpose is to only store data, than a CSV file would be easier and lighter (and you can open them with Excel). Run npm search csv to check a lot of packages that you can use to work with CSV
If you really need and Excel file, you can install this package npm install exceljs. And here is the docs for it https://github.com/exceljs/exceljs
How can I create d3.js graph from data on mongodb server using node.js?
D3.js includes ways to request non-local data either as json or text (csv) via urls and such.
In a setup that is not security sensitive (like local development or a demo environment) you could fairly directly use the mongo rest api if you enable it, which will give you json output for objects.
Or you could write build a simple http server (like in python, perl or go) that execs (python (also subprocess), perl (also backticks and qx{}), go) the mongoexport tool with the right parameters to provide csv output from mongo.
If you already have data in Mongo, and you've got Node already setup, then maybe that's what you want to use:
⇒ ⇒
If so, there's someone out there that's used Node.js® with some npm modules for MongoDB® to specifically drive a D3.js® visualization.
I'm relatively new to javascript but I want to get some data from a csv file that is saved online and gets updated each hour.
The data should be displayed on a table later on but I have some problems with saving it to an array. The csv file is comma seperated, has 9 columns, over 6000 rows and is a long string of text, so no linebreaks. The first row contains usernames and each username with special characters is conclosed with quotation marks.
I've tried several codes over the past few days, but none worked. Can I parse a online CSV file into an array at all? Is there an alternative like with SQL or saving the file to my server?
Remember: The file gets updated each hour..
NOTE: There are not really problems with the codes I've found, all of these were tested by others and seemed to work. But only for local files, not actual URLs!
You can use this https://code.google.com/p/jquery-csv/ plugin and it is possible to convert multi-line csv into 2D-array using $.csv.toArrays(csv) or to an object using $.csv.toObjects(csv). Check this post or this one for more info
$.ajax({
url: "urlto/filename.csv",
success: function (data) {
var arr = $.csvtoArray(data);
_oncomplete(arr);
},
dataType: "text",
});
_oncomplete: function (arr) {
//Your array here
}
You can have a look at papaparse for a solid and full-featured CSV parsing library.
setInterval javascript function lets you update the data every hour, in case you decide to develop this part on the client.
Is there an alternative like with SQL or saving the file to my server?
Yes there are alternatives, the right architecture depends on your use case. How many visitors will go to your web page and view the results, how critical your application is, how reliable the data source is, etc. If you're not sure about these you should talk to a web developer with more experience around these questions.
You may want to parse the CSV file every hour on the server and store a copy of the data there, to serve to your visitors. This way, if the upstream data source is unavailable, you still have a copy of the data from the past.
P.S.:
I've tried several codes over the past few days, but none worked
stackOverflow is about this: getting help about specific problems in your code, rather than asking general questions (answer to those can be found easily using a search engine).
I am developping a client side application with HTML5
my datas (only text data) are stored in my file JSON (70 Mega).
I want to implement a function to search all occurences in my data file.
Does exists open source of this function? or what is the best way to implement it?
thanks for your opinions
If you are planning to use Javascript on the client side, take a look at http://kiro.me/projects/fuse.html
However,70Mb of data is simply too much for any browser to handle and a client side search implementation is definitely not recommended.