I need to get gtfs data and show raw data on screen. I tried to use gtfs-stream module, but not sure, how to get data from it.
var gtfs_source = 'https://www.bkk.hu/gtfs/budapest_gtfs.zip'
request.get(gtfs_source).pipe(gtfs.enhanced()).on('data', (entity) => {
console.log(entity)
})
gtfs-utils has a rich set of functions, but it is unable to get data from a compressed file.
Related
For a site that I'm making, I wanted to have an additional feature which uses this plugin on GitHub called cratedigger that uses WebGL and Three.js. It is a 3D virtual crate that contains vinyl records, and simulates "crate digging" of albums. The plugin gets the data for the vinyl titles, artists, and covers through a string in JSON format that is parsed through JSON and stored in a const variable. The original code (index.js):
const data = JSON.parse('[{"title":"So What","artist":"Miles
Davis","cover":"http://cdn-
images.deezer.com/images/cover/63bf5fe5f15f69bfeb097139fc34f3d7/400x400-
000000-80-00.jpg","year":"2001","id":"SOBYBNV14607703ACA","hasSleeve":false},
{"title":"Stolen Moments","artist":"Oliver Nelson","cover":"http://cdn-
images.deezer.com/images/cover/99235a5fbe557590ccd62a2a152e4dbe/400x400-
000000-80-00.jpg","year":"1961","id":"SOCNMPH12B0B8064AA","hasSleeve":false},
{"title":"Theme for Maxine","artist":"Woody Shaw","cover":"http://cdn-
images.deezer.com/images/cover/bb937f1e1d57f7542a64c74b13c47900/400x400-
000000-80-00.jpg","year":"1998","id":"SOMLSGW131343841A7","hasSleeve":false}]
');
You can view the source code here. The above code is in line 3.
For my site though, I want the titles, artists, covers, etc. to come from my MySQL database. So what I did is when I click a button found in my main site, it will run the sql query of my database and then convert it into a .json file:
//getdatabase.php
<?php
include 'includes/session.php';
include 'includes/header.php';
$conn = $pdo->open();
$data = $conn->query("
SELECT name as title,
artist_name as artist,
concat('http://website.com/images/', photo) as cover,
year(date_created) as year,
id,
hasSleeve
from products
where category_id = '5';
")->fetchAll(PDO::FETCH_ASSOC);
foreach($data as &$row){
$row['hasSleeve'] = filter_var($row['hasSleeve'],
FILTER_VALIDATE_BOOLEAN);
}
$json_string = json_encode($data);
$file = 'cratedigger.js/src/htdocs/data.json';
file_put_contents($file, $json_string);
$pdo->close();
header('location: cratedigger.js/lib/index.html');
?>
Afterwards, it will be redirected to the index of the cratedigger plugin. To retrieve the data in the .json file, I used fetch API in the index.js file under the src folder of this plugin. So I replaced the original code in the plugin with this:
//replaced the long line of const data=JSON.parse('[{"title":...]'); with
this:
let data = [];
fetch('data.json').then(function(resp) {
return resp.json();
})
.then(function(data) {
console.log(data); //outputs data from my database
return data;
});
console.log(data); //output is 0 or none
I use node.js to build and test this plugin, and when I test it with the code I used, the crate appears but with no records in it (a blank wooden crate). In the console, the log inside the fetch api does output the data from the json file, but the log outside the fetch outputs zero or none. I figured that it was that fetch is asynchronous, so the second console.log didn't output any data because it didn't wait for fetch to finish.
And that's my problem. I want to replace the original code that uses a string in JSON format, and replace it with data from my database. Some of the solutions that I came up with that didn't work are:
Use await and async - This is still asynchronous so it couldn't store my data in the variable.
XMLHttpRequest is mostly asynchronous too and its synchronous part is already deprecated.
Place fetch inside a function - My problem with this code is that the variable "data" used in the original code is used/called in other parts of the source files like these examples:
function fillInfoPanel(record) {
if (record.data.title) {
titleContainer.innerHTML = record.data.title;
}
}
//or this//
cratedigger.loadRecords(data, true, () => {
bindEvents();
});
So calling a function like myData(); to get the data from fetch wouldn't work as I need it to be inside the data variable. I'm not sure how I'm supposed to replace data variable used in other parts of the code with a function.
Any suggestions that might work? I don't necessarily need to use fetch to retrieve data. I'm more into HTML, PHP & CSS, and not that familiar with Javascript, so I'm stuck here. Node.JS was something that I've learned a week ago so the codes initially confused me. I've read articles and watched YouTube tutorials about javascript, json, etc. but they confused me more than help with my problem.
I have json file with given data (total.json)
var data = {"trololo":{"info":"61511","path".... }}
I need to get object "info" and then print data "61511" in alert window
I include my json like
var FILE = 'total'
var data_file_names = {};
data_file_names[FILE] = 'total.json';
And then i use it like
var data_trololo = data_file_names[FILE];
Plese, help me print object "info". Maybe there is another way to solve this problem
You need to make an ajax call to the json file. Then you can access the array like the below example.
Note : Your json wasn't properly formatted.
var data = {
"trololo":{
"info": ["61511","path"]
}
};
console.log(data.trololo.info[0]); //this one will print 61511
Usually one can make an ajax call to read the file on the server.
But if you are ok with using HTML5 features then go through the link find out how to read the file on the browser itself. Though File API being part of HTML5 spec is stable across browsers.
http://www.html5rocks.com/en/tutorials/file/dndfiles/
My application needs to read in a large dataset and pipe it to the client to manipulate with D3.js. The problem is, on large datasets, the reading/loading of the file contents could take a while. I want to solve this using streams. However, I'm unsure of how to do so in the context of the Sails framework.
What I want to do is read the contents of the file and pipe it to a rendered page. However, I can't figure out how to pipe it through if I use something like res.view('somePage', { data: thePipedData });.
I currently have something like this:
var datastream = fs.createReadStream(path.resolve(DATASET_EXTRACT_PATH, datatype, dataset, dataset + '.csv'));
datastream.pipe(res);
...
return res.view('analytics', { title: 'Analytics', data: ??? });
What's the best way to approach this?
Based on your example it seems like the best course of action would be to set up a separate endpoint to serve just the data, and include it on the client via a regular <script> tag.
MyDataController.js
getData: function(req, res) {
/* Some code here to determine datatype and dataset based on params */
// Wrap the data in a Javascript string
res.write("var theData = '");
// Open a read stream for the file
var datastream = fs.createReadStream(
path.resolve(DATASET_EXTRACT_PATH, datatype, dataset, dataset + '.csv')
);
// Pipe the file to the response. Set {end: false} so that res isn't closed
// when the file stream ends, allowing us to continue writing to it.
datastream.pipe(res, {end: false});
// When the file is done streaming, finish the Javascript string
datastream.on('end', function() {
res.end("';");
});
}
MyView.ejs
<script language="javascript" src="/mydata/getdata?datatype=<%=datatype%>&etc.."></script>
MyViewController.js
res.view('analytics', {datatype: 'someDataType', etc...});
A slight variation on this strategy would be to use a JSONP-style approach; rather than wrapping the data in a variable in the data controller action, you would wrap it in a function. You could then call the endpoint via AJAX to get the data. Either way you'd have the benefit of a quick page load since the large data set is loaded separately, but with the JSONP variation you'd also be able to easily show a loading indicator while waiting for the data.
I am new to JavaScript and D3 and cannot figure out how to allow users to upload a csv file and displaying a scatterplot using d3. I am using the tag to allow user to select file. But I am not sure on what the next step should be. Is there a way to read the csv file and store it's contents in a d3 array and then displaying a graph using that array ??
Thanks in advance
Look into the d3.csv function (https://github.com/mbostock/d3/wiki/CSV). Here is a simple example
//load up the example.csv file
d3.csv('example.csv',
function(data){
//this is an object, the contents of which should
//match your example.csv input file.
console.log(data);
// do more stuff with 'data' related to
// drawing the scatterplots.
//
//-----------------------
},
function(error, rows) {
console.log(rows);
};
);
There are a number of examples online showing you how to go from a data array to a scatterplot...it's a matter of modifying those examples to fit your specific data format.
I'm creating an android app which takes in some json data, is there a way to set up a directory such as;
http://......./jsons/*.json
Alternatively, a way to add into a json file called a.json, and extend its number of containing array data, pretty much add more data into the .json file this increase its size.
It could be by PHP or Javascript.
Look into Parsing JSON, you can use the JSON.parse() function, in addition, I'm not sure about getting all your JSON files from a directory call, maybe someone else will explain that.
var data ='{"name":"Ray Wlison",
"position":"Staff Author",
"courses":[
"JavaScript & Ajax",
"Buildinf Facebook Apps"]}';
var info = JSON.parse(data);
//var infostoring = JSON.stringify(info);
One way to add to a json file is to parse it, add to it, then save it again. This might not be optimal if you have large amounts of data but in that case you'll probably want a proper database anyway (like mongo).
Using PHP:
$json_data = json_decode(file_get_contents('a.json'));
array_push($json_data, 'some value');
file_put_contents('a.json', json_encode($json_data));