Robust open source Node.js based projects for learning? - javascript

I'm working on learning particular node.js (specifically tower.js). I'd like to see some very robust open source application examples that revolve around more complex scenarios, particularly in the mapping area. I've not found a solid example as of yet that I feel I can sink my teeth into, but I'm hoping someone may point me towards a Google Maps or Open MapQuest style application that's built atop node if possible.
Any suggestions in this vein? Any robust example app is fine, simply one of those scenarios would be much more apropos.

I've been working on an Open MapQuest based application on Node.js myself. This is the best full-stack reference I could find: Node.js, Express, Leaflet, PostGIS. But it's far from robust; in fact, it's vulnerable to SQL injection!
In general, you'll need two things:
a client-side map renderer like Leaflet, which is incredibly awesome.
a geospatial database like MongoDB, which has builtin geospatial indexing and which I'm using, or PostGIS which they used in that tutorial.
Then, just follow a Tower tutorial, and create a RESTful endpoint that queries nearby items of interest given a longitude and latitude. You probably won't be able to use Tower Models, since they don't support geospatial queries AFAIK. On the client side, check if your map's been moved, and fetch the endpoint to update nearby items.

Related

Selection of technology implementation?

I need to develop a small application integrated in sharepoint. I am using the 2019 version on on premise. The application consists of data entry, sql queries for loading data into the fields. I want to use the latest approaches for this purpose. I still have time, but I don’t know which way to go.
I looked at a lot of solutions without code, but as I understand it, I need to move in the direction of JS, at the same time I want to save it in sharepoint lists or where to store the entered data. But how can I get data from the external MSSQL database, use node.js or the web part, REST. I am very confused, help me please. I developed applications on C # windows form with ado.net, if I know the direction I will figure it out in JS.
So following up with the comments, you could implement the following stack:
React or Vue: To implement a simple GUI to perform basic CRUD on you DB
NodeJS: To handle the data manipulation (if any) and push things to Sharepoint
Take a look at the very popular MERV stack for inspiration on the app structure. (MERV uses MongoDB, but the logic would be the same if you swap that bit for MySQL)
Using NodeJS you can leverage Sharepoint REST API using the available libraries. Check out this tutorial for details on the implementation.

Is Elasticsearch is a seperate database or does it work with MongoDB or others?

Is Elasticsearch a database itself? Is it safe to use it as my primary database? Is it secure as my primary database to store sensitive data?
Elasticsearch is a standalone database. Its main use case is for searching text and text and/number related queries such as aggregations. Generally, it's not recommended to use Elasticsearch as the main database, as some operations such as indexing (inserting values) are more expensive compared to other databases.
You can use Elasticsearch along with any other database such as MongoDB or MySQL, where the other databases can act as the primary database, and you can sync Elasticsearch with your primary database for the "searchable" parts of the data.
Elasticsearch works well with a number of other products from Elastic such as Logstash for logging purposes and Kibana for visualization purposes.
Elasticsearch homepage has a well-written description of it and its main use cases.
You might want to look at below link to understand Elasticsearch's usefulness as a database and what tradeoffs you have make in order to use it as a primary database
https://www.elastic.co/blog/found-elasticsearch-as-nosql
In general, Elasticsearch has been primarily used as an index store for retrieving/searching data really fast. Elasticsearch is powered by Lucene which is a high performance , text search engine library , which makes it a very powerful tool to provide an on top full-text search platform for applications. But it is usually recommended that your "source of truth" database is separate from Elasticsearch index data itself and also because nature of its primary operation ( full-text search ) it has not focussed on other aspects of database such as durability , security and write consistency etc. Hope this helps.
This question is from 2018, and at that time I would have probably answer with a definitive No - don't use Elasticsearch as your only and primary database.
But since ~2015 a lot of resiliency issues have been found and addressed, and in recent years a lot of features and specifically stability and resiliency features have been added, that it's definitely something to consider given the right use-cases and leveraging the right features in the right way.
I used ELK stack only one time to monitor the log file from an application. The 'database' used is Elasticsearch. And it does seem as though it is positioned to be a primary database that is also "Open source and free to use."
Elasticsearch is a distributed, RESTful search and analytics engine capable of solving a growing number of use cases. As the heart of the Elastic Stack, it centrally stores your data so you can discover the expected and uncover the unexpected.
My understanding is that the entire ELK stack is comprised of three tools. Namely it is ELK Stack: Elasticsearch, Logstash, Kibana. Where Elasticsearch is a search and analytics engine. Logstash is a server‑side data processing pipeline that ingests data from multiple sources simultaneously, transforms it, and then sends it to a "stash" like Elasticsearch. Kibana lets users visualize data with charts and graphs in Elasticsearch.
That's from their site at:
https://www.elastic.co/elk-stack
I am curious what others have to say and will have to follow your question. I currently work with Oracle and SQL Server for our application and would like to see how we could leverage additional database software in the future. Open source is always intriguing.

How to render point cloud data in browser with iTowns2

I am attempting to use iTowns2 (https://github.com/iTowns/itowns2) to visualize point cloud data in the browser. According to the README: "[iTowns'] first purpose was the visualisation of street view images and terrestrial lidar point cloud."
From this I glean that there should be instances of people using iTowns to visualize point cloud data somewhere online. I've been looking for days and I can't find an example of someone using iTowns2 to visualize point cloud data in the browser.
The example in the GH repo renders a globe in the browser but no point cloud. There is a iTowns/iTowns2-sample-data repo which has a bunch of point cloud data but no instructions on how to use the data or references to other resources.
Has anyone used this package to show point cloud data in the browser? Does anyone know an article or resource that demonstrates doing this with iTowns2? Does anyone know of a different library for rendering point cloud data with examples and/or better documentation?
Ideally I would be able to track down the source code for something like this: http://www.itowns-project.org/#demo
The documentation is quite ambiguous, and judgind by the Github issues it looks like the library is under heavy refactoring.
I took a quick look to this repo and realized that it is just using Potree for point cloud visualization:
http://potree.org/
So you can just use Potree directly. Wich is better documented.
In addition to this, it's quite trivial to set up your own point cloud visualizer using Three.js.
Just take a look at the Points object:
https://threejs.org/docs/#api/objects/Points
And this example:
https://github.com/mrdoob/three.js/blob/master/examples/webgl_buffergeometry_points.html
Three.js also includes some 3D format loaders, like ply:
https://github.com/mrdoob/three.js/blob/master/examples/webgl_loader_ply.html
If you are interested in using las files you might also want to look at:
https://github.com/verma/plasio
Let's provide a early 2018 update! (source: I'm a maintainer)
iTowns now supports visualizing pointclouds directly. You can test it here: http://www.itowns-project.org/itowns/examples/pointcloud.html
If you want to test your own data, please visit http://www.itowns-project.org/itowns/examples/pointcloud.html?selector=1
We currently support results from PotreeConverter, and lopocs. We plan to add 3dtiles pointcloud format soon.
We indeed used potree for pointclouds before, but that was not ideal, a bit because we diverge on some technology/design choices, but mainly because using potree prevented us to tightly integrate pointcloud visualization in iTowns. For instance, iTowns stops its rendering loop when it has nothing to do (saves a lot of cpu), and potree does not. It also allows us to implement our own culling/SSE/network priority... heuristics.
Potree has currently better graphic post-treatments of pointclouds, although we also plan to add EDL and other improvements (occlusions for instance) soon. And of course, the advantage of iTowns is that it's not limited to pointclouds, but can display a variety of data type, from rasters to vectors, see the examples page and especially this example of a pointcloud on a globe.
But the main difference between these 2 projects is that Potree aims at being a standalone viewer (AFAIK), whereas iTowns is more a framework to implement your own app! Potree remains a big source of inspiration for us concerning pointclouds, big kudos to their maintainer :-)
(Btw, the github has moved to https://github.com/iTowns/itowns)

python data analysis overlay image of US

I am familiar with coding in python for the work I do in bioinformatics. I've recently been asked to do a different type of analysis -- analyzing data and then overlaying that data over a map of the US. I figure I will need to use javascript after I write the python code to do the data analysis, but I am not familiar with creating images. What is the best way to incorporate my python data analysis with code that will produce a dynamic image?
Thanks for your help.
My Solutions
1) Just as what what other people said , you could try to use the Google Map APIs and code a bit.
2) Or you can use Openstreetmap. I would perfer openstreetmap.
I did several apps and websites based on LBS. I know how to place coordinates on maps.
If you want to finish this quickly and cooler. You may try this combination:
Django as the freamwork,
PostgreSQL as the DB backend
PostgreSQL PGIS as the geolocation handler
Openstreetmap as the map viewer
My summery:
Solution 1) is quick, faster. It needs you some hard-coding efforts.
Solution 2) is bit slower but full-featured. It's very extensive for future developing.
Hope this could help you

Building a geo focused app: what should I know?

I am looking forward to build an application that relies heavily on geographic data.
The application will use HTML5's ability to get GPS data and will do computations such as finding the nearest street, finding the shortest path between 2 points, etc. I was thinking of using a platform such as Google Maps, so it will most likely be written in Javascript. However, I might off load the client's CPU by doing the heavy computations server side (possibly in C++ or a scripting language).
Is there any technology, framework, standard, etc. that I should know about before I start coding?
Many SQL based relational databases have spatial awareness that can help with GPS coords.
MySQL's spatial extensions are one example. Here is an article on it. I gather if you can convert GPS data to spatially related fields then you can do things like select the nearest row to another or the first northerly row ...
Make sure to understand projections and geographic datum correction. How you do it will depend on your choice of technologies, obviously, but if you don't understand those issues they will bite you badly.
You can find my glowing review of the Google Maps API here.

Categories