I'm trying to make a choropleth map visualization with fixed polygon regions, but variable data for each region that depends on a query. Currently I have the polygon coordinates in KML and SHP format, and I could convert it to GeoJSON if needed.
Basically what I want it to do is to load a map with those regions once, and be able to update the values + fill colors of those regions whenever new data is requested / received through an AJAX callback. The most important is it doesn't unnecessarily reload the region polygons, i.e. I could just supply a JSON object of region id's and their new values).
I already tried using the Google Maps API, but I can't seem to bypass generating an entire new KML file each time I want to load new values. This forces me to unnecessarily reload the region shapes as well, although they never change. I tried caching the KML clientside in a JS object, updating it's values each time new data is received and then re-setting the map (using geoxml3), but this seems to perform rather slow (since the entire KML is being iterated and since it's a pretty large file including all the region coordinates). FusionTables didn't work for me either because I need to fetch the data from my own database, and from what I understand using FusionTables you can only query a FT table.
As far as I know the Gmaps API probably isn't going to work for me. Which other solution could suit my needs best?
Google just added GeoJSON support to the Maps API. This example does something similar to what is described above:
https://developers.google.com/maps/articles/combining-data
Related
The UK Environment Agency has a page to view water monitoring data from stations placed across the UK. The data for each point can be exported out as a .csv, but it doesn't contain the spatial data.
I want to extract the spatial data for each green marker (shown below) either individually or as a single list, but I can't seem to find that anywhere via the page or in the page's code.
So far I have tried;
Requesting a JSON of the data by scouring the HTML and attempting an 'AJAX Call' - I was able to find the <div> classes for each, but only the absolute position of the markers, not the the spatial coordinates. My attempt at an AJAX call brought up 'undefined' errors, and I don't know enough about the page's structure to craft the correct code to call up anything.
I also tried looking for a JSON with the data via the 'Network' and 'XHR' tabs on my Browser's Development Tools (F12), but the only JSON I found is empty.
Looking for the 'ViewerPort.info' and scouring through the .js scripts to locate the coordinates - I found something promising with spatial coordinates (download) from the viewerport, but it doesn't identify which point is which, so I have no idea if these are correct and would require filling in the other metadata (station name, Monitor ID, etc) manually which defeats the point.
Short of linking every option I found and attempted, a big search on Google and Stack Overflow on extracting marker coordinates from embedded Google Maps frames.
This previous question seems to imply that it can be unique to each website on how to access this information, and that it might not be available at all for privacy/security reasons. Since this information is publically available and the T&C's permit extracting data from the page, I was hoping trying the above would save time from manually inputting them myself by eye.
Have I missed an alternative method to extract these spatial points, or has the site been made in such a way so that I can't extract them and should stop attempting?
I am working on a web application based on OpenLayers, Geoserver, Java and JavaScript.
The user needs to select an area on the map and that portion has to be downloaded as an image. I tried hitting the WMS using Ajax -
http://localhost:8080/geoserver/wms?request=GetMap&service=WMS&version=1.1.0&layers=geoworkspace:STRUCTURE,&styles=&srs=EPSG:27700&bbox=526274.1873390013,196214.08896841796,526277.1040062243,196217.2973028639&width=1200&height=1200&format_options=dpi:300;antialiasing:on&format=image%2Fpng8
As a result, I get all the required layers except for the underlying base map. Is there a way to get the base map too?
Is there an alternative approach to this requirement?
Enable CORS on server (GeoServer) for ajax
https://gist.github.com/essoen/91a1004c1857e68d0b49f953f6a06235
If your base layer is on the same server then you can request it in a single GetMap operation, by providing the layer name in a comma separated list as part of the layers request parameter. Note though that this creates a single image, that merges the layers, so you'll need to take care of the order you list the layers.
So
http://ogc2.bgs.ac.uk/cgi-bin/UGA_ARGI/ows?service=WMS&version=1.3.0&request=GetMap&width=700&height=450&styles=,&layers=ARTISAN,ARTISANC&format=application/openlayers&crs=EPSG:4326&bbox=-2.000000,29.000000,4.500000,37.000000&
The ARTISANC layer is drawn on top of the ARTISAN layer.
and
http://ogc2.bgs.ac.uk/cgi-bin/UGA_ARGI/ows?service=WMS&version=1.3.0&request=GetMap&width=700&height=450&styles=,&layers=ARTISANC,ARTISAN&format=application/openlayers&crs=EPSG:4326&bbox=-2.000000,29.000000,4.500000,37.000000&
The ARTISAN layer is drawn on top of the ARTISANC layer.
So in your case you'd want the base layer to be listed first.
Also note that as you have two layers, you should have two styles, so we have the styles request parameter like styles=,& or you could just have styles& to force the default.
I would like to convert geojson into mbtiles either in the client or server side to be loaded in leaflet. I have data that will be inserted into postgesql everyday and I want to turn that into mbtiles when a user views the map. I have used tippecanoe and mbutils to convert it into the format that I need but I need it to have most up to date data.
I have also tried to use the mapbox dataset API but I couldn't get that to work the way I need it to.
My problem is that I have dynamic data that will be updated regularly and I want to convert that to mbtiles, then to vector tiles. One possible solution is to have the server run this every morning to convert the geojson to have fresh tiles everyday. I just cannot find a way to do this properly.
EDIT:
geojsonvt looks like something that could work for me but I can not figure out how to draw this into leaflet 1.0.3.
I am visualizing some government data using Google Maps JS API. Currently every time the user changes a filter value it grabs the entirety of the JSON data again, filters it, and creates a marker for each row that passes the filter validation. This is slow because it's re-downloading the JSON every time you change the form the filters are located in.
There are two ways to approach caching and displaying the data dynamically: storing the received JSON once and destroying/recreating markers based on the filter, or by creating all the markers at once and only displaying those that match filters.
Is there a performance difference between these two options? Both make sense to me, I'm just not sure how to tell whether one is better than the other. How can I assess how 'heavy' google maps markers are for the user?
The suggested 2 approaches are definitely going to be faster than the original strategy where the JSON data is re-fetched on each filter change.
I guess there are advantages and disadvantages to each method.
If you are not going to retrieve the JSON data on each filter change then essentially the data could be out of data but if the use case is that the JSON data rarely gets updated then this consideration can be dropped.
Having the JSON data cached and creating all of the markers upfront would cause the map to take a bit longer than usual to load at start, as you will need to create all markers first whereas the other way round is that you only create a subset of the markers - hence quicker.
I guess it all comes down to how many markers are there? What is the typical using pattern of the map.
If there is a million markers and the typical filter would cause 100,000 markers to be regenerated on change, then you better off generating the markers upfront and just tweak their visibility accordingly.
Similarly if you have a million markers and the typical filter would only cause 1 or 2 markers to appear out of the million, then probably destroying and recreate would be faster.
Anyway, as a user I would rather have the map take a bit longer to load at the start, probably sacrificing 1-2 seconds. Then have the markers changes instantaneously when I'm playing with the filters. Hope this helps.
Fellow Overflowers,
Am working in a project in which I need to place (pin) on map (Google in my case), a stream of data. One data record consist of 11 columns and the last 2 ones, are "city" and "country".
The data source is an html page, using the usual table tags, this is a business model and can not be changed. I managed to parse and analyze them using Nokogiri and finally store them in an array.
The idea is to pin each data in the map and ballon the rest of the 9 columns.
The hint: data are refreshed every 1 minute.
I can not figure out the approach: Shall i use arrays or a database to save the data? The average number of records to be displayed is 120 at the same time, on the map.
..and has anybody implemented something similar, could there be a comment regarding the performance?
Thanks a bunch...
Petros
I think you will get problems when geocoding Country, City every minute. Probability you reach limit or encounter some failure is very high. The rest (refreshing with ajax and removing/adding 100+ markers) will be no problem.
So, if I'd do it I would created local database with geocodes (latitude and longitude) for Country+City. I would gather geocodes (that is not present in database) via google.maps.Geocoder right in the javascript and send them back with AJAX to append to the table. I suppose, your country+city database get complete rapidly and you no need to extensive geocoding anymore, also for exceptions you always have geocoder ready in javascript to resolve new city. It should work like a charm.
Alternative is to use static geocoder right in your server side just after you get new portion of data. But I would prefer first approach.