I am trying to create a json file to test out some react-native map polygon functionality. I have geoJson available from a project using leaflet maps. I need to format the lat/lng points. I have drilled down from the top level-geometry-coordinates. but I am stuck on what to do next. Since i only need the end result for testing any library can be used in the plunker to get the desired result.
here is where i am at.
[
[
[
-106.75845,
34.659846
],
[
-106.81188,
34.649485
],
[
-106.80648,
34.646378
],
[
-106.75845,
34.659846
]
],
[
[
-106.70432,
34.650473
],
[
-106.79663,
34.720663
],
[
-106.7278,
34.637498
],
[
-106.70432,
34.650473
]
]
]
this is the what i need the end result to look like. plunker
// desired result
var result = [[{
latitude: 0,
longitude: 0
}, {
latitude: 0,
longitude: 0
}, {
latitude: 0,
longitude: 0
}], [{
latitude: 0,
longitude: 0
}, {
latitude: 0,
longitude: 0
}, {
latitude: 0,
longitude: 0
}]];
try this updated plunker
obj = obj.map(function(innerArray){
return innerArray.map(function(value){
return {latitude:value[0], longitude:value[1]} ;
});
});
console.log(obj);
Related
We have a dictionary with cities, using a uniqe id as the key:
cities: {
'adfjlx9w': {
name: 'New York',
latitude: 4,
longitude: -7
},
'favkljsl9': {
name: 'Copenhagen',
latitude: 2,
longitude: -18
}
}
We need to convert our dictionary into Geojson in order to place the cities on a map, but cannot use the typical route below, as it is not an array of objects:
GeoJSON.parse(cities, {
Point: ['latitude', 'longitude']
});
What is the fastest and best way to do this?
If I understand correctly, you need to extract the latitude and longitude data for each value of the cities object to an array of latitude/longitude values of shape:
{ latitude, longitude }
One approach would be to use Object#values which returns an array of the values for cities, and, optional use Array#map to transform each object to a new object (with only the latitude, longitude values):
const cities = {
'adfjlx9w': {
name: 'New York',
latitude: 4,
longitude: -7
},
'favkljsl9': {
name: 'Copenhagen',
latitude: 2,
longitude: -18
}
}
const latlngArray = Object
// Extract value array from cities
.values(cities)
// Map each value to lat/lng only object
.map(item => ({ latitude : item.latitude, longitude : item.longitude }))
console.log(latlngArray);
/* Pass latlngArray to GeoJSON.parse
GeoJSON.parse(latlngArray, {
Point: ['latitude', 'longitude']
});
*/
Hope that helps!
Something like this should work.
const citiesArray = Object.values(cities);
GeoJSON.parse(citiesArray, {
Point: ['latitude', 'longitude']
});
According to the GeoJSON Specification, to include the id and name properties with latitude/longitude coordinates, you would need to parse the object as type FeatureCollection with a property features.
Each features array object should be of type Feature, with properties and geometry values. The properties value should contain the metadata, and the geometry value should be of type Point with coordinates property containing latitude/longitude.
const cities = {
'adfjlx9w': {
name: 'New York',
latitude: 4,
longitude: -7
},
'favkljsl9': {
name: 'Copenhagen',
latitude: 2,
longitude: -18
}
}
let citiesGeoJSON = Object.entries(cities)
.reduce((
_cities,
[cityID, cityData],
) => {
let city = {
"type": "Feature",
"properties": {
"id": cityID,
"name": cityData.name,
},
"geometry": {
"type": 'Point',
"coordinates": [
cityData.latitude,
cityData.longitude,
],
},
}
_cities.features.push(city)
return _cities
}, {
"type": "FeatureCollection",
"features": [],
})
There is a GeoJSON Linter online.
When using $.extend(true, {}, obj1, obj2), the resulting object erases any numerically-indexed data up to the number of results in obj2.
A sample call to $.extend() would be like this:
Then take this sample data for instance:
var base_json_obj = {
'regions': [
'NC',
'-Greensboro',
'VA',
'-Richmond'
],
'vehicles': [
'Ford:Escape',
'Nissan:Altima'
],
'turnover': {
'min': '0d',
'max': '5d'
}
};
var new_json_obj = {
'regions': [
'FL',
'-Miami'
],
'vehicles': [
'Hyundai:Sonata'
],
'turnover': {
'min': '1d',
'max': '6d'
}
};
var resulting_object = $.extend(true, {}, base_json_obj, new_json_obj);
The resulting object is
{
'regions': [
'FL',
'-Miami',
'VA',
'-Richmond'
],
'vehicles': [
'Hyundai:Sonata',
'Nissan:Altima'
]
'turnover': {
'min': '1d',
'max': '6d'
},
}
And here is the expected output. Notice that regions has all 6 values and that vehicles has all 3 values.
{
'regions': [
'NC',
'-Greensboro',
'VA',
'-Richmond',
'FL',
'-Miami'
],
'vehicles': [
'Ford:Escape',
'Nissan:Altima',
'Hyundai:Sonata'
]
'turnover': {
'min': '1d',
'max': '6d'
},
}
Is there a way to modify the call to $.extend() or use $.merge() in some way to achieve this?
$.extend(true, ...) will treat the Arrays like an object, using value and index, so just like 'min', the key of 0 will be replaced.
You'll need to write something that handles this case. The $.extend() function is relatively complicated, but a good place to start:
Github - jQuery Core Source
You can use a loop :
var base_json_obj = {
'regions': [
'NC',
'-Greensboro',
'VA',
'-Richmond'
],
'vehicles': [
'Ford:Escape',
'Nissan:Altima'
],
'turnover': {
'min': '0d',
'max': '5d'
}
};
var new_json_obj = {
'regions': [
'FL',
'-Miami'
],
'vehicles': [
'Hyundai:Sonata'
],
'turnover': {
'min': '1d',
'max': '6d'
}
};
var resulting_object={};
for (key in base_json_obj) {
// the values from base and new json objects respectively
var val1 = base_json_obj[key], val2 = new_json_obj[key];
resulting_object[key] = Array.isArray(val1)? // if array
val1.concat(val2): // concatenate them, else (objects)
Object.assign({}, val1, val2); // assign the second object to the first one
}
console.log(resulting_object);
I have a JSON file like below:
{
"soils": [{
"mukey": "658854",
"mukeyName": "Meggett-Kenansville-Garcon-Eunola-Blanton-Bigbee (s1517)",
"sl_source": "Fl soil map",
"cokey": "3035468",
"soilName": "Eunola",
"comppct_r": 20,
"compArea": "9.96",
}],
"asfirs": [{
"long": -82.96896600817682,
"lat": 29.977675992923395
}],
"polygon": [{
"rings": [
[
[-9235836.910744485,
3501136.0564117758
],
[-9235798.692230342,
3500237.921329426
],
[-9236553.507884657,
3500667.87961353
],
[-9235836.910744485,
3501136.0564117758
]
]
],
"spatialReference": {
"wkid": 102100,
"latestWkid": 3857
}
}]
}
I want extract the value of Polygon key to another JSON object like below
{
"rings": [
[
[-9161396.799823288,
3453315.140590871
],
[-9160708.866568722,
3453095.3841345515
],
[-9161349.02668061,
3452751.4175072685
],
[-9161396.799823288,
3453315.140590871
]
]
],
"spatialReference": {
"wkid": 102100,
"latestWkid": 3857
}
}
Now when I do it using
var key3 = 'polygon';
var newPolygonJSON = polygonJson[key3];
var text = JSON.stringify(newPolygonJSON);
where polgonJson contains my initial JSON file I get an extra [] bracket which is not allowing me to create a proper JSON file, like below.
[{
"rings": [
[
[-9235836.910744485,
3501136.0564117758
],
[-9235798.692230342,
3500237.921329426
],
[-9236553.507884657,
3500667.87961353
],
[-9235836.910744485,
3501136.0564117758
]
]
],
"spatialReference": {
"wkid": 102100,
"latestWkid": 3857
}
}]
How can I get rid of those [] brackets or extract the value properly?
When you stringify JSON object, it puts extra [] brackets because it takes your object as an array. To extract JSON from text variable, you need to get value of the first (and only) element in that array.
var key3 = 'polygon';
var newPolygonJSON = polygonJson[key3];
var text = JSON.stringify(newPolygonJSON[0]);
Is there any way we can query and get location data using mongodb geospatial query that matches the following criteria?
Getting all locations that are part of intersection between two boxes or in general two polygons.
For example below can we get in query output only those locations that are within the yellow area which actually is the common area for the purple and red geometric objects [ polygons ] ?
My study of mongodb document so far
http://docs.mongodb.org/manual/reference/operator/query/geoWithin/
This provides results that are within one or more polygons [ I am looking for the intersection of these individual polygon results as output ]
Use case
db.places.find( {
loc: { $geoWithin: { $box: [ [ 0, 0 ], [ 100, 100 ] ] } }
} )
Above query provides results within a rectangle geometric area [ I am looking for locations that are common to two such individual queries ]
db.places.find( {
loc: { $geoWithin: { $box: [ [ 0, 0 ], [ 100, 100 ] ] } }
} )
db.places.find( {
loc: { $geoWithin: { $box: [ [ 50, 50 ], [ 90, 120 ] ] } }
} )
So looking at this with a fresh mind the answer is staring me in the face. The key thing that you have already stated is that you want to find the "intersection" of two queries in a single response.
Another way to look at this is you want all of the points bound by the first query to then be "input" for the second query, and so on as required. That is essentially what an intersection does, but the logic is actually literal.
So just use the aggregation framework to chain the matching queries. For a simple example, consider the following documents:
{ "loc" : { "type" : "Point", "coordinates" : [ 4, 4 ] } }
{ "loc" : { "type" : "Point", "coordinates" : [ 8, 8 ] } }
{ "loc" : { "type" : "Point", "coordinates" : [ 12, 12 ] } }
And the chained aggregation pipeline, just two queries:
db.geotest.aggregate([
{ "$match": {
"loc": {
"$geoWithin": {
"$box": [ [0,0], [10,10] ]
}
}
}},
{ "$match": {
"loc": {
"$geoWithin": {
"$box": [ [5,5], [20,20] ]
}
}
}}
])
So if you consider that logically, the first result will find the points that fall within the bounds of the initial box or the first two items. Those results are then acted on by the second query, and since the new box bounds start at [5,5] that excludes the first point. The third point was already excluded, but if the box restrictions were reversed then the result would be the same middle document only.
How this works in quite unique to the $geoWithin query operator as compared to various other geo functions:
$geoWithin does not require a geospatial index. However, a geospatial index will improve query performance. Both 2dsphere and 2d geospatial indexes support $geoWithin.
So the results are both good and bad. Good in that you can do this type of operation without an index in place, but bad because once the aggregation pipeline has altered the collection results after the first query operation the no further index can be used. So any performance benefit of an index is lost on merging the "set" results from anything after the initial Polygon/MultiPolygon as supported.
For this reason I would still recommend that you calculate the intersection bounds "outside" of the query issued to MongoDB. Even though the aggregation framework can do this due to the "chained" nature of the pipeline, and even though resulting intersections will get smaller and smaller, your best performance is a single query with the correct bounds that can use all of the index benefits.
There are various methods for doing that, but for reference here is an implementation using the JSTS library, which is a JavaScript port of the popular JTS library for Java. There may be others or other language ports, but this has simple GeoJSON parsing and built in methods for such things as getting the intersection bounds:
var async = require('async');
util = require('util'),
jsts = require('jsts'),
mongo = require('mongodb'),
MongoClient = mongo.MongoClient;
var parser = new jsts.io.GeoJSONParser();
var polys= [
{
type: 'Polygon',
coordinates: [[
[ 0, 0 ], [ 0, 10 ], [ 10, 10 ], [ 10, 0 ], [ 0, 0 ]
]]
},
{
type: 'Polygon',
coordinates: [[
[ 5, 5 ], [ 5, 20 ], [ 20, 20 ], [ 20, 5 ], [ 5, 5 ]
]]
}
];
var points = [
{ type: 'Point', coordinates: [ 4, 4 ] },
{ type: 'Point', coordinates: [ 8, 8 ] },
{ type: 'Point', coordinates: [ 12, 12 ] }
];
MongoClient.connect('mongodb://localhost/test',function(err,db) {
db.collection('geotest',function(err,geo) {
if (err) throw err;
async.series(
[
// Insert some data
function(callback) {
var bulk = geo.initializeOrderedBulkOp();
bulk.find({}).remove();
async.each(points,function(point,callback) {
bulk.insert({ "loc": point });
callback();
},function(err) {
bulk.execute(callback);
});
},
// Run each version of the query
function(callback) {
async.parallel(
[
// Aggregation
function(callback) {
var pipeline = [];
polys.forEach(function(poly) {
pipeline.push({
"$match": {
"loc": {
"$geoWithin": {
"$geometry": poly
}
}
}
});
});
geo.aggregate(pipeline,callback);
},
// Using external set resolution
function(callback) {
var geos = polys.map(function(poly) {
return parser.read( poly );
});
var bounds = geos[0];
for ( var x=1; x<geos.length; x++ ) {
bounds = bounds.intersection( geos[x] );
}
var coords = parser.write( bounds );
geo.find({
"loc": {
"$geoWithin": {
"$geometry": coords
}
}
}).toArray(callback);
}
],
callback
);
}
],
function(err,results) {
if (err) throw err;
console.log(
util.inspect( results.slice(-1), false, 12, true ) );
db.close();
}
);
});
});
Using the full GeoJSON "Polygon" representations there as this translates to what JTS can understand and work with. Chances are any input you might receive for a real application would be in this format as well rather than applying conveniences such as $box.
So it can be done with the aggregation framework, or even parallel queries merging the "set" of results. But while the aggregation framework may do it better than merging sets of results externally, the best results will always come from computing the bounds first.
In case anyone else looks at this, as of mongo version 2.4, you can use $geoIntersects to find the intersection of GeoJSON objects, which supports intersections of two polygons, among other types.
{
<location field>: {
$geoIntersects: {
$geometry: {
type: "<GeoJSON object type>" ,
coordinates: [ <coordinates> ]
}
}
}
}
There is a nice write up on this blog.
I have a JSON like this:
{
"default": [
[
1325876000000,
0
],
[
1325876000000,
0
],
[
1325876000000,
0
],
[
1325876000000,
0
]
],
"direct": [
[
1328196800000,
0
],
[
1328196800000,
100
],
[
1328196800000,
0
],
[
1328196800000,
0
]
],
"Sales": [
[
1330517600000,
0
],
[
1330517600000,
0
],
[
1330517600000,
90
],
[
1330517600000,
0
]
],
"Support": [
[
1332838400000,
0
],
[
1332838400000,
0
],
[
1332838400000,
0
],
[
1332838400000,
0
]
]
}
I want to generate array contains the name of each item and the first value of the corresponing array. the result should be like this:
ticks = [["default", 1325876000000],["direct", 1328196800000],["Sales", 1330517600000],["Support", 1332838400000]]
the names like default, direct, sales, supportare dynamic so I can't do jsondata.support
what I tried
ticks = []
for key in jsondata{
arraynew = [];
arraynew.push(key)
}
but I don't know how to push the values?
Help please.
You just need to access the sub-array.
var ticks = [];
for (var key in jsondata) {
ticks.push( [ key, jsondata[key][0][0] ] );
}
The expression jsondata[key] gets you the outer array corresponding to each key. Then, jsondata[key][0] gets you the first of the sub-arrays, and adding the final [0] to that gets you the first value in the first sub-array.
Note that you're not guaranteed to get the keys back in any particular order.