JSON Response from web service - best practices question - javascript

I'm writing a simple web service that return a JSON response. It'll be heavily used, so I want to try and make the JSON response as small as possible for performance reasons. I'm on the fence over a design decision; penny for your thoughts!
My JSON response from the server looks like this:
{
"customers":
[
{
"id": "337",
"key": "APIfe45904c"
},
{
"id": "338",
"key": "somethingDifferent"
},
{
"id": "339",
"key": "APIfe45904c"
},
{
"id": "340",
"key": "APIfe45904c"
}
]
}
The APIfe45904c here is used in about 60-70% of the records, so I could also modify the the JSON response to remove the repeated information and add a default_key i.e. if there's no key specified, the client should assume the default_key like this:
{
"default_key": "APIfe45904c",
"customers":
[
{
"id": "337"
},
{
"id": "338",
"key": "somethingDifferent"
},
{
"id": "339"
},
{
"id": "340"
}
]
}
No client is using the web service yet, so this wouldn't break anything. Is this good practice? It works, and makes for a small JSON response, but I'm conflicted. I like the KISS principle for developers using the service, but I also want as small a JSON response as possible.
I was tempted to replace customers with c, id with i and key with k to aid reducing the file size, but I figured this will be a problem if I want to get other clients to start using it. Should I drop the idea of default_key for the same reason?
Each JSON response will likely be no more 200 lines of id/key pairs, so I don't need to incorporate pagination, etc.

I would keep it simple as you say, and then use gzip to compress it. It should compress very well as it is repetitive, and remains convenient for programmers.
See here for pointers in outputting gzip headers for AJAX: Is gzip encoding compatible with JSON?

Unless you have very special performance needs, I would always choose clarity over brevity. Especially for an API that is going to be used by many developers.

You should use the consistent format where each record has an id and a key field. What you lose in bandwidth you gain from not having to pre-process the JSON on the client-side.
I tend to analyze my JSON data structure like you but in the end it isn't worth the tiny bit of space you save. Your JSON data structure looks good... have you seen Twitter's JSON data structure? Now that is ugly.

I would go with the default key idea, but I wouldn't go as far as shortening the attribute names since that can be confusing. Perhaps you can take an argument from the web service call (from query string) that specifies whether or not the client desires to have shortened attribute names.

Related

Creating a new Phabricator task with javascript

I am trying to connect to Phabricator conduit API and create a task via a javascript bonded to a google sheet.
The Conduit API Docs linked here doesn't really explain as much. I have seen better API documentations!
Below is what I have in mind but this is a cURL and I have no idea how to make it Javascript or wither this would work or not? I appreciate the help
curl https://secure.phabricator.com/api/maniphest.edit \
-d api.token=api-token \
-d param= [
{
"type": "title",
"value": "A value from a cell on the googlesheet"
},
{
"type": "description",
"value": "A value from a cell on the googlesheet"
},
{
"type": "subscribers.add",
"value": "A value from a cell on the googlesheet"
}
] \
Generally speaking the steps are:
First, generate an API token in:
https://phabricator.yourdomain.com/settings/user/username/page/apitokens/
where phabricator.yourdomain.com must be changed by the subdomain you have Phabricator installed and username must be changed by your administration user name.
Then, let's say you have installed Phabricator in phabricator.yourdomain.com, you can request the API methods with URLs of the following type
https://phabricator.yourdomain.com/api/method_name?parameter1=value1&parameter2=value2...
where method_name must be replaced by the descriptor of a real method from this catalog:
https://secure.phabricator.com/conduit/
For example, if you want to read the contents of task number 125, with a generated API token of value api-svhcp2a3qmgkkjfa5f6sh7cm4joz, use the method maniphest.info to complete a URL like this:
http://phabricator.yourdomain.com/api/maniphest.info?api.token=api-svhcp2a3qmgkkjfa5f6sh7cm4joz&task_id=125&output=json
This URL can be directly tested in your preferred browser to obtain a JSON response with the information about task number 125 (make sure that task ID exists). Firefox will even show the returned JSON in a human-readable fashion.
These working URLs can be then inserted in Javascript as
window.location.href=http://phabricator.yourdomain.com/api/maniphest.info?api.token=api-svhcp2a3qmgkkjfa5f6sh7cm4joz&task_id=125&output=json
or as an asynchronous Ajax call.
I had a similar problem as you (I used HTTParty with Ruby).
To solve it I used the following body (using your example):
"transactions[0][type]=title&transactions[0][value][0]=A value from a cell on the googlesheet&transactions[1][type]=description&transactions[1][value]=A value from a cell on the googlesheet&transactions[2][type]=subscribers.add&transactions[2][value][0]=A value from a cell on the googlesheet"

I have some raw JSON stored in a variable and just want to post it to an API

I have been traversing through Stackoverflow and everywhere else on the web to try and find a solution to my issue..
I am working in Javascript and attempting to POST a small section of JSON to an endpoint in the API i know is working (I have completes the GET and POST manually in Postman)
Here is my issue..
I want dont really want to do the "GET" in my programme I just want to either reference the file or even just store it in a little variable.
So for example I have in my code:
var OauthUpload = {
"objects": [
{
"name": "api",
"serviceID": 16,
"properties": {}
}
],
"version": "integration",
"environment": "redshift"
}
Then I am trying to reference this in the JS function:
function ApiPostOauth (port) {
$.post(OauthUpload, "http://docker.dc.test.com:" + getActualPort(port) + "/rest/v1/oauth/import", runner);
}
But I am having no joy! I have seen a few different silutions but none seem to fit for me.
Basically I want a way to just:
Reference my own JSON as a variable and then insert tht so my function "ApiPostOauth" has that inserted before it runs?
Thanks guys
Steve
I have put together an example for your use. When executing this code, the server will return the same object it is sent. So the 'OauthUpload` object is sent as the request body and the server returns the exact same object. Note: if you don't see output in the output panel when running the sample I will need to restart the server (leave a comment). This is here to demonstrate:
[EDIT] - after re-reading your question, it appears you would like to pass the 'OauthUpload` object into the function. I've updated the example.
You have a mistake in your call to jQuery post() method. As shown in the comments, the first two arguments are reversed in the call to post().
Since you didn't pick up on that, I decided to provide an example using your code. Since I don't have your server, I stood up a server for this example. So the URL and port will be different, but the AJAX call will be the same.
Please pay close attention to the OauthUpload object. Notice the property names are no longer surrounded by ". I removed these quotes because they seemed to be causing you confusion about JavaScript objects and JSON strings (there is no such thing as a JSON Object regardless of what you read on the web - JSON is a string format).
Next, look at the differences between the call made to $.post() in my example and your code. You will see the URL comes first in that call.
let url = "//SimpleCORSEnabledServer--randycasburn.repl.co/rest/v1/oauth/import";
let OauthUpload = {
objects: [{
name: "api",
serviceID: 16,
properties: {}
}],
version: "integration",
environment: "redshift"
}
ApiPostOauth(OauthUpload);
function ApiPostOauth(data) {
$.post(url, data, runner)
}
function runner(data) {
document.querySelector('pre').textContent = JSON.stringify(data, null, 2);
}
<pre></pre>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>

Double parse JSON data (or use another method to split)

Wondering what the best way is to break a large data object coming from AJAX apart. If I send just one portion (says paths) I use JSON.parse(data) What I'd really like to do is split the object apart first into it's individual blocks, then be able to do something like JSON.parse(data['paths']).
Here's a clipped sample of the JSON data
{
"paths": {
"type": "FeatureCollection",
"features": [{
"type": "Feature",
"geometry": {
"type": "MultiLineString",
"coordinates": [
[
[-122.32074805731085, 47.634990818586026],
[-122.32074412999432, 47.63497931696752],
[-122.32107629703529, 47.63465666282262]
]
]
},
"properties": {
"path_name": "Woodland path"
},
"id": 2
}]
},
"beds": {
"type": "FeatureCollection",
"features": [{
"type": "Feature",
"geometry": {
"type": "MultiPolygon",
"coordinates": [
[
[
[-122.32073753862116, 47.6347629704532],
[-122.32071585642394, 47.63470617810399],
[-122.32073753862116, 47.6347629704532]
]
]
]
},
"properties": {
"bed_name": "Azalea Triangle"
},
"id": 1
}]
}
}
Here's what I have in javascript
$.ajax
dataType: 'text'
url: 'map.json'
success: (data) ->
Here's the Rails code that generates the call
data = { buildings: #geopaths, lawns: #geobeds }
respond_to do |format|
format.json { render json: data }
format.html
end
UPDATE:
I had sort of avoided explaining what I was wanting to do because I thought it would confuse the issue. In a nut shell - I am collecting data from a database, and sending it to Javascript to be displayed as layers on a map. Each layer has a name (paths, beds, etc), and gets encoded as GeoJSON in Rails before being sent to Javascript with an AJAX command. If I only send one layer I have no trouble parsing the data and getting it onto the map. A typical line of code would look like pathMarkers = L.geoJSON(JSON.parse(data)).
I now need to pass multiple layers to the map. My understanding is AJAX can only handle one object so I combine both paths and beds into one object. When I get to the Javascript side I don't know what to do. In other words I need to get only that portion of the object that has path data for the pathMarkers, and only that portion of the object that has bed data for the bedMarkers.
Graphically this is what I'm trying to do:
paths = a bunch of GeoJSON data
beds = a bunch of GeoJSON data
Use AJAX to send paths and beds to javascript
build pathMarkers with JSON.parse for the paths data
build bedsMarkers with JSON.parse for the beds data
I could build a sample and post it to bitbucket if it would help.
Assuming that I understood correctly and your concern is bringing in distinct data layers into a geo library like Leaflet.js, a single AJAX request is fine unless the JSON payload is so large that it crashes the browser.
As you don't provide much of your code, the following is a general example of how you would do it.
First you create the map object. Obviously :)
const map = L.map(id).setView([1.2345, -1.2345], 10);
Start the AJAX request to fetch the geoJSON file.
$.ajax({
dataType: "json",
url: '/json/lives/here.json',
data: {} /* any props you'd like to pass as query string to your server */,
success: success
});
And the crux of the issue: "How do I access each feature collection?"
The sucess or done callback is where you can be sure you received the data, and can add it to the map.
jQuery's AJAX method, when called with dataType: 'json', automatically runs JSON.parse() for you (and a couple of other things). Once the JSON is parsed, it can be accessed as any other object in JS.
At this point the success callback receives the JSON-turned-into-object, which you can access with traditional JS methods. Like so:
function success (data) {
// data is the parsed JSON. It is now just a JS object.
// Below: for every "key" in the data object, pass its data to L.geoJSON() and add it to the map.
for (var geojsonFeatureCollection in data) {
if (data.hasOwnProperty(geojsonFeatureCollection)) {
L.geoJSON(geojsonFeatureCollection, {/* options */}).addTo(map);
}
}
}
To answer your point about AJAX and a single object: AJAX is a just like any other browser request.
Yes you do send one request at a time. And likewise you receive one response from the server. But what is contained in the response can be absolutely any data. So what you are doing server side is totally OK!
In your case the data consists of a JSON text file, which is later parsed and turned into a JS object for you to actually do something with. The object contains a bunch of "keys" (beds, paths) and all you need to do is iterate over each of those keys and pass each one to Leaflet's geoJSON method for rendering.
I've run into this problem before with incredibly large payloads on mobile devices (iOS with Phonegap, to be precise). You may want to look into a library named OboeJS, at http://www.juancaicedo.com/oboe.js-website/.
Essentially, that library streams the JSON request so that you can process it in chunks. You should be able to use this to suit your needs.

Create json object with javascript

I've tested my REST service with success using Advanced Rest Client, where I'm sending a payload that looks like this:
{
"comments":"test comments",
"internal_verification_areas":[
{
"area_id":"1",
"selected":"1",
"notes":"notes1",
"status":"1"
},
{
"area_id":"2",
"selected":"0",
"notes":"notes2",
"status":"0"
}]
}
As mentioned my REST function executes with success.
I then moved to implement the whole thing on my web-interface and created the internal_verification_areas object as follows:
var verification_areas = {
internal_verification_areas: [
{
area_id:"1", // no need to quote variable names
selected:"1",
notes:"noter",
status:"1"
},
{
area_id:"2", // no need to quote variable names
selected:"1",
notes:"noter2",
status:"1"
}
]
};
The whole thing is then fed into my request like this (comments parameter is fetched from a textarea):
$.post("createInternalVerification.php",{comments: $('textarea#korrigeringer').val(),internal_verification_areas: verification_areas}
createInternalVerification.php will json encode the data and request the service.
The problem is, that i get an error saying: "Integrity constraint violation: 1048 Column 'area_id' cannot be null". I assume there is something wrong with my posted data, but i can't figure out what. From my POV my Advanced Rest Client payload looks similar to the payload i send from my web-interface.
EDIT:
I've noticed that the network tab (google chrome) shows some differences in my payload. I'm returning internal_verification_areas in my response to analyze the difference.
(MY WEB INTERFACE RECEIVES)
{"error":false,"message":"Intern efterprovning oprettet","test":{"internal_verification_areas":[{"area_id":"1","selected":"1","notes":"noter","status":"1"},{"area_id":"2","selected":"1","notes":"noter2","status":"1"},{"area_id":"3","selected":"1","notes":"noter3","status":"1"}]}}
(ADVANCED REST CLIENT RECEIVES)
{"error":false,"message":"Intern efterprovning oprettet","test":[{"area_id":"1","selected":"1","notes":"jAAAAAAA","status":"1","id":"4"},{"area_id":"2","selected":"0","notes":"NEEEEEJ","status":"0","id":"5"}]}
Guess I messed up my understanding of objects and arrays. Turns out my web-interface was sending and array with and object with arrays. Changing it (as shown after this post) fixed my mistake. I'm so sorry zerkms for wasting your precious time and causing an immense annoyance to your unicum skilled mind. I find it more and more frightening to post questions on StackOverflow with the presence of such skilled and godlike figures who constantly remind minions such as myself that Stackoverflow has become the very bedrock of arrogant developers.
var internal_verification_areas = [
{
area_id:"1", // no need to quote variable names
selected:"1",
notes:"noter",
status:"1"
},
{
area_id:"2", // no need to quote variable names
selected:"1",
notes:"noter2",
status:"1"
},
{
area_id:"3", // no need to quote variable names
selected:"1",
notes:"noter3",
status:"1"
}
];

Large JSON data for mobile app : should I use array or object?

Given JSON data being a 50 000 to 300 000 entries dictionary.
Given I build an hybrid app (HTML5/JS/CSS) on mobile, with potential slow devices.
I get my data as am array, but since my users should constantly interact with the data, for the sake of speed and performance,
should I use, query, edit an array such (note: I know the target word="zoo") :
var dict = [
{ "word": "acadia", "fr": ... },
{ "word": "acaria", "fr": ... },
{ ... },
...
]
but I don't have the index, I just have the value "zoo" to get the { "word":"zoo"} object.
Or should I use, query, edit an object such :
var dict = {
"acadia":{ "word": "acadia", "fr":... },
"acaria":{ "word": "acaria", "fr":... },
"...": { ... },
...
}
Array
Objects are slower then arrays.
The code to write an object is longer than tho write an array.
when i load much data like that i compress the response as follows:
obj={
info:{en:0,fr:1,es:2,it:3},
data:[
['acadia','... ','...'],
['acaria','... ','...'],
]
}
//access
var wordNumber=0,lng=obj.info.en;
obj.data[wordNumber][lng];
but the it also depends on how you structure/index everything.
so maybe
obj{
'acadia':['fr...','es...','it...'],
'acaria':['fr...','es...','it...'],
}
// access:
var word='acadia',lng=0;
var word=obj[word]?obj[word][lng]:'word does not eist';
is faster as it has direct access...
but no duplicatetes and no unallowed characters. you need to check that.
not shure if fr is for france word... correct me if i'm wrong. and i edit the code.
btw .. i loaded a json file of 20mb on ipad1 witout problems... it contained the exif data of 20k images.And it was written vary bad.. you could use cache.manifest or webSQL to store it permanently... and considering that my json file was written very bad.. the dictionary with 300k entries should be the same size if using arrays.
but for a such big data & if you constantly update you should also be able to use some serverside language and a proper DB where you update only the necessary data.
EDIT
data.json
{
info:['en','fr','it'],
data:[
['enword1','frword1','itword1'],
['enword2','frword2','itword2'],
//.....
]
}
then store everything in a webSQL DB..
and use that.. for offline.
to create a even smaller jsonfile
{
info:['en','fr','it'],
data:[
['enword1,frword1,itword1'],
['enword2,frword2,itword2'],
//.....
]
}
then use split(',') words should not contain ,
Smaller so the json file is very small.
{
info:['en','fr','it'],
data:'enword1,frword1,itword1|enword2,frword2,itword2'
}
words should not contain , & |
words=data.split('|'),l=words.length;
while(l--){
word=words[l].split(',');//en,fr,it
// insert into webSQL
}
but you need to test if the brosser can handle that easy.
you need to find a equilibrium between filesize and how long it takes to inserta all this words..

Categories