I understood why we are using JSON to exchange data between browser and server in place of XML but I could not understand why we are using only string type of JSON even we have six different value datatype, I mean why we can't use integer or Boolean or any other value datatype.
Hope you guys understand what I'm trying to say, Thanks in advance.
If I understand correctly, the limitation is because of the way data needs to be encoded to be sent over HTTP and ultimately over the wire. You json object (or xml,etc) is ultimately just a payload for HTTP (which is just a payload for tcp in turn and so on).
HTTP inherently does not and should not identify data types in payload, it is just an array for HTTP. You can select how to represent this array i.e. how to encode it; It can be string (ascii, utf-8, etc) or binary but it has to be uniform for the whole payload.
HTTP does offer different encoding methods of payload which can be interpreted by the receiver by looking at the content-type header and accordingly decode the data.
Hope this helps.
why we are using only string type of JSON
Uhm, we're not. I believe you're misunderstanding something here. HTTP responses can really contain anything; every time you download a PDF or an image from a web server, the web server is sending a binary payload, which can literally be anything. So it's not even true that all HTTP bodies must be text.
To exchange data between systems, you send bytes. For these bytes to mean anything, you need an encoding scheme. Image formats have a particular way in which bytes need to be arranged, and when properly doing so, you can send pictures with them. Same for PDFs, video, audio, and anything else (including text).
If you want to send structured data, you need to express that structure somehow. How do you send a, for example, PHP array over HTTP…? (Substitute your equivalent list data structure in your language of choice.) You can't. A PHP array is a specific data structure in memory of a PHP runtime, sending that as is over HTTP has no meaning (because it deals with internal pointers and such). This array needs to be serialised first. There are many possible serialisation methods, some of them using binary data, and some using formats which are human readable to varying degrees. You could simply join all array elements with commas and .split(',') them again on the other end, but that's rather simplistic and misses many more complex cases and edge cases.
JSON and XML (and YAML and whatnot) are human readable formats which can serialise data structures like arrays (and dictionaries and numbers and booleans etc), and which happen to be text-based (purposely, to make them developer-friendly). You can use any of those data types JSON allows. Nothing prevents you from doing so, and not using them is insane. JSON and XML also happen to be two formats easily parsed with tools built into every browser. You could use any other binary format too, but then you'd have to manually parse it in Javascript.
Communication between browser and server can be done in many ways. It's just that JSON comes out of the box. You can use protobuf, xml and plethora of other data serialization techniques to communicate with the server as long as both sides understand what's the communication medium. On the browser side, you have to probably implement protobuf, xml etc serialization/deserialization on your own in javascript.
Any valid JSON is permitted for data exchange. The keys are string quoted but the values can be strings, numbers, booleans, array or other objects itself. Though before transmission, everything is converted into a string and the receiving side parses it into the correct format.
I have a nodejs application and need to convert a Buffer object to array of byte as c/c++ API's payload. I tried "Uint8Array" and "Array.prototype.slice.call", but it seems they don't work as expect. The API could not decode them correctly.
Is there any way I can do this?
Thanks.
Problem solved. I made a mistake before. After retest, to convert Buffer to Uint8Array which could be passed to c/c++ API as array of byte.
What is the correct way to insert Buffer as BLOB just 1-to-1 without serializing to string/hex? Is it possible at all from Node.js?
At my backend I already have Buffers with binary data and want to store them with minimum overheads using prepared statements. I'm getting ER_TRUNCATED_WRONG_VALUE_FOR_FIELD errno:1366 Incorrect string value error. It looks like the library just assumes there is a valid utf8 string in a buffer.
I used protostuff to transform to byte array a json input i have. The code in java is:
LinkedBuffer buffer = LinkedBuffer.allocate(1024);
Schema<String> orderSchema = RuntimeSchema.getSchema(String.class);
int i = 1 ;
for(String p:poligonsStr) {
buffer.clear();
byteslist.add(ProtostuffIOUtil.toByteArray(p, orderSchema, buffer));
}
The problem is I don't know the algorithm that is used and how I can decode with the JavaScript client (Node.js). Also I saw there is a very good algorithm called Smile implemented for protostuff in project com.dyuproject.protostuff but I would like to know how to get schema with that library- I didn't manage that yet.
I would like to know what's the best to use: ProtostuffIOUtil or SmileIOUtil?
And how to use? And how to decode with JavaScript?
protostuff binary encoding is different from protobuf, and as far as I know there is no JavaScript library to decode protostuff-encoded data at the moment.
smile is not supported by web browsers out of the box, but there are libraries that can decode it.
As for me, there are two optiomal ways how you can encode data on server using Protostuff library, and decode it using JavaScript on client side:
Use protobuf encoding, it is good if size of encoded data is important. On server side, you should use ProtobufIOUtil to serialize your data to protobuf binary format. On client side, you can use https://github.com/dcodeIO/ProtoBuf.js/ to decode binary data from server.
Use JSON encoding, it is native format for JavaScript and usually it will be parsed faster than binary protobuf-encoded data. On server side, you should use JsonIOUtil (from protostuff-json module) to serialize your data to JSON text format. On client side, it is supported out of the box.
Here is an example how to serialize your POJO into protobuf binary using Protostuff: HelloService.java
I have a system with two process, one in Java and one in Node.js. The node.js process is a web front-end. It ingests data and sends it to a queue for process, where it is consumed by the Java process. The data is a string of user data collected from browser code, and I create a string of the data in Node.js using JSON.stringify(data), and push it to kinesis, the aws version of kafka. The java process receives the raw bytes, creates a string object, and then parses the json.
My question is this: I am not sure what decoding I should instruct Java to use on the raw bytes. Right now, it "just works" with the default decoding, but I feel this is a bad idea, since the default decoding could be platform-dependent. Should I use Buffer on the Node.js side, and encode the string into UTF-8 before I push it to the queue? That way I could explicitly set UTF-8 as the decoding on the java side? Is this a best practice? Any advice much appreciated.