Here's the two scenarios.
We are using a manually built xml soap request with xmlhttprequest, sending it to a wcf soap service, getting back the response and using xPath to parse the data and fill out a drop down list.
We are sending a json request to a rest wcf service and getting a json response back and assigning the values to a drop down list
Which scenario is faster? My sense tells me #2 but I could be wrong.
Json will be faster, since Json is essentially Javascript. But that shouldn't be the main motivation. Parsing the data, will assumingly be only a small part of your application anyway.
On the other hand, browsers are also well trained to parse XML.
The main difference is that XML, and therefor SOAP, is larger to send to the client, so the transfer may be a bigger slowdown than the parsing.
Anyway, if you want to know, you should just test and profile instead of guessing or asking.
Option two would generally be faster than option one, as JSON is a much simpler format than XML.
However, if you really need the parsing to be fast, you shouldn't use either, you should use a custom format that is really fast to parse using simple string operations. For example a comma separated string that could be parsed with a split(',').
After profiling in my scenario, I found out that JSON is actually much faster as far as processing time within the browser
Related
I understood why we are using JSON to exchange data between browser and server in place of XML but I could not understand why we are using only string type of JSON even we have six different value datatype, I mean why we can't use integer or Boolean or any other value datatype.
Hope you guys understand what I'm trying to say, Thanks in advance.
If I understand correctly, the limitation is because of the way data needs to be encoded to be sent over HTTP and ultimately over the wire. You json object (or xml,etc) is ultimately just a payload for HTTP (which is just a payload for tcp in turn and so on).
HTTP inherently does not and should not identify data types in payload, it is just an array for HTTP. You can select how to represent this array i.e. how to encode it; It can be string (ascii, utf-8, etc) or binary but it has to be uniform for the whole payload.
HTTP does offer different encoding methods of payload which can be interpreted by the receiver by looking at the content-type header and accordingly decode the data.
Hope this helps.
why we are using only string type of JSON
Uhm, we're not. I believe you're misunderstanding something here. HTTP responses can really contain anything; every time you download a PDF or an image from a web server, the web server is sending a binary payload, which can literally be anything. So it's not even true that all HTTP bodies must be text.
To exchange data between systems, you send bytes. For these bytes to mean anything, you need an encoding scheme. Image formats have a particular way in which bytes need to be arranged, and when properly doing so, you can send pictures with them. Same for PDFs, video, audio, and anything else (including text).
If you want to send structured data, you need to express that structure somehow. How do you send a, for example, PHP array over HTTP…? (Substitute your equivalent list data structure in your language of choice.) You can't. A PHP array is a specific data structure in memory of a PHP runtime, sending that as is over HTTP has no meaning (because it deals with internal pointers and such). This array needs to be serialised first. There are many possible serialisation methods, some of them using binary data, and some using formats which are human readable to varying degrees. You could simply join all array elements with commas and .split(',') them again on the other end, but that's rather simplistic and misses many more complex cases and edge cases.
JSON and XML (and YAML and whatnot) are human readable formats which can serialise data structures like arrays (and dictionaries and numbers and booleans etc), and which happen to be text-based (purposely, to make them developer-friendly). You can use any of those data types JSON allows. Nothing prevents you from doing so, and not using them is insane. JSON and XML also happen to be two formats easily parsed with tools built into every browser. You could use any other binary format too, but then you'd have to manually parse it in Javascript.
Communication between browser and server can be done in many ways. It's just that JSON comes out of the box. You can use protobuf, xml and plethora of other data serialization techniques to communicate with the server as long as both sides understand what's the communication medium. On the browser side, you have to probably implement protobuf, xml etc serialization/deserialization on your own in javascript.
Any valid JSON is permitted for data exchange. The keys are string quoted but the values can be strings, numbers, booleans, array or other objects itself. Though before transmission, everything is converted into a string and the receiving side parses it into the correct format.
I am fetching JSON data from my local server and was wondering what functions I should run my data through before printing it on the page in HTML. Just want to ensure everything is secure and any special characters like quotes are handled properly.
Thanks!
If you are using legal JSON and you are using a real JSON parser, not eval(), then your JSON is safe. It can't contain executable code, only data definitions.
You are certainly free in your client code to take the parsed JSON and run a bunch of sanity checks on the data to make sure it makes sense and passes any specific tests you might want to run on it, but you won't have to worry about code injection if you are using real JSON and a real JSON parser. That is one of the advantages of using JSON - it is a data-only format.
If you're worried about someone hijacking your server and returning bogus data, then you can try to secure the endpoint with https and run any obvious sanity checks in the client.
What would be the best format for storing a relatively large amount of data (essentially a big hashmap) for quick retrieval using javascript? It would need to support Unicode as well.
XML, JSON?
Gigantic javascript objects are generally a sign that you're trying to do something you really shouldn't be doing. XML is even worse, it has to be parsed to form meaningful data.
In this case an AJAX query to RESTful interface to a proper database backend would probably serve you well.
Javascript object access (particularly for any query beyond accessing a single item by its hash) is very slow compared to even a basic database.
There is a nice research of the people at flickr about this topic. They ended up by using csv over xml and json.
JSON definitely beats XML for performance reasons.
But a query against DB on the backend would probably be the only feasible solution once a certain scale is reached, since local resources can not possibly match data retrieval from large store compared to DB.
Mostly I have just used XML files to store config info and to provide elementary data persistence. Now I am building a website where I need to store some XML type data. However I am already using JSON extensively throughout the whole thing. Is it bad to store JSON directly instead of XML, or should I store the XML and introduce an XML parser.
Not bad at all. Although there are more XML editors, so if you're going to need to manually edit the files, XML may be better.
Differences between using XML and JSON are:
A lot easier to find an editor supporting nice way to edit XML. I'm aware of no editors that do this for JSON, but there might be some, I hope :)
Extreme portability/interoperability - not everything can read JSON natively whereas pretty much any language/framework these days has XML libraries.
JSON takes up less space
JSON may be faster to process, ESPECIALLY in a JavaScript app where it's native data.
JSON is more human readable for programmers (this is subjective but everyone I know agrees so).
Now, please notice the common thread: any of the benefits of using pure XML listed above are 100% lost immediately as soon as you store JSON as XML payload.
Therefore, the gudelines are as follows:
If wide interoperability is an issue and you talk to something that can't read JSON (like a DB that can read XML natively), use XML.
Otherwise, I'd recommend using JSON
NEVER EVER use JSON as XML payload unless you must use XML as a transport container due to existing protocol needs AND the cost of encoding and decoding JSON to/from XML is somehow prohibitively high as compared to network/storage lossage due to double encoding (I have a major trouble imagining a plausible scenario like this, but who knows...)
UPDATED: Removed Unicode bullets as per info in comments
It's just data, like XML. There's nothing about it that would preclude saving it to disk.
Define "bad". They're both just plain-text formats. Knock yourself out.
If your storing the data as a cache (meaning it was in one format and you had to process it programatically to "make" it JSON. Then I say no problem. As long as the consumer of your JSON reads native JSON then it's standard practice to save cache data to disk or memory.
However if you're storing a configuration file in JSON which needs human interaction to "process" then I may reconsider. Using JSON for simple Key:Value pairs is cool, but anything beyond that, the format may be too compact (meaning nested { and [ brackets can be hard to decipher).
one potential issue with JSON, when there is deep nesting, is readability,
you may actually see ]]]}], making debugging difficult
I am storing a JSON string in the database that represents a set of properties. In the code behind, I export it and use it for some custom logic. Essentially, I am using it only as a storage mechanism. I understand XML is better suited for this, but I read that JSON is faster and preferred.
Is it a good practice to use JSON if the intention is not to use the string on the client side?
JSON is a perfectly valid way of storing structured data and simpler and more concise than XML. I don't think it is a "bad practice" to use it for the same reason someone would use XML as long as you understand and are OK with its limitations.
Whether it's good practice I can't say, but it strikes me as odd. Having XML fields in your SQL database are at least queryable (SQL Server 2000 or later, MySQL and others) but more often than not is a last resort for metadata.
JSON is usually the carrier between JavaScript and your back end, not the storage itself unless you have a JSON back end document orientated database such as CouchDB or SOLR as JSON lends itself perfectly for storing documents.
Not to say I don't agree with using JSON as a simple (that is, not serializing references) data serializer over XML, but I won't go into a JSON vs XML rant just for the sake of it :).
If you're not using JSON for its portability between 2 languages, and you're positive you will never query the data from SQL you will be better off with the default serialization from .NET.
You're not the only one using JSON for data storage. CouchDB is a document-oriented database which uses JSON for storing information. They initially stored that info as XML, then switched to JSON. Query-ing is done via good old HTTP.
By the way, CouchDB received some attention lately and is backed by IBM and the Apache Software Foundation. It is written in Erlang and promises a lot (IMHO).
Yeah I think JSON is great. It's an open standard and can be used by any programming language. Most programming languages have libraries to parse and encode JSON as well.
JSON is just a markup language like XML, and due to its more restrictive specification is is smaller in size and faster to generate and parse. So from that aspect it's perfectly acceptable to use in the database if you need ad-hoc structured markup (and you're absolutely sure that using tables isn't possible and/or the best solution).
However, you don't say what database you're using. If it is SQL Server 2005 or later then the database itself has extensive capabilities when dealing with XML, including an XML data type, XQuery support, and the ability to optimise XQuery expressions as part of an execution plan. So if this is the case I'd be much more inclined to use XML and take advantage of the database's facilities.
Just remember that JSON has all of the same problems that XML has as a back-end data store; namely, it in no way replaces a relational database or even a fixed-size binary format. If you have millions of rows and need random access, you will run into all the same fundamental performance issues with JSON that you would with XML. I doubt you're intending to use it for this kind of data store scenario, but you never know... stranger things have been tried :)
JSON is shorter, so it will use less space in your database. I'd probably use it instead of XML or writing my own format.
On the other hand, searching for matches will suck - you're going to have to use "where json like '% somevalue %'" which will be painfully slow. This is the same limitation with any other text storage strategy (xml included).
You shouldn't use either JSON or XML for storing data in a relational database. JSON and XML are serialization formats which are useful for storing data in files or sending over the wire.
If you want to store a set of properties, just create a table for it, with each row beeing a property. That way you can query the data using ordinary SQL.