Javascript Fastest Local Database - javascript

What would be the best format for storing a relatively large amount of data (essentially a big hashmap) for quick retrieval using javascript? It would need to support Unicode as well.
XML, JSON?

Gigantic javascript objects are generally a sign that you're trying to do something you really shouldn't be doing. XML is even worse, it has to be parsed to form meaningful data.
In this case an AJAX query to RESTful interface to a proper database backend would probably serve you well.
Javascript object access (particularly for any query beyond accessing a single item by its hash) is very slow compared to even a basic database.

There is a nice research of the people at flickr about this topic. They ended up by using csv over xml and json.

JSON definitely beats XML for performance reasons.
But a query against DB on the backend would probably be the only feasible solution once a certain scale is reached, since local resources can not possibly match data retrieval from large store compared to DB.

Related

Large JSON object in browser client

I have a large JSON object of about 8 MB from a server stored in a browser client. If I cut down the name of variables that are duplicated in these lists will that give a performance boost at all when manipulating the lists and updating the objects?
{ "VenueLocationID" : 12 }
{ "vid" : 12 }
If you're transferring 8MB of data from server to client, probably. However, there are better ways to improve performance. If your JSON is coming through an HTTP response, activating gzip compression could net even better performance without reducing readability.
The best way to tune performance is to profile the application -- find out where the bottlenecks are, then address them. Profilers can sometimes find things that I never thought would be issues.
Another thing to look at is how the JSON is being built. I've helped some systems out by stream-parsing. Instead of serializing (stringifying) one huge array, I serialized each element [and wrote it to a response stream,] surrounded by the typical delimiters ('[', ']', and ',').

Which is faster in JavaScript, JSON or SOAP parsing?

Here's the two scenarios.
We are using a manually built xml soap request with xmlhttprequest, sending it to a wcf soap service, getting back the response and using xPath to parse the data and fill out a drop down list.
We are sending a json request to a rest wcf service and getting a json response back and assigning the values to a drop down list
Which scenario is faster? My sense tells me #2 but I could be wrong.
Json will be faster, since Json is essentially Javascript. But that shouldn't be the main motivation. Parsing the data, will assumingly be only a small part of your application anyway.
On the other hand, browsers are also well trained to parse XML.
The main difference is that XML, and therefor SOAP, is larger to send to the client, so the transfer may be a bigger slowdown than the parsing.
Anyway, if you want to know, you should just test and profile instead of guessing or asking.
Option two would generally be faster than option one, as JSON is a much simpler format than XML.
However, if you really need the parsing to be fast, you shouldn't use either, you should use a custom format that is really fast to parse using simple string operations. For example a comma separated string that could be parsed with a split(',').
After profiling in my scenario, I found out that JSON is actually much faster as far as processing time within the browser

Best Server-Side Data Storage Method for Simple Data

I'm coding a website that involves storing very simple data, just a very long list of names with no additional data, on the server. As this data is so simple, I don't really want to use MySQL (it would be a bit too clunky) so I'm asking what's the best way to store very simple data on the server.
I definitely would favour speed over anything else, and easy access to the data via javascript and AJAX would be very good too as the rest of the site is coded in javascript/jQuery. I don't really care if the data can be viewed freely (as it will be available anyway), as long as it can't be changed by unauthorised users.
There are a lot of things to think about with this.
Is the information the same for all users with just a single set that applies to all users out there? Or is there a separate set of data for each user?
How is the data going to be served to the client, my guess here is that you would be having a web service or otherwise that might return a JSON.
From a security standpoint, do you want someone to be able to just "grab" the data and run?
Personally I find that a database if often a better choice, but otherwise i would use an XML file. Keep in mind though that you have to be careful with loading/reading of XML files to serve web requests to prevent any potential file locking issues.
Use an XML file that is web-accessible. Then you can query the XML file from the browser if need be, and still parse/write it in PHP. You'll want to use the flock function in PHP to make sure that two instances of a page don't try to write to the file at the same time.
Write it to a file and save the data as a serialized object. This way when you read in the data it's instantly accessible as the variable type you need (array, obj, etc). This will be faster than XML parsing.

Is it bad to store JSON on disk?

Mostly I have just used XML files to store config info and to provide elementary data persistence. Now I am building a website where I need to store some XML type data. However I am already using JSON extensively throughout the whole thing. Is it bad to store JSON directly instead of XML, or should I store the XML and introduce an XML parser.
Not bad at all. Although there are more XML editors, so if you're going to need to manually edit the files, XML may be better.
Differences between using XML and JSON are:
A lot easier to find an editor supporting nice way to edit XML. I'm aware of no editors that do this for JSON, but there might be some, I hope :)
Extreme portability/interoperability - not everything can read JSON natively whereas pretty much any language/framework these days has XML libraries.
JSON takes up less space
JSON may be faster to process, ESPECIALLY in a JavaScript app where it's native data.
JSON is more human readable for programmers (this is subjective but everyone I know agrees so).
Now, please notice the common thread: any of the benefits of using pure XML listed above are 100% lost immediately as soon as you store JSON as XML payload.
Therefore, the gudelines are as follows:
If wide interoperability is an issue and you talk to something that can't read JSON (like a DB that can read XML natively), use XML.
Otherwise, I'd recommend using JSON
NEVER EVER use JSON as XML payload unless you must use XML as a transport container due to existing protocol needs AND the cost of encoding and decoding JSON to/from XML is somehow prohibitively high as compared to network/storage lossage due to double encoding (I have a major trouble imagining a plausible scenario like this, but who knows...)
UPDATED: Removed Unicode bullets as per info in comments
It's just data, like XML. There's nothing about it that would preclude saving it to disk.
Define "bad". They're both just plain-text formats. Knock yourself out.
If your storing the data as a cache (meaning it was in one format and you had to process it programatically to "make" it JSON. Then I say no problem. As long as the consumer of your JSON reads native JSON then it's standard practice to save cache data to disk or memory.
However if you're storing a configuration file in JSON which needs human interaction to "process" then I may reconsider. Using JSON for simple Key:Value pairs is cool, but anything beyond that, the format may be too compact (meaning nested { and [ brackets can be hard to decipher).
one potential issue with JSON, when there is deep nesting, is readability,
you may actually see ]]]}], making debugging difficult

Is JSON used only for JavaScript?

I am storing a JSON string in the database that represents a set of properties. In the code behind, I export it and use it for some custom logic. Essentially, I am using it only as a storage mechanism. I understand XML is better suited for this, but I read that JSON is faster and preferred.
Is it a good practice to use JSON if the intention is not to use the string on the client side?
JSON is a perfectly valid way of storing structured data and simpler and more concise than XML. I don't think it is a "bad practice" to use it for the same reason someone would use XML as long as you understand and are OK with its limitations.
Whether it's good practice I can't say, but it strikes me as odd. Having XML fields in your SQL database are at least queryable (SQL Server 2000 or later, MySQL and others) but more often than not is a last resort for metadata.
JSON is usually the carrier between JavaScript and your back end, not the storage itself unless you have a JSON back end document orientated database such as CouchDB or SOLR as JSON lends itself perfectly for storing documents.
Not to say I don't agree with using JSON as a simple (that is, not serializing references) data serializer over XML, but I won't go into a JSON vs XML rant just for the sake of it :).
If you're not using JSON for its portability between 2 languages, and you're positive you will never query the data from SQL you will be better off with the default serialization from .NET.
You're not the only one using JSON for data storage. CouchDB is a document-oriented database which uses JSON for storing information. They initially stored that info as XML, then switched to JSON. Query-ing is done via good old HTTP.
By the way, CouchDB received some attention lately and is backed by IBM and the Apache Software Foundation. It is written in Erlang and promises a lot (IMHO).
Yeah I think JSON is great. It's an open standard and can be used by any programming language. Most programming languages have libraries to parse and encode JSON as well.
JSON is just a markup language like XML, and due to its more restrictive specification is is smaller in size and faster to generate and parse. So from that aspect it's perfectly acceptable to use in the database if you need ad-hoc structured markup (and you're absolutely sure that using tables isn't possible and/or the best solution).
However, you don't say what database you're using. If it is SQL Server 2005 or later then the database itself has extensive capabilities when dealing with XML, including an XML data type, XQuery support, and the ability to optimise XQuery expressions as part of an execution plan. So if this is the case I'd be much more inclined to use XML and take advantage of the database's facilities.
Just remember that JSON has all of the same problems that XML has as a back-end data store; namely, it in no way replaces a relational database or even a fixed-size binary format. If you have millions of rows and need random access, you will run into all the same fundamental performance issues with JSON that you would with XML. I doubt you're intending to use it for this kind of data store scenario, but you never know... stranger things have been tried :)
JSON is shorter, so it will use less space in your database. I'd probably use it instead of XML or writing my own format.
On the other hand, searching for matches will suck - you're going to have to use "where json like '% somevalue %'" which will be painfully slow. This is the same limitation with any other text storage strategy (xml included).
You shouldn't use either JSON or XML for storing data in a relational database. JSON and XML are serialization formats which are useful for storing data in files or sending over the wire.
If you want to store a set of properties, just create a table for it, with each row beeing a property. That way you can query the data using ordinary SQL.

Categories