Regularly Updated JSON Overwrite vs Parse - javascript

I have a frontend Application in Javascript with jQuery
I get a JSON Object every few seconds.
I have to incorporate this JSON Object into my local data.
I can either take the Object and completely replace a local Object with it, or I could check each entry and only replace the changed entries.
In case it makes a difference:
I don't know the structure of the object in advance
The object maximum depth is usually 2 ({ "a": { "b":"c"}})
Which one is faster in terms of processing time?
Is there even a straight answer or does it depend on the object and / or browser?

Related

Can I use SQL to store my Javascript objects?

I'm new to programming, and I have been programming a small project with vanilla javascript, but I was using a lot of document.getElementById() tags, and I stored all of these in a javascript object, on a seperate file, but I was wondering If I could Just store that object on a SQL file, to make my project more organized.
I'm not sure if that's possible, I know that SQL stores data, so would I be able to store my JS object on a sql file, and import that object into my seperate Javascript files?
I'm trying to make sure if I can do what I want to do before I decide to start learning sql, but If it does do what I need, I was going to start incorporating it for organization, so I can learn it as I create projects.
You can use the JSON.stringify function to convert your javascript objects into strings. However, it is important to note that the only items within the javascript object that are converted into strings are: objects, arrays, strings, numbers, and values that are: null, true, or false. If you have references to functions or classes that have been instantiated, then these will be lost. You can convert the string back into a javascript object using JSON.parse.
One thing to consider before you do this is whether or not you need to perform database queries on the data that you are storing within the javascript object. If you need to search on the javascript object's data, then you should store the information directly within tables in the database. If you don't need to search on it, then converting the data to a string and saving it should be fine to do. Since it sounds as though you are using the data for your own purposes, doing this should be fine since extracting all of the data from the database shouldn't be an intensive task. Also, you can write your own scripts to parse the data.
Definitely, you can store as a JSON Blob
https://learn.microsoft.com/en-us/sql/relational-databases/json/store-json-documents-in-sql-tables?view=sql-server-ver15

Optimize slow search algorithm - javascript, JSON and localstorage

I'm building a guestlist app for my company using PHP, Javascript/jQuery/Ajax, JSON and localstorage. It will be used on smartphones: primarily Iphone 4. I'm using localstorage as my cache since the search-part of the application has to work in offline mode.
I'm having performance issues while searching through a guestlist.
The app's flow looks like this (for this examaple I'm working with guestlist which contains 600 guests)
1. Retrieve all guests from the server with PHP encode with JSON and send back to js via AJAX. This works fine.
2. Parse the PHP responseText (called phpData) using JSON.Parse:
var parsedMysqlData = JSON.parse(phpData);
which gives us an JSON.Array containing 600 objects looking like this:
Object: {
Id: Int
EventId: int
guestInfo: string
guestlist: string
guestName: string
reference: string
total: int
used: int
}
3. Save the JSON.Array to the user's localstorage using JSON.Stringify:
localStorage.setItem(0, JSON.stringify(parsedMysqlData));
4. when the user starts searching we get his search string then retrieve our guestlist using JSON.parse in localstorage like this:
var currentGuestlist = JSON.parse(localStorage.getItem(0));
And then iterate through our objects using this for-loop trying to match his search string with our guests the array currentGuestlist:
for (i=0; i<currentGuestlist.length; i++) {
// match 'currentGuestList[i]['name']' with what the user typed in search form
}
For some reason this takes a really long time on an iPhone 4. Searching through 600 objects will freezes the iphone for about 4 seconds before returning the matches.
Before storing arrays containing JSON objects in localStorage and parsing it with JSON, I simply stored unordered strings in localStorage and it work a whole lot faster. The JSON objects ad structure to the data stored in localStorage which is crucial. So I guess the speed issue has to have something to do with the fact that I'm using JSON objects? How can i structure my data i localStorage in an organized way while still maintaining as good speed performance as before?
Lastly anykind of tips or advice on which techniques you would use to make this app as lightweight and fast as possible is greatly appreciated.
Are you fetching the list per each search from the local storage? Don't do that, instead store it in the local storage only as needed (whenever it changes), and keep it always as a data structure.
Simply having objects instead of plain strings cannot be the reason for slowness, as everything in JavaScript is an object already, and thus it should only be slowing by a constant factor.
Furthermore, if this is about autocomplete kind of behaviour, then I suggest you would slow down the search, and also consider that if the user types in the box "Ma", the list gets filtered, and the user adds "tt" for "Matt", only previously filtered matches need to be considered...
Perhaps your set up would me more suited to use the WebSQL-database instead of storing a JSON object in the local storage. WebSQL is now deprecated but it's pretty well supported in webkit-browsers. and i'm using it with good results on a couple of projects.
You can read more about it here: http://html5doctor.com/introducing-web-sql-databases/

JavaScript: Web Worker and Typed Arrays

I have a web worker (started with new Worker()) that does some processing and is supposed to return a Float32Array.
It seems however that after the worker postMessage()s the data, it goes through serialization and desirialization to JSON and what I end up with when receiving the message is a plain javascript Array (with all of the properties the original typed array had)
A trivial work around would be to just recreate the typed array from the javascript array but that's wasteful and takes up time and memory.
Is there a better way to do this? Some kind of way to tell the JSON deserialization to instantiate a Float32Array instead of a javascript array? or a way to otherwise transfer the binary data?
All browsers that support workers (except IE10) support what's called transferable objects which means that if you have an array buffer (ie take your the .buffer property of your typed array) you can as a second parameter of postMessage include a list of array buffers you want to transfer ownership of back. This is much much faster than copying it.
update: this seems to be a Chrome bug at the moment:
http://code.google.com/p/chromium/issues/detail?id=73313
typed array are preserved in Firefox 4.

Using jQuery to set/retrieve multiple key/value pairs in one cookie using an array

My goal here is to cut down the number of cookies I'm using to store things like persistent states into a single cookie by setting and retrieving multiple key/value pairs within a single cookie using an array.
I'm mostly interested in knowing if someone else has already done this and has written a plugin for it, as I would think this has been done before.
You could serialize the object into a JSON string. That would make it super simple to re-load the object. See question 191881 for information on how to serialize an object to JSON in JavaScript.

Passing a large dataset to the client - Javascript arrays or JSON?

I'm passing a table of up to 1000 rows, consisting of name, ID, latitude and longitude values, to the client.
The list will then be processed by Javascript and converted to markers on a Google map.
I initially planned to do this with JSON, as I want the code to be readable and easy to deal with, and because we may be adding more structure to it over time.
However, my colleague suggested passing it down as a Javascript array, as it would reduce the size greatly.
This made me think, maybe JSON is a bit redundant. After all, for each row defined, the name of each field is also being outputted repetitively. Whereas, for an array, the position of the cells is used to indicate the field.
However, would there really be a performance improvement by using an array?
The site uses GZIP compression. Is this compression effective enough to take care of any redundancy found in a JSON string?
[edit]
I realize JSON is just a notation.
But my real question is - what notation is best, performance-wise?
If I use fully named attributes, then I can have code like this:
var x = resultset.rows[0].name;
Whereas if I don't, it will look less readable, like so:
var x = resultset.rows[0][2];
My question is - would the sacrifice in code readability be worth it for the performance gains? Or not?
Further notes:
According to Wikipedia, the Deflate compression algorithm (used by gzip) performs 'Duplicate string elimination'. http://en.wikipedia.org/wiki/DEFLATE#Duplicate_string_elimination
If this is correct, I have no reason to be concerned about any redundancy in JSON, as it's already been taken care of.
JSON is just a notation (Javascript Object Notation), and includes JS arrays -- even if there is the word "object" in its name.
See its grammar on http://json.org/ which defines an array like this (quoting) :
An array is an ordered collection of
values. An array begins with [ (left
bracket) and ends with ] (right
bracket). Values are separated by ,
(comma).
This means this (taken from JSON Data Set Sample) would be valid JSON :
[ 100, 500, 300, 200, 400 ]
Even if it doesn't include nor declare nor whatever any object at all.
In your case, I suppose you could use some array, storing data by position, and not by name.
If you are worried about size you could want to "compress" that data on the server side by yourself, and de-compress it on the client side -- but I wouldn't do that : it would mean you'd need more processing time/power on the client side...
I'd rather go with gzipping of the page that contains the data : you'll have nothing to do, it's fully automatic, and it works just fine -- and the difference in size will probably not be noticeable.
I suggest to use a simple CSV format. There is a nice article on the Flickr Development Blog where they talked about their experience with such a problem. But the best would be to try it on your own.

Categories