Using object methods on Node server and web client over JSON? - javascript

I have a Javascript library that I want to use on a web browser and also on a Node.js backend. In the library, I have multiple objects with methods defined like so:
function foo() {
this.bar = 200;
this.someMethod = function(baz) {
return this.bar + baz;
};
}
var x = new foo();
And I can use it in the client or the Node.js server by doing the following:
x.someMethod(5);
=> (returns 205)
Now, when I JSON.stringify my object, it shows up without the method.
var string = JSON.stringify(x);
=> {"bar":200}
Which means I can't unpack the JSON on my server and use the same methods.
var parsed = JSON.parse(string);
document.write(parsed.someMethod(5));
=> (doesn't do anything. the methods are gone!)
In a class based system I'd just use a copy constructor. Something that would reconstruct the object from JSON.
function foo_copy_constructor(parsed_json) {
f = new foo();
f.bar = parsed_json.bar;
return f;
}
var z = foo_copy_constructor(parsed);
z.someMethod(5);
=> (returns 205 like it should)
( jsfiddle: http://jsfiddle.net/7FdDe/ )
Basically, Is there a better way than this?
Many of my objects contain instances of other objects I've written with their own methods, and this seems like it would get tedious to build a constructor for every object since both the client and the server use the same library and object definitions. I know that JavaScript is based on prototypes, but I don't really understand them since I've just started with JavaScript and am used to Python and class-based languages.
Thanks for any and all help!

JSON.stringify only stringifies the objects that have the toJSON method. So you could simply add the toJSON method to your methods. (Remember, functions are objects too.)
function A() {
this.method = function() { console.log(1); };
}
var c = new A();
JSON.stringify(c);
"{}"
A.prototype.otherMethod = function() { console.log(1); };
var c = new A();
JSON.stringify(c);
"{}"
Function.prototype.toJSON = function() { return this.toString(); };
JSON.stringify(c);
"{"method":"function () { console.log(1); }"}"
However, when parsing this back, you get the function as a string. So you have to the strings back to functions with something like this:
var d = JSON.parse(JSON.stringify(c));
Object.keys(d).forEach(function(k) {
// If it starts with "function"
if (/^function/.test(d[k])) {
// Get the body of the function
var f = d[k].match(/{(.*)}/).pop();
// Replace the string with a real function
d[k] = new Function(f);
}
});
d.method();
1
However, instead of messing with javascript like this, I'd rather suggest that you use a well-tested library like now.js.

Related

Javascript new operator help needed - dynamic object using the new operator in javascript

So I was looking to get an idea working but I can't seem to figure it out. I would like like to do something like the following.
// my old fasion code but not very flexibale
function runCode() {
var x = new myObject();
return x.run();
}
What I really like to be able to do.....
// pass in string paramater of object to create
// objName is the name of an object string to new like "myObject"
function runCode(objNameToUse) {
var x = new Object(objNameToUse);
return x.run();
}
runCode("myObject");
How could I do something like this using javascript?
You can make a mapping from names of available constructors to the constructors themselves, like with an object:
var someConstructors = Object.create(null);
someConstructors.myObject = myObject;
…
function runCode(objNameToUse) {
var constructor = someConstructors[objNameToUse];
var x = new constructor();
return x.run();
}
runCode("myObject");
Object.create(null) is an object that doesn’t inherit the properties of Object.prototype, so you don’t have unexpected extra “keys”, like hasOwnProperty. Important because the usual reason to do this is if you wanted to create objects based on a string outside of your control (e.g. user input); in many other cases, you could just pass myObject directly to runCode as a function instead of a name.
So, make sure this isn’t actually what you’re looking for for “flexibility”:
function runCode(constructor) {
var x = new constructor();
x.run();
}
runCode(myObject);
If you're absolutely sure that you 100% control which class name is passed to your function at any time, you can use eval:
class myObject {
run() {
console.log('run');
}
}
function runCode(objNameToUse) {
var x = new (eval(objNameToUse))();
return x.run();
}
runCode('myObject');

How to pass custom class instances through Web-Workers?

Since Web-Worker JSON serialize data between threads, something like this doesn't work:
worker.js
function Animal() {}
Animal.prototype.foobar = function() {}
self.onmessage = function(e) {
self.postMessage({animal: new Animal()})
}
main.js
let worker = new Worker('worker.js')
worker.onmessage = function(e) {
console.log(e.data)
}
worker.postMessage('go!')
The outcome would be a simple object with the loss of the foobar prototype method.
Is it possible to transfer the custom object back to the main thread without losing its prototype methods? Like, would this be possible with ArrayBuffer? I'm not familiar with that stuff, so I'm a bit lost.
Assuming you program both the client and the webservice you can define the Animal function in boths sides
Then you can add to Animal.prototype (in both sides) toJson method to pass the info you need to recreate the object (and may be choose some attribute to define the className)
You define a reviver that use the reverse process
Then when you post you must always JSON.stringify(e)
In the onmessage you JSON.parse(m,reviver)
function Animal(name, age){
var private_name = name;
this.public_age = age;
this.log = function(){
console.log('Animal', private_name, this.public_age);
}
this.toJson = function(){
return JSON.stringify({
__type__:'Animal', // name of class
__args__:[this.public_age, private_name] // same args that construct
});
}
}
Animal.prototype.age = function(){
return this.public_age;
}
var a = new Animal('boby', 6);
worker.postMessage(JSON.stringify(a));
function reviver(o){
if(o.__type__){
var constructor=reviver.register[o.__type__];
if(!constructor) throw Error('__type__ not recognized');
var newObject = {};
return constructor.apply(newObject, o.__args__);
}
return o;
}
reviver.register={}; // you can register any classes
reviver.register['Animal'] = Animal;
worker.onmessage = function(m){
var a = JSON.parse(e, reviver);
}
There is a simple way without set prototypes, also without convert to string with JSON.stringify, you need to build to function:
toObject(instance):obj, instance is and instance class and will be converted to and object
toInstanceClass(obj):instance, obj is and object and will return and instance from your class
You need to pass your obj to the worker, in the worker you'll build your instance from your class, make all your operations and return like and obj
In your main thread you'll need to rebuild your instance from your class by the returned obj from worker, that's all.

Difference between creating javascript objects

When creating an object to use JS in an OO manner, is there any difference within a JS engine between (with the exception of being able to define a constructor:
var Uploader = Uploader || {};
and
var Uploader = function() {
}
and
function Uploader() {
}
Especially when later, you wish to do something along the lines of
Uploader.DOM = {
Create: function(file) {
}
};
Is it all down to personal preference? Or is there a real difference?
Objects:
var MyObj = {
myArr: [1,2,3],
find: function(/*some arguments*/) {
//some logic that finds something in this.myArr
}
}
In MyObj.find function this keyword will point to MyObj (which somewhat resembles how this works in languages those have classes). You can use this functionality to do mix-ins:
var MyObj2 = {
myArr: [4,2,6]
}
MyObj2.find = MyObj.find;
In MyObj2.find function this keyword will point to MyObj2.
Also objects support getters and setters (works on IE9+ and all good browsers):
var MyObj = {
myArr: [1,2,3],
find: function(/*some arguments*/) {
//some logic that finds something in this.myArr
},
get maxValue() {
return Math.max.apply(null, this.myArr);// maxValue is calculated on the fly
},
a_: null,
get a () {
return this.a_;
},
set a (val) {
//fire a change event, do input validation
this.a_ = val;
}
}
Now max value in the array can be accessed like this: MyObj.maxValue. I also added a property a_. It can't be named the same as its getter and setter so appended an underscore. Appending or prepending underscores is a naming convention for private variables which should not be accessed directly.
var qwe = MyObj.a // get a_
MyObj.a = 'something'; //set a_
Functions:
var u = new Uploader(); // will throw an exception
var Uploader = function() { }
Uploader is defined in runtime here. It does not exist yet when I try to instantiate it.
var u = new Uploader(); //will work
function Uploader() {}
Uploader is defined in compilation time here so it will work.
Functions can be used with revealing pattern to conceal some members. Functions don't support getters and setters but you can put objects inside functions.
function myFunc() {
function privateFunc() {};
function publicFunc() {};
var obj = {
//members, getters, setters
};
return {
publicFunc: publicFunc,
obj: obj
}
}
You can call muFunc.publicFunc() outside of myFunc because it is returned. But you can not use privateFunc outside because it is not returned.
Revealing pattern functions are not meant to be instantiated usually. This is because when you instantiate it everything inside will be copied to a new instance. So it will use up more memory than if you would add functions using prototype.
myFunc.prototype.someFunc = function() {};
Like this all instances of myFunc will share the same instance of someFunc.
Conclusion: with functions you can simulate a private access modifier but in objects the this keyword acts somewhat similar of what you'd expect in a language that have classes. But you always can use call, apply and bind to change the context (i.e. what 'this' keyword will be) of the function.

Best approach for creating an object from deserializing

Best approach for creating an object from deserializing.
I'm looking for good approach when create object from serialized data. Let's assume that there is an object defined like that:
function A()
{}
A.prototype.a = "";
and serialized data: "a".
So which approach will be better and why:
1. Create static method deserialize:
A.deserialize = function( data )
{
var a = new A();
a.a = data;
return a;
}
and will be called like that:
var a = A.deserialize("a");
2. Create method in prototype
A.prototype.deserialize = function ( data )
{
this.a = data;
}
and it will be called like that
var a = new A();
a.deserialize( "a" );
3. Process data in contstructor
function A(data)
{
this.a = data;
}
Consider that data can be different types for example - string, json or ArrayBuffer.
I'm looking for a more generic solution. Is there any matter how I will create object?
You can use generic solution for (de)serializing other objects as JSON and make an utility function out of it. See more at How to serialize & deserialize Javascript objects?
If you prefer deserializing each object type by itself.
Static method approach
Pros:
Static method handles whole object creation and setting values cleanly, not creating any unnecessary temp objects and such. By far the most cleaner solution.
This method doesn't require object to be aware of serialization process, and it can be easily added to existing solution
Prototype method approach
Cons:
This variant pollutes prototype chain.
You have to create an object for sake of creating an object unless you want to use it to fill it up from inside. That can be problematic if there is for instance logic in constructor, that needs to be executed.
Process data in constructor
Cons:
Constructor needs to be overloaded. It can be difficult to recognize, if data passed to constructor are normal or serialized data. e.g. normally it takes some string and serialized data is string as well.
i wrote this jquery function, makes it very easy to serialize any form data.
$.fn.serializeObject = function () {
var result = {};
this.each(function () {
var this_id = (this.id.substring(this.id.length - 2) == '_s') ? this.id.substring(0, this.id.length - 2) : this.id.replace('_val', '');
if (this.type == 'radio') {
var this_id = this.name;
result[this_id.toLowerCase()] = $('input[name=' + this_id + ']:checked').val();
}
else if (this.type == 'checkbox') {
result[this_id.toLowerCase()] = $('#' + this_id).prop('checked');
}
else {
if (this_id.indexOf('___') > -1) {
this_id = this_id.substring(0, this_id.indexOf('___'));
}
result[this_id.toLowerCase()] = this.value;
}
});
return result;
};
you can easily call it by doing var form_vars = $('#div input, #div select, #div textarea').serializeForm()
you can add additional properties to the object by doing form_vars.property = 'value';, you can also even add js arrays and json objects to it. then you can use $.ajax to submit.

Is encapsulation required for JavaScript objects?

I'm developing a node.js application. Looking for ways to create the datamodel.
The data being sent to/from the client is JSON. Since database is MongoDb, the data to/from the db is JSON.
I'm new to JS, I could find so many js libraries dedicated to creating encapsulated objects. Is it still required?
What are the possible consequence of just defining models as simple js objects, and use prototype based inheritance where necessary?
Any help is appreciated. Thanks.
What are the possible consequence of just defining models as simple js
objects, and use prototype based inheritance where necessary?
IMO, You will lose maintainability over the time as your application size increases or your team size increases as many developers start working on the same code.
In other words - Without proper encapsulation it is easy to modify objects one doesn't own - easy to meddle with the parts that you don't want to be touched.
If you are writing a library/framework of some sort where the just APIs are exposed to the user and you don't have proper encapsulation one could probably bring everything down by just one modification.
For example:
var myObj = {
mySecretPrivateCrucialFunction: function () {
// all the awesome crucial logic
},
//This is what you want other parts of the code ...
// ... to be interacting with in this object
myPublicMethod: function () {
// some logic
mySecretPrivateCrucialFunction();
// some thing else
}
}
I can do this.
myObj.mySecretPrivateCrucialFunction = function () {
alert('yay..i got you..');
};
But if you do this way - you don't give that chance.
var myObj = (function () {
var mySecretPrivateCrucialFunction = function () {
// all the awesome crucial logic
}
//This is what you want other parts of the code ...
// ... to be interacting with in this object
return {
myPublicMethod: function () {} /*some logic */
mySecretPrivateCrucialFunction(); /*some thing else */
}
})();
In case you want to make all your properties hidden/private and still want to get the JSON representation of the object - you can do something like this -
var myObj = (function () {
// your private properties
var prop1 = 1;
var prop2 = 2;
// your getters
var getProp1 = function () {
return prop1;
};
var getProp2 = function () {
return Prop2;
};
// your setters
var setProp1 = function (newValue) {
prop1 = newValue;
};
var setProp2 = function (newValue) {
prop2 = newValue;
};
// your JSON representation of the object
var toString = function () {
return JSON.stringify({
prop1: prop1,
prop2: prop2
});
};
// Object that gets exposed -
return {
"getProp1": getProp1,
"getProp2": getProp2,
"setProp1": setProp1,
"setProp2": setProp2,
"toString": toString
}
})();
There are two ways to approach this question as I see it:
I'm assuming that your data is being transfered as a JSON string.
"[{\"key\":\"val\"}]" is showing up in your responses, and you are then putting them through JSON.parse to turn them into viable arrays and objects.
So the first way would be to make "class-instances" (no new or inheritance necessary, just a constructor function which encapsulates data and exposes an interface, based on the data-type).
function makeTweet (data) {
var tweet = {
from_user : data.from_user || "anonymous",
/* ... */
},
toString = function () {},
public_interface : {
toString : toString,
/* getters, etc */
};
return public_interface;
}
I know you already know this stuff, but consider a situation where you've got two or three different data-types inside of the same process (like at the end of the line, when you're ready to print to the client), and you have a process which is reading and writing to public fields on every object. If different objects have different properties, things end poorly, or end in a sea of flakey if statements.
The other way to look at it might be an entity/service system
function handleTweets (tweetArr) {
var templateFormat = system.output_format,
string = "";
if (templateFormat === "HTML") {
string = HTMLtemplateTweets(tweetArr);
} else { /* ... */ }
}
function HTMLtemplateTweets (tweetArr) {}
function JSONtemplateTweets (tweetArr) {}
function XMLtemplateTweets (tweetArr) {}
...
With the point being that you would turn the JSON string into an array of data-only objects, and feed them down a line of type-specific library/system functions.
This would be more like a very simplified entity/system approach, rather than an OO (as classically accepted) approach.
Now, your data safety comes from making sure that your objects only ever go down one intended path, and thus transforms will be predictable for every object of that type.
If you want "inheritance" for performance/memory purposes in JS, then this might also be a direction to look.

Categories