I made a Character class shared by the client and server in the same js file.
The server instanciates these characters and stocks them in a characterList object. I send it to the client with socket.emit( 'updated_characters', characterList ), but the client gets back an object of Object, instead of an object of Character, so I can't use my Character's methods on them.
How can I get around that ?
You can't send custom object types directly using socket.io. socket.io uses JSON as the transport format for objects and JSON does not have the ability to record or rebuild custom object types.
So, what you need to do is to create a serialized format for your objects (which will also likely be JSON) and then when you read the data on the other end, you will construct a custom object in your own code and then pass it this data in the constructor from which it will initialize itself.
The canonical ways to do this is to create a serialize method for your custom object that creates a new plain object that contains all the relevant info that needs to be sent over the wire and does not contain references to other objects (since that can't be serialized). Keep in mind that objects often have housekeeping information that does not necessarily need to be sent to the other end. Then, create an initialize method for your custom object that can take this serialized data and properly initialize your object that you can use on the other end.
Here's a simple example:
function MyObject(data) {
if (data) {
this.init(data);
} else {
this.count = 0;
this.greeting = "hello"
}
}
MyObject.prototype = {
init: function(data) {
this.greeting = data.greeting;
this.count = data.count;
},
getData: function() {
return {greeting: this.greeting, count: this.count};
},
talk: function() {
say(this.greeting);
}
};
Then, on the sending end of things, if you have an instance of your object in item, you would send the data with:
socket.emit('updated_characters', item.getData());
And, on the receiving side of things, you would have this:
socket.on('updated_characters', function(data) {
var item = new MyObject(data);
// do something with the item here
});
socket.emit( 'updated_characters', JSON.stringify(characterList) );
we have to do JSON.stringify in order for socket to identify as JSON and display it in front-end
Related
I want to store an object as a string, and then convert ot back to an object and call a method of this object.
user.delete() // this works
self.user = JSON.stringify(user)
const storeUser = JSON.parse(self.user)
storeUser.delete() // Error: delete is not a function
If JSON.parse and JSON.stringify would allow a copy of methods, your code would be insecure and would have risks of people running arbitrary code on your server/computer since normally JSON string comes from external sources.
You should allow your User's class to parse a json and create a new instance based on that json:
class User {
constructor(someRandomProperty) {
// ... code
}
static fromJson(json) {
let parsedJson;
try {
parsedJson = JSON.parse(json);
} catch (error) {
throw new Error("Invalid Json User");
}
return new User(parsedJson.someRandomProperty);
}
delete() {
// .... code
}
}
And then you would copy the user like this:
User.fromJson(yourJsonStrinHere)
As has been said in comments, functions/methods are not part of the JSON format so they are not present when you serialize to JSON. Same with class names and such. So, when you call JSON.stringify(), it does not include any information about what type of class it is or any of the methods associated with that object. When you then call JSON.parse(), all you will get back is instance data in a plain object. That's just one of the limitations of JSON.
Normally, you will not serialize code like methods. Instead, you will serialize what type of object it is, serialize the relevant instance data and then you want to reinstantiate the object, you will look at the data to see what type of object it is, call the constructor to make an object of that type and pass to the constructor the data you serialized from the prior object and you will have built a version of the constructor that takes exactly the data you are serializing so it can properly initialize the new object.
Hi , I have some issue with parse method!!!!!
as you can see in backbone js document the parse method in collection has this syntax :
collection.parse(response, options)
1) I want to know why we should use / override the parse method and what is the main usage of that?
2) I read some article and I get that the parse method give us the data structure for the client-side.
3 ) I really have issue for understanding the arguments of parse method .
- what is the options for??
can you give me an example of using parse method with two parameters?
Thanks!
The docs have a nice summary:
parse is called by Backbone whenever a collection's models are returned by the server, in fetch. The function is passed the raw response object, and should return the array of model attributes to be added to the collection. The default implementation is a no-op, simply passing through the JSON response.
http://backbonejs.org/#Collection-parse
1) You should return an array of model attributes. If your JSON response only has this then you don't need to do anything. Typically the parse override is used simply to point inside the JSON object at the right part. For example if your response was like this:
{
httpCode: 200,
responseMessage: 'success',
data: [ {model1}, {model2} ...]
}
Then you would need to override parse to point to the data key:
parse: function(response) {
return response.data;
}
2) They meant that the response arg is the object which was returned by the server.
3) The second options arg is the options that was passed to the .fetch call. You don't need to worry about it unless you want to do some specific logic based on the URL, the HTTP method or anything else that can be passed to fetch (and jQuery.ajax options and some Backbone ones like reset).
4)
parse: function(response, options) {
// For some reason POST requests return a different data structure.
if (options.method === 'POST') {
return response.data;
}
return response;
}
I'm having a problem where the cached object doesn't resemble the correct data so I figured it I can push up the most uptodate version to the browser cache it will solve my problem.
How do you update your localStorage with a new object? So if I had a controller with that had an assessment updated. How can I push that assessment object up to the localStorage?
To do that with native JavaScript, you would do something like this:
localStorage.setItem('itemKey', JSON.stringify(yourObject));
var item = JSON.parse(localStorage.getItem('itemKey'));
Within the context of angular, you should make a localStorage service as a wrapper around localStorage and inject it into your service or controller. Or you could inject $window and use it directly off of that like: $window.localStorage
A response specifically for the asker of this duplicate question:
LocalStorage can only store strings, which is why you're stringifying your object before storing it. To manipulate the stored string as an object, you can pass it to JSON.parse (assuming it's properly JSON-formatted). Then to store the modified version, you need to convert it back into a string.
// Creates JSON-formatted object, converts it to a string and stores the string as "ship"
const ship = { name: "black pearl", captain: "Jack Sparrow" };
const originalStringifiedForStorgage = JSON.stringify(ship);
localStorage.setItem("ship", JSON.stringify(ship));
// Retrieves the string and converts it to a JavaScript object
const retrievedString = localStorage.getItem("ship");
const parsedObject = JSON.parse(retrievedString);
// Modifies the object, converts it to a string and replaces the existing `ship` in LocalStorage
parsedObject.name = "newName";
const modifiedndstrigifiedForStorage = JSON.stringify(parsedObject);
localStorage.setItem("ship", strigifiedForStorage);
If the object is in JSON format (not sure if Angular uses a different format) you could probably use the setItem() and getItem() methods to update and retrieve local storage items!
For example taken from the following post:
http://thejackalofjavascript.com/storing-objects-html5-local-storage/
var me = {name:'myname',age:99,gender:'myGender'};
localStorage.setItem("user",me);
//fetch object
console.log(localStorage.getItem("user")); // will return "[object Object]"
You can use full featured Angular module angular-local-storage
An AngularJS module that gives you access to the browsers local
storage with cookie fallback
set
myApp.controller('MainCtrl', function($scope, localStorageService) {
//...
function submit(key, val) {
return localStorageService.set(key, val);
}
//...
});
get
myApp.controller('MainCtrl', function($scope, localStorageService) {
//...
function getItem(key) {
return localStorageService.get(key);
}
//...
});
setItem wont work instead it will create another item in localStorage with the same name
Instead directly use
localStorage.item = (what ever the change that you want in the item)
I have a simple model called BaseModel that extends from a Backbone.Model. According to Backbone.js documentation I've overridden parse method to work with a preexisting API.
The parse method is called in two different situation. When from a collection I want to fetch to grab all data. When from a model I want to fetch a specific data.
Within BaseModel to differentiate the behavior I'm doing the following.
parse : function(response, options) {
var result;
if(options.collection) {
// I'm coming from the fetch that belongs to the collection
// manipulate result here...
} else {
// I'm coming from the fetch that belongs to the model
// manipulate result here...
}
return result;
}
Is this appraoch valid or is there a better way to achieve this?
Edit 1
I thought on Andrew answer but the situation I need to manage is weird. In fact, when the parse method is called the first time (from the collection) data are parsed and properties for the model are created. Then, when the parse method is called from the model itself, additional data are parsed and properties for the model are merged to the first ones.
Edit 2
In my situation, for example, response coming from collection contains an array of objects where each object has a property a. Conversion can be applied, e.g. date obj. Then, response coming from model contains b. Also here conversion can be applied. At the end both properties will be merged into the same model but they are coming from different fetch calls.
Notice that response in collection is already an array. So, I do not to differentiate or split nothing here. I would just know that if I come from collection, I will find a, b otherwise.
Read the fetch in the collection as give me all models, while the other call as based on a model returned from the collection, enrich it by details.
Backbone.Collection also has a parse method. In my opinion the correct way is to implement it for both your BaseModel and your Collection.
The collections parse method only needs convert the data to be an array of unparsed models. It then delegates to the BaseModel parse method automatically to parse each one individually.
e.g.,
BaseModel {
parse : function(response, options) {
var result;
// I'm coming from the fetch that belongs to the model
// manipulate result here...
return result;
}
}
BaseCollection {
parse : function(response, options){
// I'm coming from the fetch that belongs to the collection
// Turn it into an array.
return response.split('mydelim');
}
}
From your edit 2, It looks like your approach is the right idea. I would however say that if I where to do it, I would test the returned object for properties rather than the context of the call so I don't need to care about the datasource,
parse : function(response, options) {
var result = {};
if(response.a){
result.c = response.a;
} else if(response.b){
result.c = response.b;
}
...
return result;
}
I have a basic backbone model, its urlRoot attribute is set and the corresponding target on the server side returns a correct JSON output (both JSON string and application/json header).
I call a fetch like this:
var athlete = new Athlete({ id: 1 });
athlete.fetch();
at this point if I add a
console.log(athlete);
I can see the model, and inspecting it in firebug I can open the attributes object and see all the values returned from the server.
BUT if I do a:
console.log(athlete.get('name'));
I get undefined (the name appears under the attributes in the DOM inspection I mentioned above)
also doing a:
console.log(athlete.attributes);
returns an object containing only {id: 1} which is the argument I passed while creating the model.
If I create the model like this:
var athlete = new Athlete(<JSON string copypasted from the server response>);
then everything works fine, the .get() method returns whatever I ask, and athlete.attributes shows all the values.
What am I doing wrong?
fetch is asynchronous, which means that the data won't be available if you immediatly call console.log(athlete.get('name')) after the fetch.
Use events to be notified when the data is available, for example
var athlete = new Athlete({id: 1});
athlete.on("change", function (model) {
console.log(model.get('name'));
});
athlete.fetch();
or add a callback to your fetch
var athlete = new Athlete({ id: 1 });
athlete.fetch({
success: function (model) {
console.log(model.get('name'));
}
});
or take advantage of the promise returned by fetch:
athlete.fetch().then(function () {
console.log(athlete.get('name'));
});
Just as a quick remark when using events in this example. It did not work with change in my case because this events fire on every change. So sync does
the trick.
var athlete = new Athlete({id: 1});
athlete.on("sync", function (model) {
console.log(model.get('name'));
});
athlete.fetch();