Can an object become 'stuck' inside another object (because its deep)? - javascript

This question is not specific to this particular scenario. The scenario could be concerning any complex/deep JavaScript object but for me to visualize the question I need a scene. Do not answer specifically on the supplied example - only cloning, scope and objects.
Brief Outline
If I have stored a websocket object inside an object, could I later move this websocket out of the storage object and put into another object? kind of like when you pop or splice an array, the array item is not only removed from the array but it's also returned to you (not obliterated but transferred / 'plucked from'). Or is the websocket object stuck/tied to the storage object {}? (If so, in what state would the object be in a lower scope?? What is it 'then'?)
Obviously:
//pseudo code
var finalobject = Object.assign({},storageobject.socket);
//not in nodejs? probably a bad idea anyway
var finalobject = storageobject.socket;
//only a shallow reference
delete storageobject.socket
finalobject.test = 'abc123';
// Obvious TypeError: Cannot set property..bla bla
what would I have though if?:
function lowerscope(x){return x;}
var sameobject=lowerscope(storageobject.socket);
is sameobject now dolly the sheep? a shallow reference or a deep reference? a copy?
My Question And The reason Why I think it's not a stupid question
If I first assign the socket in the storage object to the finalobject
then delete the storageobject.socket then I cant use the finalobect or set properties because that was just a shallow reference to what I just deleted.
So what exactly is going on if I pass the finalobject through a function like so? (again pseudo code)
var finalobject = storageobject.socket;
function appInit(mySocket) {
// do app stuff here with mySocket
// set up some functions...
return; // * come back out to main scope
}
appInit(finalobject);
delete storageobject.socket; // * what am I deleting??
Is the object cloned into appInit? Or does it live there as just another reference?
Example of my code flow (optional; you can skip this):
This part is (too) specific to my case and is here only in case someone asks to see code or 'needs' to understand why I ask.
I have put/created-reference-to my newly created (node.js ws) Websocket in another object:
var uid = '7657rrfdt6e6t'; //unique id
var socketsReference = {}; //main scope
socketServer.on('connection',function(mainSocket) {
var socketsGroup = {}; //local scope
mainSocket.uid = uid;
socketsGroup[1] = mainSocket;
socketsReference[uid] = socketsGroup;
socketsGroup[1].send('uid:' + uid);
So that is the main socket that is stored in the object. It has just sent its uid to the client so that it can set that as its uid property too.
Then next thing I do is connect the client to a secondary socket. For this I want the secondary socket to end up with the similar uid+'.s2' unique id as the main so what I do is connect the second socket then send the new 2nd connection a message from the client to the server which tells the server side socket the uid of the mainSocket
secondarySocket.on('message', function(data) {
this.uid = data.uid + '.s2';
socketsReference[data.uid][2] = this;
Later in the application I take the correct user specific socket group and inject it into the main application where the connected user is in his own scope.
var mySockets = socketsReference[my_uid]
appInit(mySockets);

Related

Socket.io client within an object

I am using Node.js, with Socket.IO for communications to the client on my server...
For example:
On my server, I have a User class, which contains basic information and functions about each user. Each time someone connects, a new User object will be created and added to a users array, with the Socket.IO client object parsed to it. Here is the code:
// Set up server, Socket.IO, etc.
users = [];
var User = function(client) {
this.value = "some random value";
this.client = client;
this.client.on("event",function(data) {
// Do stuff with data
});
}
socket.on("connection", function(client) {
users.push(new User(client));
});
My problem is this: when receiving messages with Socket.IO .on(), I want to do stuff the User object which client is owned by. But the problem is that accessing this doesn't access the User object, but rather the client object (or at least I think so, but it isn't the User but rather some Socket.IO object). Even when I refer the .on() function to call a function in my object, like this.event in my User object, I still can't access my User object with this. I tried creating a local variable within each object called self, and setting it to this, like this: self = this;, but I can't edit this, but only self.
Any ideas?
this.client.on("event",function(data) {
console.log(this.value === "some random value"); // true
}.bind(this));
bind makes this keyword set to the provided value i.e. User object.

Freeze objects using web-worker

I have an array of collections which needs to be freezed using web-worker. Sample below shows single collection freezing.
var worker = new Worker("worker.js");
worker.onmessage = function (e) { // WATCH MESSAGES FROM THE WORKER
var data = e.data;
// TEST: freezed collection property changes here in main scope. Weird!!!
};
worker.postMessage(JSON.stringify({
'collection' : someHugeJSON_object
}));
// In my worker.js
function deepFreeze(){
// my freezing logic
}
onmessage = function (e) {
var data = JSON.parse(e.data);
freezedJSON_object = deepFreeze(data.collection);
// TEST: collection property does not change in worker scope after freezing
// DONE FREEZING EVERYTHING... PHEW!!!
postMessage({
'collection' : freezedJSON_object
});
}
Does enumerability, configurability, or writability properties of an object, restricted to a particular scope?
When you call postMessage(obj) you don't send obj - it's cloned using structured clone algorithm.
MDN page is rather explicit about what happens to frozen objects:
Property descriptors, setters, and getters (as well as similar metadata-like features) are not duplicated. For example, if an object is marked read-only using a property descriptor, it will be read-write in the duplicate, since that's the default condition.
So you can't freeze object in WebWorker and send it back to main thread.
By the way - you don't have to call JSON.stringify on messages passed to WebWorker.

Parse.com cloud function - manually modify object fields before sending to client

I'm trying to limit the visibility of some fields of parse User object in cloud function.
I have a "Product" class, with a pointer named "owner" to a the "User" that uploaded the item.
I also have a cloud function called "getProducts", I use query.include("owner") to get the owner data at the same time.
What i want to achieve, is that the output of the "getProduct", will be a list of products, but the "Owner" object will contain only certain fields, such as "firstName" or "facebookId",
I don't want to return to the client other sensitive data even though I'm not presenting it (such as Location, email, family name etc..).
After searching I've seen 2 possible solutions.
1.) Cut the User class into 2 classes, 1 of is "Private" class with ACL just for the user.
2.) The second approach that i prefer, i to edit the fields in the cloud function, but i can't seem to change the "owner" object at the "product" object. i'm getting the error:
"Error: Uncaught Tried to save an object with a pointer to a new, unsaved object. (Code: 141, Version: 1.2.19)"
var output[] = [];
_.each(results, function(result) {
var responseData = {};
var owner = result.get("owner");
//Remove fields from the user object
var itemOwnerId = owner.id;
var itemOwnerFirstName = owner.firstName;
var itemOwnerFacebookID = owner.facebookID;
var itemOwner = new Parse.User();
itemOwner.id = itemOwnerId;
itemOwner.id = itemOwnerId;
itemOwner.firstName = itemOwnerFirstName;
itemOwner.facebookID = itemOwnerFacebookID;
result.set("owner", itemOwner);
responseData.item = result;
output.push(responseData);
});
It seems that calling result.set("owner", itemOwner) isn't good, and throwing me exepction:
rror: Uncaught Tried to save an object with a pointer to a new, unsaved object. (Code: 141, Version: 1.2.19)
What am I doing wrong?
The SDK doesn't allow an object that has been changed to be serialized into a response.
A hack way to work around this would be:
result.dirty = function() { return false; };
This would disable the check and allow you to return the modified object.
If you wanted to re-enable it later, you'd need to store the original value of result.dirty and reassign it later.

Create a copy of a module instead of an instance in Node.js

this would be my first question ever on stackoverflow, hope this goes well.
I've been working on a game (using corona SDK) and I used Node.js to write a small little server to handle some chat messages between my clients, no problems there.
Now I'm working on expanding this little server to do some more, and what I was thinking to do is create an external file (module) that will hold an object that has all the functions and variables I would need to represent a Room in my games "Lobby", where 2 people can go into to play one against the other, and each time I have 2 players ready to play, I would create a copy of this empty room for them, and then initialize the game in that room.
So I have an array in my main project file, where each cell is a room, and my plan was to import my module into that array, and then I can init the game in that specific "room", the players would play, the game will go on, and all would be well... but... my code in main.js:
var new_game_obj = require('./room.js');
games[room_id] = new_game_obj();
games[room_id].users = [user1_name,user2_name];
Now, in my room.js, I have something of the sort:
var game_logistics = {};
game_logistics.users = new Array();
game_logistics.return_users_count = function(){
return game_logistics.users.length;
}
module.exports = function() {
return game_logistics;
}
So far so good, and this work just fine, I can simply go:
games[room_id].return_users_count()
And I will get 0, or 1, or 2, depending of course how many users have joined this room.
The problems starts once I open a new room, since Node.js will instance the module I've created and not make a copy of it, if I now create a new room, even if I eliminated and/or deleted the old room, it will have all information from the old room which I've already updated, and not a new clean room. Example:
var new_game_obj = require('./room.js');
games["room_1"] = new_game_obj();
games["room_2"] = new_game_obj();
games["room_1"].users = ["yuval","lahav"];
_log(games["room_1"].return_user_count()); //outputs 2...
_log(games["room_2"].return_user_count()); //outputs 2...
Even doing this:
var new_game_obj = require('./room.js');
games["room_1"] = new_game_obj();
var new_game_obj2 = require('./room.js');
games["room_2"] = new_game_obj2();
games["room_1"].users = ["yuval","lahav"];
_log(games["room_1"].return_user_count()); //outputs 2...
_log(games["room_2"].return_user_count()); //outputs 2...
Gives the same result, it is all the same instance of the same module in all the "copies" I make of it.
So my question as simple as that, how do I create a "clean" copy of my original module instead of just instancing it over and over again and actually have just one messy room in the end?
What you're doing is this (replacing your require() call with what gets returned);
var new_game_obj = function() {
return game_logistics;
}
So, every time you call new_game_obj, you return the same instance of game_logistics.
Instead, you need to make new_game_obj return a new instance of game_logistics;
// room.js
function Game_Logistics() {
this.users = [];
this.return_users_count = function(){
return this.users.length;
};
}
module.exports = function() {
return new Game_Logistics();
}
This is quite a shift in mentality. You'll see that we're using new on Game_Logistics in module.exports to return a new instance of Game_Logistics each time it's called.
You'll also see that inside Game_Logistics, this is being used everywhere rather than Game_Logistics; this is to make sure we're referencing the correct instance of Game_Logistics rather than the constructor function.
I've also capitalized your game_logistics function to adhere to the widely-followed naming convention that constructor functions should be capitalized (more info).
Taking advantage of the prototype chain in JavaScript is recommended when you're working with multiple instances of functions. You can peruse various articles on "javascript prototypical inheritance* (e.g. this one), but for now, the above will accomplish what you need.

reuse serialized reference to "this"-Keyword

First things first: I'm not sure whether the information that I'm going to provide will be enough, I will happily add additional information if needed.
I'm serializing a complex structure into the JSON-Format, Field[i][0] is the "this"-reference to an object.
Firebug's Output on JSON.Stringify(myObj)
This is all fine and working as long as I keep it all JS. But now I have the requirement to serialize and send it to my backend to get the reference + computed information back.
Now how do I map back to the reference I had before? How do I bind this ref back to an Object?
This $$hash thing looks internal and proprietarish so I havent even bothered trying something like Object[$$hash] = ref or whatever.
This general idea probably seems pretty whack, but the result is returned asynchrously and I need an identifier to bind the new information back to the original object. Obviously I could just make up my own identifier for that, but I was wondering whether there's an option to solve it this way.
EDIT
The objects are created like this (likewise)
var arrayOfObj = []
arrayOfObj.push(new Object.With.SomeSettersAndGetters());
The Object has a method like
function GetRef(){
return this;
}
Which I'm using to keep a ID/Ref through my code.
Thank you!
Update
If you want to update a series of instances and make many Ajax requests, then you need to look at Ajax long polling and queueing techniques. You won't be able to preserve the reference, but regardless of what Ajax technique you use, make use of the below trick to preserve the reference.
Add long polling on top and you're good to go.
The idea is this:
Assume the server will respond in JSON format. If you need to refer to the original references, here's my two cents:
Update the exact references when the server replies. Say you have 10 instances of Something stored in an array. On a successful response, you use the methods in the Something class to update the specific instances in whatever way you want.
/**
* The array with something instances.
* #type {Array.<Something>}
*/
var instances = [];
/**
* The Ajax success function.
* #param {Event} event The event object.
*/
function ajaxSuccess(event) {
var response = event.target.getResponseText();
var actualResponse = JSON.parse(response);
for (var i = 0, len = actualResponse.length; i++) {
instances[i].setWhatever(actualResponse[i].whatever);
};
};
The above is a more procedural approach. If you want full blown OOP in JS, then you think in modular design patterns. Say you have a module that loads data into some place. Basically, everything related to that module is an instance property.
var myModule = function() {
this.whatever = 1;
};
myModule.prototype.loadMore = function() {
var request = new XMLHttpRequest(),
that = this; // store a reference to this.
request.send(); // etc
request.onreadystatechange = that.onSucess;
};
myModule.prototype.onSucess = function(event) {
var response = JSON.parse(event.target.getResponseText());
this.whatever = response.whatever;
};
var moduleInstance = new myModule();
myModule.loadMore();
// Now the scope is always preserved. The callback function will be executed in the right scope.
Let's assume on the backend side of things, you have a model class that mimics your client side JavaScript model. Say you want to update a reference inside a model that displays text. I use Scala on the backend, but look at the fields/properties and ignore the syntax.
case class Article (
title: String,// these are my DB fields for an Article.
punchline: String,
content: String,
author: String
);
// now assume the client is making a request and the server returns the JSON
// for an article. So the reply would be something like:
{"title": "Sample title", "punchline": "whatever", "content": "bla bla bla boring", "author": "Charlie Sheen"};
// when you do
var response = JSON.parse(event.target.getResponseText());
// response will become a JavaScript object with the exact same properties.
// again, my backend choice is irrelevant.
// Now assume I am inside the success function, which gets called in the same scope
// as the original object, so it refers TO THE SAME THING.
// the trick is to maintain the reference with var that = this.
// otherwise the onSuccess function will be called in global scope.
// now because it's pointing to the same object.
// I can update whatever I want.
this.title = response.title;
this.punchline = response.punchline;
this.content = response.content;
this.author = response.author;
// or I can put it all in a single variable.
this.data = response;
What you need to remember is that scope needs to be preserved. That's the trick.
When I do var that = this; I copy a reference to the model instance. The reference is remembered through higher-order, not current scope.
Then I tell the XMLHttpRequest object to call that.ajaxSuccess when it is complete. Because I used that, the ajaxSuccess function will be called in the scope of the current object. So inside the ajaxSuccess function, this will point to the original this, the same instance.
JavaScript remembers it for me it when I write var that = this;

Categories