Should data functions be properties of a data array? - javascript

I have a dataset and some functions that manipulate that data. I would like to store the data and functions in a logical structure that is readable and easy to use without violating good practices.
I see 2 options:
1. Define the data as an array and add the manipulation functions to the array, which is possible because an Array is a kind of Object.
Example:
var dataSet = [1, 2, 3, 4];
var dataSet.add = function (newData) {
if (newData === badData) {
console.log('bad data!');
return;
}
dataSet.push(newData);
};
Pros: Cleaner, more readable, easier to use
Cons: We are modifying an Array object, which means it will not behave as expected, so we should not try to copy this object for example, or document the fact that doing so causes it to lose it's functions.
2. Define a new Object and define the data as a property of the object along with all the manipulation functions.
Example:
var dataSet = {
data: [1, 2, 3, 4],
add: function (newData) {
if (newData === badData) {
console.log('bad data!');
return;
}
data.push(newData);
}
};
Pros: Object works as expected, can be duplicated easily, and packages the data and functions neatly.
Cons: Data cannot be referenced as a canonical representation of the object by simply calling 'dataSet'. Data Manipulation will become tedious to write in some cases, due to the need to constantly call dataSet.data instead of just calling dataSet.
Which of these 2 options is best and why? What are some examples of these options in use today?
Feel free to offer better options as well.
Edit:
Option 2a: (Use an Prototype if you have more than one dataset)
Option 3: Don't attach the data to the functions at all. Instead create a separate array for the data outside of the functions object and put all the data functions into their own object.
Thanks!

In my opinion the second option is better, cause as you said, modifying native object is a bad practice.
Actually there is a third option, you can create a class, that will get the native data structure and expose method on it.
In that way, you will able to create multiple instances of the same class.
class DataSet {
constructor(data) {
this.data = data;
}
add(newData) {
if (newData === badData) {
console.log('bad data!');
return;
}
data.push(newData);
}
}
Same implementation but with ES5
var DataSet = (function () {
function DataSet(data) {
this.data = data;
}
DataSet.prototype.add = function (newData) {
if (newData === badData) {
console.log('bad data!');
return;
}
data.push(newData);
};
return DataSet;
}());

Related

best way to handle callback initialization and functions that hold data between function calls

I have this code that is called in an ajax callback once the data is fetched:
function onFetchCallback(data) {
onFetchCallback.accumData ??= [];
onFetchCallback.timeLine ??= [];
onFetchCallback.tempValues1 ??= [];
onFetchCallback.tempValues2 ??= [];
onFetchCallback.char;
const hasNulls = data.includes(null);
if (!hasNulls) {
//push values into different arrays
} else {
//push the rest of no nulls if there is any...
}
}
I dont find this clean, bacause I am checking if the arrays that accumulate the data are initialized for every callback call. I think it woull be better to have the callback function initialized, so that the arrays are created, and then call the functions that will store the data in the arrays.
So I did:
function onFetchCallback() {
function init() {
onFetchCallback.accumData ??= [];
onFetchCallback.timeLine ??= [];
onFetchCallback.tempValues1 ??= [];
onFetchCallback.tempValues2 ??= [];
onFetchCallback.char;
}
function store(data) {
const hasNulls = data.includes(null);
if (!hasNulls) {
//push values into different arrays
} else {
//push the rest of no nulls if there is any...
}
}
onFetchCallback.init = init;
onFetchCallback.store = store;
}
So then when I need to use my callback I do:
onFetchCallback();
onFetchCallback.init();
myWhateverFunc(onFetchCallback.store);
Being myWhateverFunc the one calling the callback:
function myWhateverFunc(callback) {
$.ajax({
//whatever
})
.done(function (data) {
callback(data); //CALL
});
}
This works and I find it super javasScriptic so I do it all the time. Meaning the onFetchCallback initialization + other methods call to handle the function members. I do not know js in depth so I would like to know of there are any flaws with this pattern, or if there is any other better/cooler/javaScriptStylish way to do this.
The pattern you're using has a lot of resemblence with the function constructor which is more commonly used in JavaScript.
An implementation of your code in the function constructor pattern would like like this:
function FetchCallback() {
this.accumData = [];
this.timeLine = [];
this.tempValues1 = [];
this.tempValues2 = [];
this.char;
}
FetchCallback.prototype.store = function(data) {
const hasNulls = data.includes(null);
if (!hasNulls) {
// push values into different arrays
} else {
// push the rest of no nulls if there is any...
}
};
It enables you to create an object with properties and methods which are predefined. This removes the hassle of repetition when you need multiple instances of this same object.
To use the constructor you'll need to create a new instance with the new keyword. This will return an object with all the properties and methods set.
const fetchCallback = new FetchCallback();
// Note the .bind() method!
myWhateverFunc(fetchCallback.store.bind(fetchCallback));
Edit
You'll need to specifically set the value of this to the created instance that is stored in fetchCallback. You can do this with the bind() method. This methods explicitly tells that this should refer to a specific object.
The reason to do this is that whenever you pass the store method as the callback to the myWhateverFunc, it loses it's context with the FetchCallback function. You can read more about this in this post
The main difference between this and your code is that here the FetchCallback function will be unaltered, where your function is reassigned every time you call onFetchCallback() and onFetchCallback.init(). The constructor pattern will result in more predictable behavior, albeit that the this keyword has a learning curve.

javascript array value as variable name , how I get them from function

I'm using array value as variable and then call the function N method, how I get them in function N.
I really want to simulate the Javascript array method, I don't want to use parameters to achieve it. For example,
var p1 = [1,2,3,4,5]; p1.push(6);
function _Array() {
this._this = this;
}
_Array.prototype.show = function () {
this._this.forEach(function(item){alert(item);}) //how to print 1,2,3,4,5
};
var p1 = [1,2,3,4,5];
p1 = new _Array();
//p1._Array.call(p1); //not work
// new _Array().show.call(p1); //not work
// p1.show(); //not work
You have to store that in the instance
function N(arr) {
this._this = arr
}
N.prototype.say = function () {
this._this.forEach(function (item) {
console.log(item)
})
}
p1 = new N([1, 2, 3, 4, 5])
p1.say()
If you are insistent on wanting to write a method that takes the array by reference, you can modify the array prototype like so:
Array.prototype.show = function() {
this.forEach(item => alert(item));
}
However, it is a VERY BAD IDEA to modify the built in object prototypes, as this can cause conflicts with external libraries implementing their own "show" function that is being used differently, or cause incompatibilities with future versions of JS that implements this method.
It would be far more prudent in most situations to pass the array as a parameter, unless you have a very specific reason why you're not doing so. In that case, you should at least prefix the method with some sort of project identifier to minimize the chances of conflicts occurring.

Accessing Svelte component properties in a callback?

Imagine that you have a lot of properties in a component:
let a = 'foo';
let b = 'bar';
// ...
let z = 'baz';
You then want to do something like update all of them from an external callback, like in another library (i.e. something that isn't and can't be a Svelte component itself).
A simple use case is just an AJAX method to load in a bunch of data (assume this ajax function works and you can pass it a callback):
onMount(async function() {
ajax('/data', function(data) {
a = data.a;
b = data.b;
// ...
z = data.z;
});
});
This works, but it's incredibly boilerplaty. What I'd really like is a way to loop through all the properties so they can be assigned to programmatically, especially without prior knowledge on the outside library/callback's part.
Is there no way to get access to a Svelte component and its properties so you can loop through them and assign them from an outside function?
Vue has a simple solution to this, because you can pass the component around, and still check and assign to its properties:
var vm = this;
ajax('/data', function(data) {
for (var key in data) {
if (vm.hasOwnProperty(key)) {
vm[key] = data[key];
}
});
});
I have seen some solutions to this, but they're all outdated - none of them work with Svelte 3.
Apologies if this has been asked before. I've spent days trying to figure this out to avoid all that extra boilerplate and the closest I could find is Access Component Object in External Callback? which does not have an answer right now.
If possible, you could put the ajax call in the parent component and have the data returned from it stored in a temporary object, that you then pass on to the component using the spread operator.
<Component { ...dataObject }></Component>
let dataObject = {};
onMount(async function() {
ajax('/data', function(data) {
dataObject = data;
});
});
You can reduce the boilerplate by using destructuring:
onMount(async function() {
ajax('/data', data => {
({ a, b, ..., z } = data);
});
});
But if you have a very large number of variables, you might be better off just putting them in an object in the first place:
let stuff;
onMount(async function() {
ajax('/data', data => {
stuff = data;
});
});

How to subscribe to object changes?

I wonder how to subscribe to the changes of a JavaScript object e.g. like Redux does. I read through a lot of JS documentations but I couldn't find a non-deprecated way to handle this problem (Object.protype.watch() as well as Object.observe() are deprecated). Moreover I read a few Stackoverflow questions regarding this topic but they are all at least 5 years old. To visualize my problem I'll show an example.
This could be an object I want to watch for:
const store = {
anArray = [
'Hi',
'my',
'name',
'is'
]
}
.. and this a function which changes the store object:
function addAName() {
store.anArray.push('Bob')
}
My goal in this example is to trigger the following function every time the store object changes
function storeChanged() {
console.log('The store object has changed!')
}
Thank you in advance!
Have you tried using Proxy from ECMA6? I think this is what you are looking for
You only have to define a function as the set of the validator of the Proxy like this:
let validator = {
set: function(target, key, value) {
console.log(`The property ${key} has been updated with ${value}`);
return true;
}
};
let store = new Proxy({}, validator);
store.a = 'hello';
// console => The property a has been updated with hello
To solve this problem without any indirections (in using object) you can use proxy.
By wrapping all objects with observable you can edit your store freely and _base keeps track of which property has changed.
const observable = (target, callback, _base = []) => {
for (const key in target) {
if (typeof target[key] === 'object')
target[key] = observable(target[key], callback, [..._base, key])
}
return new Proxy(target, {
set(target, key, value) {
if (typeof value === 'object') value = observable(value, callback, [..._base, key])
callback([..._base, key], target[key] = value)
return value
}
})
}
const a = observable({
a: [1, 2, 3],
b: { c: { d: 1 } }
}, (key, val) => {
console.log(key, val);
})
a.a.push(1)
a.b.c.d = 1
a.b = {}
a.b.c = 1
You can use Object.defineproperty() to create reactive getters/setters. It has good browser support and looks handy.
function Store() {
let array = [];
Object.defineProperty(this, 'array', {
get: function() {
console.log('Get:', array);
return array;
},
set: function(value) {
array = value;
console.log('Set:', array)
}
});
}
var store = new Store();
store.array; //Get: []
store.array = [11]; //Set: [11]
store.array.push(5) //Set: [11, 5]
store.array = store.array.concat(1, 2, 3) //Set: [11, 5, 1, 2, 3]
It's non-trivial.
There are several approaches that different tools (Redux, Angular, KnockoutJS, etc.) use.
Channeling changes through functions - This is the approach Redux uses (more). You don't directly modify things, you pass them through reducers, which means Redux is aware that you've changed something.
Diffing - Literally comparing the object tree to a previous copy of the object tree and acting on changes made. At least some versions of Angular/AngularJS use(d) this approach.
Wrapping - (Kind of a variant on #1) Wrapping all modification operations on all objects in the tree (such as the push method on your array) with wrappers that notify a controller that they object they're on has been called — by wrapping those methods (and replacing simple data properties with accessor properties) and/or using Proxy objects. KnockoutJS uses a version of this approach.
You can try this npm package
tahasoft-event-emitter
Your code will look like this
import { EventEmitter } from "tahasoft-event-emitter";
const store = {
anArray: ["Hi", "my", "name", "is"]
};
const onStoreChange = new EventEmitter();
function addAName(name) {
onStoreChange.emit(name);
store.anArray.push(name);
}
function storeChanged(name) {
console.log("The store object has changed!. New name is " + name);
}
onStoreChange.add((name) => storeChanged(name));
addAName("Bob");
If you are not interested in the new value of name, you can write it simple like this
import { EventEmitter } from "tahasoft-event-emitter";
const store = {
anArray: ["Hi", "my", "name", "is"]
};
const onStoreChange = new EventEmitter();
function addAName() {
store.anArray.push("Bob");
onStoreChange.emit();
}
function storeChanged() {
console.log("The store object has changed!");
}
onStoreChange.add(storeChanged);
addAName();
Whenever you call addAName, storeChanged will be called
Here is the example for on codesandbox
EventEmitter on an object status change
You can check the package GitHub repository to know how it is created. It is very simple.
I am not sure if there is a way to do it with native functionality and even if there is I think you won't be able to do this without some sort of abstraction.
It doesn't even work natively in react/redux either as you need to specifically call setState explicitly to trigger changes. I recommend a very observer pattern whose implementation roughly looks like this.
var store = {
names: ['Bob', 'Jon', 'Doe']
}
var storeObservers = [];
Now, simple push your observer functions doesn't matter even if they are part of some component or module
storeObservers.push(MyComponent.handleStoreChange)
Now simple expose a function to change the store similar to setState
function changeStore(store) {
store = store
storeObservers.forEach(function(observer){
observer()
})
}
You can obviously modularized this to handle more complex situations or if you only want to change a part of state and allow observers to bind callbacks to partial state changes.
Alejandro Riera's answer using Proxy is probably the correct approach in most cases. However, if you're willing to include a (small-ish, ~90kb) library, Vue.js can have watched properties on its instances, which can sort of do what you want.
It is probably overkill to use it for just change observation, but if your website has other uses for a reactive framework, it may be a good approach.
I sometimes use it as an object store without an associated element, like this:
const store = new Vue({
data: {
anArray: [
'Hi',
'my',
'name',
'is'
]
},
watch: {
// whenever anArray changes, this function will run
anArray: function () {
console.log('The store object has changed:', this.anArray);
}
}
});
function addAName() {
// push random strings as names
const newName = '_' + Math.random().toString(36).substr(2, 6);
store.anArray.push(newName);
}
// demo
setInterval(addAName, 5000);

Javascript object, Node and Jade [duplicate]

I am trying to develop an offline HTML5 application that should work in most modern browsers (Chrome, Firefox, IE 9+, Safari, Opera). Since IndexedDB isn't supported by Safari (yet), and WebSQL is deprecated, I decided on using localStorage to store user-generated JavaScript objects and JSON.stringify()/JSON.parse() to put in or pull out the objects. However, I found out that JSON.stringify() does not handle methods. Here is an example object with a simple method:
var myObject = {};
myObject.foo = 'bar';
myObject.someFunction = function () {/*code in this function*/}
If I stringify this object (and later put it into localStorage), all that will be retained is myObject.foo, not myObject.someFunction().
//put object into localStorage
localStorage.setItem('myObject',JSON.stringify(myObject));
//pull it out of localStorage and set it to myObject
myObject = localStorage.getItem('myObject');
//undefined!
myObject.someFunction
I'm sure many of you probably already know of this limitation/feature/whatever you want to call it. The workaround that I've come up with is to create an object with the methods(myObject = new objectConstructor()), pull out the object properties from localStorage, and assign them to the new object I created. I feel that this is a roundabout approach, but I'm new to the JavaScript world, so this is how I solved it. So here is my grand question: I'd like the whole object (properties + methods) to be included in localStorage. How do I do this? If you can perhaps show me a better algorithm, or maybe another JSON method I don't know about, I'd greatly appreciate it.
Functions in javascript are more than just their code. They also have scope. Code can be stringified, but scope cannot.
JSON.stringify() will encode values that JSON supports. Objects with values that can be objects, arrays, strings, numbers and booleans. Anything else will be ignored or throw errors. Functions are not a supported entity in JSON. JSON handles pure data only, functions are not data, but behavior with more complex semantics.
That said you can change how JSON.stringify() works. The second argument is a replacer function. So you could force the behavior you want by forcing the strinigification of functions:
var obj = {
foo: function() {
return "I'm a function!";
}
};
var json = JSON.stringify(obj, function(key, value) {
if (typeof value === 'function') {
return value.toString();
} else {
return value;
}
});
console.log(json);
// {"foo":"function () { return \"I'm a function!\" }"}
But when you read that back in you would have to eval the function string and set the result back to the object, because JSON does not support functions.
All in all encoding functions in JSON can get pretty hairy. Are you sure you want to do this? There is probably a better way...
Perhaps you could instead save raw data, and pass that to a constructor from your JS loaded on the page. localStorage would only hold the data, but your code loaded onto the page would provide the methods to operate on that data.
// contrived example...
var MyClass = function(data) {
this.firstName = data.firstName;
this.lastName = data.lastName;
}
MyClass.prototype.getName() {
return this.firstName + ' ' + this.lastName;
}
localStorage.peopleData = [{
firstName: 'Bob',
lastName: 'McDudeFace'
}];
var peopleData = localStorage.peopleData;
var bob = new MyClass(peopleData[0]);
bob.getName() // 'Bob McDudeFace'
We don't need to save the getName() method to localStorage. We just need to feed that data into a constructor that will provide that method.
If you want to stringify your objects, but they have functions, you can use JSON.stringify() with the second parameter replacer. To prevent cyclic dependencies on objects you can use a var cache = [].
In our project we use lodash. We use the following function to generate logs. Can be used it to save objects to localStorage.
var stringifyObj = function(obj) {
var cache = []
return JSON.stringify(obj, function(key, value) {
if (
_.isString(value) ||
_.isNumber(value) ||
_.isBoolean(value)
) {
return value
} else if (_.isError(value)) {
return value.stack || ''
} else if (_.isPlainObject(value) || _.isArray(value)) {
if (cache.indexOf(value) !== -1) {
return
} else {
// cache each item
cache.push(value)
return value
}
}
})
}
// create a circular object
var circularObject = {}
circularObject.circularObject = circularObject
// stringify an object
$('body').text(
stringifyObj(
{
myBooblean: true,
myString: 'foo',
myNumber: 1,
myArray: [1, 2, 3],
myObject: {},
myCircularObject: circularObject,
myFunction: function () {}
}
)
)
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.4/lodash.min.js"></script>
Does not fix functions as requested, but a way to store variables locally...
<html>
<head>
<title>Blank</title>
<script>
if(localStorage.g===undefined) localStorage.g={};
var g=JSON.parse(localStorage.g);
</script>
</head>
<body>
<input type=button onClick="localStorage.g=JSON.stringify(g, null, ' ')" value="Save">
<input type=button onClick="g=JSON.parse(localStorage.g)" value="Load">
</body>
</html>
Keep all variables in object g. Example:
g.arr=[1,2,3];
note some types, such as Date, you'll need to do something like:
g.date=new Date(g.date);
stores locally per page: different pages have different gs

Categories