Do we need to identify the data we are using? - javascript

I just completed a React component. It calls a service in an action, and puts the result into the state in a reducer. Then the React mechanism updates the component, and then the dom.
When I finished the component, it was checking that it was getting the values it was expecting from the service. After several rounds of pull requests and comments, there is no particular data in the component. The data received from the service is just kept as “data”, and it is passed to whoever needs it without examination or alteration. Even when I pass the data to the subcomponent, I pass the whole data object using a JSX spread, and it takes the pieces it needs.
I am very uncomfortable programming this way. If the service returns unexpected data, all I will be able to tell the business is “it broke.” If the server is supposed to send person.arm.hand, and it sends person without an arm property, the program will throw an exception when it tries to dereference person.arm.hand.
I am told that I should rely on the integration tests. Checks for expected values shouldn’t be put into the code.
Is that standard industry practice?

Related

Wait for response from emitted message?

I'm having a trouble wrapping my head around following concept.
I'm sending OSC messages to query status of instruments in Ableton, so I have emmiter/receiver combo going on. Now, thing is that I'd like to avoid having to keep up some sort of global state and wrap everything around this.
and I do communicate with Ableto in following fashion:
sender.emit("/live/device", queryData);
receiver.on("/live/device", function(responseData){
// process response here...
})
So you can tell that I'm not really sure when I got data back and cannot really sequence new queries based on responses.
What I'd like to do is to simply
query number of instruments on ONE certain channel
get number back
query parameters of each instrument of that channel based on first query
receive parameters back
But problem is that I have no idea how to wrap eventListeners to respond to these queries, or rather how to sequence them in way that is non-blocking and yet still avoiding having some sort of global state going on.
Querying data and storing Promises to be resolved by eventListener seems like a solution, but then I'm stuck on how to pass them back to sequence.
After some research, it seems that this kind of behaving breaks the whole concept of event listeners, but then I suppose the whole point is to have some global state to keep track of what is going on, right?
Event listeners are telling you some asynchronous action coming from a user action or any other interrupt. Depending on the API you are facing, they might have re-used event listeners for replies instead of providing a promise or callback return for the send API. If the server has multiple clients interacting with it, it might want to tell all clients at the same time when their state changes as well.
If you are sure there is no way to directly provide a callback in the send method for a reply to your request or a request does not yield a promise that resolves with the reply at some point, there are usually workarounds.
Option 1: Send context, receive it back
There are APIs that allow sending a "context" object or string to the API. The API then sends this context to the event listeners whenever it answers this specific question along with their payload. This way, the context part of their payload can be checked if it's the answer to the request. You could write your own little wrapper functions for a more direct send/reply pattern then.
Option 2: Figure out the result data, if it fits your request
If the resulting data has something specific to match on, like keys on a JSON object, it may be possible to find out what the request was.
Option 3: Use state on your side to keep track of everything
In most cases where I have seen such APIs, the server didn't care much about requests and only sent out their current state if it was changed by some kind of request. The client needs to replicate the state of the server by listening to all events, if it wants to show the current server state.
In most situations where I faced this issue, I thought about Option 1 or 2 but ended up with Option 3 anyways: Other clients or hardware switches might interfere with my client UI and change the server state without me listening on that change. That way I would loose information that invalidates my UI, so I would need to listen and replicate the state of the server/machine/hardware anyways.

Partial state changes for Vaadin's AbstractJavascriptComponent

I'm implementing a JavaScript-based Vaadin component that will need to show and update a relatively large data set. I'm doing this by extending AbstractJavaScriptComponent.
I'm trying to keep the JS side as "dumb" as possible, delegating user interactions to the server using RPC, and which updates the shared state. Then the JS connector wrapper's onStateChange function is called with the new state, which causes the DOM to be updated accordingly.
I have 2 problems:
I don't want to transfer the whole data set each time a small part gets updated.
I don't want to entirely rebuild the UI each time either.
I can solve the second problem by keeping the previous state and comparing parts of it to find out what changed and only make the necessary DOM changes.
But that still leaves the first problem.
Do I have to stop using Vaadin's shared state mechanism and instead only use RPC for communicating the changes to the state?
Update:
I've been doing some testing, and it certainly appears that Vaadin's shared state mechanism is horrible in terms of efficiency:
Whenever the component calls getState() in order to update some property in the state object (or even without updating anything), the whole state object is transferred. The only way to avoid this, as far as I can see, is to not use the shared state mechanism and instead use RPC calls to communicate specific state changes to client.
There are some issues with the RPC approach that will need to be resolved, for example: if you change a value multiple times within a single request/response cycle, you don't want to make the RPC call multiple times. Instead, you want only the last value to be sent just like the shared state mechanism only sends the final state in the response. You can keep dirty flags for each part of the state that you want to send separately (or just keep a copy of the previous state and compare), but then you need to somehow trigger the RPC call at the end of the request handling. How can this be done?
Any ideas on this are welcome!
Update 2:
Vaadin 8 fixes the root issue: it sends only the changed state properties. Also, it doesn't call onStateChange() on the JS connector anymore when only doing an RPC call (and not changing any state).
OP is correct in stating that shared state synchronisation is inefficient for AbstractJavaScriptComponent-based components. The entire state object is serialised and made available to the Javascript connector's onStateChange method whenever the connector is marked as dirty. Other non-javascript components handle state updates more intelligently by only sending changes. The exact place in the code where this happens is line 97 in com.vaadin.server.LegacyCommunicationManager.java
boolean supportsDiffState = !JavaScriptConnectorState.class
.isAssignableFrom(stateType);
I'm not sure why state update is handled differently for AbstractJavaScriptComponent-based components. Maybe it's to simplify the javascript connector and remove the need to assemble a complete state object from deltas. It would be great if this could be addressed in a future version.
As you suggest, you could dispense with JavaScriptComponentState completely and rely on server->client RPC for updates. Keep dirty flags in you server-side component or compare old state and new state by any mechanism you want.
To coalesce the changes and send only one RPC call for each change, you could override beforeClientResponse(boolean initial) in your server-side component. This is called just before sending a response to the client and is your chance to add a set of RPC calls to update the client-side component.
Alternatively, you could override encodeState where you have free-reign to send exactly whatever JSON you like to the client. You could choose to add a list of changes to the base JSON object returned by super.encodeSate. Your javascript connector could interpret as appropriate in its onStateChange method.
Edited to add: calling getState() in your server-side component will mark the connector as dirty. If you want to get state without marking it as dirty then use getState(false) instead.
Following our discussion about this, I've created a drop-in replacement for AbstractJavaScriptComponent that transmits state deltas and includes some extra enhancements. It's in the very early stages but should be useful.
https://github.com/emuanalytics/vaadin-enhancedjavascript
The solution is deceptively simple: basically re-enabling state difference calculation by bypassing this line of code in com.vaadin.server.LegacyCommunicationManager.java:
boolean supportsDiffState = !JavaScriptConnectorState.class
.isAssignableFrom(stateType);
The implementation of the solution is complicated by the fact that the Vaadin classes are not easily extended so I've had to copy and re-implement 6 classes.
Vaadin's shared state works exactly like you want out of the box: when a component is added to the DOM first time, the whole shared state is transferred from server to client, so that it's possible to display the component. After that, only changes are transferred. For example, one changes the caption of a visible component by calling component.setCaption("new caption"), Vaadin only transfers that new caption text to client and "merges" that to the client-side shared state instance of the component.

How to integrate Redux with very large data-sets and IndexedDB

I have an app that uses a sync API to get its data, and requires to store all the data locally.
The data set itself is very large, and I am reluctant to store it in memory, since it can contains thousands of records. Since I don't think the actual data structure is relevant, let's assume I am building an email client that needs to be accessible offline, and that I want my storage mechanism to be IndexedDB (which is async).
I know that a simple solution would be to not have the data structure as part of my state object and only populate the state with the required data (eg - store email content on state when EMAIL_OPEN action is triggered). This is quite simple, especially with redux-thunk.
However, this would mean I need to compromise 2 things:
The user data is no longer part of the "application state", although in truth it is. Since the sync behavior is complex, and removing it from the app state machine will hurt the elegance of the redux concepts (the way I understand them)
I really like the redux architecture and would like all of my logic to go through it, not just the view state.
Are there any best-practices on how to use redux with a not-in-memory state properties? The thing I find hardest to wrap my head around is that redux relies on synchronous APIs, and so I cannot replace my state object with an async state object (unless I remove redux completely and replace it with my own, async implementation and connector).
I couldn't find an answer using Google, but if there are already good resources on the subject I would love to be pointed out as well.
UPDATE:
Question was answered but wanted to give a better explantation into how I implemented it, in case someone runs into it:
The main idea is to maintain change lists of both client and server using simply redux reducers, and use a connector to listen to these change lists to update IDB, and also to update the server with client changes:
When client makes changes, use reducers to update client change list.
When server sends updates, use reducers to update server change list.
A connector listens to store, and on state change updates IDB. Also maintain internal list of items that were modified.
When updating the server, use list of modified items to pull delta from IDB and send to server.
When accessing the data, use normal actions to pull from IDB (eg using redux-thunk)
The only caveat with this approach is that since the real state is stored in IDB, so we do lose some of the value of having one state object (and also harder to rewind/fast-forward state)
I think your first hunch is correct. If(!) you can't store everything in the store, you have to store less in the store. But I believe I can make that solution sound much better:
IndexedDB just becomes another endpoint, much like any server API you consume. When you fetch data from the server, you forward it to IndexedDB, from where your store is then populated. The store gets just what it needs and caches it as long as it doesn't get too big or stale.
It's really not different than, say, Facebook consuming their API. There's never all the data for a user in the store. References are implemented with IDs and these are loaded when required.
You can keep all your logic in redux. Just create actions as usual for user actions and data changes, get the data you need and process it. The interface is still completely defined by the user data because you always have the information in the store that is needed to GET TO the rest of it when needed. It's just somewhat condensed, i. e. you only save the total number of messages or the IDs of a mailbox until the user navigates to it.

Strategies for data coupling when working with node.js + react server side rendering + flux + mongodb

I'm trying to find a way to lower the possibilities for mistake when working with three tiers of information. Let me try to explain.
I'm building a web app with:
Node.js
mongodb
react (with server side rendering)
flux (alt.js)
browserify
The data flows can be one of these two:
User ask for a page -> data helper getting the proper data from the db -> passing to alt.js bootstrap to fill all the stores -> asking react to build the app (renderToString) and components rendering the view -> retuning to the client
User updates something -> flux action is sent (calling server with ajax) -> data helper preparing the data to be saved in the db -> saving and returning the result to the client -> store updates the state -> react component updates his view
There are three places that need to know the data structure:
The data helper in the server that export the proper data from the data structure and send to the mongodb or gets the data from the db and build the data structure
The flux store that updates his state after user action
The component that render the view from the state
This means that if I want to change the data structure (even if to change the name of one of the properties) I will have to change it in three places which can be very risky and prone for mistakes.
Is there a way to achieve data coupling in JS?
I have been looking into this somewhat, but for a client-side application only. We considered going with an immutable datastructure solution, of which there are several.
In the end we instead went with a message bus solution, based on PubSubJS to message changed state to all parts of the app. We coupled this with a helper function responsible for updating the state of the datastructure, so that all updates are controlled by that function.
I think the feature you wanted is syntax check, which is one of compiler features. And javascript is not a compile language. So my suggestion is to change a language. For me, I've worked with Typescript for a long time, it works fine to me. It is a compile language, and javascript is what it compile to. I think it can fits you needs after you defined your interface.

dojo.store.Observable, JSON REST and queryEngine

Does anybody know how to use the JsonRest store in dojo witn an Observable weapper, like the one in dojo.store.Observable?
What do I need, server side, to implement the store and make it work as an Observable one? What about the client side?
The documentation says http://dojotoolkit.org/reference-guide/1.7/dojo/store/Observable.html
If you are using a server side store like the JsonRest store, you will need to provide a queryEngine in order for the update objects to be properly included or excluded from queries. If a queryEngine is not available, observe listener will be called with an undefined index.
But, I have no idea what they mean. I have never created a store myself, and am not 100% familiar with queryEngine (to be honest, I find it a little confusing). Why is queryEngine needed? What does the doc mean by "undefined index"? And how do you write a queryEngine for a JsonRest store? Shouldn't I use some kind of web socket for an observable REST store, since other users might change the data as well?
Confused!
I realize this quesiton is a bit old, but here's some info for future reference. Since this is a multi-part question, I'll break it down into separate pieces:
1) Server-side Implementation of JsonRest
There's a pretty decent write up on implementing the server side of JsonRest Store. It shows exactly what headers JsonRest will generate and what content will be included in the rest. It helps form a mental model of how the JsonRest api is converted into HTTP.
2) Query Engine
Earlier in the same page, how query() works client side is explained. Basically, the query() function needs to be able to receive an object literal (ex: {title:'Learning Dojo',categoryid:5}) and return the objects in the store that match those conditions. "In the store" meaning already loaded into memory on the client, not on the server.
Depending on what you're trying to do, there's probably no need to write your own queryEngine anyway -- just use the built-in SimpleQueryEngine if you're building your own custom store. The engine just needs to be handed an object literal and it adds the whole dojo query() api for you.
3) Observables
My understanding is that the Observables monitor client side changes in the collection of objects (ex: adding or removing a result) or even within a specific object (ex: post 5 has changed title). It does NOT monitor changes that happen server-side. It simply provides a mechanism to notify other aspects of the client-side app that data changed so that all aspects of the page stay synchronized.
There's a whole write up on using Observables under the headings 'Collection Data Binding' and 'Object Data Binding: dojo/Stateful'.
4) Concurrency
There's two things you'd want to do in order to keep your client side data synchronized with the server side data: a) polling for changes from other users on the server, b) using transactions to send data to the server.
a) To poll for changes to the data, you'd want to have your object store track the active query in a variable. Then, use setTimeout() or setInterval() to run the query in the background again every so often. Make sure that widgets or other aspects of your application use Observables to monitor changes in the query result set(s) they depend on. That way, changes on the server by other users would automatically be reflected throughout your application.
b) Use transactions to combine actions that must be combined. Then, make sure the server sends back HTTP 200 Status codes (meaning 'It Worked!'). If the transactions returns a HTTP status in the 400s, then it didn't work for some reason, and you need to requery the data because something changed on the backend. For example, the record you want to update was deleted, so you can't update it. There's a write up on transactions as well under the heading 'Transactional'

Categories