Node/React: How to handle jQuery AJAX when rendering on server? - javascript

I have a small webapp in Node/Express that renders initial HTML server side with react-dom. The page is then populated client side with a $.ajax call to the API inside componentDidMount. The HTML loads immediately, but there's no useful content until React starts and completes that GET.
This is wasteful. It would be better to hit the API while rendering the initial HTML. But. I don't know a clean way to implement this. Seems like I could get what I want by declaring a global $ in node with a stubbed get method, but this feels dirty.
How do I implement $.ajax when rendering a React component server side?
The code is public on Github. Here's a component with $.get and here's my API.

componentDidMount doesnt run on the server, it runs only client side for the first render, so the ajax request will never happen on the server. You should do it in a static method (there are other ways of do it)
It would be better if you choose superagent or axios - that can made ajax requests client and server side
You then have to put the result of the ajax request as the initial state on a global variable.
It's better if you follow some repos, like this:
See https://github.com/erikras/react-redux-universal-hot-example

Here's how I solved this.
Moved my ajax out of componentDidMount so that it is called while rendering initial HTML on the server.
Declared my own global $ in Node with a get method that calls the router directly. This is what it looks like:
global.$ = {
get: (url, cb) => {
const req = {url: url};
const res = {
send: data => cb(data),
status: () => {
return {send: data => cb(data)};
}
};
return api_router(req, res);
}
};
Some caveats
If this feels like a questionable hack to you, that's ok. It feels like a questionable hack to me too. I'm still open to suggestions.
#stamina-loop's suggestion of replacing jQuery's AJAX with module that works for both the server and client is a good one that would solve this problem. For most people I would recommend that approach. I chose not to because it seemed wasteful to go over the network just to call a route handler that is adjacent in code. Could be made less wasteful with a fancy nginx config that redirects outbound API calls back to the same box without making a round trip. I'm thinking on that.
I've since learned that using jQuery alongside React is likely to cause problems. I'll be replacing it with something else down the road.
For most use cases it will still make sense to keep the AJAX in componentDidMount and to load initial HTML without it. That way time-to-first-byte is as low as possible. The types of things that are loaded from restful APIs are usually not needed for SEO and are things that users are used to waiting a few extra milliseconds for (Facebook does it so can you).

Related

How do I get around Angular undoing my monkey patching with its own monkey patching?

I'd like to intercept XHR requests for the Google Maps API so I can run them through my own proxy server as a way of keeping my API key private.
Angular has its own HttpInterceptors, but they'll only intercept XHR requests which are made using Angular's HttpClient, not any requests made outside of the Angular framework by the Maps API. I'd think that monkey patching XMLHttpRequest.open() would be the best way to get at the requests going to the Maps API, which I've done like this:
var oldXHROpen = XMLHttpRequest.prototype.open;
XMLHttpRequest.prototype.open = function(method, url, async, username, password) {
console.log(url);
return oldXHROpen.apply(this, arguments);
};
The above code is placed in a <script> in the <head> section of my index.html, so it's definitely executed before any Angular code is executed.
The patch works... for a short time. I see URLs for a few assets loaded by my code logged out, but then this message appears:
Angular is running in the development mode. Call enableProdMode() to enable the production mode.
After that just one more URL gets logged, and that's the last intercept I make. XHR requests continue to be processed, but my patch never sees them happen.
I'm sure this has something to do with zone.js, but I still don't understand how it could happen. Since I redefine XMLHttpRequest.prototype.open before Angular or zone.js even get a chance to see the original open() function, which is tucked away in the oldXHROpen variable, how does a direct connection to the native open() ever manage to happen again, bypassing my patch?
It appears I was laboring under the false premise that the client side of the Google Maps API would have to make XHR requests to get its work done.
It doesn't. It's all done by loading images, CSS, and fonts, with a little JSONP thrown in.
My monkey patching actually does work, and doesn't get undone by Angular.

Request Caching with Circuit-Breaker(Opossum) in nodejs

Based on the Netflix Hystrix circuit-breaker design pattern i was trying to do the following:
const circuitBreaker = require('opossum');
import * as request from 'request-promise';
const circuit = circuitBreaker(request.get);
circuit.fallback(() => Promise.resolve({result:[]}));
I have 3 node js services deployed . They use a circuit-breaker(opossum) to make REST Calls in between them. I have a fallback method which handles the scenario when a service goes down. I was wondering if something like request-caching can be used alongside the circuit breaker to return cached response whenever the fallback is invoked. If yes, how can i achieve this ?
P.S : request is my client to make REST calls
As far I know opossum does not provide a out of the box solution for your problem. You have to implement some mechanism to cache the latest successful call. In my point of view probably the best way to do it, is having some distributed cache like Redis and cache the latest successful response but make sure to have a temporary entry in Redis you don't want to return old deprecated data.
I'm not sure if this can help but you can try:
circuitBreaker(request.get, { cache: true });
You can see a bit more details on this test file

Next.js: fetching data in getInitialProps(): server-side vs client-side

I'm using Next.js, and I have a custom server using Express. I have a page that requires some data from the database.
getInitialProps(), when running on the server, could just grab the data from the database and return it, without any problems.
However, getInitialProps() can also run on the client side (when the user initially requests a different page, then navigates to this one). In that case, since I'm on the client side, I obviously can't just fetch the data from the database - I have to use AJAX to talk to the server and ask it to retrieve it for me.
Of course, this also means that I have define a new Express route on the server to handle this request, which will contain exactly the same code as the server-side part of getInitialProps(), which is very undesirable.
What's the best way to handle this?
getInitialProps() always receives the request and response as parameters which are only set on the server:
static async getInitialProps({req}){
if(req){
// called on server
} else {
// called on client
}
}
https://github.com/zeit/next.js#fetching-data-and-component-lifecycle
Since no good solution seemed to have existed, I have created and published a library to provide a simple and elegant solution to this problem: next-express.
In your getInitialProps you should be making a http request to a new express route that has your logic for fetching from the database. That logic should never live in the UI layer.
This route should then be called regardless of whether you are on the client or on the server - you don't need to do any code branching.
Make an API distinct from your next.js app. Think of the next app as a frontend client that happens to render pages on the server
With time new solutions come around.
Nextjs has introduced a new method getServerSideProps primarily for such use cases
getServerSideProps only runs on server-side and never runs on the browser.
For me, the quickest way I found is to get the data from __NEXT_DATA__
MyApp.getInitialProps = async (): Promise<AppCustomProps> => {
const isInBroswer = typeof window !== 'undefined';
if (isInBroswer) {
const appCustomPropsString =
document.getElementById('__NEXT_DATA__')?.innerHTML;
if (!appCustomPropsString) {
throw new Error(`__NEXT_DATA__ script was not found`);
}
const appCustomProps = JSON.parse(appCustomPropsString).props;
return appCustomProps;
}
// server side, where I actually fetch the data from db/cms and return it
}

How to subscribe via DDP connections to other Meteor servers on the server side?

I'd like to synchronize Data between two Meteor apps. Therefore I have published a collection with the data in question on both apps (which obviously run the same Meteor version 0.8.1.2 with the exact same packages).
When I run
var testConnection = DDP.connect('http://10.0.10.20:3003/');
var newCollection = new Meteor.Collection('remoteData', testConnection);
testConnection.subscribe('remoteData');
console.log('Data list starts here:');
newCollection.find().forEach(function(data){console.log(data)});
on any client I do get a list of all data like expected. Server side there is nothing so newCollection stays empty (also I know from debugging that the server does actually execute testConnection.subscribe('remoteData') and the other server executes everything within its corresponding publish function just like for clients).
I tried it this way as the poster here https://stackoverflow.com/a/18360441 mentioned something like this works on client and server. Looking in the docs for subscribe ( http://docs.meteor.com/#meteor_subscribe ) it says it only works on the client which would explain that nothing happens on my server but would be a bit strange as DDP.connect ( http://docs.meteor.com/#ddp_connect ) seems to be meant for client and server and supports subscribe.
So do I miss something here? And what would be the best way to get a subscribe like functionality between two servers if subscribe really does not work in this scenario?
I know I can work with custom Meteor.methods but this seems a bit like a crutch compared to how nice it would work with subscribe, so I would be very interested in any better solution...
Like user728291 pointed out the problem was that the server in this case isn't waiting for this.ready() in the publish function on the other side and therefore when newCollection.find() is called on the server newCollection still is empty (but will receive data shortly after). It seems that on the client newCollection.find() tries to wait for this.ready() of the servers publish function (also I'm absolutely not sure about this, maybe the reason it works on the client is a totally different one) and therefore on the client it isn't empty at that time.
Anyhow, you are on the safe side when you always trigger find() in the callback of subscribe which will interpret any function as onReady callback (http://docs.meteor.com/#meteor_subscribe).
So what guaranteed works on server and client is
var testConnection = DDP.connect('http://10.0.10.20:3003/');
var newCollection = new Meteor.Collection('remoteData', testConnection);
testConnection.subscribe('remoteData', function() {
console.log('Data list starts here:');
newCollection.find().forEach(function(data){console.log(data)});
});

How to avoid too many ajax calls and cache json data on the client side

I have a calendar application and it loads all of the event data using ajax and json results. the issue is that i have different view and right now i have to re call the server when i change views.
Is there any recommendation for ways i can cache this data on the client side and check if i have loaded these events already before firing off more ajax calls.
What is the best practice for this ?
Like hvgotcodes said, an MVC framework would help; try backbone.js (http://documentcloud.github.com/backbone/), for instance.
Alternatively, you might want to consider using jStorage (http://www.jstorage.info/). Every time you need to make an AJAX call, check first if it's in your storage object, then run the AJAX call if it isn't. On the other end, whenever you finish an AJAX call, store the results in the storage object. Make sure you have some kind of index (a CalendarEvent id) to reference when looking it up in the data store. Might want to add some kind of "expire time" to the data in your storage, too ... a timestamp after the AJAX call, and re-request up front if it's out of date.
It's called MVC.
You need to construct a data model for you application, write some sort of Record objects, and then you can determine their status. So your application would have some sort of CalendarEvent model, and when you load data from the server, you would instantiate instances.
So when changing views, you would first check to see if you had the model object for that view, and if you did, you wouldn't need to load it from the server (unless you want to check for changes).
Your scheme doesn't need to be that complicated. If you load events by Id, you can do something like
window.App = {};
window.App.Models = {};
when you load a record you could put
window.App.Models[id] = InstanceOfYourRecord
and that way its pretty fast to look for records. Or just use a framework (like Sproutcore) that has a robust data layer.
I had similar issues on a recent project.
Conceptually, I have the "real" data model (DM) kept on the server, persisted to a database.
To make life sane, the client keeps its own local data model. Outside of the client DM, all the client code thinks it's pulling results locally.
When reading data (GET) from the client DM it:
checks the cache for existing results
invokes appropriate AJAX queries when cached data is not available, then caches the results.
When changing data (POST) via the client DM it:
invalidates the cache as appropriate
invokes appropriate AJAX queries
emits custom jQuery event indicating client DM changed
Note that this client DM also:
centralizes AJAX error handling
tracks AJAX calls still in-flight. (Lets us warn users when leaving pages with unsaved changes).
allows a drop-in, dummy replacement for unit testing, where all the calls hit local data and are completely synchronous.
Implementation notes:
I coded this as a JavaScript class called DataModel. As the design becomes more complex, it makes sense to further break-down the responsibilities in to separate objects.
jQuery's custom events let you easily implement the observer pattern. Client components update themselves from the client DM whenever it indicates data has changed.
JSON in your remote API helps simplify the code. My client DM stores the JSON results directly in its cache.
The client dm function arguments include call-backs so everything can naturally be passed along via AJAX when needed: function listAll( contactId, cb ) { ... }
My project only allowed single user logins. If outside parties can change the server datamodel, some sort of has-data-changed probe should be fired regularly to ensure the client cache is still valid.
For my app, multiple client components would request the same data when receiving a client DM changed event. This resulted in multiple AJAX calls with the same info. I fixed this problem with a getJsonOnce() helper, which manages a queue of client component call-backs awaiting the same result.
Example function in my implementation:
listAll:
function( contactId, cb ) {
// pull from cache
if ( contactId in this.notesCache ) {
cb( this.notesCache[contactId] );
return;
}
// init queue if needed
this.listAllQueue[contactId] = this.listAllQueue[contactId] || [];
// pull from server
var self = this;
dataModelHelpers.getJsonOnce(
'/teafile/api/notes.php',
{'req': 'listAll', 'contact': contactId},
function(resp) { self.notesCache[contactId] = resp; },
this.listAllQueue[contactId],
cb
);
}
The getJsonOnce() helper makes sure that if multiple client components request the exact same (uncached) data, that we only send out a single AJAX request and inform everyone once it comes in.
The notesCache is just a simple javascript object:
this.notesCache = {};

Categories