What is the simplest way to call $httpBackend manually?
Yes, I am aware that you are supposed to use $http instead, however this is a special use case: I am augmenting the $exceptionHandler and want to send a log message back to the server, however I can't use $http since it will trigger an $apply which could retrigger an exception causing an infinite loop and locking up the browser.
Most examples (and my own code) has used jQuery to issue the Ajax call to log the error. I'm trying to solve it with out using jQuery.
Warning this is an undocumented API on purpose and I'm certain the Angular team reserves the right to change the signature.
That being said here is the minimal backend call I've been able to create:
$httpBackend('POST', '/some/url', //method and url
JSON.stringify(buildLogInfo()), //request body
function(status,resp,headerString){ //response call back
console.log('manual backend call',status,resp,headerString);
},
{"Content-Type": "application/json"} //request headers
);
Related
I have a small webapp in Node/Express that renders initial HTML server side with react-dom. The page is then populated client side with a $.ajax call to the API inside componentDidMount. The HTML loads immediately, but there's no useful content until React starts and completes that GET.
This is wasteful. It would be better to hit the API while rendering the initial HTML. But. I don't know a clean way to implement this. Seems like I could get what I want by declaring a global $ in node with a stubbed get method, but this feels dirty.
How do I implement $.ajax when rendering a React component server side?
The code is public on Github. Here's a component with $.get and here's my API.
componentDidMount doesnt run on the server, it runs only client side for the first render, so the ajax request will never happen on the server. You should do it in a static method (there are other ways of do it)
It would be better if you choose superagent or axios - that can made ajax requests client and server side
You then have to put the result of the ajax request as the initial state on a global variable.
It's better if you follow some repos, like this:
See https://github.com/erikras/react-redux-universal-hot-example
Here's how I solved this.
Moved my ajax out of componentDidMount so that it is called while rendering initial HTML on the server.
Declared my own global $ in Node with a get method that calls the router directly. This is what it looks like:
global.$ = {
get: (url, cb) => {
const req = {url: url};
const res = {
send: data => cb(data),
status: () => {
return {send: data => cb(data)};
}
};
return api_router(req, res);
}
};
Some caveats
If this feels like a questionable hack to you, that's ok. It feels like a questionable hack to me too. I'm still open to suggestions.
#stamina-loop's suggestion of replacing jQuery's AJAX with module that works for both the server and client is a good one that would solve this problem. For most people I would recommend that approach. I chose not to because it seemed wasteful to go over the network just to call a route handler that is adjacent in code. Could be made less wasteful with a fancy nginx config that redirects outbound API calls back to the same box without making a round trip. I'm thinking on that.
I've since learned that using jQuery alongside React is likely to cause problems. I'll be replacing it with something else down the road.
For most use cases it will still make sense to keep the AJAX in componentDidMount and to load initial HTML without it. That way time-to-first-byte is as low as possible. The types of things that are loaded from restful APIs are usually not needed for SEO and are things that users are used to waiting a few extra milliseconds for (Facebook does it so can you).
I am building a web app with different JS libraries (AngularJS, OpenLayers,...) and need a way to intercept all AJAX responses to be able, in case the logged user session expired (response gets back with 401 Unauthorized status), to redirect him to the login page.
I know AngularJS offers interceptors to manage such scenarios, but wasn't able to find a way to achieve such injection into OpenLayers requests. So I opted for a vanilla JS approach.
Here I found this piece of code...
(function(open) {
XMLHttpRequest.prototype.open = function(method, url, async, user, pass) {
this.addEventListener("readystatechange", function() {
console.log(this.readyState); // this one I changed
}, false);
open.call(this, method, url, async, user, pass);
};
})(XMLHttpRequest.prototype.open);
...which I adapted and looks like it behaves as expected (only tested it on last Google Chrome).
As it modifies the prototype of XMLHTTPRequest I am wondering how dangerous this could result or if it could produce serious performance issues. And by the way would there be any valid alternative?
Update: how to intercept requests before they get sent
The previous trick works ok. But what if in the same scenarion you want to inject some headers before the request gets sent? Do the following:
(function(send) {
XMLHttpRequest.prototype.send = function(data) {
// in this case I'm injecting an access token (eg. accessToken) in the request headers before it gets sent
if(accessToken) this.setRequestHeader('x-access-token', accessToken);
send.call(this, data);
};
})(XMLHttpRequest.prototype.send);
This type of function hooking is perfectly safe and is done regularly on other methods for other reasons.
And, the only performance impact is really only one extra function call for each .open() plus whatever code you execute yourself which is probably immaterial when a networking call is involved.
In IE, this won't catch any code that tries to use the ActiveXObject control method of doing Ajax. Well written code looks first for the XMLHttpRequest object and uses that if available and that has been available since IE 7. But, there could be some code that uses the ActiveXObject method if it's available which would be true through much later versions of IE.
In modern browsers, there are other ways to issue Ajax calls such as the fetch() interface so if one is looking to hook all Ajax calls, you have to hook more than just XMLHttpRequest.
This won't catch XMLHttpRequests for some versions of IE (9 and below). Depending upon the library, they may look for IE's proprietary ActiveX control first.
And, of course, all bets are off if you are using a non-strict DOCTYPE under IE, but I'm sure you knew that.
Reference: CanIuse
As kindly pointed out by by Firefox AMO Editor Rob W,
The following code changes the behavior of XMLHttpRequest. By default,
if the third ("async") parameter is not specified, it defaults to
true. When it is specified and undefined, it is equivalent to "false",
which turns a request in a synchronous HTTP request. This causes the
UI to block while the request is being processed, and some features of
the XMLHttpRequest API are disabled too.
...
To fix this, replace open.call(....) with open.apply(this, arguments);
And here is a reference link:
https://xhr.spec.whatwg.org/#the-open()-method
Try this
let oldXHROpen = window.XMLHttpRequest.prototype.open;
window.XMLHttpRequest.prototype.open = function(method, url, async, user, password) {
console.log({method});
// Show loader
this.addEventListener('load', function() {
console.log('load: ' + this.responseText);
// Hide loader
});
return oldXHROpen.apply(this, arguments);
}
I have a simple angular app that has two views that are loaded using ngRoute. I need to do some clean up on the server when the user navigates between views and when the user leaves the page (refreshes window, closes tab, or closes browser).
My first stop was here: Showing alert in angularjs when user leaves a page. It solved the first case where the user navigates between views. I've handled the clean up like this:
$scope.$on('$locationChangeStart', function (event) {
var answer = confirm("Are you sure you want to leave this page?")
if (answer) {
api.unlock($scope.id); //api is a service that I wrote. It uses angular $http service to handle communications and works in all other cases.
} else {
event.preventDefault();
}
});
However, I haven't been able to handle the case where the user leaves the page. Following the above answer and this Google Groups post: https://groups.google.com/forum/#!topic/angular/-PfujIEdeCY I tried this:
window.onbeforeunload = function (event) {
api.unlock($scope.id); //I don't necessarily need a confirmation dialogue, although that would be helpful.
};
But it did not work. Then I've read here: How to execute ajax function onbeforeunload? that the request needs to be synchronous and here: How to $http Synchronous call with AngularJS that Angular does not support this (although this one might be outdated).
I've also tried calling $http directly (without the service) but that did not work either. At this point I'm stuck. Any suggestions / leads would be really appreciated.
Thanks in advance!
The problem is that angular always make ASYNCHRONOUS calls with the $http service (and you can’t change that behavior). Since the page exits before the $http service have a chance to emit the requests you are not getting the desired result.
If you want a workarround without using any third party libraries try the following code:
var onBeforeUnload = function () {
var data = angular.toJson($scope.id...);//make sure you $scope is legal here...
var xmlhttp = new XMLHttpRequest();
xmlhttp.open("POST", "apiURL/unlock", false);//the false is for making the call synchronous
xmlhttp.setRequestHeader("Content-type", "application/json");
xmlhttp.setRequestHeader("Authorization", token);
xmlhttp.send(data);
}
It will block UI until the request is completed, so try to make the server fast or the user will notice.
Solution is to call $rootScope.$digest(); after firing your $http calls in your onbeforeunload callback. The reason is that angular's $http methods wrap the config in an immediately resolved promise which means the ajax request doesn't actually get fired until the next tick. Other browsers appear to allow one more tick after the onbeforeunload tick, but not IE11 (the only version of IE in which I tested). Forcing the digest cycle avoids the need to wait until the next tick.
Looking for some help with a best practice.
I have a module which I am setting a few custom headers. No big deal here:
$httpProvider.defaults.headers.common['token'] = function() {
return token;
};
token is a value that I must $http.get() on the page load.
My intial thought was to put this in my controller, but after thinking about it, it more more sense to do it in the module configuration on page load where I am setting my custom headers:
var app = angular.module('app',['ngRoute', 'ngResource'],function($httpProvider) {
// Custom headers
});
My question is two part:
Is this the best way to do this?
If it is, how do I make a $http.get() request inside of the module config?
app.config, as you might have noticed, won't allow you to use services like $http (or any service you make yourself), it's run before they are defined. Try putting the call in your app.run instead. It is after config and it has no restrictions against using services.
If it is the right approach or not is harder to answer as it depends on the exact use-case. As $http-calls are asynchronous you cannot just call your backend when the app starts and be sure the token exists in your controllers or services, the http call might not have returned yet! This might be a problem for you if you expect to use the token right away.
A better option, again depending on use-case, might be to use a resolve-function on any route that needs the token. A route will holds off on loading any controller and template until the routes resolve-function has finished. Using this method you can then be 100% sure that the token exists once the controller is run.
This video has a good intro to resolves.
They can also be combined. Running the http-call in your app.run, and then using a resolve function to make sure it exists before the controller loads.
I am writing a Backbone application which should interface to a REST API.
My problem arises when a user deletes a model that has already been deleted by someone else. In my opinion, the backend should just return success (200), as the model is deleted anyway. But the people developing the server side have a different opinion, hence what I get is a 404. For comparison, when the request actually fails - hence the model is still alive - the response code is 400, or possibly 401 for authorization issues.
Since I get an error, I actually do not remove the model. What I am trying to do is modifying this behaviour: if I get a 404 error while deleting a model, it should be treated as success. But I am not really sure what is the most convenient way to handle this.
Ideally I would like to avoid putting this logic inside model.destroy. This would lead to a repetition. I could put this inside a destroy method of a superclass, but models override this method anyway, each one with its own logic, so it gets messy. I would prefer that at this point the model.destroy methods received a success, not knowing that the actual response was a 404.
On the other hand, I am not sure how to put this logic inside Backbone.sync, short of rewriting the whole function.
What is the most transparent way to transform all 404 responses to DELETE requests into success?
It's a hack, but should do the trick:
model.destroy({
error: function(model, resp, options) {
if (resp.status == 404) {
resp.status = 200;
options.success(model, resp);
}
}
})
Btw, as of Backbone 0.9, destroy() and create() are optimistic.