Here's my code.
$scope.init = function () {
$scope.urlParam = $location.search();
if ($scope.urlParam.token == undefined || $scope.urlParam.id == undefined) {
$scope.is_start = false;
alert("您没有权限投票");
} else {
$http.get('/vote/validate_querystring?token=' + $scope.urlParam.token + "&id=" + $scope.urlParam.id)
.success(function (response) {
if (response.status == 3) {
$scope.is_start = false;
alert("您没有权限投票");
} else if (response.status == 4) {
$scope.is_start = false;
alert("您已完成投票");
} else {
$http.get('/vote/r_vote_setting')
.success(function (response) {
if (response.status == 1) {
$scope.is_start = false;
alert("投票尚未开始");
} else {
$scope.is_start = true;
$scope.voteData = response.data;
}
});
}
})
}
};
I put this function in ng-init so it will be invoked every time the page is loaded.
As you can see, there are two $http.get in this function. The problem is when I hit the back button to go back to this page, $http.get('/vote/validate_querystring?token=') would be loaded from browser cache while $http.get('/vote/r_vote_setting') makes a new request to the server. I found this from chrome console.
Request URL:http://localhost:8080/vote/validate_querystring?token=202cb962ac59075b964b07152d234b70&id=1
Request Method:GET
Status Code:200 OK (from disk cache)
Remote Address:[::1]:8080
Referrer Policy:no-referrer-when-downgrade
Request URL:http://localhost:8080/vote/r_vote_setting
Request Method:GET
Status Code:200 OK
Remote Address:[::1]:8080
Referrer Policy:no-referrer-when-downgrade
I want to know why this happens and how to make them both send request to the server rather than cache when hitting back button.
You can use cache: false option of $http.
$http.get('/vote/validate_querystring?token=' + $scope.urlParam.token + "&id=" + $scope.urlParam.id,cache: false);
Use $httpProvider to set caching false
myModule.config(['$httpProvider', function($httpProvider) {
//initialize get if not there
if (!$httpProvider.defaults.headers.get) {
$httpProvider.defaults.headers.get = {};
}
// Answer edited to include suggestions from comments
// because previous version of code introduced browser-related errors
//disable IE ajax request caching
$httpProvider.defaults.headers.get['If-Modified-Since'] = 'Mon, 26 Jul 1997 05:00:00 GMT';
// extra
$httpProvider.defaults.headers.get['Cache-Control'] = 'no-cache';
$httpProvider.defaults.headers.get['Pragma'] = 'no-cache';
}]);
You can also include a time parameter to the call, that would every call unique, thus avoid unwanted caching for only that specific call instead having to mess with $http default settings and potentially affect every other calls on the page
$http.get({
url: sampleUrl,
method: 'GET',
params: {
time: new Date()
}
}).error(function (err) {
console.log('Error encountered: ' + err);
}).success(function (data) {
console.log(data);
});
Related
I have been fumbling with AngularJS for weeks now and trying to cobble together the parts of this, that, and the other thing together to create a file serving web application with a Django backend. I thought things were going well until I found myself trying to upload a file with all of my other form data. My HTML form consistently showed up as having no file attached during the validation step before sending the request. Well, that's no good! Anyways, this ended up being some manner of unsupported operation, for one reason or another. I turned to ng-file-upload, a third party file upload service for AngularJS. The most current iteration of ng-file-upload uses AngularJS 1.6 style requests and my third party registration application angular-django-registration-auth uses $http previous to 1.6.
I need to update the third party registration application but it has the following code.
'request': function(args) {
// Let's retrieve the token from the cookie, if available
if($cookies.token){
$http.defaults.headers.common.Authorization = 'Token ' + $cookies.token;
}
// Continue
params = args.params || {}
args = args || {};
var deferred = $q.defer(),
url = this.API_URL + args.url,
method = args.method || "GET",
params = params,
data = args.data || {};
// Fire the request, as configured.
$http({
url: url,
withCredentials: this.use_session,
method: method.toUpperCase(),
headers: {'X-CSRFToken': $cookies['csrftoken']},
params: params,
data: data
})
.success(angular.bind(this,function(data, status, headers, config) {
deferred.resolve(data, status);
}))
.error(angular.bind(this,function(data, status, headers, config) {
console.log("error syncing with: " + url);
// Set request status
if(data){
data.status = status;
}
if(status == 0){
if(data == ""){
data = {};
data['status'] = 0;
data['non_field_errors'] = ["Could not connect. Please try again."];
}
// or if the data is null, then there was a timeout.
if(data == null){
// Inject a non field error alerting the user
// that there's been a timeout error.
data = {};
data['status'] = 0;
data['non_field_errors'] = ["Server timed out. Please try again."];
}
}
deferred.reject(data, status, headers, config);
}));
return deferred.promise;
},
Beginning at var deferred = (this is defining a promise object, right?) I am unclear on what is going on. The assignments are easy to understand for the most part, with exception granted to the promise object (How does data = args.data || {}; end up in the right-handside of one of $http provider's compound assignment?), but what exactly is happening in the success and error cases where angular.bind() is called? I can't seem to find any good examples where angular seems to bind to a promise.
Fixed this with then() calls after finding some decent resources. Here is what my code ended up looking like, I'm including my logging because it may help someone else.
"request": function(args) {
// Let"s retrieve the token from the cookie, if available
if($cookies.get("token")){
$http.defaults.headers.common.Authorization = "Token " + $cookies.get("token");
}
// Continue
params = args.params || {};
args = args || {};
var deferred = $q.defer(),
url = this.API_URL + args.url,
method = args.method || "GET",
params = params,
data = args.data || {};
// Fire the request, as configured.
$http({
url: url,
withCredentials: this.use_session,
method: method.toUpperCase(),
headers: {"X-CSRFToken": $cookies["csrftoken"]},
params: params,
data: data
})
.then(function(response) {
console.log("Success case: " + url);
console.log("Headers: " + JSON.stringify(response.headers(),null, 4));
console.log("Config: " + response.config);
console.log("Status: " + response.status);
console.log('Response: ');
console.log('JSON: ' + JSON.stringify(response.data, null, 4));
deferred.resolve(response.data, response.status);
}, function(response) {
console.log("Error case: " + url);
console.log("Headers: " + JSON.stringify(response.headers(),null, 4));
console.log("Config: " + response.config);
console.log("Status: " + response.status);
console.log('Response: ');
console.log('JSON:' + JSON.stringify(response.data, null, 4));
if(response.data){ response.data.status = status; }
if(status == 0){
if(response.data == ""){
response.data = {};
response.data["status"] = 0;
response.data["non_field_errors"] = ["Could not connect. Please try again."];
}
if(data == null){
response.data = {};
response.data["status"] = 0;
response.data["non_field_errors"] = ["Server timed out. Please try again."];
}
}
deferred.reject(response.data, response.status, response.headers, response.config);
});
return deferred.promise;
},
I'm trying to use " Pre-fetching" and fetch "collect" techniques to cache js, CSS, and stuffs on a SPA application.
To pre-fetching scripts I have tried a code very like this snippet:
self.addEventListener('install', function(event) {
var now = Date.now();
var urlsToPrefetch = [
'static/pre_fetched.txt',
'static/pre_fetched.html'
];
event.waitUntil(
caches.open(CURRENT_CACHES.prefetch).then(function(cache) {
var cachePromises = urlsToPrefetch.map(function(urlToPrefetch) {
var url = new URL(urlToPrefetch, location.href);
url.search += (url.search ? '&' : '?') + 'cache-bust=' + now;
var request = new Request(url, {mode: 'no-cors'});
return fetch(request).then(function(response) {
if (response.status >= 400) {
throw new Error('request for ' + urlToPrefetch +
' failed with status ' + response.statusText);
}
return cache.put(urlToPrefetch, response);
}).catch(function(error) {
console.error('Not caching ' + urlToPrefetch + ' due to ' + error);
});
});
return Promise.all(cachePromises).then(function() {
console.log('Pre-fetching complete.');
});
}).catch(function(error) {
console.error('Pre-fetching failed:', error);
})
);
});
Full code can be checked here
After pre-fetching, I have almost all critical scripts on cache (such as angular.js, modules and controllers and maybe some jqueries), so, I do a fetch event to collect all others scripts that load by require.js asynchronously.
self.addEventListener('fetch', function (event) {
if (event.request.method === "GET" && testes_to_know_if_it_area_a_js_or_css) {
event.respondWith(
caches.match(event.request)
.then(function (response) {
if (response) {
loggger && console.log('From Cache', event.request.url);
return response;
}
// IMPORTANT: Clone the request. A request is a stream and
// can only be consumed once. Since we are consuming this
// once by cache and once by the browser for fetch, we need
// to clone the response
var fetchRequest = event.request.clone();
return fetch(fetchRequest).then(
function (response) {
// Check if we received a valid response
if (!response || response.status !== 200 || response.type !== 'basic') {
return response;
}
// IMPORTANT: Clone the response. A response is a stream
// and because we want the browser to consume the response
// as well as the cache consuming the response, we need
// to clone it so we have 2 stream.
var responseToCache = response.clone();
caches.open(CURRENT_CACHES['general-cache'])
.then(function (cache) {
try {
loggger && console.log('Add to Cache', event.request.url);
cache.put(event.request, responseToCache);
} catch (e) {
console.error(e);
}
});
return response;
}
);
})
);
}
});
Sorry, I don't found the original script that I based to build this one.
Both, are working very well, but not as expected. The second fetch add it to cache again, I Think it's because caches.match(event.request) doesn't really match. So, I put a console to see both request objects, the synthetic created by pre-fetch and the cloned from fetch.
The synthetic:
The cloned:
So, I'm not sure if I can overwrite these properties to synthetic be same as cloned, can I do that safely? How can I solve that?
PS: This code isn't run as common scripts. The snippet was just to organize.
I did not found any reference to confirm my solution, but, its works.
The solution was create a 2 different caches and "Normalize" the request cloning it into a syntetic request, removing all references, keeping only the basic:
var CURRENT_CACHES = {
'prefetch-cache': 'prefetch-cache-v' + CACHE_VERSION, //Prefetch cach
'general-cache': 'general-cache-v' + CACHE_VERSION,
};
The prefetch-cache is responsible to store all files that I want to prefetch on my service worker and the general-cache is for all other files(It make sense when you have a SPA and want to accumulate some requests like translation files, js components, css and other stuffs).
You can make an array with URI of all files that you want to prefetch:
var urlsToPrefetch = [
//JS
"plugin/angular/angular.min.js", "plugin/requirejs/require.min.js","app/main.js","app/app.js","app/includes.js"
//CSS
,"styles/css/print.css","styles/css/bootstrap.css","styles/css/fixes.css",
//Html
,"app/layout/partials/menu.tpl.html", "app/layout/public.tpl.html",
//JSON
,"app/i18n/languages.json","app/i18n/pt-br.json", "app/i18n/en.json"
];
And into install event you should create new Requests of all files from this array and store into prefetch-cache:
self.addEventListener('install', function (event) {
logger && console.log('Handling install event:', event);
//var now = Date.now();
// All of these logging statements should be visible via the "Inspect" interface
// for the relevant SW accessed via chrome://serviceworker-internals
if (urlsToPrefetch.length > 0) {
logger && console.log('Handling install event. Resources to prefetch:', urlsToPrefetch.length , "resources");
event.waitUntil(
caches.open(CURRENT_CACHES['prefetch-cache']).then(function (cache) {
var cachePromises = urlsToPrefetch.map(function (urlToPrefetch) {
urlToPrefetch += '?v=' + CACHE_VERSION;
// This constructs a new URL object using the service worker's script location as the base
// for relative URLs.
//var url = new URL(urlToPrefetch + '?v=' + CACHE_VERSION, location.href);
var url = new URL(urlToPrefetch, location.href);
// Append a cache-bust=TIMESTAMP URL parameter to each URL's query string.
// This is particularly important when precaching resources that are later used in the
// fetch handler as responses directly, without consulting the network (i.e. cache-first).
// If we were to get back a response from the HTTP browser cache for this precaching request
// then that stale response would be used indefinitely, or at least until the next time
// the service worker script changes triggering the install flow.
//url.search += (url.search ? '&' : '?') + 'v=' + CACHE_VERSION;
// It's very important to use {mode: 'no-cors'} if there is any chance that
// the resources being fetched are served off of a server that doesn't support
// CORS (http://en.wikipedia.org/wiki/Cross-origin_resource_sharing).
// In this example, www.chromium.org doesn't support CORS, and the fetch()
// would fail if the default mode of 'cors' was used for the fetch() request.
// The drawback of hardcoding {mode: 'no-cors'} is that the response from all
// cross-origin hosts will always be opaque
// (https://slightlyoff.github.io/ServiceWorker/spec/service_worker/index.html#cross-origin-resources)
// and it is not possible to determine whether an opaque response represents a success or failure
// (https://github.com/whatwg/fetch/issues/14).
var request = new Request(url, {mode: 'no-cors'});
return fetch(request).then(function (response) {
logger && console.log('Add to Cache (Prefetch)', url.href);
if (!response || response.status !== 200 || response.type !== 'basic') {
throw new Error('request for ' + urlToPrefetch +
' failed with status ' + response.statusText);
}
//var responseToCache = response.clone();
// Use the original URL without the cache-busting parameter as the key for cache.put().
// return cache.put(urlToPrefetch, responseToCache);
return cache.put(urlToPrefetch, response);
}).catch(function (error) {
logger && console.error('Not caching ' + urlToPrefetch + ' due to ' + error);
});
});
return Promise.all(cachePromises).then(function () {
logger && console.log('Pre-fetching complete.');
});
}).catch(function (error) {
logger && console.error('Pre-fetching failed:', error);
}));
}
// Perform install steps
// if (urlsToPrefetch.length > 0) {
// event.waitUntil(
// caches.open(CURRENT_CACHES['perma-cache'])
// .then(function (cache) {
// return cache.addAll(urlsToPrefetch);
// })
// );
// }
// `skipWaiting()` forces the waiting ServiceWorker to become the
// active ServiceWorker, triggering the `onactivate` event.
// Together with `Clients.claim()` this allows a worker to take effect
// immediately in the client(s).
return self.skipWaiting();
});
For all others files that will be stored in cache in future you must declare it into fetch event listener and store this requests into general-cache:
self.addEventListener('fetch', function (event) {
//console.log(event);
if (event.request.method === "GET") {
var qSFilter = "" + ((event.request.url).split('?'))[0];//Filtrar Quetry Stirng
//console.log(event.request.url, qSFilter, qSFilter.split(CACHE_SCOPE), CACHE_SCOPE);
var leUrl = (qSFilter.split(CACHE_SCOPE))[1];
//Is possible to implement some logic to skip backend calls and other uncachable calls
if (/^(app|style|plugin).*(js|css|html|jpe?g|png|gif|json|woff2?)$/.test(leUrl)
|| /^backend\/server\/file\/i18n\/((?!client).+)\//.test(leUrl)
|| /^backend\/server\/static\/images\/.*$/.test(leUrl)
|| /^backend\/server\/static\/style.*$/.test(leUrl)
) {
var url = new URL(leUrl + '?v=' + CACHE_VERSION, location.href);
var synthetic = new Request(url, {mode: 'no-cors'});
//console.log(event.request,response.clone(),synthetic);
event.respondWith(
// caches.match(event.request)
caches.match(synthetic)
.then(function (response) {
// Cache hit - return response
if (response) {
logger && console.log('From Cache', event.request.url);
return response;
}
// IMPORTANT: Clone the request. A request is a stream and
// can only be consumed once. Since we are consuming this
// once by cache and once by the browser for fetch, we need
// to clone the response
var fetchRequest = event.request.clone();
return fetch(fetchRequest).then(
function (response) {
// Check if we received a valid response
if (!response || response.status !== 200 || response.type !== 'basic') {
return response;
}
// IMPORTANT: Clone the response. A response is a stream
// and because we want the browser to consume the response
// as well as the cache consuming the response, we need
// to clone it so we have 2 stream.
var responseToCache = response.clone();
caches.open(CURRENT_CACHES['general-cache'])
.then(function (cache) {
try {
logger && console.log('Add to Cache', event.request.url, qSFilter,leUrl);
cache.put(event.request, responseToCache);
} catch (e) {
console.error(e);
}
});
return response;
}
);
})
);
}
}
});
The full working script could be accessed here:
https://gist.github.com/LeonanCarvalho/0527526a6b784b23facf56fa3cc12d22
I am learning by trying to build a database that tracks comics. I can POST new comics and GET them with no trouble. But when I'd like to PUT, I run into a problem. I keep getting a bad request sent, but I think all the information is correct. I think all my info matches up, but I'm not sure what else is wrong.
All I am trying to do is update the comic list so you can track your physical and digital copies of the comic.
Thanks in advance.
Here is my DBController.cs:
[Authorize]
public IHttpActionResult Put(Comic comic)
{
string UserId = User.Identity.GetUserId();
if (UserId == null) return BadRequest("You Must Be Logged In to Edit");
else if (comic.UserId == UserId || User.IsInRole("Admin"))
{
using (ApplicationDbContext db = new ApplicationDbContext())
{
var currentComic = db.Comics.Find(comic.ComicId);
currentComic.IsDigital = comic.IsDigital;
currentComic.IsPhysical = comic.IsPhysical;
db.SaveChanges();
}
return Ok();
}
else return BadRequest("Insufficient privileges");
}
}
Here is my CollectionController.js:
$scope.physical = false;
$scope.digital = false;
$scope.updateComic = function (ComicId) {
var comic = {
ComicId: ComicId,
IsPhysical: $scope.physical,
IsDigital: $scope.digital,
}
return MarvelApiFactory.editComic(comic).then(function (data) {
})
}
And my ApiFactory.js
var editComic = function (comic) {
var deferred = $q.defer();
$http({
url: '/api/ComicDB',
method: "PUT",
headers: { Authorization: "Bearer " + localStorage.getItem('token') },
data: comic
}).success(function () {
deferred.resolve();
}).error(function () {
deferred.reject();
})
return deferred.promise;
}
return {
editComic: editComic,
}
Here is my .html:
<button class="btn btn-danger" ng-click="updateComic(x.ComicId)">Save</button>
And lastly, my error messages. Sorry, not really sure how/what you need. Last night when I was figuring this out, I had clicked on the network tab and was ble to find inner exceptions and such. Either I can't find them this time, or I didn't get any. But this is from my JS console:
PUT http://localhost:53612/api/ComicDB 400 (Bad Request)angular.js:9827 (anonymous function)angular.js:9628 sendReqangular.js:9344 serverRequestangular.js:13189 processQueueangular.js:13205 (anonymous function)angular.js:14401 Scope.$evalangular.js:14217 Scope.$digestangular.js:14506 Scope.$applyangular.js:21440 (anonymous function)jquery-1.10.2.js:5109 jQuery.event.dispatchjquery-1.10.2.js:4780 elemData.handle
PUT and DELETE are not enabled in IIS Express and IIS8 by default. You can enable these verbs by following these steps:
Open the applicationHost.config file on the machine
running the Web Api application. The file is located at %userprofile%\documents\IIS{Express|8}\config”.
scroll down to the bottom of the IIS
applicationHost.config file and look for a handler entry
that starts with:
<add name="ExtensionlessUrl-Integrated-4.0"...`.
In the "verb" attribute add PUT and DELETE so the "verb" attribute
looks like: verb="GET,HEAD,POST,DEBUG,PUT,DELETE"
I would like my AngularJS app to capture "no internet connection" just like how Chrome captures it. When Chrome captures it, it shows net::ERR_INTERNET_DISCONNECTED on the console.log.
Angular's HTTP interceptor is not able to capture it. The only data I get is
rejection.data.status undefined
rejection.status 0
So far, this has been working great. I noticed that the status is 0 when it can't contact anything. This is inside my http interceptor
responseError: function (rejection) {
if (rejection.status == 0) {
// put whatever behavior you would like to happen
}
// .......
return $q.reject(rejection);
}
A simple script that could help you, you could do it in different ways, even loading an image and call a function when this fails.
function getNetworkStatus(callback, timeout, x){
x = new XMLHttpRequest();
x.timeout = timeout,
x.onreadystatechange = function(){
x.readyState == 4 && callback(x.status == 200)
}, x.onerror = function(e){
callback(!1)
}, x.ontimeout = function(){
callback(!1)
}, (x.open("GET", "http://ip-api.com/json/"), x.send());
}
getNetworkStatus(function(isOnline){
console.info(isOnline ? "ONLINE" : "OFFLINE");
},60000);
UPDATE
We define this interceptor in httpProvider, so return strictly an error when a call does not happen successfully
angular.module('MyApp', []).config([
'$httpProvider',
function($httpProvider) {
$httpProvider.interceptors.push([
'$q',
function($q) {
return {
responseError: function(res){
console.info("Failed to open url: " + res.config.url, res);
//Angular returns "success" by default, but we will call "error" if data were not obtained.
if(res.data == null && res.status === 0 && res.statusText === ""){
return $q.reject(res) //callback error()
}
return res //return default success()
}
};
}
]);
}
]).run(['$http', function($http) { // --TEST--
//$http.get("http://ip-api.com/json/").success(function(){ //page online
$http.get("https://ip-api.com/json/").success(function(){ //try "https" (page offline to test)
console.info("My great page is loaded - We are online :)",arguments)
}).error(function(){
console.info("ERROR: My great online page is not loaded :/ - possibility of connection loss?",arguments)
});
}]);
You can change a trusted URL that you think will never be offline, for example Google, but remember, the url should have origin access permissions that does not show the "Access-Control-Allow-Origin" error. http://ip-api.com/json/ is ok.
I’m trying the convert an unstructured jQuery application to MVC AngularJS but at one point I’m kind of lost or I just don’t get it… (Unfortunately I’m not a JavaScript-God so the error might also be there)
This is the snippet of the original jQuery code.
$.ajax({
type: "GET",
url: "rest/users/" + userId + "/requests",
accept: "application/json; charset=utf-8",
statusCode: {
200: function(data) {
// do something
},
404: function() {
window.location = 'index.html';
},
500: function() {
alert("Server Error!");
}
}
});
A simple REST call where the HTTP response code is used to navigate. Unfortunately I can’t make this run in AnglularJS.
Here is my Controller:
// Inside the RequestController
$scope.requests = RequestModel.getRequestsByUser($rootScope.currentUser.userId);
if($scope.requests.statusCode == 200) {
// do something
}
else if($scope.requests.statusCode == 404) {
$location.path('/notFound');
} else if ($scope.requests.statusCode == 500) {
$location.path('/error');
}; // PROBLEM: The if/else statement is never true since statusCode is not available
Here is my Model:
// Inside the RequestModel
this.getRequestsByUser = function(userId) {
var RequestResource = $resource('../rest/users/' + userId + "/requests");
var requestList = RequestResource.get({}, function(response, getResponseHeaders) {
// PROBLEM: The property "stausCode" is "unavilable" at the Controller even if it was set here
requestList.statusCode = 200;
console.log("SUCCESS: getRequestsByUser() -> StatusCode: requestList.statusCode");
console.log(requestList);
}, function(response, getResponseHeaders) {
requestList.statusCode = response.status;
console.log("FAILED: getRequestsByUser() -> StatusCode: " + response.status);
});
return requestList;
};
This doesn't work since “statusCode” is “unavailable” inside my controller. The REST call works and also the data binding to the view is fine. I’m just not able to implement the “navigation part”. Do I miss something like $watch properties, asynchronous behavior or is my approach just incorrect?!
Thanks for your help!
You can make better use of resource parameter mapping in your service:
// Inside service
this.requestsByUser = $resource('../rest/users/:userId/requests', {userId:'#userId'});
That way you'll be able to reuse the same resource for different rest actions (eg. post, delete).
And controller code to handle statuses (response handlers were moved to controller):
// Inside the RequestController
$scope.requests = RequestModel.requestsByUser
.get(
{userId: $rootScope.currentUser.userId},
function(response) { // success handler
if(response.status == 200) {
// do something
}
},
function(response) { // error handler
if(response.status == 404) {
$location.path('/notFound');
} else if (response.status == 500) {
$location.path('/error');
}
}
);
Another way around is to use $q service to return promises from your service. But provided solutions seems cleaner to me