Dealing with cache and updated JSON files in AngularJS - javascript

I'm building an app where I add and delete items from a JSON file.
I've encountered the following problem: When I delete an item, it gets reflected in the frontend (the item disappears), but it takes a couple of hard page reloads for the app to read the new JSON file produced by my PHP file instead of the cached one.
If I just reload once, it will just read the JSON file in cache, which doesn't reflect the changes made.
Is there any way to deal with this issue directly in AngularJS?
Here's my Angular code:
$scope.remove = function(array, index){
if($scope.totsselected){
array.splice(index, 1);
$http.post("deleteall.php", {
data : array
})
.then(function(data, status, headers, config) {
$http.get('data/all.json')
.then(function (response) {
$scope.productesgenerals = response.data;
console.log($scope.productesgenerals);
}).catch(function (error) {
});
});
}
)};
And my PHP code:
<?php
$contentType = explode(';', $_SERVER['CONTENT_TYPE']);
$rawBody = file_get_contents("php://input"); // Read body
$data = json_decode($rawBody); // Then decode it
$all = $data->data;
$jsonData = json_encode($all);
file_put_contents('data/all.json', $jsonData);
?>

It sounds like you have $http caching turned on. Try disabling it for this request
$http.get('data/all.json', {cache: false})
https://docs.angularjs.org/api/ng/service/$http#caching
If that doesn't work (it is still cached), then it sounds like server-side caching. You can bust this by sending a unique query string.
$http.get('data/all.json?_=' + Date.now(), {cache: false})
This will make each request a unique request and should prevent the server side caching.
One caveat is that since you are ignoring the caching, you lose all the performance benefits of caching.

Related

If in cache, use cached version, otherwise GET new JSON

After some advice/guidance in regards to the end user performance.
Have put together a small client side 'person search' which is sourcing its data from a JSON file. The issue I am having is that the server which is compiling the JSON for use is old/slow (non for profit community group). As a result, users are having to wait between 3-6 seconds before they are able to interact with the page. I know there can be efficiencies made with how I make the request for the JSON data, but I am still very much new to javascript. The JSON is being compiled as:
print(JSON.stringify(datarecord));
This is how I am currently requesting the JSON file, and then using...
var request = new XMLHttpRequest();
request.open("GET", "linktojson.json", false);
request.send(null);
var a = JSON.parse(request.responseText);
$(document).ready(function() {
// Javascript things here
console.log(a);
}
I suspect as a result, every time the user accesses the page, a request is made. Then the JSON is compiled (slowly) and then the page is ready for use.
I have been looking at trying to store the JSON data in the users cache, so that way the JSON data only needs to be obtained once per session, however the load time still appears slow.
This is how I have updated my code.
var request = new XMLHttpRequest();
request.open("GET", "linktojson.json", false);
request.send(null);
localStorage.setItem("request",(request));
$(document).ready(function() {
var jsonString = localStorage.getItem("request");
var retrievedObject = JSON.parse(jsonString);
console.log(retrievedObject);
// Javascript things here
}
I have also been able to confirm that the bottleneck is occurring at the JSON request point to the server. I tested this by saving a copy of the JSON produced, then using this 'static' JSON file instead, and the page will render in under a second.
Then went further again, and used the following JSON file (28K records) and the page still rendered quickly.
LINK: https://raw.githubusercontent.com/prust/wikipedia-movie-data/master/movies.json
Sorry if this is a little long winded. I wanted to help describe the problem as best as possible.
If you want to utilize caching, you have to check that cache first before you do a request. Otherwise, it won't make much sense.
And doing synchronous blocking requests is deprecated, and you should also switch over to use fetch if possible instead of XMLHttpRequest.
A function that first checks if the data is in the catch before doing the request could look like this:
async function getData() {
let data = localStorage.getItem('request')
// check if data is in cache
if( data === null ) {
// if it is not in cache then request it
const response = await fetch('linktojson.json')
// parse the json response
data = await response.json()
// store the data in the cache
localStorage.setItem('request', JSON.stringify(data));
} else {
// if it exists then parse it
data = JSON.parse(data)
}
// return the data
return data
}
You could then do something like this:
getData()
.then(data => {
// ensure DOM is ready
$(() => {
console.log('do something with data', data)
})
})
.catch(err => {
console.log('error occured')
})
The code could also be written like that:
function waitForDomReady() {
return new Promise(resolve => $(resolve))
}
async function run() {
try {
let data = await getData();
await waitForDomReady();
console.log('do something with data', data)
} catch (err) {
console.log('error occured')
}
}
run()

NodeJS - How to get cookies from server response

I want to use nodeJS as tool for website scrapping. I have already implemented a script which logs me in on the system and parse some data from the page.
The steps are defined like:
Open login page
Enter login data
Submit login form
Go to desired page
Grab and parse values from the page
Save data to file
Exit
Obviously, the problem is that every time my script has to login, and I want to eliminate that. I want to implement some kind of cookie management system, where I can save cookies to .txt file, and then during next request I can load cookies from file and send it in request headers.
This kind of cookie management system is not hard to implement, but the problem is how to access cookies in nodejs? The only way I found it is using request response object, where you can use something like this:
request.get({headers:requestHeaders,uri: user.getLoginUrl(),followRedirect: true,jar:jar,maxRedirects: 10,},function(err, res, body) {
if(err) {
console.log('GET request failed here is error');
console.log(res);
}
//Get cookies from response
var responseCookies = res.headers['set-cookie'];
var requestCookies='';
for(var i=0; i<responseCookies.length; i++){
var oneCookie = responseCookies[i];
oneCookie = oneCookie.split(';');
requestCookies= requestCookies + oneCookie[0]+';';
}
}
);
Now content of variable requestCookies can be saved to the .txt file and can loaded next time when script is executed, and this way you can avoid process of logging in user every time when script is executed.
Is this the right way, or there is a method which returns cookies?
NOTE: If you want to setup your request object to automatically resend received cookies on every subsequent request, use the following line during object creation:
var request = require("request");
request = request.defaults({jar: true});//Send cookies on every subsequent requests
In my case, i've used 'http'library like the following:
http.get(url, function(response) {
variable = response.headers['set-cookie'];
})
This function gets a specific cookie value from a server response (in Typescript):
function getResponseCookieValue(res: Response, param: string) {
const setCookieHeader = res.headers.get('Set-Cookie');
const parts = setCookieHeader?.match(new RegExp(`(^|, )${param}=([^;]+); `));
const value = parts ? parts[2] : undefined;
return value;
}
I use Axios personally.
axios.request(options).then(function (response) {
console.log(response.config.headers.Cookie)
}).catch(function (error) {
console.error(error)
});

Reconciling WP API, AngularJS, and $resource/$http requests (getting X-WP-TotalPages)

I'm putting together an app using WP-API with an Angular frontend. I'm developing locally, with the data I'm trying to use loaded in from a remote server. Doing a $resource request for all posts works great.
But, I'm trying now to get the result of the X-WP-TotalPages header and can't figure out how to do it. Here's the code as it stands:
var request = function() {
var fullRoute = 'http://dylangattey.com/wp-json/posts/';
var defaultGet = {
method: 'GET',
transformResponse: function(data, headers){
response = {};
response.posts = JSON.parse(data);
response.headers = headers();
console.log(headers['X-WP-TotalPages']);
return response;
}
};
return $resource(fullRoute, {}, {
get: defaultGet,
query: defaultGet
});
};
// This gives me back the first 10 posts
request().query();
Chrome and curl both show that X-WP-TotalPages should be equal to 2 as a header. However, it just logs undefined.
Am I missing something? No matter whether I use $http or $resource I get the same result. I also have the same issue whether I use a remote site or a local WP installation on localhost. Really, I just want to know the total number of pages or even just the total number of posts for a given request, so if there's a better way to do it, I'd love to know.
You probably need to control the access to the specific headers you want on the server side.
See this thread for a brief discussion, or MDN on Access-Control-Expose-Headers.

Posting data to JSON - Angular .post

I am working on an application and am having an issue posting to a .JSON file in my assets. I am working with an Angular based application. The error code I get is 404 with a response of: Cannot POST /assets/data/card-stack.json. Now the problem is, when I work with my get to retreive the JSON data it works perfect. It is only when I am using .post. Here is what I am doing:
$http.get('./../../assets/data/card-stack.json').success(function(data) {
$scope.cards = data;
// Set the showdown images from the card data grabbed from the card-stack.json file
$scope.showdowns = [
$scope.cards[0].url,
$scope.cards[1].url,
$scope.cards[2].url,
$scope.cards[3].url
];
});
// Simple POST request example (passing data) :
$http.post('./../../assets/data/card-stack.json', {url : './../images/banana.jpg'}).
success(function(data, status, headers, config) {
// this callback will be called asynchronously
// when the response is available
}).
error(function(data, status, headers, config) {
console.log(data);
// called asynchronously if an error occurs
// or server returns response with an error status.
});
Suggestions?
A .json file is a static file that just contains json data so you won't be able to post to it. Instead you would need to use a server side page or service such as php to process the posted data.
Try to set the $http header:
$http.defaults.headers.post["Content-Type"] = "application/json";
Try it.

Storing KnockoutJS modeled data with AmplifyJS

I'm trying to figure out a way to cache my knockoutJS SPA data and I've been experimenting with amplifyJS. Here's one of my GET functions:
UserController.prototype.getUsers = function() {
var self = this;
return $.ajax({
type: 'GET',
url: self.Config.api + 'users'
}).done(function(data) {
self.usersArr(ko.utils.arrayMap(data.users, function(item) {
// run each item through model
return new self.Model.User(item);
}));
}).fail(function(data) {
// failed
});
};
Here's the same function, "amplified":
UserController.prototype.getUsers = function() {
var self = this;
if (amplify.store('users')) {
self.usersArr(ko.utils.arrayMap(amplify.store('users'), function(item) {
// run each item through model
return new self.Model.User(item);
}));
} else {
return $.ajax({
type: 'GET',
url: self.Config.api + 'users'
}).done(function(data) {
self.usersArr(ko.utils.arrayMap(data.users, function(item) {
// run each item through model
return new self.Model.User(item);
}));
}).fail(function(data) {
// failed
});
};
This works as expected, but I'm not sure about the approach I used, because it will also require extra work on the addUser, removeUser and editUser functions. And seeing as I have many more similar functions throughout my app, I'd like to avoid the extra code if possible.
I've found a way of handling things with the help of ko.extenders, like so:
this.usersArr = ko.observableArray().extend({ localStore: 'users' });
Then use the ko.extenders.localStore function to update the local storage data whenever it detects a change inside the observableArray. So on init it will write to the observableArray in case local storage data exists for users key and on changes it will update the local storage data.
My problem with this approach is that I need to run my data through the model and I couldn't find a way to do that from the localStore function, which is kept on a separate page.
Has any of you worked with KO and Amplify? What approach did you use? Should I use the first one or try a combination of the two and rewrite the extender in a way that it only updates the local storage without writing to the observableArray on init?
Following the discussion in the question's comments, I suggested to use native HTTP caching instead of adding another caching layer on the client by means of an extra library.
This would require implementing a conditional request scheme.
Such a scheme relies on freshness information in the Ajax response headers via the Last-Modified (or E-Tag) HTTP headers and other headers that influence browser caching (like Cache-Control: with its various options).
The browser transparently sends an If-Modified-Since (or If-None-Match) header to the server when the same resource (URL) is requested subsequently.
The server can respond with HTTP 304 Not Modified if the client's information is still up-to-date. This can be a lot faster than re-creating a full response from scratch.
From the Ajax request's point of view (jQuery or otherwise) a response works the same way, no matter if it actually came from the server or if it came from the browser's cache, the latter is only a lot faster.
Carefully adapting the server side is necessary for this, the client side on the other hand does not need much change.
The benefit of implementing conditional requests is reduced load on the server and faster response behavior on the client.
A specialty of Knockout to improve this even further:
If you happen to use the mapping plugin to map raw server data to a complex view model, you can define - as part of the options that control the mapping process - a key function. Its purpose is to match parts of your view model against parts of the source data.
This way parts of the data that already have been mapped will not be mapped again, the others are updated. That can help reduce the client's processing time for data it already has and, potentially, unnecessary screen updates as well.

Categories