How to provide an XMLHttpRequest promise to a JavaScript function - javascript

Messing around with an autocomplete plugin available at https://www.npmjs.com/package/bootstrap-4-autocomplete, and the following works:
$('#id').autocomplete({
source: {'test1':1, 'test2':2, 'test1':3}
});
Instead of local JSON, will need to make an XMLHttpRequest and was thinking something like the following, and while I don't get an error, I also don't get anything:
$('#id').autocomplete({
source: function() {
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
return JSON.parse(this.responseText);
}
};
xhttp.open(method, url, true);
xhttp.send();
}
});
The plugin's author made the following remark a while back:
I don't have plans to directly invoke any url inside the lib. What you
can do is set autocomplete to your textfield after your ajax call
returns, which you can do with jQuery, like this:
$.ajax('myurl').then((data) => $('#myTextfield').autocomplete({
source: data }));
You don't have to worry about setting autocomplete to a field multiple
times, it is supposed to work like this when you need to change the
source.
Tried it and as expected, $.ajax() initiated an XMLHttpRequest request upon page load, and not as desired when the user enters a character into the search input.
How am I able to make an XMLHttpRequest to source the data into the plugin? I am assuming that I should be using a promise, however, if not, still would appreciate any assistance.
Thanks

Well, that's how plugin supposed to work. Its meat and potatoes is createItems function, called on keyup event - and responsible for filling out that dropdown with items. And here's its key part (1.3.0 version):
function createItems(field: JQuery < HTMLElement > , opts: AutocompleteOptions) {
const lookup = field.val() as string;
// ...
let count = 0;
const keys = Object.keys(opts.source);
for (let i = 0; i < keys.length; i++) {
const key = keys[i];
const object = opts.source[key];
const item = {
label: opts.label ? object[opts.label] : key,
value: opts.value ? object[opts.value] : object,
};
if (item.label.toLowerCase().indexOf(lookup.toLowerCase()) >= 0) {
items.append(createItem(lookup, item, opts));
if (opts.maximumItems > 0 && ++count >= opts.maximumItems) {
break;
}
}
}
// skipped the rest
}
As you can see, each time createItems is called, it goes through source object, grepping all the items containing lookup string.
So all the data parts are expected to be there - and to be processable synchronously. That's the plugin's way, with all good and bad coming out of this approach.
The best thing the plugin's author could've suggested here (without going against what plugin is about) is using AJAX to prepopulate the data before calling autocomplete. And that's what he did in that comment actually.
Now, what can be done here? One might think it's enough just to transform createItems into an async function - for example, calling source if it's a function and expecting its result to be a Promise. It seems to be seductively simple excluding that lookup loop in process - and just take the the results of that AJAX call to repopulate source...
But that's not so simple, unfortunately: there are several caveats to be aware of. What should happen, for example, if user stops typing (triggering first AJAX call), then types some more, then stops once again (triggering another AJAX call) - but the first one actually arrives later? The corresponding bug was plaguing a lot of autocomplete implementations I've been working with, sadly - it's not that easy to reproduce if you're testing only with fast network connections (let alone only on localhost).
That's just one of the reasons the author decided against extending that plugin, it seems. After all, it was built to solve one specific task - and it does this well. So unless you want to fork it and essentially rewrite it into 'two strategies' one, I'd suggest considering looking somewhere else.

Related

Detecting net::ERR_FILE_NOT_FOUND with only JS

I have recreated a blueprint, which has 60+ rooms, as an inline SVG.
There are functions that display information, such as pictures, when you select or hover a room. I'm using one div container to display the pictures by setting its background property to url('path-of-image.ext'), as can be seen below.
var cla = document.getElementsByClassName('cla');
for (i = 0; i < cla.length; i++) {
cla[i].addEventListener('mouseenter', fun);
}
function fun(){
var str = 'url("media/' + this.id.slice(4) + '.jpg")';
pictureFrame.style.background = str;
pictureFrame.style.backgroundSize = 'cover';
pictureFrame.style.backgroundPosition = 'center'
}
The reason I'm not using the background property's shorthand is because I plan on animating the background-position property with a transition.
However, not all rooms have pictures. Hence console throws the following error, GET ... net::ERR_FILE_NOT_FOUND, when you select or hover said rooms. The error doesn't cause the script to break, but I would prefer not to run that code every single time a room is hovered, even when a given room doesn't have pictures.
Even though I know this can be done imperatively with if/else statements, I'm trying to do this programmatically since there are so many individual rooms.
I've tried using try/catch, but this doesn't seem to detect this sort of error.
Any ideas?
Is it even possible to detect this kind of error?
You could attempt to read it using FileReader and catch/handle NotFoundError error.
If it were to error, you could assign it to an object or array which you would first check upon hover. If the file was in that array, you could avoid attempting to read it again and just handle however you like.
Here is a good article by Nicholas Zakas on using FileReader
First off I would see if there is a way of checking if the file exists before the document even loads so that you don't make unnecessary requests. If you have a database on the backend which can manage this that would serve you very well in the long term
Since you make it sound like the way you only know a file exists is by requesting it, here's a method that will allow you to try this:
function UrlExists(url)
{
var http = new XMLHttpRequest();
http.open('HEAD', url, false);
http.send();
return http.status!=404;
}
This won't request the image twice because of browser caching. As you can see that method is itself being depricated and overall the best way you can remedy this problem is checking before the page even loads; if you have a database or datastructure of any sort, add a class or property to the element if the image exists or not. Then, in your existing method, you can call something like document.getElementsByClassName('cla-with-image') to get only records that you've determined has an image (much more efficient than trying to load images that don't exist).
If you end up using that UrlExists method, then you can just modify your existing method to be
function fun(){
var url = "media/' + this.id.slice(4) + '.jpg";
if (UrlExists(url)) {
var str = 'url(' + url + ')';
pictureFrame.style.background = str;
pictureFrame.style.backgroundSize = 'cover';
pictureFrame.style.backgroundPosition = 'center'
}
}

Ajax call in "for" loops skips odd/even iterations

If I am here asking it is because we are stuck on something that we do not know how to solve. I must admit, we already searched in StackOverflow and search engines about a solution.. but we didn't manage to implement it / solve the problem.
I am trying to create a JavaScript function that:
detects in my html page all the occurrences of an html tag: <alias>
replaces its content with the result of an Ajax call (sending the
content of the tag to the Ajax.php page) + localStorage management
at the end unwraps it from <alias> tag and leaves the content returned from ajax call
the only problem is that in both cases it skips some iterations.
We have made some researches and it seems that the "problem" is that Ajax is asynchronous, so it does not wait for the response before going on with the process. We even saw that "async: false" is not a good solution.
I leave the part of my script that is interested with some brief descriptions
// includes an icon in the page to display the correct change
function multilingual(msg,i) {
// code
}
// function to make an ajax call or a "cache call" if value is in localStorage for a variable
function sendRequest(o) {
console.log(o.variab+': running sendRequest function');
// check if value for that variable is stored and if stored for more than 1 hour
if(window.localStorage && window.localStorage.getItem(o.variab) && window.localStorage.getItem(o.variab+'_exp') > +new Date - 60*60*1000) {
console.log(o.variab+': value from localStorage');
// replace <alias> content with cached value
var cached = window.localStorage.getItem(o.variab);
elements[o.counter].innerHTML = cached;
// including icon for multilingual post
console.log(o.variab+': calling multilingual function');
multilingual(window.localStorage.getItem(o.variab),o.counter);
} else {
console.log(o.variab+': starting ajax call');
// not stored yet or older than a month
console.log('variable='+o.variab+'&api_key='+o.api_key+'&lang='+o.language);
$.ajax({
type: 'POST',
url: my_ajax_url,
data: 'variable='+o.variab+'&api_key='+o.api_key+'&lang='+o.language,
success: function(msg){
// ajax call, storing new value and expiration + replace <alias> inner html with new value
window.localStorage.setItem(o.variab, msg);
var content = window.localStorage.getItem(o.variab);
window.localStorage.setItem(o.variab+'_exp', +new Date);
console.log(o.variab+': replacement from ajax call');
elements[o.counter].innerHTML = content;
// including icon for multilingual post
console.log(o.variab+': calling multilingual function');
multilingual(msg,o.counter);
},
error: function(msg){
console.warn('an error occured during ajax call');
}
});
}
};
// loop for each <alias> element found
//initial settings
var elements = document.body.getElementsByTagName('alias'),
elem_n = elements.length,
counter = 0;
var i = 0;
for(; i < elem_n;i++) {
var flag = 0;
console.info('var i='+i+' - Now working on '+elements[i].innerHTML);
sendRequest({
variab : elements[i].innerHTML,
api_key : settings.api_key,
language : default_lang,
counter : i
});
$(elements[i]).contents().unwrap().parent();
console.log(elements[i].innerHTML+': wrap removed');
}
I hope that some of you may provide me some valid solutions and/or examples, because we are stuck on this problem :(
From our test, when the value is from cache, the 1st/3rd/5th ... values are replaced correctly
when the value is from ajax the 2nd/4th .. values are replaced
Thanks in advance for your help :)
Your elements array is a live NodeList. When you unwrap things in those <alias> tags, the element disappears from the list. So, you're looking at element 0, and you do the ajax call, and then you get rid of the <alias> tag around the contents. At that instant, element[0] becomes what used to be element[1]. However, your loop increments i, so you skip the new element[0].
There's no reason to use .getElementsByTagName() anyway; you're using jQuery, so use it consistently:
var elements = $("alias");
That'll give you a jQuery object that will (mostly) work like an array, so the rest of your code won't have to change much, if at all.
To solve issues like this in the past, I've done something like the code below, you actually send the target along with the function running the AJAX call, and don't use any global variables because those may change as the for loop runs. Try passing in everything you'll use in the parameters of the function, including the target like I've done:
function loadContent(target, info) {
//ajax call
//on success replace target with new data;
}
$('alias').each(function(){
loadContent($(this), info)
});

What is more effective: every time make an ajax request or slice a result in func?

I have a JSON data of news like this:
{
"news": [
{"title": "some title #1","text": "text","date": "27.12.15 23:45"},
{"title": "some title #2","text": "text","date": "26.12.15 22:35"},
...
]
}
I need to get a certain number of this list, depended on an argument in a function. As I understand, its called pagination.
I can get the ajax response and slice it immediately. So that every time the function is called - every time it makes an ajax request.
Like this:
function showNews(page) {
var newsPerPage = 5,
firstArticle = newsPerPage*(page-1);
xhr.onreadystatechange = function() {
if(xhr.readyState == 4) {
var newsArr = JSON.parse(xhr.responseText),
;
newsArr.news = newsArr.news.slice(firstArticle, newsPerPage*(page));
addNews(newsArr);
}
};
xhr.open("GET", url, true);
xhr.send();
Or I can store all the result in newsArr and slice it in that additional function addNews, sorted by pages.
function addNews(newsArr, newsPerPage) {
var pages = Math.ceil(amount/newsPerPages), // counts number of pages
pagesData = {};
for(var i=0; i<=pages; i++) {
var min = i*newsPerPages, //min index of current page in loop
max = (i+1)*newsPerPages; // max index of current page in loop
newsArr.news.forEach(createPageData);
}
function createPageData(item, j) {
if(j+1 <= max && j >= min) {
if(!pagesData["page"+(i+1)]) {
pagesData["page"+(i+1)] = {news: []};
}
pagesData["page"+(i+1)].news.push(item);
}
}
So, simple question is which variant is more effective? The first one loads a server and the second loads users' memory. What would you choose in my situation? :)
Thanks for the answers. I understood what I wanted. But there is so much good answers that I can't choose the best
It is actually a primarily opinion-based question.
For me, pagination approach looks better because it will not produce "lag" before displaying the news. From user's POV the page will load faster.
As for me, I would do pagination + preload of the next page. I.e., always store the contents of the next page, so that you can show it without a delay. When a user moves to the last page - load another one.
Loading all the news is definitely a bad idea. If you have 1000 news records, then every user will have to load all of them...even if he isn't going to read a single one.
In my opinion, less requests == better rule doesn't apply here. It is not guaranteed that a user will read all the news. If StackOverflow loaded all the questions it has every time you open the main page, then both StackOverflow and users would have huge problems.
If the max number of records that your service returns is around 1000, then I don't think it is going to create a huge payload or memory issues (by looking at the nature of your data), so I think option-2 is better because
number of service calls will be less
since user will not see any lag while paginating, his experience of using the site will be better.
As a rule of thumb:
less requests == better
but that's not always possible. You may run out of memory/network if the data you store is huge, i.e. you may need pagination on the server side. Actually server side pagination should be the default approach and then you think about improvements (e.g. local caching) if you really need them.
So what you should do is try all scenarios and see how well they behave in your concrete situation.
I prefer fetch all data but showing on some certain condition like click on next button data is already there just do hide and show on condition using jquery.
Every time call ajax is bad idea.
but you also need to call ajax for new data if data is changed after some periodic time

Javascript How to check if a call is being made, and kill it if it is

Using jQuery I'm writing a website api call in Javascript, which so far works pretty well. When a person updates a number in a text input it does a call to the API and updates a field with the response. It gets problematic however, when I user quickly makes a lot of changes. The javascript then seems to pile up all queries, and somehow does them side by side, which gives the field to be updated kind of a stressy look.
I think one way of giving the user a more relaxed interface, is to only start the API call after the user finished editing the input field for more than half a second ago. I can of course set a timeout, but after the timeout I need to check if there is not already a call under way. If there is, it would need to be stopped/killed/disregarded, and then simply start the new call.
First of all, does this seem like a logical way of doing it? Next, how do I check if a call is underway? And lastly, how do I stop/kill/disregard the call that is busy?
All tips are welcome!
[EDIT]
As requested, here some of the code I already have:
function updateSellAmount() {
$("#sellAmount").addClass('loadgif');
fieldToBeUpdated = 'sellAmount';
var buyAmount = $("#buyAmount").val();
var sellCurrency = $("#sellCurrency").val();
var buyCurrency = $("#buyCurrency").val();
var quoteURL = "/api/getQuote/?sellCurrency="+sellCurrency
+"&buyAmount="+buyAmount
+"&buyCurrency="+buyCurrency;
$.get(quoteURL, function(data, textStatus, jqXHR){
if (textStatus == "success") {
$("#sellAmount").val(data);
$("#sellAmount").removeClass('loadgif');
}
});
if (fieldToBeUpdated == 'sellAmount') {
setTimeout(updatesellAmount, 10000);
}
}
$("#buyAmount").on("change keyup paste", function(){
updateSellAmount();
});
If you make your AJAX call like this:
var myAjaxDeferred = $.ajax("....");
You can check it later with:
if (myAjaxDeferred.state() === "pending") {
// this call is still working...
}

Ajax not firing on prototype - I think I need to unset or remove a javascript method - causing havoc help!

I'm working using scriptaculous library. However I'm facing some issues with inclusion of the JSON library for the prototype library. It adds a toJSONSTring and parseJSONSTRING method to all objects automatically and frankly this is causing havoc in places. Like I can't seem to use the Ajax Updater function and I suspect its because of this toJSONSTring method that has been attached to my options object which I pass to it.
Is there anyway to unset or atleast somehow remove a function which has been added to the Object.
EDIT:::
Actually I'm trying to make an ajax request and I'm facing an issue in the
Ajax.Updater = Class.create(Ajax.Request,....
part of the prototype library. At the part where its supposed to execute and post an AJAX request it doesn't - especially at:
$super(url, options);
To be precise I'm using this sortable and editable table grid here at this url:
http://cloud.millstream.com.au/millstream.com.au/upload/code/tablekit/index.html
Basically you clcik on a table cell to edit it and push the OK button to confirm. Upon clicking the button an ajax request is made.
The editable feature of the table calls the Ajax updater as follows in a submit function:
submit : function(cell, form) {
var op = this.options;
form = form ? form : cell.down('form');
var head = $(TableKit.getHeaderCells(null, cell)[TableKit.getCellIndex(cell)]);
var row = cell.up('tr');
var table = cell.up('table');
var ss = '&row=' + (TableKit.getRowIndex(row)+1) + '&cell=' + (TableKit.getCellIndex(cell)+1) + '&id=' + row.id + '&field=' + head.id + '&' + Form.serialize(form);
this.ajax = new Ajax.Updater(cell, op.ajaxURI || TableKit.option('editAjaxURI', table.id)[0], Object.extend(op.ajaxOptions || TableKit.option('editAjaxOptions', table.id)[0], {
postBody : ss,
onComplete : function() {
var data = TableKit.getCellData(cell);
data.active = false;
data.refresh = true; // mark cell cache for refreshing, in case cell contents has changed and sorting is applied
}
}));
},
The problem is that the request is never made and I keep pushing the OK button to no avail.
EDIT::::::::::::::::
I'm still stumped here - I've even tried calling the Ajax.Updater function on my own and it won't work at all. Its like this function has officially been rendered as useless all of a sudden. I've made the changes you said but all to no avail :( frankly I'm running out of options here - another idea would be to ditch this tablekit and look for something else which has similar functionality in the hopes that THAT MIGHT work!
It sounds like those methods are being added to the prototype of Object. By adding to Object's prototype, the library is automatically giving everything that derives from Object (in other words, everything) those methods as well. You might want to take do some reading on Prototypal inheritance in Javascript to get a better handle on this.
Anyway, you can remove those methods by doing this:
delete Object.prototype.toJSONString;
delete Object.prototype.parseJSONString;
You can delete anything from an object using "delete":
a.toJSON = function () {};
delete a.toJSON;
a.toJSON() => error: toJSON is not a function
a.toJSON => undefined
However I don't think that what is happening happens because of what you think is happening :) Maybe give more details on the problem you have with Ajax.Updater?
Seen the edit. OK, can you also post the actual line of code that calls Ajax.Updater and, also important, explain in detail how the options object you feed to it is made?
Also, please make sure you're doing something like this:
new Ajax.Updater(stuff)
and NOT just:
Ajax.Updater(stuff)
You NEED to create a new Updater object and use "new" (most probably you're already doing that, just making sure).
OK I'm still not sure what is getting passed to ajax.Updater since you extend stuff that I can't see, but try this: remove the "&" from the beginning of the variable "ss"; in the options object use parameters: ss instead of postBody: ss.
delete obj.property
In this case:
delete obj.toJSONSTring;
delete obj.parseJSONSTRING;

Categories