I make search engine based on elastic search and connect it to web page by using ajax and jquery. when someone enters search , all match results are displayed on single web page if less than 30.In most cases I know that total match results are more than 600 but web page only display 30 results.
In console it shows like this
data: Object
hits: Object
hits: Array[10]
max_score : 1.2333
total : 650
here total match results are 650 but web page display only 30 results.
How can I implement pagination. the code I used to display those results on web page is
$.ajax({
url: '/elastic/',
type: 'GET',
data: {"data": text},
success: function (response) {
$('.pagination').remove();
data = JSON.parse(response);
console.log(data);
for (var hit in data.data.hits.hits)
{
var source = data.data.hits.hits[hit]._source;
$('.div').append(source.user_name + ' / ' +
source.name +'<br/>');
}
}, searchText = text; }
Here I should implement pagination and I tried to look for relevant example of pagination here but I didn't found anything. and everytime the search result is different. sometimes there are 500 matching results and sometimes 10.
Can someone please give me a hint/guidance or example that how pagination can be implemented. I am trying from last few days.
You can use from and size parameters (as query parameter or data parameter). On the other hand there is also a Scroll API. You can check it from https://www.elastic.co/guide/en/elasticsearch/reference/2.4/search-request-scroll.html
Related
I'm working on a project for freeCodeCamp and I've been stuck on this part all day. I'm pulling data from the Wikipedia API and I'm able to work with it, yet I'm not sure how the syntax should look for what I'm trying to achieve. Here is a link to an example of the data I'm working with. Wikipedia API Search.
Now, in my HTML I have a bootstrap modal that appears after the user inputs something into a form, with a listed group inside with the returned data from the search.
This is the code I have so far.
$(document).ready(function () {
$('#searchForm').on('submit', function(e) {
e.preventDefault();
$('#wikiSearch').modal('show');
var usersearch = document.getElementById('userinput').value;
var apiURL = "https://en.wikipedia.org/w/api.php?
action=opensearch&search="
+ usersearch + "&format=json&callback=?";
$.ajax({
url: apiURL,
contentType: "application/json; charset=utf-8",
dataType: 'json',
type: 'GET',
success: function (data) {
data[1].forEach(function(item) {
$('#results').append("<tr><td><a href='#'>"+item+"</a></td></tr>")
});
data[2].forEach(function(item) {
$('#brief').append("<tr><td>"+item+"</td></tr>")
})
}
});
});
});
For each group in my modal of my HTML I want to display 1 result from the search. I figured I would be able to use a nested forEach but it's not returning the results I wanted. I've tried using map, and also tried creating a long nested for loop and feel like I might be doing more harm than good when it comes to learning since I'm only getting confused now lol. Thanks for any input.
To show the first row from the search results use the first key of the nested array:
$('#results').append("<tr><td><a href='#'>" + data[1][0] + "</a></td></tr>")
$('#brief').append("<tr><td>" + data[2][0] + "</td></tr>")
Don't forget to add a check for the possible undefined value inside the data array.
Objective: Parse JSON from an API where results are listed across multiple pages.
I am new to JSON and to working with data in general. I want to know how to write a function that will update the url, outputting the results for each page, and stopping when it reaches one that is empty.
This problem here is from a Shopify url displaying JSON data used for trivial purposes and not part of a real application.
https://shopicruit.myshopify.com/admin/orders.json?page=1&access_token=c32313df0d0ef512ca64d5b336a0d7c6
Each page had 5O objects. I'm making an $.ajax request to the url but the url has page=1 as a query,
$.ajax({
url:"https://shopicruit.myshopify.com/admin/orders.json?page=1&access_token=c32313df0d0ef512ca64d5b336a0d7c6",
method:'get',
dataType:'JSON'
}).done(function(response){
so the response I am only getting back is only for The results of page one (obviously). I know there are more pages b/c if I manually put a 2 in place of the 1 I can see different data. This goes on for multiple pages. I have tried removing the page option, setting it to all and any and these just display page 1.I thought maybe leaving the page option out would cure the problem but it does not.
How do I get all the pages of data with an ajax call?
Is it a function that takes the $.ajaxcall inside of it, that adds page++ and makes a new call for each page? I still don't know how to write that sadly.
The shopify API docs do give some examples on how to display "all data" but I tried to use what they suggested and it did not work so I'm not sure that it's applicable to the problem, but just in case it is–
https://help.shopify.com/api/reference/order
Here is a simplistic answer - this will get pages until there's clearly no more data - i.e. once a page returns less than limit orders
function getAllData(page) {
return $.ajax({
url:"https://shopicruit.myshopify.com/admin/orders.json?page=" + (page || 1) + "&limit=250&access_token=c32313df0d0ef512ca64d5b336a0d7c6",
method:'get',
dataType:'JSON'
}).then(function(response){
if (page && response.orders.length == 250) {
return getAllData(page + 1)
.then(function (more) {
return response.orders.concat(more)
})
}
return response.orders;
});
}
getAllData(1).then(function(orders) {
// orders is an array of orders
});
Note I've used 250 for limit to get 250 at a time
I say this is simplistic because, it does get all the data, however, you need to wait until all the data is retrieved before you can use it - this may take too long for your "user experience" - but this should get you to a place you can start
There's logic in the code such that if page is 0, only the first page will be retrieved regardless of how many items are in it - so you could do something like
getAllData().then(function(page1data) {
// do something with page 1
}).then(function() {
return getAllData(2); // start at page 2
}).then(function(restOfData) {
// do something with restOfData, pages 2+
});
One thing I'm not sure of is
.then(function(response){
you may need to change this to
.then(function(r){
var response = r[0];
I'm not 100% certain of jQuery's .then callback arguments
I am trying to update values on my page when I user selects what they want to filter but I do not want to refresh the webpage constantly. As an example, think of a real estate website where you filter based on location and the types of housing come back with the number (e.g., apartment [4] townhouse [0] studio [5]). The types of housing will always be there but its the numbers I am interested in updating. When ever you change the filter, new numbers are popping up. What I am doing is filtering questions based on topic, subject etc.
Is there anyway to use this in node.js? What I have working so far requires the page to refresh.
$.ajax({
type="POST",
url: "/user/calculatequestions",
data: {
filter date here... },
success: function () { },
error: function () { }
});
The '/user/calculatequestions' goes through an app.post and renders a new page with new variables.
Thanks in advance,
S
You can get the values of the filter through Java Script and send data filter through Ajax.
Example of input:
<input id="field" name="field1" type="text" >
Java Script function (together Ajax stack):
var data = {};
data.fielter_data1 = document.getElementById('field');
$.ajax({
type="POST",
url: "/user/calculatequestions",
data: data,
success: function () { },
error: function () { }
});
Then, in the function node called /user/calculatequestion you can get the parameters filter with:
var filter_data = req.body.fielter_data1;
The success callback returns the data after note proccess, then you update the components (inputs, tables, lists and etc) in the front-end.
See this question please: How to refresh table data using Ajax, Json and Node.js
setInterval(function () {
$("ul").empty();
var tag = $("#add").data('tag'), maxid = $("#add").data('maxid');
$.ajax({
type: 'GET',
url: 'ajax.php',
data: {
tag: tag,
max_id: maxid
},
dataType: 'json',
cache: false,
success: function(data) {
// Output data
$.each(data.images, function(i, src) {
$('ul#photos').append('<li><img src="' + src + '"></li>');
});
// Store new maxid
$('#add').data('maxid', data.next_id);
}
});
}, 20000);
I'm trying to load 1 picture at a time in an interval of 20s. However for a certain hashtag with only 27 photos. It loads well until the last 20, which loads all at one even though I'm limiting it to just one. It's always the last 20.
How do I load it 1 at a time for the last 20?
It's difficult to say exactly without looking at your PHP script, but what I can say is that you are iterating over an array of returned photos (using $.each) and appending each photo from the array of returned photos to your DOM.
So one thing would be, don't iterate over the array of photos and just access the first index of the array of photos (data.images[0]). If you can't figure out why your server side code is returning more photos than you want (which you should investigate), just grab all the photos and set a timeout that adds one of the returned photos every 20s after you've made the network request for all of them. This would mean less network requests as well, so maybe it would be a good solution for you.
If you want to make up to 20 ajax requests (maybe not an optimal solution), and you are getting more images back than you want, then your PHP needs to be investigated and right now you're only showing us the client side code.
I'm making a messaging system and I am currently reloading the content of the div holding the messages every 10 seconds using jQuery's .load(), but I have a problem: When trying to make a "Select all" button, "Delete selected" button, etc. when that 10 seconds comes up it reloads the buttons and it reloads the messages, so the messages get deselected because of the reload.
What I would like to know is how to make it actually load in new messages, but not actually reload the whole div. I know that Gmail does not reload the whole div because it works properly.
This is my JavaScript function that reloads the div and changes the page title (that has inbox count) so it stays updated:
function title() {
setTimeout("document.title = $('#heading').text();", 500);
}
function ajaxStuff() {
setTimeout("$('#heading').load('/employee/message/inbox.php #h1_head'); $('#messages').load('/employee/message/inbox.php #messages_inner');title();ajaxStuff();", 10000);
}
ajaxStuff();
Here is how I have the inbox set up:
Basically what I want to do is load in new messages with AJAX but somehow not refresh the div. I tried looking at Gmail's source but there's too much to go through and they make it confusing with a bunch of random classes and IDs.
Note: I have searched this on Google for a while now and did not find anything.
In response to comments:
I don't think a tutorial is warranted here. Change your server code to return the "new" messages with a class="new" attribute, then use:
$.ajax({
url: "/employee/message/inbox.php",
success: function(result) {
$(result).find(".new").prependTo("#heading");
}
});
Of course, that code may need some modifications to fit your environment/return data.
When checking for new messages send an ID of the newest message in your request. Then your php will return only everything newer that you add to your existing data.
jQuery.ajax({
type: 'get',
dataType: 'text',
url: "/employee/message/inbox.php",
data: {
from_user : from_user,
to_user: to_user,
message_id: message_id,
something_else_you_need_to_send: its_value
t: Math.random()
},
success: function(data, textStatus){
// whatever you need to do with the result returned from php (server)
}
Then in your sql query you do
select * from table
where user_id=user_id_from_ajax
and message_id > message_id_from_ajax`
update
in your php you use
$from_user = $_REQUEST['from_user'];
$to_user = $_REQUEST['to_user'];
$message_id = $_REQUEST['message_id'];