I'm trying to create a list of paginated responses of repositories where user clicks a page number and sees 10 items at a time. But the GitHub API returns different result for the same page request(e.g page=1) which gives user different result when they get back to previous page.
This is what my request looks like.
await octokit.request('GET /search/repositories', {q:"learn&code in:readme+language:javascript",sort:"stars",order:"desc",per_page:10,page:page})
`
I want to programmatically call this API with different page number but expecting that I can somehow get same result for same page number.How do I solve this?
Related
I am starting to build my first app using the pokeapi.co. I had an idea to do a drop downlist of all 1000~ pokemon and wanted to pull the data from the api. When I request data from "https://pokeapi.co/api/v2/pokemon" it does give me an array of pokemon names but it is limited to 20. Is there a way I can set the limit to max or all? Also I am not sure if this is the best way to implement the drop down menu so any additional advice or approach is welcome.
For pokeapi.co, calling any API endpoint without a resource ID or name will return a paginated list of available resources for that API. By default, a list "page" will contain up to 20 resources. If you would like to change this just add a 'limit' query parameter to the GET request, e.g. ?limit=1000. Like this:
https://pokeapi.co/api/v2/ability/?limit=1000
In my opinion, requesting 1000 items at a time is not an elegant way to do, not to mention the long time the user has to wait for the returned data. Instead, I suggest you to turn it into a table with a search option, split it into pages with a 20-30 items a page, fetching more data as soon as the user moves to a new page
So I have a product page where I see several different products that are being generated only when I click on the product. I see the dynamic content being loaded in the XHR tab in Network Console with a single GET Request for every item I click on.
Is there a way to select every product on that page and scrape specific data just like normal? I've tried scraping websites that don't use AJAX with ease but I'm pretty stumped here. I was going to scrape the products in order but when I went to the headers and looked at the request URL, I noticed that the item code for each product is not in order. For example the first item on the page has a product code of 46837 and then the one right below it has a product code of 68392. Or should I just be using selenium or splash to capture these AJAX calls?
Thanks!
I have a page with Client side paggination and Filtration. The page lists around 200-300 prouducts.
The page contains some filters like Category,Manufacturer and Weight.
Clicking upon any of the page number or filter, I am manupulating the page content on client side using Jquery.
Everything is working fine till this step. Now there is a usecase where I am facing problem.
Lets say a user comes to our product listing page and click on some of the filters and gets a list of products.
Now he clicks on a particular product , which redirects him to the product page to view the details of the product.
But now when the user clicks on the back button , the user gets the page with the intial state without any filter selected.
Is there any way user will get the page with the filters previously selected on clicking the back button?
You can use some of the following to store data across multiple pages.
Store data in cookies.
Store data in local storage.
Store data in the session on the server.
Make the data part of your URL (use hash or query string for the filter parameters). Note that changing query string causes page reload.
If using cookies, local storage, or hash, you'll need to add JavaScript code to your page that loads and applies the stored data on page load.
There is a number of ways to do this:
If you are dealing with html5 history and a single-page application, then you are not reloading the page. But based on your question, I assume this is not what you are dealing with.
Store something in the URL. For an example of this, look at the filters on TotalHockey, e.g. http://www.totalhockey.com/Search.aspx?category_2=Sticks%2fComposite%20Sticks&chan_id=1&div_main_desc=Intermediate&category_1=Sticks so when you go backwards, the URL contains the entire state.
Use localstorage, if you have a browser that supports it.
use cookies with the $.cookie API
Store it on the session in the server.
You can store the Search Filter Data in session just after submitting on the filter input and on each ajax request (Loading your product listing), you can check the search filter inputs stored in the session and show the data according to them. If search session is empty then show whole listing.
You can also store the full ajax request URL (if GET method is used) in the session after searching the record and hit that particular URL again after coming back from product detail page.
I have a php page which displays the result of a particular solution submitted by the user. The page is displaying an image "running" while the solution is checked in the back end and when the solution checking is finished in the back end the result is stored in the database.Now as the result got stored in the database the same php page which was displaying an image "running" should display the image and result that it got from the database. So, it needs to refresh every time to fetch the result from the database.I have used an iframe for that part of the page and passed the solution id using SESSION to the page which iframe is using and that page is fetching the data from the database. but the problem is that due to refreshing when a different solution(with different solution id) is submitted in another tab of the browser then both the previous and the current tab in the browser shows the current page since the solution id is passed using the SESSION variable. I tried Ajax also but not getting desired result. I want that a particular tab on the browser display the result of the solution which was submitted on that tab only. How can i do this please someone help.I surfed the net but dint got any desired result.
Basically, your problem is, that
you allow the same session in two different browser tabs (or windows), so the session ID is not unique between solutions
you store the solution id in the session, so the session id must be unique between solutions
You can work around your problem by removin either of these two conditions - either do not allow session re-use (bad idea IMHO) or use the solution id rather than the session id in your AJAX call for the iFrame refresh, e.g. as a GET parameter
I am working on a big site, and in the site there is a search module. Searching is done by using a a lot of user submitted values, so in pagination I must pass all these data to the next page, appending the values to url make the url very big.
Sso how can I solve this issue? I am planning to use a javascript based page submission (POST) with all the values in hidden fields to the next page the read all the values from the next page.
Will it cause any problems? Or should I use database to keep the search criterias?
I would create a server side object, possibly with a database backend which is updated by the different pages.
It is at my opinion the most clear and easy solution. Giving parameters from page to page, either by post or javascript or cookie will work too but it's more of a quirk in my experience.
Also if a search query is so complex that it needs multiple pages to create it, it might be helpfull for the user to have all the data stored on the server so he can change it more easily by switching back and forth between the different pages.
I would store all the search criterias in some kind of session-store on the server when the initial search is being triggered.
For pagination I would retrieve the criterias from the session-store and then just show the appropriate results. Also I would append some kind of key to the pagination links (so this would be the only hidden post-field) under which the search criterieas can be found.
Even though the session is per user, you might have several search windows open within the same session, and you don't want to mess them up with the pagination.
In order to make a reliable search with pagination, we need to do a bit more than normal.
We need to handle the following cases.
Once search is done, user may choose to do browser back and forward. Here, if you are doing form submission on every page, it would be an overload. Also, if user presses browser refresh button, it will unnecessarily warn him that data is being submitted.
Searching on a large database with lots of criteria is costly. Hence, optimization is important.
So you should NOT do the following:
Submit data on every page change
Not store data in cookie. (This is not secure and not even reliable.)
For large database with complex query, cache the result in session.
In case, you need very up-to-date and real-time result, ignore point (3) and try doing partial search for every page.
Thus, for your case, you can do the following:
When user searches first time, make the form POST data to a search page.
This search page will store the search query in session and generate a unique id for it.
Now render the result page. The result page will be passed the search id (generated in point 2) and the page number. Example result.aspx?searchId=5372947645&page=2
The result page will puck up the query from session using the searchId and then provide result based on the page number sent.
Using hidden fields and POST method should be fine too unless you are able to get them on the next page right.
To supplement Sarfraz's answer...
It's not necessary to use Javascript to make a POST.
<form action="destination_url" method="POST">
...
</form>