in gmail if you check mark email 4 then move to different set of 50 or 25 records and mark selection 26 then both 4 and 26 are retained if you move back and forth.
How does google do this?
would it be possible to do something like this in a page that brings only 50 records and when NEXT is clicked...it again goes to DB to bring next set of 50 records.
You don't technically change pages, it's all the same page, the content is just changed dynamically with JavaScript.
Take a close look at the url. Only the hash part of it changes. Which means you don't really load new pages when you click things on Gmail. They just change the elements shown to you with javascript.
Similar thing could be done with page loads if you use localStorage or sessionStorage
You could do the page you're describing with Ajax techniques.
The inner pages are most likely loaded using AJAX. Kind of like iFrames, you monitor the links that are clicked and only load the inner part of what you're after so that you aren't loading things twice...
It's possible that these are saved in JavaScript or Cookies... I would probably store them as a JavaScript array of selected checkboxes personally... depending on how much load you're already giving to the user.
Related
I need to store some dynamic variables in a Json file (i was told this would work) to be able to load them in multiple pages. Per example a customer should be able to select one thing in one page (a highlighted div) and then go to another page, select some items there and be able to go pack to the other page and that div is still selected. (Variables remembered) Now i tried to google and search here but found nothing matching my description. Please help me!
Try using localstorage, store selected values by user in localstorage and check this at page ready or load if it exists in loacalstorage show that value or div with highlight as per your functionality and if localstorage doesn't had a value then consider it as first time user come to this page.
but be careful using localstorage as you have to clear it when you done with it, it may cause memory leakage problem.
I need some help.
I have angularjs application. It has 2 tabs. The first one is for viewing items and the second tab is for adding items. I want the viewing tab to refresh once I add need items from the second tab. I know how to refresh a page with
window.location.reload();
but I would like to know how I can refresh the other page as well when a function is called. Is there a way to refresh all html views with a click?
Thank you.
Not sure I fully understood, but it seems like you have two browser tabs (or windows) open and you want one to affect the other.
Since they're separated, they're also running two different app instances, meaning that stuff like scope.$apply() just won't affect the other one. Basically, you need to switch from "communication across the app" to "communication across the domain". To achieve that you can use localStorage (or a simple cookie, database, file, etc...).
The idea is that one view would check for a certain session variable (say, every two seconds), and force an update and clear that variable in case that variable is set.
The other view, of course, would be in charge of setting that variable to the session once an item is added.
That's probably the simplest approach.
If you want instant changes and constant communication, check out webSockets, but I think it might be an overkill for this.
See this simple example:
VIEW: http://jsfiddle.net/1dshqpay/
ADD: http://jsfiddle.net/u7n46Lsk/
They are two completely separate apps/URLs, but adding an item from ADD will show up in VIEW.
(I couldn't create it here as two snippets as they're sandboxed and not supporting localStorage)
I am currently doing some work on a research database where they have decided that they want to be able to share links to articles from the site on social networks (Facebook, Twitter, LinkedIn and Google+).
Preferably this should be done through the share buttons provided by the respective networks. I quickly got the buttons working and displayed correctly on the site by following the implementation instructions from each network.
My problem is a consequence of that the site offers the possiblity to show 1000 (1K) post on a single search result page. This means that when such a page is created it needs to create 1000 share buttons for each social network (effectively 4000).
Sadly this seems to overwhelm the browser as it offers to stop the javascript provided by the social networks and whether you choose to stop it or not - the page ends up in deadlock waiting for a response from the social networks and never finishes the page loading process.
I have an idea that the problem may be that the large number of asynchronous requests means that the browser somehow misses some of the responses and thus ends up waiting forever for a response that will never come.
As mentioned it is only a problem with such a large number of posts, if a page for example displays 100 posts (effectively 400 share buttons) it works perfectly.
While it could be argued that 1000 post on a single page is overkill, limiting the maximum number of displayed post is sadly not an option.
My question therefore is whether any of you know of a way to solve this kind of problem or if my only real option is to create custom share buttons that doesn't need to be created through the javascript provided by the social networks ?
The following references leads to the documentation for each of the share buttons.
Twitter
Facebook
LinkedIn
Google+
For all these buttons, there is a main js file which does the heavy work.
So, for LinkedIn, add the script tag:
<script src="//platform.linkedin.com/in.js" type="text/javascript"></script>
once in the page. And use the below script as a placeholder for your linkedin button whereever you need it. (don't forget to replace the data-url attribute in below script)
<script type="IN/Share" data-url="http://developer.linkedin.com/plugins/share-plugin-generator" data-counter="top"></script>
For Twitter similarly, the below script tag needs to be added once in the page as it's job is to get the main js file and add it to the page.
<script>!function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");</script>
The below script needs to be added multiple times whereever you want. Replace the data-url attribute with your url which needs to be tweeted when you click on it.
Tweet
When you get the code for FB or Google Plus like, you will get a script which needs to be added once and then the code to be added where ever you need.
EDIT:
Based on your comment below: The scripts will surely cause issue because they need to convert each and every placeholder into a good looking 'like' button. Below are few ways to improve the performance:
run these scripts only on page load (i.e., add the main scripts at load time)
using setTimeout or setInterval, work on every 100 placeholders at a time (requires change in main scripts)
Lazy load the init of like buttons. When the user scrolls the page and the like buttons will show up in the page, then initialize the buttons (requires change in main scripts)
Recommended Approach: Keep just one set of like buttons. When user hovers over a search result, then add this set of buttons to that div and change the attributes related to url in the buttons. With this way, only one set of buttons will be shown and won't take time at all to init them.
What is the best way of doing a pagination? I would also need to save the current page, so that when I click a link it would save the page I was on. So if I'm on page 2 of the pagination and click one link and then get back to pagination page it would remember that I was on page 2.
I get the results/data from Json request where I have offset and limit possibility.
$.getJSON(base_url+'/ajax/get_news/4/!OFFSET!/!LIMIT!/true', func...
Where !LIMIT! is how many results it shows and !OFFSET! is, well offset :D When I click a link, it makes that request, it goes throught the results and appends the result into page.
What is best way to save the page, cookies? Should I get all the results and then do the pagination somehow or do new request when user change page?
Some tutorial or "hands on" example would be awesome. Normal instructions/guides are difficult to undestand since my first language isn't english.
It appears you have two questions:
1) How to save page state (what page you are on): If the application must continue to use an ajax, then you should look at storing the state in the url as discribed here:
http://ajaxpatterns.org/Unique_URLs
2) Regarding where to do the pagination, I think it would depend on the size of the data to paginate. If it is small and you are not worried about the data changing on between paging, do it all in javascript. Otherwise, do it server-side.
Okey I should use the !OFFSET! and !LIMIT! to do the pagination. I just need change those numbers with pagination links (1 2 3 4 pages etc) to get the pagination to work I believe. But I dont know where to start :/
I am looking for a way to display a list of websites one at a time from a URL list. I'm fine with a very manual solution, I found an AJAX solution where each "page" is displayed in a tab but it is very heavy because if I have 50 pages I want users to page through one at a time, this solution essentially pulls all 50 pages onto the one page. Do you know of a framework which does the same thing but only loads one page at a time? Thank you very much for the advice and help. Here is the site I found - http://css-tricks.com/jquery-ui-tabs-with-nextprevious/
You could load the URLs into an array and then create a 'next' button that loads the next url into a div; replacing the previous one.
do you require doing this will javascript?
might be easier to curl the pages using php, then echo this returned data as an eval-able array into the html. Then allow user to alter which part of the returned array you are looking at using a next and prev button.
if you pre-load each one it will be heavy as you have noted.
This idea is screaming for AJAX. With proper AJAX calls, you would only load a page once it has actually been selected by tab. Any previous page loaded into the area would need to be dumped. You shouldn't actually need to physically switch tabs if you're using the src attribute of an iframe, simply changing the src and forcing it to refresh itself should accomplish the trick. If you are performing a screen scrape through a remote web service, then you could simply use jQuery/AJAX to rewrite the innerHTML of the panel in question.