We have been building online catalogs for motorcycle parts for our online store.
Originally we did this with a program that turned PDF's into flash catalogs, but flash can't run on a lot of devices as you know and these devices are now getting super common.
So we have build an alternative in HTML, we exported each page of the original PDF's into images and built them into a jquery book.
http://www.thehogfathermotorcycles.com/catalog/jcata/mag.php?a=cce&b=2014&c=1
Some of these catalogs (like the one above) is over 1000 pages. I think you see the problem coming by now.... How on earth do we stop all 1000 images loading at once?
In a perfect world we load say 20... as the user flips through the catalog new images load on the back of this 20.
I really am stumped with this, how would you do it? We tried lazy load but unless you are scrolling vertically it does not work.
any help would be seriously appreciated.
Simple answer? Don’t add thousands of images to a HTML page.
More verbose, helpful answer: You could create a PHP script where you paginate the catalogue items, and then use AJAX to load the next “page” via pagination links. Destroy the previous page, replace with the next requested page.
Related
I am currently building a website but in an effort to prevent not necessary data to be loaded i decided to split but the website into serveral divs and load the content inside the div.
Because of this when i click on the back button i dont go to the previous location on the site but to where i was browsing before. Is there a way to solve this without rewriting the entire site? So for instance on my site there would be a members page that would be called upon using javascript by loading $('#content').load('members.php?id=$id');
For instance by creating a fake location...index.php#fakelocation (which contains the specific content i just loaded)
Can anyone give me a push in the right direction (or if this is impossible id like to hear it to)
I think what you're looking for is a combination of the History API and AJAX.
Lucky for you, there's a great library called PJAX that combines these technologies.
Without knowing more about how your backend works, I can't comment on additional steps to optimize the whole application, but PJAX is friendly with any number of server-side technologies.
I'm managing a site that does online courses with classes giveon on an HTML page.
Each class is on it's own page and it is NOT possible to split the class on multiple pages.
My problem is that the browsers grind to a halt on loading the classes.
There are about 2000 DOM elements, 100 of which are images, about 2 PDF documents (in the 200k-20MB range), and 1 to 3 youtube videos.
What can I do to make the site load "progressively" ? If I were to print the classes (there's a print button) it'd take between 20 to 60 pages.
Here (http://i.imgur.com/7VRLlPb.jpg) is a screenshot of one of the entries
Text smudged because they are somewhat private
You can try 'progressive loading' techniques.
http://docforge.com/wiki/Web_application/Progressive_loading
Its not however simple to do. You need to have more knowledge in JavaScript to perform this efficiently.
There are lot of libraries available to do so.
But, first get idea about progressive loading techinques
http://www.slideshare.net/stoyan/progressive-downloads-and-rendering
Still on the same project.
I currently have a theme I'm working on for tumblr. It's incredibly image heavy, so my goal is to load the entire page one (or a few) times while still being able to go to the next page of posts with pagination. The native pagination requires the entire page to load in order to get to older posts. What I believe I'm trying to achieve is loading with ajax, like this?
(Tumblr also does have an established script similar to this, infinite scrolling, which allows you to load more posts by scrolling to the bottom. That convinces me that this is possible, but unfortunately, as far as I know, not created yet.)
So, can anyone help me, if only it is to point me in the direction of an already established plugin somewhere? I've run into quite the problem since i know very little about scripting and nothing about ajax. I'm still quite new at this!
Again, here is the entire code of the theme currently, if that will help you with your answer. (Obviously, my posts render in the "POSTSGOHERE" div.) And, just in case, here is the script for infinite scrolling that definitely works on tumblr.
(No one point out how I haven't chained all my plugins yet, I'm a little cautious since they currently work and I don't want to break anything!)
Thanks again, stackoverflow.
EDIT: To clarify, I don't want infinite scrolling. I want pagination via ajax, like in the example. Infinite scrolling also will slow down my page, since you can only load so many posts before it just harms the speediness of the theme. It's very cumbersome and messy for this page.
Also, guys, I can't comment on your replies, for some reason? I'll just edit up here in order to reply.
Hi
my web site provides instant filtering of articles via JavaScript.
Initially, 12 most fresh summaries for articles are displayed.
Summaries of ALL articles are putted into JavaScript cache-object (rendered by server in script tags).
When user click on tags, corresponding summaries for articles will be taken from JS cache-object and inserted into the page as HTML pieces.
Does it have some negative impact on how SEO-friendly my web site is.
The main problem is clear: only 12 "static" URL's are displayed and another will appear programmatically only on user interaction.
How to make the site SEO-friendly, keeping this nice filtering feature ?
When i will add a link "all articles" that will load separate page with all articles, will it solve the SEO problems ?
The way to make this work for Search Engines, user who don't have JavaScript and also in your funky way is to write this feature in stages.
Stage 1: Get a working "Paged" version of this page working, so it shows 12 results and you can click on "next page" and "last page" and maybe even on various page numbers.
Stage 2: Implement the filter using a form-post and have it change the results shown in the page view.
Stage 3: Add JavaScript over the top of the working form and have it display the results the normal post would display. You can also replace the full-page-reload for paging with JavaScript safe in the knowledge that it all works without JavaScript.
Most people use an AJAX request rather than storing an ever-increasing list in a JavaScript array.
Crawlers (or most of them) don't enable javascript while crawling. therefore, all javascript powered content won't be referenced. And your site will be considered smaller as it is by search engines. this will be penalizing for your pages.
Making a "directory" page can be a solution. But if you do so, search engines will send user to these static pages, and not through your javascript viewer homepage.
Anyway, I would not recommend to make a javascript-only viewable content:
First it's not SEO friendly
then it's not javascript-disabled
friendly
imposibillity to use history
functions (back and next)
middle click "open in a new
tab/window" won't be working.
Also you can't bookmark javascript
generated content
so is your nice feature nice enough to lose the points mentionned above?
There are ways to conciliate your feature + all those points but that's far from easy to do.
I have a page which appears to load fully, but I actually have to wait a further 6-10 seconds for things like buttons to become fully functional.
In IE you can still see the browser loading bar at full for this time after the page displays.
Does anyone know why this might be? I stripped out all the javascript and it still does it.
I get that in my pages sometimes when I don't compress my images. Large image files, or other large media, would be the first place I would check.
Something else I normally look for in speeding up the page is time it takes to download services (feed parsing etc), though since you took out all your javascript that shouldn't be a problem.
The problem with this was the way the page was rendering from the asp:repeater
Instead I created a datagrid and it seemed to eliminate the problem. No idea why as I'd expect it to be the other way around if anything.