I'm managing a site that does online courses with classes giveon on an HTML page.
Each class is on it's own page and it is NOT possible to split the class on multiple pages.
My problem is that the browsers grind to a halt on loading the classes.
There are about 2000 DOM elements, 100 of which are images, about 2 PDF documents (in the 200k-20MB range), and 1 to 3 youtube videos.
What can I do to make the site load "progressively" ? If I were to print the classes (there's a print button) it'd take between 20 to 60 pages.
Here (http://i.imgur.com/7VRLlPb.jpg) is a screenshot of one of the entries
Text smudged because they are somewhat private
You can try 'progressive loading' techniques.
http://docforge.com/wiki/Web_application/Progressive_loading
Its not however simple to do. You need to have more knowledge in JavaScript to perform this efficiently.
There are lot of libraries available to do so.
But, first get idea about progressive loading techinques
http://www.slideshare.net/stoyan/progressive-downloads-and-rendering
Related
We have been building online catalogs for motorcycle parts for our online store.
Originally we did this with a program that turned PDF's into flash catalogs, but flash can't run on a lot of devices as you know and these devices are now getting super common.
So we have build an alternative in HTML, we exported each page of the original PDF's into images and built them into a jquery book.
http://www.thehogfathermotorcycles.com/catalog/jcata/mag.php?a=cce&b=2014&c=1
Some of these catalogs (like the one above) is over 1000 pages. I think you see the problem coming by now.... How on earth do we stop all 1000 images loading at once?
In a perfect world we load say 20... as the user flips through the catalog new images load on the back of this 20.
I really am stumped with this, how would you do it? We tried lazy load but unless you are scrolling vertically it does not work.
any help would be seriously appreciated.
Simple answer? Don’t add thousands of images to a HTML page.
More verbose, helpful answer: You could create a PHP script where you paginate the catalogue items, and then use AJAX to load the next “page” via pagination links. Destroy the previous page, replace with the next requested page.
In print mode, I would like to render such a links:
link
like this:
link (page 11)
(suppose the target page is on 11. page in print preview).
It is possible to add page number using counters to the page footer with plain CSS, but could I use this in more "the way I want it" way?
The solution doesn't need to be cross-browser (FF/Chrome is enough), any CSS/Javascript tricks are allowed.
As I wrote in my blog post "Printing web pages", printing web pages is in a sorry state.
Unless you print only (one column) text without images and tables, it's already hard enough to get a printout which resembles the screen content at least somewhat (yeah, I exaggerate here but not much).
The best browser for printing currently is still, since about a decade, Opera. All other browsers suck more or less.
CSS could help a lot but no browser implements the necessary parts. See Target counters in the Generated Content for Paged Media module - this would do exactly what you need.
Okay, after getting the rant out of the way, here are some obstacles which you will need to solve:
The page numbers will start to shift as soon as you start adding test to existing links. So if you write "blabla (page #12)" that will probably page #13 once you get there.
To be able to know what goes onto which page, you will have to split the web page into pages yourself. If you have mostly text, this isn't terribly hard: Just create one div per page, make sure the browser doesn't print it over a page break (good luck with that :-( ) and then move DOM elements into the divs until they fit a page ... if you can find out what the page size is.
To avoid the shifting problem, add " (page #0000)" to all links to make sure they occupy enough room. Later, they will be replaced with shorter texts, so pages with a lot of links might have some empty space at the bottom but that's better than the alternatives.
You should be able to write something like that (and debug it) in just six months or so ...
Which is why everyone uses a rendering engine on the server which can produce HTML and, say, PDF output. PDF allows you exact control over the page layout. You just need to find a rendering engine that supports text references. I suggest LaTeX; it's a bit unwieldy but it gets the job done.
Could this work?
#page {
#bottom-right {
content: counter(page) " of " counter(pages);
}
}
(from Page numbers with CSS/HTML)
I am building a one page webapp and it's starting to get pretty big. There are several components to the app, each one meticulously styled. On average the app has a DOM element count of 1200+. I have been warned by my YSlow scan that this is too many, and that I should have no more than 700 DOM elements.
I am usually quite strict and efficient with my markup and I doubt I would be able to trim much off. I tend to use a lot of DOM elements to get the styling exactly right and working cross browser.
How can I dramatically cut the number of DOM elements?
Will I have to load more of the content on demand (ajax) instead on all on page load?
Does a large amount of DOM elements have a big impact on performance?
I would love to hear people's experience with this and any solutions you may have...
The number of dom elements would only enter into the picture if you're doing a lot of DOM and/or CSS manipulation on the page via Javascript. Scanning for an ID in a page with 50,000 elements is always going to be slower than a page with only 500. Changing a CSS style which is inherited by most of the page will most likely lead to more redrawing/reflowing than it would on a simpler page, etc...
The only way to cut element count is to simplify the page.
We've built a single page web app. Initially Yslow worried me as we had 2,000+ DOM objects in the page.
After some work we got all the other Yslow items to green. And we ended up living with it(around 1,800 right now) as the app is very fast in various browsers.
But we don't support IE6 and IE7, and it could be different for these browsers.
How can I dramatically cut the number of DOM elements?
By using only those elements that are necessary. If you want an more elaborate advice, post your code.
Will I have to load more of the content on demand (ajax) instead on all on page load?
If you want your page to perform better on start-up, you can do that.
Does a large amount of DOM elements have a big impact on performance?
Not necessarily.
You can render elements on demand when user click a button or can use lazy loading like Twitter.
Hi
my web site provides instant filtering of articles via JavaScript.
Initially, 12 most fresh summaries for articles are displayed.
Summaries of ALL articles are putted into JavaScript cache-object (rendered by server in script tags).
When user click on tags, corresponding summaries for articles will be taken from JS cache-object and inserted into the page as HTML pieces.
Does it have some negative impact on how SEO-friendly my web site is.
The main problem is clear: only 12 "static" URL's are displayed and another will appear programmatically only on user interaction.
How to make the site SEO-friendly, keeping this nice filtering feature ?
When i will add a link "all articles" that will load separate page with all articles, will it solve the SEO problems ?
The way to make this work for Search Engines, user who don't have JavaScript and also in your funky way is to write this feature in stages.
Stage 1: Get a working "Paged" version of this page working, so it shows 12 results and you can click on "next page" and "last page" and maybe even on various page numbers.
Stage 2: Implement the filter using a form-post and have it change the results shown in the page view.
Stage 3: Add JavaScript over the top of the working form and have it display the results the normal post would display. You can also replace the full-page-reload for paging with JavaScript safe in the knowledge that it all works without JavaScript.
Most people use an AJAX request rather than storing an ever-increasing list in a JavaScript array.
Crawlers (or most of them) don't enable javascript while crawling. therefore, all javascript powered content won't be referenced. And your site will be considered smaller as it is by search engines. this will be penalizing for your pages.
Making a "directory" page can be a solution. But if you do so, search engines will send user to these static pages, and not through your javascript viewer homepage.
Anyway, I would not recommend to make a javascript-only viewable content:
First it's not SEO friendly
then it's not javascript-disabled
friendly
imposibillity to use history
functions (back and next)
middle click "open in a new
tab/window" won't be working.
Also you can't bookmark javascript
generated content
so is your nice feature nice enough to lose the points mentionned above?
There are ways to conciliate your feature + all those points but that's far from easy to do.
I have a large amount of XHTML content, that I would like to display in WebKit. Unfortunately, Webkit is running on a mobile platform with fairly slow hardware, so loading a large HTML file all at once is REALLY slow. Therefore I would like to gradually load HTML content. I have split my data in small chunks and am looking for the proper way to feed it into WebKit.
Presumably I would use javascript? What's the right way of doing it through javascript? I'm trying document.body.innerHTML += 'some data'. Which doesn't seem to do very much - possibly because my chunks may not be valid standalone html. I've also tried document.body.innerText += 'some data'. Which doesn't seem to work.
Any suggestions?
This sounds like a perfect candidate for Ajax based "lazy" content loading that starts loading content while the user scrolls down the page. There are several jQuery plugins for this. This is one of them.
You will have to have valid chunks for this in any case, though. Also, I'm not sure how your hardware is going to react to this. If the problem is RAM or hard disk memory, you may encounter the same problems no matter how you load the data. Only if it's the actual connection or the speed at which the page is loaded, will lazy loading make sense.
Load it as needed via ajax. I have a similar circumstance, and as the user scrolls near the end of the page, it loads another 50 entries. Because each entry contains many js events, too many entries degrades performance; so after 200 records, I remove 50 from the other end of the list as the user scrolls.
No need to reinvent the wheel. Use jQuery's ajax method (and the many shorthand variantes out there): http://api.jquery.com/category/ajax/