Integrating a flash component on Joomla without reloading it on each page - javascript

Is-there a joomla compenent that permits to integrate a flash SWF on a joomla website : without realoding it for each page (by using an IFRAME or anything else

Hello anonymous (nice name!)
This would not be ideal at all, since if you were to have the flash content in an iframe, it would mean that EVERY other content loaded by joomla would need to be loaded into separate iframes in order to prevent the entire document from reloading.
This is impossible (well, ALMOST) due to the nature of the joomla core and how it is constructed at run time.
So many people (clients) request this feature of flash not reloading on each new page. It is the nature of the web and XHTML (or any front end web languages) that in order for new content to be display the document must be reloaded. Yes there are asynchronous methods such as AJAX etc, but those are out of the question for Joomla, unless you spent light years hacking away at the core.
If you had simple "views" or different objects you wanted to show within your flash movie and not have the ENTIRE movie start from frame 1 (for example) you could append URL variables to the flash source, and then based on what those variables are, you could manipulate what content is shown in the flash movie.
The swf file will still reload (it has to if the document is reloading) but at least you can serve up different content within the SWF file.
I hope this steers you in the right direction.
Kind regards,
Simon

Related

Accessing local files for a client-side only website, to be distributed as a HTML-download the user opens with a browser

Due to the safety rules of the same-origin policy (SOP), i am unable to load certain local files when opening an index HTML-file directly with a browser. Using a "live-server"-plugin works fine, as all the files in that case are "on the same server". I need to distribute the website as a client-side only app - A folder and html file to be opened with your browser. Solutions to the problem always seem to require setting up a server. Is there any way to avoid that, and keep everything on the client?
I am making a mathematics e-book, that i want to distribute as a website people can download. I want it to be client-only and a download, since if it were to become popular, then i wouldn't be able to afford running the server (as i would be studying at that time). I have chosen html and javascript over EPUB, as they are much more powerful, and allow for tons of interactivity (and much more efficient development).
So far i have a browser.html file, that loads individual pages with JQuery .load(). This browser.html file has both html, and javascript. The CSS is in an external file. The individual pages have many pictures, that are also stored locally on the server. As the pages are contained in subfolders, the picture URL's go out into their parent folder, and into the assets folder, like: ../../../Assets/Chapter1/Talopgaver og intuition/Misc\F\solsystem.png. I use custom-elements (shadow-DOM) to handle various complex aspects such as questions, answers, along with certain other things too. Other than JQuery, i also make use of Math-Jax, and a "polymer" library that helps with cross-browser support of custom-elements. All the pages in a certain chapter are loaded in the start, and then put into a array (this makes it fast to scroll through pages as you often do in books). They (as strings) are each modified slightly to automate certain tedious parts of development.
I have tried to open the browser.html file on chrome, firefox, internet-explorer, and edge. They all load the html that browser.html inherently contains (properly styled even), but none of them load any external pages. Interestingly, one of the images used in the browser.html file still works (i would think that would be a local file too, not?). If have tried turning off calls to ajax or external CSS, but nothing changed. I have searched for other people with similair problems, but all the answers just reccomended setting up a server.
When loading the page with a live-server plugin, the result looks something like this:
browser.html page opened with "live-server" visual studio code plugin by Dey, Ritwick
When opening the browser.html page directly using chrome, it looks like this:
browser.html page opened directly with chrome
The error i get (after having removed an ajax .get() call) isn't particularily descriptive: simply "Failed to load resource: net::ERR_FILE_NOT_FOUND" from "platform.js:1". Even if turn off the call to start loading pages, it gives me excaclty the same error messages.
Looking at the network reports, with live server it looks very ordinary. without it's pretty weird. It says it takes hours to load browser.html, even though that clearly isn't the case. It fails loading platform.js, after using 22 seconds trying. The networks report looks a bit more healthy when turning off the call to load pages. It gives up loading platform.js faster (8 seconds), yet still supposedly takes hours to load browser.html.
Though it shouldn't ultimately be neccesary, i have linked the entire browser.html document below, along with an example of a page it might load (the example in the first picture above).
browser.html. Too big for a stackexchange code-block embed
Page in previous picture (page 37)
Any help is appreciated!
EDIT: Main problem seems to be the loading of pages using JQuery.load(). Even on a simple testing website that operation is just not possible without running on a server.

What are the benefits of putting jQuery references in Masterpage?

I'm currently developing a website with ASP.NET and I always check its performance through Firebug...
Now, my question is,
is it better to put all referencing jquery references in Masterpage?
(Reference all first)
or is it better to put specific jquery reference to a specific Content Page?
(Reference specific only)
Thank you!
I would think it's best to put your stylesheets in the master page, as long as they are site-wide styles. You should also think about compressing them into one download, so you reduce HTTP requests.
Maintainance-wise, I would put each reference where it belongs, depending on the scope of your jquery-referencing object.
For instance, if it's something to do with, say, the main navigation, that's present on all rendered pages, and therefore certainly in your master page, then you have better have your $("#navBar") close to your <div id="navBar">...</div>, i.e. on your MasterPage.
If on the other hand it's something related to a specific content page, let's say that shiny carousel (and its specific jQuery plugin) you need on your homepage, you have better have $('myCarousel').carousel(2); close to you <div id="carousel">, i.e. on your HomePage.aspx content page.
And while you're at dispatching stuff to where they belong, if you can tell for sure that carousel plugin is only required on your homepage, you have better include the plugin on your HomePage.aspx content page only.
Not only will you ease your maintainance, but you will also get benefits performance-wise, as you will be more likely to be initializing variables only when they are used, therefore puting a little bit less memory overhead on the browser. Same stuff about loading plugin-related resources (you may not want each and every page load bloated because your master links to every stuff required somewhere on your site).
Generally speaking, I would encourage you to identify any libraries or plugins which you would be using on multiple pages and include them in a place which will automatically put them within those pages.
For maintainability, I usually mash/minify all of my third-party libraries (such as jQuery, jQuery-UI, Backbone, etc) into a single JS file, along with any plugins for them which I know I will be using throughout the site. The downside to doing this is that you may have one very large JS file which loads the first time the user loads the page - the upside: client-side cache that file, and the user doesn't have to load it again.
The general rule of thumb is: minify the number of bytes that the user has to download, and minify the number of HTTP requests which the user has to make throughout your site. So - by compressing these kinds of files into a single download, and letting it exist on every page with the same URL - you can have a single request which generally gets a 302 response and no download. This is far better than having 5 different plugins which are loaded on different pages, each of which makes a separate HTTP call - even if those calls all receive 302 responses.
It is not a proper Approach to referencing jquery references on Content Page.
Once give jquery references on Master Page and use them on all content Pages. It is, Proper and Optimized approach.

JavaScript and SWF communication

I have a sizeable interactive swf file and the file is embedded to my HTML using SWFObject. I can communicate with the swf by JavaScript and it works perfectly. But it has no preloader and because the file is big I want to show a loading image or swf and when the file is loaded completely show my play button using JavaScript.
How can I understand, if the file is loaded completely?
I tried bunch of solutions but none of them was successful. First, I tried to create a preloader in Flash and load my external swf then send a message to JavaScript on complete event using externalinterface, it worked, but I couldn't communicate to the main swf Action Scripts by JS anymore.
I found some JavaScript libraries that are supposed to fire an event, when the file is loaded but it happens when the loading is successful (the swf file is there and starts loading).
I would also strongly recommend you incorporate your play/pause, etc., functionality directly into your Flash program - there is really no need to use JavaScript for this!
But if you have to use JS, you could show the load progress by creating an internal preloader, or loading an external swf into the same application domain (whereas if you don't use JS, you don't need to worry about application domains), or using SWFBridge to establish two-way communications between two separate SWFs.
The internal preloader is a pretty neat solution, if you want all of your data embedded into just one file - at the expense of having to create additional frames, and having to think about where exactly to put your class instances in your FLA (you can't use "Embed into frame 1").
Loading into the same application domain is more elegant, especially if you use your Flash IDE mostly for coding (and not design), or if you have external data anyway. Plus, it's a good way to create modular applications.
SWFBridge is really good if you have ActionScript 2 SWFs, or legacy files without directly available source code - but if all you need is a simple preloader and some JavaScript, I would probably not use it in this case.

Web Development Best Practices - How to Support Javascript Disabled

What is the best thing to do when a user doesn't have JavaScript enabled? What is the best way to deliver content to that kind of user? What is the best way to keep a site readable by search engines?
I can think of two ways to achieve this, but do not know what is better (or if a 3rd option is better):
Rely on the meta-refresh tag to redirect users to a non-javascript version of site. Wrap the meta-refresh tag in a noscript tag so it will be ignored by those with javascript.
Rely on an iframe tag located within the body tag to deliver a non-javascript version of site. Wrap the iframe tag in a a noscript tag so it will be ignored by those with javascript.
I would also appreciate high-profile examples of the correct or incorrect way to do this.
--------- ADDITION TO QUESTION -----------
Here is an example of what I have done in the past to address this: http://photocontest.highpoint.edu/
I want to make sure there aren't better ways to do this.
You are talking about graceful degradation: Designing and making the site to work with javascript, then making the site still work with javascript turned off. The easiest thing to do is include the html "noscript" tag somewhere near the top of your page that gives a message saying that the site REQUIRES javascript or things won't work right. SO is a perfect example of this. Most of the buttons at the top of the screen run via javascript. Turn it off and you get a nice red banner and the drop down js effects are gone.
I prefer progressive enhancement development. Get the site working in it's entirety without javascript / flash / css3 / whatever, THEN enhance it bit by bit (still include the noscript tag) to improve the user experience. This ensures you have a fully working, readable website regardless if you're a disabled user with a screen reader or search engine, whilst providing a good user experience for users with newer browsers.
Bottom line: for any dynamically generated content (for example page elements generated via AJAX) there has to be a static page alternative where this content must be available via a standard link. If you are using javascript for tabbed content, then show all the content in a way that is consistent with the rest of the webpage.
An example is http://www.bbc.co.uk/news/ Turn off javascript and you have a full page of written content, pictures, links etc. Turn on javascript and you get scrolling news stories, tabbed content, scrolling pictures and so on.
I'm going to be naughty and post links to wikipedia:
Progressive Enhancement
Graceful Degredation
You have another option, just load the same page but make it work for noscript users (progressive enhancement/gracefull degradation).
A simple example:
You want to load content into a div with ajax, make an <a> tag linking to the full page with the new content (noscript behavior) and bind the <a> tag with jQuery to intercept clicks and load with ajax (script behavior).
$('a.ajax').click(function(){
var anchor = $(this);
$('#content').load(anchor.attr('href') + ' #content');
return false;
});
I'm not entirely sure if Progressive Enhancement is considered to be best practice these days but it's the approach I personally favour. In this case you write your server side code so that it functions like a standard web 1.0 web app (no JavaScript) to provide at least enough functionality for the system to work without JavaScript. You then start layering JavaScript functionality on top of this to make the system more user friendly. If done properly you should end up with a web app that at least provides enough functionality to be useful for non-JavaScript users.
A related process is known as Graceful Degradation, which works in a similar way but starts with the assumption that a user has JavaScript enabled and build in workarounds for cases where they don/t. This has a drawback, however, in that if you overlook something you can leave a non-JavaScript user without a fallback.
Progressive Enhancement example for a search page: Build your search page so that it normally just returns a HTML page of search results, but also add a flag that can be set via GET that when set, it returns XML or JSON instead. On the search page, include a script that does an AJAX request to the search page with the flag appended onto the query string and then replaces the main content of the page with the result of the AJAX call. JavaScript users get the benefit of AJAX but those without JavaScript still get a workable search page.
http://en.wikipedia.org/wiki/Progressive_enhancement
If your application must have javascript to function then there's nothing you can do except show them a polite message in a noscript tag.
Otherwise, you should be thinking the other way around.
Build your site without JS
Give awesome user experience and make it full functional
Add JS and make the UX even more functional. Layer the JS on top.
So if the user doesn't have JS, your site will still revert to step two of your site state.
As for crawling. If your site depends on AJAX and a lot of JS to work, you can make gogole aware of it : http://code.google.com/web/ajaxcrawling/docs/getting-started.html
One quick tip that may help you: just install lynx, a command-line web browser, and you'll immediately see how google and other seo see your site (and blind people too). This is very useful. Of course, in a command line windows, there's no graphics and javascript is disabled.
If you're doing "serious" Ajax (e.g. client side-routing) the following technique could be useful:
Use Urls without GET/"?"-parameters (it makes your life easier later on)
Use http://baseurl.com/#!/path/to/resource for client side-routing
Implement rendering of non-script HTML-version of your site (HTML snapshot is what Google calls it) at http://baseurl.com/path/to/resource
Wrap the whole content of your HTML snapshot in noscript-tags and redirect via top.location.href to the full version of the site
Handle http://baseurl.com/?_escaped_fragment=/path/to/resource - it should redirect via 301-response to http://baseurl.com/path/to/resource
Use a-tags only for GET-links, use forms for POST/PUT/DELETE-links - unstyle the hell out of them if necessary
A nice example code for links I found while researching "How to write proper Ajax-code":
Resource
This is of course a pretty complex solution but it should enable both SEO (including non-search engine crawlers) and accessibility. The problem is that you have to be able to render your page server- AND client side.
One solution could be to use a templating framework like mustache where implementations for different platforms exist.
Use something like {{#pagelet}}/path/to/partial{{/pagelet}} for dynamic parts of your page - example: {{#pagelet}}/image/{{image_id}}/preview{{/pagelet}}
In your client-side rendering, pagelet would be implemented to be dynamically replaced with something loaded via Ajax (for example: render )
In your server-side rendering, pagelet would just be rendered directly (in doubt just curl the pagelet and render it right away - or if you can write the code asynchronously do it just as you would do it client side: write some temporary span into a buffer, start fetching all the pagelets, replace the temporary spans as the pagelets arrive and flush the buffer once all pagelets have been rendered.
That's the best general design I found so far. You can deep link into your app, it's search engine friendly and it should force you to build a page that gracefully degrades.
P.S.: One advantage of the techniques described above is that both the Ajax- and the "Web 1.0"-rendering of a page could profit from memcached-caching of whole pagelets.
I would prefer to code the page without javascript and then if javascript is enabled, we redirect users to a similar page with javascript. (same concept as progressive enhancement)
redirecting with javascript

Speed up web site loading

I am looking for the best way to speed up the load time of my js.
The problem is that I am working with a very large site that uses the jquery framework, and what's happening is because the site is also loading, facebook connect, addthis sharing, google analytics and another tracking code, the jquery is delayed a few seconds, and certain elements like the calendar just appear, and my users are complaining that things take to long.
I did a test in google chrome and the avg load time is 4s. Which is too much.
I am already doing minification, and the jquery UI/ Jquery is being loaded from google. What's the best way to approach this?
Make fewer http calls by combining images and script and css, and also use a Content Delivery Network for you static images and css might help!
You are not likely to be able to do much more about the load time of the external scripts, so what you can do is to change the order that things happen in the page so that the external scripts are loaded after you have initialised the page.
Scripts are loaded and executed in a serial manner, so if you change their order in the source code, you also change the order they are loaded.
Instead of using the ready event in jQuery, you can put your initialising code inline in the page, after all the content but before the body closing tag. That way the elements that you want to access are loaded when the script runs, and you can put external scripts below the initialising code to make them load after.
Small technical changes (such as serving the JSs from Google, minifying, etc..) will only get you so far.
Seems you simply have lots of dynamic stuff going on in your page. Have you though of an asynchronous way of building your content? One option might be to place placeholders instead of the external content, and asynchronously load it, so when all the scripts are loaded and ready, all you need to do is throw the markup into the placeholder.
This will create a better user experience, instead of the user waiting 10 seconds for the entire page, it will start loading incrementally after 2 seconds (and still fully load after 10).
In addition to Yuval's answer some options that might or might not bring you a speed gain:
the load time of external libraries is something beyond your control. Try to include them as late as possible, and better still, dynamically after the page has loaded. This way your page won't stall, if Google analytics or Facebook have another hickup.
It is not necessarily faster to load jQuery from Google. Consider putting jQuery, jQuery UI and as many of your own JS as reasonable in a single file, minify and gzip it and let the server serve the gzipped version where possible. Note here, that the gain in speed depends largely on what your users cache and where they cache it. If they already have jQuery from Google in their cache, this technique might make page load slower.
The bottomline is, that after some optimization you're out for experimenting. You must find out, what your average user has in her cache, if the page is accessed directly via deep links or if you can smuggle some JS or CSS (or even images) into her cache via a previous "landing page".
Make sure you deliver your content in gzip/deflate encrypted format. Combine multiple javascript files into 1 file, which helps to reduce the number of http requests.
P.S. Here's a test tool to check if compression is configured:
http://www.gidnetwork.com/tools/gzip-test.php

Categories