JavaScript to replace server used for all images? - javascript

Given a static HTML file and its accompanying static CSS file that have references to a static server holding images, would it be possible to use JavaScript to prevent the page from loading all images from their src attribute and url() definition and instead replace them with another server?
Trying to do something along the lines of
$('body').html($('body').html().replaceAll('src.example.com', 'target.example.com'));
Doesn't prevent the images from being loaded at the first place. Any ideas?
Background story, in case it helps you suggest something different: I'd like my website to display images inside China from a different local Chinese server. That might help with the throttling that non-Chinese sites experience in China. Choosing a Chinese CDN for serving the whole world seems like a bad idea and relying on a service resolving DNS differently per country seems somewhat risky.
So currently we have the .com and .cn domains and want the .cn domain to be served from the same server but have its images served from a local Chinese CDN. While the website does generate dynamic pages, they are cached statically and trying to generate different pages by domain would means more effort generating the pages. That's why I thought that perhaps JavaScript could help out by replacing all images.

With a combination of tricks like using the base tag and onclick and onsbumit event handlers you could achieve your objective, but I'm almost sure it isn't a nice solution.
Using a javascript at the head or the first thing in your body you could insert a base tag which would change the standard server for all resources linked by your site (forms actions, link href, stylesheets, images) requests (even the CSS ones).
Then to avoid the server change for links and forms you would have to handle click and submit events to fix the server according to the current website url or once the page is ready modify all them at once.
Does this might work? Yes, but I do believe the best solution would be change the URLs in server side or let the CDN handle the geotargeting.

Related

fewer HTTP requests on server - wordpress site

I have recently run a gtmetrix.com speed test on my website
The main area of focus on my site seems to be style sheets and JS and making fewer HTTP requests. It tells me...
This page has 44 external Javascript scripts. Try combining them into one.
This page has 33 external stylesheets. Try combining them into one.
Is it possible to combine these all into one? I have one style sheet with my child theme, the rest come from the parent theme and Bootstrap CSS/Js.
Also, Does this have much of an effect on the speed of my site?
If your css files are stored on your own server, you can concatenate them into one file in any order. If your js files are stored on your own server, you can also carefully concatenate them into one single file, but the order in which they are concatenated should match the order in which they are called now, relative to themselves and to the html code, just to be safe, so universal stuff like jquery should probably come first, and very specific custom scripts should be appended last. Make sure to encase stuff in $( document ).ready(function() {}), as appropriate.
If your css and js files are stored on external servers as you indicated, then you can either download them and store them locally instead, or you can fetch them all server-side on page load with cURL (or the equivalent for your language) instead of client-side and then assemble them in memory each time you load the page. Not sure why you would do this, because it would take up a lot of server resources and probably make the site run slower, but it is possible.
Realistically, what I would do to improve your speed is, instead of working on concatenating the files, which will not provide much significant benefit, work on seeing which files you actually need and then getting rid of the ones you don't.

Advantage of embedded scripts in html to save an extra HTTP request?

I was looking at why embedding scripts in HTML are better than using external scripts and one point was that it saves an extra HTTP request. But I want to know how would this link to a real life situation? Can someone give me an example of this happening and how?
Thank you.
When a browser gets an HTML file, it parses that file to figure out what to display. Anytime it hits a tag that requires an additional resource (eg script tag with a src attribute, img tag link etc) it will send another request to download that resource.
I think embedding the JavaScript code in the page isn't the best solution, as it means you can't let the browser cache the JS that's common across all pages, but it is good to concatenate and minify scripts so that you only download one or two bundles.
There's a lot of good discussion at the YSlow site: http://yslow.org/

Change href of anchor tag through html/css

I have a webpage(say http://www.example1.com) which contains an anchor tag that points to a different website(say http://www.example2.com). I have to test it on testing servers (for which the urls are www.example1-test.com and example2-test.com) before publishing it. When the code is on example1-test.com then the link should point to example2-test.com and when it is on example1.com then it should point to example2.com.
But I cannot use JavaScript or manually change code while switching the servers. I can only use html and css. Is there a way to do this?
I know this is a weird question and css is used for styling and not to write logic but I cannot use JavaScript for this unless there is a way to enable JavaScript in the browser through some html tags.(say some meta tags,etc.)
I can put in two separate link pointing to respective example1.com and example2.com and do a show/hide depending on the environment but then the question is where do I put these conditions (cannot use JavaScript).
For testing I used to change my hosts file and redirect live domains to my test IPs on the internal network. I follow links and my browser thinks I'm hitting example2.com but actually my hosts file is pointing to the IP of example2-test.com and I see new functionality. When both go live I know visitors experience exactly what I've just tested but using live DNS.
Nowadays URLs in my applications are generated using routes and environments. I can specifically tell my framework that in development environments I expect URLs to be generated with XYZ base but behave differently in production.
Another option is Greasemonkey scripts. It is a browser plugin which executes scripts on command to manipulate pages, navigate, automate and more.
I can't think of a client-side way to achieve what you're looking for without the use of Javascript or browser plugins.

Is it possible to cache an HTML page in a way that it can be requested from multiple URIs?

I'm in the process of building a single-page web application with friendly URLs. As an example, for a chat you could request /chat/roomname or /chat/roomname2 to connect to a different chatroom. Since this is an SPA however, both of those would lead to the same HTML contents.
Is it possible to tell the browser to cache both pages as a single page (as in, going to /chat/roomname would also cache /chat/roomname2 in the browser), or something that would give a similar effect? This way the HTML contents could be large and only have to be loaded once.
Alternatively I could do /#/chat/roomname or similar, though I'd prefer not to if the above is possible.
Separate paths are separate things for the browser. Even the same pages with different query parameters are treated different (hence, the "cache busting" technique). However...
You can:
Off-load all scripts to external scripts so the browser caches them separate from the page.
Do the same with CSS as well.
Keep the page to a bare minimum.
If the resources (scripts, styles, images) won't be modified for a long period of time, you can set a longer expiry for them.
Instead of loading the entire layout using HTML, you can use AJAX to fly in templates. That way, those layout requests get cached as well. Have JS assemble them when they're needed.
In the end, your pages will be devoid of all JS, CSS and markup that can be flown in and assembled using JS. This will make the page lighter than usual. You can take it a step further by minifying the scripts and styles, compressing images, compress the HTML and stuff.

How to setup a dynamic website with javascript only (no serverside)

Here's my problem: I want to build a website, mostly static but with some dynamic parts (a little blog for news, etc..).
My webserver can only do static files (it's actually a public dropbox directory!) but I don't want to repeat the layout in every html page!
Now, I see two possible solutions here: either I create an index.htm page that emulates site navigation with javascript and AJAX or I create all the different html pages and then somehow import the layout bits with javascript..
From you I need ideas and suggestions on how to implement this, which libraries to use, or maybe there exists even something tailored exactly for what I need?
Thanks!!
I would define the site layout in your index.html file, and then use JavaScript and Ajax to load the actual content into a content div on the page. That way your content files (fetched by Ajax) will be more or less plain HTML, with CSS classes defined in index.html. Also, I wouldn't recommend building a blog in pure HTML and JavaScript. It wouldn't be very interactive; no comments, ratings, etc. You could store your blog content in XML and then fetch and display it with Ajax and JavaScript, however.
While on the subject of XML, you could implement all your site content in XML. You should also store the list of pages (for generating navigation) as XML.
Just another one way. You can generate static HTML in your computer and upload result to dropbox. Look at emacs muse.
jQuery allows you to easily load a section of one page into another page. I recommend loading common navigation sections into the different pages, rather than the other way around to avoid back/forward problems. Layout can be done with a separate CSS file rather than with tables to minimize the amount of repeated code. For the blog, you could put each blog entry in a separate file and load each section individually.
However, I would just use something already available. TiddlyWiki, for example, is a self-contained wiki that is all in one file. It's very customizable, and there's already a blog plug-in available for it. You can work on the site on your hard drive or USB drive, and then you can upload it to the web when done. There's nothing more to it.
Have you considered using publishing software on your computer to combine your content with a template, resulting in a set of static pages that you can then upload to the dropbox?
Some options in this regard come to mind:
Movable Type - can output static HTML which can then be uploaded to the server
Adobe Dreamweaver
Apple iWork Pages
To handle comments, you can use Disqus. It inserts a complete comment system into your site using just JavaScript.
You can use the Google Closure templates. It's one of the fastest and most versatile javascript templating solutions around.

Categories