I've been noticing that sometimes my Facebook app runs slow, and when checked it was because the all.js file was not loaded from the Facebook server, so I copied the file ontp my server and tested it.
Everything seems to work fine, and actually it runs faster. My question is - do you know if there are bugs or errors in doing this?
The problem here is that now you're shifting a dependency, and by extension the maintenance of that dependency to your local application. If it's hosted on Facebook's servers, they can update it to fix bugs or add features.
If it's taking a long time to load, you should bring it up on their support forums
Your page has to load the all.js file in any case.
Facebook servers should be faster than the server which host your
website. So, theoretically loading the js file from facebook should
be faster.
A better approach would be to cache the file for some time. This will make the page loads after the initial one much, much faster.
As people have mentioned, the all.js file is updated constantly with bug-fixes etc. So, it is always better to get the newest version of the file instead of manually updating it on your server after some time.
You can have some problems when the facebook update API. You will need to regularly and frequently (every 5 min?) update the file.
Related
Last year we put online a website using Gatsby that made use of a service worker (SW) for caching files. A month ago, we put online a complete rework of this website, using a completely different tech stack (ie: not using Gatsby and SW anymore). The new version runs on Webflow now.
My problem is that the previous service worker is still installed on our customers browsers. Any previous visitor is seeing the old version of the website because the old SW cached everything.
I have no way of pushing another JS script to unregister the SW because all the JS files are cached by this same SW! This old version of the website seems to run completely offline now and I don't see a way of removing this old SW and the associated cache (without asking for my customers to manually remove the SW themselves of course).
Does anyone have any idea :D
EDIT with recent observations:
The solution is certainly to provide a new sw.js file.
The easy way would have been to have Webflow (where the new website is hosted) serve a new sw.js file. Unfortunately, it does not seem to be possible (see here).
But I could 301 Redirect to a different location of the sw.js file. Unfortunately (again), browsers don't accept redirect when loading a new service worker.
So I am still stuck there.
On cloudflare I want to disable caching and see my website changes immediately that I've pushed live.
Things I've tried:
I've put development mode on.
Create a bypass on caching in page rules.
Purged an individual webpage.
Purged the website.
Set cache to clear every 2 hours.
None of the above worked.
Tech I'm using:
Angular2
SystemJS
Typescript which becomes javascript on build.
Firebase for hosting and database.
Cloudflare for SSL etc.
The only way people see my website changes, it if they hard refresh.
The main problem is I've got a javascript file called app.js and its has all my javascript in for my Angular app. And it doesnt seem like its trying to get the resource in the browser.
I've changed the app.js to app.js?1490959855777
And still doesnt fetch the file again.
I basically want to see my JS file without a user having to hard refresh.
Based on the discussion above, it looks like the caching is happening on the browser - since a hard refresh will get the new file contents.
I think what happened is CF told the browser to hold onto that file for a very log time. And the browser is listening to that request.
Because you can't ask your users to do a hard refresh, you'll need to rename the static files that are being cached so aggressively.
This is not a programming question per se. I am using a free web host called getfreehosting. I am using their online file manager to transfer files. From time to time, the changes I make on source code do NOT reflect immediately after I upload them. I.e. when I run my application on Chrome, then go to view page source, I realize the JavaScript running is still the old version! In most cases this doesn't happen but when it does it is extremely frustrating. I've tried clearing the browser's cache. I even tried editing the file directly on their servers. Sometimes it solves the problem but other times it doesn't.
Is this a common issue encountered when transferring files to a web host? Or perhaps this is one of the downsides of using a free web host?
Thanks.
You can try clearing your browser's cache, or the ol' CTRL+F5 refresh trick. Otherwise, the hosting provider may be using a caching layer to help ease resource usage.
It is the responsibility of the server to indicate to the browser what the cacheable lifetime of the script files are when they are served to the browser (1 hr, 1 day, 1 month, etc...). This is a server side setting.
Caching is very important for both server-side efficiency and client-side performance so you don't want to defeat it completely.
You can either shorten the server-side setting for the cache lifetime or you can use a version number in your script files (like jQuery does) so that when you revise your script files, you give them a new filename like "myscript-v12.js" and update the corresponding HTML files to refer to the new filename. Then, as soon as the browser gets the new HTML file, it is guarenteed to get the new JS file because the new filename could never have been in the browser cache.
If this is just an issue for you personally while developing and revising your site, then just clear your browser cache after you upload new files and then when your browser loads that page, it won't have any version in the cache and will be forced to get the new version from the server.
There is a CACHE system in modern browsers.
Try clear cache before you browse your web site.
I version all of my client side JS files like "/js/myfile.js?v=3903948" so that my clients don't need to clear their browser cache to ensure they get the updated files. But every time I push an update, without fail, at least one person runs into a problem where they are running the old version and get some kind of error. I used to think that this was just them having already been on the page during the release and just needing to reload the browser, but this happened to me today when I was definitely not previously on the page. I browsed to the live site and was running the old code. I needed to do a browser refresh on that page to get the new file.
What can cause this?
PS I was using Chrome on Win7, but I have seen clients report this before on all different browsers.
If your main web page can also be cached, then the old version of that page can be requesting the old version of the JS file. JS file versioning works best if the page that actually refers to the JS file cannot be cached or has very short caching time.
I agree with jfriend00 about the webpage itself being cashed and thus requesting the old javascript version.
To prevent this, you can have the javascript file loaded by an ajax (Post) request, either requesting the server what is the accurate(latest) version number to download, or requesting the javascript itself and inserting it, e.g. in the head of the page.
Edit: see for example here
I make a quick AJAX request to the server for the version it expects them to have, then force them to refresh the page if the client's script is old.
Seems that proxy or some load balancer is serving old content instead of new. Also check IIS/webserver settings how are these files cached/expired.
You can check what is going on on the wire with tools like Fiddler.
There is probably a better title for I'd like to accomplish, but the details should be helpful.
I've recently learned that specifying a script's src path as //some.domain.com rather than http://some.domain.com or https://some.domain.com will cause the browser to request the script using whichever protocol was used to load the page. This works great when the page is loaded from a site, but often I debug on my local system, so the protocol is file, and of course errors occur whenever resources or scripts aren't found.
Other than changing src paths, is there a better way to debug locally? I imagine there is code solution that detects when the page is running locally versus loaded from a domain, but I haven't found examples yet.
Install a product such as wampserver, then you'll have a localhost webserver you can test everything on. This is how I do it, works like a charm.
There are similar products available for ASP or other non-PHP server-side technologies (you didn't specify), if you are just doing HTML + JS then any old server would do.