There is probably a better title for I'd like to accomplish, but the details should be helpful.
I've recently learned that specifying a script's src path as //some.domain.com rather than http://some.domain.com or https://some.domain.com will cause the browser to request the script using whichever protocol was used to load the page. This works great when the page is loaded from a site, but often I debug on my local system, so the protocol is file, and of course errors occur whenever resources or scripts aren't found.
Other than changing src paths, is there a better way to debug locally? I imagine there is code solution that detects when the page is running locally versus loaded from a domain, but I haven't found examples yet.
Install a product such as wampserver, then you'll have a localhost webserver you can test everything on. This is how I do it, works like a charm.
There are similar products available for ASP or other non-PHP server-side technologies (you didn't specify), if you are just doing HTML + JS then any old server would do.
Related
I have a locally-stored project whose directory structure is the following (I minimized non-relevant folders):
What I want to do is that in an HTML file, like index.html, to add a <header> such that its contents would be loaded from an external HTML file, so all of what I'll have to write in index.html would be <header>, and my solution would load the content automatically.
To do this, I'd like to use JavaScript (preferably jQuery, but I'll accept other solutions if they work and jQuery doesn't, or if they work and executed faster than jQuery).
I don't think that I should use an <iframe> due to the fact that it'd probably increase loading times more than using jQuery/JavaScript (which, like I said, is what works now, when the website is live).
Right now, I'm using the jQuery .load() function. I don't know much about jQuery, but I've been told that it should work locally - and it doesn't, for me.
My browser's console shows me the problem:
jquery-3.1.1.min.js:4 XMLHttpRequest cannot load file:///C:/Users/GalGr/Desktop/eiomw/header.html. Cross origin requests are only supported for protocol schemes: http, data, chrome, chrome-extension, https, chrome-extension-resource.
And I'm trying to overcome it.
This code works on my live website - it might not be updated to the code of the files that I linked to below, but it doesn't matter - their code matters.
This is the index.html file:
index.html
This is the header.html file:
header.html
This is `main_script.js:
main_script
The reason you're having a problem with this locally is mainly down to security measures in your browser.
Essentially whenever you're using jQuery's load() function it makes a separate HTTP request (approach known as AJAX) for the file or URL you give it.
Modern browsers enforce that the URL you request using AJAX methods is from the same origin (server) as a security feature to stop pages randomly loading content from anywhere on the internet in the background. In your case it seems like this shouldn't affect you because you're browsing your pages locally and the request you're making using load() is also for a local file (header.html).
However, I am assuming you're just opening up the page directly in your browser, so your browser's URL will look something like 'file:///C:/Users...' (similar example in the error message you gave). This means your browser is directly reading the file from disk and interpreting it as HTML to display the page. It seems likely you don't actually have a local HTTP server hosting the page, otherwise the URL would start with 'http://'. It is for this reason that the browser is giving the security error, even though your AJAX request for header.html is technically from the same source as the page it is executed on.
Your server will have an HTTP server which it's using to host the pages, and so everything works fine as you're then using HTTP as normal, and this security feature does not get in your way.
I would suggest that you simply install an HTTP server locally on your dev machine. You don't even need to 'install' one per-se, there are loads of development HTTP servers that just run standalone, so you start them up when you want to browse your local HTML files. As you appear to be on Windows, I'd check out either IIS (Windows' HTTP server) or IIS Express (like IIS but runs standalone). There are also many others available like Apache, Nginx, etc. etc.
If you do this, you can host your pages on something like 'http://localhost/index.html'. Then, any AJAX requests you make for local files will work fine, just like your server.
Hope that makes sense, and I'm not telling you something you already know?
Why not using something more straight foreword like mustache.js ?
I found a solution:
Using phpStorm's built-in localhost, I was able to emulate a server that handles my requests and responses.
I have a website that is grown somewhat large and is built on a super-restrictive platform (SBI). There you have to follow their file structure and put everything in an appropriate folder and then upload each and every file through their interface manually. I have cool HTML5 template and some Javascript with a lot of little files and images so it was just way easier to upload all this stuff to my OTHER DOMAIN hosted by Hostgator using Filezilla and then just refer css and js files from my SBI site to their location at my Hostgator's domain.
Are there any potential issues with this method?
The reason I am asking is because yesterday I came across Google's article on serving resourcing from a consistent URL: https://developers.google.com/speed/docs/best-practices/payload#duplicate_resources However, I might be misunderstanding what it means. When I put my actual URL to test at Google's page speed insights here https://developers.google.com/speed/pagespeed/insights it advises me to serve resources from a consistent URL, but in details it doesn't complain about my CSS and JS files, it complains about Facebook only, like this:
Suggestions for this page:
The following resources have identical contents, but are served from different URLs. Serve these resources from a consistent URL to save 1 request(s) and24.3KiB.
http:// static.ak.facebook.com/.../xd_arbiter.php?...
https:// s-static.ak.facebook.com/.../xd_arbiter.php?...
I appreciate you reading this. Thanks in advance!
Serving static content from a different domain is common practice, I don't see any issues there - it's as safe and reliable as the server you are using to serve it.
The facebook warning could mean you are loading the same FB API script twice, or it just may be some black magic done by the FB devs.
You should not have any problems with hosting your files on a different site. Your users may experience a slightly slower page load because their machine has to do more DNS lookups, on the other hand most web browsers only download a maximum of 2 files form a host simultaneously, so doubling your hosts can double your simultaneous downloads. That warning about Facebook is because the same script is being downloaded twice from two different places which is not ideal, but I'm not familiar with the Facebook api so I'm not sure if that can be helped.
I've been noticing that sometimes my Facebook app runs slow, and when checked it was because the all.js file was not loaded from the Facebook server, so I copied the file ontp my server and tested it.
Everything seems to work fine, and actually it runs faster. My question is - do you know if there are bugs or errors in doing this?
The problem here is that now you're shifting a dependency, and by extension the maintenance of that dependency to your local application. If it's hosted on Facebook's servers, they can update it to fix bugs or add features.
If it's taking a long time to load, you should bring it up on their support forums
Your page has to load the all.js file in any case.
Facebook servers should be faster than the server which host your
website. So, theoretically loading the js file from facebook should
be faster.
A better approach would be to cache the file for some time. This will make the page loads after the initial one much, much faster.
As people have mentioned, the all.js file is updated constantly with bug-fixes etc. So, it is always better to get the newest version of the file instead of manually updating it on your server after some time.
You can have some problems when the facebook update API. You will need to regularly and frequently (every 5 min?) update the file.
Ok so I'm lost here, frustrated and pulling my hair and out. Plus probably about to be fired or take a pay cut.
I moved Files from a development server to my local machine. The files are consistent (used diff tool), all the dependencies are there. It works for the most part. The problem is that the some of the javascript (not all) is just not working. We're using jquery and a lot of plugins for it. I've checked with the web developer plugin in firefox and all the js files are loading. I cleared the cache in both firefox and chrome multiple times to no avail. The development server is a windows server running wamp. My local machine is running ubuntu. Somebody tell me what I missed.
Download firebug as a Firefox extension and view the http request and responses.
Easiest may be from within the 'net' tab to determine if your script is making a request.
Very likely that it is a source domain issue. There are no work-around for this issue. The ajax request and the source data must be on the same domain.
It may have something to do with JavaScript's security limitations. (In certain circumstances) You can only operate on URLs or pages from the current domain, which most likely changed when you moved the files off the other server. More here.
Are you running the files via a webserver, or just opening the files directly? If it's the latter, you'll want to set up a server on your local machine for local testing, and serve the files using it. Otherwise, you'll very likely run into the domain restrictions others have mentioned above.
You may need to host the site using a local server. VS IDE has an add-on called live server. You need to set up a workspace in order for it to work. The port used on my machine was 5500.
You need to make sure any dependencies for javascript are running on your server or the javascript will not be executed. These dependencies are listed in the json file.
ex. If you require express, you need to be running node or the javascript won't execute in your web browser.
In the terminal:
node app.js
Any dependencies that are not installed and running on the server will not execute.
Are you accessing the html web pages through the webserver and not simply double clicking the file to open it?
Also if you have WebDeveloper toolbar installed the click "Disable", "Disable Javascript" and make sure "All Javascript" isn't ticked.
I noticed that when I open HTML file locally by double clicking on it, it will not "run" the same as if I had it on a web server and opened it by HTTP GET request.
I need to have a local HTML file a user can open by double clicking on it. This HTML file has several JQuery load calls such as this:
$("#content").load("http://somepage.com/index.html");
I want to update several divs with content from remote sites.
This works fine If I have this file on a web server but not if I double click it under windows explorer... How can I "make" the file "run" as it would on a web server?
I think you pretty much cannot. This has to do with domain-access restrictions, which are there to avoid cross site scripting and the likes.
The files on your hard drive are especially limited - think what the life could be if they were allowed to treat your whole hard-drive as a single domain.
If you want things to work properly you need to be running a server. XAMPP is a pretty good bet as it's easy to install and set up.
Any non-AJAX javascript will work fine as is though, as long as the paths to include any css or js are relative.
You can't do this locally. You have to have it hosted somewhere for this to work. It's done this way for the sake of security.
What are you trying to do that you "need" to have this?