As from yesterday I realised that I cannot make my web application access the javascripts & certain style sheets on my server (works offline)
It started to happen suddenly, and during my attempts to sort the issue I also made a dummy application to test it out:
http://intra.gchss.edu.mt
The scripts aren't accessible even if they are physically present. Tomcat reports HTTP Status 404 - The requested resource (/scripts/javascript/script.js) is not available.
In my attempts I also tried creating several other applications and tweaked contents.xml to stop tomcat caching:
<Context cachingAllowed="false" cacheMaxSize="0" cacheTTL="1">
<WatchedResource>WEB-INF/web.xml</WatchedResource>
</Context>
But they all behave in the same way. (I know it's irrelevant with new applications but I tried it anyway)
I checked the directory and the files permissions and they're 777 so they should be accessible and runnable by anyone. But still I can't access them.
There are two things I haven't tried yet and wouldn't want to try
Re-installing tomcat
Put the scripts in the main directory.
Any thoughts? I'm running tomcat 5.5 on Debian.
Thanks!
Related
The Problem:
I edit an asset file such has .js or .css via my code editor Sublime Editor 3. I then save those files to the server via an SFTP plugin on sublime. Then when I refresh the live website to view changes from my chrome browser (I have a plugin that flushes the browser cache so I see new changes.) I sometimes get a error on the chrome console that reads:
net::ERR_HTTP2_PROTOCOL_ERROR 200
Where the browser is not served the requested file. When I check my log file for Apache I see the following:
[alert] 657967#657967: *188534 pread() read only 7497 of 7498
My server is setup with Nginx running as a Web Server and Reverse Proxy for Apache.
How can I make it so that Nginx does not fail server the requested files even though they were just edited. Maybe sending back cache until it updates the new changes. Please advise because it is driving me nuts and I have no idea how to overcome it.
My workflow for JavaScript consists of me writing code and refreshing the live site to view web console on chrome. I need to be able to view the changes I made from the server via browser. I don't like local environments. I tried to google this topic many times with no luck so any help would be much appreciated.
I have a locally-stored project whose directory structure is the following (I minimized non-relevant folders):
What I want to do is that in an HTML file, like index.html, to add a <header> such that its contents would be loaded from an external HTML file, so all of what I'll have to write in index.html would be <header>, and my solution would load the content automatically.
To do this, I'd like to use JavaScript (preferably jQuery, but I'll accept other solutions if they work and jQuery doesn't, or if they work and executed faster than jQuery).
I don't think that I should use an <iframe> due to the fact that it'd probably increase loading times more than using jQuery/JavaScript (which, like I said, is what works now, when the website is live).
Right now, I'm using the jQuery .load() function. I don't know much about jQuery, but I've been told that it should work locally - and it doesn't, for me.
My browser's console shows me the problem:
jquery-3.1.1.min.js:4 XMLHttpRequest cannot load file:///C:/Users/GalGr/Desktop/eiomw/header.html. Cross origin requests are only supported for protocol schemes: http, data, chrome, chrome-extension, https, chrome-extension-resource.
And I'm trying to overcome it.
This code works on my live website - it might not be updated to the code of the files that I linked to below, but it doesn't matter - their code matters.
This is the index.html file:
index.html
This is the header.html file:
header.html
This is `main_script.js:
main_script
The reason you're having a problem with this locally is mainly down to security measures in your browser.
Essentially whenever you're using jQuery's load() function it makes a separate HTTP request (approach known as AJAX) for the file or URL you give it.
Modern browsers enforce that the URL you request using AJAX methods is from the same origin (server) as a security feature to stop pages randomly loading content from anywhere on the internet in the background. In your case it seems like this shouldn't affect you because you're browsing your pages locally and the request you're making using load() is also for a local file (header.html).
However, I am assuming you're just opening up the page directly in your browser, so your browser's URL will look something like 'file:///C:/Users...' (similar example in the error message you gave). This means your browser is directly reading the file from disk and interpreting it as HTML to display the page. It seems likely you don't actually have a local HTTP server hosting the page, otherwise the URL would start with 'http://'. It is for this reason that the browser is giving the security error, even though your AJAX request for header.html is technically from the same source as the page it is executed on.
Your server will have an HTTP server which it's using to host the pages, and so everything works fine as you're then using HTTP as normal, and this security feature does not get in your way.
I would suggest that you simply install an HTTP server locally on your dev machine. You don't even need to 'install' one per-se, there are loads of development HTTP servers that just run standalone, so you start them up when you want to browse your local HTML files. As you appear to be on Windows, I'd check out either IIS (Windows' HTTP server) or IIS Express (like IIS but runs standalone). There are also many others available like Apache, Nginx, etc. etc.
If you do this, you can host your pages on something like 'http://localhost/index.html'. Then, any AJAX requests you make for local files will work fine, just like your server.
Hope that makes sense, and I'm not telling you something you already know?
Why not using something more straight foreword like mustache.js ?
I found a solution:
Using phpStorm's built-in localhost, I was able to emulate a server that handles my requests and responses.
I use a number of CDN links in my webapplication for javascript and CSS, e.g.:
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.2.3/jquery.min.js"
integrity="sha384-I6F5OKECLVtK/BL+8iSLDEHowSAfUo76ZL9+kGAgTRdiByINKJaqTPH/QVNS1VDb"
crossorigin="anonymous"></script>
Usually everything works fine, but sometimes I get this message in Firebug console:
None of the "sha256" hashes in the integrity attribute match the content of the subresource.
If that happens, my javascript doesn't load and my application is broken. A simple refresh will resolve that. Rather than getting rid of the CDN links and hosting the files myself, I would like to fix this. Is this a common problem?
One possible explanation for this is if your system time is sufficiently off. I was running Debian in a VirtualBox instance. I hibernated the host machine a few times without touching the VM again. That's when I noticed certain Web pages weren't loading properly in Firefox within the VM. Once I got here it occurred to me to check the system time. Sure enough it was off by nearly 2 hours. ntp was not installed so I installed that package: sudo aptitude install ntp. I verified that the date/time was updated with date, and then tested Firefox again. The problematic Web pages (including this one) worked.
Make sure your network connection is working or if you have your browser set up to use Network Proxy that it too is working.
I was seeing this message as I was loading html locally (e.g., File -> Open File) on a browser where the integrity check would fail because I did not have the network proxy (via ssh tunnel) working at the time. As soon as I resolved my network connectivity, the page would load and these messages would go away (with the assumption, of course, that the integrity attribute values are correct.)
There is probably a better title for I'd like to accomplish, but the details should be helpful.
I've recently learned that specifying a script's src path as //some.domain.com rather than http://some.domain.com or https://some.domain.com will cause the browser to request the script using whichever protocol was used to load the page. This works great when the page is loaded from a site, but often I debug on my local system, so the protocol is file, and of course errors occur whenever resources or scripts aren't found.
Other than changing src paths, is there a better way to debug locally? I imagine there is code solution that detects when the page is running locally versus loaded from a domain, but I haven't found examples yet.
Install a product such as wampserver, then you'll have a localhost webserver you can test everything on. This is how I do it, works like a charm.
There are similar products available for ASP or other non-PHP server-side technologies (you didn't specify), if you are just doing HTML + JS then any old server would do.
Ok so I'm lost here, frustrated and pulling my hair and out. Plus probably about to be fired or take a pay cut.
I moved Files from a development server to my local machine. The files are consistent (used diff tool), all the dependencies are there. It works for the most part. The problem is that the some of the javascript (not all) is just not working. We're using jquery and a lot of plugins for it. I've checked with the web developer plugin in firefox and all the js files are loading. I cleared the cache in both firefox and chrome multiple times to no avail. The development server is a windows server running wamp. My local machine is running ubuntu. Somebody tell me what I missed.
Download firebug as a Firefox extension and view the http request and responses.
Easiest may be from within the 'net' tab to determine if your script is making a request.
Very likely that it is a source domain issue. There are no work-around for this issue. The ajax request and the source data must be on the same domain.
It may have something to do with JavaScript's security limitations. (In certain circumstances) You can only operate on URLs or pages from the current domain, which most likely changed when you moved the files off the other server. More here.
Are you running the files via a webserver, or just opening the files directly? If it's the latter, you'll want to set up a server on your local machine for local testing, and serve the files using it. Otherwise, you'll very likely run into the domain restrictions others have mentioned above.
You may need to host the site using a local server. VS IDE has an add-on called live server. You need to set up a workspace in order for it to work. The port used on my machine was 5500.
You need to make sure any dependencies for javascript are running on your server or the javascript will not be executed. These dependencies are listed in the json file.
ex. If you require express, you need to be running node or the javascript won't execute in your web browser.
In the terminal:
node app.js
Any dependencies that are not installed and running on the server will not execute.
Are you accessing the html web pages through the webserver and not simply double clicking the file to open it?
Also if you have WebDeveloper toolbar installed the click "Disable", "Disable Javascript" and make sure "All Javascript" isn't ticked.