I developed a website for a school project and locally on my PC, it works like a charm! the CSS and JavaScript are perfect and the website is very fluid and dynamic.
when i uploaded it to 000webhost, the website works well except for the JavaScript. the carousel images, sliders, etc are static and don't move.
The JavaScript is coded into each HTML file using <script></script> tags, and is not a separate .js file.
It works perfectly when its run locally and this problem only occurs when i run it from the web host.
Please view the website and assist anyhow possible, it will be much appreciated.
https://dut-it-tutors.000webhostapp.com/index.html
thanks.
You're loading the jQuery library from a CDN without https enabled. As your site is hosted using https the browser blocks the request to your non-https resource because of mixed content.
To fix this load jQuery from a CDN over https.
Just simply check the spellings of the file name.
'<script src="./userLogin.js"></script>'
is wrong when the actual filename is
<script src="./userlogin.js"></script>
check for upper and lower cases
Related
I'm working on a project to download a website with 2 layers for offline browsing.
although I'm facing the problem with CSS, JS, Image,
now my code save the index html file and change all the links to Absolute to avoid the href problem.
but it's not working for offline browsing.
my question is how can I write a script to download only 2 layers of the website for offline browsing and storge all the CSS, JS and Image for full offline browsing?
PS. I know I can just use request and write the files to locally, but how to put it to correct folder?
eg.
/far/boo/image.png or /far/boo/css.css
Thanks for the comment above make my direction to find my answer.
I end up using requests.get("http://somesites.com/far.boo", stream=True, headers= head) with some loop to do the job.
define head first,
head = {"User-Agent": "Mozilla/5.0 ..."}
I found mine at https://httpbin.org/headers
it's a bit ugly, but work correctly.
Reference: download image from url using python urllib but receiving HTTP Error 403: Forbidden
I tried uploading my website to the free 000webhosting service. My website loads fine locally, but when uploading it through FileZilla and reloading the page I get an error that says "The jQuery library must be included before the smoothscroll.js file. The plugin will not work properly."
I have a smooth scrolling javascript file and some jquery in my code that I got from somebody else. Does anybody have any idea why my website loads fine locally, but the smooth scrolling doesn't load when uploading my site to a host? And yes, the jquery is before the javascript.
Try to check for your problem:
The files don't exist on the server at all at that path
You moved the files but didn't change the path in index.html
You changed the path in index.html but didn't move the files there
Try checking the files of the particular plugin on the hosting server.
There must be some files that are missing or not uploaded completely.
I am learning OpenLayers 3 and I ran into a problem while trying to pull in basemaps to the browser using a JS Bin online editor.
If I write the exact same code in a local text editor (Notepad++) everything works as it should. But not when I am using JS Bin.
Here is the link with the code:
https://jsbin.com/wijoha/edit?html,css,console,output
Can you help me figure out what is wrong with it? I've already spent a couple of hours trying to solve the issue but can't get my head around it...
Looking at the console on the JSBin you have added (the browser window's one, rather than the JSBin one), the CSS is not being loaded as you are attempting to put a HTTP resource into a HTTPS page. The error message reads:
Mixed Content: The page at 'https://null.jsbin.com/runner' was loaded over HTTPS, but requested an insecure stylesheet 'http://openlayers.org/en/v3.10.1/css/ol.css'. This request has been blocked; the content must be served over HTTPS.
Additionally, your JS file ol.js is not loading at all, as openlayers.org does not seem to be accepting serving the file over HTTPS (for me at least, in Chrome).
Instead, try serving everything over HTTP (including the URL of JSBin itself), here is a working example:
http://jsbin.com/focoxoxabo/edit?html,css,console,output
There is probably a better title for I'd like to accomplish, but the details should be helpful.
I've recently learned that specifying a script's src path as //some.domain.com rather than http://some.domain.com or https://some.domain.com will cause the browser to request the script using whichever protocol was used to load the page. This works great when the page is loaded from a site, but often I debug on my local system, so the protocol is file, and of course errors occur whenever resources or scripts aren't found.
Other than changing src paths, is there a better way to debug locally? I imagine there is code solution that detects when the page is running locally versus loaded from a domain, but I haven't found examples yet.
Install a product such as wampserver, then you'll have a localhost webserver you can test everything on. This is how I do it, works like a charm.
There are similar products available for ASP or other non-PHP server-side technologies (you didn't specify), if you are just doing HTML + JS then any old server would do.
I noticed that when I open HTML file locally by double clicking on it, it will not "run" the same as if I had it on a web server and opened it by HTTP GET request.
I need to have a local HTML file a user can open by double clicking on it. This HTML file has several JQuery load calls such as this:
$("#content").load("http://somepage.com/index.html");
I want to update several divs with content from remote sites.
This works fine If I have this file on a web server but not if I double click it under windows explorer... How can I "make" the file "run" as it would on a web server?
I think you pretty much cannot. This has to do with domain-access restrictions, which are there to avoid cross site scripting and the likes.
The files on your hard drive are especially limited - think what the life could be if they were allowed to treat your whole hard-drive as a single domain.
If you want things to work properly you need to be running a server. XAMPP is a pretty good bet as it's easy to install and set up.
Any non-AJAX javascript will work fine as is though, as long as the paths to include any css or js are relative.
You can't do this locally. You have to have it hosted somewhere for this to work. It's done this way for the sake of security.
What are you trying to do that you "need" to have this?