I'm working on a project to download a website with 2 layers for offline browsing.
although I'm facing the problem with CSS, JS, Image,
now my code save the index html file and change all the links to Absolute to avoid the href problem.
but it's not working for offline browsing.
my question is how can I write a script to download only 2 layers of the website for offline browsing and storge all the CSS, JS and Image for full offline browsing?
PS. I know I can just use request and write the files to locally, but how to put it to correct folder?
eg.
/far/boo/image.png or /far/boo/css.css
Thanks for the comment above make my direction to find my answer.
I end up using requests.get("http://somesites.com/far.boo", stream=True, headers= head) with some loop to do the job.
define head first,
head = {"User-Agent": "Mozilla/5.0 ..."}
I found mine at https://httpbin.org/headers
it's a bit ugly, but work correctly.
Reference: download image from url using python urllib but receiving HTTP Error 403: Forbidden
Related
I developed a website for a school project and locally on my PC, it works like a charm! the CSS and JavaScript are perfect and the website is very fluid and dynamic.
when i uploaded it to 000webhost, the website works well except for the JavaScript. the carousel images, sliders, etc are static and don't move.
The JavaScript is coded into each HTML file using <script></script> tags, and is not a separate .js file.
It works perfectly when its run locally and this problem only occurs when i run it from the web host.
Please view the website and assist anyhow possible, it will be much appreciated.
https://dut-it-tutors.000webhostapp.com/index.html
thanks.
You're loading the jQuery library from a CDN without https enabled. As your site is hosted using https the browser blocks the request to your non-https resource because of mixed content.
To fix this load jQuery from a CDN over https.
Just simply check the spellings of the file name.
'<script src="./userLogin.js"></script>'
is wrong when the actual filename is
<script src="./userlogin.js"></script>
check for upper and lower cases
I am learning OpenLayers 3 and I ran into a problem while trying to pull in basemaps to the browser using a JS Bin online editor.
If I write the exact same code in a local text editor (Notepad++) everything works as it should. But not when I am using JS Bin.
Here is the link with the code:
https://jsbin.com/wijoha/edit?html,css,console,output
Can you help me figure out what is wrong with it? I've already spent a couple of hours trying to solve the issue but can't get my head around it...
Looking at the console on the JSBin you have added (the browser window's one, rather than the JSBin one), the CSS is not being loaded as you are attempting to put a HTTP resource into a HTTPS page. The error message reads:
Mixed Content: The page at 'https://null.jsbin.com/runner' was loaded over HTTPS, but requested an insecure stylesheet 'http://openlayers.org/en/v3.10.1/css/ol.css'. This request has been blocked; the content must be served over HTTPS.
Additionally, your JS file ol.js is not loading at all, as openlayers.org does not seem to be accepting serving the file over HTTPS (for me at least, in Chrome).
Instead, try serving everything over HTTP (including the URL of JSBin itself), here is a working example:
http://jsbin.com/focoxoxabo/edit?html,css,console,output
I am trying to add a Javascript picture gallery created with Wowslider to my Blogger blog.
I followed the instructions on the Wowslider website, and I thought I could serve the necessary files, including the images, from my server.
Unfortunately, after I set it all up, it didn't work, and by using Firebug, I discovered that the files on my server won't be shown on the Blogger server because my site uses http and Blogger uses https. So I was getting an "Blocked loading mixed active content" error.
As far as I can see, there's no way for me to load a directory of Javascript, CSS, HTML, and image files anywhere on the Blogger server.
Converting my website, which is hosted by a service out of my control, to use https is not an option.
Is there any way I can host my Wowslider picture gallery in such a way that Blogger will display it?
Turns out the problem is not as bad as I thought.
Blogger only serves pages in https mode when you're logged in and editing pages. When a visitor to the site is just viewing the blog, it's in http mode.
This means that if you embed a Wowslider into blogger, you won't be able to see it while you're editing it. However, you, and everybody else, will be able to see it after you publish it, log out, and view it as a visitor.
So in the end it works, it's just a little confusing because you won't be able to see the end result until after you publish.
I have a web app (sencha/phonegap) that includes a feature allowing users to click on buttons that link to Wikipedia articles. This obviously works fine if the device has internet access, but I get numerous requests to make the app work when the app is offline too. To accomplish this, I'd like to give the user the option to download the linked articles/webpages for offline access. When the device does not have internet access, the app would instead display the saved version (which might be stale/out-of-date, but is better than nothing). What are possible ways to accomplish this task?
My first thought was to somehow use the html manifest to cache the pages in the phone's browser, which sounds possible on the Android browser, but iOS apparently has a 5MB browser cache limit - too small.
My next thought was to save the needed html & associated files and bundle them up inside the app. But this seems a rather cumbersome approach, the app becomes much larger than it needs to be, and the webpages are stale back to the date the app was installed.
Using javascript, is it possible to download webpages, which I could then save (on the sd card, for example) for access later?
Or is there a more elegant approach?
If anyone could point me in the right direction it would be much appreciated.
In pure Javascript you can make an Ajax request to download a page. Then you can use the FileWriter to write the responseText to a file on the file system. However, that won't help you when it comes to images. You'll need to use the FileTransfer.download() command to get the binary image files.
If I were you I'd:
Use AJAX to download the html.
Parse the html looking for images.
Use FileTransfer.download to get the images.
I noticed that when I open HTML file locally by double clicking on it, it will not "run" the same as if I had it on a web server and opened it by HTTP GET request.
I need to have a local HTML file a user can open by double clicking on it. This HTML file has several JQuery load calls such as this:
$("#content").load("http://somepage.com/index.html");
I want to update several divs with content from remote sites.
This works fine If I have this file on a web server but not if I double click it under windows explorer... How can I "make" the file "run" as it would on a web server?
I think you pretty much cannot. This has to do with domain-access restrictions, which are there to avoid cross site scripting and the likes.
The files on your hard drive are especially limited - think what the life could be if they were allowed to treat your whole hard-drive as a single domain.
If you want things to work properly you need to be running a server. XAMPP is a pretty good bet as it's easy to install and set up.
Any non-AJAX javascript will work fine as is though, as long as the paths to include any css or js are relative.
You can't do this locally. You have to have it hosted somewhere for this to work. It's done this way for the sake of security.
What are you trying to do that you "need" to have this?