I noticed that when I open HTML file locally by double clicking on it, it will not "run" the same as if I had it on a web server and opened it by HTTP GET request.
I need to have a local HTML file a user can open by double clicking on it. This HTML file has several JQuery load calls such as this:
$("#content").load("http://somepage.com/index.html");
I want to update several divs with content from remote sites.
This works fine If I have this file on a web server but not if I double click it under windows explorer... How can I "make" the file "run" as it would on a web server?
I think you pretty much cannot. This has to do with domain-access restrictions, which are there to avoid cross site scripting and the likes.
The files on your hard drive are especially limited - think what the life could be if they were allowed to treat your whole hard-drive as a single domain.
If you want things to work properly you need to be running a server. XAMPP is a pretty good bet as it's easy to install and set up.
Any non-AJAX javascript will work fine as is though, as long as the paths to include any css or js are relative.
You can't do this locally. You have to have it hosted somewhere for this to work. It's done this way for the sake of security.
What are you trying to do that you "need" to have this?
Related
I'm trying to link this page http://www.bauer.uh.edu/parks/f1471m.htm to my text editor ( text mate) with a mac.
I even copied the code and pasted it into an HTML file, and made a .js file (the one I'm trying to use to practice with) in the same folder and to use the local tag with no success.
I tried using src="http://www.bauer.uh.edu/parks/f1471m.htm" with no luck.
am I doing something wrong?
I'm making a few assumptions here...
From what you're describing, I think you're trying to directly edit the file. Unless you have write access on that server, what you're doing isn't going to work. Some web development software will allow you to do this, but most text editors don't.
Can you work on the file locally (on your computer's file system) and upload it to the server through FTP?
Many browsers give you the ability to save a page locally these days. That would set up the proper structure for you on your own machine. (FireFox can do this using Save Page)
I am attempting to put together a short script that I want to place on a Google "Site"
page. What I want to do is select a file that is local to my browser and then
re-direct it to a folder on a local windows server. I have been playing with file
upload and then download but essentially getting nowhere. HTML 5 has a "Download"
tag but Sites appears to disallow it for some reason.
We use a mix of browsers here:IE(8-10),FF,Chrome.
I have seen many good solutions but they all seem to need server side code. Which I cannot
do.
Any advice is appreciated.
I have a web app (sencha/phonegap) that includes a feature allowing users to click on buttons that link to Wikipedia articles. This obviously works fine if the device has internet access, but I get numerous requests to make the app work when the app is offline too. To accomplish this, I'd like to give the user the option to download the linked articles/webpages for offline access. When the device does not have internet access, the app would instead display the saved version (which might be stale/out-of-date, but is better than nothing). What are possible ways to accomplish this task?
My first thought was to somehow use the html manifest to cache the pages in the phone's browser, which sounds possible on the Android browser, but iOS apparently has a 5MB browser cache limit - too small.
My next thought was to save the needed html & associated files and bundle them up inside the app. But this seems a rather cumbersome approach, the app becomes much larger than it needs to be, and the webpages are stale back to the date the app was installed.
Using javascript, is it possible to download webpages, which I could then save (on the sd card, for example) for access later?
Or is there a more elegant approach?
If anyone could point me in the right direction it would be much appreciated.
In pure Javascript you can make an Ajax request to download a page. Then you can use the FileWriter to write the responseText to a file on the file system. However, that won't help you when it comes to images. You'll need to use the FileTransfer.download() command to get the binary image files.
If I were you I'd:
Use AJAX to download the html.
Parse the html looking for images.
Use FileTransfer.download to get the images.
There is probably a better title for I'd like to accomplish, but the details should be helpful.
I've recently learned that specifying a script's src path as //some.domain.com rather than http://some.domain.com or https://some.domain.com will cause the browser to request the script using whichever protocol was used to load the page. This works great when the page is loaded from a site, but often I debug on my local system, so the protocol is file, and of course errors occur whenever resources or scripts aren't found.
Other than changing src paths, is there a better way to debug locally? I imagine there is code solution that detects when the page is running locally versus loaded from a domain, but I haven't found examples yet.
Install a product such as wampserver, then you'll have a localhost webserver you can test everything on. This is how I do it, works like a charm.
There are similar products available for ASP or other non-PHP server-side technologies (you didn't specify), if you are just doing HTML + JS then any old server would do.
During a normal browsing session I want to edit a specific javascript file before the browser receives since once it gets there it's impossible to edit. Is there are any tool for this? For what I need it I can't just save it and edit it on my disk.
I'm ready to learn how to program it myself but if anyone can point out more or less what I have to do I'd be very grateful. I'd have to intercept the packets until I have the whole file while blocking the browser from receiving it any part of it, then edit it manually and forward it to the same port.
I don't think I can do this by just using pcap, I've read a bit about scapy but I'm not sure if it can help me either.
Thanks in advance.
You'd need to implement some sort of proxy, or hook into an existing one, and intercept the file as it's being downloaded and replace it.
Not trivial for a beginner, but a good learning project.
If you are happy to, rather then editing a file, replace it with a local one, then I would* use Charles and its Map To Local function.
Actually, "did". This helped me debug a problem with a browser and a JS file I couldn't edit yesterday.
You can probably achieve whatever it is you are wanting to do by using the firefox firebug plugin, chrome's development tools or the firefox greasemonkey plugin.
Or you could enter the files domain into your hosts file and point that domain to your local machine (running a web server), edit & save that javascript file locally and serve it from your own web server.