I wonder what happened if I have two web pages on my server using the same script or css files.
If a user opens the first web page and his browser downloads the extern script and css files, then the user goes to my second web page which uses the same script and css files, will the script and css files get downloaded again? If so, how can I avoid this?
Thanks a lot!
Related
I'm new here. My problem is I got a loaded tab on chrome and the webpage is not accessible anymore.
anyway, the page contents are still on the server at least thumbnails.
Is there a way to download this tab entirely with its page structure as HTML for viewing later.
I don't turned off my pc for 2 days only hibernate it. Searching for a way but can't find any resource.
edit: the webpage is a meganz folder which has ~100 folders on it. i think it's very hard to download with page structure because it loads a file explorer application (server-side i think) when first loading the folder and files.
Either File > Save Page As.. (in Chrome)
or
FTP into the web server and download the whole folder/page from the public_html folder, if it is a static page. If it's not, you'll only get template files, and no content data.
or if you're just interested in reading the content:
File > Print > Save as PDF
I'm not sure how to go about this but I wanted to be able to load content from a file (i.e. txt, html, etc...) to my Tinymce instance.
So basically a button that initiates a file browser and then loads the content of the file to Tinymce.
I've seen this for images and image browsers but I'm not sure how to do this for files.
This would be done locally, not on a server as well.
Nick -
Prior to HTML 5, JavaScript in the browser cannot read files directly from the hard drive - this is done for security reasons:
https://en.wikipedia.org/wiki/JavaScript#Security
In HTML 5 there are new APIs that allow you to get to files. This page has a decent overview:
http://www.html5rocks.com/en/tutorials/file/dndfiles/
I would note that browser support for this is not yet universal:
http://caniuse.com/#feat=fileapi
Assuming you can use the File API you should be able to do what you want after a user selects a file.
How to load only html from web pages in selenium?
I need only html of requested page without css and javascript.
If you need selenium for web-scraping, strictly speaking, you would still need need javascript and css files since they can take a significant part in the page load and rendering. For example, several parts of a page can be loaded with additional ajax calls, or inserted via a custom javascript logic.
Also, if you want only HTML part of a page, why do you need to involve a real browser?
If you still want to prevent js and css files from loading, you can configure certain permissions in Firefox through tweaking FirefoxProfile preferences, see:
Do not want images to load and CSS to render on Firefox in Selenium WebDriver tests with Python
FirefoxDriver: how to disable javascript,css and make sendKeys type instantly?
Currently in my app I am loading html page in UIWebView using loadRequest method of it.
This HTML includes number of js and css files. Now I need to store this HTML page with necessary js and css files. After page is loaded I need to give option to user whether user wants to store it for offline viewing or not.
I tried a lot but I couldn't store js or css file.
I have seen other post( How to save a locally loaded HTML file in UIWebview ) to save html loaded in UIWebView but couldn't find anything about js and css file.
Let me know whether it is possible or not and if yes then how.
You can do it using ASIHTTPRequest. Check this answer, it has a nice description of this process.
Hope this helps.. :)
I have a website that uses hashes for navigation, so it just uses javascript/ajax to load new pages in. Of course the one file, index.php loads when you load the page, then everything else is controlled via javascript. I'm now using an iFrame to upload photos, the problem is that it seems like when I click on links to go to other pages (while the iframe upload the files, since it is on the index.php page so it's not affected by switching pages) it just waits and I think it waits until the file upload is complete to load the new page I have requested. I have seen websites that do this though, my initial thought is that it could just be this: They use another server to store their files on, aside from the one the website is hosted on, therefore if it's that reason it doesn't take all the bandwidth and resources from each other. would this be the case?