Given an url, how could a script find what resources are loaded? - javascript

Given an url (e.g. localhost:8000), how can a script find what resources will a browser load (via HTTP requests)?
For example, let's suppose that the / route responds with:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Test</title>
<link rel="stylesheet" href="/css/style.css">
</head>
<body>
some content
<img src="/foo.png" alt="">
</body>
</html>
The resources which will be loaded are:
/css/style.css
foo.png
This is simple (just a dom iteration with via cheerio or so), but it's not so native I think it should.
An iteration in the response HTML will not work for the additional CSS #imports and background-image and so on.
What is the native way to get the list with the CSS, images and maybe other resources which are loaded by the browser?
Maybe is it possible via jsdom?

Like #adeneo suggested, the missing keywords were headless browser. I find it very simple via the zombie library. Below you can see a small example, however the documentation is a great resource.
// Dependencies
var Browser = require("zombie");
// Load localhost:9000
Browser.localhost("localhost", 9000);
// Load the page from localhost,
// including js, css, images and iframes
var browser = new Browser({
features: "scripts css img iframe"
});
// Open the page and list the resources
browser.visit("/", function(error) {
console.log(browser.resources.map(function (c) {
return c.request.url;
}));
});

Related

"Missing PDF" every time I use Grapecity PDF Viewer

I'm building an app that will have an interactive PDF form on a server (in HTML, CSS, JS). I have been trying to use the Grapecity PDF viewer, but to no avail. I've followed the documentation to a T, using these resources: one, two, three.
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, maximum-scale=1, user-scalable=no, shrink-to-fit=no">
<meta name="theme-color" content="#000000">
<title>GC Viewer Demo | PDF Plugin</title>
<link rel="stylesheet" href="https://cdn.materialdesignicons.com/2.8.94/css/materialdesignicons.min.css">
<script>
function loadPdfViewer(selector) {
var viewer = new GcPdfViewer(selector, { renderInteractiveForms: true });
viewer.addDefaultPanels();
viewer.open("HelloWorld.pdf");
}
</script>
</head>
<body onload="loadPdfViewer('#root')">
<div id="root"></div>
<script type="text/javascript" src="gcpdfviewer.js"></script>
</body>
</html>
I currently have the "HelloWorld.pdf" and the gcpdfviewer javascripts in the same folder as the above index.html but every time I test the code in the browser, the PDF viewer loads, but the PDF doesn't, giving me an error that states "missing PDF."
This is really bothering me because the PDF is exactly where it's supposed to be, I think.
I'm currently not using a license key, but the documentation makes it seems like I don't need one. Maybe that's the issue.
Any ideas?
Edit - Here are the console errors in Chrome:
The pdf worker has been disabled. Note, rendering PDF in foreground thread can slow pdf viewer performance.
ce # gcpdfviewer.js:1
index.html:1 Access to XMLHttpRequest at 'file:///C:/HelloWorld.pdf' from origin 'null' has been blocked by CORS policy: Cross origin requests are only supported for protocol schemes: http, data, chrome, chrome-extension, https.
HelloWorld.pdf:1 Failed to load resource: net::ERR_FAILED
index.html:1 Uncaught (in promise) V
Firstly, the issue is not because of the non-license version.
The issue occurs because you are trying to execute the sample locally using the file system. To overcome this issue, you should host the application on a local server and the Pdf will be loaded in the PdfViewer.
While loading a Pdf in the PdfViewer, there is XMLHttpRequest which checks the origin. This is null in the case of the file system. Hence, throws the error on accessing the file.
Here is the documentation link for configuring the PdfViewer:
https://www.grapecity.com/documents-api-pdf/docs/online/view-pdf.html
Regards,
Manish Gupta
Thank you for using GCPDF Viewer. Is the filename exactly the same, it might be case-sensitive OS?
Can you look in the browser network tab and watch for the request going to retrieve the PDF file, is it looking in the same location where you have placed the file.
which server software are you running? is it serving the PDF file?
http://www.grapecity.com
As stated earlier, for security reasons, it is not possible to access files on your local filesystem via JavaScript, you need to set up a web server and open PDF files using the web server url.
But if you really want to do it, there is another workaround - start Chrome with disabled web security, and then open the index.html page from the local file system, for example:
"c:\Program Files (x86)\Google\Chrome\Application\chrome.exe" --user-data-dir="C:/temp/CustomChromeSession" --disable-web-security "file:///C:/temp/gcpdfviewer-test/index.html"
Note, this workaround is not recommended due to security reasons and this approach can be disabled by browser developers later.
Here's a screenshot of how it works

Play cannot reference external javascript

A very basic question
Cannot load external javascript resource on server
I am working on a Play framework project. I've made some basic html view with some Javascript. It works correctlly when I have my js code in the actual view.
However, when I tried moving js code to a separate file and load it using
<script> src="main.js" </script>
It works correctly when opened using plain chrome browser. However when I run it on server and it fails and chrome dev console prints the following message
GET http://localhost:9000/main.js 404 (Not Found)
I've tried setting up a GET request on targer URL but cannot pass main.js as an arguement to Ok method
def getmainJs()= Action {
Ok()
}
Is there a painless way to access the js code or do I have to go through the process of setting up the JavacriptRouter menntioned here. The app is only going to be 2 views to I kind of don't care about scalability
I created an example on how to serve a Javascript file:
Routes:
GET /foo sk.ygor.stackoverflow.q53319493.controller.ApplicationController.foo
GET /assets/*file controllers.Assets.versioned(file)
View:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Title</title>
</head>
<body>
<script src="#routes.Assets.versioned("main.js")"></script>
</body>
</html>

Problems with map's script API

I'm trying to use maps api (yandex but I have seen similar problem in google maps too) and include its script in head:
<!DOCTYPE html>
<html lang="en">
<head>
<title>site</title>
<meta charset="UTF-8">
<script src="https://cdnjs.cloudflare.com/ajax/libs/postscribe/2.0.8/postscribe.min.js">
<script async src="https://api-maps.yandex.ru/1.1/index.xml" type="text/javascript"></script>
<script type="text/javascript">
window.onload = function () {
var map = new YMaps.Map(document.getElementById("YMapsID"));
map.setCenter(new YMaps.GeoPoint(59.938518, 30.323342), 10);
};
</script>
</head>
Here is body:
<body>
<div class="map-block_map-item">
<div id="YMapsID" style="width:100%;height:100%">
</div>
</div>
</body>
But when page downloads I get this warning:
"It isn't possible to write into a document from an asynchronously-loaded external script unless it is explicitly opened."
If I don't use async attribute get this message:
A parser-blocking, cross site (i.e. different eTLD+1) script, https://api-maps.yandex.ru/1.1/_YMaps.js?v=1.1.21-58, is invoked via document.write. The network request for this script MAY be blocked by the browser in this or a future page load due to poor network connectivity. If blocked in this page load, it will be confirmed in a subsequent console message. See https://www.chromestatus.com/feature/5718547946799104 for more details
How can I solve this problem? As I understood it happens on very slow connections (but I have rather fast connection so why?) and it cause block document.written scripts. (I'm new in js so yes I have google it but didn't understand how it can be improve besides async script download)

I'm trying to include JQuery in Dreamweaver, but it's not being recognized. Not as file, not from a CDN

I've tried including it in my files, and from a CDN. I followed a tutorial about it, although it's hardly a tutorial to follow since all i had to do was include the link after <head>. Still, it's not being recognized and plugins aren't being recognized and just appear as text. Does anyone have an idea what could be the problem? If it's perhaps a setting in dreamweaver that has to be changed?
<!DOCTYPE HTML>
<html>
<head>
<meta charset="utf-8">
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"></script>
<link href="myfirststylesheet.css" rel="stylesheet" type="text/css">
<title>Hallo wereld</title>
</head>
</html>
Its worth noting that, http or https in src hinders the inclusion of the cdn files at times.Remove the protocol and try this way....this way, it would take either http or https depending on the nature of hosting server
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"> </script>
EDIT
Further Read on why this helps ( if not running the file from computer but through some server, even localhost):
Can I change all my http:// links to just //?
http-and-https-with-google-cdn

Dynamically Trigger HTML5 Cache Manifest file?

I am using the new cache manifest functionality from HTML5 to cache my web app so it will work offline. The content is cached automatically when the page is loaded with the following html element:
<html lang="en" manifest="offline.manifest">
This works fine. However, I want to give my users the option of whether they want the content cached offline. So, here is my question:
Is there any way to trigger that an application be cached at runtime, using JavaScript, and not have it automatically done when the page is loaded.
For example, something like this (using jquery):
----------------index.html--------------
<head>
<meta charset="utf-8" />
<script src="http://code.jquery.com/jquery-1.4.4.min.js"></script>
<script type="text/javascript" src="Main.js"></script>
</head>
<body>
<button id="cacheButton">Cache Page</button>
</body>
</html>
---------Main.js---------
$(document).ready(
function()
{
$('#cacheButton').click(onCacheButtonClick);
}
)
function onCacheButtonClick(event)
{
console.log("Setting Offline Manifest");
$('#htmlRoot').attr("manifest","offline.manifest");
}
-------------offline.manifest-------------
CACHE MANIFEST
#version .85
#root
index.html
scripts/main.js
#jquery assets
http://code.jquery.com/jquery-1.4.4.min.js
Basically, when the button is clicked, I dynamically set the manifest attribute of the html element. This works (in the sense the element is set), but it does not cause the browser to then cache the page.
Any suggestions?
You dynamically trigger caching by adding an iframe that points to an empty page containing the actual cache manifest.
offline.html:
<!DOCTYPE html>
<html manifest="offline.appcache">
<head>
<title></title>
</head>
<body>
</body>
</html>
Make sure to add index.html to the cache manifest.
Then just add something like:
<iframe src="offline.html" width="0" height="0">
to document.body dynamically to trigger caching.
After many weeks spent with offline caching, the answer is no, you either cache or don't cache, setting the cache attribute on the client side has no effect.
You could consider offering an alternate url for the caching version, be aware that the page is also implicitly cached as a "master entry".
I am at a loss to understand why you would want to offline cache jquery though, since it is likely to be served with very long expiry anyway.
You may wish to consider offline storage as an alternative. Store the text of the scripts and inject them into the DOM on load. If not cached fetch using Ajax and inject the response, as creating a script tag with the src won't load the script.
Depending on your application, it may be possible to use a modified version of #schibum's approach by breaking down your app into "mini" apps, then caching the sub-sections in an iframe. Consider this example:
index.html
<html manifest="indexx.manifest">
<head>
<script src="jquery-2.1.4.min.js"></script>
<script src="index.js"></script>
<title>Index</title>
</head>
<body>
<ul>
<li>One
<li>Two
<li>Three
</ul>
<iframe id="if" />
</body>
</html>
index.manifest
CACHE MANIFEST
# 3
index.html
jquery-2.1.4.min.js
index.js
index.js
$( document).ready(function() {
var pages = ['1.html','2.html','3.html'];
var LoadNext = function() {
alert(pages[0]);
page = pages.shift();
alert(page)
if ( page !== undefined ) {
console.log("Attempting to load " + page);
$('#if').attr('src', page)
} else {
console.log("All done");
}
};
$('#if').load(function() {
console.log("Finished loading");
LoadNext()
});
LoadNext();
});
1.html
<html manifest="1.manifest">
<head>
<title>One</title>
</head>
<body>
<img src="1.jpg" width="50%">
</body>
</html>
1.manifest
CACHE MANIFEST
# 2
1.html
1.jpg
{2,3}.{html,manifest} follow 1.{html,manifest}'s form.
As a result, each sub-page (1.html, 2.html, 3.html) have their own manifest, and are thus cached independently. index.html has its own (minimal) manifest, so caching that unconditionally is not nearly as network-heavy as caching the entire app. Then the javascript is responsible for pre-loading every page in the iframe so that it gets cached.
Load index.html, then go offline and see how the sub-pages work. too.
An obvious drawback is that any assets shared between pages (e.g. jQuery) must be redundantly cached.
One thing you must remember. Do not cache the manifest file itself. So all you need to do is refresh the page with a different version of the manifest file according for your user selection. You can dynamically generate the manifest file itself, any change to that file will cause a cache refreshment. Best practice to trigger re-caching is to change the version of the manifest file, something like: # ver1 - 01/01/2018 to # ver2 - 02/02/2018 will do the trick.
So you cannot change it in client side but you can do it server side.

Categories