The problem definition
On my page, www.xxx.com/page, there is a script:
<script type="text/javascript" src="main.1234.js"></script>
Browser relolves it as www.xxx.com/main.1234.js instead of www.xxx.com/page/main.1234.js
More details about my setup
In fact, there are two applications running on the same server machine:
www.xxx.com/ (app #1)
www.xxx.com:82/ (app #2)
The page actually exists in the app #2 as www.xxx.com:82/page. If I access the page directly, then everything works as it should work (i.e. browser resolves links as expected).
But my setup is a bit more complicated. My goal is to hide app #2 from any public access, and make it available only via app #1 as www.xxx.com/page. In order to achieve that, I setup app #1 so, that if a user requests www.xxx.com/page, then app #1 under the hood performs a request to www.xxx.com:82/page of app #2 and returns received content back to the user.
From user's point of view everything should look like content of www.xxx.com:82/page resides under www.xxx.com/page. And it almost works. The only problem is that for some reason browser resolves URLs as I described under "The problem definition". How to fix it?
Additional info, hope it may help
I suppose, that an answer should be hidden in the responces. I suppose, that a cause is that the browser receives different response headers. The following are lists of headers the browser receives in each of these two cases:
Response from app #1 (www.xxx.com/page) where browser incorrectly resolves URLs:
Cache-Control:private
Content-Length:775
Content-Type:text/html;charset=UTF-8
Date:Fri, 19 Jan 2018 11:34:40 GMT
Expires:Thu, 01 Jan 1970 00:00:00 UTC
Set-Cookie:zimidy-initialSessionIdHash=-226086716; Path=/
Strict-Transport-Security:max-age=31536000 ; includeSubDomains
X-Content-Type-Options:nosniff
X-Frame-Options:SAMEORIGIN
X-XSS-Protection:1; mode=block
Response from app #2 (www.xxx.com:82/page) where browser correctly resolves URLs:
Accept-Ranges:bytes
Cache-Control:public, max-age=0
Connection:keep-alive
Date:Fri, 19 Jan 2018 11:33:16 GMT
ETag:W/"307-1610e1964c4"
Last-Modified:Fri, 19 Jan 2018 11:06:40 GMT
X-Powered-By:Express
The URL of
main.1234.js
starts from the location your page is in. The URL of
/main.1234.js
starts from the location of the baseurl. You have problably meant the baseurl. If your path is foo/bar/mypage, then linking main.1234.js will search for the file in foo/bar/. If you put a slash to the start, it will search for the file in the baseurl, which should be the root folder.
RolandStarke gave me an advice, which helped me to solve the problem.
Also, an explanation of the behaviour can be found here.
To make relative URLs to work properly, ending slash is required. I used it in the link from app #2, but not in the link from app #1. After addition of ending slash, everything started to work.
So, this link doesn't work properly: www.xxx.com/app
But this one, with ending slash, does work as expected: www.xxx.com/app/
Related
I'm very confused. I've got an AJAX call which causes a login form to be processed, and creates a cookie on a successful login. The web browser is not registering the cookie.
In troubleshooting, I isolated it down to something to do with the AJAX calling the site, rather than navigating directly.
e.g. I created a simple page "test" which returns the following output:
HTTP/1.1 200 OK
X-Powered-By: Express
Set-Cookie: token=ABCDEFG; Domain=localhost; Path=/
Content-Type: application/json; charset=utf-8
Content-Length: 19
ETag: W/"13-S4werj8PuppRlonJZs+jKA"
Date: Wed, 23 Sep 2015 22:09:03 GMT
Connection: keep-alive
{"message":"value"}
If I navigate directly to the page, the cookie is created in the browser.
If I make an AJAX call to the page, the cookie is not created in the browser.
e.g:
$.get('http://localhost:8081/test');
I've found similar posts which state that this happens with AJAX if the domain or the path are not defined, but as you can see, I defined these and still no dice.
If it matters, the majority of my testing has been on Firefox, but I did do at least a couple of tests on Chrome.
Any help you have would greatly be appreciated. I'm confused by this, as everything I read suggests this should be possible.
To clarify further:
1) I'm not seeing the cookie created when reviewing CookieManager+ addon for Firefox.
2) I'm also not seeing the cookie added to subsequent requests to the same host (even the same port).
3) What I read seems to suggest that cookies are tied to a host, not a port (But that doesn't seem to be the issue based on #1 and #2):
Are HTTP cookies port specific?
Try setting withCredentials in your request:
$.get('http://localhost:8081/test', {xhrFields: {withCredentials: true}});
Alternatively try setting the crossDomain value:
$.ajax({type:"GET", url:"localhost:8081/test", crossDomain:true});
If you're trying to do this in Angular, as I was, this is how you do it there:
$http doesn't send cookie in Requests
config(function ($httpProvider) {
$httpProvider.defaults.withCredentials = true;
I'm a developer of a landing page authoring system.To make our pages load faster we use a force-cache technique - all the resources (images/js/css/fonts) in a landing page are set to expire in a year. The file paths are unique and the files never change.
It should work find, but there is a problem when browsing those pages in Chrome - when it loads Javascript files that were already cached (*) the browser sometimes connects to the server with an IF-MODIFIED-SINCE header (getting a 304 reply) and sometimes it doesn't. I don't want the browser to connect to the server at all for the Javascript files that were already cached, it slows down the page loading.
(*) already cached: just enter the page more then once, not using F5 (which obviously causes reloading) but by pressing Enter in the address bar, or opening a new tab and entering the same address.
An example page can viewed here: http://mati.ravpage.co.il/placeholders
I analyzed the issue by entering Chrome's dev tools (F12), inspecting the network tab, sorting by Status. Sometimes only the "placeholders" entry get a 304, sometimes also the Javascript entries.
For the Javascript file bugs_mati_...1391870865.js, the HTTP headers are:
Access-Control-Allow-Origin:*
Cache-Control:max-age=31536000, public
Connection:keep-alive
Content-Encoding:gzip
Content-Type:application/javascript
Date:Sat, 08 Feb 2014 14:57:08 GMT
Expires:Sun, 08 Feb 2015 14:55:25 GMT
Last-Modified:Sat, 08 Feb 2014 14:47:48 GMT
Server:NetDNA-cache/2.2
Transfer-Encoding:chunked
Vary:Accept-Encoding,User-Agent
X-Cache:HIT
I don't understand why Chrome behaves the way it does. I can't even reproduce the issue with high confidence, the behavior is erratic. I am lost.
I would appreciate any help,
Thank you in advanced,
Mati
I have web page which refers large number of JS, Images files. when teh page is loaded second time, 304 requests goes to the server. I would like to get http 200 (cache) status for cached object rather than 304.
I am using asp.net 4, iis 7.
Setting the Expiry header is not working, it still sends 304 requests. I want http status 200 (cache).
Please let me know if there is any technique for this.
You've said that setting the expiry doesn't help, but it does if you set the right headers.
Here's a good example: Google's library hosting. Here are the (relevant) headers Google sends back if you ask for a fully-qualified version of a library (e.g., jQuery 1.7.2, not jQuery 1., or jQuery 1.7.):
Date:Thu, 07 Jun 2012 14:43:04 GMT
Cache-Control:public, max-age=31536000
Expires:Fri, 07 Jun 2013 14:43:04 GMT
(Date isn't really relevant, I just include it so you know when the headers above were generated.) There, as you can see, Cache-Control has been set explicitly with a max-age of a year, and the Expires header also specifies an expiration date a year from now. Subsequent access (within a year) to the same fully-qualified library doesn't even result in an If-Modified-Since request to the server, because the browser knows the cached copy is fresh.
This description of caching in HTTP 1.1 in RFC 2616 should help.
I have a Flash application that sends a getURL request for an image file every 60 seconds.
This works fine in all browsers except IE9 with Internet Option set to automatically check for newer versions of stored pages.
I setup Charles proxy (http://xk72.com) to watch the requests being sent by my flash app and confirmed that the request is being surpressed by IE9 when the setting is set to Auto, but works fine if I change the setting to check everytime I visit the webpage. This, however, is not an option! I need this to work in all browsers regardless of how the options are set.
Even if I do a page refresh (F5), the ASP page does not reload. The only way to get it to reload is close the browser and restart it.
I have tried adding content headers to disable caching but it does not appear to work.
For Example, here is my request headers:
HTTP/1.1 200 OK
Date Sun, 02 Oct 2011 23:58:31 GMT
Server Microsoft-IIS/6.0
X-Powered-By ASP.NET
Expires Tue, 09 Nov 2010 14:59:39 GMT
Cache-control no-cache
max-age 0
Content-Length 9691
Content-Type text/html
Set-Cookie ASPSESSIONIDACQBSACA=ECJPCLHADMFBDLCBHLJFPBPH; path=/
Cache-control private
I have read the Microsoft blog (http://blogs.msdn.com/b/ie/archive/2010/07/14/caching-improvements-in-internet-explorer-9.aspx) which states that if I add the content headers, the browser should respect my request, but it obviously does not.
I don't think this is a Flash issue since the html page that holds the Flash object will not even reload.
You can append a random number to the end of the url.
I'm sending 2 cookies to the browser. One is a browser identifier which expires in 1 year, and the other is a session tracker without an expiration. The response headers for a fresh request look like this
HTTP/1.1 200 OK
Content-Type: text/html; charset=utf-8
X-XSS-Protection: 0
ETag: "b502a27282a5c621f34d522c3fcc8e3e"
Set-Cookie: bid=ahFmaXJld29ya3Njb21wdXRlcnIPCxIHQnJvd3NlchimigcM; expires=Fri, 12-Aug-2011 05:21:55 GMT; Path=/
Set-Cookie: rid=1281569589; Path=/about
Expires: Wed, 11 Aug 2010 23:33:09 GMT
Cache-Control: private, max-age=345600
Date: Wed, 11 Aug 2010 23:33:09 GMT
I'm trying to access both cookies from JavaScript on the page.
In Firefox and Chrome document.cookie gives me this
"rid=1281568223; bid=ahFmaXJld29ya3Njb21wdXRlcnIPCxIHQnJvd3Nlchj2nAYM"
In IE6, IE7, IE8 document.cookie only gives me this
"bid=ahFmaXJld29ya3Njb21wdXRlcnIPCxIHQnJvd3Nlchj2nAYM"
Is the 'path' attribute in my rid cookie throwing off IE or would it be the missing expiration date (which I thought was supposed to be optional)? I assume it is not the fact that I'm setting more than 1 cookie, because that's done all the time.
IE will only allow you access to those cookies if you are in a subdirectory! So if you set the cookie's path to be /about, and your page is actually /about then you cannot access it.
So it seems for IE you can access the cookie on pages beneath /about like /about/us but not on a page that is /about itself. Go figure :/
Alexis and Rishi I think have this spot on. And this is the only place on the web I've found this information on how IE treats cookies with paths. And what a ball ache! IE strikes again.
Incidentally, in IE 11 at least, it performs a 'starts with' comparison on the full path, so setting a cookie with a path of '/abou' can be accessed on the '/about' page. Though in my current project, this is little consolation, as I'm not in the position to make the assumption that taking one character off the end of the path will reliably identify unique paths in a site.
I also am having a similar issue with IE. I'm setting three cookies without a path (so assumed to be "/"). I am running in a dev envronment on my own machine. When I open the page as http://localhost/page.aspx, I get the expected result and my javascript can find the cookies, however, if I load the same page as http://mymachine.mydomain.com/page.aspx I can watch (in the debugger) the same three cookies being added to the response, but when I get to the javascript function that looks for them, all my cookies are null. Needless to say, this works ok on FireFox.