What is the cookie path if I am using XAMPP? - javascript

I am trying to find the cookie that I created using Google Chrome and on the XAMPP server. I am using the Cookies.js plugin. I know the cookie is being created successfully because my website loads up the settings that I write into the cookie, but I can't find it in htdocs which I believe is the root folder. The cookie path is the default (root) path.

So after a bit of poking around, I finally found the cookie.
In Chrome, it's stored in either of the two paths below:
C:\Users\your username\AppData\Local\Google\Chrome\User Data\Default
C:\Users\your username\AppData\Local\Google\Chrome\User Data\Default\Local Storage
answer is here: https://superuser.com/questions/459426/where-does-chrome-store-its-cookie-file
or it can be accessed by typing the following in the URL bar
chrome://settings/cookies
In Firefox, it can be accessed at
C:\Users\user\AppData\Local\Mozilla\Firefox\Profiles\random characters
answer is also here: https://superuser.com/questions/387372/where-does-firefox-keep-cookies
I checked the paths myself so I am pretty sure that they are right. Correct me if I am wrong. I also noticed that the paths change between some versions of the same browser so beware of that too.

Cookies are not created in /htdocs folder, they are being held only in browser. For more information check
http://www.allaboutcookies.org/

Related

How to Change Firefox Proxy Settings Programmatically?

I'm launching Firefox via command line and I'd like to launch a specific Firefox Profile with a proxy. According to this answer on Stackoverflow, Firefox proxy settings are stored in pref.js in the Firefox Profile folder and it is necessary to edit this file to launch FF with a proxy.
I've edited the file as follows:
user_pref("network.proxy.ftp", "1.0.0.1");
user_pref("network.proxy.ftp_port", 00000);
user_pref("network.proxy.gopher", "1.0.0.1");
user_pref("network.proxy.gopher_port", 00000);
user_pref("network.proxy.http", "1.0.0.1");
user_pref("network.proxy.http_port", 22222);
user_pref("network.proxy.no_proxies_on", "localhost, 1.0.0.1");
user_pref("network.proxy.socks", "1.0.0.1");
user_pref("network.proxy.socks_port", 00000);
user_pref("network.proxy.ssl", "1.0.0.1");
user_pref("network.proxy.ssl_port", 00000);
user_pref("network.proxy.type", 1);
Note: the IP address and port used above are for demonstration purposes.
However, I'm encountering two problems:
1) Firefox completely ignores these settings and launches FF without any proxy at all
2) When Firefox exits the text modification is reverted/deleted
Note: When I edited the text file above, Firefox was not running. I know there's a disclaimer at the top of prefs.js:
If you make changes to this file while the application is running, the
changes will be overwritten when the application exits.
But there were no live instances of Firefox running at the time I edited the above file.
Manually creating different FF Profiles (as suggested by another user) with different proxies is not an option as everything needs to be done programmatically, without manual intervention.
Does Firefox still support linking proxy via pref.js? If not, what is the current working solution to launch Firefox via command line with a proxy in Java?
Thanks
A proxy-autoconfig file is what you are looking for.
Docs here.
Define a file name.pac, that contains the javascript function
function FindProxyForURL(url, host)
Inside the file you can use any javscript you'd like to decide what proxy to use. Set the path to your .pac file in the firefox settings, under auto-config proxy. Remember to use a file url.
To setup automatic file switching, simply configure firefox to point towards a single file, and overwrite the file programmatically every time you want it to change. You could keep copies of all options, and simply copy an option file into the target file right before running.
An example of a super simple pac file is this:
function FindProxyForURL (url, host) {
return 'PROXY proxy.example.com:8080; DIRECT';
}
It will always return the identical proxy for all endpoints.
Passwords are not explicitly supported by the pac standard, but there are different ways to approach this. Firefox will prompt you for a login if it thinks it needs one, and you could also embed the password into the url (username:password#proxy.example.com). Additionally, a tool like proxy login automator could allow you to use passwords and to dynamically set the proxy without having to fight with firefox.

connect.sid not generated in Bluemix

We have some JavaScript lines working fine on local node.js (and Mongo.db).
But when we moved the code to BlueMix, something does not work.
Let me explain.
I see in the chrome browser there is a cookie named "connect.sid" when all is ok (local server).
And "connect.sid" is not present when code runs in Bluemix, so I can not use "request.session".
Any clues ?
One probable reason could be that due to httpOnly flag is being set. pls. do check on this.

Magento install copied - admin menu doesn't work

I cloned an existing magento 1.7.2 installation on the same server with a test subdomain. The frontend seems to work, and I can login to the admin. The admin menu doesn't work however, no dropdowns, and copying url paths doesn't work either. I've searched online, and most answers date back to 2008 and suggest that it's a rights issue. So I've changed the rights of folders and files to 755 and 644, but still no working menus. The cache (var/cache) is empty.
These menus are javascript generated. The following error message is from the console:
Error: TypeError: Element.addClassName is not a function
To be clear - the solution is not in javascript, but it's something on the server. This install works on the same server in another directory with another domain.
Any ideas how to fix this?
The error
Error: TypeError: Element.addClassName is not a function
indicates some javascript on your page can't call the addClassName method.
The addClassName method is added to element via the prototype javascript framework.
That means its very likely your browser can't download the prototype.js file. Since it can't download this file, the addClassName method is never defined, and you get the error you're seeing.
Look at the source code of your admin pages and find the script tag that includes the version of prototpye shipped with your version of Magento.
<script type="text/javascript" src="http://magento.example.com/js/prototype/prototype.js"></script>
Take the URL from this script tag and load it in your browser.
My guess is you'll get a 404 because the file is missing, or a forbidden error because the file has incorrect permissions, or some other web server error that prevents the file from being shown. It's also possible that the link is pointing to an older domain name that's based on a value configured or cached in Magento.
Track down the source of that problem, and you'll be good to go.
Another reason could be that the skin and CSS rules are not correct for your environment.
I've just moved a site from live to local, and the skin/css/media were configured to a subdomain so I looked in the core_config_data table and updated the URLs
Please check if you have set merge js or css to yes, you can update this via db if you cant do it via menu:
SELECT * FROM core_config_data WHERE path LIKE 'dev%'
Change from 1 to 0 merge_css and merge_js
In my case I have changed the permissions of folder and its recurring files and folder and it started working. Try it once.

Permission Denied using Javascript/jQuery on a local file

function publish(text) {
$('#helpdiv').prepend(text);
}
function get_help(topic) {
$.get(topic, publish);
}
<p>Hi. click here for more help.</p>
<div id="helpdiv"></div>
I've inherited this chunk of HTML and javascript above (snippet). It is/was going to be used as local help. Currently it is online only and it works fine. However, when I copy the files locally, I get "Permission Denied" in Internet Explorer and in Chrome doesn't do anything when I "click here for more help". What it's supposed to do is load the help content from inline-help.html and display it in the helpdiv div. Now here is the kicker, if I take the same files and copy them to inetpub on my PC and load them as http://localhost/hello.html it functions perfectly.
Presumably this is a security thing where the "local" zone isn't allowing me to load files off of the user's HD? But I'm not really sure what's going on and would like to understand this problem further and potentially come up with a workaround.
Any insight is greatly appreciated.
jquery's "get" uses xmlHttpRequest, which doesn't work on local files, unfortunately. If you really need to be able to fetch local data (or data from a different domain) asynchronously, you should use dynamic script tags. However that means the data file has to be reformatted as JSON data.
I don't think your browser is allowing you to run javascript locally (using the file:/// access method). But when you load it from http://localhost/ it works fine.
You need to either develop on a website, or use your localhost server.

Javascript security / cross scripting on same server

Have some Javascript that I need to work via the following:
://localhost
://servername
:/www.domainnamefortheserver.com
When I run the script from http://servername with an IFRAME referencing the domain - it does not load.
Is there a way to get the Javascript security model to recognize the server name, localhost and the domain as the same "domain"?
Thanks
If you are running on UNIX you can edit /etc/hosts to give a fake DNS entry for your server.
eg.
127.0.0.1 localhost www.domainnamefortheserver.com
Then you can always connect to it as the correct name even when it's not on the live site yet. Don't try and break the javascript security directly.
This will also work on OSX. Windows works differently, I expect.
If you are using a server-side language to generate the page, you may be able to set the security domain like so:
document.domain = $CURRENT_HOSTNAME;
So the security domain will be the domain the user requested. This is a shot in the dark, but I hope it helps nonetheless.
Use root relative URIs:
href="/foo/bar"
rather than absolute URIs:
href="http://example.com/foo/bar"
That way the document will be loaded from the same hostname.
What do you mean by
my references are to the domain name
?
If you load scripts in your page on http://servername (using <script src=''>), they will have access to everything on http://servername, even if they come from another domain.
However, if you try to make AJAX calls to the other domain, then you have a problem. You can use the trick explained by Christopher, ie making aliases to the domain.

Categories