Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I want to inject some javascript code during HTTP call before onload function is called. How can i achieve this? Is this even possible? I have achieved this using Chrome extension but I want some other method to inject the code which works on all other browsers. Something like injecting through URL bar while opening the page. This injection should work on all sites i open.
Another neat way to do it is to have the request go through a proxy. The proxy can inject the JS by modifying the source before it arrives to the client. This is how most ad-supported (ad-injecting) proxies do it.
But then, it's a security issue. That's why, same as having to tell the user to install an extension, you'd have to tell the user to use the proxy. Otherwise, it's a no-go.
Yet another way to do it is to have the owners of the site embed a script that points to a permanent path. Behind that path, you write any script, which may include one that loads some more scripts.
It can be as short as:
<script src="path/to/a/permanent/location"></script>
This is typically how publicly served APIs work, like Google. Behind their permanent path is a script that loads all the APIs you need. But, like I said in the previous section, you need permission, this time in the form of having the webmaster embed the script on the page.
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I'm trying to solve a problem similar to the ones in this question and this one, basically tracking a sub-session for users by browser tab. The thing I'm trying to accomplish, though, is finding a way to set a request parameter to send back to the server with each request, whether it's a simple synchronous link click, a form post, or an ajax request. What occurred to me as a hopeful solution would be if I could set something in the html head that would be sent each time, though I haven't seen anything to suggest that it's possible. I'm hoping to find a solution that doesn't require wrapping all server requests in some sort of javascript to include the desired parameter.
We're using Rails on the backend.
Difficult question to answer without knowing which server side language you are using. Regardless, using php, perl, python, or any other language, you could generate a unique ID and append it as a query string to your links. If you don't want the ugly links, you can:
A. use .htaccess to perform a mod_rewrite to clean up the URL
B. use an additional cookie to track the sub-session
C. use pure javascript, not very reliable depending on the client browser though..
the method used to track depends on what capabilities you have, database, text file, etc.
Again, not much info to go on.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
On this site, which is on Joomla and using a plugin called Matukio, the links on the left used to work, but no longer do and we are desperate to figure out why.
The company who makes the plugin replied with this info, but it sounds like he's just spouting a bunch of stuff that isn't really the cause (JUri::root() is working fine, the VWO stuff was there before when the links worked, etc).
I doubt anyone can give much insight based on this limited info, but just taking a shot bc stackoverflow is literally the best site on the internet for help. If anyone has any ideas on things i should look at, test, please share...
FROM MATUKIO:
you have JavaScript errors on your page (not caused by Matukio):
07:43:49.475 ReferenceError: Heatmap is not defined
<anonym>events:129
1events:129:3
ga('set', 'VWO', Heatmap);
Additionally the following is failing:
07:44:55.490 Loading of mixed contents "http://www.workwave.com/index.php?option=com_matukio&view=requests&format=raw&task=route_link&link=index.php%3FItemid%3D283%26option%3Dcom_matukio%26view%3Deventlist%26art%3D0%26catids%3D0%26search%3D%26limit%3D10%26dateid%3D2%26fees%3D0%26locations%3D0%26organizers%3D0%26ordering%3D1%26start%3D0&Itemid=283" was blocked .1jquery.min.js:5:25679
As you see the link looks right, but hte protocol is wrong. E.g. http
instead of https.. Are you using any plugin for https redirection? Or
htaccess? It seems that JUri::root() is not working correct on your
instance. Joomla has a setting for https in the global config.
Kind regards,
Yves
It's hard to tell without more information but it looks like your site is using HTTPS. This is a good thing. But one of the restrictions is that if you're using a secure connection, then you can't access resources that are stored on a server using the non-secure HTTP protocol (it's kind of buying a fancy lock for your front door and then leaving the window open.)
This is likely coming from the Heatmap library. If you're pull this library from a CDN, try changing the url to "https://" instead of "http://" and this should fix it.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
let's say i have a website example.com and a page example.com/DHAS5KJ1H45GAS.html. There is no links to said page anywhere except mysql database.
I'm assuming there is a setting to make search engines ignore the page (noindex, nofollow).
So my question: is there a program that can find all website's pages?
PS: I'm trying to make a page accessible only for users that know the passcode (without registration), i have mysql database with code/link pairs.
Maybe there is a better way to do this?
If you give a file a random name and don't have it linked anywhere publicly (and don't have indexing enabled in your web server), there is theoretically no way that it would be found by anyone without the link.
Do keep in mind, however, that anything you put in the URL will get stored to the user's browser history (someone who REALLY wanted to invade your website might use bruteforce CSS history knocking to exploit this if your codes weren't sufficiently random), and that it'd be pretty easy for anyone who had access to share the URL.
Create a robots.txt file in the root of your web directory, and set the content to:
User-agent: *
Disallow: /
This will prevent the site from being indexed by search engines. You can make the settings as specific as you'd like to allow/disallow pages more specifically, if need be.
If this file has constant name you can apply mod_rewrite. Simply create the file .htaccess in the folder where the file is located and put following code:
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteRule ^/?DHAS5KJ1H45GAS\.html$ - [F,L]
</IfModule>
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
So, a while back I wrote a script in Python to automatically register me to classes in my school (by creating a browser with mechanize and signing in with my user/password and clicking the buttons etc)
lately I've been trying to learn JavaScript/html/css and I was thinking JavaScript would be better suited for such stuff, but I don't really understand the concept of how it will work yet, because the only way I've seen JavaScript used is linked from an html file and then run when you load the html.
I was wondering how would I create a standalone JavaScript to do those kind of stuff without html and how would I run it? (Do I download an interpreter of some kind?)
I could find information about that kind of stuff through Googling though i'm sure it's there I just don't know what this is called (tried web mining and web crawling but it doesn't seem to be it)
No, wrong use. If you are going to try to use JS externally to manipulate a browser page, you are going to have to open a web console on your browser then paste the code in, which is totally impractical.
Node.js does let one write Javascript that has full file system access, etc. to your computer, so this would be your best bet, but your question is pretty vague so I can't tell if this is exactly what you would need.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 8 months ago.
Improve this question
Say a webpage loads an external javascript at load, is there any such FireFox plugin that I could use to modify the javascript before the page actually processes it?
(not just specifically javascript)
Thanks in advance.
(also I'm pretty sure Tamper Data plugin only changes header data and not actual content being received)
For everyone that has never used tamperdata: Tamper data is for OUTGOING requests. Tamper Data can modify the ENTIRE request, except the URL which requires you to replay the request.
Using GreaseMonkey you can make stand-alone custom plugins that can modify any element of the page before it loads.
Here is THE GUIDE you want which explains GreaseMonkey.
Here are a massive number of GreaseMonkey "UserScripts". This site contains many examples of what you are looking for.
You have a couple options:
Tamper Data will modify POST parameters (and GET really since you can modify the URL).
You can also combine FoxyProxy (https://addons.mozilla.org/en-US/firefox/addon/2464) with any number of free interactive proxies (Fiddler, Paros, Burp, Charles)
Finally you can choose to not use a proxy and write up a greasemonkey script.
I think you'll likely have the most luck with the FoxyProxy + proxy approach. Unfortunately it's not a single addon.
The minimalistic browser-agnostic approach would be to write your own bookmarklet. For example I have found the Show Hiddens bookmarklet to be extremely useful for debugging form submissions. While extremely simple the bookmarklet does things which Tamper Data cannot.
I have found it here:
http://www.squarefree.com/bookmarklets/forms.html
Also the Forms tab in the Web Developer toolbar has some useful options.
If You want to change a downloadable resource, use Opera, set it's cache to never expire, and modify the files cached. That's how I did it a year or two ago, successfully.
I believe GreaseMonkey can modify the data in the page, though I'm not sure if it's executed before or after the page loads.
Check out TamperMonkey for Chrome: http://tampermonkey.net/
Or if you want to do it manually, in Chrome, it's really simple.
In Chrome, browse to: chrome://extensions. Then drag your .js file into that page.
Chrome will automatically create a manifest.json file in the Chrome AppData folder.
You can change the manifest.json file to filter the websites you want to use your script on.