prevent bots / scrapers executing javascript to get output [closed] - javascript

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I see allot about Cpatcha's and Submission forms / methods to block bots and content scrapers / leechers but nothing about blocking those who take the entire JavaScript contents and execute it to obtain and view what it is outputting.
Is it possible to prevent bots executing JavaScript to obtain the output.
I have looked at if statements within JavaScipt checking screen resolutions, keyboards, mouse, touch screens basic human required functions etc but it is a hard area to find information on.
if (bot){ //don't execute Javascript don't let the bot get the real output.
return;
}

The only known mechanism is to use minification and obfuscation of your javacsript functions. Change them on every deploy or every day through a script process. Another thing is not to have window methods on the global space.
You may want to look at Web Assembly, but not all browsers have currently adopted it.
There is no straight forward way to achieve this perfectly. If people put enough time they can crack it out.

Related

Hacked Site - SSH to remove a large body of javascript from 200+ files [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I have been asked to clean a hacked site on apache (php, JS, HTML) which I can do and I have implemented security features, however there is still JS injected into about 2000+ javascript files. The injected code is the same on every page and about 5500 characters long with !'' characters interspersed.
Ideally I'd like to run a SSH command that would find and remove this long code from every page it is on. All of the examples find, grep, sed etc only show it for very short strings with no special characters.
Any help appreciated.
There's no point to trying to fix the server in-place. Wipe it down to bare metal and re-deploy from source control.
Once someone has gotten into your boxes, there's no way to ensure they're really gone unless you burn it all down. There's certainly no magic command that can figure it out for you.

What is the best way of testing that, removing unused tables(migration) do not affect the application [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
We have some very old tables which we do not use. I am planning to remove those. My initial plan is to rename the tables and test if our application is affected anyway. But I am not sure how to test the application and make sure that it's not affected.
Ideally, you should have tests for your application, which are going to break if anything is using the deleted tables.
Assuming that you don't have tests, the next best way is to run a global search in your codebase and look for those models / table names being used in the code. If you still don't feel confident, you can manually go through every page and make sure that nothing is broken. Depending on the size of your app, that might be really slow and painful, but it's what you get for not writing tests from the start :P
Good luck!
Run all your automated tests. If you don't have any, right now is always the best time to start adding them.

Program to repeatedly get the contents of a webpage [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I wish to get the contents of a web page that requires me to be logged in (and one that I do not have control over: e.g. Twitter or Facebook), for example I can have Chrome running and I can see Ajax updating the page updating, but I want to periodically get the contents of this page and somehow save it. I don't mind leaving a computer running to achieve this...
You can use any http software to achieve this (like curl). Depending on the site it will take some investigation of how requests are made, in what order, the post data, the encryption, the user agent, cookies, headers, etc. etc.
It could take some time to find the right recipe.
Generally these sites don't want you to do this though, so don't be surprised when you run up against captcha or other clever methods from preventing exactly what you're trying to do.
Chances are, if you have to ask, you won't get in. But have fun.

AJAX Microgames [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
If you're not familiar with the concept of a Microgame, check out this video of WarioWare Twisted.
I'm interested in setting up a site where users can play series of browser-based Microgames which are delivered to them by a server. Ideally this would allow me to crowdsource the games and have an open submission system. What sort of scheme could I use to make this work?
I'm thinking that one way to do it would be to have each game consist of:
A javascript file that defines a MicroGame object that controls a rectangular portion of the screen, gets input and timing information from the main page, then calls back to the main page with a "Success" or "Failure" message.
A folder of assets that must be downloaded before the game executes.
Is this possible to do, client-side within a browser? Where would be a good place to start figuring this out?
There are a lot of open issues here. The biggest problem is what language do they submit games in which you can execute safely on the players machines? That said, there are tools like this out there. You could look at the excellent Play My Code for inspiration.

Javascript - Dynamically Create it? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I have a set of users with different permissions. Depending on what permissions they have, they should only have access to a certain javascript files. In terms of speed, is it better if on every instance of their visit, I check the permission of that user, create one javascript file that contains ALL the javascript commands accessible to that user, and load that file into the view?
Or is it better to have multiple javascript files, call them page#_permission# (for instance, page1_permission10.js), and just load the corresponding files every time the page loads?
Thanks
It is probably faster to load in only the JavaScript that is needed BUT...
It probably will not be significant enough to warrant the effort. Futhermore, you may find youself in debugging hell just to save a few ms.
Firefox and many other browsers have built in tools which describe how much time it takes to load a page. Below a recent example for stackoverflow.com. You can perform a similar operation you site and locate the bottlenecks.

Categories