How to pull reviews from an outside website that does not have an open API? [closed] - javascript

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 days ago.
Improve this question
I'm in need of automatically getting reviews from a job posting page for my client. The problem is that it's a medium sized, local website without a public API. Are there any ways to automatically update your website based on information from another website? It'd consist of rating, name of reviewer, message + date

A library like Puppeteer is a premium choice for this if you're wanting to stick with Node.js. When you scrape just be sure you consider the website's Terms of Service that you're scraping - but I've used Puppeteer for this purpose and it's generally very simple and convenient.
You use DOM selectors as you would with Javascript and then can grab text or metadata attribute values wherever you want.
Then you just need to be careful with how you run these crawlers to not abuse the website. One crawl per 1-minute is probably a minimum if I were to guess but it's up to you!

Related

It's possible to record leads analytics about "Source", "Medium" and "Campaign" on a Custom HTML Form? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I know that this question is kinda weird but I have a CRM platform integrated with a MySQL database.
In this way, each new row in the MySQL database is a lead in the CRM system but now I need to track all information related with source, ect... from that lead and seems like a complicated task but don't really know if is possible.
I have hours looking for a solution and still haven't found anything useful.
You have to replicate programmatically what Google Analytics do and save value in cookie or database.
For example, when user lands on website with UTM in the URL you can get that values, if there aren't UTM you can check if there is gclid parameter in the URL so it will be google / cpc, if there isn't that parameter you can check document referral, if it is google.com your source and medium will be google / organic, alternatively website / referral.
You save the data the first time and pass it in the form. This is the principle makes use of.

Program to repeatedly get the contents of a webpage [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I wish to get the contents of a web page that requires me to be logged in (and one that I do not have control over: e.g. Twitter or Facebook), for example I can have Chrome running and I can see Ajax updating the page updating, but I want to periodically get the contents of this page and somehow save it. I don't mind leaving a computer running to achieve this...
You can use any http software to achieve this (like curl). Depending on the site it will take some investigation of how requests are made, in what order, the post data, the encryption, the user agent, cookies, headers, etc. etc.
It could take some time to find the right recipe.
Generally these sites don't want you to do this though, so don't be surprised when you run up against captcha or other clever methods from preventing exactly what you're trying to do.
Chances are, if you have to ask, you won't get in. But have fun.

Is it possible to use the Twitter API to feed tweets into a local text file? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I am using a projection mapping programme, with part of the project taking text from a local text file, and inserting it into the programme.
Instead of manually filling this text file with pre-populated content, is it possible to use the Twitter API to feed in tweets to this local text file?
I'm not very familiar using javascript to access twitter's API, but you should be able to achieve that using twit.
Depending on the number of data you're wanting to obtain, the streaming API will be a better option. This will give you a large number of data with less likelihood of reaching a limit.
If you have more question regarding that specific package, you should give it a try (plenty of examples available) and post specific questions with proper tags (i.e. javascript, twit)

AJAX Microgames [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
If you're not familiar with the concept of a Microgame, check out this video of WarioWare Twisted.
I'm interested in setting up a site where users can play series of browser-based Microgames which are delivered to them by a server. Ideally this would allow me to crowdsource the games and have an open submission system. What sort of scheme could I use to make this work?
I'm thinking that one way to do it would be to have each game consist of:
A javascript file that defines a MicroGame object that controls a rectangular portion of the screen, gets input and timing information from the main page, then calls back to the main page with a "Success" or "Failure" message.
A folder of assets that must be downloaded before the game executes.
Is this possible to do, client-side within a browser? Where would be a good place to start figuring this out?
There are a lot of open issues here. The biggest problem is what language do they submit games in which you can execute safely on the players machines? That said, there are tools like this out there. You could look at the excellent Play My Code for inspiration.

Javascript - Dynamically Create it? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I have a set of users with different permissions. Depending on what permissions they have, they should only have access to a certain javascript files. In terms of speed, is it better if on every instance of their visit, I check the permission of that user, create one javascript file that contains ALL the javascript commands accessible to that user, and load that file into the view?
Or is it better to have multiple javascript files, call them page#_permission# (for instance, page1_permission10.js), and just load the corresponding files every time the page loads?
Thanks
It is probably faster to load in only the JavaScript that is needed BUT...
It probably will not be significant enough to warrant the effort. Futhermore, you may find youself in debugging hell just to save a few ms.
Firefox and many other browsers have built in tools which describe how much time it takes to load a page. Below a recent example for stackoverflow.com. You can perform a similar operation you site and locate the bottlenecks.

Categories