Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I want to create an HTML file which pulls information from one of several possible databases.
The database will consist of students names and target grades and there will be one for each class.
The page created will effectively have a block/button for each student with a drop down box which contains a grade.
I would like to send this data to another database (one for each class) to track progress over time, effectively when submitting the classes grades I want to add a coloumn to the database
I have been learning programming for the last 12 months and have a reasonable understanding of HTML, JavaScript, Python (I have also just started leanring some jquery) . I dont want a full fledged solution but if someone could point me in the right direction I would be happy to do the hard work.
Primary question is what language do I need to learn to code read and write to the database? Any other suggestions about scripting are more than welcome.
I would use PHP and Javascript (jQuery is a easy to use framework). PHP can handle the "backend" operations like reading and writing to the database. HTML / CSS / Javascript can handle the view.
If you want to do all of this without the annoying page refreshes you can take a look at AJAX.
Some links to help you getting started
http://www.php.net/manual/en/getting-started.php
http://learn.jquery.com/javascript-101/getting-started/
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 days ago.
Improve this question
I'm in need of automatically getting reviews from a job posting page for my client. The problem is that it's a medium sized, local website without a public API. Are there any ways to automatically update your website based on information from another website? It'd consist of rating, name of reviewer, message + date
A library like Puppeteer is a premium choice for this if you're wanting to stick with Node.js. When you scrape just be sure you consider the website's Terms of Service that you're scraping - but I've used Puppeteer for this purpose and it's generally very simple and convenient.
You use DOM selectors as you would with Javascript and then can grab text or metadata attribute values wherever you want.
Then you just need to be careful with how you run these crawlers to not abuse the website. One crawl per 1-minute is probably a minimum if I were to guess but it's up to you!
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have built an Admin section where i can see all the contents from database in a table. everything is fine but if the database table has thousands of rows I have to scroll down very deep. is there any way I can make links like `1,2,3.Next the way google search results do? or is there any even better way?
Isn't is going to be something like:
if($(document).height>2000){
//what should i do?
}
Looks like you are using jquery. If you are going to implement everything on your own it may be time consuming - although it would be fun!
To save some time, there are quite a few plugins available to achieve this. Refer:
https://datatables.net/
http://www.jqueryscript.net/table/Client-side-HTML-Table-Pagination-Plugin-with-jQuery-Paging.html
Since you may have huge data you should be using server side pagination instead of client side. Here is another SO link which explains the difference between them : Pagination: Server Side or Client Side?
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am a newbie at Net Suite scripting. I need move 4 million records customer data from one subsidiary to another subsidiary Via Net suite script. I also created saved search for that particular records. IN CSV imports it took more time. is it possible to move via script?
Your best bet is doing it through CSV as it will take longer to do so using script. If you have multiple queues, you may want to activate multithreading on the CSV options.
Use suitescript 2.0 map reduce script. It will auto reschedule the script and it is much faster
As #Soviut said, please share the code that you already have and have tried.
#Adolfo is correct, though, in that it will probably be faster doing this via CSV import.
If your customers have any transactions, you will have to unlink them first, as the record can not be moved between subsidiaries if it has transactions associated with it.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm just starting to learn web development and I've been wondering If I can build solid websites using only HTML, CSS and maybe some PHP.
I don't want anything complicated I wanna start simple just to get used to those languages before starting to learn JS because I feel it's a bit more complicated and I'm not good with programming languages.
I wanna be able to create something like this: http://enactus.org/
I can't comment so i'll post as an answer. So please no down voting.
You can create a website but that site won't be an interactive website. What I mean is that the site will not be able to get data from users and save it in databases (well you can if you wan't to use some php but php is also more on the math, logic and stuffs that you would typically use on javascript.), or buttons that would do stuffs without refreshing the website... So to summarize it if you want to learn website designing first HTML and CSS is a way to go and also can be a great start for beginners like us. And if you want your site to be interactive start learning Javascript and PHP.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I want to get the dates booked and price from the the airbnb page: https://www.airbnb.com.sg/rooms/2781352 under the "Calendar" tab of it.
I am quite newbie to this, and I want to python to do that, can I?
And what else should learn, javascript, PHP?
For extracting data from web pages, my first stop is Beautifulsoup. It is designed for just this purpose, and is excellent at it. Combine it with the great requests HTTP library (so much better and easier than urllib/urllib2/etc.) for getting the pages.
Both of these are Python modules, there is no need to learn any other programming languages to do it, although it greatly helps to have an understanding of HTML and DTDs (Document Type Definitions) for setting up paths.