Best Practice To send Big JSON Data to Node JS - javascript

Node js : I am taking a survey which have 21 question init and answer should be in written text so the Json became very big. My question is how should I send my data to the backend should I send the whole JSON or I should store it in the file and then send it what is the best practice for this ? (Front End : JQuery And Backend : NodeJS )

Define "very big". Cause I seriously doubt that your users will write answers that weight more then say 50Kb. Note that average kindle ebook weights ~2.5Mb with ~300 pages (https://www.quora.com/What-is-the-average-file-size-of-an-e-book). And even this size is not really big for a request.
Anyway, to improve UX it is worth considering splitting the survey into multiple pages and sending each page to the server one-by-one. For that you have to store those pages on the server side of course (possibly with timeouts in case a user decides he won't continue).

Related

what's the fastest way to web-scraping and get different items from a website

I've been working on a web-scraping project written in Python using Selenium and requests. Each time when I need data I send a request to get it (or using Selenium) and each request takes time.
My question is, is there any option to get a lot of different data ( such as product's name+price+shipping) in 1 request in Python or even in javascript?
Given a url for example: https://www.amazon.com/Dell-Inspiron-7573-i7-8550U-Windows/dp/B07NRC8ZXC/ref=sr_1_1_sspa?keywords=laptop+i7&qid=1572590892&sr=8-1-spons&psc=1&spLa=ZW5jcnlwdGVkUXVhbGlmaWVyPUEyNjFPSFdTOEpVODdQJmVuY3J5cHRlZElkPUEwNTU5Nzk1MlBGWFkxU0JKOVlLNiZlbmNyeXB0ZWRBZElkPUEwNjU0MzYwM0NRT01ER1oxSDdMOCZ3aWRnZXROYW1lPXNwX2F0ZiZhY3Rpb249Y2xpY2tSZWRpcmVjdCZkb05vdExvZ0NsaWNrPXRydWU=
NOTE: you marked this question as duplicated with this question - How to click on Load More button within Google Trends and print all the titles through Selenium and Python
PLEASE, those are different questions, I'm asking about getting this product's price+name+shipping who are under different "scopes", she asked about getting the same "objects" (such as names)!!!
so if you know any other way to get those things instead of sending 3 different requests to get them, I would love to hear about them... I've heard that there is a way to combine a few requests as one big request in javascript, is it true?
still i'm waiting for an answer, anyone?

Moving sensitive logic out of browser

I've made a nice little math quiz app in rails 5 that generates random math algebra problems to the user. All of the random variables and checking to see if user's answers are correct is done in the view with javascript. I want to move this logic to the backend so the user can't cheat with browser tools like inspect. Not sure where to put this in a rails app or how it would work?
Does it go in rails/lib or do I make an ajax call? The problems are not hard coded in a database, just randomly generated. So I need to generate some random numbers in the backend send to the view to display something like 3x - 2 = 5x + 12. And then have the user's answer checked against the solution for this random problem in the backend and send info back to front-end quiz form which is keeping track of score.
See Here is the basic rails folder structure and usage of each folder
Secondly, straight to the point You can write that kind of logic either in the controller(not recommended) or in the services(recommended)
I recommend to use services for that See Here why

Save large data-set to mySQL table from Javascript without POST?

I have a large-ish amount of server-side data that I need to store in a mySQL table. [I'm a novice, working through the learning curve of javascript & php.]
I'm thinking it's best to stringify the javascript array into a JSON object and send that to a PHP page to save to the database. Once the data's in a PHP array, I know how* to get it into the database; I'm just not sure what's the best way to get it there.
I can't POST (like this example) since the maximum length of a POST string is 2048 characters, and I have maybe 10-20kb of data.
I'd rather not use AJAX or Node.js (like this example) for the sake of simplicity, and since this is a one-off (but both on my list to learn in the future!)
Or, would it be best to create a temp text file to the server with javascript, and then call a PHP page to load & process the data? (Although I can't find examples of how to do that without using POST.)
I understand the difference between server-side & client-side (thanks to this great explanation) but the size limit of POST seems to be my issue?
*Also I'm a little unsure as to when/how it's necessary to encode data (like with this deprecated mysql-real-escape-string example) for storage with {json/posting/DB tables/text}. In this case my data could contain 'single' & "double" quotes (but no foreign characters 国外 वर्ण), which [in my short experience] seem like the only times it will be an issue?
Thanks!
The problem is that Javascript is client side language while PHP is server side language. This means that PHP cannot interact with the user without some HTML, CSS or JavaScript and visa-versa, JavaScript can't interact with server side files without some PHP. Why is this? Since JavaScript is client side the user can edit it as they can see the code while with a PHP script it is all on the server and they are not able to see the code, only what it outputs/prints. So in short you cannot do what you are asking without POST or GET and it is not possible to do this without a server side script such as a PHP script (Python is also very useful if you are thinking of learning more about web backends).
There are numerous example of how to do this that you can find with a simple google search, here is a great example send data to MySQL with AJAX + jQuery + PHP
Hope I could clarify your question.

Firebase performance doubts with big data

I have some doubts about the best approaches and performance with firebase 3.x
My question might be stupid but I am relative new to this and need to understand this better.
If I have for example a Firebase object with thousands or even millions of entries, with for example user comments and I do a simple:
$scope.user_comments = $firebaseArray(ref.child('user_comments'));
What happens actually? Do I get transferred the entire data already to my browser in this case or is this more like an open DB connection and I get only the data transferred which I would call later like for example from only one user_id in this case?
What I mean is if this is more like for example in MySQL that I connect to the DB but did not send the data back to my browser until I select a bunch of data or is the simple command
$scope.user_comments = $firebaseArray(ref.child('user_comments'));
already transferring the entire object to my browser and local memory.
Sorry if this is somehow a stupid question but I wonder what I do best with big object structures and how I distribute them later to make sure I dont transfer unneeded data nonstop.
thanks for some input on this in advance.

Alternative to creating large client side Javascript array of objects via JSON file?

I have a website that contains graphs which display employee activity records. There are tiers of data (ie: region -> state -> office -> manager -> employee -> activity record) and each time you click the graph it drills down a level to get to display more specific information. The highest level (region) requires me to load ~1000 objects into an array and the lowest level is ~500,000 objects. I am populating the graphs via a JSON formatted text file using:
$.ajax({url:'data/jsondata.txt', dataType: 'json',
success: function (data) {
largeArray = data.employeeRecords;
}
Is there an alternative method I could use without hindering response time/performance? I am caught up in the thought that I must pre-load all of the data client side otherwise there will be lagtime if I need to fetch it on a user click. If anyone can point me to best practices and maybe even explain what is considered "TOO MUCH" client side data i'd appreciate it.
FYI i'm restricted to using an old web server and if I want to do anything server side i'd be limited to classic ASP otherwise it has to be client side. thank you!
If your server responds quickly
In this case, you can probably simply load data on demand when a user clicks. The server is quick, so why bother trying to be smarter for no gain.
If the server is quick, but not quick enough, then you might be able to preload the next level while drawing the first. Eg if you have just rendered the graph at the "office" level, then silently preload the "manager" next level down data while the user is still reacting to the screen update.
If the server is too slow for data on demand
In this case you probably need to model exactly where it is slow and address that. There are several things in play here and your question doesnt exactly say.
Is the server slow to query the database, if yes fix it. There is little you can do client side to solve this.
Is the server slow to package for transmission? Harder to fix, server big enough?
Network transmission is slow? Hmmm, need to send less data or get users onto faster bandwidth.
Browser unpack time is slow? (ie delay decoding the data before your script can chart it). Change how you package the data, or send less data, such as chunks.
Can browsers handle 500,000 objects? You should be able to just monitor memory of tHe browser you are using, and there are opionions yes/no on this. Will really depend or target users browser/hardware.
You might like to look at this question What is the most efficient way of sending data for a very large playlist over http? as it shows an alternative way of sending and handling data which I've found to be much quicker for step 4 above. Of course, at 500k objects you will no longer be able to use localStorage, but I've been experimenting with downloading millions of array elements and it works ok. ( still WIP ) I dont use jquery, so not sure how useable this is either.
Best practice? Sorry cannot help with that part of the question.

Categories