So i'm looking for an efficient way to work with the sql data i'm going to store.
For now it's pretty simple, i have a table with multiple attributes that i want to collect via js.
I don't know if putting this into an HTML table would be the best option, i don't think so since i believe it would make things harder to get rows with specific attribute.
The information i want to extract from the database would look like this :
Event1 ID1 start end duration week attribute1 .... attributeX
Event2 ID2 start end duration week attribute1 .... attributeX
..
If the user asks for events from week n°3 i want to connect to the database only once, get all the events from that week, and then i have to process that data so that specific events with specific attribute values would appear at some place in the page.
Do you guys know what's the best way to store the data in order to do that kind of thing ?
On the PHP side you can store your data like that :
$array = [];
$array[] = [
'event' =>Event1,
'id' => ID1,
'start' => start,
'end' => end,
'duration' => duration,
'week' => week,
'attributres' => [attribute1, ...., attributeX]
];
Then you should use the json_encode method to send it back to the AJAX. And like that you should have a nice and pretty JS Object that you can use to do hat you want.
Just as a memo for me, and eventually to help people out.
Found out this is the best way to do things yesterday :
while($row=$query->fetch()){
$data[]=$row;
}
echo json_encode($data);
Where $data is the array that will contain all the sql informations like so :
$data['0']['nameEvent'], $data['0']['id'] ..... $data['0']['attributeX']
...
$data['n']['nameEvent'] ....
This type of storage is very convenient and makes it easy to fully mess with the DB informations as i wish.
Related
For my current project I fill an array by using the sql query
"SELECT names FROM students";
and throwing every response into an array named $names_array.
Then I use
foreach($names_array as $value) {
echo "<option>".$value."</option>";
}
to fill up a datalist with options so you can find a name using the list autocomplete or enter a name that is not yet found in the array.
Now here is the issue, if I click on an existent name I need to take a couple of other pieces of data from the table and fill in other input fields automatically.
So lets say the database table per student also has their age, birth, guardians number & guardians email.
How do I check if the typed in student already exists and if they do, get their additional data from the table?
If I can somehow get the entered name in PHP I could just look through the table which would be a lot faster but I've tried doing this and I can't seem to get it done.
I was using a very inefficient method where I json_encode an array gathered from the sql query
"SELECT * FROM students";
and then use
echo "<script>var names = ".$names_json."</script>";
to be able to fetch it in js. Now after parsing it and looping through it I can find my neccesary data but considering the database table already has 6000 options and is still increasing it's starting to take a while to loop through it, especially if the name I'm searching for is near the end of the array. Now this can take anywhere from 1 to 15 seconds where the website is completely frozen and it looks like it crashed until it's done and does what I need to do with the data.
I've tried using the solution offered here but that doesn't seem to change anything.
Please, does anyone know of a better way to do what I'm essentially already doing without temporarily freezing the website? Or maybe a completely different way of getting the other pieces of data? Thanks in advance.
for prevent the script loading to freeze the website load, you can add defer attribute, like so:
echo "<script defer>...some long logic....</script>";
For search easily through the array, you can sort it by the searched value, then use binary search
Also, you can store it in literal object, where the key is the name, and the value is object of all the student data. it will require some memory space, but make the search super fast
At first on server side - pagination/limit, do not "select all"
SELECT names FROM students WHERE names LIKE ? ORDER BY names LIMIT 20;
Second on client side - lazy loading via ajax, but first after, for example, user typed 3 chars of name.
I guess I should answer this question if anyone else ends up stumbling onto the same issue.
I change the foreach loop slightly by adding the ID as a data-id to the options
foreach($names_array as $value) {
echo "<option data-id='".$value['names_id']"'>".$value['names_name']."</option>";
}
Through js (and jquery) you can obtain the id of the chosen student like this:
currentVal = $("#inputID").val();
currentID = $("#listID option[value='" + currentVal + "']".attr('data-id');
now you can find the index of the chosen student in the namesArray doing this:
if (currentID != undefined || currentVal != "" || currentVal != " ") {
arrayIndex = namesArray.findIndex(x => x.names_id == currentID);
currentArray = namesArray[arrayIndex];
}
where namesArray is the var 'names' json parsed which I echo in the script seen in the question and the if block prevents it from even checking the array if the id is undefined or the input is empty.
Ok, so I've been reading and reading and searching and searching and strangely it doesn't seem like my scenario has been really covered anywhere.
I have an app that creates a list of products. I want a simple view that can sort the products and page through them.
Fore reference here is a simple representation of the data in Firebase.
app
stock
unique_id
name
url
imageUrl
price
When creating the list I have multiple threads using the push method on my firebase references:
new Firebase(firebaseUrl).child('stock').push({
name: "name",
price: 123
});
This gives me a lovely "hash" collection on the stock property of the app.
So what I'd now like to do is have a table to sort and page through the records that were placed in the stock hash.
I make a GET request to my server to a url like /stock?limit=10&skip=10&sort=name%20asc. This particular url would be the second page where the table contained 10 records per page and was sorted by the name property in ascending order.
Currently in my query handler I have this:
var firebaseRef = new Firebase(firebaseUrl).child('stock');
if (this.sortDesc) {
firebaseRef = firebaseRef
.orderByChild(this.sortProperty)
.endAt()
.limitToFirst(this.limitAmount);
} else {
firebaseRef = firebaseRef
.orderByChild(this.sortProperty)
.limitToFirst(this.limitAmount);
if (this.skipAmount > 0) {
firebaseRef = firebaseRef.startAt(this.skipAmount);
}
}
firebaseRef.once('value', function (snapshot) {
var results = [];
snapshot.forEach(function (childSnapshot) {
results.push(childSnapshot.val());
});
callback(null, results);
});
I'm running into a couple of problems. I'm going to split this into two cases, ascending and descending queries.
Ascending query
The orderByChild and limitToFirst seems to work correctly in the sorting ascending case. This means I can change which property has an ascending sort and how many results to return. What I am not able to get to work is skipping n records for paging to work. In the example query above I'm going to the second page. I do not get results 11-20, but I instead get the same 10 records as the first page.
Descending query
In this case I cannot begin to figure out how to tell Firebase to order by a property of the object identified by the unique key in a descending fashion. The closest I've read is to use endAt() and then limit. Docs say the limit is deprecated plus this still doesn't help me with any paging.
I tired to do doodles picturing how this would work. I came up with: order by the property, start at the 'end' of the collection, and then limit back to the page size. While this still wouldn't solve paging I would expect it to give me the last n records where n was the size of the page. I get no results.
I suppose I could say use firebaseRef = firebaseRef .orderByChild(this.sortProperty).limitToLast(this.limitAmount + this.skipAmount); and in the result callback use the forEach loop to take the first (or would it be the last; I'm not sure how that iteration would work) n records where n=this.limitAmount. This just seems inefficient. Wouldn't it be better to limit the query instead of using CPU cycles to limit data that had come over the wire or is this the relational DB query thought pattern overriding the correct thought process for NoSQL?
Further Confusion
After posting this I've still been working on a solution. I've had some things get close, but I'm also running into this filtering issue. How could I filter a set of items to one property by still sorting on another? Jeez! I want to have the ability for a user to get all the stock that isn't sold out and order it by price.
Finally
Why hasn't this basic example been fleshed out on any of the Firebase "Getting Started" pages? Being able to show tabular data, page through it, sort, and filter seem like something that EVERY web developer would come across. I'm using ng-table in an Angular app to drive the view, but it still seems that regardless of platform that the queries that I'm trying to generate would be practical on any platform that Firebase supports. Perhaps I'm missing something! Please educate me!
Firebase and NoSQL
I've come up with this simple scenario that I often run into with web applications. I want to show tabular data, filter, page, and sort it. Very simple. Very common. Writing a SQL statement for this would be dead easy. Why is the query so complicated for something like Firebase. Is this common with all NoSQL solutions? There is no relational data being stored thus the need for a relational database seems unnecessary. Yet, it seems like I could hack together a little flat file to do this storage since the ability to make Firebase do these simple tasks is not made clear in its API or Docs. FRUSTRATED!!!
I want to query object from Parse DB through javascript, that has only 1 of some specific relation object. How can this criteria be achieved?
So I tried something like this, the equalTo() acts as a "contains" and it's not what I'm looking for, my code so far, which doesn't work:
var query = new Parse.Query("Item");
query.equalTo("relatedItems", someItem);
query.lessThan("relatedItems", 2);
It seems Parse do not provide a easy way to do this.
Without any other fields, if you know all the items then you could do the following:
var innerQuery = new Parse.Query('Item');
innerQuery.containedIn('relatedItems', [all items except someItem]);
var query = new Parse.Query('Item');
query.equalTo('relatedItems', someItem);
query.doesNotMatchKeyInQuery('objectId', 'objectId', innerQuery);
...
Otherwise, you might need to get all records and do filtering.
Update
Because of the data type relation, there are no ways to include the relation content into the results, you need to do another query to get the relation content.
The workaround might add a itemCount column and keep it updated whenever the item relation is modified and do:
query.equalTo('relatedItems', someItem);
query.equalTo('itemCount', 1);
There are a couple of ways you could do this.
I'm working on a project now where I have cells composed of users.
I currently have an afterSave trigger that does this:
const count = await cell.relation("members").query().count();
cell.put("memberCount",count);
This works pretty well.
There are other ways that I've considered in theory, but I've not used
them yet.
The right way would be to hack the ability to use select with dot
notation to grab a virtual field called relatedItems.length in the
query, but that would probably only work for me because I use PostGres
... mongo seems to be extremely limited in its ability to do this sort
of thing, which is why I would never make a database out of blobs of
json in the first place.
You could do a similar thing with an afterFind trigger. I'm experimenting with that now. I'm not sure if it will confuse
parse to get an attribute back which does not exist in its schema, but
I'll find out, by the end of today. I have found that if I jam an artificial attribute into the objects in the trigger, they are returned
along with the other data. What I'm not sure about is whether Parse will decide that the object is dirty, or, worse, decide that I'm creating a new attribute and store it to the database ... which could be filtered out with a beforeSave trigger, but not until after the data had all been sent to the cloud.
There is also a place where i had to do several queries from several
tables, and would have ended up with a lot of redundant data. So I wrote a cloud function which did the queries, and then returned a couple of lists of objects, and a few lists of objectId strings which
served as indexes. This worked pretty well for me. And tracking the
last load time and sending it back when I needed up update my data allowed me to limit myself to objects which had changed since my last query.
this time I'm looking for something really special.
In my PHP page, I got a table generated by a Javascript, here is the example :
Example Page
This table is racing game results. I didn't write the JS, and I can't change the format of the results.
What I need is to parse these results to get variables, to generate championship results giving points to guys, adding points of multiple series, etc...
I tried :
Parsing with DOMDocument, also substr, but as it's JS generated it can't work.
>>this solution<< which sounded good but it doesn't work.
Do you guys have any idea to get an exploitable array ?
If not, what do you suggest as alternative solution ? I'm not able to reproduce the JS in a PHP function, too hard.
Here is the JS : click
Thank you !
Here's how you'd be able to parse the initial array in PHP:
$url = 'http://mxs-concept.com/liveresults/test.php';
$page = file_get_contents($url); // fetch the page
preg_match('/resultslines=(.+)print_race_analysis/sim', $page, $matches); // find javascript array
$js = preg_replace('/,\s+\]$/', ']', $matches[1]); // fix last comma before "]"
$array = json_decode($js); // decode the array
var_dump($array); // see the exact array you had in javascript
However, after reading your edit, it seems that you'd need more work if you're taking the PHP way. You might want to try to install NodeJS and make it parse the JS but might be an overkill. If I would you, I'd translate the JS to PHP even if it will take a day just because NodeJS might take a week instead.
I hope somebody could help me out, I have no idea how to solve this problem.
so first, here's a php array: http://pastie.org/private/s99d8w7cbhjd2yucdijw
if i do print_r for mysql ID=1, it would be like that: http://pastie.org/private/5b9n86dnxlp96afpiwvjeg
now i'm pushing data to javascript:
var sec_0_f = [];
<?php foreach ($action_events["2nd"]["0"]["Free"] as $key => $value) : ?>
sec_0_f.push('<?php echo $value; ?>');
<?php endforeach ?
and so on..
Array contains messages, which varies each time, depends on mysql ID call. However, I would need to make some sort of a sequence. So message would display in #notice ( http://pastie.org/private/ge5ceqpihkbl82hs3ya3g ). And then each #notice would trigger an animation, so #notice animation1, #notice animation2 etc, depends on number of messages.
So there would be max 20 javascript arrays filled with data..
Here comes the needed "system", which is splitted into 2 sections. So the 1nd one would sequentaly display 2 "Free" and 1 "Corner" message in #notice and each will trigger the function, but not in the exact order, if you know what i mean. For each messages of "Free", "Corner" etc, I have premade functions. Names of functions are "Event"+"Team"+"Number".
Here's example: http://pastie.org/private/9j6rcf5f8lb5jmegbbqtog
(would actually need to put some callback there, when it's done..). But for each mysql ID, the same function would need to be grabbed.. So I'm thinking about creating additional field under each ID which some data and make sure that each time calls the same..
Sorry for a long message, but I've been trying to come up with something for days, without success. Does anybody have any idea how should I approach this? Did I take the right method to do this?
If I get it right, is this (click for demo) what are you trying to do?
It basically fills the actions in an array (like you do in php), then steps in that array, takes the first object and display it, then waits a second, then goes to step into the next object, and so on till all the actions are sequentially displayed.