A Brief Description...
I have an editable jqWidgets grid within my site but unlike traditional editable grids, instead of updating the database as you edit a row, I want to update the entire grid at once when a 'Save' button is pressed.
With this in mind, I can only see two possible options:
Somehow stringify the entire grid object into a query string and send 1 AJAX request to the server.
Or, loop through each row and execute an AJAX request for each individual row in the grid
The issue I have is that I simply cannot figure out which method is better as they both have their complications. The query string on option 1 could potentially be astronomical and exceed memory limits as the user is trying to pass the entire contents of the grid over a POST request. However, the latter solution could cause issues by the fact that it is executing an AJAX request for each row in the grid. Imagine if there were 100, or even 1000 rows in the grid!!
My Question
So, can anybody think of an effective way to achieve this without exceeding memory limits, but whilst also avoiding making multiple AJAX requests?
Further Information
In case the above wasn't clear, consider this javascript array:
[
{ name: 'Ben', age: 23, occupation: 'Developer' },
{ name: 'Charlie', age: 24, occupation: 'Receptionist' },
{ name: 'Jemima', age: 18, occupation: 'Designer' }
]
And now, try to ascertain the best method for passing all that information to PHP in one query?
on your source object, there is an updaterow function that takes parameters rowid, newdata, and commit:
updaterow: function(rowid, newdata, commit)
newdata is a json object representing the updated row from the grid, and commit is a callback you need to invoke to consider the change to have been accepted.
within this function, you can take the updated rowdata object and throw it in an associative array based on the rowid (to make sure a changed row only gets sent back once). Finally, call
commit(true)
you can now have a function called when a button is clicked that does your ajax post with the updated row data:
var editedRows = {};
var source = {
localdata: [], // your data for the grid
datatype: 'json',
datafields: [
// define columns and types
],
updaterow: function(rowid, datarow, commit) {
editedRows[rowid] = datarow;
commit(true);
}
}
var saveData = function() {
// post editedRows, clear editedRows on success
};
Related
Background Information
In short i'm looking to achieve "mostly" whats shown here ...
http://demos.telerik.com/kendo-ui/treelist/remote-data-binding
... except it's a bit of a mind bender and the data comes from more than base endpoint url in my case.
I am trying to build a generic query building page that allows users to pick a context, then a "type" (or endpoint) and then from there build a custom query on that endpoint.
I have managed to get to the point where I do this for a simple query but now i'm trying to handle more complex scenarios where I retrieve child,or deeper data items from the endpoint in question.
With this in mind ...
The concept
I have many endpoints not all of which OData but follow mostly OData v4 rules, and so I am trying to build a "TreeGrid" based having selected an endpoint that will expose the expansion options available to the query.
All my endpoints have a custom function on it called GetMetadata() which describes the type information for that endpoint, where an endpoint is for the most part basically a REST CRUD<T> implementation which may or may not have some further custom functions on it to handle a few other business scenarios.
So, given a HTTP get request to something like ...
~/SomeContext/SomeType/GetMetadata()
... I would get back an object that looks much like an MVC / WebAPI Metadata container.
that object has a property called "Properties" some of which are scalar and some of which are complex (as defined in the data).
I am trying to build a TreeListDataSource or a HierarchicalDataSource object that I can use to bind to the Kendo treeList control for only the complex properties, that dynamically builds the right get url for the meta and lists out the complex properties for that type based on the property information from the parent type with the root endpoint being defined in other controls on the page.
The Problem
I can't seem to figure out how to configure the kendo datasource object for the TreeGrid to get the desired output, I think for possibly one of two reasons ...
The TreeListDataSource object as per the demo shown here: http://demos.telerik.com/kendo-ui/treelist/local-data-binding seems to imply that the hierarchy based control wants a flat data source.
I can't figure out how to configure the datasource in such a way that I could pass in the parent meta information (data item from the source) in order to build the right endpoint url for the get request.
function getDatasource(rootEndpoint) {
return {
pageSize: 100,
filter: { logic: 'and', filters: [{ /* TODO:possibly filter properties in here? */ }] },
type: 'json',
transport: {
read: {
url: function (data) {
//TODO: figure out how to set this based on parent
var result = my.api.rootUrl + endpoint + "/GetMetadata()";
return result;
},
dataType: 'json',
beforeSend: my.api.beforeSend
}
},
schema: {
model: {
id: 'Name',
fields: {
Type: { field: 'Type', type: 'string' },
Template: { field: 'Template', type: 'string' },
DisplayName: { field: 'DisplayName', type: 'string' },
ShortDisplayName: { field: 'ShortDisplayName', type: 'string' },
Description: { field: 'Description', type: 'string' },
ServerType: { field: 'ServerType', type: 'string' }
}
}
parse: function (data) {
// the object "data" passed in here will be a meta container, a single object that contains a property array.
$.each(data.Properties, function (idx, item) {
item.ParentType = data;
item.Parent = ??? where do I get this ???
});
return data.Properties;
}
}
};
}
Some of my problem may be down to the fact that metadata inherently doesn't have primary keys, I wondered if perhaps using parse to attach a generated guid as the key might be an idea, but then I think Kendo uses the id for the question on the API when asking for children.
So it turns out that kendo is just not geared up to do anything more than serve up data from a single endpoint and the kind of thing i'm doing here is a little bit more complex than that, and further more due to the data being "not entity type data" I don't have common things like keys and foreign keys.
With that in mind I chose take the problem away from kendo altogether and simply handle the situation with a bit of a "hack that behaves like a normal kendo expand but not really" ...
In a treegrid, when kendo shows an expandable row it renders something like this in the first cell ...
With no expanded data or a data source that is bound to a server this cell is not rendered.
so I faked it in place and added an extra class to my version .not-loaded.
that meant I could hook up a custom block of js on click of my "fake expand", to build the right url, do my custom stuff, fake / create some id's, then hand the data to the data source.
expandList.on('click', '.k-i-expand.not-loaded', function (e) {
var source = expandList.data("kendoTreeList");
var cell = $(e.currentTarget).closest('td');
var selectedItem = source.dataItem($(e.currentTarget).closest('tr'));
my.type.get(selectedItem.ServerType, ctxList.val(), function (meta) {
var newData = JSLINQ(meta.Properties)
.Select(function (i) {
i.id = selectedItem.id + "/" + i.Name;
i.parentId = selectedItem.id;
i.Selected = my.type.ofProperty.isScalar(i);
i.TemplateSource = buildDefaultTemplateSourceFor(i);
return i;
})
.ToArray();
for (var i in newData) {
source.dataSource.add(newData[i]);
}
$(e.currentTarget).remove();
source.expand(selectedItem);
buildFilterGrid();
generate();
});
});
This way, kendo gets given what it epects for a for a treeviewlist "a flat set with parent child relationships" and I do all the heavy lifting.
I used a bit of JSLINQ magic to make the heavy lifting a bit more "c# like" (i prefer c# after all), but in short all it does is grab the parent item that was expanded on and uses the id from that as the parent then generates a new id for the current item as parent.id + "/" + current.name, this way everything is unique as 2 properties on an object can't have the same name, and where two objects are referenced by the same parent the prefix of the parents property name makes the reference unique.
It's not the ideal solution, but this is how things go with telerik, a hack here, a hack there and usually it's possible to make it work!
Something tells me there's a smarter way to do this though!
I have a Rest endpoint, that sends and receives objects of the form
[
{id: 1, Name: "Type"},
{id: 2, Name: "Type:Subtype"},
...
]
I want to display this in an editable tree, using Sencha Extjs 6. I am confused as to where and when to transform the data, and how to keep changes synchronized without side effects. My current (not nice) method is to reload the data and then reset the tree's store using the converted values, but that collapses all of the expanded nodes
I can get and save entries using a model and a store
I can convert the data into a form suitable for use in a treepanel
I do not know the "right" way to do so, and to have changes in either store reflected in the other.
For clarity, the converted tree store has a data stucture:
[
{
text: "Type",
children: [
{
text: "Subtype",
isLeaf: true
}
]
}
]
Came up with the answer: a save button.
Bound the Load event of the Rest Store to the controller, and in that method copied the values into the TreeStore.
Edits are made to the TreeStore - kept purely client side
Save button copies new data back into Rest Store and calls sync(). Easy to follow. Code is kept mostly declarative. And I can provide default values in the VM's data field. Easy to do - shame it's not made clear in the documentation somewhere
I've placed an AJAX GET call in an Enyo view. The GET calls a web service which returns an array of records including the column headers.
My aim is to dynamically build a table with this returned array, where a row is created for each index and columns for each header within the index.
What I do know in terms of Enyo is to create one record by mapping the AJAX response headers to component fields:
this.$.actionsTaken.setContent( inResponse.ActionsTaken);
But I'm not sure how to do that dynamically and create the table on the fly.
So for example when I inspect the web service response my index 0 contains the following: (Where ActionsTaken, Application and EM are the col headers.)
{
ActionsTaken: "Tested uptime"
Application: "2011 Hanko"
EM: "EM102 "
}
Question:
How can you dynamically build a table from a JSON array?
The AJAX GET:
fetch: function() {
var ajax = new enyo.Ajax({
url: "http://testservice.net/WebService/GetHistory",
async: false,
handleAs:"json",
xhrFields: {withCredentials: true}
});
ajax.go(this.data);
ajax.response(this, "gotResponse");
ajax.error(this, function(inSender, inError) {
Misc.hideMask();
ViewLibrary.back();
Misc.showToast("Error retrieving data");
});
},
gotResponse: function(inSender, inResponse)
{
var debugVar = inResponse;
this.$.actionsTaken.setContent( inResponse.ActionsTaken);
this.$.configurationItemLogicalName_value.setContent( inResponse.Application);
this.$.eM.setContent( inResponse.EM);
},
The components that hold the data values:
{name:"outage_table", kind: "FittableRows",components:[
{content: "Actions Taken", name: "actionsTaken", },
{content: "Application", name: "application", },
{content: "EM", name: "eM", },
]}
Depending on the complexity of all your data, you might be able to do this fairly simply. Iterate through your array and on each object, you can then iterate through its keys and create each column with its data.
Something like:
for (var k in theObject) {
// make column k
// add theObject[k] to it
}
I think the problem is that you have created named components that match this example data, but it is unknown if those will always be the same keys?
There is a kind (enyo.DataTable, which, surprisingly, I have never used) that you might use instead. It lets you add rows (no headers) so you would make your first row from all the object keys, then the next row would be those keys' data. It is derived from DataRepeater so there may be some implementation to work out, such as using an enyo.Collection to store your data and then set the collection to the DataTable.
The other alternative that matches closer to what you have would be to just dynamically create the components as you need them:
this.$.outage_table.createComponents([{...}]);
this.$.outage_table.render(); // need to re-render when dynamically adding components
Is there a way to configure a JsonRestStore to work with an existing web service that returns an array of objects which is not at the root-level of the JSON response?
My JSON response is currently similar to this:
{
message: "",
success: true,
data: [
{ name: "Bugs Bunny", id: 1 },
{ name: "Daffy Duck", id: 2 }
],
total: 2
}
I need to tell the JsonRestStore that it will find the rows under "data", but I can't see a way to do this from looking at the documentation. Schema seems like a possibility but I can't make sense of it through the docs (or what I find in Google).
My web services return data in a format expected by stores in Ext JS, but I can't refactor years worth of web services now (dealing with pagination via HTTP headers instead of query string values will probably be fun, too, but that's a problem for another day).
Thanks.
While it's only barely called out in the API docs, there is an internal method in dojox/data/JsonRestStore named _processResults that happens to be easily overridable for this purpose. It receives the data returned by the service and the original Deferred from the request, and is expected to return an object containing items and totalCount.
Based on your data above, something like this ought to work:
var CustomRestStore = declare(JsonRestStore, {
_processResults: function (results) {
return {
items: results.data,
totalCount: results.total
};
}
});
The idea with dojo/store is that reference stores are provided, but they are intended to be customized to match whatever data format you want. For example, https://github.com/sitepen/dojo-smore has a few additional stores (e.g. one that handles Csv data). These stores provide good examples for how to handle data that is offered under a different structure.
There's also the new dstore project, http://dstorejs.io/ , which is going to eventually replace dojo/store in Dojo 2, but works today against Dojo 1.x. This might be easier for creating a custom store to match your data structure.
We're still using ExtJS 3.3 in this project because it's an older project and (unfortunately) we don't have time to upgrade at the moment.
I have an editor grid panel with a restful JSON store. The JSON store has autoSave set to false, because I need to perform a potentially long running task on the server when something has changed. So I decided to create a "Save" button so the store is only saved once the user has completed the modifications. As the store is sending multiple requests to the server, I'm listening to the save event of the store to fire another request starting the long running operation after all data has been saved.
And here comes the problem: According to the documentation the third parameter of the event handler is the data that has been saved, grouped by the store operation (create, update or destroy), like this:
{
create: [ /* created records here */ ],
update: [ /* updated records here */ ],
destroy: [ /* deleted records here */ ]
}
My problem is that I'm receiving only the "created" records. For the other operations, the array is always empty when there should be some records. If there were no records using "update" or "destroy", the data object doesn't contain the "update" or "destroy" key (which is correct).
Let's say there were one updated, two created and no deleted records in the last save operation, the data I receive would look like this:
{
create: [
{ /* record 1 data here */ },
{ /* record 2 data here */ }
],
update: [] // <-- should not be empty!
}
I don't know why update and destroy are always empty. Can anyone help me?
Or maybe you have another suggestion how I can solve the original problem, performing an "after-save" task using the IDs of all created/updated/deleted records.
I managed to work around this issue.
Apparently the beforesave event receives the correct parameters. I'm now storing the dirty records in a temporary property in the store before saving them and read the property afterwards in the save event.
listeners: {
beforesave: function (store, data) {
// in my concrete case it's okay to join all records
var all = (data.create || []).concat(data.update || []).concat(data.destroy || []);
store.tempData = all;
},
save: function (store) {
// do something with store.tempData
}
}
As the records in tempData are references, it doesn't matter that they may be phantom records (without server-generated ID) when beforesave is called. When save is called, these records have already been replaced with "real" records.