Use Case: I would like to map over an array of SKUs in the JSON received, do a GET request with the SKU, obtain the product_id from that request, and re-create the array, replacing the SKUs (in order) with their respective product_id.
Step Function Steps:
Input Code is received
Map Task -> GET Request -> ??
New Object created with product_id
New Object is used for business use case
Input Code Example:
{
"data": {
"product": {
"configurable_product_links": [
"SKU1",
"SKU2",
"SKU3",
"SKU4"
]
}
}
}
Output Code Example:
{
"data": {
"product": {
"configurable_product_links": [
"product_id_1",
"product_id_2",
"product_id_3",
"product_id_4"
]
}
}
}
I will be using a Call Back Return Pattern with the step function for queuing requests into the API I obtain the product_ids from.
A Map state can handle this use case. The ItemProcessor field would specify a Lambda task to perform the lookup. The output will be an array of results. The results array preserves the order of the input ItemsPath array.
Although Map iterations receive a single input value by default, you can pass arbitrary inputs to each iteration. Map States have an ItemSelector property, which "overrides the values of the input array items before they're passed on to each Map state iteration." You have access to the $$ Context Object and the $ input object.
"ItemSelector": {
"Index.$": "$$.Map.Item.Index",
"Value.$": "$$.Map.Item.Value",
"Secret.$": "$.previousTask.secret",
},
When mapping through an array in Step Functions, it does not allow one to access previous task information, outside of the Map.
When doing a GET Request, credentials are required, and it is bad practise to hard-code them. For AWS, the secret manager can be used.
Thus, the soltion to this issue has been solved by Iterating a Loop Using Lambda.
We'll count the quantity of elements in the array, and loop through until the end. This removes the limitations of the map task, whilst still achieving the requirements.
Related
I have a database full of guitar objects. I also have a service record that has a many to 1 relationship with the guitars: guitar can have many service records but a service record can only have one guitar.
I am trying to return a JSON list that has this kind of format:
guitar [{
"guitarId":1,
"guitarMake":"Fender",
"guitarModel":"Telecaster",
"serviceRecords" : [{
"serviceRecordId":1,
},
{
"serviceRecordId":2,
}];
}]
What would be the most efficient way to create this function on my express server so I get an object returned like this? Should I go the GraphQL route, or should make two function calls in order to construct this list and return?
I am developing a node js project in which the end-user can create and update outlets in mongoDb database. Now, let's say there are hundreds of fields in the outlet schema but the user may not update each and every field, i.e one may update 2 fields, other may update 3 fields. So, I want to create a function that can handle each type of request. I don't know this can be done or there is some other way, but i am new to this, can you suggest me something suitable for my project. Thanks in advance!
Sorry for the confusion earlier.
I am developing a project in nodejs for retail outlets. Each outlet has over 50 fields in the database while registering. Registration is fine. The POST request via API specifies all the data required.
But when I am planning to update any of those field. I am not sure what approach to handle. Sometimes only 1 field will be changed or next time a bunch of them with no order/sequence.
Example :
{"id":"ABCD", "name":"MNOPQR"}
{"id":"ABCD", "time":123, "name":"ZYX"}
So here in first query I need to only update the name while in next I need to update both name and time.
Is there any way I can manage the dynamic json parsing at server end and updating only those fields (int database) that are mentioned in the request.
You have several approaches you can use.
One is to accept an object with the changes:
function doTheChanges(changes) {
Object.keys(changes).forEach(name => {
const value = changes[name];
// Use `name` (the field name) and `value` (the value) to do the update
});
}
Called like this:
doTheChanges({foo: "bar", biz: "baz"});
...to change the foo field to "bar" and the biz field to "baz". (Names that have invalid identifier chars, etc., can be put in double quotse: {"I'm an interesting field name": "bar"}.)
Alternately you could accept an array of changes with name and value prperties:
function doTheChanges(changes) {
changes.forEach(({name, value}) => {
// Use `name` (the field name) and `value` (the value) to do the update
});
}
Called like this:
doTheChanges([
{
name: "foo",
value: "bar"
},
{
name: "biz",
value: "baz"
}
]);
You could also just accept a varying number of parameters, where each parameter is an object with name and value properties:
function doTheChanges(...changes) {
changes.forEach(({name, value}) => {
// Use `name` (the field name) and `value` (the value) to do the update
});
}
Called like this:
doTheChanges(
{
name: "foo",
value: "bar"
},
{
name: "biz",
value: "baz"
}
);
Note that that's very similar to the array option.
Or use the builder pattern, but it's probably overkill here.
Instead of put request you can use patch request so that only the values you change get affected in database .
I've placed an AJAX GET call in an Enyo view. The GET calls a web service which returns an array of records including the column headers.
My aim is to dynamically build a table with this returned array, where a row is created for each index and columns for each header within the index.
What I do know in terms of Enyo is to create one record by mapping the AJAX response headers to component fields:
this.$.actionsTaken.setContent( inResponse.ActionsTaken);
But I'm not sure how to do that dynamically and create the table on the fly.
So for example when I inspect the web service response my index 0 contains the following: (Where ActionsTaken, Application and EM are the col headers.)
{
ActionsTaken: "Tested uptime"
Application: "2011 Hanko"
EM: "EM102 "
}
Question:
How can you dynamically build a table from a JSON array?
The AJAX GET:
fetch: function() {
var ajax = new enyo.Ajax({
url: "http://testservice.net/WebService/GetHistory",
async: false,
handleAs:"json",
xhrFields: {withCredentials: true}
});
ajax.go(this.data);
ajax.response(this, "gotResponse");
ajax.error(this, function(inSender, inError) {
Misc.hideMask();
ViewLibrary.back();
Misc.showToast("Error retrieving data");
});
},
gotResponse: function(inSender, inResponse)
{
var debugVar = inResponse;
this.$.actionsTaken.setContent( inResponse.ActionsTaken);
this.$.configurationItemLogicalName_value.setContent( inResponse.Application);
this.$.eM.setContent( inResponse.EM);
},
The components that hold the data values:
{name:"outage_table", kind: "FittableRows",components:[
{content: "Actions Taken", name: "actionsTaken", },
{content: "Application", name: "application", },
{content: "EM", name: "eM", },
]}
Depending on the complexity of all your data, you might be able to do this fairly simply. Iterate through your array and on each object, you can then iterate through its keys and create each column with its data.
Something like:
for (var k in theObject) {
// make column k
// add theObject[k] to it
}
I think the problem is that you have created named components that match this example data, but it is unknown if those will always be the same keys?
There is a kind (enyo.DataTable, which, surprisingly, I have never used) that you might use instead. It lets you add rows (no headers) so you would make your first row from all the object keys, then the next row would be those keys' data. It is derived from DataRepeater so there may be some implementation to work out, such as using an enyo.Collection to store your data and then set the collection to the DataTable.
The other alternative that matches closer to what you have would be to just dynamically create the components as you need them:
this.$.outage_table.createComponents([{...}]);
this.$.outage_table.render(); // need to re-render when dynamically adding components
I am trying to fetch a series of objects from my Node.js-based, MongoDB instance using Restangular; however, I am not able to guarantee the number of objects I wish to grab by ID will always be the same.
To demonstrate, here is a code snippet of the principle:
Restangular
.several('users',
userList[0], userList[1], userList[2], userList[3], userList[4],
userList[5], userList[6], userList[7], userList[8], userList[9])
.get().then(function (users) { //...
userList is an array of IDs passed in as a part of a method:
requestUsersById = function (userList) { //...
The problem is that I cannot guarantee the size of the array. Is there a way to pass an array of IDs using Restangular? Or, am I just stuck making separate requests for each?
The ideal result would be something like:
Restangular
.several('users', userList)
.get().then(function (users) { //...
The Restangular API doesn't seem to natively support this, but I believe you can accomplish what you're trying to do by making using the apply() method.
In this case, you'd append the name of the collection users to the head of your userList array. Try this out:
// Store the userList in a new array to preserve the initial list
// (not sure if you use it somewhere else
var usersQuery = userList;
// Shift the collection name to be the first parameter in the array
usersQuery.unshift("users");
// Perform the Restangular call
Restangular
.several.apply(null,usersQuery)
.get().then(function (users) { //...
I'm using $pull to pull a subdocument within an array of a document.
Don't know if it matters but in my case the subdocuments contain _id so they are indexed.
Here are JSONs that describes the schemas:
user: {
_id: String,
items: [UserItem]
}
UserItem: {
_id: String,
score: Number
}
Now my problem is this: I am using $pull to remove certain UserItem's from a User.
var delta = {$pull:{}};
delta.$pull.items={};
delta.$pull.items._id = {$in: ["mongoID1", "mongoID2" ...]};
User.findOneAndUpdate(query, delta, {multi: true}, function(err, data){
//whatever...
});
What i get in data here is the User object after the change, when what i wish to get is the items that were removed from the array (satellite data).
Can this be done with one call to the mongo or do I have to do 2 calls: 1 find and 1 $pull?
Thanks for the help.
You really cannot do this, or at least there is nothing that is going to return the "actual" elements that were "pulled" from the array in any response, even with the newer WriteResponse objects available to the newer Bulk Operations API ( which is kind of the way forward ).
The only way you can really do this is by "knowing" the elements you are "intending" to "pull", and then comparing that to the "original" state of the document before it was modified. The basic MongoDB .findAndModify() method allows this, as do the mongoose wrappers of .findByIdAndUpdate() as well and .findOneAndUpdate().
Basic usage premise:
var removing = [ "MongoId1", "MongoId2" ];
Model.findOneAndUpdate(
query,
{ "$pull": { "items._id": { "$in": removing } } },
{ "new": false },
function(err,doc) {
var removed = doc.items.filter(function(item) {
return removing.indexOf(item) != -1;
});
if ( removed.length > 0 )
console.log(removed);
}
);
Or something along those lines. You basically "turn around" the default mongoose .findOneAndUpdate() ( same for the other methods ) behavior and ask for the "original" document before it was modified. If the elements you asked to "pull" were present in the array then you report them, or inspect / return true or whatever.
So the mongoose methods differ from the reference implementation by returning the "new" document by default. Turn this off, and then you can "compare".
Further notes: "multi" is not a valid option here. The method modifies "one" document by definition. Also you make a statement that the array sub-documents conatain an _id. This is done by "mongoose" by default. But those _id values in the array are "not indexed" unless you specifically define an index on that field. The only default index is the "primary document" _id.