How to structure a JSON call - javascript

I have an endpoint which I call with Axios and the response looks like (for example):
items: [
{
url: "https://api.example1...",
expirationDate: "2019-11-15T00:00:00+01:00"
},
{
url: "https://api.example2...",
expirationDate: "2019-12-20T00:00:00+01:00"
},
{
url: "https://api.example3...",
expirationDate: "2020-01-17T00:00:00+01:00"
},
...and so on.
If I go to one of the url:s in the browser the structure of the JSON is:
fooBar: {
url: "https://api.foo...",
id: "123",
type: "INDEX",
source: "Foobar",
quotes: {
url: "https://api.bar..."
}
},
I need to get the quotes of the two first url:s in items:[] dynamically because they will disappear when the 'expirationDate' is older than today's date.
How can this be achieved? Thanks in advance!

If I understand the requirements correctly you need to:
get the list of items
get item details for first two items (to extract links to quotes)
get quotes for first two items
You can use promise chaining to execute these operations maintaining the order:
const getQuotes = (item) => axios.get(item.url)
.then(resp => axios.get(resp.data.fooBar.quotes.url));
axios.get('/items') // here should be the url that you use to get items array
.then(resp => resp.data.slice(0, 2).map(getQuotes))
.then(quotes => Promise.all(quotes))
.then(quotes => {
console.log(quotes);
});

Please find my proposal below. I gave two examples. You can get the whole quotes object, or just the URL inside the quotes object.
This is just a console log, but you can easily ie. append this data to a div in the html or pass this URL to some other function.
$(function() {
const address = 'https://api.example1...';
function loadQuotes() {
$.ajax({
url: address,
dataType: 'json',
}).done(function(response) {
response.forEach(el => {
console.log(`el.quotes`);
// or if you want to be more specific console.log(`el.quotes.url`);
});
});
}
loadQuotes();
});
If these are nested objects, just append fooBar.
For example change the .done part to:
.done(function(response) {
let qqq = response.quotes
quotes.forEach(el => {
console.log(`el.quotes`);
// or if you want to be more specific console.log(`el.quotes.url`);
});

Related

How can I limit the objects from a group in a query in Gatsby?

I have this query in my code which allows me to build a tag cloud for this blog front page
tagCloud:allContentfulBlogPost {
group(field: tags, limit: 8) {
fieldValue
}
}
It's passing data that I map in my component using {data.tagCloud.group.map(tag => (...))};. The code works nicely, but it won't be limited by the filter I'm passing above in the group(fields: tags, limit: 8) in my query. It renders all the tags and not only the first eight.
I've unsuccessfully tried the skip filter as well for the sake of seeing if it works.
Is this the proper way to limit the count to my mapping component in Gatsby?
The Contentful source plugin doesn't define arguments on any of the nodes it creates, unfortunately. Instead you would need to create these yourself. The easiest way to do that is through the createResolvers API.
Here's a similar example from a project of mine:
// in gatsby-node.js
exports.createResolvers = ({ createResolvers }) => {
createResolvers({
SourceArticleCollection: {
// Add articles from the selected section(s)
articles: {
type: ["SourceArticle"],
args: {
// here's where the `limit` argument is added
limit: {
type: "Int",
},
},
resolve: async (source, args, context, info) => {
// this function just needs to return the data for the field;
// in this case, I'm able to fetch a list of the top-level
// entries that match a particular condition, but in your case
// you might want to instead use the existing data in your
// `source` and just slice it in JS.
const articles = await context.nodeModel.runQuery({
query: {
filter: {
category: {
section: {
id: {
in: source.sections.map((s) => s._ref),
},
},
},
},
},
type: "SourceArticle",
})
return (articles || []).slice(0, args.limit || source.limit || 20)
},
},
},
})
}
Because resolvers run as part of the data-fetching routines that support the GraphQL API, this will run server-side at build-time and only the truncated/prepared data will be sent down to the client at request time.

How can I loop over over multiple ajax requests, and then execute code after everything has been complete? [duplicate]

This question already has answers here:
Wait until all jQuery Ajax requests are done?
(22 answers)
Closed 3 years ago.
Let me just start this by saying I've done a series of searches online, but can't seem to piece it together.
Requirements: Use jQuery :(
On click, I am using a .getJSON call to get an object with several layers.
Here's an example of the data:
myObj = {
title: 'some name',
items: [
{
name: 'item-1',
url: '/item-1'
},
{
name: 'item-2',
url: '/item-4'
},
{
name: 'item-3',
url: '/item-4'
},
{
name: 'item-4',
url: '/item-4'
},
]
}
I want to loop through all of the urls, and call an .ajax operation on them, and then store the new data I get back in their respective objects.
It would look like this:
myObj = {
title: 'some name',
items: [
{
name: 'item-1',
url: '/item-1',
properties: {//whole new set of data from url}
},
{
name: 'item-2',
url: '/item-4',
properties: {//whole new set of data from url}
},
{
name: 'item-3',
url: '/item-4',
properties: {//whole new set of data from url}
},
{
name: 'item-4',
url: '/item-4',
properties: {//whole new set of data from url}
},
]
}
Once all of that is complete and each object has this new bit of data, I then want to do something with the myObj, like render it to a jquery template (ugh), but the new data MUST be inside of each item.
Here's what I have so far:
var myItems = myObj.items;
$(myItems).each(function(index, item) {
var itemUrl = '/somestuff/' + item.url + '.js';
$.getJSON(itemUrl).done(function(itemData) {
item.data = itemData;
});
}).promise().done(function() {//render data to template})
The only problem I'm having is that sometimes the data doesn't exist yet (item.properties) when the template renders, and therefore cannot render undefined.
I've tried unsuccessfully chaining .done(), and have now come across using .when(), but don't know how to write the line of code to make .when() work properly.
Any help is appreciated, and I'd be happy to clarify details.
If you capture the Promise (technically a jQuery Deferred object, actually) generated by each AJAX request, and add them to an array, then you can call .when() to execute some code once all of the Promises are resolved. Something like this (untested):
var myItems = myObj.items;
var promises = [];
$(myItems).each(function(index, item) {
var itemUrl = '/somestuff/' + item.url + '.js';
var p = $.getJSON(itemUrl);
p.then(function(itemData) {
item.data = itemData;
return itemData;
});
promises.push(p);
});
$.when.apply($, promises).then(function() { //render data to template...
This is probably preferable to chaining the done() callbacks, because it still allows the requests to execute in parallel, which is likely to be faster (although this is somewhat dependent on the server, but that's a separate issue).

Cypress request : empty array in body

I'm finding myself in some troubles while testing my API with Cypress. (I'm using version 2.1.0)
I am sending a request to my endpoint, and want to verify how it is reacting when I am sending an empty array as a parameter. The problem is that somehow, Cypress must be parsing the body I am giving him, and removing the empty array.
My code is the following :
cy.request({
method: 'PUT',
url,
form: true,
body: {
name: 'Name',
subjects: []
}
})
.then((response) => {
expect(response.body).to.have.property('subjects');
const { subjects } = response.body;
expect(subjects.length).to.eq(0);
});
// API receives only the parameter name, and no subjects
When I am sending an empty array of subjects, the endpoint will delete all the associated subjects, and return the object with an empty array of subjects. It is working as it should, and my software in use is working as it should.
When Cypress is sending this request, the endpoint does not receive the parameter subjects. Which is for me a very different thing : I should not touch the subjects in this case.
Is there a way to avoid this "rewriting" by Cypress and send the body as I write it ?
The test works when setting form: false.
it.only('PUTs a request', () => {
const url = 'http://localhost:3000/mythings/2'
cy.request({
method: 'PUT',
url: url,
form: false,
body: {
name: 'Name',
subjects: []
}
})
.then((response) => {
expect(response.body).to.have.property('subjects');
const {
subjects
} = response.body;
expect(subjects.length).to.eq(0);
});
})
I set up a local rest server with json-server to check out the behavior.
If I try to PUT a non-empty array with form: true
cy.request({
method: 'PUT',
url: url,
form: true,
body: {
name: 'Name',
subjects: ['x']
}
})
looking at db.json after the test has run, I see the item index migrating into the key,
"mythings": [
{
"name": "Name",
"subjects[0]": "x",
"id": 2
}
],
so perhaps form means simple properties only.
Changing to form: false gives a proper array
{
"mythings": [
{
"name": "Name",
"subjects": ['x'],
"id": 2
}
],
}
which can then be emptied out by posting an empty array.

Need help parsing jquery datatables editor data

I have a data format that I receive from jquery data tables editor Datatables Editor which looks like the one below and I need to parse it so that I can store it into db but I have not figured out a way of doing so.
{ action: 'edit',
'data[1][Name]': 'Some Text ',
'data[1][Rating]': '1',
'data[1][Division]': 'Some Text '
}
What is the best way to parse this form of data using javascript ? The editor library comes with a php library for parsing the data but I am using nodejs for the backend/
If you want to convert data[] into a literal, you could do something like this :
var prop, fieldName, literal = {};
for (prop in data) {
if (prop != 'action') {
fieldName = prop.match(/\[(.*?)\]/g)[1].replace(/\]|\[/g,'');
literal[fieldName] = data[prop];
}
}
→demo. It will produce a literal like
{Name: "Some Text ", Rating: "1", Division: "Some Text "}
that can be used to be inserted in a mongodb for example.
It simply loops through data, extracts each #2 [] and take the content of that bracket as property names to the literal. I do not at all claim this is the best method.
I have new and maybe a bit more systematic approach, that excludes risc of '[]' characters in regexed strings. Very simple way is to use custom ajax, I have used my own data:
const editor = new $.fn.dataTable.Editor({
ajax: (method, url, data, success, error) => {
$.ajax({
type: 'POST',
url: '/updateproductcode',
data: JSON.stringify(data),
success: (json) => {
success(json);
},
error: (xhr, error, thrown) => {
error(xhr, error, thrown);
}
});
},
table: '#mytable',
idSrc: 'productcode',
fields: ...
Then on serverside you receive object, whose key is your stringified data:
{'{"action":"edit","data":{"08588001339265":{"productcode":"08588001339265","name":"does_not_existasdfadsf","pdkname":"Prokain Penicilin G 1.5 Biotika ims.inj.s.10x1.5MU","suklcode":"0201964","pdkcode":"2895002"}
}:''}
If you parse the key of it with JSON.parse(Object.keys(req.body)[0]), you get your results:
{ action: 'edit',
data:
{ '08588001339265':
{ productcode: '08588001339265',
name: 'does_not_existasdfadsf',
pdkname: 'Prokain Penicilin G 1.5 Biotika ims.inj.s.10x1.5MU',
suklcode: '0201964',
pdkcode: '2895002' } } }

Adding keys to objects in a parse object

Im working with the parse javascript sdk and i want to add a key to an object in a Parse 'object' when saving it.
For ex:
var saveGif = new SaveGifTags();
saveGif.save({
type: req.body.type,
tag: req.body.tags[i],
gifObjects: newGif
}, {
success: function(data) {
console.log(data);
res.json(data);
},
error: function(data) {
console.log(data);
}
});
gifObjects is my object in the Parse Class. I tried to do something like this
gifObjects: gifOjects[gifObj.id] = newGif;
newGif of course is the object i want to save. This gave me an error of gifObjects is not defined. So i tried something like this
gifObjects[gifObj.id] : newGif;
that didnt work either. i want to create something like this:
{
hgs32: {
url: '',
image: ''
},
64522 : {
url: '',
image: ''
}
}
any suggestions?
Figured it out. Not sure what the downvote is for, but i had to create the object before i saved it.
var gifObjects = {};
gifObjects[gifObj.id] = newGif;
and then the save
saveGif.save({
type: req.body.type,
tag: req.body.tags[i],
gifObjects: gifObjects
},

Categories