EmberJS is not loading up the model correctly - javascript

At a loss on this one.
I'm using Ember and Ember data. I've got this extra implementation of ic-ajax to make GET, POST and PUT calls. Anyway, i'm trying to make a GET call then turn those results into model instances.
return this.GET('/editor')
.then((data) => {
return data.drafts.map((draftData) => {
let draft = this.store.find('draft',draftData.id);
console.log(draft.get('type'));
return draft;
});
});
My API returns proper data as data.drafts. This map is supposed to return an array of promises that resolve to draft models. It does not. It resolves to a draft model that has id, date, and title. But that's it. I have 25 others attributions.
In another part of the application i'm getting drafts using findAll on the model. And those models look fine. But when I try store.findRecord('draft',id) i get these fake objects.
-- edit
This is what my ReOpenClass method looks like for getting an array of objects from the server and turning them into ember objects
search(critera) {
let query = { search: critera };
let adapter = this.store.adapterFor('application');
let url = adapter.buildURL('article','search');
return adapter.ajax(url,'GET', { data: query }).then(response => {
let articleRecords = response.articles.map((article) => {
let record;
try {
record = this.store.createRecord('article', article);
} catch(e) {
record = this.store.peekRecord('article', article.id);
}
return record;
});
return articleRecords;
});
},
So far I can't find a better way to pull this off.

Related

How do I create an array from Promise results?

I'm using React to build a web app. At one point I have a list of ids, and I want to use those to retrieve a list of items from a database, get a list of metrics from each one, and then push those metrics to an array. My code so far is:
useEffect(() => {
const newMetrics = [];
currentItems.forEach((item) => {
const url = `items/listmetrics/${item.id}`;
Client.getData(url).then(async (metrics) => {
let promises = metrics.map((metricId: string) => {
// Get metric info
const urlMetric = `metrics/${metricId}`;
return Client.getData(urlMetric);
});
await Promise.all(promises).then((metrics: Array<any>) => {
metrics.forEach((metric: MetricModel) => {
const metricItem = {
id: metric.id,
metricName: metric.name
};
newMetrics.push(metricItem);
}
});
});
});
setMetrics(newMetrics);
});
}, [currentItems]);
where "metrics" is a state variable, set by setMetrics.
This appears to work ok, but when I try to access the resulting metrics array, it seems to be in the wrong format. If I try to read the value of metrics[0], it says it's undefined (although I know there are several items in metrics). Looking at it in the console, metrics looks like this:
However, normally the console shows arrays like this (this is a different variable, I'm just showing how it's listed with (2) [{...},{...}], whereas the one I've created shows as []):
I'm not confident with using Promise.all, so I suspect that that's where I've gone wrong, but I don't know how to fix it.

Javascript : Filter a JSON object from API call to only get the infos I need

I'm trying to create a small project to work on API calls. I have created an async that recovers infos about a track using the MusicBrainz API. You can check the result of the request by clicking there : https://musicbrainz.org/ws/2/recording/5935ec91-8124-42ff-937f-f31a20ffe58f?inc=genres+ratings+releases+artists&fmt=json (I chose Highway to Hell from AC/DC).
And here is what I got so far as reworking the JSON response of my request :
export const GET_JSON = async function (url) {
try {
const res = await Promise.race([
fetch(url),
timeout(CONSTANTS.TIMEOUT_SEC),
]);
const data = await res.json();
if (!res.ok) throw new Error(`${data.message} (${res.status})`);
return data;
} catch (err) {
throw err;
}
};
export const loadTrackDetail = async function (id) {
try {
const trackData = await GET_JSON(
encodeURI(
`${CONSTANTS.API_URL}${id}?inc=genres+artists+ratings+releases&fmt=json`
)
);
details.trackDetails = {
trackTitle: trackData.title,
trackID: trackData.id,
trackLength: trackData.length ?? "No duration provided",
trackArtists: trackData["artist-credit"].length
? trackData["artist-credit"]
: "No information on artists",
trackReleases: trackData["releases"].length
? trackData["releases"]
: "No information on releases",
trackGenres: trackData["genres"].length
? trackData["genres"]
: "No information on genres",
trackRating: trackData.rating.value ?? "No rating yet",
};
console.log(details.trackDetails);
} catch (err) {
throw err;
}
Now this isn't half bad, but the releases property for example is an array of objects (each one being a specific release on which the track is present) but for each of those releases, I want to "reduce" the object to its id and title only. The rest does not interest me. Moreover, I'd like to say that if, for example, the title of a release is similar to that of a previous one already present, the entire object is not added to the new array.
I've thought about doing a foreach function, but I just can't wrap my head around how to write it correctly, if it's actually possible at all, if I should use an array.map for example, or another iterative method.
If anyone has some nice way of doing this in pure JS (not Jquery !), efficient and clean, it'd be much appreciated !
Cheers
There are a few things that make this question a little difficult to answer, but I believe the below will get you pointed in the right direction.
You don't include the GET_JSON method, so your example isn't complete and can't be used immediately to iterate on.
In the example you bring, there isn't a name property on the objects contained in the releases array. I substituted name with title below to demonstrate the approach.
You state
Moreover, I'd like to say that if, for example, the name of a release
is similar to that of a previous one already present, the entire
object is not added to the new array.
But you don't define what you consider that would make releases similar.
Given the above, as stated, I assumed you meant title when you said name and I also assumed that what would constitute a similar release would be one with the same name/title.
Assuming those assumptions are correct, I just fetch to retrieve the results. The response has a json method on it that will convert the response to a JSON object. The I map each release to the smaller data set you are interested in(id, title) and then reduce that array to remove 'duplicate' releases.
fetch('https://musicbrainz.org/ws/2/recording/5935ec91-8124-42ff-937f-f31a20ffe58f?inc=genres+ratings+releases+artists&fmt=json')
.then(m => m.json())
.then(j => {
const reducedReleases = j.releases
.map(release => ({ id: release.id, name: release.title }))
.reduce(
(accumulator, currentValue, currentIndex, sourceArray) => {
if (!accumulator.find(a => a.name === currentValue.name)) {
accumulator.push(currentValue);
}
return accumulator;
},
[]);
console.log(reducedReleases);
});
const releasesReduced = []
const titleNotExist = (title) => {
return releasesReduced.every(release => {
if(release.title === title) return false;
return true
})
}
trackData["releases"].forEach(release => {
if (titleNotExist(release.title))
releasesReduced.push({id: release.id, title: release.title})
})
console.log(releasesReduced)
The array details.trackDetails.trackReleases has a path to an id and name from different objects. If you meant: ["release-events"]=>["area"]["id"]and["area"]["name"]` then see the demo below.
Demo uses flatMap() on each level of path to extract "release-events" then "area" to return an array of objects
[{name: area.name, id: area.id}, {name: area.name, id: area.id},...]
Then runs the array of pairs into a for...of loop and sets each unique name with id into a ES6 Map. Then it returns the Map as an object.
{name: id, name: id, ...}
To review this functioning, go to this Plunker
const releaseEvents = (details.trackDetails.trackReleases) => {
let trackClone = JSON.parse(JSON.stringify(objArr));
let areas = trackClone.flatMap(obj => {
if (obj["release-events"]) {
let countries = obj["release-events"].flatMap(o => {
if (o["area"]) {
let area = {};
area.name = o["area"]["name"];
area.id = o["area"]["id"];
return [area];
} else {
return [];
}
});
return countries;
} else {
return [];
}
});
let eventAreas = new Map();
for (let area of areas) {
if (!eventAreas.has(area.name)) {
eventAreas.set(area.name, area.id);
}
}
return Object.fromEntries([...eventAreas]);
};
console.log(releaseEvents(releases));

Make Multiple Post Requests In Axios

What I have been trying to do is hit an endpoint for my blog posts and then with this data remove extra layout markup that came in from Wordpress. I am using Axios to make the request and then transform response option in order to modify the data to remove the extra markup from the "post_body" object inside my response. This works on a single blog post but when I try to do this all my blog blog posts it return an object of 20 or so blog posts. What I want to do is loop through the objects transform my data and then make a post request back to another API to publish my blog post. What I can't figure out if this will be possible once my promise is resolved. Would I be able to create another for loop within the .then and find my "post_body" object and make the post request. Not sure if I am thinking about this in the right way or not. Any help is much appreciated.
var fieldName = "et_pb";
var regExp = new RegExp("\\[\/?(" + fieldName + ".*?)\\]", "g");
function getBlogPosts() {
return axios.get(allPosts, {
transformResponse: axios.defaults.transformResponse.concat(function(data, headers) {
// use data I passed into the function and the objects from the API
// pass in data into the function using forEach this will return an array
data.objects.forEach(function(i) {
// use the returned array on Objects.key to find the name of the array
Object.keys(i).forEach(function(k) {
// if the key equals execute code
// console.log(k);
if (k === "post_body") {
// fire Regex
data[k] = i[k].replace(regExp, '');
// console.log(data[k])
}
})
})
return data;
})
})
}
axios.all([getBlogPosts()])
.then(axios.spread(function(blogResponse) {
console.log(blogResponse.data);
}));
#James you are correct . you can chain multiple requests as below or you can go for asyn and await options .
axios.get(...) // for allPosts
.then((response) => {
return axios.get(...); // using response.data
})
.then((response) => {
console.log('Response', response);
});

Rxjs: updating values in observable stream with data from another observable, returning a single observable stream

Background
I'm trying to construct an observable stream of values from the Stash Rest Api of pull requests. Unfortunately, the information of whether or not a PR has merge conflicts is available at a different endpoint to the list of merges.
The list of open pull requests is visible at, say, http://my.stash.com/rest/api/1.0/projects/myproject/repos/myrepo/pull-requests
For each PR, the data on merge conflicts is visible at http://my.stash.com/rest/api/1.0/projects/myproject/repos/myrepo/pull-requests/[PR-ID]/merge
Using the atlas-stash package, I can create and subscribe to an observable stream of pull requests (updated every second):
let pullRequestsObs = Rx.Observable.create(function(o) {
stash.pullRequests(project, repo)
.on('error', function(error) {o.onError(error)})
.on('allPages', function(data) {
o.onNext(data);
o.onCompleted();
});
});
let pullRequestStream = pullRequestsObs
.take(1)
.merge(
Rx.Observable
.interval(1000)
.flatMapLatest(pullRequestsObs)
);
pullRequestsStream.subscribe(
(data) => {
console.log(data)
// do something with data
},
(error) => log.error(error),
() => log.info('done')
);
This works as I want and expect. In the end, the pullRequestsStream is an observable whose values are lists of JSON objects.
My Goal
I would like the pullRequestsStream values to be updated so every element of the list includes information from the [PR-ID]/merge api.
I assume that this can be achieved using a map on pullRequestsStream, but I'm not succeeding in doing this.
let pullRequestWithMergeStream = pullRequestStream.map(function(prlist) {
_.map(prlist, function(pr) {
let mergeObs = Rx.Observable.create(function(o) {
stash.pullRequestMerge(project, repo, pr['id'])
.on('error', function(error) {o.onError(error)})
.on('newPage', function(data) {
o.onNext(data);
o.onCompleted();
}).take(1);
});
mergeObs.subscribe(
(data) => {
pr['merge'] = data;
return pr; // this definitely isn't right
},
(error) => log.error(error),
() => log.info('done')
);
});
});
With a bit of logging, I can see that both the pull-request and the merge apis are being hit correctly, but when I subscribe to pullRequestWithMergeStream I
get undefined values.
Using return within the the subscribe step within a map doesn't work (and doesn't seem like it should) but I can't figure out what pattern/idiom would achieve what I want.
Is there a correct way of doing this? Have I gone completely down the wrong track?
tl;dr
Can I update values from an Rxjs.Observable with information from a different observable?
You could use flatMap or concatMap to have one task trigger another one. You could use forkJoin to request the merges in parallel and collect the result in one place. It is not tested, but it should go like this :
pullRequestStream.concatMap(function (prlist){
var arrayRequestMerge = prlist.map(function(pr){
return Rx.Observable.create(function(o) {...same as your code});
});
return Rx.Observable.forkJoin(arrayRequestMerge)
.do(function(arrayData){
prlist.map(function(pr, index){pr['merge']=arrayData[index]
})})
.map(function(){return prlist})
})
PS : I supposed prlist was an array.
UPDATE
Following your comment, here is a version that will run only maxConcurrent calls in parallels.
pullRequestStream.concatMap(function (prlist){
var arrayRequestMerge = prlist.map(function(pr, index){
return Rx.Observable.create(function(o) {
stash.pullRequestMerge(project, repo, pr['id'])
.on('error', function(error) {o.onError(error)})
.on('newPage', function(data) {
o.onNext({data: data, index : index});
o.onCompleted();
}).take(1);
});
});
var maxConcurrent = 2;
Rx.Observable.from(arrayRequestMerge)
.merge(maxConcurrent)
.do(function(obj){
prlist[obj.index]['merge'] = obj.data
})})
.map(function(){return prlist})
})

Inserting into Collection after Promises in a Meteor Method

I'm using this Gumroad-API npm package in order to fetch data from an external service (Gumroad). Unfortunately, it seems to use a .then() construct which can get a little unwieldy as you will find out below:
This is my meteor method:
Meteor.methods({
fetchGumroadData: () => {
const Gumroad = Meteor.npmRequire('gumroad-api');
let gumroad = new Gumroad({ token: Meteor.settings.gumroadAccessKey });
let before = "2099-12-04";
let after = "2014-12-04";
let page = 1;
let sales = [];
// Recursively defined to continue fetching the next page if it exists
let doThisAfterResponse = (response) => {
sales.push(response.sales);
if (response.next_page_url) {
page = page + 1;
gumroad.listSales(after, before, page).then(doThisAfterResponse);
} else {
let finalArray = R.unnest(sales);
console.log('result array length: ' + finalArray.length);
Meteor.call('insertSales', finalArray);
console.log('FINISHED');
}
}
gumroad.listSales(after, before, page).then(doThisAfterResponse); // run
}
});
Since the NPM package exposes the Gumorad API using something like this:
gumroad.listSales(after, before, page).then(callback)
I decided to do it recursively in order to grab all pages of data.
Let me try to re-cap what is happening here:
The journey starts on the last line of the code shown above.
The initial page is fetched, and doThisAfterResponse() is run for the first time.
We first dump the returned data into our sales array, and then we check if the response has given us a link to the next page (as an indication as to whether or not we're on the final page).
If so, we increment our page count and we make the API call again with the same function to handle the response again.
If not, this means we're at our final page. Now it's time to format the data using R.unnest and finally insert the finalArray of data into our database.
But a funny thing happens here. The entire execution halts at the Meteor.call() and I don't even get an error output to the server logs.
I even tried switching out the Meteor.call() for a simple: Sales.insert({text: 'testing'}) but the exact same behaviour is observed.
What I really need to do is to fetch the information and then store it into the database on the server. How can I make that happen?
EDIT: Please also see this other (much more simplified) SO question I made:
Calling a Meteor Method inside a Promise Callback [Halting w/o Error]
I ended up ditching the NPM package and writing my own API call. I could never figure out how to make my call inside the .then(). Here's the code:
fetchGumroadData: () => {
let sales = [];
const fetchData = (page = 1) => {
let options = {
data: {
access_token: Meteor.settings.gumroadAccessKey,
before: '2099-12-04',
after: '2014-12-04',
page: page,
}
};
HTTP.call('GET', 'https://api.gumroad.com/v2/sales', options, (err,res) => {
if (err) { // API call failed
console.log(err);
throw err;
} else { // API call successful
sales.push(...res.data.sales);
res.data.next_page_url ? fetchData(page + 1) : Meteor.call('addSalesFromAPI', sales);
}
});
};
fetchData(); // run the function to fetch data recursively
}

Categories