I've been trying to fill an array with metadata that I collect with Xray, and haven't had any success. The function is called by an API route on my server and gets the links from my application.
I seem to be struggling with promises as it takes time to scrape the metadata, and I can't seem to get the function to wait until the data has been collected before moving on. Perhaps, I'm just not understanding how Xray works? Or maybe promises? I've tried everything I can think of, this being the most recent attempt (and the simplest):
function createCollection() {
Promise.all(rawLinks.map(function(link) {
linksArray.push(xray(link, 'title')(function(error, title) {
console.log(title);
return title;
}))
}))
.then(linksArray => {
console.log(linksArray);
});
}
It's by far not the most robust or elaborate solution I've tried, but it's the most recent one. First the console logs an array with "undefined" as the data, THEN it logs the individual titles.
I would be very thankful for any help, or direction on what to research. Like I've said, I feel as if I've exhausted all my ideas and don't know where to even look anymore.
Figured it out, this seems to be doing the trick!
// format links into an array of objects
var rawLinks = links.split(', ');
var linksArray = [];
createCollection();
function createCollection() {
rawLinks.map(function(link) {
var fillMetaPromise = new Promise(
function(resolve, reject) {
var test = xray(link, 'title')(function(err, title) {
var data = { title: title, link: link };
resolve(data);
});
})
.then(data => {
processTitle(data.title, data.link);
});
});
}
function processTitle(title, link) {
var object = {
link: link,
title: title
};
linksArray.push(object);
console.log(linksArray);
}
Related
I'm trying to create a small project to work on API calls. I have created an async that recovers infos about a track using the MusicBrainz API. You can check the result of the request by clicking there : https://musicbrainz.org/ws/2/recording/5935ec91-8124-42ff-937f-f31a20ffe58f?inc=genres+ratings+releases+artists&fmt=json (I chose Highway to Hell from AC/DC).
And here is what I got so far as reworking the JSON response of my request :
export const GET_JSON = async function (url) {
try {
const res = await Promise.race([
fetch(url),
timeout(CONSTANTS.TIMEOUT_SEC),
]);
const data = await res.json();
if (!res.ok) throw new Error(`${data.message} (${res.status})`);
return data;
} catch (err) {
throw err;
}
};
export const loadTrackDetail = async function (id) {
try {
const trackData = await GET_JSON(
encodeURI(
`${CONSTANTS.API_URL}${id}?inc=genres+artists+ratings+releases&fmt=json`
)
);
details.trackDetails = {
trackTitle: trackData.title,
trackID: trackData.id,
trackLength: trackData.length ?? "No duration provided",
trackArtists: trackData["artist-credit"].length
? trackData["artist-credit"]
: "No information on artists",
trackReleases: trackData["releases"].length
? trackData["releases"]
: "No information on releases",
trackGenres: trackData["genres"].length
? trackData["genres"]
: "No information on genres",
trackRating: trackData.rating.value ?? "No rating yet",
};
console.log(details.trackDetails);
} catch (err) {
throw err;
}
Now this isn't half bad, but the releases property for example is an array of objects (each one being a specific release on which the track is present) but for each of those releases, I want to "reduce" the object to its id and title only. The rest does not interest me. Moreover, I'd like to say that if, for example, the title of a release is similar to that of a previous one already present, the entire object is not added to the new array.
I've thought about doing a foreach function, but I just can't wrap my head around how to write it correctly, if it's actually possible at all, if I should use an array.map for example, or another iterative method.
If anyone has some nice way of doing this in pure JS (not Jquery !), efficient and clean, it'd be much appreciated !
Cheers
There are a few things that make this question a little difficult to answer, but I believe the below will get you pointed in the right direction.
You don't include the GET_JSON method, so your example isn't complete and can't be used immediately to iterate on.
In the example you bring, there isn't a name property on the objects contained in the releases array. I substituted name with title below to demonstrate the approach.
You state
Moreover, I'd like to say that if, for example, the name of a release
is similar to that of a previous one already present, the entire
object is not added to the new array.
But you don't define what you consider that would make releases similar.
Given the above, as stated, I assumed you meant title when you said name and I also assumed that what would constitute a similar release would be one with the same name/title.
Assuming those assumptions are correct, I just fetch to retrieve the results. The response has a json method on it that will convert the response to a JSON object. The I map each release to the smaller data set you are interested in(id, title) and then reduce that array to remove 'duplicate' releases.
fetch('https://musicbrainz.org/ws/2/recording/5935ec91-8124-42ff-937f-f31a20ffe58f?inc=genres+ratings+releases+artists&fmt=json')
.then(m => m.json())
.then(j => {
const reducedReleases = j.releases
.map(release => ({ id: release.id, name: release.title }))
.reduce(
(accumulator, currentValue, currentIndex, sourceArray) => {
if (!accumulator.find(a => a.name === currentValue.name)) {
accumulator.push(currentValue);
}
return accumulator;
},
[]);
console.log(reducedReleases);
});
const releasesReduced = []
const titleNotExist = (title) => {
return releasesReduced.every(release => {
if(release.title === title) return false;
return true
})
}
trackData["releases"].forEach(release => {
if (titleNotExist(release.title))
releasesReduced.push({id: release.id, title: release.title})
})
console.log(releasesReduced)
The array details.trackDetails.trackReleases has a path to an id and name from different objects. If you meant: ["release-events"]=>["area"]["id"]and["area"]["name"]` then see the demo below.
Demo uses flatMap() on each level of path to extract "release-events" then "area" to return an array of objects
[{name: area.name, id: area.id}, {name: area.name, id: area.id},...]
Then runs the array of pairs into a for...of loop and sets each unique name with id into a ES6 Map. Then it returns the Map as an object.
{name: id, name: id, ...}
To review this functioning, go to this Plunker
const releaseEvents = (details.trackDetails.trackReleases) => {
let trackClone = JSON.parse(JSON.stringify(objArr));
let areas = trackClone.flatMap(obj => {
if (obj["release-events"]) {
let countries = obj["release-events"].flatMap(o => {
if (o["area"]) {
let area = {};
area.name = o["area"]["name"];
area.id = o["area"]["id"];
return [area];
} else {
return [];
}
});
return countries;
} else {
return [];
}
});
let eventAreas = new Map();
for (let area of areas) {
if (!eventAreas.has(area.name)) {
eventAreas.set(area.name, area.id);
}
}
return Object.fromEntries([...eventAreas]);
};
console.log(releaseEvents(releases));
Specifically, given a list of data, I want to loop over that list and do a fetch for each element of that data before I combine it all afterward. The thing is, as written, the code iterates through the entire list immediately, starting all the operations at once. Then, even though the fetch operations are still running, the then call I have after all that runs, before the data could've been processed.
I read something about putting all the Promises in an array, then passing that array to a Promise.all() call, followed by a then that will have access to all that processed data as intended, but I'm not sure how exactly to go about doing it in this case, since I have nested Promises in this for loop.
for(var i in repoData) {
var repoName = repoData[i].name;
var repoUrl = repoData[i].url;
(function(name, url) {
Promise.all([fetch(`https://api.github.com/repos/${username}/${repoData[i].name}/commits`),
fetch(`https://api.github.com/repos/${username}/${repoData[i].name}/pulls`)])
.then(function(results) {
Promise.all([results[0].json(), results[1].json()])
.then(function(json) {
//console.log(json[0]);
var commits = json[0];
var pulls = json[1];
var repo = {};
repo.name = name;
repo.url = url;
repo.commitCount = commits.length;
repo.pullRequestCount = pulls.length;
console.log(repo);
user.repositories.push(repo);
});
});
})(repoName, repoUrl);
}
}).then(function() {
var payload = new Object();
payload.user = user;
//console.log(payload);
//console.log(repoData[0]);
res.send(payload);
});
Generally when you need to run asynchronous operations for all of the items in an array, the answer is to use Promise.all(arr.map(...)) and this case appears to be no exception.
Also remember that you need to return values in your then callbacks in order to pass values on to the next then (or to the Promise.all aggregating everything).
When faced with a complex situation, it helps to break it down into smaller pieces. In this case, you can isolate the code to query data for a single repo into its own function. Once you've done that, the code to query data for all of them boils down to:
Promise.all(repoData.map(function (repoItem) {
return getDataForRepo(username, repoItem);
}))
Please try the following:
// function to query details for a single repo
function getDataForRepo(username, repoInfo) {
return Promise
.all([
fetch(`https://api.github.com/repos/${username}/${repoInfo.name}/commits`),
fetch(`https://api.github.com/repos/${username}/${repoInfo.name}/pulls`)
])
.then(function (results) {
return Promise.all([results[0].json(), results[1].json()])
})
.then(function (json) {
var commits = json[0];
var pulls = json[1];
var repo = {
name: repoInfo.name,
url: repoInfo.url,
commitCount: commits.length,
pullRequestCount: pulls.length
};
console.log(repo);
return repo;
});
}
Promise.all(repoData.map(function (repoItem) {
return getDataForRepo(username, repoItem);
})).then(function (retrievedRepoData) {
console.log(retrievedRepoData);
var payload = new Object();
payload.user = user;
//console.log(payload);
//console.log(repoData[0]);
res.send(payload);
});
I'm using fast-csv's fromPath() method to read data from a file. I would like to write this data into an array (which I will subsequently sort). I would expect the code below to work for this purpose, but it does not:
var csv = require('fast-csv');
var dataArr = [];
csv.fromPath("datas.csv", {headers: true})
.on("data", data => {
console.log(data);
// > { num: '4319', year: '1997', month: '4', day: '20', ...
dataArr.push(data);
});
console.log(dataArr);
// > []
I am able to read the data in the file with this code, but the array is not populated.
What is a good way to accomplish this, and why does the code above not work?
Well, I know that this question has been asked a long back but just now I got to work with CSV file for creating API with node js. Being a typical programmer I googled "Reading from a file with fast-csv and writing into an array" well something like this but till date, there isn't any proper response for the question hence I decided to answer this.
Well on is async function and hence execution will be paused in main flow and will be resumed only after nonasync function gets executed.
var queryParameter = ()=> new Promise( resolve =>{
let returnLit = []
csv.fromPath("<fileName>", {headers : true})
.on('data',(data)=>{
returnLit.push(data[<header name>].trim())
})
.on('end',()=>{
resolve(returnLit)
})
})
var mainList = [];
queryParameter().then((res)=>mainList = res)
If you want to validate something pass argument into queryParameter() and uses the argument in validate method.
The "on data" callback is asynchronous, and the commands that follow the callback will run before the callback finishes. This is why the code does not work, and this reasoning has been pointed out by others who have posted answers and comments.
As for a good way to accomplish the task, I have found that using the "on end" callback is a good fit; since the intention here is to "do something" with the whole data, after the file has been read completely.
var dataArr = [];
csv.fromPath("datas.csv", {headers: true})
.on("data", data => {
dataArr.push(data);
})
.on("end", () => {
console.log(dataArr.length);
// > 4187
});
As of "fast-csv": "^4.1.3" the approach by #ChandraKumar no longer works
The fromPath function has been removed in place of "parseFile"
var queryParameter = ()=> new Promise( resolve =>{
let returnLit = []
csv.parseFile("<fileName>", {headers : true})
.on('data',(data)=>{
returnLit.push(data[<header name>].trim())
})
.on('end',()=>{
resolve(returnLit)
})
})
var mainList = [];
queryParameter().then((res)=>mainList = res)
The "on data" callback of the module is asynchronous. Therefore, this line
console.log(dataArr);
will always return empty because it runs before the callback.
To fix this you need to process the array and sort it within the callback. For example:
var dataArr = [];
csv.fromPath("datas.csv", {headers: true})
.on("data", data => {
dataArr.push(data);
var sorted = _.sortBy(dataArr, 'propertyX');
// do something with 'sorted'
});
At a loss on this one.
I'm using Ember and Ember data. I've got this extra implementation of ic-ajax to make GET, POST and PUT calls. Anyway, i'm trying to make a GET call then turn those results into model instances.
return this.GET('/editor')
.then((data) => {
return data.drafts.map((draftData) => {
let draft = this.store.find('draft',draftData.id);
console.log(draft.get('type'));
return draft;
});
});
My API returns proper data as data.drafts. This map is supposed to return an array of promises that resolve to draft models. It does not. It resolves to a draft model that has id, date, and title. But that's it. I have 25 others attributions.
In another part of the application i'm getting drafts using findAll on the model. And those models look fine. But when I try store.findRecord('draft',id) i get these fake objects.
-- edit
This is what my ReOpenClass method looks like for getting an array of objects from the server and turning them into ember objects
search(critera) {
let query = { search: critera };
let adapter = this.store.adapterFor('application');
let url = adapter.buildURL('article','search');
return adapter.ajax(url,'GET', { data: query }).then(response => {
let articleRecords = response.articles.map((article) => {
let record;
try {
record = this.store.createRecord('article', article);
} catch(e) {
record = this.store.peekRecord('article', article.id);
}
return record;
});
return articleRecords;
});
},
So far I can't find a better way to pull this off.
I have problem when I want to read directories from the Dropbox API, dropbox.js https://github.com/dropbox/dropbox-js
The read directories function from the the API looks something like this and I want to assign the value with angular to easily reach it in my HTML.
UPDATE AFTER HELPFUL QUESTION
client.readdir("/", function(error, entries, stat1, stat2) {
if (error) {
return showError(error);
}
$scope.dbItems = stat1;
$scope.$digest();
});
The HTML-code:
<ul ng-repeat="dbItem in dbItems">
<li><input type="image" src="img/folder.png">{{dbItem.name}}</input></li>
</ul>
My problem is that the Dropbox call takes some milliseconds so I have to reload the page to print the data collected. I've tried some things with both angular and JQuery promises but I can't get i to work. All the examples I find about promises has a setTimeout-function and that is fairly easy to implement but when I try to implement it with Dropbox it doesn't work. Anyone who has tried something similar?
UPDATE WITH MY PROBLEM
Now the HTML is updated correctly but I also want to join all of my Dropbox directories to be exported as JSON. I've updated the function above to look like:
$scope.listDB = function()
{
client.readdir(dirToString(), function(error, entries, stat1, stat2) {
if (error) {
return showError(error); // Something went wrong.
}
$scope.dbItems = stat2;
$scope.$digest();
testTemp = stat1._json;
setFolders(testTemp);
function setFolders(current)
{
for(var i=0,folders = current.contents;i<folders.length;i++)
{
if(folders[i].is_dir)
{
folders[i].name = folders[i].path.replace(/\/([^)]+)\//,"");
folders[i].name = folders[i].name.replace('/',"");
$scope.listDBinner(folders[i].name);
var inner = $scope.innerItems;
folders[i].contents = inner;
setFolders(folders[i]);
}
}
}
});
};
With listDBinner:
$scope.listDBinner = function(name)
{
client.readdir(name, function(error, entries, stat1, stat2) {
if (error) {
return showError(error); // Something went wrong.
}
$scope.innerItems = stat1;
$scope.$digest();
console.log($scope.innerItems);
});
console.log($scope.innerItems);
};
The problem is that the console.log of $scope.innerItems inside of client.readdir is correct and the one outside is just empty. I know this should probably be solved with Promises of some kind but I really can't get it to work.