I am working on a small project, and I have a hard time reaching what I need. I've already requested some help on another matter about this project, but I'm at a loss again, I think this time because I've bitten a bit more than I could chew.
So here goes : I'm using the Musicbrainz API to retrieve informations for a specific track using it's id (it's length, release date, artists, etc ...).
But I'm trying to also show below the details of the track the covers of all the releases where the track is present. This require new requests for each of the release. I manage to retrieve the URLs of the images that I need and push them into an array that I then map / join to add some <img> element so that it can render in the HTML.
However, my issue is that when I click on the button to show more details about a specific track, which in turn fires the function controlTrackDetail in my controller (I've tried to implement some basic MVC architecture as practice ...), the first part of the "rendering" (the general informations from TrackView.render) is fine, but the second part (the covers) is not. I gathered that when I call my CoverView.renderCovers method, the array used as a parameter is still empty, and as such, nothing happens. If I don't empty the array, and click again on my button, it does show all my covers, but only because the URL are those of the previous call to controlTrackDetail.
If anyone has any idea as to how I could tackle this, and only render the covers after all the request in the forEach loop unqReleaseCoversUrl(mbid.id) of API calls is done, that would help me plenty. Below, you'll find the "important" snippets of code to (hopefully) understand my issue.
GET_JSON function (race between a fetch and timeout)
export const GET_JSON = async function (url) {
try {
const res = await Promise.race([
fetch(url),
timeout(CONSTANTS.TIMEOUT_SEC),
]);
const data = await res.json();
if (!res.ok) throw new Error(`${data.message} (${res.status})`);
return data;
} catch (err) {
throw err;
}
};
The model part, where I create the loadTrackDetails function to recover the informations I need.
import { CONSTANTS } from "./config.js";
import { CONVERT_MILLIS_TO_MINS_SECONDS } from "./helpers.js";
import { GET_JSON } from "./helpers.js";
import { SHORTEN_STRING } from "./helpers.js";
import { CONSTRUCT_URL_PART } from "./helpers.js";
/*
https://musicbrainz.org/ws/2/recording/738920d3-c6e6-41c7-b504-57761bb625fd?inc=genres+artists+ratings+releases&fmt=json
loadTrackDetail("738920d3-c6e6-41c7-b504-57761bb625fd");
*/
export const details = {
trackDetails: {},
artistDetails: {},
releaseDetails: {},
coverUrlArray: [],
};
export const loadTrackDetail = async function (id) {
try {
const trackData = await GET_JSON(
encodeURI(
`${CONSTANTS.API_URL}${id}?inc=genres+artists+ratings+releases&fmt=json`
)
);
details.trackDetails = {
trackTitle: trackData.title ?? "No title provided",
trackID: trackData.id,
trackReleaseDate: trackData["first-release-date"] ?? "No date provided",
trackLength: trackData.length
? CONVERT_MILLIS_TO_MINS_SECONDS(trackData.length)
: "No duration provided",
trackArtists: trackData["artist-credit"].length
? trackData["artist-credit"]
: "No information on artists",
trackReleasesBase: trackData["releases"].length
? trackData["releases"]
: "No information on releases",
trackReleasesCleanOne: trackData["releases"].length
? trackData["releases"].map((release) => ({
id: release.id,
title: release.title,
}))
: "No information on releases",
trackGenres: trackData["genres"].length
? trackData["genres"]
: "No information on genres",
trackRating: trackData.rating.value ?? "No rating yet",
};
if (details.trackDetails.trackReleasesCleanOne.length > 0) {
details.trackDetails.trackReleasesCleanOne.forEach((mbid) =>
unqReleaseCoversUrl(mbid.id)
);
}
details.coverUrlArray = details.coverUrlArray.filter(function (element) {
return element !== undefined;
});
console.log(details.coverUrlArray);
} catch (err) {
throw err;
}
};
/*
*/
export const unqReleaseCoversUrl = async function (mbid) {
try {
const coverData = await GET_JSON(
encodeURI(`${CONSTANTS.COVER_API_URL}${mbid}`)
);
console.log(coverData.images);
coverData.images.forEach((image) => {
image.thumbnails["500"]
? details.coverUrlArray.push(image.thumbnails["500"])
: null;
});
} catch (err) {
throw err;
}
};
And finally, the controller part, fired on click on a button
const controlTrackDetail = async function (trackID) {
try {
TrackView.renderSpinner();
await detailsModel.loadTrackDetail(trackID);
// 2) Rendering Recipe
TrackView.render(detailsModel.details.trackDetails);
CoverView.renderSpinner();
CoverView.renderCovers(detailsModel.details.coverUrlArray);
} catch (err) {
console.log(err);
}
};
Like I said, this works ... fine (though I guess it's not really clean ...), aside from the fact that I'm rendering my covers early, and I'm just not sure how to delay it until the URLs are pushed into the arrays.
Thanks for reading me !
PS : here are a few pictures to understand.
On first click, what I see (nothing appears below Cover Arts, where the rendering of the covers should happen) :
After a few milliseconds, the array is filled with the URLs I need :
And when I go back to the results, and click on any button without emptying the array, it is used to render the covers using the URLs that I pushed the first time around (and this array will carry on growing !)
Related
So I have a table that I'm currently lazy loading, and I want to cache the data of the 5 most current pages. When a user clicks the forward or backward button, I will send the pageNumber that they are going to, and the direction they are going(prev or next).
In my getData function, I will check if the page they are going to exists in a cachedData array or not. If yes, then I will dispatch that data without making an api call. If no, I will see the direction they are going to. For example, if they are going 1 -> 5, then page 6, I will remove page 1 from the cachedData array, then push page 6 in, and if they are going backward 6 -> 1, I will remove page 6, cuz it's the farthest from current page, and push page 1 instead. Below are the implementations:
const cachedData = [];
async function getData({ payload }) {
const { pageNumber, ...currentPageInfo } = payload;
const cachedPage = cachedData.find(page => page.pageNumber === pageNumber);
if (!cachedPage) {
const response = await fetchData(currentPageInfo);
if (response) {
if (cachedData.length >= 5) {
if (currentPageInfo.direction === "next") cachedData.shift();
cachedData.pop();
}
const { pageInfo, dataList } = response.data;
cachedData.push({
dataList,
pageNumber,
pageInfo,
});
cachedData.sort((x, y) => x.pageNumber - y.pageNumber);
dispatch(getDataSuccess({ res: dataList, pageInfo }));
}
} else {
dispatch(
getDataSuccess({
res: cachedPage.dataList,
pageInfo: cachedPage.pageInfo,
})
);
}
}
Currently it's working, but I'm not sure I'm on the right track. Between the shifting, sorting and finding, the performance aren't very good. The logic is weird because we are using our own way of pagination, and not the usual skip and take approach. But the important thing is whether anything can be improved in this code. Thank you very much.
The project aims to study a new social media:
https://booyah.live/
My needs are:
1 - Collect data from profiles that follow a specific profile.
2 - My account use this data to follow the collected profiles.
3 - Among other possible options, also unfollow the profiles I follow.
The problem found in the current script:
The profile data in theory is being collected, the script runs perfectly until the end, but for some reason I can't specify, instead of following all the collected profiles, it only follows the base profile.
For example:
I want to follow all 250 profiles that follow the ID 123456
I activate the booyahGetAccounts(123456); script
In theory the end result would be my account following 250 profiles
But the end result I end up following only the 123456 profile, so the count of people I'm following is 1
Complete Project Script:
const csrf = 'MY_CSRF_TOKEN';
async function booyahGetAccounts(uid, type = 'followers', follow = 1) {
if (typeof uid !== 'undefined' && !isNaN(uid)) {
const loggedInUserID = window.localStorage?.loggedUID;
if (uid === 0) uid = loggedInUserID;
const unfollow = follow === -1;
if (unfollow) follow = 1;
if (loggedInUserID) {
if (csrf) {
async function getUserData(uid) {
const response = await fetch(`https://booyah.live/api/v3/users/${uid}`),
data = await response.json();
return data.user;
}
const loggedInUserData = await getUserData(loggedInUserID),
targetUserData = await getUserData(uid),
followUser = uid => fetch(`https://booyah.live/api/v3/users/${loggedInUserID}/followings`, { method: (unfollow ? 'DELETE' : 'POST'), headers: { 'X-CSRF-Token': csrf }, body: JSON.stringify({ followee_uid: uid, source: 43 }) }),
logSep = (data = '', usePad = 0) => typeof data === 'string' && usePad ? console.log((data ? data + ' ' : '').padEnd(50, '━')) : console.log('━'.repeat(50),data,'━'.repeat(50));
async function getList(uid, type, follow) {
const isLoggedInUser = uid === loggedInUserID;
if (isLoggedInUser && follow && !unfollow && type === 'followings') {
follow = 0;
console.warn('You alredy follow your followings. `follow` mode switched to `false`. Followings will be retrieved instead of followed.');
}
const userData = await getUserData(uid),
totalCount = userData[type.slice(0,-1)+'_count'] || 0,
totalCountStrLength = totalCount.toString().length;
if (totalCount) {
let userIDsLength = 0;
const userIDs = [],
nickname = userData.nickname,
nicknameStr = `${nickname ? ` of ${nickname}'s ${type}` : ''}`,
alreadyFollowedStr = uid => `User ID ${uid} already followed by ${loggedInUserData.nickname} (Account #${loggedInUserID})`;
async function followerFetch(cursor = 0) {
const fetched = [];
await fetch(`https://booyah.live/api/v3/users/${uid}/${type}?cursor=${cursor}&count=100`).then(res => res.json()).then(data => {
const list = data[type.slice(0,-1)+'_list'];
if (list?.length) fetched.push(...list.map(e => e.uid));
if (fetched.length) {
userIDs.push(...fetched);
userIDsLength += fetched.length;
if (follow) followUser(uid);
console.log(`${userIDsLength.toString().padStart(totalCountStrLength)} (${(userIDsLength / totalCount * 100).toFixed(4)}%)${nicknameStr} ${follow ? 'followed' : 'retrieved'}`);
if (fetched.length === 100) {
followerFetch(data.cursor);
} else {
console.log(`END REACHED. ${userIDsLength} accounts ${follow ? 'followed' : 'retrieved'}.`);
if (!follow) logSep(targetList);
}
}
});
}
await followerFetch();
return userIDs;
} else {
console.log(`This account has no ${type}.`);
}
}
logSep(`${follow ? 'Following' : 'Retrieving'} ${targetUserData.nickname}'s ${type}`, 1);
const targetList = await getList(uid, type, follow);
} else {
console.error('Missing CSRF token. Retrieve your CSRF token from the Network tab in your inspector by clicking into the Network tab item named "bug-report-claims" and then scrolling down in the associated details window to where you see "x-csrf-token". Copy its value and store it into a variable named "csrf" which this function will reference when you execute it.');
}
} else {
console.error('You do not appear to be logged in. Please log in and try again.');
}
} else {
console.error('UID not passed. Pass the UID of the profile you are targeting to this function.');
}
}
This current question is a continuation of that answer from the link:
Collect the full list of buttons to follow without having to scroll the page (DevTools Google Chrome)
Since I can't offer more bounty on that question, I created this one to offer the new bounty to anyone who can fix the bug and make the script work.
Access account on Booyah website to use for tests:
Access by google:
User: teststackoverflowbooyah#gmail.com
Password: quartodemilha
I have to admit that it is really hard to read your code, I spent a lesser amount of time rewriting everything from scratch.
Stated that we need a code piece to be cut/pasted in the JavaScript console of web browsers able to store some data (i.e. expiration of followings and permanent followings) we need some considerations.
We can consider expiration of followings as volatile data: something that if lost can be reset to 1 day later from when we loose this data. window.localStorage is a perfect candidate to store these kind of data. If we change web browser the only drawback is that we loose the expiration of followings and we can tolerate to reset them to 1 day later from when we change browser.
While to store the list of permanent followings we need a permanent store even if we change web browser. The best idea that came to my mind is to create an alternative account with which to follow the users we never want to stop following. In my code I used uid 3186068 (a random user), once you have created your own alternative account, just replace the first line of the code block with its uid.
Another thing we need to take care is error handling: API could always have errors. The approach I chosen is to write myFetch which, in case of errors, retries twice the same call; if the error persists, probably we are facing a temporary booyah.live outage. Probably we just need to retry a bit later.
To try to provide a comfortable interface, the code blocks gathers the uid from window.location: to follow the followers of users, just cut/paste the code block on tabs opened on their profiles. For example I run the code from a tab open on https://booyah.live/studio/123456?source=44.
Last, to unfollow users the clean function is called 5 minutes later we paste the code (to not conflict with calls to follow followers) and than is executed one hour later it finishes its job. It is written to access the localStorage in an atomic way, so you can have many of them running simultaneously on different tabs of the same browser, you can not care about it. The only thing you need to take care it that when the window.location changes, all the JavaScript events in the tab are reset; so I suggest to keep a tab open on the home page, paste the code block on it, and forget about this tab; it will be the tab responsible of unfollowing users. Then open other tabs to do what you need, when you hit a user you want to follow the followers, paste the block on it, wait the job is finished and continue to use the tab normally.
// The account we use to store followings
const followingsUID = 3186068;
// Gather the loggedUID from window.localStorage
const { loggedUID } = window.localStorage;
// Gather the CSRF-Token from the cookies
const csrf = document.cookie.split("; ").reduce((ret, _) => (_.startsWith("session_key=") ? _.substr(12) : ret), null);
// APIs could have errors, let's do some retries
async function myFetch(url, options, attempt = 0) {
try {
const res = await fetch("https://booyah.live/api/v3/" + url, options);
const ret = await res.json();
return ret;
} catch(e) {
// After too many consecutive errors, let's abort: we need to retry later
if(attempt === 3) throw e;
return myFetch(url, option, attempt + 1);
}
}
function expire(uid, add = true) {
const { followingsExpire } = window.localStorage;
let expires = {};
try {
// Get and parse followingsExpire from localStorage
expires = JSON.parse(followingsExpire);
} catch(e) {
// In case of error (ex. new browsers) simply init to empty
window.localStorage.followingsExpire = "{}";
}
if(! uid) return expires;
// Set expire after 1 day
if(add) expires[uid] = new Date().getTime() + 3600 * 24 * 1000;
else delete expires[uid];
window.localStorage.followingsExpire = JSON.stringify(expires);
}
async function clean() {
try {
const expires = expire();
const now = new Date().getTime();
for(const uid in expires) {
if(expires[uid] < now) {
await followUser(parseInt(uid), false);
expire(uid, false);
}
}
} catch(e) {}
// Repeat clean in an hour
window.setTimeout(clean, 3600 * 1000);
}
async function fetchFollow(uid, type = "followers", from = 0) {
const { cursor, follower_list, following_list } = await myFetch(`users/${uid}/${type}?cursor=${from}&count=50`);
const got = (type === "followers" ? follower_list : following_list).map(_ => _.uid);
const others = cursor ? await fetchFollow(uid, type, cursor) : [];
return [...got, ...others];
}
async function followUser(uid, follow = true) {
console.log(`${follow ? "F" : "Unf"}ollowing ${uid}...`);
return myFetch(`users/${loggedUID}/followings`, {
method: follow ? "POST" : "DELETE",
headers: { "X-CSRF-Token": csrf },
body: JSON.stringify({ followee_uid: uid, source: 43 })
});
}
async function doAll() {
if(! loggedUID) throw new Error("Can't get 'loggedUID' from localStorage: try to login again");
if(! csrf) throw new Error("Can't get session token from cookies: try to login again");
console.log("Fetching current followings...");
const currentFollowings = await fetchFollow(loggedUID, "followings");
console.log("Fetching permanent followings...");
const permanentFollowings = await fetchFollow(followingsUID, "followings");
console.log("Syncing permanent followings...");
for(const uid of permanentFollowings) {
expire(uid, false);
if(currentFollowings.indexOf(uid) === -1) {
await followUser(uid);
currentFollowings.push(uid);
}
}
// Sync followingsExpire in localStorage
for(const uid of currentFollowings) if(permanentFollowings.indexOf(uid) === -1) expire(uid);
// Call first clean task in 5 minutes
window.setTimeout(clean, 300 * 1000);
// Gather uid from window.location
const match = /\/studio\/(\d+)/.exec(window.location.pathname);
if(match) {
console.log("Fetching this user followers...");
const followings = await fetchFollow(parseInt(match[1]));
for(const uid of followings) {
if(currentFollowings.indexOf(uid) === -1) {
await followUser(uid);
expire(uid);
}
}
}
return "Done";
}
await doAll();
The problem: I strongly suspect a booyah.live API bug
To test my code I run it from https://booyah.live/studio/123456?source=44.
If I run it multiple times I continue to get following output:
Fetching current followings...
Fetching permanent followings...
Syncing permanent followings...
Following 1801775...
Following 143823...
Following 137017...
Fetching this user followers...
Following 16884042...
Following 16166724...
There is bug somewhere! The expected output for subsequent executions in the same tab would be:
Fetching current followings...
Fetching permanent followings...
Syncing permanent followings...
Fetching this user followers...
After seeking the bug in my code without success, I checked booyah.live APIs: if I navigate following URLs (the uids are the ones the code continue to follow in subsequent executions)
https://booyah.live/studio/1801775
https://booyah.live/studio/143823
https://booyah.live/studio/137017
https://booyah.live/studio/16884042
https://booyah.live/studio/16166724
I can clearly see I follow them, but if I navigate https://booyah.live/following (the list of users I follow) I can't find them, neither if I scroll the page till the end.
Since I do exactly the same calls the website does, I strongly suspect the bug is in booyah.live APIs, exactly in the way they handle the cursor parameter.
I suggest you to open a support ticket to booyah.live support team. You could use the test account you provided us: I already provided you the details to do that. ;)
I'm trying to create a small project to work on API calls. I have created an async that recovers infos about a track using the MusicBrainz API. You can check the result of the request by clicking there : https://musicbrainz.org/ws/2/recording/5935ec91-8124-42ff-937f-f31a20ffe58f?inc=genres+ratings+releases+artists&fmt=json (I chose Highway to Hell from AC/DC).
And here is what I got so far as reworking the JSON response of my request :
export const GET_JSON = async function (url) {
try {
const res = await Promise.race([
fetch(url),
timeout(CONSTANTS.TIMEOUT_SEC),
]);
const data = await res.json();
if (!res.ok) throw new Error(`${data.message} (${res.status})`);
return data;
} catch (err) {
throw err;
}
};
export const loadTrackDetail = async function (id) {
try {
const trackData = await GET_JSON(
encodeURI(
`${CONSTANTS.API_URL}${id}?inc=genres+artists+ratings+releases&fmt=json`
)
);
details.trackDetails = {
trackTitle: trackData.title,
trackID: trackData.id,
trackLength: trackData.length ?? "No duration provided",
trackArtists: trackData["artist-credit"].length
? trackData["artist-credit"]
: "No information on artists",
trackReleases: trackData["releases"].length
? trackData["releases"]
: "No information on releases",
trackGenres: trackData["genres"].length
? trackData["genres"]
: "No information on genres",
trackRating: trackData.rating.value ?? "No rating yet",
};
console.log(details.trackDetails);
} catch (err) {
throw err;
}
Now this isn't half bad, but the releases property for example is an array of objects (each one being a specific release on which the track is present) but for each of those releases, I want to "reduce" the object to its id and title only. The rest does not interest me. Moreover, I'd like to say that if, for example, the title of a release is similar to that of a previous one already present, the entire object is not added to the new array.
I've thought about doing a foreach function, but I just can't wrap my head around how to write it correctly, if it's actually possible at all, if I should use an array.map for example, or another iterative method.
If anyone has some nice way of doing this in pure JS (not Jquery !), efficient and clean, it'd be much appreciated !
Cheers
There are a few things that make this question a little difficult to answer, but I believe the below will get you pointed in the right direction.
You don't include the GET_JSON method, so your example isn't complete and can't be used immediately to iterate on.
In the example you bring, there isn't a name property on the objects contained in the releases array. I substituted name with title below to demonstrate the approach.
You state
Moreover, I'd like to say that if, for example, the name of a release
is similar to that of a previous one already present, the entire
object is not added to the new array.
But you don't define what you consider that would make releases similar.
Given the above, as stated, I assumed you meant title when you said name and I also assumed that what would constitute a similar release would be one with the same name/title.
Assuming those assumptions are correct, I just fetch to retrieve the results. The response has a json method on it that will convert the response to a JSON object. The I map each release to the smaller data set you are interested in(id, title) and then reduce that array to remove 'duplicate' releases.
fetch('https://musicbrainz.org/ws/2/recording/5935ec91-8124-42ff-937f-f31a20ffe58f?inc=genres+ratings+releases+artists&fmt=json')
.then(m => m.json())
.then(j => {
const reducedReleases = j.releases
.map(release => ({ id: release.id, name: release.title }))
.reduce(
(accumulator, currentValue, currentIndex, sourceArray) => {
if (!accumulator.find(a => a.name === currentValue.name)) {
accumulator.push(currentValue);
}
return accumulator;
},
[]);
console.log(reducedReleases);
});
const releasesReduced = []
const titleNotExist = (title) => {
return releasesReduced.every(release => {
if(release.title === title) return false;
return true
})
}
trackData["releases"].forEach(release => {
if (titleNotExist(release.title))
releasesReduced.push({id: release.id, title: release.title})
})
console.log(releasesReduced)
The array details.trackDetails.trackReleases has a path to an id and name from different objects. If you meant: ["release-events"]=>["area"]["id"]and["area"]["name"]` then see the demo below.
Demo uses flatMap() on each level of path to extract "release-events" then "area" to return an array of objects
[{name: area.name, id: area.id}, {name: area.name, id: area.id},...]
Then runs the array of pairs into a for...of loop and sets each unique name with id into a ES6 Map. Then it returns the Map as an object.
{name: id, name: id, ...}
To review this functioning, go to this Plunker
const releaseEvents = (details.trackDetails.trackReleases) => {
let trackClone = JSON.parse(JSON.stringify(objArr));
let areas = trackClone.flatMap(obj => {
if (obj["release-events"]) {
let countries = obj["release-events"].flatMap(o => {
if (o["area"]) {
let area = {};
area.name = o["area"]["name"];
area.id = o["area"]["id"];
return [area];
} else {
return [];
}
});
return countries;
} else {
return [];
}
});
let eventAreas = new Map();
for (let area of areas) {
if (!eventAreas.has(area.name)) {
eventAreas.set(area.name, area.id);
}
}
return Object.fromEntries([...eventAreas]);
};
console.log(releaseEvents(releases));
I'm scraping website with Apify. I want to scrape different types of pages and then combine the data into one data set. Now i have different sets of data for each kind of pages (users, shots). How to transfer data between pageFunction executions, ex. to calculate followers number for each shot author.
async function pageFunction(context) {
const { request, log, jQuery } = context;
const $ = jQuery;
if (request.url.indexOf('/shots/') > 0) {
const title = $('.shot-title').text();
return {
url: request.url,
title
};
} else if (request.userData.label === "USER") {
var followers_count = $('.followers .count').first().text();
return {
url: request.url,
followers_count
};
}
}
If I understand the question correctly, you can pass the data through crawled pages and save only one item in the end. For this use case, you can work with userData, which you can pass with every request.
For example, if you would like to pass the data from /shots site to the USER, you could do it like this. (but it requires you to enqueue pages manually to control the flow of the data, also this approach except that the /shots type of the page is the first one you visit and then continue)
async function pageFunction(context) {
const { request, log, jQuery } = context;
const $ = jQuery;
if (request.url.indexOf('/shots/') > 0) {
const title = $('.shot-title').text();
const userLink = 'some valid url to user page'
//add to the queue your request with the title in the userData
await context.enqueueRequest({
url: userLink,
userData:{
label:'USER',
shotsTitle: title
}
})
} else if (request.userData.label === "USER") {
var followers_count = $('.followers .count').first().text();
//here you need to get the shotsTitle and return it
return {
url: request.url,
followers_count,
shotsTitle: request.userData.shotsTitle
};
}
}
If you would need to share the between runs of the actors, that is other topic, let me know if it helped.
Also would recommend going through the getting started guide which is here.
I'm using this Gumroad-API npm package in order to fetch data from an external service (Gumroad). Unfortunately, it seems to use a .then() construct which can get a little unwieldy as you will find out below:
This is my meteor method:
Meteor.methods({
fetchGumroadData: () => {
const Gumroad = Meteor.npmRequire('gumroad-api');
let gumroad = new Gumroad({ token: Meteor.settings.gumroadAccessKey });
let before = "2099-12-04";
let after = "2014-12-04";
let page = 1;
let sales = [];
// Recursively defined to continue fetching the next page if it exists
let doThisAfterResponse = (response) => {
sales.push(response.sales);
if (response.next_page_url) {
page = page + 1;
gumroad.listSales(after, before, page).then(doThisAfterResponse);
} else {
let finalArray = R.unnest(sales);
console.log('result array length: ' + finalArray.length);
Meteor.call('insertSales', finalArray);
console.log('FINISHED');
}
}
gumroad.listSales(after, before, page).then(doThisAfterResponse); // run
}
});
Since the NPM package exposes the Gumorad API using something like this:
gumroad.listSales(after, before, page).then(callback)
I decided to do it recursively in order to grab all pages of data.
Let me try to re-cap what is happening here:
The journey starts on the last line of the code shown above.
The initial page is fetched, and doThisAfterResponse() is run for the first time.
We first dump the returned data into our sales array, and then we check if the response has given us a link to the next page (as an indication as to whether or not we're on the final page).
If so, we increment our page count and we make the API call again with the same function to handle the response again.
If not, this means we're at our final page. Now it's time to format the data using R.unnest and finally insert the finalArray of data into our database.
But a funny thing happens here. The entire execution halts at the Meteor.call() and I don't even get an error output to the server logs.
I even tried switching out the Meteor.call() for a simple: Sales.insert({text: 'testing'}) but the exact same behaviour is observed.
What I really need to do is to fetch the information and then store it into the database on the server. How can I make that happen?
EDIT: Please also see this other (much more simplified) SO question I made:
Calling a Meteor Method inside a Promise Callback [Halting w/o Error]
I ended up ditching the NPM package and writing my own API call. I could never figure out how to make my call inside the .then(). Here's the code:
fetchGumroadData: () => {
let sales = [];
const fetchData = (page = 1) => {
let options = {
data: {
access_token: Meteor.settings.gumroadAccessKey,
before: '2099-12-04',
after: '2014-12-04',
page: page,
}
};
HTTP.call('GET', 'https://api.gumroad.com/v2/sales', options, (err,res) => {
if (err) { // API call failed
console.log(err);
throw err;
} else { // API call successful
sales.push(...res.data.sales);
res.data.next_page_url ? fetchData(page + 1) : Meteor.call('addSalesFromAPI', sales);
}
});
};
fetchData(); // run the function to fetch data recursively
}