Currently working with NextJS, but struggling to make an indexing page of sorts. With the router, I'm trying to get the page number by doing this:
let router = useRouter()
let page = isNaN(router.query.page) ? 1 : parseInt(router.query.page);
So that when I go to page=1, page=2 etc, I get new sets of data.
The functionality for this is called in the same main component, and is a React Query function:
const {data, status} = useQuery(["keycaps", {manu: manuId}, {prof: profileId}, {col: colorId}, {stat: statusId}], getKeycaps)
And said function looks like this:
const getKeycaps = async(key) => {
const manuId = key.queryKey[1].manu
const profileIds = key.queryKey[2].prof.map(id => `profile.id=${id}`)
const colorIds = key.queryKey[3].col.map(id => `filter_colors.id=${id}`)
const statId = key.queryKey[4].stat
const profileQueryString = profileIds.join("&")
const colorQueryString = colorIds.join("&")
let urlParams = new URLSearchParams(document.location.search);
let page = urlParams.get("page") == null ? 1 : parseInt(urlParams.get("page"));
let start = (page * 10) - 10
const data = await axios(`
${process.env.REACT_APP_STRAPI_API}/keycaps?${manuId ? 'manufacturer.id=' + manuId + '&' : ''}${profileQueryString ? profileQueryString + '&' : ''}${colorQueryString ? colorQueryString + '&' : ''}_start=${start}&_limit=10`)
return data.data
}
When initially loading pages, like directly pasting the url of the index in (i.e. keycaps?page=2), it will get all the results all fine. However, if I start using navigational buttons like this:
<Link href={`/keycaps?page=${page - 1}`}> // next/link
<div className="w-32 rounded-lg cursor-pointer">
Prev
</div>
</Link>
<Link href={`/keycaps?page=${page + 1}`}>
<div className="w-32 rounded-lg cursor-pointer">
Next
</div>
</Link>
The whole thing starts to break down. Essentially, the page doesn't actually reload any data or results until the page is unfocused. So, if I press the Next button to go to the next page, it won't load the data until I do something like tab to a new window or check a different internet tab, and then when I come back, the data will all magically load within a second.
I've tried this with next build && next start too, and this produces the same results. I just want to get the page results when the next and prev page buttons are pressed, and in a way that doesn't require the user to switch tabs to get content.
I will note that I do have a getStaticProps on this as well. It does the following:
export async function getStaticProps() {
const allKeycaps = (await getAllKeycaps() || 'Error')
return {
props: { allKeycaps }
}
}
Which will call an api script, and said script does this:
async function fetchAPI(query, {variables} = {}) {
const res = await fetch(`${process.env.REACT_APP_STRAPI_API}/graphql`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
query,
variables,
}),
})
const json = await res.json()
if (json.errors) {
console.error(json.errors)
throw new Error('Failed to fetch API')
}
console.log('json', json.data, variables)
return json.data
}
/* Keycap related grabs */
export async function getAllKeycaps() {
const data = await fetchAPI(
`
{
keycaps {
id
slug
name
published_at
updatedAt
profile {
id
name
}
manufacturer {
id
name
lead
}
status {
id
name
}
colors
filter_colors {
color
}
kits
designer
thumb {
formats
}
}
}
`
)
return data
}
Anyone know how to get this to work? To navigate between indexes like this? I've been trying to look for Next tutorials that use navigations like page 1, page 2 etc but all I can find is examples of blog articles with slugs, no index searches of any kind.
Thanks a lot in advance.
Answer found:
When setting data and status using useQuery
const curPage = router.query.page == null ? 1 : router.query.page
const {data, status} = useQuery(["keycaps", {manu: manuId}, {prof: profileId}, {col: colorId}, {stat: statusId}, {page: curPage}], getKeycaps)
And then, in getKeycaps
const page = key.queryKey[5].page
I guess the "urlParams" approach wasn't a good one? Or at least, not one that was updating quick enough. So passing through the router page number seems to work better.
Related
I'm trying to prevent the user from liking photos more than once when refreshing the page.
So the current incorrect flow is: click like on desired photo > number increments > refresh the page > click like on desired photo > number increments > refresh the page > *rinse & repeat*
The correct flow should be: click like on desired photo > number increments > refresh the page > click like on desired photo > number decrements > *rinse & repeat*
I know I'll need to map the liked UserIDs into local storage so that the browser remembers which photos were liked but I'm not sure how.
One of my many, many attempts in the else{} block is below. I've completely hit a wall and not sure how to achieve this.
Note: using localstorage is not new to me as you can see I'm using it for something unrelated in the return(). However, this mapping stuff to localstorage is what has me stumped.
useEffect(() => {
const headers = {
"Accept": 'application/json',
"Authorization": `Bearer ${authToken}`
};
axios.get('http://127.0.0.1:8000/api/get-user-uploads-data', {headers})
.then(resp => {
console.log(resp.data);
setGridData(resp.data);
}).catch(err => {
console.log(err);
});
}, []);
const handleLikesBasedOnUserId = (likedPhotoUserId, userName) => {
let mapOfLikes = {};
if(userLikedPhotos[likedPhotoUserId]) {
// dislike
delete userLikedPhotos[likedPhotoUserId];
gridData.find(photo => photo.UserID === likedPhotoUserId).likes--;
handleDislike(likedPhotoUserId, userName); // Send dislike incrementation via POST request
// localstorage logic?
} else {
// like
userLikedPhotos[likedPhotoUserId] = true;
gridData.find(photo => photo.UserID === likedPhotoUserId).likes++;
// attempt below
mapOfLikes[likedPhotoUserId] = mapOfLikes[likedPhotoUserId] ?? 1;
localStorage.setItem('mapOfLikes', JSON.stringify(mapOfLikes));
handleLike(likedPhotoUserId, userName); // Send like incrementation via POST request
// localstorage logic?
}
// Spread the userLikedPhotos to create a new object and force a rendering
setUserLikedPhotos({...userLikedPhotos});
};
return(
{
gridData.map((photos, index) => {
<span className="likesAmt">❤️ {photos.likes}</span><br/><Button variant="success" onClick={() => handleLikesBasedOnUserId(photos.UserID, photos.name)}>Like</Button><br/><span className="name">{photos.name} {localStorage.getItem('UserID') === photos.UserID ? <h6 className="you">(You)</h6> : null}</span>
})
}
);
if (localStorage.hasOwnProperty(mapOfLikes) {
// dislikes logic
localStorage.deletItem('mapOfLikes')
else {
// likes logic
localStorage.setItem('mapOfLikes', json.stringify(mapOflikes)
The project aims to study a new social media:
https://booyah.live/
My needs are:
1 - Collect data from profiles that follow a specific profile.
2 - My account use this data to follow the collected profiles.
3 - Among other possible options, also unfollow the profiles I follow.
The problem found in the current script:
The profile data in theory is being collected, the script runs perfectly until the end, but for some reason I can't specify, instead of following all the collected profiles, it only follows the base profile.
For example:
I want to follow all 250 profiles that follow the ID 123456
I activate the booyahGetAccounts(123456); script
In theory the end result would be my account following 250 profiles
But the end result I end up following only the 123456 profile, so the count of people I'm following is 1
Complete Project Script:
const csrf = 'MY_CSRF_TOKEN';
async function booyahGetAccounts(uid, type = 'followers', follow = 1) {
if (typeof uid !== 'undefined' && !isNaN(uid)) {
const loggedInUserID = window.localStorage?.loggedUID;
if (uid === 0) uid = loggedInUserID;
const unfollow = follow === -1;
if (unfollow) follow = 1;
if (loggedInUserID) {
if (csrf) {
async function getUserData(uid) {
const response = await fetch(`https://booyah.live/api/v3/users/${uid}`),
data = await response.json();
return data.user;
}
const loggedInUserData = await getUserData(loggedInUserID),
targetUserData = await getUserData(uid),
followUser = uid => fetch(`https://booyah.live/api/v3/users/${loggedInUserID}/followings`, { method: (unfollow ? 'DELETE' : 'POST'), headers: { 'X-CSRF-Token': csrf }, body: JSON.stringify({ followee_uid: uid, source: 43 }) }),
logSep = (data = '', usePad = 0) => typeof data === 'string' && usePad ? console.log((data ? data + ' ' : '').padEnd(50, '━')) : console.log('━'.repeat(50),data,'━'.repeat(50));
async function getList(uid, type, follow) {
const isLoggedInUser = uid === loggedInUserID;
if (isLoggedInUser && follow && !unfollow && type === 'followings') {
follow = 0;
console.warn('You alredy follow your followings. `follow` mode switched to `false`. Followings will be retrieved instead of followed.');
}
const userData = await getUserData(uid),
totalCount = userData[type.slice(0,-1)+'_count'] || 0,
totalCountStrLength = totalCount.toString().length;
if (totalCount) {
let userIDsLength = 0;
const userIDs = [],
nickname = userData.nickname,
nicknameStr = `${nickname ? ` of ${nickname}'s ${type}` : ''}`,
alreadyFollowedStr = uid => `User ID ${uid} already followed by ${loggedInUserData.nickname} (Account #${loggedInUserID})`;
async function followerFetch(cursor = 0) {
const fetched = [];
await fetch(`https://booyah.live/api/v3/users/${uid}/${type}?cursor=${cursor}&count=100`).then(res => res.json()).then(data => {
const list = data[type.slice(0,-1)+'_list'];
if (list?.length) fetched.push(...list.map(e => e.uid));
if (fetched.length) {
userIDs.push(...fetched);
userIDsLength += fetched.length;
if (follow) followUser(uid);
console.log(`${userIDsLength.toString().padStart(totalCountStrLength)} (${(userIDsLength / totalCount * 100).toFixed(4)}%)${nicknameStr} ${follow ? 'followed' : 'retrieved'}`);
if (fetched.length === 100) {
followerFetch(data.cursor);
} else {
console.log(`END REACHED. ${userIDsLength} accounts ${follow ? 'followed' : 'retrieved'}.`);
if (!follow) logSep(targetList);
}
}
});
}
await followerFetch();
return userIDs;
} else {
console.log(`This account has no ${type}.`);
}
}
logSep(`${follow ? 'Following' : 'Retrieving'} ${targetUserData.nickname}'s ${type}`, 1);
const targetList = await getList(uid, type, follow);
} else {
console.error('Missing CSRF token. Retrieve your CSRF token from the Network tab in your inspector by clicking into the Network tab item named "bug-report-claims" and then scrolling down in the associated details window to where you see "x-csrf-token". Copy its value and store it into a variable named "csrf" which this function will reference when you execute it.');
}
} else {
console.error('You do not appear to be logged in. Please log in and try again.');
}
} else {
console.error('UID not passed. Pass the UID of the profile you are targeting to this function.');
}
}
This current question is a continuation of that answer from the link:
Collect the full list of buttons to follow without having to scroll the page (DevTools Google Chrome)
Since I can't offer more bounty on that question, I created this one to offer the new bounty to anyone who can fix the bug and make the script work.
Access account on Booyah website to use for tests:
Access by google:
User: teststackoverflowbooyah#gmail.com
Password: quartodemilha
I have to admit that it is really hard to read your code, I spent a lesser amount of time rewriting everything from scratch.
Stated that we need a code piece to be cut/pasted in the JavaScript console of web browsers able to store some data (i.e. expiration of followings and permanent followings) we need some considerations.
We can consider expiration of followings as volatile data: something that if lost can be reset to 1 day later from when we loose this data. window.localStorage is a perfect candidate to store these kind of data. If we change web browser the only drawback is that we loose the expiration of followings and we can tolerate to reset them to 1 day later from when we change browser.
While to store the list of permanent followings we need a permanent store even if we change web browser. The best idea that came to my mind is to create an alternative account with which to follow the users we never want to stop following. In my code I used uid 3186068 (a random user), once you have created your own alternative account, just replace the first line of the code block with its uid.
Another thing we need to take care is error handling: API could always have errors. The approach I chosen is to write myFetch which, in case of errors, retries twice the same call; if the error persists, probably we are facing a temporary booyah.live outage. Probably we just need to retry a bit later.
To try to provide a comfortable interface, the code blocks gathers the uid from window.location: to follow the followers of users, just cut/paste the code block on tabs opened on their profiles. For example I run the code from a tab open on https://booyah.live/studio/123456?source=44.
Last, to unfollow users the clean function is called 5 minutes later we paste the code (to not conflict with calls to follow followers) and than is executed one hour later it finishes its job. It is written to access the localStorage in an atomic way, so you can have many of them running simultaneously on different tabs of the same browser, you can not care about it. The only thing you need to take care it that when the window.location changes, all the JavaScript events in the tab are reset; so I suggest to keep a tab open on the home page, paste the code block on it, and forget about this tab; it will be the tab responsible of unfollowing users. Then open other tabs to do what you need, when you hit a user you want to follow the followers, paste the block on it, wait the job is finished and continue to use the tab normally.
// The account we use to store followings
const followingsUID = 3186068;
// Gather the loggedUID from window.localStorage
const { loggedUID } = window.localStorage;
// Gather the CSRF-Token from the cookies
const csrf = document.cookie.split("; ").reduce((ret, _) => (_.startsWith("session_key=") ? _.substr(12) : ret), null);
// APIs could have errors, let's do some retries
async function myFetch(url, options, attempt = 0) {
try {
const res = await fetch("https://booyah.live/api/v3/" + url, options);
const ret = await res.json();
return ret;
} catch(e) {
// After too many consecutive errors, let's abort: we need to retry later
if(attempt === 3) throw e;
return myFetch(url, option, attempt + 1);
}
}
function expire(uid, add = true) {
const { followingsExpire } = window.localStorage;
let expires = {};
try {
// Get and parse followingsExpire from localStorage
expires = JSON.parse(followingsExpire);
} catch(e) {
// In case of error (ex. new browsers) simply init to empty
window.localStorage.followingsExpire = "{}";
}
if(! uid) return expires;
// Set expire after 1 day
if(add) expires[uid] = new Date().getTime() + 3600 * 24 * 1000;
else delete expires[uid];
window.localStorage.followingsExpire = JSON.stringify(expires);
}
async function clean() {
try {
const expires = expire();
const now = new Date().getTime();
for(const uid in expires) {
if(expires[uid] < now) {
await followUser(parseInt(uid), false);
expire(uid, false);
}
}
} catch(e) {}
// Repeat clean in an hour
window.setTimeout(clean, 3600 * 1000);
}
async function fetchFollow(uid, type = "followers", from = 0) {
const { cursor, follower_list, following_list } = await myFetch(`users/${uid}/${type}?cursor=${from}&count=50`);
const got = (type === "followers" ? follower_list : following_list).map(_ => _.uid);
const others = cursor ? await fetchFollow(uid, type, cursor) : [];
return [...got, ...others];
}
async function followUser(uid, follow = true) {
console.log(`${follow ? "F" : "Unf"}ollowing ${uid}...`);
return myFetch(`users/${loggedUID}/followings`, {
method: follow ? "POST" : "DELETE",
headers: { "X-CSRF-Token": csrf },
body: JSON.stringify({ followee_uid: uid, source: 43 })
});
}
async function doAll() {
if(! loggedUID) throw new Error("Can't get 'loggedUID' from localStorage: try to login again");
if(! csrf) throw new Error("Can't get session token from cookies: try to login again");
console.log("Fetching current followings...");
const currentFollowings = await fetchFollow(loggedUID, "followings");
console.log("Fetching permanent followings...");
const permanentFollowings = await fetchFollow(followingsUID, "followings");
console.log("Syncing permanent followings...");
for(const uid of permanentFollowings) {
expire(uid, false);
if(currentFollowings.indexOf(uid) === -1) {
await followUser(uid);
currentFollowings.push(uid);
}
}
// Sync followingsExpire in localStorage
for(const uid of currentFollowings) if(permanentFollowings.indexOf(uid) === -1) expire(uid);
// Call first clean task in 5 minutes
window.setTimeout(clean, 300 * 1000);
// Gather uid from window.location
const match = /\/studio\/(\d+)/.exec(window.location.pathname);
if(match) {
console.log("Fetching this user followers...");
const followings = await fetchFollow(parseInt(match[1]));
for(const uid of followings) {
if(currentFollowings.indexOf(uid) === -1) {
await followUser(uid);
expire(uid);
}
}
}
return "Done";
}
await doAll();
The problem: I strongly suspect a booyah.live API bug
To test my code I run it from https://booyah.live/studio/123456?source=44.
If I run it multiple times I continue to get following output:
Fetching current followings...
Fetching permanent followings...
Syncing permanent followings...
Following 1801775...
Following 143823...
Following 137017...
Fetching this user followers...
Following 16884042...
Following 16166724...
There is bug somewhere! The expected output for subsequent executions in the same tab would be:
Fetching current followings...
Fetching permanent followings...
Syncing permanent followings...
Fetching this user followers...
After seeking the bug in my code without success, I checked booyah.live APIs: if I navigate following URLs (the uids are the ones the code continue to follow in subsequent executions)
https://booyah.live/studio/1801775
https://booyah.live/studio/143823
https://booyah.live/studio/137017
https://booyah.live/studio/16884042
https://booyah.live/studio/16166724
I can clearly see I follow them, but if I navigate https://booyah.live/following (the list of users I follow) I can't find them, neither if I scroll the page till the end.
Since I do exactly the same calls the website does, I strongly suspect the bug is in booyah.live APIs, exactly in the way they handle the cursor parameter.
I suggest you to open a support ticket to booyah.live support team. You could use the test account you provided us: I already provided you the details to do that. ;)
I'm generating PDF by using https://pdfgeneratorapi.com/.
Now I can show data one by one using this code.Can any one give me suggestion how can show all data with loop or any other way?
This below photos showing my template from pdfgenerator .
This is the code I'm using to generate PDF
let communicationWay1=[
{0:"dim"},
{1:"kal"}
];
let cstomerExpence1=[
{0:"dim"},
{1:"kal"}
];
let title="test";
let names="test";
let phone="test";
let email="test";
let maritalStatus="test";
let city="test";
let other="test";
const result = await wixData.query(collection)
.eq('main_user_email', $w('#mainE').text)
.find()
.then( (results) => {
if (results.totalCount>0) {
count=1;
// title=results.items[1].title;
names=results.items[0].names;
email=results.items[0].emial;
phone=results.items[0].phone;
maritalStatus=results.items[0].maritalStatus;
city=results.items[0].city;
other=results.items[0].cousterExpenses_other;
title=results.items[0].title;
communicationWay=results.items[0].communicationWay;
cstomerExpence=results.items[0].cstomerExpence;
}
if (results.totalCount>1) {
names1=results.items[1].names;
email1=results.items[1].emial;
phone1=results.items[1].phone;
maritalStatus1=results.items[1].maritalStatus;
city1=results.items[1].city;
other1=results.items[1].cousterExpenses_other;
title1=results.items[1].title;
communicationWay1=results.items[1].communicationWay;
cstomerExpence1=results.items[1].cstomerExpence;
}
} )
.catch( (err) => {
console.log(err);
} );
// Add your code for this event here:
const pdfUrl = await getPdfUrl
({title,names,email,phone,city,maritalStatus,other,communicationWay,cstomerExpence,title1,
names1,email1,phone1,city1,maritalStatus1,other1,communicationWay1,cstomerExpence1
});
if (count===0) { $w("#text21").show();}
else{ $w("#downloadButton").link=wixLocation.to(pdfUrl);}
BELOW CODE IS BACKEND CODE/JSW CODE.
Also I want to open pdf in new tab. I know "_blank" method can be used to open a new tab.But I'm not sure how to add it with the url
import PDFGeneratorAPI from 'pdf-generator-api'
const apiKey = 'MYKEY';
const apiSecret = 'MYAPISECRET';
const baseUrl = 'https://us1.pdfgeneratorapi.com/api/v3/';
const workspace = "HELLO#gmail.com";
const templateID = "MYTEMPLATEID";
let Client = new PDFGeneratorAPI(apiKey, apiSecret)
Client.setBaseUrl(baseUrl)
Client.setWorkspace(workspace)
export async function getPdfUrl(data) {
const {response} = await Client.output(templateID, data, undefined, undefined, {output: 'url'})
return response
}
Just put it in a while loop with a boolean condition.
You can create a variable, for example allShowed, and set its value to False. After that, create another variable, for example numberOfDataToShow, and set it as the number of elements you want to display. Then create a counter, countShowed, initialized with 0 as its value.
Now create a while loop: while allShowed value is False, you loop (and add data).
Everytime a piece of your data is showed, you increment the value of countShowed (and set it to go on adding/showing data). When countShowed will have the exact same value of numberOfDataToShow, set allShowed to True. The loop will interrupt and all your data will be showed.
You would need to use the Container or Table component in PDF Generator API to iterate over a list of items. As #JustCallMeA said you need to send an array of items. PDF Generator API now has an official Wix Velo (previously Corvid) tutorial with a demo page: https://support.pdfgeneratorapi.com/en/article/how-to-integrate-with-wix-velo-13s8135
Created a simple react-admin application that pulls from a custom rest api. First page is displayed (default 10 per page. Click the Next button and nothing happens (still sends page=1 to the api). Click a second time and the page advances to page 2 (page=2), as expected. Click the third time and goes back to page 1 (page=1).
Then, if you click a fourth time, it goes page 2, then click again, goes to page 3, then click again, goes back to page 1. It continues with this pattern, each round, getting one page further before going back to page.
I'm able to get the correct results when calling the custom API outside of the react-admin app. I created a custom dataProvider to communicate with the API and maybe there's a problem with the getList function, but I can definitely see the page number passed into this function and it lines up with the odd results (page 1, then 1, 2, 1, then 1, 2, 3, 1, etc. The custom API expects the following query string for pagination: ?limit=10&page=1&orderBy=id&orderDir=ASC
The original react-admin tutorial returns 10 records. When I set the page limit to 5, it does seem to work OK (advances to page 2 on the first click of Next), but without more records, it's hard to test it completely. But my guess is it would work, since it is most certainly a problem with my code or the API (although, as I said, the API works outside the react app).
Here's my getList function:
const httpClient = (url, options = {}) => {
if (!options.headers) {
options.headers = new Headers({ Accept: 'application/json' });
}
const tokens = localStorage.getItem('tokens');
const objToken = JSON.parse(tokens);
options.user = {
authenticated: true,
token: `Bearer ${objToken.accessToken}`
};
return fetchUtils.fetchJson(url, options);
};
export default {
getList: (resource, params) => {
const { page, perPage } = params.pagination;
const { field, order } = params.sort;
const { q } = params.filter;
// Pagination and sort
let query = `limit=${perPage}&page=${page}&orderBy=${field}&orderDir=${order}`;
// Filter?
let useResource = '';
let useFilter = '';
if (q == null) {
// No filter: Use <resource>/ url
useResource = resource;
} else {
// Filter: Use append url with /find
useResource = `${resource}/find`;
useFilter = q;
console.log('useFilter: ', useFilter)
query += `&searchText=${useFilter}`;
}
const url = `${apiUrl}/${useResource}?${query}`;
return httpClient(url)
.then(({ json }) => ({
data: json.results,
total: json.totalRows,
}));
}, ...
Here's a screen shot of issue:
EDIT:
It looks like the correct query string is being sent but immediately after the first Next page click (page=2), page=1 is automatically sent again, returning to page one. This seems to be the case with subsequent Next clicks, as well. Thanks for helping out a newbie. But I just can't figure out why extra calls are being made returning to page 1.
Fixed in react-admin 3.4.3.
I updated using npm update and pagination works correctly.
I have exactly behavor with react 4.x.x
What i was expecting:
Going to next page when cliking on next, with react-admin 3.19 this is how my application worked
What happened instead:
when you click on the next page, pagination resets to 1 !
also, it does not take into account the pagination that I define.
on chrome default perPage is 5, even when i set it 10.
chrome_pagination_issue
on firefox default perPage=10, but i have the same issue
firefox_pagination_issue
Other information:
getList: (resource, params) => {
const { page, perPage } = params.pagination;
const { field, order } = params.sort;
console.log(params);
const query = {
...fetchUtils.flattenObject(params.filter),
_sort: field,
_order: order,
_start: (page - 1) * perPage,
_end: page * perPage,
_resource:resource
};
const url = `${apiUrl}/${resource}?${stringify(query)}`;
return httpClient(url).then(({ headers, json }) => {
if (!json.hasOwnProperty('totalElements')) {
throw new Error(
"The numberOfElements property must be must be present in the Json response"
);
}
return {
data: json.content,
total: parseInt(json.totalElements,10)
};
});
}
#4658
Environment
React-admin version: 4.0.1 , 4.0.2
React version:18
Strict mode disabled
Browser: chrome, firefox
My backend is spring boot rest api
I'm scraping website with Apify. I want to scrape different types of pages and then combine the data into one data set. Now i have different sets of data for each kind of pages (users, shots). How to transfer data between pageFunction executions, ex. to calculate followers number for each shot author.
async function pageFunction(context) {
const { request, log, jQuery } = context;
const $ = jQuery;
if (request.url.indexOf('/shots/') > 0) {
const title = $('.shot-title').text();
return {
url: request.url,
title
};
} else if (request.userData.label === "USER") {
var followers_count = $('.followers .count').first().text();
return {
url: request.url,
followers_count
};
}
}
If I understand the question correctly, you can pass the data through crawled pages and save only one item in the end. For this use case, you can work with userData, which you can pass with every request.
For example, if you would like to pass the data from /shots site to the USER, you could do it like this. (but it requires you to enqueue pages manually to control the flow of the data, also this approach except that the /shots type of the page is the first one you visit and then continue)
async function pageFunction(context) {
const { request, log, jQuery } = context;
const $ = jQuery;
if (request.url.indexOf('/shots/') > 0) {
const title = $('.shot-title').text();
const userLink = 'some valid url to user page'
//add to the queue your request with the title in the userData
await context.enqueueRequest({
url: userLink,
userData:{
label:'USER',
shotsTitle: title
}
})
} else if (request.userData.label === "USER") {
var followers_count = $('.followers .count').first().text();
//here you need to get the shotsTitle and return it
return {
url: request.url,
followers_count,
shotsTitle: request.userData.shotsTitle
};
}
}
If you would need to share the between runs of the actors, that is other topic, let me know if it helped.
Also would recommend going through the getting started guide which is here.