Issue with Pagination in nodejs with mongoose and query optimization - javascript

Having issue with database query in mongoose I am setting value but not getting correct not sure why, Also want to optimize the database query. I am using mongoose for counting how many records are there with matching query params(pagination) I have to make separate query. and finding the actual records with model.find({}) have to make separate query.
But actual problem is with pagination details I am trying to get
Example in below code if I set page = 1, page_size = 10 and my row_count is 3 then I suppose to get from 1, and to 1 but instead I am getting from 1 and to 11.
Not sure what I am doing wrong here.
const pagination = async (model, query, page_number, page_size, order, order_by, next) => {
const pageS = parseInt(page_number)
let page = +pageS || 1;
const limit = parseInt(page_size)
let per_page = +page_size || 10;
if (page < 1) {
page = 1;
}
if (per_page < 1) {
per_page = 1;
}
const startIndex = (
page - 1
) * per_page;
const endIndex = page * page_size
const key = `${order}`
const results = {}
// here reading the data count from database
const resultCount = await model.countDocuments(query).exec();
if (endIndex < resultCount) {
results.next = {
page: page + 1,
page_size: limit
}
}
if (startIndex > 0) {
results.previous = {
page: page - 1,
page_size: limit
}
}
try {
// here trying to search the query with applied pagination
const data = await model.find(query)
.limit(per_page)
.skip(startIndex)
.sort({ [key] : order_by })
.exec()
// here I am passing details but not getting exact to and from; from is working expected but not to
// Example if I set page = 1, page_size = 10 and my row_count is 3 then I suppose to get from 1, and to 1 but intead I am getting from 1 and to 11.
const pagination_details = {
data: data,
meta: {
page,
page_size: per_page,
row_count: parseInt(resultCount, 10),
page_count: Math.ceil(resultCount / per_page ),
from:startIndex + 1,
to: endIndex + 1,
order: order,
order_by: order_by
}
}
return pagination_details
next()
} catch (e) {
console.log(e);
console.error(e);
}
};
Can anyone help me here to achieve the right data, what I am making mistake here. Might be something logical mistake

You have forgotten to divide the start and end indexes by the per_page to get page numbers, try replacing:
from:startIndex + 1,
to: endIndex + 1,
with:
from: Math.floor(startIndex / per_page) + 1,
to: Math.ceil(endIndex / per_page) + 1,

Related

Lambda too slow to query DynamoDB with recursive promises - javascript, sharding

I'm trying to fetch the most recently created items in a dynamodb table. For that I'm using a pattern described by Alex Debrie in his dynamoddb book plus sharding.
When a new item is created in the table it also feeds a GSI with a GSIPK that is made out of the item creation day plus a random shard number between 0 and 9. The SK would be the item unique ID
GSI1
GSI1PK: truncated timestamp#[0-9]
GSI1SK: item id
thre can be few dozens of recently created items or thousands of items.
To fetch the most recent items I have three (3) parameters:
Date: The current day
Limit: total amount of items to fetch
Days:number of days back to look for items
As suggested by Alex Debrie book the method to retrieve the items is a recursive function with promises.
The problem that I'm facing is that my lambda function is very slow.
in the scenario that there are not so many items created recently, the function has to go through all the days+shards one after another to fetch items.
for example.
If I want to fetch the last 100 items in the last 7 days. and there are less than 100 items spread across the shards. The function will go through 70 Queries (7 days x 10 shards) and it takes around 10 seconds to finish
On the contrary if I want to fetch 100 items in the last 7 days and hundreds of items were created recently, then it till take around a second to run.
items are small. around 400 bytes each.
I'm running an on-demand capacity dynamodb table
Lambda is configured with memorySize: 1536MB
Node.js 16.x
Any ideas how can make this run faster ?
const getQueryParams = (createdAt, shard, limit) => {
const params = {
TableName : "table",
IndexName: 'GSI1',
KeyConditionExpression: "#gsi1pk = :gsi1pk",
ExpressionAttributeNames: {
"#gsi1pk": 'GSI1PK'
},
ExpressionAttributeValues: {
":gsi1pk": `${truncateTimestamp(timestamp).toISOString()}#${shard}` //e.g 2023-02-09T00:00:00.000Z#8
},
ScanIndexForward: false,
Limit: limit
};
return params;
}
const getItems = async => {
const items = []
const number_of_days = 3;
const getLatestItems = async ({ createdAt = new Date(), limit = 100, days = 0, shard = 0 }) => {
const query_params = getQueryParams(createdAt, shard, limit);
let max_items_to_fetch = limit;
return dynamoDb.query(query_params).then(
(data) => {
// process data.
if (data.Items) {
data.Items.forEach((item) => {
if (items.length < limit) {
items.push(item);
}
})
max_items_to_fetch = limit - data.Items.length;
}
if (items.length >= limit) {
return items;
}
if (shard < 9) {
let params = {
createdAt: new Date(createdAt.setDate(createdAt.getDate())),
limit: max_items_to_fetch,
days: days,
shard: shard + 1,
}
return getLatestItems(params);
} else if (days < number_of_days) {
let params = {
createdAt: new Date(createdAt.setDate(createdAt.getDate() - 1)),
limit: max_items_to_fetch,
days: days + 1,
shard: 0,
}
return getLatestItems(params);
}
return items;
},
(error) => {
throw new Error('Error getting all recent itmems')
}
);
}
return getLatestItems({});
};
export const main = async (event) => {
const start = Date.now();
const itemPromises = getItems();
const res = await Promise.all([itemPromises]);
const end = Date.now();
console.log(`Execution time: ${end - start} ms`);
};

How to make an algorithm for complex pagination in JS?

I'm trying to make pagination work but it's a bit complex for me so I need help. If someone has an idea how to make it, I would be thankful.
I need to show items that the user owns. One page can have maximum of 12 items displayed.
I have a pageData array, that returns the number of items for a category. Category "shirts" has 2 items.
pageData = [ {id:'pants',totalItems: 15},
{id:'shirts',totalItems: 2}, {id:'dresses',totalItems: 13}]
On first load I need to show 12 items (lets say 12 pants), on second load (click on load more button) I need to show 3 more pants, 2 shirts and 7 dresses.
Metadata for an item has to be fetched via api (this is just an example how to get id of specific item, I'm trying to put it somehow into code but don't have clear idea how):
for (let i = 0; i < pageData['pants'].totalItems; i++) {
const metadata = await api.getMetadata({
userId: 'userId',
id: i,
});
}
This is some code that I have but like I said, I don't really have an idea how to make it:
const getItems = async (pageData, currentPage) => {
const itemsPerPage = 12;
let fetchedData = [];
let total = 0;
let start = currentPage * itemsPerPage;
for (let i = 0; i < pageData.length; i++) {
const minRange = total;
const maxRange = minRange + itemsPerPage;
if (start >= minRange && start <= maxRange) {
const minId = minRange - total;
const maxId = maxRange - total;
for (let j = minId; j < maxId; j++) {
const metadata = await api.getMetadata({
userId: 'userId',
id: j,
});
fetchedData.push(metadata);
}
}
total += itemsPerPage;
}
};

Why can't I get the result sorted after using pagination?

I am learning how to do pagination. So I got a task to use skip and limit and then on top of that sort it. I've used REST API and query parameters to pass the things required like skip limit sort.
Now, my logic gets me to skip and limit running when I hit the GET API but sorting doesn't work. Can anyone provide a workaround and also explain why mine isn't working?
Code:
const getAliens = async (req, res) => {
try {
const skip = parseInt(req.query.skip);
const limit = parseInt(req.query.limit);
const sort = req.query.sort;
const sortOn = {};
if (sort != null) {
const key = sort.split(":")[0];
const value = sort.split(":")[1] == "asc" ? 1 : -1;
sortOn[key] === value;
}
const startIndex = (skip - 1) * limit;
const endIndex = skip * limit;
const response = await Alien.find()
.sort(sortOn)
.find()
.limit(limit)
.skip(startIndex)
.exec();
return ResponseUtils.success(res, response);
} catch (e) {
Log.error(e);
return ResponseUtils.error(res);
}
};

Looking for a Cleaner Way To Make this Pagination Function in JavaScript

I want to write a method that takes in an offset and limit along with a passed in Array of objects. The objects always have an id column. I want to return a new Array with the result based on the offset and limit. Here is an example, followed by my implementation that I'm not happy with (too many fences and fence post like variables which are always error prone). Also, if limit == -1 then take the rest.
Maybe there is a better way with slice? Could reduce somehow help?
const baseArray = [{name: 'peter'},{name: 'terry'},{name: 'tammy'},{name: 'mary'}];
const offset = 1;
const limit = 2;
const speakerArray = getPaginatedData(baseArray,offset,limit);
speakerArray is [{name: 'terry',cursor: 'dGVycnk='},{name: 'tammy', cursor: 'dGFtbXk='];
where, the cursor's are calculated with this line of code:
console.log(Buffer.from('terry').toString("base64"));
Here is my implementation I don't like.
const speakerArray = baseArray
.filter((rec, index) => {
return index > offset - 1 && (offset + limit > index || limit == -1);
})
.map((rec) => {
rec.cursor = Buffer.from(rec.name.toString()).toString("base64");
return rec;
});
Well, the implementation with slice is
function getPaginatedData(array, offset, limit) {
if(limit < 0) return array.slice(offset);
return array.slice(offset, offset + limit);
}
const baseArray = [
{ name: "peter" },
{ name: "terry" },
{ name: "tammy" },
{ name: "mary" },
];
const offset = 1;
const limit = 2;
const speakerArray = getPaginatedData(baseArray, offset, limit);

Best way to paginate with Mongoose and Node.js

I'm currently learning mongoose and I have been making a pagination system. I made two codes, but I wonder which one of them is best way to do what I want in terms of performance and everything
Script 1:
app.get("/:page",(req,res)=>{
post.find({}).then((data)=>{
let per_page = 2; // set how many posts per page
let num_page = Number(req.params.page);
let max_pages = Math.ceil(data.length/per_page);
if(num_page == 0 || num_page > max_pages){
res.render('404');
}else{
let starting = per_page*(num_page-1)
let ending = per_page+starting
res.render('posts', {posts:data.slice(starting,ending), pages: max_pages, current_page: num_page});
}
});
});
Script 2:
app.get("/:page",(req,res)=>{
post.count({}, (err, len)=>{
let per_page = 2; // set how many posts per page
let num_page = Number(req.params.page);
let max_pages = Math.ceil(len/per_page);
if(num_page == 0 || num_page > max_pages){
res.render('404');
}else{
let starting = per_page*(num_page-1);
let ending = per_page+starting;
let promise = post.find({}).limit(per_page).skip(starting);
promise.then((data)=>{
res.render('posts', {posts:data, pages: max_pages, current_page: num_page});
});
}
});
});

Categories