I'm creating a REST API that gets raw data from the internet then apply a REGEX to it and return it in JSON format.
this is my function for getting Data as JSON.
first i'm using got() to get the raw data than I apply ANIME_LIST_REGEX.exec() to filter it with the regular expression to make it in JSON format.
async function getAnimeList(url) {
const {body} = await got(url);
let ANIME_LIST_DATA = ANIME_LIST_REGEX.exec(body)
if (!ANIME_LIST_DATA) {
return null;
}
return {
animeList: ANIME_LIST_DATA[1]
};
}
in this endpoint I'm retreiving the data from the 1st function and parsing the JSON, the return it as a response.
app.get('/anime-list', async (req, res, next) => {
const appData = await getAnimeList(URL_BASE_ANIME_LIST);
var listJson = JSON5.parse(appData.animeList)
res.json(listJson)
})
The issue is that the returned array is pretty big (5000 entries of js objects) and the request takes long time to return and show the array
What I want to do is to return a chunck of that array every time I call the function or reach the endpoint.
Tried several methods but none of them made sense.
Does anyone got an idea?
You can split the large array into a small chunk with Array.prototype.splice(). To determine the range of the chunk, you can pass queries to the endpoint.
app.get("/anime-list", async (req, res, next) => {
const appData = await getAnimeList(URL_BASE_ANIME_LIST)
var listJson = JSON5.parse(appData.animeList)
const from = req.query.from || 0
const count = req.query.count || 100
res.json(listJson.splice(from, count))
})
However, as others mention, calling getAnimeList() per request will cause another performance problem. I highly suggest you refactor the function to cache the result.
Related
Actually Im a new mongodb express developer. I want to optimize my backend data fetching.
I have tried many ways but didn't work.
Actually I dont want to pass full name as parameter or search text.
But I'm confused is there any way to build a query that will help me to get partially matched data from mongodb ? like includes("some text") method works.
Here is my code. it works fine but i don't want to filter after fetching:
//Getting products using search query
app.get("/products", async (req, res) => {
//I want to use this query to optimize data fetching
let query = {}; //Optimization needed
//To search products using products name (my current solution)
if (req.query.name) {
// query = {name: req.query.name} // Only gives result for exact match
const cursor = productsCollection.find(query);
const products = await cursor.toArray();
const searchText = req.query.name.toLowerCase();
const filteredProducts = products.filter((product) =>
product.name.toLowerCase().includes(searchText)
);
res.send(filteredProducts);
}
//To get all products
else {
const cursor = productsCollection.find(query);
const products = await cursor.toArray();
res.send(products);
}
});
Background: Been trying for the last 2 day to resolve this myself by looking at various examples from both this website and others and I'm still not getting it. Whenever I try adding callbacks or async/await I'm getting no where. I know this is where my problem is but I can't resolve it myself.
I'm not from a programming background :( Im sure its a quick fix for the average programmer, I am well below that level.
When I console.log(final) within the 'ready' block it works as it should, when I escape that block the output is 'undefined' if console.log(final) -or- Get req/server info, if I use console.log(ready)
const request = require('request');
const ready =
// I know 'request' is deprecated, but given my struggle with async/await (+ callbacks) in general, when I tried switching to axios I found it more confusing.
request({url: 'https://www.website.com', json: true}, function(err, res, returnedData) {
if (err) {
throw err;
}
var filter = returnedData.result.map(entry => entry.instrument_name);
var str = filter.toString();
var addToStr = str.split(",").map(function(a) { return `"trades.` + a + `.raw", `; }).join("");
var neater = addToStr.substr(0, addToStr.length-2);
var final = "[" + neater + "]";
// * * * Below works here but not outside this block* * *
// console.log(final);
});
// console.log(final);
// returns 'final is not defined'
console.log(ready);
// returns server info of GET req endpoint. This is as it is returning before actually returning the data. Not done as async.
module.exports = ready;
Below is an short example of the JSON that is returned by website.com. The actual call has 200+ 'result' objects.
What Im ultimately trying to achieve is
1) return all values of "instrument_name"
2) perform some manipulations (adding 'trades.' to the beginning of each value and '.raw' to the end of each value.
3) place these manipulations into an array.
["trades.BTC-26JUN20-8000-C.raw","trades.BTC-25SEP20-8000-C.raw"]
4) export/send this array to another file.
5) The array will be used as part of another request used in a websocket connection. The array cannot be hardcoded into this new request as the values of the array change daily.
{
"jsonrpc": "2.0",
"result": [
{
"kind": "option",
"is_active": true,
"instrument_name": "26JUN20-8000-C",
"expiration_timestamp": 1593158400000,
"creation_timestamp": 1575305837000,
"contract_size": 1,
},
{
"kind": "option",
"is_active": true,
"instrument_name": "25SEP20-8000-C",
"expiration_timestamp": 1601020800000,
"creation_timestamp": 1569484801000,
"contract_size": 1,
}
],
"usIn": 1591185090022084,
"usOut": 1591185090025382,
"usDiff": 3298,
"testnet": true
}
Looking your code we find two problems related to final and ready variables. The first one is that you're trying to console.log(final) out of its scope.
The second problem is that request doesn't immediately return the result of your API request. The reason is pretty simple, you're doing an asynchronous operation, and the result will only be returned by your callback. Your ready variable is just the reference to your request object.
I'm not sure about what is the context of your code and why you want to module.exports ready variable, but I suppose you want to export the result. If that's the case, I suggest you to return an async function which returns the response data instead of your request variable. This way you can control how to handle your response outside the module.
You can use the integrated fetch api instead of the deprecated request. I changed your code so that your component exports an asynchronous function called fetchData, which you can import somewhere and execute. It will return the result, updated with your logic:
module.exports = {
fetchData: async function fetchData() {
try {
const returnedData = await fetch({
url: "https://www.website.com/",
json: true
});
var ready = returnedData.result.map(entry => entry.instrument_name);
var str = filter.toString();
var addToStr = str
.split(",")
.map(function(a) {
return `"trades.` + a + `.raw", `;
})
.join("");
var neater = addToStr.substr(0, addToStr.length - 2);
return "[" + neater + "]";
} catch (error) {
console.error(error);
}
}
}
I hope this helps, otherwise please share more of your code. Much depends on where you want to display the fetched data. Also, how you take care of the loading and error states.
EDIT:
I can't get responses from this website, because you need an account as well as credentials for the api. Judging your code and your questions:
1) return all values of "instrument_name"
Your map function works:
var filter = returnedData.result.map(entry => entry.instrument_name);
2)perform some manipulations (adding 'trades.' to the beginning of each value and '.raw' to the end of each value.
3) place these manipulations into an array. ["trades.BTC-26JUN20-8000-C.raw","trades.BTC-25SEP20-8000-C.raw"]
This can be done using this function
const manipulatedData = filter.map(val => `trades.${val}.raw`);
You can now use manipulatedData in your next request. Being able to export this variable, depends on the component you use it in. To be honest, it sounds easier to me not to split this logic into two separate components - regarding the websocket -.
i am making an asynchronous request to a database and then running a loop on the resultant data but i am getting only the last value while sending a response to front-end
routes.post('/data/qualitative/bivariate', async (req, res)=>{
const { colName1, colName2} = req.body;
var colNameObj1={};
var colNameArray1={};
colNameObj1[colName1]=1;
colNameObj1[colName2]=1;
colNameObj1['_id']=0;
//requesting data from database
const data= await dataModel.find({}, colNameObj1);
//filtering the data
const newData= data.map( (item)=>{
colNameArray1['x']= item[colName1];
colNameArray1['y']= item[colName2];
return colNameArray1
})
//in response i am getting just the data from the last index
res.json(newData)
})
In response i am getting just the data from the last index. Please advise how i can handle this asynchronous request
You must declare colNameArray1 inside the map function:
routes.post('/data/qualitative/bivariate', async (req, res)=>{
const { colName1, colName2} = req.body;
var colNameObj1={};
colNameObj1[colName1]=1;
colNameObj1[colName2]=1;
colNameObj1['_id']=0;
//requesting data from database
const data= await dataModel.find({}, colNameObj1);
//filtering the data
const newData= data.map( (item)=>{
var colNameArray1={};
colNameArray1['x']= item[colName1];
colNameArray1['y']= item[colName2];
return colNameArray1
})
//in response i am getting just the data from the last index
res.json(newData)
})
If you want all the rows, then you should push each object you can create in to an array. In the above code, you are overwriting over the same object.
Try updating the code to the following:
routes.post('/data/qualitative/bivariate', async (req, res)=>{
const { colName1, colName2} = req.body;
var colNameObj1={};
var colNameArray1={};
const dataList = [] // Create a empty array
colNameObj1[colName1]=1;
colNameObj1[colName2]=1;
colNameObj1['_id']=0;
//requesting data from database
const data= await dataModel.find({}, colNameObj1);
//filtering the data
data.forEach( (item)=>{
colNameArray1['x']= item[colName1];
colNameArray1['y']= item[colName2];
dataList.push(colNameArray1) // Push to the array
})
//in response i am getting just the data from the last index
res.json(dataList)
})
i found the issue and it has nothing to do with the asynchronous request and as mentioned above its confirmed that i am overwriting the object. Cloning the object using spread {... obj} has solved the issue. Thanks for help.
I'm building an application using Node/Express/MongoDB (first time with all these) that will let me pull data from the DB and display it on an Express page. This is what the GET request looks like:
var str = "";
app.route('/view-reports').get(function(req, res) {
var cursor = collections.find({});
cursor.each(function(err, item) {
if (item != null) {
console.log(str);
str = str + "Substance: " + item.substance + "<br>";
}
if (err) {
console.log(err);
}
});
console.log(str);
res.send(str);
str = "";
});
I would expect this to return something like this:
Substance: a
Substance: b
Substance: c
However, the initial request does not return anything at all. The second request will return the above. If I enclose res.send(str) in an if conditional it simply will not load until a second request is made.
cursor.each() is asynchronous. That means it runs sometimes LATER, after your res.send(str), thus you get the previous version of str. You need to collect all the data first and then send your response only when you have all the data.
If you want all the data, then you could use promises and .toArray() like this:
app.route('/view-reports').get(function(req, res) {
collections.find({}).toArray().then(data => {
let result = data.map(item => {
return "Substance: " + item.substance + "<br>";
}).join("");
res.send(result);
}).catch(err => {
// database error
console.log(err);
res.sendStatus(500);
});
});
Note: This also wisely gets rid of the str variable which was outside the scope of the request and thus could easily lead to a concurrency bug when multiple requests were in flight at the same time (from different users).
Create a router specifically for substances and use it in app. Instead of breaks, you can create a ul, also, that processing should happen on the front end. Separate your concerns. The server shouldn't have to worry about any rendering and etc. One purpose per process.
The routers can be created per resource. Create a router for substances, for cats, for dogs. Each individual router has it's own get post delete and puts that allow you to modify that resource. app can use all the routers at once.
app.use(catRouter);
app.use(mooseRouter);
app.use(platypusRouter);
const { Router } = require('express');
const createError = require('http-errors');
let substanceRouter = new Router();
function buildElement(arr)
{
let start = '';
arr.forEach(val => {
if(!val) return;
start += `Substance : ${val}<br>`;
});
return start;
}
subtanceRouter.get('/endpoint/whatever', function(req, res, next) {
collectios.find({})
.then(results => {
if(!results) throw new Error('Resource Not Found');
let output = buildElement(results);
res.json(output);
next();
})
.catch(err => next(createError(404, err.message)));
})
app.use(substanceRouter);
Alternately we can write :
let output = results
.filter(sub => !!sub)
.join('<br>');
res.json(output);
But be advised this will add memory overhead, generates a completely new array to house the results, consuming at worst, O(n) memory.
Hi people and happy holidays!
I'm trying to consume a stream of csv rows with highland. To do some special processing and avoid passing down the headers to the stream, I'm calling .consume() and then I wanted the outcome in an array. The problem is that the callback of .toArray() is never called. I'm just curious about this as I already changed my solution to .map().toArray() but I still think .consume() would be a more elegant solution. ;) This is the piece of code that reads from a csv file and processes the rows:
const fs = require('fs');
const _ = require('highland');
const dataStream = fs.createReadStream('file.csv', 'utf8');
_(dataStream)
.split()
.consume((err, row, push, next) => {
console.log('consuming a row', row); // <-- this shows data
if (err) {
push(err);
return next();
}
// This condition is needed as .consume() passes an extra {} at the end of the stream <-- donno what this happens!!
if (typeof row !== 'string') {
return next();
}
const cells = row.split(',');
if (cells[0] === 'CODE') { // <-- header row
const { columnNames, columnIndexes } = processHeader(cells)); // <-- not relevant, works okay
return next();
}
console.log('processin content here', processRow(cells)); // <-- not relevant, works okay
push(null, processRow(cells));
return next();
})
.toArray(rows => console.log('ROWS: ', rows)); // <-- I don't get anything here
Thanks a lot!
Both consume and toArray consume a stream. You can only consume a stream once. If you try to consume it multiple times the "stream" will be empty.