JS functions with different parameters communicating results - javascript

I'm not sure if this is possible and can't wrap my head around it. I have a larger project where I want to combine a couple of callbacks in different files into one and simply get the data from the different functions. My issue is that I can't seem to fetch the last part which I'll demonstrate in the code.
File one, importing file three and calling the sendUpdate function with 'note' as param.
const three = require("./three")
const note = "test"
three.sendUpdate(note);
File two,
async function getUser(user) {
return await otherfileFarAway.otherFunctionFarAway(user);
}
module.exports = {
getUser
}
File three, where I want to bundle these two up. I'm importing file two
const two = require("two");
async function sendUpdate(note) {
const total = await two.getUser(user);
console.log(note); // Works
console.log(total); // Undefined, duuh
try {
const url
.concat(total)
.concat("/Notes");
const result = await axios.post(
url,
{
Comment: note
},
);
return result.data;
} catch (error) {
logger.axiosError(error);
return null;
}
}
module.exports = {
sendUpdate
}
How would I actually just call getUser in file two from file three and get the value it is getting in file two? Is that possible? If I call it without parameters I get nothing, if I send it with something I get undefined. If I define it beforehand with for example "let user;" I get nothing.
What am I doing wrong or is it just simply not possible to get the returned value?

I suppose you are using Node? If that is that case you need to use exports to make them available via require. Here's some doc

Modify file two to the following to add some debug messages, so you'll see what's wrong:
async function getUser(user) {
console.log(`====== executing getUser() in filetwo with param: `, user);
const result = await otherfileFarAway.otherFunctionFarAway(user);
console.log(`====== returning `, result);
return result;
}
module.exports = {
getUser
}
Your questions seems to be How would I actually just call getUser in file two from file three and get the value it is getting in file two?.
The answer is that you're already calling that filetwo.getUser(...) properly, because otherwise you'd get a syntax error. It's probably that you're not sending the right parameter to it, so otherFunctionFarAway(...) returns nothing/undefined/null.
Edit
After your comment regarding overriding the user var in filetwo, a solution would be, something along the lines:
create and export another function in filetwo which will export the correct user variable
in sendUpdate(...), before calling the present getUser(note) function, make another call to the above function
call getUser() with the result from point 2.

Related

How to get info from a JSON?

I'm new around here and I'm studying JS! In particular JSON! However, I have come across an exercise that I cannot solve, also because I do not understand what I am doing wrong. I need to extract the information about the planets from the StarWars API. So I do the classic fetch and as a result I get the generic information about the planet in the form of a JSON.
However, I have to extract the planet name and I get stuck, because when I check the PlanetsData variable, it gives me undefined. Ergo the cycle I wrote to extract the names of the planets doesn't work for some reason.
So, my question is:
Why do I get "undefined" for the PlanetsData variable? .. Shouldn't I get the JSON, which displays correctly in the console?
Did I write the cycle correctly?
Thanks to who will answer me!
This is my code:
async function getPlanetsData() {
const planetsData = await fetch ("https://swapi.dev/api/planets").then(data => {
return data.json()}).then(planets => {console.log(planets.results)}) // ---> Here i receive the JSON data
for (let key in planetsData) {
const someInfo = planetsData.results[key].name
console.log(JSON.stringify(someInfo)) } // ---> I don't understand why, but I don't get anything here. There is no response in the console, as if the call did not exist
}
getPlanetsData()
You can write the same function in a different and clearer way,
check the comments to understand the code!
async function getPlanetsData() {
// 1. Fetch and wait for the response
const response = await fetch ("https://swapi.dev/api/planets");
// 2. Handle the response, in that case we return a JSON
// the variable planetsData now have the whole response
const planetsData = await response.json();
// console.log(planetsData); // this will print the whole object
// 3. Return the actual data to the callback
return planetsData;
}
// Function usage
// 4. Call "getPlantesData" function, when it completes we can call ".then()" handler with the "planetsData" that contains your information
getPlanetsData().then(planetsData => {
// 5. Do whatever you want with your JSON object
// in that case I choose to print every planet name
var results = planetsData.results; // result array of the object
results.forEach(result => console.log(result.name));
});
It seems that you have the same issue as : read and save file content into a global variable
Tell us if it does solve your issue or not.
(UPDATE)
To answer explicitly to your questions.
First question:
To get value into variable planetsData you can do this:
async function getPlanetsData() {
const response = await fetch ("https://swapi.dev/api/planets")
const planetsData = await response.json()
for (let key in planetsData) {
const someInfo = planetsData.results[key].name
console.log(JSON.stringify(someInfo))
}
}
getPlanetsData()
Second question:
You didn't write the cycle correctly.
To resolve promises it is preferable to choose between using await and.

Angular get different values from nodejs API

If I try to call a get on my nodeJS than it's work but I get different values.
I have to submit my search parameters 1-3 times to get the right output.
My Node server get the right search parameter and output the right API response but if I want to print it to my Angular app then it will use a old parameters. I don't understand the behavior.
Request Path:
angular -> nodeJS -> 3rd partie API -> nodeJS -> angular
Is that path wrong? how can I get the current data for angular?
APP.JS
app.get('/url', (req, res) => {
var name = req.query.searchname;
console.log("Output\n---------------------------")
console.log(req.query.searchname);
doGetRequest(name);
return res.send(data);
})
test.service
test(searchName){
this.http.get<any>('http://localhost:3000/url', {
params: {
searchname: searchName
}}).subscribe(data => this.totalAngularPackages = data)
this.log();
}
log(){
console.log(this.totalAngularPackages);
}
The variable this.totalAngularPackages is assigned asynchronously. Any statements that directly depend on it must be inside the subscription. By the time the console.log is run, the variable this.totalAngularPackages is still undefined or holds the previously assigned values. You could learn more about async requests here.
test(searchName) {
this.http.get<any>('http://localhost:3000/url', {params: {searchname: searchName}})
.subscribe(data => {
this.totalAngularPackages = data;
this.log(); // <-- must be inside the subscription
});
}
log() {
console.log(this.totalAngularPackages);
}

Unable to send data through callbacks in Javascript

I am using NodeJs, Express and async libarary. I am trying to fetch data from elasticsearch and return that information to the user.
Here is my code:
1. // function that generates the ES query I want
const generateEsQuery = (source,target)=> (callback)=>{
let query = {} // some query that I generated
callback(null,callback)
}
2. // functions that I want to call after the fetching the query
const readElasticData =(data)=> async(callback) =>{
const trafficData =await elasticClient.search({"ignoreUnavailable":true,index:data.indexes,body:{"query":data.query}},{ignore: [404]});
callback(null ,trafficData)
}
async function readEsData (data,callback){
const trafficData =await elasticClient.search({"ignoreUnavailable":true,index:data.indexes,body:{"query":data.query}},{ignore: [404]});
callback(null ,trafficData)
}
3. // Calling my funtions
function myFunction(information){
// async.waterfall([generateEsQuery(information[0],information[1]),readElasticData],// The second function doesnt even run
async.waterfall([generateEsQuery(information[0],information[1]),readEsData] // the second functions runs but doesn't return the results to the callback
function(err, results) {
console.log("All Results",results);
return results;
});
}
I have two functions one to generate a ElasticQuery (1) and another to execute the query (2), I am trying to use the waterfall function of async library to execute them once after another , but I am unable to fetch the results at the final callback.
Out of the two functions that I use to read data , the second one "readEsData" atleast runs after the "generateEsQuery" function. I am new to Node js and I am not sure what I am doing wrong.I also do not want to combine the two functions as I will be reusing the first one in other places.
I am following this approach : https://medium.com/velotio-perspectives/understanding-node-js-async-flows-parallel-serial-waterfall-and-queues-6f9c4badbc17
Question : How do I send the result of the second function through the callback.

Firebase Cloud Functions. Cannot read property parent of undefined (.ref)

I am trying to deploy the following function to firebase. The function deploys fine, but when the function triggers I get an error: cannot read property 'parent' of undefined. The error occurs in the first line I reference parent. I used console.log on snapshot and snapshot.ref, and although snapshot exists, snapshot.ref is undefined.
I have used snapshot.ref.parent in other cloud functions and it is working fine. There are two main difference with this function:
(a) it is an onUpdate (I have previously been using onCreate and onDelete)
(b) it is an async function.
exports.likeRating = functions.database.ref('Ocean/{waveId}/Likes').onUpdate(async (snapshot) =>{
let likes; let dislikes; let comments; let echoes;
await snapshot.ref.parent.child('Dislikes').once('value').then(response=>{dislikes = response.val(); return null});
await snapshot.ref.parent.child('Likes').once('value').then(response=>{likes = response.val(); return null});
await snapshot.ref.parent.child('Comments').child('CommentsCount').once('value').then(response=>{comments = response.val(); return null});
await snapshot.ref.parent.child('Echoes').once('value').then(response=>{echoes = response.val(); return null});
snapshot.ref.parent.child('Rating').set(dislikes+likes+comments+echoes);
return null;
}
Any ideas as to why I am getting this error? All help is appreciated.
That function will run significantly slower than it needs to as you're waiting for your requests in series, you should await a Promise.all([<Promises>]) instead, also the return null is redundant.
I'm also not sure why you add everything up each time instead of incrementing the Rating value but maybe I didn't think about it as much as you.
If you look at the docs the signature of the callback is function(non-null functions.Change containing non-null functions.firestore.DocumentSnapshot, optional non-null functions.EventContext)
So the first param is change which contains before and after which are of type DocumentSnapshot, it is those properties you should be using e.g. change.after.ref.

Save csv-parse output to a variable

I'm new to using csv-parse and this example from the project's github does what I need with one exception. Instead of outputting via console.log I want to store data in a variable. I've tried assigning the fs line to a variable and then returning data rather than logging it but that just returned a whole bunch of stuff I didn't understand. The end goal is to import a CSV file into SQLite.
var fs = require('fs');
var parse = require('..');
var parser = parse({delimiter: ';'}, function(err, data){
console.log(data);
});
fs.createReadStream(__dirname+'/fs_read.csv').pipe(parser);
Here is what I have tried:
const fs = require("fs");
const parse = require("./node_modules/csv-parse");
const sqlite3 = require("sqlite3");
// const db = new sqlite3.Database("testing.sqlite");
let parser = parse({delimiter: ","}, (err, data) => {
// console.log(data);
return data;
});
const output = fs.createReadStream(__dirname + "/users.csv").pipe(parser);
console.log(output);
I was also struggling to figure out how to get the data from csv-parse back to the top-level that invokes parsing. Specifically I was trying to get parser.info data at the end of processing to see if it was successful, but the solution for that can work to get the row data as well, if you need.
The key was to wrap all the stream event listeners into a Promise, and within the parser's callback resolve the Promise.
function startFileImport(myFile) {
// THIS IS THE WRAPPER YOU NEED
return new Promise((resolve, reject) => {
let readStream = fs.createReadStream(myFile);
let fileRows = [];
const parser = parse({
delimiter: ','
});
// Use the readable stream api
parser.on('readable', function () {
let record
while (record = parser.read()) {
if (record) { fileRows.push(record); }
}
});
// Catch any error
parser.on('error', function (err) {
console.error(err.message)
});
parser.on('end', function () {
const { lines } = parser.info;
// RESOLVE OUTPUT THAT YOU WANT AT PARENT-LEVEL
resolve({ status: 'Successfully processed lines: ', lines });
});
// This will wait until we know the readable stream is actually valid before piping
readStream.on('open', function () {
// This just pipes the read stream to the response object (which goes to the client)
readStream.pipe(parser);
});
// This catches any errors that happen while creating the readable stream (usually invalid names)
readStream.on('error', function (err) {
resolve({ status: null, error: 'readStream error' + err });
});
});
}
This is a question that suggests confusion about an asynchronous streaming API and seems to ask at least three things.
How do I get output to contain an array-of-arrays representing the parsed CSV data?
That output will never exist at the top-level, like you (and many other programmers) hope it would, because of how asynchronous APIs operate. All the data assembled neatly in one place can only exist in a callback function. The next best thing syntactically is const output = await somePromiseOfOutput() but that can only occur in an async function and only if we switch from streams to promises. That's all possible, and I mention it so you can check it out later on your own. I'll assume you want to stick with streams.
An array consisting of all the rows can only exist after reading the entire stream. That's why all the rows are only available in the author's "Stream API" example only in the .on('end', ...) callback. If you want to do anything with all the rows present at the same time, you'll need to do it in the end callback.
From https://csv.js.org/parse/api/ note that the author:
uses the on readable callback to push single records into a previously empty array defined externally named output.
uses the on error callback to report errors
uses the on end callback to compare all the accumulated records in output to the expected result
...
const output = []
...
parser.on('readable', function(){
let record
while (record = parser.read()) {
output.push(record)
}
})
// Catch any error
parser.on('error', function(err){
console.error(err.message)
})
// When we are done, test that the parsed output matched what expected
parser.on('end', function(){
assert.deepEqual(
output,
[
[ 'root','x','0','0','root','/root','/bin/bash' ],
[ 'someone','x','1022','1022','','/home/someone','/bin/bash' ]
]
)
})
As to the goal on interfacing with sqlite, this is essentially building a customized streaming endpoint.
In this use case, implement a customized writable stream that accepts the output of parser and sends rows to the database.
Then you simply chain pipe calls as
fs.createReadStream(__dirname+'/fs_read.csv')
.pipe(parser)
.pipe(your_writable_stream)
Beware: This code returns immediately. It does not wait for the operations to finish. It interacts with a hidden event loop internal to node.js. The event loop often confuses new developers who are arriving from another language, used to a more imperative style, and skipped this part of their node.js training.
Implementing such a customized writable stream can get complicated and is left as an exercise for the reader. It will be easiest if the parser emits a row, and then the writer can be written to handle single rows. Make sure you are able to notice errors somehow and throw appropriate exceptions, or you'll be cursed with incomplete results and no warning or reason why.
A hackish way to do it would have been to replace console.log(data) in let parser = ... with a customized function writeRowToSqlite(data) that you'll have to write anyway to implement a custom stream. Because of asynchronous API issues, using return data there does not do anything useful. It certainly, as you saw, fails to put the data into the output variable.
As to why output in your modified posting does not contain the data...
Unfortunately, as you discovered, this is usually wrong-headed:
const output = fs.createReadStream(__dirname + "/users.csv").pipe(parser);
console.log(output);
Here, the variable output will be a ReadableStream, which is not the same as the data contained in the readable stream. Put simply, it's like when you have a file in your filesystem, and you can obtain all kinds of system information about the file, but the content contained in the file is accessed through a different call.

Categories