This is the code I have written using express and node.js
const express = require("express");
const https = require("https");
const app = express();
app.get("/", function(req, res) {
// Url to my api key
const url = "https://api.spoonacular.com/recipes/random?apiKey=...&number=1";
https.get(url, function(response) {
response.on("data", function (data) {
console.log(JSON.parse(data));
// const theRecipe = JSON.parse(data);
console.log(data);
});
});
res.send("The server is up and running");
});
app.listen(3000, function () {
console.log("Server started at port 3000");
});
When I refresh my webpage on localhost, on console I get the following error:
quote
SyntaxError: Unexpected end of JSON input
at JSON.parse ()
at IncomingMessage. (C:\Users\ArunBohra\Desktop\FoodRecipes\app.js:12:33)
quote
Can anyone find what is the problem with my code.
The data event fires when a chunk of data from the response has arrived. You are trying to parse the first chunk as if it were the complete JSON text.
You need to collect the pieces from each data event but wait until the end event fires before joining them together into the complete string of JSON that you can parse.
There is an example of fetching and parsing JSON in the documentation.
You might want to look at modules like axios and node-fetch which take care of that (and the JSON parsing) for you while providing modern Promise based APIs.
If you have a new enough version of Node, you can use the native Fetch API.
If you use a package like node-fetch you can get the whole thing in one go instead of what you have now which is chunking the data
const fetch = require('node-fetch');
const url = "https://api.spoonacular.com/recipes/random?apiKey=...&number=1";
fetch(url)
.then(response => response.json())
.then(data => console.log(data));
In addition to other answers, you can do it without another package.
https.get(url, function (response) {
let result = "";
response.on("data", function (data) {
result += chunk;
});
res.on('end', () => {
// now you have the combined result
console.log(result);
});
});
Related
I am trying to download distance between 2 locations from tomtom api.
Protractor will not let me use
*fetch - fetch is not defined - please use import
*import - Cannot use import statement outside of module
*when I add
{
type: module
}
to package.json - protractor stops working, as no entire code is a module of ES
*browser.get - opens http with json data, but I cannot extract it.
Is there any other way? I tried to import json to a different file and export response.data, but the module error stops me from doing that also.
Protractor is for testing angular webpages, but you can have the browser execute arbitrary javascript, but to use fetch, you need to use window
function getTomTomData() {
//replace this with your tomtom api call, and transform the response
return window.fetch(TOM_TOM_URL);
}
browser.executeScript(getTomTomData).then(res=> {
//do something with the response
});
I did not manage to run node-fetch on my script as Protractor kept rejecting the import. I managed to to sort it out with require 'https'
const https = require('https');
let measureDistance = async function(pickup, dropoff) {
let url = `https://api.tomtom.com/routing/1/calculateRoute/${pickup[0]}%2C${pickup[1]}%3A${dropoff[0]}%2C${dropoff[1]}/json?routeType=shortest&avoid=unpavedRoads&key=uwbU08nKLNQTyNrOrrQs5SsRXtdm4CXM`;
await https.get(url, res => {
let body = '';
res.on('data', chunk => {
body += chunk;
});
res.on("end", () => {
try {
let json = JSON.parse(body);
howFar = json.routes[0].summary.lengthInMeters;
} catch (error) {
console.error(error.message);
}
}).on("error", (error) => {
console.error(error.message);
});
});
};
Also I used to put require on top of the file like in Ruby, which seemed to be another issue.
I am trying to pass simple data from my server to a javascript file called on another html page. I am testing sending a single string from the server, but am not sure how to receive it. Server below:
const express = require('express')
const app = express()
const port = 3000
app.use(express.static("./assets"));
app.get('/', (req, res) => {
//res.send('Hello World!')
res.sendFile('./main.html', { root: __dirname });
})
app.listen(port, () => {
console.log(`Example app listening at http://localhost:${port}`)
})
app.get('/get-test', async (_req, res) => {
try {
const test_string = "get-test request string";
return res.send(test_string);
} catch (e) { throw e; }
});
And in another javascript file I have the following:
async function testing() {
const response = await fetch('/get-test');
console.log(response);
}
testing();
The console.log gives me a whole object, and clicking through it I can't seem to find my test_string anywhere:
So I believe the get request worked, but how do I actually access the data inside that I want?
You need to call await response.text() before console.loging it.
So in this functio:
async function testing() {
const response = await fetch('/get-test');
console.log(response);
}
testing();
You will need to put await inside the console.log
so like this:
async function testing() {
const response = await fetch('/get-test');
console.log(await response);
}
testing();
Why ?
Since javascript is asynchronous when retrieving data there will be a delay. That is why they implemented async / await and promises. So when you trying to make a get request with fetch you will need need to await the response. But the response is also a promise, which mean you will need to await the response aswell. This makes more sense, when you try to process the response to lets say json.
A little tip
When the console.log returns no error, but no data either. It might because of a promise issue
Im using csv-parser npm package and doing a sample csv parse. My only confusion is accessing the parsed array after running these functions. I understand im pushing the data in .on('data') , then doing a console.log(results); statement in .on('end'); to show what's being stored. Why do I get undefined when i try to access results after running those functions. Doesn't results get the information stored?
const csv = require('csv-parser');
const fs = require('fs');
const results = [];
fs.createReadStream('demo.csv')
.pipe(csv())
.on('data', (data) => results.push(data))
.on('end', () => {
console.log(results);
});
I came here to find the solution to the same issue.
Since this is an async operation, what works here is to call that function that acts on your parsed data once the end handler is called. Something like this should work in this situation:
const csv = require('csv-parser');
const fs = require('fs');
const results = [];
fs.createReadStream('demo.csv')
.pipe(csv())
.on('data', (data) => results.push(data))
.on('end', () => {
console.log(results);
csvData(results);
});
const csvData = ((csvInfo) => {
console.log(csvInfo);
console.log(csvInfo.length);
})
I can get results in .on('end', () => { console.log(results);}); , but
if I put a console.log() after the createReadStream , results is
undefined, does that make sense? – Videoaddict101
Your stream acts asynchronously, that means your data and your end handler will be called later, meanwhile your javascript continue to be executed. So accessing your array just after fs.createReadStream instruction will result of an empty array.
Understanding async is very important using javascript, even more for nodejs.
Please have a look on differents resources for handling async like Promise, Async/Await ...
You should you neat-csv which is the endorsed wrapper for csv-parser that gives you a promise interface.
That said, you can create a promise and resolve it in the on("end", callback)
import fs from "fs";
import csv from "csv-parser";
function getCsv(filename) {
return new Promise((resolve, reject) => {
const data = [];
fs.createReadStream(filename)
.pipe(csv())
.on("error", (error) => reject(error))
.on("data", (row) => data.push(row))
.on("end", () => resolve(data));
});
}
console.log(await getCsv("../assets/logfile0.csv"));
I am trying to send a POST http request with data, but it was taking too long! so I followed the Electron Documentation same example code and it turns out to be slow too. It is taking around 40 SECONDS for 'response' event to fire and return back data!!
Sample code from https://electronjs.org/docs/api/net
const { app } = require('electron')
app.on('ready', () => {
const { net } = require('electron')
const request = net.request('https://github.com')
request.on('response', (response) => {
console.log(`STATUS: ${response.statusCode}`)
console.log(`HEADERS: ${JSON.stringify(response.headers)}`)
response.on('data', (chunk) => {
console.log(`BODY: ${chunk}`)
})
response.on('end', () => {
console.log('No more data in response.')
})
})
request.end()
})
I used nodejs http module and it works fine but I am wondering why is Electron's native module taking that long to return results?
I've faced the same problem. In my case, my company makes use of a proxy, so my requests spent too much time (about 40 seconds as in your case). My solution was the use Node HTTP requests (with axios using the proxy: false).
Using the native http.get() in Node.js, I'm trying to pipe a HTTP response to a stream that I can bind data and end events to.
I'm currently handling this for gzip data, using:
http.get(url, function(res) {
if (res.headers['content-encoding'] == 'gzip') {
res.pipe(gunzip);
gunzip.on('data', dataCallback);
gunzip.on('end', endCallback);
}
});
Gunzip is a stream and this just works. I've tried to create streams (write streams, then read streams) and pipe the response, but haven't been having much luck. Any suggestions to replicate this same deal, for non-gzipped content?
The response object from a HTTP request is an instance of readable stream. Therefore, you would collect the data with the data event, then use it when the end event fires.
var http = require('http');
var body = '';
http.get(url, function(res) {
res.on('data', function(chunk) {
body += chunk;
});
res.on('end', function() {
// all data has been downloaded
});
});
The readable.pipe(dest) would basically do the same thing, if body in the example above were a writable stream.
Nowadays the recommended way of piping is using the pipeline function. It is supposed to protect you from memory leaks.
const { createReadStream} = require('fs');
const { pipeline } = require('stream')
const { createServer, get } = require('http')
const errorHandler = (err) => err && console.log(err.message);
const server = createServer((_, response) => {
pipeline(createReadStream(__filename), response, errorHandler)
response.writeHead(200);
}).listen(8080);
get('http://localhost:8080', (response) => {
pipeline(response, process.stdout, errorHandler);
response.on('close', () => server.close())
});
Another way of doing it that has more control would be to use async iterator
async function handler(response){
let body = ''
for await (const chunk of response) {
let text = chunk.toString()
console.log(text)
body += text
}
console.log(body.length)
server.close()
}
get('http://localhost:8080', (response) => handler(response).catch(console.warn));