I was doing a pretty easy task of getting information from an API but then i got the response as ReadableStream and things start to turn dark as always. I am trying to use the cat-facts API.
URL : https://alexwohlbruck.github.io/cat-facts/docs/
and render the results with VueJS. What i found on the internet but this code just return some random numbers.
Here is the code:
created() {
this.getFacts();
},
methods: {
getFacts() {
let vm = this;
fetch('http://localhost:8080/facts')
.then(response => {
const reader = response.body.getReader();
return new ReadableStream({
start(controller) {
return pump();
function pump() {
return reader.read().then(({ done, value }) => {
// When no more data needs to be consumed, close the stream
if (done) {
controller.close();
return;
}
// Enqueue the next data chunk into our target stream
vm.facts = value;
controller.enqueue(value);
return pump();
});
}
}
})
})
.catch(err => console.error(err));
},
}
I am using vue.config.js to manage cors error:
module.exports = {
devServers = {
proxy: 'https://cat-fact.herokuapp.com/'
}
}
I made the request with POSTMAN and worked just well.
What happens if you do it like this:
async getFacts() {
let vm = this;
const response = await fetch('http://localhost:8080/facts');
const myJson = await response.json();
console.log(JSON.stringify(myJson));
}
Just wondring if you need to use pump at all.
EDIT
Without async/await
fetch('http://localhost:8080/facts')
.then(response => response.json())
.then(data => console.log(data));
I tried with ajax and a proxy made with php using guzzle to comunicate with the API and it worked! Just a simple line. But I am very confused about how fetch method works!
Related
I have an API called getQuote and a component called QuoteCard. Inside QuoteCard I'm trying to render an array of users that liked a quote. The API works fine, I have tested it, and the code below for getting the users works fine too.
const Post = async (url, body) => {
let res = await fetch(url, {
method: "POST",
headers: {
"Content-Type": "application/json",
"accept": "*/*"
},
body: JSON.stringify(body)
}).then(r => r.json());
return res;
}
const getAllLikes = async () => {
let users = await Post('api/getQuote', {
id: "639e3aff914d4c4f65418a1b"
})
return users
}
console.log(getAllLikes())
The result is working as expected :
However, when trying to map this promise result array to render it onto the page is where I have problems. I try to render like this:
<div>
{getAllLikes().map((user) => (
<p>{user}</p>
))}
</div>
However, I get an error that states:
getAllLikes(...).map is not a function
I don't understand why this is happening. Why can't I map the array? Is it because it's a promise or something?
And if anyone needs to see the getQuote API, here it is:
//Look ma I wrote an API by myself! :D
import clientPromise from "../../lib/mongodb";
const ObjectId = require('mongodb').ObjectId;
import nc from "next-connect";
const app = nc()
app.post(async function getQuote(req, res) {
const client = await clientPromise;
const db = client.db("the-quotes-place");
try {
let quote = await db.collection('quotes').findOne({
_id: new ObjectId(req.body.id)
})
res.status(200).json(JSON.parse(JSON.stringify(quote.likes.by)));
} catch (e) {
res.status(500).json({
message: "Error getting quote",
success: false
})
console.error(e);
}
})
export default app
Thanks for any help!
It is due to the fact that getAllLikes is an async function and thus it returns promise which does not have a map function.
You can either save it in a state variable before using await Or chain it with .then.
Minimal reproducible example which works
const getAllLikes = async () => {
return ['a', 'b']
}
getAllLikes().then((r) => r.map((g) => { console.log(g) }))
Edit: The above code won't work if directly used with jsx since the return of getAllLikes will still be a promise. Solution would be to save it in a state variable and then using it.
I am from Angular and I believe we call pipe on Observables (or Promises). Map can then be called inside the pipe function
observable$ = getAllLikes().pipe(map( user => <p>{user}</p>))
If there is no pipe, I can only think of manually subscribing (which is not a good practice)
sub$ = getAllLikes().subscribe( user => <p>{user}</p>)
// unsub from sub$ appropriately
// We do this from ngOnDestroy in angular
ngOnDestroy() {
this.sub$?.unsubscribe()
}
I have a task that requires fetching api data, with the constraint of only one outstanding api request at a time. Must receive a response, or time out, before issuing the next one. Since fetch (or axios) returns a promise, I can’t figure out how to wait for each promise to fulfill before issuing the next fetch.
I'm handed a large array of api url's that must all be resolved in this one-at-a-time manner before continuing.
I’m using create-react-app’s bundled dev server, and Chrome browser.
Curiously, accomplishing this via a node script is easy, because ‘await fetch’ actually waits. Not so in my browser environment, where all the fetch requests blast out at once, returning promises along the way.
Here’s a simple loop that results in the desired behavior as a node script. My question is how to achieve this one-outstanding-request-at-a-time synchronous serialization in the browser environment?
const fetchOne = async (fetchUrl) => {
try {
const response = await fetch(fetchUrl, { // Or axios instead
"headers": {
'accept': 'application/json',
'X-API-Key': 'topSecret'
},
'method': 'GET'
})
const data = await response.json();
if (response.status == 200) {
return (data);
} else {
// error handling
}
} catch(error) {
// different error handling
}
}
const fetchAllData = async (fetchUrlArray) => {
let fetchResponseDataArray = new Array();
let fetchResponseDataObject = new Object(/*object details*/);
for (var j=0; j<fetchUrlArray.length; j++) { // or forEach or map instead
// Node actually synchronously waits between fetchOne calls,
// but react browser environment doesn't wait, instead blasts them all out at once.
// Question is how to achieve the one-outstanding-request-at-a-time synchronous
// serialization in the browser environment?
fetchResponseDataObject = await fetchOne(fetchUrlArray[j]);
fetchResponseDataArray.push(fetchResponseDataObject);
}
return(fetchResponseDataArray);
}
If there's a problem, it's with code you haven't shown (perhaps in one of your components, or maybe in your project configuration).
Here's an runnable example derived from the problem you described, which mocks fetch and an API, showing you how to iterate each network request synchronously (and handle potential errors along the way):
Note, handling potential errors at the boundaries where they might occur is a better practice than only having a top level try/catch: by doing so, you can make finer-grained decisions about what to do in response to each kind of problem. Here, each failed request is stored as [url, error] in a separate array so that you can programmatically make decisions if one or more requests failed. (Maybe you want to retry them in a subsequent step, or maybe you want to show something different in the UI, etc.). Note, there's also Promise.allSettled(), which might be useful to you now or in the future.
<div id="root"></div><script src="https://unpkg.com/react#17.0.2/umd/react.development.js"></script><script src="https://unpkg.com/react-dom#17.0.2/umd/react-dom.development.js"></script><script src="https://unpkg.com/#babel/standalone#7.16.4/babel.min.js"></script>
<script type="text/babel" data-type="module" data-presets="env,react">
const {useEffect, useState} = React;
const successChance = {
fetch: 0.95,
server: 0.95,
};
function mockApi (url, chance = successChance.server) {
// Simulate random internal server issue
const responseArgs = Math.random() < chance
? [JSON.stringify({time: performance.now()}), {status: 200}]
: ['Oops', {status: 500}];
return new Response(...responseArgs);
}
function mockFetch (requestInfo, _, chance = successChance.fetch) {
return new Promise((resolve, reject) => {
// Simulate random network issue
if (Math.random() > chance) {
reject(new Error('Network error'));
return;
}
const url = typeof requestInfo === 'string' ? requestInfo : requestInfo.url;
setTimeout(() => resolve(mockApi(url)), 100);
});
}
// Return an object containing the response if successful (else an Error instance)
async function fetchOne (url) {
try {
const response = await mockFetch(url);
if (!response.ok) throw new Error('Response not OK');
const data = await response.json();
return {data, error: undefined};
}
catch (ex) {
const error = ex instanceof Error ? ex : new Error(String(ex));
return {data: undefined, error};
}
}
async function fetchAll (urls) {
const data = [];
const errors = [];
for (const url of urls) {
const result = await fetchOne(url);
if (result.data) data.push([url, result.data]);
else if (result.error) {
// Handle this however you want
errors.push([url, result.error]);
}
}
return {data, errors};
}
function Example () {
const [data, setData] = useState([]);
const [loading, setLoading] = useState(false);
useEffect(() => {
const fetchData = async () => {
setLoading(true);
try {
const {data, errors} = await fetchAll([
'https://my.url/api/0',
'https://my.url/api/1',
'https://my.url/api/2',
'https://my.url/api/3',
'https://my.url/api/4',
'https://my.url/api/5',
'https://my.url/api/6',
'https://my.url/api/7',
'https://my.url/api/8',
'https://my.url/api/9',
]);
setData(data);
}
catch (ex) {
console.error(ex);
}
setLoading(false);
};
fetchData();
}, []);
return (
<div>
<div>Loading: {loading ? '...' : 'done'}</div>
<ul>
{
data.map(([url, {time}]) => (<li
key={url}
style={{fontFamily: 'monospace'}}
>{url} - {time}</li>))
}
</ul>
</div>
);
}
ReactDOM.render(<Example />, document.getElementById('root'));
</script>
I am trying to call my rest api endpoint in AIRTABLE from inside an AWS Lambda with no success. I get no errors, no outputs.
If I call the same code using node - it works.
I am able to use Axios in my code.
Pure airtable code (works)
var Airtable = require('airtable');
var base = new Airtable({apiKey: 'keyoMYSECRETKEY'}).base('Mybaseid');
base('MyBase').select({maxRecords: 3,view: "MyView"}).eachPage(function page(records, fetchNextPage) {
// This function (`page`) will get called for each page of records.
records.forEach(function(record) {
console.log('Retrieved',JSON.stringify(record.get('Session Information')));
});
fetchNextPage();
}, function done(err) {
if (err) { console.error(err); return; }
});
If I put it inside a Lambda handler - I get nothing.
const axios = require('axios')
const url = 'https://checkip.amazonaws.com/';
var Airtable = require('airtable');
var base = new Airtable({apiKey: 'keySECRETKEY'}).base('MYBASEID');
let response;
exports.lambdaHandler = async (event, context) => {
try {
base('MyBase').select({maxRecords: 3,view: "MyView"}).eachPage(function page(records, fetchNextPage) {
records.forEach(function(record) { //HERE - NOTHING HAPPENS
console.log('Retrieved',JSON.stringify(record.get('Session Information')));
});
fetchNextPage();
}, function done(err) {
if (err) { console.error(err); return; }
});
const ret = await axios(url); //THIS WORKS
response = {
'statusCode': 200,
'body': JSON.stringify({
message: 'hello world - boo',
location: ret.data.trim()
})
}
} catch (err) {
console.log(err);
return err;
}
return response
};
What am I missing so I can call Airtable API from inside an AWS Lambda?
It seems that your lambda terminates before the API call execution your trying to perform.
I believe this will be solved using a synchronous lambda or with a correct usage of promises with await calls.
Best way to troubleshoot this is to go back to the basics.
See if you can at least get a meaningful console log by wrapping a simpler fetch request into a lambda handler:
const baseId = 'exampleAppId123';
const tableName = 'Table 1';
const api_key = 'keyExample123';
const url = `https://api.airtable.com/v0/${baseId}/${tableName}?api_key=${api_key}`;
exports.lambdaHandler = async () => {
const res = await fetch(url)
.then(res => res.json())
.then(data=>console.log(data))
.then(() => {
//do more stuff
})
}
Then report back if you can't. Or better yet, report back either way as that's bound to help more people in the future.
Worst case? The above code still doesn't do anything. If that happens, I suggest going with #Shoty's first instinct and turning this code into a synchronous fetch request by removing the async/await syntax and returning chained thenables. Not that blocking behavior of this sort is acceptable from a UX perspective, but it should at least help with debugging.
I want to fetch an URL and process the response in chunks of a defined size. I also don't want to block while waiting for the whole chunk to be available. Is something like this available in the Fetch API?
Example how it could look like:
const response = await fetch(url)
const reader = response.body.getReader()
const chunk = await reader.read(CHUNK_SIZE)
There is support in fetch() to be consumed like a stream. See MDN reference here. It appears that you need some boilerplate code for ReadableStream...
Code would like this:
const workOnChunk = (chunk) => { console.log("do-work")};
// Fetch your stuff
fetch(url)
// Retrieve its body as ReadableStream
.then(response => response.body)
// Boilerplate for the stream - refactor it out in a common utility.
.then(rs => {
const reader = rs.getReader();
return new ReadableStream({
async start(controller) {
while (true) {
const { done, value } = await reader.read();
// When no more data needs to be consumed, break the reading
if (done) {
break;
}
// Do your work: ¿¿ Checkout what value returns ¿¿
workOnChunk(value)
// Optionally append the value if you need the full blob later.
controller.enqueue(value);
}
// Close the stream
controller.close();
reader.releaseLock();
}
})
})
// Create a new response out of the stream (can be avoided?)
.then(rs => new Response(rs))
// Create an object URL for the response
.then(response => response.blob())
.then(blob => { console.log("Do something with full blob") }
.catch(console.error)
NOTE: The nodejs-fetch API is not exactly the same. If you are on nodejs, see nodeje-fetch's stream support.
Hello I'm trying to test this API call but I don't know how to test for the status code of the response since it is a real (and it has to stay like that) API call and not a mock one
this is the function I'm testing:
export const getDataFromApi = (url) => {
return axios.get(url)
.then(({ data }) => data)
.catch(err => console.log(err.toString()));
}
and this is the test:
describe('Read data from API', () => {
test('Get result of the API call', (done) => {
const apiUrl = "https://rickandmortyapi.com/api/character";
getDataFromApi(apiUrl)
.then(data => {
expect(data).toBeDefined();
expect(data.results.length).toBeGreaterThan(0);
done();
});
});
});
how can I expect if the status code of data is 200 or if is another status code?
also is necessary for me to leave that done after the execution of the function? I know with call backs I have to put it but with this promise I'm not sure
Axios has a single response object returned in both the success and error paths which contains the HTTP status code. An error is raised if the response is not in the 2xx range.
You can plumb the status code as a return object from your getDataFromApi() wrapper function, but you'll probably want the full response object for other checks (like headers). I recommend getting rid of the wrapper altogether.
Without the wrapper, here's 2 different status checks using promises, one for success and one for failure:
describe('Read data from API', () => {
test('Get successful result of the API call', async() => {
const apiUrl = "https://rickandmortyapi.com/api/character";
await axios.get(apiUrl)
.then(r => {
expect(r.data).toBeDefined();
expect(r.data.results.length).toBeGreaterThan(0);
expect(r.status).toBeGreaterThanOrEqual(200);
expect(r.status).toBeLessThan(300);
})
.catch(e => {
fail(`Expected successful response`);
});
});
test('Get failure result of the API call', async() => {
const apiUrl = "https://rickandmortyapi.com/api/character-bad";
await axios.get(apiUrl)
.then(r => {
fail(`Expected failure response`);
})
.catch(e => {
if (e.response) {
expect(e.response.status).toBeGreaterThanOrEqual(400);
expect(e.response.status).toBeLessThan(500);
} else {
throw e;
}
});
});
});