I'm struggling to find documentation or examples of implementing an upload progress indicator using fetch.
This is the only reference I've found so far, which states:
Progress events are a high level feature that won't arrive in fetch for now. You can create your own by looking at the Content-Length header and using a pass-through stream to monitor the bytes received.
This means you can explicitly handle responses without a Content-Length differently. And of course, even if Content-Length is there it can be a lie. With streams you can handle these lies however you want.
How would I write "a pass-through stream to monitor the bytes" sent? If it makes any sort of difference, I'm trying to do this to power image uploads from the browser to Cloudinary.
NOTE: I am not interested in the Cloudinary JS library, as it depends on jQuery and my app does not. I'm only interested in the stream processing necessary to do this with native javascript and Github's fetch polyfill.
https://fetch.spec.whatwg.org/#fetch-api
Streams are starting to land in the web platform (https://jakearchibald.com/2016/streams-ftw/) but it's still early days.
Soon you'll be able to provide a stream as the body of a request, but the open question is whether the consumption of that stream relates to bytes uploaded.
Particular redirects can result in data being retransmitted to the new location, but streams cannot "restart". We can fix this by turning the body into a callback which can be called multiple times, but we need to be sure that exposing the number of redirects isn't a security leak, since it'd be the first time on the platform JS could detect that.
Some are questioning whether it even makes sense to link stream consumption to bytes uploaded.
Long story short: this isn't possible yet, but in future this will be handled either by streams, or some kind of higher-level callback passed into fetch().
My solution is to use axios, which supports this pretty well:
axios.request({
method: "post",
url: "/aaa",
data: myData,
onUploadProgress: (p) => {
console.log(p);
//this.setState({
//fileprogress: p.loaded / p.total
//})
}
}).then (data => {
//this.setState({
//fileprogress: 1.0,
//})
})
I have example for using this in react on github.
fetch: not possible yet
It sounds like upload progress will eventually be possible with fetch once it supports a ReadableStream as the body. This is currently not implemented, but it's in progress. I think the code will look something like this:
warning: this code does not work yet, still waiting on browsers to support it
async function main() {
const blob = new Blob([new Uint8Array(10 * 1024 * 1024)]); // any Blob, including a File
const progressBar = document.getElementById("progress");
const totalBytes = blob.size;
let bytesUploaded = 0;
const blobReader = blob.stream().getReader();
const progressTrackingStream = new ReadableStream({
async pull(controller) {
const result = await blobReader.read();
if (result.done) {
console.log("completed stream");
controller.close();
return;
}
controller.enqueue(result.value);
bytesUploaded += result.value.byteLength;
console.log("upload progress:", bytesUploaded / totalBytes);
progressBar.value = bytesUploaded / totalBytes;
},
});
const response = await fetch("https://httpbin.org/put", {
method: "PUT",
headers: {
"Content-Type": "application/octet-stream"
},
body: progressTrackingStream,
});
console.log("success:", response.ok);
}
main().catch(console.error);
upload: <progress id="progress" />
workaround: good ol' XMLHttpRequest
Instead of fetch(), it's possible to use XMLHttpRequest to track upload progress — the xhr.upload object emits a progress event.
async function main() {
const blob = new Blob([new Uint8Array(10 * 1024 * 1024)]); // any Blob, including a File
const uploadProgress = document.getElementById("upload-progress");
const downloadProgress = document.getElementById("download-progress");
const xhr = new XMLHttpRequest();
const success = await new Promise((resolve) => {
xhr.upload.addEventListener("progress", (event) => {
if (event.lengthComputable) {
console.log("upload progress:", event.loaded / event.total);
uploadProgress.value = event.loaded / event.total;
}
});
xhr.addEventListener("progress", (event) => {
if (event.lengthComputable) {
console.log("download progress:", event.loaded / event.total);
downloadProgress.value = event.loaded / event.total;
}
});
xhr.addEventListener("loadend", () => {
resolve(xhr.readyState === 4 && xhr.status === 200);
});
xhr.open("PUT", "https://httpbin.org/put", true);
xhr.setRequestHeader("Content-Type", "application/octet-stream");
xhr.send(blob);
});
console.log("success:", success);
}
main().catch(console.error);
upload: <progress id="upload-progress"></progress><br/>
download: <progress id="download-progress"></progress>
Update: as the accepted answer says it's impossible now. but the below code handled our problem for sometime. I should add that at least we had to switch to using a library that is based on XMLHttpRequest.
const response = await fetch(url);
const total = Number(response.headers.get('content-length'));
const reader = response.body.getReader();
let bytesReceived = 0;
while (true) {
const result = await reader.read();
if (result.done) {
console.log('Fetch complete');
break;
}
bytesReceived += result.value.length;
console.log('Received', bytesReceived, 'bytes of data so far');
}
thanks to this link: https://jakearchibald.com/2016/streams-ftw/
As already explained in the other answers, it is not possible with fetch, but with XHR. Here is my a-little-more-compact XHR solution:
const uploadFiles = (url, files, onProgress) =>
new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.upload.addEventListener('progress', e => onProgress(e.loaded / e.total));
xhr.addEventListener('load', () => resolve({ status: xhr.status, body: xhr.responseText }));
xhr.addEventListener('error', () => reject(new Error('File upload failed')));
xhr.addEventListener('abort', () => reject(new Error('File upload aborted')));
xhr.open('POST', url, true);
const formData = new FormData();
Array.from(files).forEach((file, index) => formData.append(index.toString(), file));
xhr.send(formData);
});
Works with one or multiple files.
If you have a file input element like this:
<input type="file" multiple id="fileUpload" />
Call the function like this:
document.getElementById('fileUpload').addEventListener('change', async e => {
const onProgress = progress => console.log('Progress:', `${Math.round(progress * 100)}%`);
const response = await uploadFiles('/api/upload', e.currentTarget.files, onProgress);
if (response.status >= 400) {
throw new Error(`File upload failed - Status code: ${response.status}`);
}
console.log('Response:', response.body);
}
Also works with the e.dataTransfer.files you get from a drop event when building a file drop zone.
I don't think it's possible. The draft states:
it is currently lacking [in comparison to XHR] when it comes to request progression
(old answer):
The first example in the Fetch API chapter gives some insight on how to :
If you want to receive the body data progressively:
function consume(reader) {
var total = 0
return new Promise((resolve, reject) => {
function pump() {
reader.read().then(({done, value}) => {
if (done) {
resolve()
return
}
total += value.byteLength
log(`received ${value.byteLength} bytes (${total} bytes in total)`)
pump()
}).catch(reject)
}
pump()
})
}
fetch("/music/pk/altes-kamuffel.flac")
.then(res => consume(res.body.getReader()))
.then(() => log("consumed the entire body without keeping the whole thing in memory!"))
.catch(e => log("something went wrong: " + e))
Apart from their use of the Promise constructor antipattern, you can see that response.body is a Stream from which you can read byte by byte using a Reader, and you can fire an event or do whatever you like (e.g. log the progress) for every of them.
However, the Streams spec doesn't appear to be quite finished, and I have no idea whether this already works in any fetch implementation.
with fetch: now possible with Chrome >= 105 🎉
How to:
https://developer.chrome.com/articles/fetch-streaming-requests/
Currently not supported by other browsers (maybe that will be the case when you read this, please edit my answer accordingly)
Feature detection (source)
const supportsRequestStreams = (() => {
let duplexAccessed = false;
const hasContentType = new Request('', {
body: new ReadableStream(),
method: 'POST',
get duplex() {
duplexAccessed = true;
return 'half';
},
}).headers.has('Content-Type');
return duplexAccessed && !hasContentType;
})();
HTTP >= 2 required
The fetch will be rejected if the connection is HTTP/1.x.
Since none of the answers solve the problem.
Just for implementation sake, you can detect the upload speed with some small initial chunk of known size and the upload time can be calculated with content-length/upload-speed. You can use this time as estimation.
A possible workaround would be to utilize new Request() constructor then check Request.bodyUsed Boolean attribute
The bodyUsed attribute’s getter must return true if disturbed, and
false otherwise.
to determine if stream is distributed
An object implementing the Body mixin is said to be disturbed if
body is non-null and its stream is disturbed.
Return the fetch() Promise from within .then() chained to recursive .read() call of a ReadableStream when Request.bodyUsed is equal to true.
Note, the approach does not read the bytes of the Request.body as the bytes are streamed to the endpoint. Also, the upload could complete well before any response is returned in full to the browser.
const [input, progress, label] = [
document.querySelector("input")
, document.querySelector("progress")
, document.querySelector("label")
];
const url = "/path/to/server/";
input.onmousedown = () => {
label.innerHTML = "";
progress.value = "0"
};
input.onchange = (event) => {
const file = event.target.files[0];
const filename = file.name;
progress.max = file.size;
const request = new Request(url, {
method: "POST",
body: file,
cache: "no-store"
});
const upload = settings => fetch(settings);
const uploadProgress = new ReadableStream({
start(controller) {
console.log("starting upload, request.bodyUsed:", request.bodyUsed);
controller.enqueue(request.bodyUsed);
},
pull(controller) {
if (request.bodyUsed) {
controller.close();
}
controller.enqueue(request.bodyUsed);
console.log("pull, request.bodyUsed:", request.bodyUsed);
},
cancel(reason) {
console.log(reason);
}
});
const [fileUpload, reader] = [
upload(request)
.catch(e => {
reader.cancel();
throw e
})
, uploadProgress.getReader()
];
const processUploadRequest = ({value, done}) => {
if (value || done) {
console.log("upload complete, request.bodyUsed:", request.bodyUsed);
// set `progress.value` to `progress.max` here
// if not awaiting server response
// progress.value = progress.max;
return reader.closed.then(() => fileUpload);
}
console.log("upload progress:", value);
progress.value = +progress.value + 1;
return reader.read().then(result => processUploadRequest(result));
};
reader.read().then(({value, done}) => processUploadRequest({value,done}))
.then(response => response.text())
.then(text => {
console.log("response:", text);
progress.value = progress.max;
input.value = "";
})
.catch(err => console.log("upload error:", err));
}
I fished around for some time about this and just for everyone who may come across this issue too here is my solution:
const form = document.querySelector('form');
const status = document.querySelector('#status');
// When form get's submitted.
form.addEventListener('submit', async function (event) {
// cancel default behavior (form submit)
event.preventDefault();
// Inform user that the upload has began
status.innerText = 'Uploading..';
// Create FormData from form
const formData = new FormData(form);
// Open request to origin
const request = await fetch('https://httpbin.org/post', { method: 'POST', body: formData });
// Get amount of bytes we're about to transmit
const bytesToUpload = request.headers.get('content-length');
// Create a reader from the request body
const reader = request.body.getReader();
// Cache how much data we already send
let bytesUploaded = 0;
// Get first chunk of the request reader
let chunk = await reader.read();
// While we have more chunks to go
while (!chunk.done) {
// Increase amount of bytes transmitted.
bytesUploaded += chunk.value.length;
// Inform user how far we are
status.innerText = 'Uploading (' + (bytesUploaded / bytesToUpload * 100).toFixed(2) + ')...';
// Read next chunk
chunk = await reader.read();
}
});
const req = await fetch('./foo.json');
const total = Number(req.headers.get('content-length'));
let loaded = 0;
for await(const {length} of req.body.getReader()) {
loaded = += length;
const progress = ((loaded / total) * 100).toFixed(2); // toFixed(2) means two digits after floating point
console.log(`${progress}%`); // or yourDiv.textContent = `${progress}%`;
}
Key part is ReadableStream ≪obj_response.body≫.
Sample:
let parse=_/*result*/=>{
console.log(_)
//...
return /*cont?*/_.value?true:false
}
fetch('').
then(_=>( a/*!*/=_.body.getReader(), b/*!*/=z=>a.read().then(parse).then(_=>(_?b:z=>z)()), b() ))
You can test running it on a huge page eg https://html.spec.whatwg.org/ and https://html.spec.whatwg.org/print.pdf . CtrlShiftJ and load the code in.
(Tested on Chrome.)
I have an axios get request which takes too long to resolve. This is for a site hosted on Heroku, which has a request timeout set at 30 seconds. The following code returns the request after about 50 seconds (which is surprisingly long, as there are only 21 urls to loop through in playerLink). Therefore, the request is never resolved on the live site.
Here is the Promise code:
const PORT = 8000
const axios = require('axios')
const cheerio = require('cheerio')
const express = require('express')
const cors = require('cors')
const app = express()
app.use(cors())
app.listen(PORT , () => console.log(`server running on PORT ${PORT}`))
const players = 'https://www.trinethunder.com/sports/sball/2021-22/teams/trine?view=roster'
const playerStats = 'https://www.trinethunder.com'
const playerLink = []
app.get('/players', (req, res) => {
function getPlayers() {
return new Promise((resolve, reject) => {
axios(players)
.then((response) => {
const html = response.data;
const $ = cheerio.load(html);
$("td.text.pinned-col > a", html).each(function () {
var link = $(this).attr("href");
//if link not yet in array, push to array
if (playerLink.indexOf(playerStats + link) === -1) {
playerLink.push(playerStats + link);
}
});
resolve()
})
.catch((err) => {
console.log(err);
});
});
}
function getPlayerStats() {
setTimeout(async () => {
const statsArray = []
for (let i = 0; i < playerLink.length; i++) {
await new Promise((resolve, reject) => {
axios.get
(playerLink[i])
.then((response) => {
const html = response.data;
const $ = cheerio.load(html);
const statName = [];
const statDesc = [];
const statNum = [];
$("h2 > span:nth-child(1)", html).each(function () {
var name = $(this).text();
statName.push(name);
});
$(".stat-title", html).each(function () {
var stat1 = $(this).text();
statDesc.push(stat1);
});
$(".stat-value", html).each(function () {
var stat2 = $(this).text();
statNum.push(stat2);
});
//Conditional is here because sometimes statsArray
//gets filled multiple times
if (statsArray.length < 63) {
statsArray.push(statName, statDesc, statNum);
}
resolve();
})
.catch((err) => console.log(err));
});
}
res.json(statsArray)
}, 400);
}
getPlayers()
.then(getPlayerStats)
.catch((err) => console.log(err));
});
Simplified Fetch statement for /players:
fetch('http://localhost:8000/players')
.then(response => response.json())
.then(data => {
console.log(data)
}).catch(err=>console.log(err))
Please let me know if you see anything that may be slowing down the execution of the request.
I cleaned up the code, removed the setTimeout(), set it up for maximum parallelization and instrumented it and made it so it can run stand-alone. After doing so, the log it produces is below and I see that getPlayers() takes 2413ms and the synchronous cheerio processing of the individual player requests takes a total of 6087ms. From start to finish, the whole thing takes 9415ms on my system.
This is significantly faster than what you report. The biggest structural change I made is that all the individual getPlayerStat requests are made in parallel, not in serial which (if the target server can handle it) will shorten the total wait for network requests on getting player stats. I also removed the setTimeout() as that seemed like a hack for some other problem and once the code is structured properly for asynchronous handling, that should not be necessary.
Here's the detailed log if you want to see where all the detailed time is spent. You can run the code below on your own system to see what you get there:
000000: begin all
000006: begin getPlayers()
002419: end getPlayers()
002419: begin getPlayerStats
002420: begin get https://www.trinethunder.com/sports/sball/2021-22/players/makinzeromingersy0k
002423: begin get https://www.trinethunder.com/sports/sball/2021-22/players/emersynhaneyjnrb
002424: begin get https://www.trinethunder.com/sports/sball/2021-22/players/amandapratheruluw
002424: begin get https://www.trinethunder.com/sports/sball/2021-22/players/adrienneroseybff7
002425: begin get https://www.trinethunder.com/sports/sball/2021-22/players/emmabeyeri6zz
002426: begin get https://www.trinethunder.com/sports/sball/2021-22/players/aprilsellersi95s
002427: begin get https://www.trinethunder.com/sports/sball/2021-22/players/annakoeppl38q8
002427: begin get https://www.trinethunder.com/sports/sball/2021-22/players/annagilli8rl
002428: begin get https://www.trinethunder.com/sports/sball/2021-22/players/angelenaperry2scn
002429: begin get https://www.trinethunder.com/sports/sball/2021-22/players/laurenclausenfb4j
002430: begin get https://www.trinethunder.com/sports/sball/2021-22/players/emilywheaton1jym
002430: begin get https://www.trinethunder.com/sports/sball/2021-22/players/kaylyncoahranhp6r
002431: begin get https://www.trinethunder.com/sports/sball/2021-22/players/mercededaughertyiswy
002432: begin get https://www.trinethunder.com/sports/sball/2021-22/players/taylormurdockgeho
002432: begin get https://www.trinethunder.com/sports/sball/2021-22/players/lexiclark77gr
002433: begin get https://www.trinethunder.com/sports/sball/2021-22/players/ainsleyphillipsmfe9
002434: begin get https://www.trinethunder.com/sports/sball/2021-22/players/ellietrinexhe2
002434: begin get https://www.trinethunder.com/sports/sball/2021-22/players/ashleyswartouta714
002435: begin get https://www.trinethunder.com/sports/sball/2021-22/players/gisellerileybdb8
002436: begin get https://www.trinethunder.com/sports/sball/2021-22/players/elizabethkoch5umu
002436: begin get https://www.trinethunder.com/sports/sball/2021-22/players/scarlettelliott0bvt
003251: after get https://www.trinethunder.com/sports/sball/2021-22/players/kaylyncoahranhp6r
003596: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/kaylyncoahranhp6r
003599: after get https://www.trinethunder.com/sports/sball/2021-22/players/makinzeromingersy0k
003902: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/makinzeromingersy0k
003905: after get https://www.trinethunder.com/sports/sball/2021-22/players/emersynhaneyjnrb
004200: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/emersynhaneyjnrb
004203: after get https://www.trinethunder.com/sports/sball/2021-22/players/amandapratheruluw
004489: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/amandapratheruluw
004492: after get https://www.trinethunder.com/sports/sball/2021-22/players/emmabeyeri6zz
004771: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/emmabeyeri6zz
004773: after get https://www.trinethunder.com/sports/sball/2021-22/players/aprilsellersi95s
005060: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/aprilsellersi95s
005063: after get https://www.trinethunder.com/sports/sball/2021-22/players/elizabethkoch5umu
005345: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/elizabethkoch5umu
005348: after get https://www.trinethunder.com/sports/sball/2021-22/players/emilywheaton1jym
005638: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/emilywheaton1jym
005643: after get https://www.trinethunder.com/sports/sball/2021-22/players/ashleyswartouta714
005943: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/ashleyswartouta714
005951: after get https://www.trinethunder.com/sports/sball/2021-22/players/ainsleyphillipsmfe9
006243: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/ainsleyphillipsmfe9
006245: after get https://www.trinethunder.com/sports/sball/2021-22/players/adrienneroseybff7
006541: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/adrienneroseybff7
006545: after get https://www.trinethunder.com/sports/sball/2021-22/players/annagilli8rl
006821: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/annagilli8rl
006824: after get https://www.trinethunder.com/sports/sball/2021-22/players/mercededaughertyiswy
007111: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/mercededaughertyiswy
007118: after get https://www.trinethunder.com/sports/sball/2021-22/players/lexiclark77gr
007402: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/lexiclark77gr
007411: after get https://www.trinethunder.com/sports/sball/2021-22/players/angelenaperry2scn
007681: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/angelenaperry2scn
007685: after get https://www.trinethunder.com/sports/sball/2021-22/players/laurenclausenfb4j
007974: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/laurenclausenfb4j
007976: after get https://www.trinethunder.com/sports/sball/2021-22/players/scarlettelliott0bvt
008265: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/scarlettelliott0bvt
008267: after get https://www.trinethunder.com/sports/sball/2021-22/players/ellietrinexhe2
008553: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/ellietrinexhe2
008555: after get https://www.trinethunder.com/sports/sball/2021-22/players/gisellerileybdb8
008838: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/gisellerileybdb8
008840: after get https://www.trinethunder.com/sports/sball/2021-22/players/annakoeppl38q8
009129: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/annakoeppl38q8
009131: after get https://www.trinethunder.com/sports/sball/2021-22/players/taylormurdockgeho
009415: after cheerio parse https://www.trinethunder.com/sports/sball/2021-22/players/taylormurdockgeho
009415: end all
... data here
getPlayers() took 2413ms
cheerio processing took 6087ms
And, here's the stand-alone code that anyone can run:
const axios = require('axios');
const cheerio = require('cheerio');
const players = 'https://www.trinethunder.com/sports/sball/2021-22/teams/trine?view=roster'
const playerStats = 'https://www.trinethunder.com'
const zeroes = "000000000000000000000000000000";
function zeroPad(num, padLen) {
let str = num + "";
let padNum = padLen - str.length;
if (padNum > 0) {
str = zeroes.slice(0, padNum) + str;
}
return str;
}
const base = Date.now();
function log(...args) {
let delta = Date.now() - base;
let deltaPad = zeroPad(delta, 6);
console.log(deltaPad + ": ", ...args);
}
let getPlayersT = 0;
let cheerioT = 0;
async function run() {
async function getPlayers() {
log("begin getPlayers()");
let startT = Date.now();
const playerLink = [];
const response = await axios(players);
const html = response.data;
const $ = cheerio.load(html);
$("td.text.pinned-col > a", html).each(function() {
const link = $(this).attr("href");
//if link not yet in array, push to array
if (playerLink.indexOf(playerStats + link) === -1) {
playerLink.push(playerStats + link);
}
});
log("end getPlayers()")
getPlayersT += Date.now() - startT;
return playerLink;
}
async function getPlayerStats(playerLink) {
log("begin getPlayerStats");
const statsArray = [];
await Promise.all(playerLink.map(async link => {
log(`begin get ${link}`)
const response = await axios.get(link);
log(`after get ${link}`)
const html = response.data;
const startT = Date.now();
const $ = cheerio.load(html);
const statName = [];
const statDesc = [];
const statNum = [];
$("h2 > span:nth-child(1)", html).each(function() {
var name = $(this).text();
statName.push(name);
});
$(".stat-title", html).each(function() {
var stat1 = $(this).text();
statDesc.push(stat1);
});
$(".stat-value", html).each(function() {
var stat2 = $(this).text();
statNum.push(stat2);
});
//Conditional is here because sometimes statsArray
//gets filled multiple times
if (statsArray.length < 63) {
statsArray.push(statName, statDesc, statNum);
}
cheerioT += Date.now() - startT;
log(`after cheerio parse ${link}`);
}));
return statsArray;
}
try {
log("begin all")
const playerLink = await getPlayers();
const statsArray = await getPlayerStats(playerLink);
log("end all")
return statsArray;
} catch (e) {
console.log(e);
}
}
run().then(result => {
console.log(result);
console.log(`getPlayers() took ${getPlayersT}ms`);
console.log(`cheerio processing took ${cheerioT}ms`);
}).catch(err => {
console.log("error");
});
I am trying to implement a download functionality using streams in NodeJS.
In the code I am trying to simulate an endpoint that sends data in chunks, something similar to paginated data, for example chunks in size of 5000. Or to make it further clear, we can send top and skip parameters to the endpoint to get a particular chunk of data. If no parameters are provided, it send the first 5000 entries.
There are 2 cases that I am trying to take care of:
When the user cancels the download from the browser, how do I handle the continuous fetching of data from the endpoint
When the user pauses the download from the browser, how do I pause the data fetching, and then resume once user resumes it
The first case can be taken care of using 'close' event of request. When the connection between the client and the server get cancelled, I disconnect.
If anyone has a better way of implementing this please suggest.
I am having trouble handling the second case when the user pauses.
If anyone could guide me through this, or even provide a better solution to the overall problem(incl. handling the chunks of data), it would be really helpful.
const {createServer} = require('http');
const {Transform} = require('stream');
const axios = require('axios');
var c = 0;
class ApiStream extends Transform {
constructor(apiCallback, res, req) {
super();
this.apiCallback = apiCallback;
this.isPipeSetup = false;
this.res = res;
this.req = req
}
//Will get data continuously
async start() {
let response;
try {
response = await this.apiCallback();
} catch (e) {
response = null;
}
if (!this.isPipeSetup) {
this.pipe(this.res);
this.isPipeSetup = true;
}
if (response) {
response = response.data
if (Array.isArray(response)) {
response.forEach((item) => {
this.push(JSON.stringify(item) + "\n");
});
} else if (typeof response === "object") {
this.push(JSON.stringify(response) + "\n");
} else if (typeof response === "string") {
this.push(response + "\n");
}
this.start()
}else{
this.push(null);
console.log('Stream ended')
}
}
}
const server = createServer(async (req, res, stream) => {
res.setHeader("Content-disposition", "attachment; filename=download.json");
res.setHeader("Content-type", "text/plain");
let disconnected = false;
const filestream = new ApiStream(async () => {
let response;
try {
if(disconnected){
console.log('Client connection closed')
return null;
}
c++;
response = await axios.get("https://jsonplaceholder.typicode.com/users");
//Simulate delay in data fetching
let z = 0;
if(c>=200) response = null;
while(z<10000){
let b = 0;
while(b<10000){
b+=0.5;
}
z +=0.5;
}
} catch (error) {
res.status(500).send(error);
}
if (response) {
return response;
}
return null;
}, res, req);
await filestream.start();
req.on('close', (err) => {
disconnected = true;
})
})
server.listen(5050, () => console.log('server running on port 5050'));
I'm developing an AWS Lambda in TypeScript that uses Axios to get data from an API and that data will be filtered and be put into a dynamoDb.
The code looks as follows:
export {};
const axios = require("axios");
const AWS = require('aws-sdk');
exports.handler = async (event: any) => {
const shuttleDB = new AWS.DynamoDB.DocumentClient();
const startDate = "2021-08-16";
const endDate = "2021-08-16";
const startTime = "16:00:00";
const endTime = "17:00:00";
const response = await axios.post('URL', {
data:{
"von": startDate+"T"+startTime,
"bis": endDate+"T"+endTime
}}, {
headers: {
'x-rs-api-key': KEY
}
}
);
const params = response.data.data;
const putPromise = params.map(async(elem: object) => {
delete elem.feat1;
delete elem.feat2;
delete elem.feat3;
delete elem.feat4;
delete elem.feat5;
const paramsDynamoDB = {
TableName: String(process.env.TABLE_NAME),
Item: elem
}
shuttleDB.put(paramsDynamoDB).promise();
});
await Promise.all(putPromise);
};
This all works kind of fine. If the test button gets pushed the first time, everything seems fine and is working. E.g. I received all the console.logs during developing but the data is not put into the db.
With the second try it is the same output but the data is successfully put into the Db.
Any ideas regarding this issue? How can I solve this problem and have the data put into the Db after the first try?
Thanks in advance!
you need to return the promise from the db call -
return shuttleDB.put(paramsDynamoDB).promise();
also, Promise.all will complete early if any call fails (compared to Promise.allSettled), so it may be worth logging out any errors that may be happening too.
Better still, take a look at transactWrite - https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB/DocumentClient.html#transactWrite-property to ensure all or nothing gets written
I have an input that sends an api call on submit to the unsplash API. I am trying to convert this into a netlify function but am not sure how to pass the params from the input into the function. I am trying to keep the API key hidden. I've never worked with the qs package and looked up the docs but have not been able to quite figure it out.
script.js
const KEY = "" //secret
const URL = `https://api.unsplash.com/search/photos?page=1&per_page=50&client_id=${process.env.KEY}`;
const input = document.querySelector(".input");
const form = document.querySelector(".search-form");
const background = document.querySelector(".background");
const overlay = document.querySelector(".overlay");
const header = document.querySelector(".title");
let results = [];
search = (searchTerm) => {
let url = `${URL}&query=${searchTerm}`;//this should hit the netlify endpoint instead
return fetch(url)
.then((response) => response.json())
.then((result) => {
toggleStyles();
header.appendChild(form);
result.results.forEach((image) => {
const galleryItem = document.createElement("div");
galleryItem.className = "gallery-item";
const imageDiv = document.createElement("div");
imageDiv.className = "image-div";
document.querySelector(".results-page").appendChild(galleryItem);
galleryItem.appendChild(imageDiv);
imageDiv.innerHTML =
"<img class='image' src=" + image.urls.regular + ">";
form.classList.remove("toggle-show");
input.classList.add("header-expanded");
form.addEventListener("submit", (e) => {
e.preventDefault();
document.querySelector(".results-page").remove();
});
});
console.log(result.results);
return results;
});
};
toggleStyles = () => {
const resultsContainer = document.createElement("div");
resultsContainer.className = "results-page";
document.body.appendChild(resultsContainer);
};
input.addEventListener("focus", (e) => {
e.preventDefault();
input.style = "font-family: 'Raleway', sans-serif";
input.placeholder = "";
});
input.addEventListener("blur", (e) => {
e.preventDefault();
input.style = "font-family: FontAwesome";
input.value = "";
input.placeholder = "\uf002";
});
form.addEventListener("submit", (e) => {
e.preventDefault();
let searchTerm = input.value;
search(searchTerm);
});
token-hider.js
const axios = require("axios");
const qs = require("qs");
exports.handler = async function (event, context) {
// apply our function to the queryStringParameters and assign it to a variable
const API_PARAMS = qs.stringify(event.queryStringParameters);
console.log("API_PARAMS", API_PARAMS);
// Get env var values defined in our Netlify site UI
// TODO: customize your URL and API keys set in the Netlify Dashboard
// this is secret too, your frontend won't see this
const { KEY } = process.env;
const URL = `https://api.unsplash.com/search/photos?page=1&per_page=50&client_id=${KEY}`;
console.log("Constructed URL is ...", URL);
try {
const { data } = await axios.get(URL);
// refer to axios docs for other methods if you need them
// for example if you want to POST data:
// axios.post('/user', { firstName: 'Fred' })
return {
statusCode: 200,
body: JSON.stringify(data),
};
} catch (error) {
const { status, statusText, headers, data } = error.response;
return {
statusCode: error.response.status,
body: JSON.stringify({ status, statusText, headers, data }),
};
}
};
I added the KEY as an Environment variable in my netlify UI, and am able to hit the function's endpoint. Any help is greatly appreciated as I am new to serverless functions but want to learn JAMstack really badly.
I think you are asking two different things here - how to use secret keys in Netlify functions and how to pass parameters to it from your front end code.
You can define environment variables in your Netlify site settings. These variables can then be used in your Netlify functions via process.env. So if you called your variable SECRET_KEY, then your Netlify function code (not your front end code!) would be able to read it via process.env.SECRET_KEY.
Looking over your code, it looks like you understand that as you have it in your function, but you also try to use it in the client-side code. You can remove that.
Your code gets the query string parameters, and it looks like you just need to add them to the end of the URL you hit. Did you try that?