I need to load URL from multiple array in browser
So for example-: api url is www.example.com/dst=ad. Here ad is one set of input and I have number of list that needs to be executed one after another like 5-10 secs time interval.
I'm familiar with doing this via bash using for loop this kind of tasks but I needed to get this particular task only via browser.
so here is exactly what I needed
for dst in ad ef gh mn; do www.example.com/dst=$dst; done;
I wanted the browser to load the set of given lists for the api url one after another in browser as API gets authentication with signal signon. I heard this can be done via javascript but I'm not familiar with this language.
Can someone please provide any input with this.
Try this:
const loadURL = async (interval, array) => {
array.forEach(async (element) => {
await fetch(`https://www.example.com/dst=${element}`)
})
await setTimeout(() => loadURL(interval, array), interval)
}
loadURL(yourIntervalTime, yourArray);
Related
I am creating a chrome extension that works in a certain website to automate some tasks in the website, the extension take an array of data and send them one by one (mimic clicking a button in the website that send the data to the backend), so i am automating that async repeating task.
but the problem is that i don't get any data returned from the website after clicking the send button because i am not the owner or the developer that created that website
The website is like a book store, you need to search the book by name then do further work with the book, my extension is about automating this process, so instead of searching the book from the website UI, the extension has many inputs that accept multiple books names then send this collection to the website backend by mimicking clicking a send button in the website UI
My problem here is that i have a function that works with each item in the collection once i click a button in the extension after adding some books names but the problem is that i need to wait in each cycle till the website find the book,so i have asyncronous step here and this is my real problem, i need to wait for the book finding function to complete then i can do the further work
//extension add books button
const addBooks = document.getElementById('add-books');
//books collected from changing the extension inputs
let bookCollection = [{...}, {...}, {...}]
addBooks.addEventListener('click', async () => {
for (let drug of collectedDrugs) {
let { name, code, price } = book;
//trying to await the website find the book
await getBook(name);
await addBook(price, qty);
}
})
Unfortunately this doesn't work and , this function should work with every item in the collection waiting for the async step to finish, complete the function then repeat the same cycle with another book, how can i fix this code to make it work!
You have to move your async function call to a proxy that's outside of the listener because the native eventListener ecosystem is kinda legacy and doesn't support the new async feature:
function getBook(name){
console.log("getBook", name);
}
async function getBookProxy(name){
await getBook(name);
}
document.body.addEventListener('click', () => {
getBookProxy("bob");
});
Have an interval that loads the html every 3 secs, but want to refresh it and not keep adding more under the already made code.
async function Products(){
setInterval(async function() {
const response = await fetch('http://localhost:3000/api/products');
const data = await response.json();
const mainContainer = document.getElementById("myData");
for (let obj of data) {
const div = document.createElement("div");
div.innerHTML = `${obj["sku"]}: ${obj["name"]}`;
mainContainer.appendChild(div);
}
}, 10000)
}
When I click a start button everything works, but how do i make it refresh the already made HTML rather than repeatedly recreating it with the interval. Trying to figure out a good approach to this. Thanks
Create and append a <div> immediately, and in the interval, assign to its innerHTML:
function Products() {
const container = document.getElementById("myData").appendChild(document.createElement("div"));
setInterval(async function () {
const response = await fetch('http://localhost:3000/api/products');
const data = await response.json();
container.innerHTML = '';
for (let obj of data) {
container.innerHTML += `${obj["sku"]}: ${obj["name"]}`;
}
}, 10000)
}
I think what you are trying to do is implement a Serie of observable elements that you can look only for those who have changed instead of all the data, something like React does with the virtual DOM.
Considering the code tout already posted, refreshing elements at a set interval is a bad idea. What if you have 1000 user refreshing at the same time? What if it cause your response tone to be more then 3 seconds?
That said, if you really want to work on creating something like that, you have to find a way to load not all the products from the api, but only the ones that are different. Again, if you want to keep it that way, here are, in my opinion, what you could do:
Start by loading all the product on the page, but set an interval to check a new Endpoint which would tell you what products have been added after the last one.
Use either an index or a key to identify which product is which, so tout know the ones you have.
You need a way to know quick product was updated since the last load.
That's a start. You can implement these in different way. I would suggest having a timestamp for the time created and updated, so you can then query only those who fall after this timestamp.
Add a dynamic ID to your product elements, (e.g. <div id=${sku} >**Product Here**<\div>
That way, you can track your product and recreate only the ones who changed/are new.
That's obviously a complicated way of implementing an open connection, if you want another solution, you could also open a Socket with your api, which would send event for data updated and created and would ultimately make your 3 second ping obsolete, resulting in a better scalability in my opinion.
Per Evernote documentation for findNotesMetadata the maxnotes returned from server in 1 response is 250. I am trying to understand how to make multiple requests to retrieve entire array if more then 250. Below is current code.
const Evernote = require('evernote');
const developerToken = "...";
const client = new Evernote.Client({token: developerToken, sandbox: false});
const noteStore = client.getNoteStore();
const spec = {}
spec.includeTitle = true;
spec.includeTagGuids = true;
spec.includeAttributes = true;
spec.includeNotebookGuid = true;
const filter = new Evernote.NoteStore.NoteFilter({
words: '*',
});
noteStore.findNotesMetadata(filter, 0, 250, spec)
.then(noteobj => {
...
})
.catch( e => console.error(e));
Current code doenst incorporate any loop yet but works up to 250 notes. Due to Evernote SDK and dealing with promises Im not positive even where to start. I have searched online quite a bit to find a solution directly (looking at Evernote examples) and indirectly (looking at other rest API examples). Not having any luck. Any help is appreciated.
The offset param to findNotesMetadata is how you indicate the start index into the actual result set you want. In the case of the code you've shown, you're passing in 0 (it's the second param). That is telling the API that you want your results to begin with item 0 in the actual result set, up to a maximum of 250 results.
If you want to "page" through the result set in windows of 250 results, you can call the method again using 250 as the offset, and ask for the next 250 results. This is a fairly common design pattern for paging through result sets via a remote API or anything that has a resource constraint on retrieving data. You'll want to handle the cases when no more results are available-- either because you get fewer back than the maxNotes that you ask for or the corner case where you get exactly the max number but then zero on the following request. That's how you know to break out of your loop.
The Evernote API seems to offer a findNoteCounts method, which should give you an idea of how many actual results there would be, but as with all async systems, there's a theoretical race where that number changes between API calls.
I'm trying to get the page fully load time in seconds with puppeteer in Node, for this I do some research on the API and other questions and create the following code:
/* First Configuration */
puppeteer.launch({
defaultViewport: { width: 1600, height: 800 }
}).then(async browser => {
const page = await browser.newPage();
await page.setCacheEnabled(false);
await page.goto('https://stackoverflow.com', {waitUntil: 'networkidle0'});
/* Get Page Metrics */
const perf = await page.metrics();
console.log(JSON.stringify(perf));
/* Get Page Evaluate */
const timing = await page.evaluate(() => {
const result = {};
for (const key of Object.keys(window.performance.timing.__proto__))
result[key] = window.performance.timing[key];
return result;
});
console.log(JSON.stringify(timing));
/* Show Results on Browser Close */
await browser.close().then(() => {
var fullyLoadEvaluate = (timing.loadEventEnd - timing.navigationStart);
console.log('Fully Load Time (Page Evaluate): ' + fullyLoadEvaluate);
var fullyLoadMetrics = (perf.LayoutDuration + perf.RecalcStyleDuration + perf.ScriptDuration + perf.TaskDuration);
console.log('Fully Load Time (Page Metrics): ' + fullyLoadMetrics);
/* Send Response to Server */
res.send('Check The Console');
});
});
Basically I use two codes to return metrics, One of them is page.metrics() that return the following data:
{"Timestamp":961736.600171,"Documents":8,"Frames":4,"JSEventListeners":375,"Nodes":8654,"LayoutCount":27,"RecalcStyleCount":31,"LayoutDuration":0.705517,"RecalcStyleDuration":0.144379,"ScriptDuration":0.527385,"TaskDuration":1.812213,"JSHeapUsedSize":11082496,"JSHeapTotalSize":20344832}
And the last one page.evaluate(), return the following:
{"navigationStart":1556722407938,"unloadEventStart":0,"unloadEventEnd":0,"redirectStart":0,"redirectEnd":0,"fetchStart":1556722407938,"domainLookupStart":1556722408247,"domainLookupEnd":1556722408548,"connectStart":1556722408548,"connectEnd":1556722408737,"secureConnectionStart":1556722408574,"requestStart":1556722408738,"responseStart":1556722408940,"responseEnd":1556722409087,"domLoading":1556722408957,"domInteractive":1556722409995,"domContentLoadedEventStart":1556722409995,"domContentLoadedEventEnd":1556722410190,"domComplete":1556722412584,"loadEventStart":1556722412584,"loadEventEnd":1556722412589,"toJSON":{}}
In my example I'm testing the site https://stackoverflow.com. Like webpagetest.org and getmetrix.com, I'm trying to get Page Fully Load Time.
I know this kind of value is inconsistent, but I wonder if the values I'm calculating are right, and which of the two results seems to be more correct ? Fully Load Time (Page Evaluate) or Fully Load Time (Page Metrics) ?
You can use page.metrics() to compare two points in time (e.g. before and after page.goto). The page.evaluate approach to read the data from the performance API is also a good alternative. As I already pointed out in the comment, it is not defined what should be considered a "full page load". Both approaches are valid.
It's even more complex
There are a number of thing which people might consider a page to be loaded:
DOMContentLoaded event fired
Load event fired
Time it takes from navigation start until all resources embedded in the document (like images are loaded)
Time it takes from navigation start until all resources are loaded
Time until there are not more ongoing network requests.
...
You also have to consider whether whether you want network related phases (like DNS) to be part of the measurement. Here is an example request (generated with the Chrome DevTools Network tab) showing how complex a single request might be:
There is also a document explaining each of these phases.
Simple approach
The simplest way to measure the load time would just to start measuring when the navigaiton starts and stop measuring after the page is loaded. This could be done like this:
const t1 = Date.now();
await page.goto('https://example.com');
const diff1 = Date.now() - t1;
console.log(`Time: ${diff1}ms`);
Note that there are also other APIs (page.metrics, process.hrtime, perf_hooks) to get more precise timestamps.
You can also pass options to the page.goto function to change the resolving of the promise to something like this (quoted from the docs):
Consider navigation to be finished when there are no more than 0 network connections for at least 500ms
For that, you would have to use the setting networkidle0:
await page.goto('https://example.com', { waitUntil: 'networkidle0' });
There are also other events in the docs linked above you could use.
More complex: Use the Performance API
To get more precise results, you can use the Performance API as you already did in your code. Instead of going through the prototype of window.performance you can also use the functions performance.getEntries() or performance.toJSON() like this:
const perfData = await page.evaluate(() =>
JSON.stringify(performance.toJSON(), null, 2)
);
That way, you get data that looks like this:
{
"timeOrigin": 1556727036740.113,
"timing": {
"navigationStart": 1556727036740,
"unloadEventStart": 0,
"unloadEventEnd": 0,
"redirectStart": 0,
"redirectEnd": 0,
"fetchStart": 1556727037227,
"domainLookupStart": 1556727037230,
"domainLookupEnd": 1556727037280,
"connectStart": 1556727037280,
"connectEnd": 1556727037348,
"secureConnectionStart": 1556727037295,
"requestStart": 1556727037349,
"responseStart": 1556727037548,
"responseEnd": 1556727037805,
"domLoading": 1556727037566,
"domInteractive": 1556727038555,
"domContentLoadedEventStart": 1556727038555,
"domContentLoadedEventEnd": 1556727038570,
"domComplete": 1556727039073,
"loadEventStart": 1556727039073,
"loadEventEnd": 1556727039085
},
"navigation": {
"type": 0,
"redirectCount": 0
}
}
So if you want to know how long it took from navigationStart to loadEventStart you subtract one value from the other one (e.g. 1556727039073 - 1556727036740 = 2333 ms).
So which one to take?
This is up to your decision. In general, it is a good idea to use the Load event as a starting point. Waiting until all requests are finished might actually never happen because there are constantly resources being loaded in the background. Using networkidle2 as waitUntil option might be an alternative in case you don't want to use the load event.
In the end, however, it comes down to your use case which metric to use.
I am using the HTML5 Widget Api to take a SoundCloud playlist with 14 songs and store the information about every song in an array.
I use the .getSounds method as following:
function loadTracks(){
myPlayer.player.getSounds(function(ret){
myPlayer.playlistInfo = ret;
});
}
It correctly returns an array with 14 spots. The first 5 spots contain exactly what i want, but the last 9 have different information that does not seem to be related to the song.
this is how the returned array looks like
I could recreate the problem with different playlists. I only get the correct information for the first 5 songs.
Does anyone has and idea on how to solve this? I set up a codepen for this
Thanks.
I was struggling with the same problem. For me it looks like SC.Widget.Events.READY is sometimes firing before the Widget is really completely ready.
A safe solution would be to loop through a function and check the array for its completeness until all the data you need is there. In my example on Code Pen I check the array every 200 ms for all titles and break out of the loop on success.
It looks like the data is all valid - resource type: sound, type: track - but it looks like the API is not returning the full set of information for each song in the playlist beyond the fifth. It's only returning the artwork URL and extended information for the first 5, but I believe the rest of the songs are still accessible by their id. If you need the extra Information, you may have to query the SoundCloud API for each specific song beyond the fifth index (if length > 5), which will probably return the full info for each song. You'll have to do many queries with this method, however
The real answer... It is not possible. getSounds wont return all the track info for a playlist at once like you expect it to. The reasons:
The Widget API is an half-baked, insufficient, poorly-documented, abandoned API.
The full SoundCloud API has more functionality, but has been closed off for many many years, and will likely never return.
SoundCloud has no real developer support at all anymore.
SoundCloud as a whole company seems to have been circling the drain financially for several years (I recall several years ago they almost shut down before getting a new CEO or something). I speculate that this is causing the above shortcomings.
But I didn't create a new answer just to say that.
I need a media player for a redesign of my website. The SoundCloud widget is so ugly and incomplete and inflexible, but SoundCloud as a service already provides streaming audio and recording of song plays/downloads/comments, which would be a big effort to reimplement. Also for some reason, SoundCloud is the standard for embedding sharable audio on websites (look at any sample library demo page). Bandcamp has a widget too, but it can only do widgets by albums and not playlists, and also doesn't show things like play numbers or supporters, and is also completely un-customizable. So I really wanted to find a way to make this dumb Widget API work.
Here is a REALLY HACKY AND UGLY way I think I found that hopefully works consistently. Use with caution.
It literally just goes through each track with .next() as fast as it can, calling getCurrentSound repeatedly on each until it loads and returns the full song info.
// once you know how many tracks are in the widget...
const fullTracks = [];
// do the following process same amount of times as number of tracks
for (let track = 0; track < tracks.length; track++) {
// make periodic attempts (up to a limit) to get the full info
for (let tries = 0; tries < 100; tries++) {
// promisify getCurrentSound (doesn't handle error)
const sound = await new Promise((resolve) => player.getCurrentSound(resolve));
// check if full info has loaded for current sound yet
if (sound.artwork_url) {
// record full info
fullTracks.push(sound);
// stop re-trying
break;
} else
// wait a bit before re-trying
await new Promise((resolve) => window.setTimeout(resolve, 10));
}
// move to next track
player.next();
}
// reverse the process; go all the way back to first track again
for (let track = 0; track < tracks.length; track++)
player.prev();
On my machine and internet connection, for 13 tracks, this takes about 300ms. It also throws a bunch of horrible errors in the console that you can't try/catch to suppress.