Update HTML data without creating more elements on the webpage - javascript

Have an interval that loads the html every 3 secs, but want to refresh it and not keep adding more under the already made code.
async function Products(){
setInterval(async function() {
const response = await fetch('http://localhost:3000/api/products');
const data = await response.json();
const mainContainer = document.getElementById("myData");
for (let obj of data) {
const div = document.createElement("div");
div.innerHTML = `${obj["sku"]}: ${obj["name"]}`;
mainContainer.appendChild(div);
}
}, 10000)
}
When I click a start button everything works, but how do i make it refresh the already made HTML rather than repeatedly recreating it with the interval. Trying to figure out a good approach to this. Thanks

Create and append a <div> immediately, and in the interval, assign to its innerHTML:
function Products() {
const container = document.getElementById("myData").appendChild(document.createElement("div"));
setInterval(async function () {
const response = await fetch('http://localhost:3000/api/products');
const data = await response.json();
container.innerHTML = '';
for (let obj of data) {
container.innerHTML += `${obj["sku"]}: ${obj["name"]}`;
}
}, 10000)
}

I think what you are trying to do is implement a Serie of observable elements that you can look only for those who have changed instead of all the data, something like React does with the virtual DOM.
Considering the code tout already posted, refreshing elements at a set interval is a bad idea. What if you have 1000 user refreshing at the same time? What if it cause your response tone to be more then 3 seconds?
That said, if you really want to work on creating something like that, you have to find a way to load not all the products from the api, but only the ones that are different. Again, if you want to keep it that way, here are, in my opinion, what you could do:
Start by loading all the product on the page, but set an interval to check a new Endpoint which would tell you what products have been added after the last one.
Use either an index or a key to identify which product is which, so tout know the ones you have.
You need a way to know quick product was updated since the last load.
That's a start. You can implement these in different way. I would suggest having a timestamp for the time created and updated, so you can then query only those who fall after this timestamp.
Add a dynamic ID to your product elements, (e.g. <div id=${sku} >**Product Here**<\div>
That way, you can track your product and recreate only the ones who changed/are new.
That's obviously a complicated way of implementing an open connection, if you want another solution, you could also open a Socket with your api, which would send event for data updated and created and would ultimately make your 3 second ping obsolete, resulting in a better scalability in my opinion.

Related

how do I get multiple a tag href links (in form of an array) from inside an iframe (puppeteer)?

Pretty new to coding, which is why this question might be easily answered, but after scanning the internet for 2 days now and still not having a real solution, i thought I'd just ask here.
So, as the title explains, I have an iframe inside a website I want to scrape with an id attribute (we'll just call it iframeid) and somewhere inside this iframe I have a div container with a class attribute (we'll call it divclass) that contains - besides other elements - multiple <a> tags. My goal is to get an array in which all the links from those <a> tags are listed, put to date I only achieved the following through researching and a bit of luck:
const elementHandle = await page.waitForSelector('iframe#iframeid');
const frame = await elementHandle.contentFrame();
await frame.waitForSelector('div[class=divclass] a');
var x = 2; //a var to determine which a tag I want
const oneA= await frame.$('div[class=entryLayer] a:nth-child(' + x + ')');
const link = await (await oneA.getProperty('href'))._remoteObject.value;
console.log(link);
What it does is it takes a variable and pulls the link of its according <a> tag, but I can't figure out how to put it in a loop and besides that, the amount of <a> tags varies, which makes the loop for me to code even harder.
Wouldn't it even be possible to leave out the loop completely? I found similar stackoverflow questions, but one for example only had one <a> tag which seems to change the code completely.
At the end I just want a working piece of code that I as a newbie can understand but is fairly compact at the same time.
Thanks for helping in advance!
EDIT
My solution with the help of a comment:
const elementHandle = await page.waitForSelector('iframe#iframeid');
const frame = await elementHandle.contentFrame();
const thisDiv = await frame.waitForSelector('div[class=divclass]');
const xpath_expression = '//a[#href]';
await page.waitForXPath(xpath_expression);
const links = await thisDiv.$x(xpath_expression);
const link_urls = await thisDiv.evaluate((...links) => {
return links.map(e => e.href);
}, ...links);
console.log(link_urls);
It does pull out some strange other links though, but I'm just going to filter them out normally.
As I know in every iframe can be treated as different page. Here is the reference I used for the same type of task https://stackoverflow.com/a/54940865/17755263

Liferay 7: Using URL params and Javascript to prefill a form

I've been working with Liferay 7 for a while and needed to create a feedback form with prefilled values. I created a feedback form and a page, where it's shown and where I could add Javascript.
The user clicks on a link ("Did you find this helpful? Yes/No") and it takes you to the feedback page with the page and answer as URL parameters.
URL: {feedback-page-url/} ?pageUrl=/services&answer=Yes
Now here's where the problems began. Liferay updates it's values very confusingly and while generic document.getElementsByName(...) etc. seemed to work at first, they updated back when clicking the page. The difficult thing is to update the right values in right elements, so they won't be overrun by Liferay.
I provided an answer to my question below. Feel free to ask me anything, I'll try to help :)
Full code block in the end!
So I found out a solution to this problem. Liferay creates an instance (_com_liferay...) and uses it's values to be up to date, so we need to get a hold of it and update it's values. You can do it manually by inspecting and finding your instance, but I have an automatic code that should get it for you.
The id we are searching for is for DDMFormPortlet and the String we get this way is close to perfect. The String that document.querySelector() finds begins with p_p_id_com..., so we can use .substring to remove the unnecessary part and then add +"form" in the end to make it complete. If you find a better way to find this, please share it :)
// _com_liferay_dynamic_data_mapping_form_web_portlet_DDMFormPortlet_INSTANCE_randomkey_form
const idFinder = function() {
const idString = document.querySelector('[id*="DDMFormPortlet"]').id;
return(idString.substring(6) + "form");
}
Now that we have the correct String text, we'll find the element, that corresponds to it:
const formFieldArray = window[idFinder()];
Now if you try it just by itself, it most likely won't find anything, because it's loads slowly. I put all of this into try-catch with setTimeout() to make sure everything works as intended. Now all we need to do is collect the information from our URL and set it to the correct places.
const params = new URLSearchParams(location.search);
const formAutoFiller = function (params) {
try {
const formFieldArray = window[idFinder()];
// make sure you have the numbers according to your form!
formFieldArray.pages[0].rows[0].columns[0].fields[0].value=params.get('pageUrl');
formFieldArray.pages[0].rows[1].columns[0].fields[0].value=params.get('answer');
// ...
}
}
And finally, as the changed values update to the form after clicking an input field, we'll move the selection focus to one of the input fields after the other code is ran:
document.getElementsByClassName("ddm-field-text")[1].focus();
A bit cleanup for myself and we're done! Full Javascript here:
const params = new URLSearchParams(location.search);
const idFinder = function() {
const idString = document.querySelector('[id*="DDMFormPortlet"]').id;
return(idString.substring(6) + "form");
}
const formAutoFiller = function (params) {
try {
const formFieldRows = window[idFinder()].pages[0].rows;
formFieldRows[0].columns[0].fields[0].value=params.get('pageUrl');
formFieldRows[1].columns[0].fields[0].value=params.get('answer');
document.getElementsByClassName("ddm-field-text")[1].focus();
} catch (e) {
setTimeout(formAutoFiller, 500, params);
}
}
formAutoFiller(params);

getAllData does not include element just inserted, even after callback

So I am using NeDB as a data store for a simple little project, and I want to create a little API that inserts something new and returns the list of all items in the datastore, including the new one.
async function willAddNewItem() {
// db is NeDB Datastore
const existing = db.getAllData();
const sortedIds = existing
.map(item => item.id)
.sort((one, another) => one - another);
const id = sortedIds.length === 0
? 0
: sortedIds.slice(-1)[0] + 1;
const newItem = { id, foo: 'bar' };
await new Promise(resolve => db.insert(newItem, () => resolve()));
return db.getAllData()
.sort((one, another) =>
new Date(one.updatedAt).getTime() - new Date(another.updatedAt).getTime()
);
}
However, every time I call this function, I get the list of items I had before inserting the new one. On the next call, the item I added last time would be there, but not the new one. Refreshing my page (which results in calling db.getAllData() again to populate the initial page) shows everything it should. It’s only missing immediately after I insert it. So it would appear that the callback on db.insert is not actually waiting until the insertion is complete (nor is there any documentation that I can find about this).
Adding an index to id fixes this, and I do want an index on id so that is good, but I still want to know why this is happening/where I have to be careful about it happening in the future. For that matter, I’m a little worried that the only reason adding the index to id worked is because it happened to be a little faster, so it was “done” before I called db.getAllData() when it wasn’t before—and that this may not be consistent.
I could use something like
const afterAdding = await new Promise(resolve =>
db.insert(newItem, (_, newDoc) => resolve([...existing, newDoc]))
);
and return the sorted afterAdding, but this seems a little dubious to me (I’d really like to get exactly the db’s state rather than try to “recreate” it myself, assuming that insert has done exactly and only what I expected), and in any event I would like to know more about how NeDB works here.
Also, yes, I know I’m ignoring possible errors on the insert (I have checked to see if this is the cause; it is not), and I probably don’t want .getAllData() at all but rather some .find query. This is a basic test case as I get familiar with NeDB.

Updating the cart number in Shopify with AJAX

Apologies if this is somewhere else but I can't find a better way of doing this.
I want to update the cart total number in realtime with AJAX anytime a product is added, removed, or quantity is changed.
I have it working but there has to be a better way. It feels wrong to poll constantly for changes and I certainly don't want to end up with a memory leak.
Right now I have the item count being fetched from JSON and then changing the number in the div by polling every second, giving the user the illusion that the number is being updated when they change something.
I've tried adding a listener to the add to cart button (works) as well as listening on the quantity selector (doesn't work).
I'm sure I'm just being a noob so any help is appreciated. Code below:
// Fetch the cart in JSON and change cart quantity on the fly after first product added to cart
function doPoll(){
setTimeout(function() {
$.getJSON('/cart.js', function(cart) {
$('.cart-count').html(cart.item_count);
doPoll();
}, 1000);
}
Update
So the fix was actually very simple. The reason I couldn't attach a listener on a particular element within the cart was the cart wasn't loaded yet via ajax (duh!)
So all I did was remove the constant polling and instead ran my function anytime the ajax on the page was fully loaded:
$( document ).ajaxComplete(function() {
//Do stuff here
});
Actually it's even simpler. If an ajax driven cart adjustment happens Timber gives you an overridable hook. Just override ShopifyAPI.onCartUpdate
e.g.
ShopifyAPI.onCartUpdate = function(cart){
//do something new with the cart contents
$('.cart-count').html(cart.item_count);
};
Other than that your cart count is also available on page load via liquid so if you combine the two you're covered:
<div class="cart-count">{{ cart.item_count }}</div>
If you have not implemented any other way to add to cart, it's very simple.
If you look in the AJAX functions file you have mentioned, especially in functions at lines 101 and 113, you can see that those function are related to cart update. Add your code $('.cart-count').html(cart.item_count); before callback(cart); and you'll be good to go.
A more up to date solution without Jquery. You can use fetch and then use the data returned to populate the cart count.
fetch('/cart.js')
.then(response => response.text())
.then((responseText) => {
data = JSON.parse(responseText);
var counterEl = document.querySelectorAll('.js-cart-item-count');
counterEl.forEach((element) => {
element.innerHTML = data.item_count
})
})

What is more effective: every time make an ajax request or slice a result in func?

I have a JSON data of news like this:
{
"news": [
{"title": "some title #1","text": "text","date": "27.12.15 23:45"},
{"title": "some title #2","text": "text","date": "26.12.15 22:35"},
...
]
}
I need to get a certain number of this list, depended on an argument in a function. As I understand, its called pagination.
I can get the ajax response and slice it immediately. So that every time the function is called - every time it makes an ajax request.
Like this:
function showNews(page) {
var newsPerPage = 5,
firstArticle = newsPerPage*(page-1);
xhr.onreadystatechange = function() {
if(xhr.readyState == 4) {
var newsArr = JSON.parse(xhr.responseText),
;
newsArr.news = newsArr.news.slice(firstArticle, newsPerPage*(page));
addNews(newsArr);
}
};
xhr.open("GET", url, true);
xhr.send();
Or I can store all the result in newsArr and slice it in that additional function addNews, sorted by pages.
function addNews(newsArr, newsPerPage) {
var pages = Math.ceil(amount/newsPerPages), // counts number of pages
pagesData = {};
for(var i=0; i<=pages; i++) {
var min = i*newsPerPages, //min index of current page in loop
max = (i+1)*newsPerPages; // max index of current page in loop
newsArr.news.forEach(createPageData);
}
function createPageData(item, j) {
if(j+1 <= max && j >= min) {
if(!pagesData["page"+(i+1)]) {
pagesData["page"+(i+1)] = {news: []};
}
pagesData["page"+(i+1)].news.push(item);
}
}
So, simple question is which variant is more effective? The first one loads a server and the second loads users' memory. What would you choose in my situation? :)
Thanks for the answers. I understood what I wanted. But there is so much good answers that I can't choose the best
It is actually a primarily opinion-based question.
For me, pagination approach looks better because it will not produce "lag" before displaying the news. From user's POV the page will load faster.
As for me, I would do pagination + preload of the next page. I.e., always store the contents of the next page, so that you can show it without a delay. When a user moves to the last page - load another one.
Loading all the news is definitely a bad idea. If you have 1000 news records, then every user will have to load all of them...even if he isn't going to read a single one.
In my opinion, less requests == better rule doesn't apply here. It is not guaranteed that a user will read all the news. If StackOverflow loaded all the questions it has every time you open the main page, then both StackOverflow and users would have huge problems.
If the max number of records that your service returns is around 1000, then I don't think it is going to create a huge payload or memory issues (by looking at the nature of your data), so I think option-2 is better because
number of service calls will be less
since user will not see any lag while paginating, his experience of using the site will be better.
As a rule of thumb:
less requests == better
but that's not always possible. You may run out of memory/network if the data you store is huge, i.e. you may need pagination on the server side. Actually server side pagination should be the default approach and then you think about improvements (e.g. local caching) if you really need them.
So what you should do is try all scenarios and see how well they behave in your concrete situation.
I prefer fetch all data but showing on some certain condition like click on next button data is already there just do hide and show on condition using jquery.
Every time call ajax is bad idea.
but you also need to call ajax for new data if data is changed after some periodic time

Categories