I made an Axios get request to get data to display it with react
export function Wareh() {
const [wareh, setWareh] = useState([{}]);
useEffect(() => {
axios.get("http://localhost:1515/wareh").then((response) => {
setWareh((existingData) => {
return response.data;
});
});
}, []);
return wareh;
}
the problem here if i update my data i have to refresh the page to see the udpate Here is my question :
how to make itto be like r like if any changes happens in the database it reflects the get request.
There are 3 ways to achieve this. Either you can do it using
Long pooling
In this technique, you can set Interval and call the same endpoint to refresh the data with the passage of time.
setInterval(() => fetchWareh(), 5000)
Let's assume you can shift your Axios call to a function named: fetchWareh and call the same function after every 5 seconds
Server Sent Event
This is approach is quite similar to the first approach. You read more about from here:
https://www.w3schools.com/html/html5_serversentevents.asp
Implement socket on server & client
A most recommended to do it through socket.io to fetch real-time data.
Socket Documentation Here
Related
I am using Firebase JavaScript Modular Web Version 9 SDK with my Vue 3 / TypeScript app.
My understanding is that when using Firestore real-time listeners with offline persistence it should work like this:
When the listener is started the callback fires with data read from the local cache, and then immediately after it also tries to read from the server to make sure the local cache has up to date values. If the server data matches the local cache the callback listener should only fire once with data read from the local cache.
When data changes, the callback listener fires with data read from the server. It uses that data to update the local cache.
When data doesn't change, all subsequent calls to the listener trigger a callback with data read from the local cache.
But I have setup offline persistence, created a listener for my Firestore data, and monitored where the reads were coming from...
And in my app I see an initial read from the local cache (expected), and then a second immediate read from the server (unexpected). And after that all subsequent reads are coming from the server (also unexpected).
During this testing none of my data has changed. So I expected all reads from the callback listener to be coming from the local cache, not the server.
And actually the only time I see a read from the local cache is when the listener is first started, but this was to be expected.
What could be the problem?
P.S. To make those "subsequent calls" I am navigating to a different page of my SPA and then coming back to the page where my component lives to trigger it again.
src/composables/database.ts
export const useLoadWebsite = () => {
const q = query(
collection(db, 'websites'),
where('userId', '==', 'NoLTI3rDlrZtzWCbsZpPVtPgzOE3')
);
const firestoreWebsite = ref<DocumentData>();
onSnapshot(q, { includeMetadataChanges: true }, (querySnapshot) => {
const source = querySnapshot.metadata.fromCache ? 'local cache' : 'server';
console.log('Data came from ' + source);
const colArray: DocumentData[] = [];
querySnapshot.docs.forEach((doc) => {
colArray.push({ ...doc.data(), id: doc.id });
});
firestoreWebsite.value = colArray[0];
});
return firestoreWebsite;
};
src/components/websiteUrl.vue
<template>
<div v-if="website?.url">{{ website.url }}</div>
</template>
<script setup lang="ts">
import { useLoadWebsite } from '../composables/database';
const website = useLoadWebsite();
</script>
Nothing is wrong. What you're describing is working exactly the way I would expect.
Firestore local persistence is not meant to be a full replacement for the backend. By default, It's meant to be a temporary data source in the case that the backend is not available. If the backend is available, then the SDK will prefer to ensure that the client app is fully synchronized with it, and serve all updates from that backend as long as it's available.
If you want to force a query to use only the cache and not the backend, you can programmatically specify the cache as the source for that query.
If you don't want any updates at all from the server for whatever reason, then you can disable network access entirely.
See also:
Firestore clients: To cache, or not to cache? (or both?)
I figured out why I was getting a result different than expected.
The culprit was { includeMetadataChanges: true }.
As explained here in the docs, that option will trigger a listener event for metadata changes.
So the listener callback was also triggering on each metadata change, instead of just data reads and writes, causing me to see strange results.
After removing that it started to work as expected, and I verified it by checking it against the Usage graphs in Firebase console which show the number of reads and snapshot listeners.
Here is the full code with that option removed:
export const useLoadWebsite = () => {
const q = query(
collection(db, 'websites'),
where('userId', '==', 'NoLTI3rDlrZtzWCbsZpPVtPgzOE3')
);
const firestoreWebsite = ref<DocumentData>();
onSnapshot(q, (querySnapshot) => {
const source = querySnapshot.metadata.fromCache ? 'local cache' : 'server';
console.log('Data came from ' + source);
const colArray: DocumentData[] = [];
querySnapshot.docs.forEach((doc) => {
colArray.push({ ...doc.data(), id: doc.id });
});
firestoreWebsite.value = colArray[0];
});
return firestoreWebsite;
};
actually I'm working on reactJs project with laravel on backend I want to play sound whenever I recieve notification I tried but I didn't get a solution.
this is my code :
useEffect(() => {
let statusInterval = setInterval(() => {
getData();
}, 10000)
return () => {
clearInterval(statusInterval);
}
}, [])
async function getData() {
let result = await fetch(`${API_ENDPOINT}/api/listComplain`);
result = await result.json();
setData(result)
}
I want also to know if there's another way better that setInterval to get data
Thanks in advance for your help.
Classic would have been to use web sockets for this instant notification notices to clients. But this requires some setup and knowledge. Actually I wrote a service that you can POST to it and it will send data (like a string) to the clients, which is what you need. https://rejax.io .
I am using Quill as a rich text editor for my Node / Express web app.
Quill API has a method called "on" (https://quilljs.com/docs/api/#on) to fire an event every time the editor selection or text changes.
I am using this method to save the contents of the editor to a MySQL database, using quill.root.innerHTM to capture the entirety of the content in HTML format.
This works well, but my problem is that this approach fires a POST request to my Express endpoint for every keystroke of the user. I don't want to overwhelm the server and I don't need to save every keystroke variation.
One solution I imagined was to delay the DB query by 3 seconds and fire only one request with the most recent content of the editor.
I tried using setTimeout() to achieve this like so:
app.post('/editor', (req, res) => {
let post = true;
const id = req.query.id;
const data = req.body.content;
setTimeout(() => {
if (post == true) {
post = false;
db.connection.query('UPDATE my_table SET content = ? WHERE id = ?', [data, id], (error) => {
if (error) {
res.status(500).send("Internal server error")
}
res.status(200).send();
});
}
console.log('data posted');
}, 3000);
});
As you can see, I tried using a boolean. I know why this code doesn't work, but I couldn't figure out a way to "ignore" the requests that happen between the time intervals, and only fire a DB query with the latest data from the editor.
Thanks!
I managed to solve the problem using "debounce" from Underscore.js (http://underscorejs.org/#debounce). It works really well!
I did not touch the server route. I implemented it on the frontend. Here's what the code looks like now:
const quill = new Quill('#editor', options);
function update() {
let quillHtml = quill.root.innerHTML;
let quillContent = {
"contents": `${quillHtml}`
};
postData('/editor', id, quillContent);
}
const lazyUpdate = debounce(update, 1000);
quill.on('text-change', lazyUpdate);
postData() is just a helper function to generate a POST request using fetch()
I'm wondering if it's possible to fetch data only once to a running React app.
The goal that I want to achieve is this:
I have an app where I'm fetching user data from JSONPLACEHOLDER API, then passing this data to the state of this component and assigning to a usersArray variable.
During app is running there are some actions proceeded like deleting data from this usersArray.
WHAT IS HAPPENING:
After the page reloads, the main component is mounting once again and the data is fetched once again.
EXPECTATION :
I want to fetch this data only once and forever. Is it possible somehow to achieve this thing in React or I'm doing something wrong ?
You could put the data in localStorage and always check if there is some data there before doing the request.
Example
class App extends React.Component {
state = { users: [] };
componentDidMount() {
let users = localStorage.getItem("users");
if (users) {
users = JSON.parse(users);
this.setState({ users });
} else {
fetch("https://jsonplaceholder.typicode.com/users")
.then(res => res.json())
.then(users => {
this.setState({ users });
localStorage.setItem("users", JSON.stringify(users));
});
}
}
handleClick = index => {
this.setState(
prevState => {
const users = [...prevState.users];
users.splice(index, 1);
return { users };
},
() => {
localStorage.setItem("users", JSON.stringify(this.state.users));
}
);
};
render() {
return (
<div>
{this.state.users.map((user, index) => (
<div key={user.id}>
<span>{user.name}</span>
<button onClick={() => this.handleClick(index)}>Remove</button>
</div>
))}
</div>
);
}
}
You can use session storage if you want the data to be retrieved once PER SESSION or local storage if you want to have better control of the data's "expiration". set an arbitrary value in the fetch's callback that you'll check before fetching another time every time the app loads. The window.sessionStorage API is simple to use :
sessionStorage = sessionStorage || window.sessionStorage
// Save data to sessionStorage
sessionStorage.setItem('key', 'value');
// Get saved data from sessionStorage
var data = sessionStorage.getItem('key');
// Remove saved data from sessionStorage
sessionStorage.removeItem('key');
// Remove all saved data from sessionStorage
sessionStorage.clear();
Same syntax is used for window.localStorage
This is absolutely right :-
After the page reloads, the main component is mounting once again and the data is fetched once again.
As from my understanding , when you delete and then refresh then deleted data comes back.
This is off course going to happen as everything is getting stored in memory.
Solution :- You must use database , when you save data you do it in db , when you delete data , you delete it from db & when you fetch data then you fetch from db.
So any operation like update , delete will work fine now.
The question is WHY you expecting sth different?
It's only fake api, real would work as expected. If you're correctly sending change/delete requests - they will be memorized and refresh will read updated data (if not cached).
If it's a SPA (Single Page App) there is no need for refresh - it shouldn't happen. Maybe sth went wrong with router?
I need to use some data from a 3rd party API in my app, poll for the needed data with certain frequency from the server, and make it available to the client. The easiest way would be to create a collection and update it, and make the data available to the client via pub/sub. But, in this particular case I don't need to store that data or keep track of it, and it updates very frequently, so storing it to db would actually be just additional unneeded work. I would prefer to store it somehow in the RAM, and make it available to the client in some other way except collections (perhaps, return from a method call). But I'm not sure, how to do that. Could someone suggest some nice approach?
You could use this package meteor-publish-join to fetch data from external API and publish to client periodically (disclaimer: I am the author):
Server:
import { JoinServer } from 'meteor-publish-join';
Meteor.publish('test', function() {
// Publish a random value from an external API, plays well with promise, re-run every 10 seconds
JoinServer.publish({
context: this,
name: 'withPromise',
interval: 10000,
doJoin() {
const id = parseInt(Math.random() * 100, 10);
return fetch(`https://jsonplaceholder.typicode.com/posts/${id}`)
.then(res => res.json())
.then(data => data.title)
.catch(err => console.error(err));
},
});
});
Client:
import { JoinClient } from 'meteor-publish-join';
Meteor.subscribe('test');
// Get the values published within `test` publication. All these values are reactive
JoinClient.get('withPromise')